Discover how to elevate text2cypher by advancing Cypher query generation through Large Language Models (LLMs). We'll explore the nuances of In-Context Learning, including few-shot learning and dynamic prompting with LangChain. Additionally, we'll dive into fine-tuning techniques, such as PEFT and LoRA, to guide you through dataset preparation and fine-tuning with Unsloth. This session is ideal for refining LLMs for precise and efficient data retrieval in Neo4j.
Guest: Geraldus Wilsen
LinkedIn / geraldus-wilsen
@geralduswilsen
Github: https://github.com/projectwilsen/neo4...
Tomaz Github: https://github.com/neo4j-labs/text2cy...
Blog: / geraldus-wilsen_how-to-fine-tune-llms-usin...
Few-Shot Prompting: https://blog.langchain.dev/few-shot-p...
llama 3.1 405b: https://build.nvidia.com/meta/llama-3...
0:00 Introduction
1:27 Importance of understanding Cypher
3:20 Introduction of the guest, Geraldus Wilsen
7:01 Inspiration to the talk
9:02 Overview of text2cypher
10:56 Explanation of in-context learning and the role of few-shot learning
12:33 Enhancing text2cypher with in-context learning and fine-tuning.
17:50 Q&A break
23:40 text2cypher Demo
#neo4j #graphdatabase #genai #llm #graphrag
Watch video Neo4j Live: Enhancing text2cypher with In-Context Learning & Fine-Tuning online without registration, duration hours minute second in high quality. This video was added by user Neo4j 01 January 1970, don't forget to share it with your friends and acquaintances, it has been viewed on our site 1,03 once and liked it 6 people.