#langchain #langchaintutorial #ai #apikey #openai #llm #chatbot #chatmodl
📑 Useful Links:
Langchain Docs: https://www.langchain.com
Source Code: https://github.com/rehmat11872/Langchain-t...
In this quickstart we'll show you how to:
Get setup with LangChain, LangSmith and LangServe
Use the most basic and common components of LangChain: prompt templates, models, and output parsers
Use LangChain Expression Language, the protocol that LangChain is built on and which facilitates component chaining
Build a simple application with LangChain
Trace your application with LangSmith
Serve your application with LangServe
That's a fair amount to cover! Let's dive in.
Setup
Jupyter Notebook
This guide (and most of the other guides in the documentation) uses Jupyter notebooks and assumes the reader is as well. Jupyter notebooks are perfect for learning how to work with LLM systems because oftentimes things can go wrong (unexpected output, API down, etc) and going through guides in an interactive environment is a great way to better understand them.
You do not NEED to go through the guide in a Jupyter Notebook, but it is recommended. See here for instructions on how to install.
Installation
To install LangChain run:
Pip
Conda
pip install langchain
For more details, see our Installation guide.
LangSmith
Many of the applications you build with LangChain will contain multiple steps with multiple invocations of LLM calls. As these applications get more and more complex, it becomes crucial to be able to inspect what exactly is going on inside your chain or agent. The best way to do this is with LangSmith.
Note that LangSmith is not needed, but it is helpful. If you do want to use LangSmith, after you sign up at the link above, make sure to set your environment variables to start logging traces:
export LANGCHAIN_TRACING_V2="true"
export LANGCHAIN_API_KEY="..."
Building with LangChain
LangChain enables building application that connect external sources of data and computation to LLMs. In this quickstart, we will walk through a few different ways of doing that. We will start with a simple LLM chain, which just relies on information in the prompt template to respond. Next, we will build a retrieval chain, which fetches data from a separate database and passes that into the prompt template. We will then add in chat history, to create a conversation retrieval chain. This allows you to interact in a chat manner with this LLM, so it remembers previous questions. Finally, we will build an agent - which utilizes an LLM to determine whether or not it needs to fetch data to answer questions. We will cover these at a high level, but there are lot of details to all of these! We will link to relevant docs.
If you enjoyed the video, 🙌😊 please share it with others and hit the subscribe button for more insightful content! 🚀
********************Follow me*********************
Connect with me.🌐
Linkedln: https://www.linkedin.com/in/rehmat-qadeer-...
GitHub: https://github.com/rehmat11872
Keep watching for more exciting content! 🎥💻
Thanks a bunch for your support! 👍 Keep watching and stay tuned for more exciting updates! 📺😊 #TechEnthusiast #SubscribeNow"
Смотрите видео LangChain Tutorial (Python) #1: Intro & Setup Python and Virtual Environments & API KEY онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь Rehmat Qadeer 30 Апрель 2024, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 3 раз и оно понравилось людям.