Faster LLM Function Calling — Dynamic Routes

Опубликовано: 15 Январь 2024
на канале: James Briggs
10,552
261

LLM function calling can be slow, particularly for AI agents. Using Semantic Router's dynamic routes, we can make this much faster and scale to thousands of tools and functions. Here we see how to use it with OpenAI's GPT-3.5 Turbo, but the library also supports Cohere and Llama.cpp for local deployments.

In semantic router there are two types of routes that can be chosen. Both routes belong to the Route object, the only difference between them is that static routes return a Route.name when chosen, whereas dynamic routes use an LLM call to produce parameter input values.

For example, a static route will tell us if a query is talking about mathematics by returning the route name (which could be "math" for example). A dynamic route can generate additional values, so it may decide a query is talking about maths, but it can also generate Python code that we can later execute to answer the user's query, this output may look like "math", "import math; output = math.sqrt(64).

⭐ GitHub Repo:
https://github.com/aurelio-labs/seman...

📌 Code:
https://github.com/aurelio-labs/seman...

🔥 Semantic Router Course:
https://www.aurelio.ai/course/semanti...

👋🏼 AI Consulting:
https://aurelio.ai

👾 Discord:
  / discord  

Twitter:   / jamescalam  
LinkedIn:   / jamescalam  

00:00 Fast LLM Function Calling
00:56 Semantic Router Setup for LLMs
02:20 Function Calling Schema
04:04 Dynamic Routes for Function Calling
05:51 How we can use Faster Agents


Смотрите видео Faster LLM Function Calling — Dynamic Routes онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь James Briggs 15 Январь 2024, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 10,55 раз и оно понравилось 26 людям.