LLM vs RAG vs Tokens: Ollama, the local LLM manager for GenAI

Published: 29 June 2024
on channel: Bret Fisher Cloud Native DevOps
12,136
1k

Watch the whole episode about Ollama    • Local GenAI LLMs with Ollama and Dock...  

Matt Williams
============
  / technovangelist  
  / technovangelist  

Nirmal Mehta
============
  / nirmalkmehta  
  / normalfaults  
https://hachyderm.io/@nirmal

Bret Fisher

=========
  / bretefisher  
  / bretfisher  
https://www.bretfisher.com

Join my Community 🤜🤛
================
💌 Weekly newsletter on upcoming guests and stuff I'm working on: https://www.bretfisher.com/newsletter/
💬 Join the discussion on our Discord chat server   / discord  
👨‍🏫 Coupons for my Docker and Kubernetes courses https://www.bretfisher.com/courses/
🎙️ Podcast of this show https://www.bretfisher.com/podcast


Watch video LLM vs RAG vs Tokens: Ollama, the local LLM manager for GenAI online without registration, duration hours minute second in high quality. This video was added by user Bret Fisher Cloud Native DevOps 29 June 2024, don't forget to share it with your friends and acquaintances, it has been viewed on our site 12,136 once and liked it 1 thousand people.