How To Run ANY Open Source LLM LOCALLY In Linux

Published: 11 June 2024
on channel: Ksk Royal
5,809
179

In this video, I will show you how to run ANY open source llm (large language models) locally on Linux using Ollama & LMStudio. Ollama & LMStudio are the best tools that allows you to run various models such as llama3, Gemma, Mistral, codellama & much more. Watch this video and learn running LLMS locally on Linux computer.

Timestamps
00:00 Introduction
00:38 Pre-requisites 
01:16 Installing Ollama
02:18 Download LLM
03:01 Testing LLAMA3 & Gemma
05:31 Customizing Model
06:55 Installing LMStudio

Download
Ollama: https://ollama.com/download
LMStudio: https://lmstudio.ai/

Relevant Tech Videos
Dual boot ubuntu 24.04 LTS And Windows 11 -    • How to Dual Boot Ubuntu 24.04 LTS and...  
Clean Install Ubuntu 24.04 LTS -    • How TO Install Ubuntu 24.04 LTS EASIL...  
Install Ubuntu 24.04 LTS On Virtual Box -   • How To Install Ubuntu 24.04 LTS in Vi...  


~ Buy Me A Coffee - http://buymeacoffee.com/kskroyal
~ Connect On Instagram - @KSKROYALTECH
~ For Business Enquires ONLY - [email protected]
~ My Website - https://kskroyal.com/

© KSK ROYAL 
    MereSai


Watch video How To Run ANY Open Source LLM LOCALLY In Linux online without registration, duration hours minute second in high quality. This video was added by user Ksk Royal 11 June 2024, don't forget to share it with your friends and acquaintances, it has been viewed on our site 5,809 once and liked it 179 people.