h2ogpt: Another Open-source large language model by H2O.ai team

Опубликовано: 01 Январь 1970
на канале: 650 AI Lab
3,532
68

Another full open source large language model from H2Oa.o AI team with 12B and 20B parameters, trained on the Pile open-source dataset is released with the following features:
Open-source repository with fully permissive, commercially usable code, data and models
Code for preparing large open-source datasets as instruction datasets for fine-tuning of large language models (LLMs), including prompt engineering
Code for fine-tuning large language models (currently up to 20B parameters) on commodity hardware and enterprise GPU servers (single or multi node)
Code for enabling LoRA (low-rank approximation) and 8-bit quantization for memory-efficient fine-tuning and generation.
Code to run a chatbot on a GPU server, with shareable end-point with Python client API
Code to evaluate and compare the performance of fine-tuned LLMs

== Video Timeline ==
(00:00) Content Intro
(00:15) Introducing H2O
(01:30)H2O.ai Intro
(03:05) h2ogpt
(04:55) h2ogpt Chatbot
(09:35) h2ogpt Details
(10:40) H2O llmstudio Intro
(13:20) Conclusion

=== Resources ===
https://h2o.ai/
https://h2o.ai/events/h2o-world/
https://github.com/h2oai/h2ogpt
https://github.com/h2oai/h2o-llmstudio
https://huggingface.co/spaces/h2oai/h...
https://huggingface.co/spaces/h2oai/h...
https://www.kaggle.com/code/philippsi...

Please visit:
https://prodramp.com | @prodramp
  / prodramp  

Content Creator:
Avkash Chauhan (@avkashchauhan)
  / avkashchauhan  

Tags:
#stablelm #stableai #finetunellm #openai #python #ai #langchain #chromadb


Смотрите видео h2ogpt: Another Open-source large language model by H2O.ai team онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь 650 AI Lab 01 Январь 1970, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 3,532 раз и оно понравилось 68 людям.