h2ogpt: Another Open-source large language model by H2O.ai team

Published: 01 January 1970
on channel: 650 AI Lab
3,532
68

Another full open source large language model from H2Oa.o AI team with 12B and 20B parameters, trained on the Pile open-source dataset is released with the following features:
Open-source repository with fully permissive, commercially usable code, data and models
Code for preparing large open-source datasets as instruction datasets for fine-tuning of large language models (LLMs), including prompt engineering
Code for fine-tuning large language models (currently up to 20B parameters) on commodity hardware and enterprise GPU servers (single or multi node)
Code for enabling LoRA (low-rank approximation) and 8-bit quantization for memory-efficient fine-tuning and generation.
Code to run a chatbot on a GPU server, with shareable end-point with Python client API
Code to evaluate and compare the performance of fine-tuned LLMs

== Video Timeline ==
(00:00) Content Intro
(00:15) Introducing H2O
(01:30)H2O.ai Intro
(03:05) h2ogpt
(04:55) h2ogpt Chatbot
(09:35) h2ogpt Details
(10:40) H2O llmstudio Intro
(13:20) Conclusion

=== Resources ===
https://h2o.ai/
https://h2o.ai/events/h2o-world/
https://github.com/h2oai/h2ogpt
https://github.com/h2oai/h2o-llmstudio
https://huggingface.co/spaces/h2oai/h...
https://huggingface.co/spaces/h2oai/h...
https://www.kaggle.com/code/philippsi...

Please visit:
https://prodramp.com | @prodramp
  / prodramp  

Content Creator:
Avkash Chauhan (@avkashchauhan)
  / avkashchauhan  

Tags:
#stablelm #stableai #finetunellm #openai #python #ai #langchain #chromadb


Watch video h2ogpt: Another Open-source large language model by H2O.ai team online without registration, duration hours minute second in high quality. This video was added by user 650 AI Lab 01 January 1970, don't forget to share it with your friends and acquaintances, it has been viewed on our site 3,532 once and liked it 68 people.