Enroll now: https://bit.ly/42rqWxv
In this course, you’ll learn how to deploy a large language model-based application into production using a serverless architecture, which enables you to deploy your applications without the need to manage and scale the infrastructure they run on.
As a hands-on exercise, you'll learn to summarize audio files by pairing an LLM with an automatic speech recognition (ASR) model. You’ll build an event-driven system that automatically detects incoming customer inquiries, transcribes them with ASR and summarizes them with an LLM, using Amazon Bedrock. While we'll use the Amazon Titan model in the course, you can choose from many different models on the platform.
After finishing the course, you'll be able to:
Prompt an LLM and customize its responses using Amazon Bedrock.
Convert audio recordings into written transcripts with Amazon Transcribe, and summarize these transcripts using an LLM.
Enable logging for all the calls you make to LLMs to help you maintain security, audit, and compliance standards.
Deploy this audio summarizer as an event-driven serverless workflow using AWS Lambda.
The course is taught by Mike Chambers, co-instructor of the popular Coursera course, “Generative AI with Large Language Models.”
Learn more: https://bit.ly/42rqWxv
Смотрите видео New course with AWS: Serverless LLM apps with Amazon Bedrock онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь DeepLearningAI 14 Февраль 2024, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 4,237 раз и оно понравилось 104 людям.