Enroll now: https://bit.ly/42rqWxv
In this course, you’ll learn how to deploy a large language model-based application into production using a serverless architecture, which enables you to deploy your applications without the need to manage and scale the infrastructure they run on.
As a hands-on exercise, you'll learn to summarize audio files by pairing an LLM with an automatic speech recognition (ASR) model. You’ll build an event-driven system that automatically detects incoming customer inquiries, transcribes them with ASR and summarizes them with an LLM, using Amazon Bedrock. While we'll use the Amazon Titan model in the course, you can choose from many different models on the platform.
After finishing the course, you'll be able to:
Prompt an LLM and customize its responses using Amazon Bedrock.
Convert audio recordings into written transcripts with Amazon Transcribe, and summarize these transcripts using an LLM.
Enable logging for all the calls you make to LLMs to help you maintain security, audit, and compliance standards.
Deploy this audio summarizer as an event-driven serverless workflow using AWS Lambda.
The course is taught by Mike Chambers, co-instructor of the popular Coursera course, “Generative AI with Large Language Models.”
Learn more: https://bit.ly/42rqWxv
Watch video New course with AWS: Serverless LLM apps with Amazon Bedrock online without registration, duration hours minute second in high quality. This video was added by user DeepLearningAI 14 February 2024, don't forget to share it with your friends and acquaintances, it has been viewed on our site 4,237 once and liked it 104 people.