Generative AI Serverless - Build a Social Media Analytics using Bedrock RAG Knowledge base & Lambda!

Published: 10 August 2024
on channel: Cloud With Girish
184
1

In this video, I am going to show you how to build a serverless Generative AI Retrieval Augmented Generation (RAG) solution to implement a single document knowledge base using Amazon Bedrock, Lambda, and API. With this solution, I will create a social media analytics assistant that can provide contextual responses based on the past 6 months of social media data in a CSV file stored in a S3 bucket. This data includes metrics about Twitter, Instagram, Facebook, and LinkedIn for the fictitious bank MyBankGB.

'Chat with your document' is the latest Generative AI feature added by Amazon to its already feature-rich areas of GenAI, Knowledge Base, and RAG.

RAG, which stands for Retrieval Augmented Generation, is becoming increasingly popular in the world of Generative AI. It allows organizations to overcome the limitations of large language models (LLMs) and utilize contextual data for their Generative AI solutions.

Amazon Bedrock is a fully managed service that offers a variety of foundation models, such as Anthropic Claude, AI21 Jurassic-2, Stability AI, Amazon Titan, and others.

I will use the recently released Anthropic Sonnet foundation model and invoke it via Amazon Bedrock using Lambda and API. As of May 2024, this is the only model supported by AWS for the single document knowledge base or 'Chat with your document' function.


Watch video Generative AI Serverless - Build a Social Media Analytics using Bedrock RAG Knowledge base & Lambda! online without registration, duration hours minute second in high quality. This video was added by user Cloud With Girish 10 August 2024, don't forget to share it with your friends and acquaintances, it has been viewed on our site 184 once and liked it 1 people.