An Open-Source AI Chatbot Template Built With Next.js and the AI SDK by Vercel. Deployed on AWS using SST. LLM provider: AWS Bedrock.
Features · Model Providers · Deploy Your Own · Running locally
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
- SST
- SST is a framework for building serverless applications. It provides a set of tools and libraries for building, deploying, and managing serverless applications.
- SST is used to deploy the application to AWS.
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Supports AWS Bedrock
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- AMAZON RDS for saving chat history and user data
- AMAZON S3 for efficient file storage
- NextAuth.js
- Simple and secure authentication
This template ships with Nova Pro amazon.nova-pro-v1:0
as the default using AWS Bedrock. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
For AWS Bedrock, you need to create an IAM role with the following user policies: AmazonBedrockFullAccess
;
Then in the dashboard you need to enable the amazon.nova-pro-v1:0
model and any other models you want to use.
Make sure you set your Bedrock credentials using SST Secrets. See here.
We're deploying to AWS using SST, read more about how to set up your environment variables here.
sst deploy --stage dev / production
- Configure your AWS credentials and CLI. Follow the instructions here.
- Configure your AWS Bedrock credentials. Follow Model Provider instructions above.
- Setup your environment variables (see sst.config.ts)
- Run
pnpm install
to install the dependencies - Run
sst dev
to start the development server
Your app template should now be running on localhost:3000.