BuzzAI or gt-chat is a question-answering chatbot that is designed to answer any questions about GaTech. The chatbot is powered by Next.js, FastAPI, and OpenAI, and it provides a fast and intuitive interface for finding answers to commonly asked questions by sourcing from over 14k Georgia Tech websites. The webpages are collected, cleaned, and splitted into 49k 1024-character document chunks and embeded into 1536-dimension vectors using OpenAI's text-embedding-ada-002 for $4. The produced FAISS vector index enables the chatbot to quickly find the most relevant information to a question and use it to generate an answer. You can try it out at gt-chat.org!
To use BuzzAI, you can simply go to the gt-chat.org or clone the repository and run the chatbot locally.
To run the chatbot locally, you will need to clone the repository and follow the instructions below.
To start the frontend you will need:
- Node.js 12 or later installed
- The URL of the backend server as an environment variable
DOMAIN
Deployment
Installing
Clone the repository and then run the following command to install the required dependencies:
npm install
Running the Chatbot
To start the chatbot, run the following command:
npm run dev
This will start the Next.js development server and open the chatbot in your default browser at http://localhost:3000.
The backend for the project is a python fastapi server that uses the LangChain + OpenAI API to generate answer for to /qa
GET endpoint.
Supabase
You will need to set up a Supabase project and create a table called qa_log
with the following schema:
Column Name | Data Type |
---|---|
id | uuid |
created_at | timestamp |
question | text |
answer | text |
success | boolean |
You need the Supabase project URL and service key to set up the environment variables later.
Deployment
Change the Railway build command to bash build.sh
and it should work out of the box.
To run the server locally:
Step 1: Set up python environment and fetch OpenAI embeddings
python3 -m venv venv
bash build.sh
Step 2: Set up environment variables
export OPENAI_API_KEY=<your key>
export SUPABASE_URL=<supabase project url>
export SUPABASE_KEY=<supabase project *service* key, not annon key>
Step 3: Run Local Server
python main.py
Risingwave for the chatgpt-plugin repo that I used as a starting point for the backend.
odysa for the BuckyAI repo that I used as a starting point for the frontend.
Pull requests are always welcomed!