An open-source LLM chatbot that answers Rice students' questions.
Documentation · Architecture · Features · LLM · Running locally · Authors
Check our latest documentation here.
The application is 3 parts: llm, vectorDB, and server manager. The server manager constantly listens&scrapes various sources and updates the vectorDB. The sources include owlnest events, clubs instagram, and esther courses. In addition to these sources, all rice students can add new sources. For example, if there is a site posting finance career fair I want to share, I can do so within Owlracle using the functionality called Teaching. Then the server manager will automatically scrape that info and share it with the next student who asks about some relavent questions.
- Prompt Engineering: codifies how LLM reasons about users question given provided contexts from VectorDB.
- Supabase VectorDB: augments information retrieval for LLM.
- Modal: manages(create, update, delete) VectorDB information.
You will need to use the environment variables defined in .env.example
to run Owlracle. If you're interested in joining us, join the discord where we open sourced the env variables. After that, run following commmands:
pnpm i . #(run everytime to install new node modules)
npx prisma generate --data-proxy #(run only once)
pnpm dev #(run every time to start the app on local host)
Owlracle should now be running on localhost:3000.
This library is created by NICE team members, with contributions from:
- Peter Cao (Ye Cao) -- organizer
- Alexia Huang (Yuening Huang) -- frontend
- Ningzhi Xu (Ningzhi Xu) -- backend
- Jinyu Pei (Jinyu Pei) -- backend
- Arihan Varanasi (Arihan Varanasi) -- architecture & knowledge retrieval
- Jasmine Lu (Jasmine Lu) -- server manager