This repo contains the code for running an LLM Api in 2 environments:
dev
: A development environment running locally on dockerprd
: A production environment running on AWS ECS
- Clone the git repo
from the
llm-api
dir:
- Create + activate a virtual env:
python3 -m venv aienv
source aienv/bin/activate
- Install
phidata
:
pip install phidata
- Setup workspace:
phi ws setup
- Copy
workspace/example_secrets
toworkspace/secrets
:
cp -r workspace/example_secrets workspace/secrets
- Optional: Create
.env
file:
cp example.env .env
-
Install docker desktop
-
Set OpenAI Key
Set the OPENAI_API_KEY
environment variable using
export OPENAI_API_KEY=sk-***
OR set in the .env
file
- Start the workspace using:
phi ws up
Open localhost:8000/docs to view the FastApi docs.
- Stop the workspace using:
phi ws down