Final-project-in-progress for 17-422/722,05-499/899, Spring 2024: Building User-Focused Sensing Systems.
This code is built upon official repositories of CLAP and ViFi-CLIP.
This environment has been tested on MacBook Pro with M1 Pro chip.
conda env create -f environment.yaml
- For CLAP, weights will be downloaded automatically.
- For Video Finetuned CLIP, go to
./ViFi-CLIP
and create a new folder namedckpts
. Download this checkpoint and move it to./ViFi-CLIP/ckpts
.
- To run CLAP inference, open clap_inference.ipynb. Specify label csv file path and video path in the first cell and run all.
- To run Video Finetuned CLIP inference, open ViFi-CLIP_inference.ipynb. Specify label csv file path and video path in the first cell and run all.
In the smarthome-pi folder, run python3 main.py
In the webapp folder, run
python3 manage.py makemigrations
python3 manage.py migrate
python3 manage.py runserver