This project implements a Reverse Image Search Engine that leverages embedding techniques on the Caltech101 dataset. The goal of the system is to allow users to search for visually similar images based on a query image. The embedding approach enables the system to capture the visual similarity between images, even if they belong to different categories.
- Indexing: The dataset is indexed to create embeddings for each image using deep learning techniques.
- Querying: Users can input an image as a query and retrieve visually similar images from the dataset.
- Embedding Visualization: The project includes visualization techniques to explore and understand the image embeddings.
This project uses the Caltech101 dataset which contains 101 categories of objects, with 40 to 800 images per category. The dataset is split into training and testing sets, with 30 and 50 images per category respectively. The images are of variable sizes and aspect ratios.
To run the Reverse Image Search Engine, follow the steps:
- Clone the repository
git clone https://github.com/ArnabKumarRoy02/Image-Search-Engine.git
- Create a virtual environment
cd Image-Search-Engine
conda create -n env
conda activate env
conda install python==3.8.16
- Install the dependencies
pip install -r requirements.txt
- Launch the flask app
python app.py
- Access the Reverse Image Search Engine by opening the provided URL in a web browser.
- Upload an image or provide the URL of an image as a query.
- Submit the query and wait for the system to retrieve visually similar images.
- Explore the search results and interact with the visualization features to gain insights into the embeddings.
Contributions are welcome! If you want to enhance the Reverse Image Search Engine, submit a pull request with your proposed changes. Please follow the existing code style and include appropriate tests.
This project is licensed under the MIT License