The goal of the project is to implement a big data system to support a real estate investor who is considering buying property for short-term rentals in Europe, and direct him or her to choose the area where he or she can buy.
The technologies used in the implementation of the project are listed below.
This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.
In order to start the system, you have to first install the dependencies, which are listed below. Then, start the various scripts in the order to follow.
You should always pay attention to the directory inside which you are executing terminal commands.
All dependencies needed to start the project are listed in the requirements.txt file, so simply run the terminal command
pip install requirements.txt
First you have to start the Docker yml in order to have a working Redis instance.
cd redis
docker compose up
APIs have been made through FastAPI to make Redis database endpoints available in localhost. Therefore, one would need to start the Uvicorn server to make the endpoints work.
uvicorn main:app --reload
At this point all the elements needed to have the system running are running.
For the purposes of this project, the data were collected manually by web scraping and are already collected inside the scraped.json file. However, it is possible to implement a data stream processing system, since the Data Storage and Data Preparation scripts are already set up for this.
Make sure you are in the 'redis' folder. Then run the command:
python3 import_to_redis.py
This script imports all the Real Estates records obtained in the scraped.json file into the Redis database.
Return to root. Then run the command to start the getData script, which performs a retrieve from the database and merges all the JSON records into a CSV file.
cd ..
python3 getData.py
Two separate scripts perform data preparation and cleaning.
python3 dataPreparation.py
python3 dataCleaning.py
The final script performs the generation of the various plots aimed at directing the operator in the choice of purchase.
python3 dataVisualisation.py
It is possible to start a web application that provides the Front-End side to our system via Plotly Dash. Just start the .py script to run the server, and consult the Front-End locally:
python3 webApp.py
or run it from your favorite IDE.
- Scraping
- Data Collection
- Implementing Redis DB
- Dockerize DB
- Implementing FastAPI and various endpoints
- Implementing Redis UI
- Realizing importing to Redis script
- Script to retrieve data from Redis and converting JSON to CSV
- Implementing data cleaning and preparation scripts
- Implementing data visualization script
- Diagram
- Web App
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Luca Maccacaro - luca.maccacaro@studenti.unitn.it
Simone Bellavia – simone.bellavia@studenti.unitn.it
Project Link: https://github.com/lucamac99/big-data-project