This python code uses python to get Clash of Clans statistics on a schedule, using Cloud Scheduler, BigQuery using Pub/Sub and Cloud Functions, Virtual Private Cloud, Cloud Nat, Cloud Router, and Data Studio.
Table of Contents
-
This project gives functions the user can use to pull data about fact data like clans, players, war logs, global rankings, and dimension data like locations, player labels, clan labels, leauges, etc. (Backend API Code in "apiHelper.py", get response in df form using "main.py" functions)
-
This project, also was deployed entirely in GCP. It uses Cloud Scheduler and Pub/Sub to trigger a Cloud Functions 2x daily. This Cloud Function runs Python code that pulls data from Clash of Clans API through a static API provided by the Virtual Private Cloud, Cloud NAT & Cloud Router. The API data is stored in BigQuery and served as a Data Studio report emailed daily.
-
Note: There is a small cost to running the VPC/NAT/Router combo. It's the only way to get a static IP egress needed for the Clash of Clans API call. My estimates are ~$13/month. Depending on how many clans you lead, how serious of a player you are, and if you use the VPC for another project, it might be worth the investment.
- Python
- Cloud Scheduler
- Pub/Sub
- Cloud Functions
- Virtual Private Cloud
- Cloud NAT
- Cloud Router
- BigQuery
- Data Studio
- Clash of Clans API
- Installing all Required Packages
pip install -r requirements.txt
-
Open a Google Cloud Platform Account and create a new project.
-
Open the IAM & Admin page, set up permissions and obtain an API key. Replace this key with the \CloudFunction ".json" key.
- Ensure a default Virtual Private Cloud exists in GCP.
- Create a Serverless VPC Access Connector in GCP.
- Create a Cloud Router in GCP.
- Create a Cloud NAT in GCP.
- Create a SuperCell account, and Create an API Key to access Clash of Clans data. Use the Static IP Address from the NAT Gateway Step. Plug this into the cloud functions main.py as the API_KEY. Make another with your home IP Adress if you want to test locally.
- Set up a Pub/Sub Topic.
- Set up a Cloud Scheduler to send the Pub/Sub topic trigger to Cloud Functions 2x daily.
- Set up the Cloud Function with the following trigger, code, and connection settings.
- Use BigQuery to check query your computers' stats over time.
- Create a Data Studio Report connecting to BigQuery. Set report to send daily via email.
- Test locally and adjust the cloud function python script by using the apiHelpers.py file and main.py file. Integrate customized functions into your automated ETL process!
Jared Fiacco - jaredfiacco2@gmail.com
Another GCP Project of Mine:
- Transcribe Podcasts, Save to GCP Firebase
- Monitor & Store Computer Statistics in GCP BigQuery using Pub/Sub
This project was inspired by: