As a Data Engineer and Data Scientist with extensive expertise in data pipeline development, data migration, data visualization, and statistical modelling, I bring a wealth of knowledge to my work. I specialize in Python, SQL, and SAS, and I have experience working on cloud platforms such as Azure, GCP, Oracle Cloud, Snowflake, and Oracle ERP System. My responsibilities have included developing and maintaining ETL processes, creating data models, and providing valuable insights through data visualization.
In addition to my professional work, I am an active contributor to the open-source community and have several projects on my GitHub profile that showcase my skills in data engineering and machine learning. Previously, I worked as a Statistical Programmer at ICON plc, where I developed software using Python, SAS, and SQL platforms for clinical data reports.
I hold an MSc in Biomedical Engineering from the University of Bristol, where I received a Distinction and was awarded a fully-funded Think Big Scholarship. Additionally, I am bilingual in English and Mandarin and have volunteered as a Postgraduate Student Ambassador at the University of Bristol.
-
๐จโ๐ป All of my projects are available at https://github.com/WilsonH918
-
๐ซ How to reach me wilson.hh.hsieh@gmail.com
Project Link | Tools | Project Description |
---|---|---|
EnergyStocks_HistoricalPrice_DataPipeline | Pyspark, SQL, AWS (Lambda, EC2, S3), Snowflake (CDC), PowerBI | This is a data pipeline project that retrieves S&P500 listed energy companies' historical stock price data, stores the data in an AWS S3 bucket, and transforms the data in a Snowflake data warehouse. The project is automated using AWS Lambda to trigger a Python script that runs the pipeline on a scheduled basis. |
Data_Pipeline_Ethereum_Token | Python, Airflow (DAGs), PostgreSQL, Docker, Hadoop | This project is designed to extract ERC20 token data from Web3 using the Etherscan API and create an ETL pipeline using Apache Airflow. The extracted data is scheduled to be fed into a local PostgreSQL database daily. The project involves technologies such as Docker, Airflow DAGs, PostgreSQL, and HDFS. Below is the screenshot of the data pipeline in action. |
Real-time-Streaming-of-ERC20-Transactions-with-Kafka-and-Python | Python, SQL, Kafka, Docker, Web3 | This project demonstrates how to build a real-time data pipeline to retrieve ERC20 token transactions and store them in a local CSV file. The project uses Apache Kafka, an open-source distributed streaming platform, to stream real-time data from the Etherscan API, a blockchain explorer for the Ethereum network, and then stores the data in CSV format in a local file. |
Thesis Code - Motion Heatmap and Machine Learning for Stair Climbing Detection | Pyspark, Pandas, scikit-learn, TensorFlow, Matplotlib, Seaborn | This code repository contains the code used to generate the results presented in my thesis titled "Motion Heatmap and Machine Learning for Stair Climbing Detection." In this thesis, we present a dataset of video data that includes bounding boxes information and silhouette images, along with the methods used to process this data to detect human movements, trajectories over time, and the usage of each room in the home environment. |
ERC20_MyToken | Solidity, Python, Web3, Blockchain | This is a simple ERC20 token contract written in Solidity. It allows for the creation, transfer, and burning of tokens. The contract also includes an onlyOwner modifier to restrict access to certain functions. |