This simple RESTful API project based on FastAPI is demonstration of multiple modern technologies/methodologies/principles:
- Python programming language
- RESTful API using FastAPI framework
- input data validation using pydantic library
- API designed using Clean Architecture with help of Repository Pattern
- poetry for dependencies management
- code checks using: flake8, black, isort
- unit tests using pytest and hypothesis
- code coverage using pytest-cov
- load tests using locust
- schema tests using schemathesis
- contract tests using pact-python
- cloud-based app deployed to Amazon Web Services (AWS)
- Serverless (Serverless Framework) - AWS Lambda, API Gateway
- Microservices / serverless architecture (AWS Lambdas creating REST API)
- Infrastracture as a Code (IaaC) (Serverless framework - serverless.yml defines infrastructure resources)
- DevOps-based workflow (common code base with Makefile commands spanning Developers and Operations Teams together - used in CI/CD)
- CI/CD pipeline
- code syntax verification (flake8, isort, black) (
make lint
) - security verification (bandit) (
make security
) - unit tests (pytest) (
make unittest
) - code coverage (coverage python module) (
make cov
) - deploy infrastructure (AWS, Serverless framework) (
make deploy
) - End-To-End tests (behave, selenium, pytest-bdd) (NOT IMPLEMENTED YET) (
make e2e-tests
) - load/performance tests (locust) (
make load-tests
) - destroy infrastructure (AWS, Serverless framework) (
make destroy
) - CI/CD deployment configured using Git Flow principles
- code syntax verification (flake8, isort, black) (
- deploying from Command Line or from CI/CD
- single Makefile to control all deployment and code checkings commands
- available to deploy to multiple stages /environments (ex. DEV, SIT, PROD) using the same command (ex.
make deploy ENV=SIT
) - integrated with GitHub Actions CI/CD pipelines
- Monitoring
- basic monitoring based on CloudWatch Dashboards
- getting logs from AWS CloudWatch
This app template is based on a Serverless Application Framework
- Install dependencies:
make deps
- Run API locally:
make run
- Check API is up and running by opening it's root URL http://localhost:8000 in your web browser.
- Check OpenAPI docs is up and running by opening it's URL http://localhost:8000/docs in your web browser.
- Set up AWS credentials for your terminal
- Install Serverless Application Framework via npm - Instruction. You can use
make serverless
command from root directory of this project (orsudo make serverless
if you seeEACCES: permission denied
). - Deploy default app
make deploy
You can work with app on specified stage (environment) ex. dev
, uat
, prd
by passing ENV variable into the
make
commands ex.:
make deploy ENV=dev
make deploy ENV=uat
make deploy ENV=prd
or export ENV
variable in your terminal and use default commands ex.
export ENV=dev
make deploy run
The default stage for the app is equal to current branch name ex. master.
make deploy
will build and deploy infrastructure and code as defined in serverless.yml file:
By default resources are deployed to the default
stage
(environment) based on current branch name ex. master
. Thanks to that multiple users working on separate branches can deploy to
separate AWS resources to avoid resources conflicts.
We have following level of tests in the application:
make code-checks
- checks code syntax usingflake8
,black
,isort
and security usingbandit
make unittest cov
- trigger all unit tests of the code and show code coveragemake e2e-tests
(NOT IMPLEMETED YET) - behave-based tests runned after deploymentmake schema-tests
- schemathesis-based API schema testsmake contract-tests
- pact-based API contract testsmake load-tests
- locust-based load tests
The CI/CD is based on Makefile targets and is integrated with GitHub Actions to trigger (however it could be easly integrated with any other CI/CD tool ex. Jenkins, BitBucket pipelines, GitLab, TravisCI, Bamboo or any other)
It consists of following steps:
You can run all the below steps/commands using one make ci
command
make lint
=> check code syntax usingpylint
toolmake security
=> check code security breaches usingbandit
toolmake unittest
=> trigger unit tests and show reportmake cov
=> show unit tests code coverage
You can run all the below steps/commands using one make cd
command:
make deploy
=> deploys app to AWSmake e2e-tests
=> run End to End tests on deployed appmake load-tests
=> run Load tests on deployed appmake destroy
=> (optional: works only on GitFlow feature branches) destroy AWS resources after finishing e2e-tests
Currently CI/CD is integrated with GitHub Actions. However you can set it up quickly with any other CI/CD tool and see pipelines and actions similar to the ones below.
To run CI/CD pipelines you need to export AWS_KEY
and AWS_SECRET
to the
Secrets
section of your GitHub project:
You can see CI/CD pipelines of project here
Pipelines of GitHub Actions looks like on this picture:
Pipeline steps are configured in pipeline config file
Sample pipeline processing with details of each step can be found when you click on some of the pipelines in Actions tab.
It should look like on this picture:
We can automatically reformat the code according to black and isort rules:
make format
To create Pull Request, go to Pull Requests and fo following steps:
- Click 'New pull request'
- Select your branch and click on it.
- Make sure you selected your PR to be merged into
develop
(NOTmaster
) (we will use GitFlow for releases later) - Click create Pull Request
When you follow above actions, the CI/CD pipeline will be triggered automatically and perform all checkings described in CI/CD section above.
When everything will be finished you should see results like here and if everything is green you can ask your colleague for Code Review.
If something is not green, you should fix it before asking Code Review.
When you Code is reviewed you can click 'Merge pull request' and merge it into
develop
branch.