Skip to content

Latest commit

Β 

History

History
779 lines (591 loc) Β· 30.1 KB

README.md

File metadata and controls

779 lines (591 loc) Β· 30.1 KB

Child Mind Institute - MindLogger Backend API

This repository is used for the backend of the MindLogger application stack.

Automated tests Coverage

Getting Started

Contents

Features

See MindLogger's Knowledge Base article to discover the MindLogger application stack's features.

Technologies

And

Code quality tools:

Application

Prerequisites

Recommended Extras

Installing pyenv is recommended to automatically manage Python version in the virtual environment specified in the Pipfile

Alternatively, on macOS you can use a tool like Homebrew to install multiple versions and specify when creating the virtual environment:

pipenv --python /opt/homebrew/bin/python3.11

Environment Variables

Key Default value Description
DATABASE__HOST postgres Database Host
DATABASE__USER postgres User name for Postgresql Database user
DATABASE__PASSWORD postgres Password for Postgresql Database user
DATABASE__DB mindlogger_backend Database name
CORS__ALLOW_ORIGINS * Represents the list of allowed origins. Set the Access-Control-Allow-Origin header. Example: https://dev.com,http://localohst:8000
CORS__ALLOW_ORIGINS_REGEX - Regex pattern of allowed origins.
CORS__ALLOW_CREDENTIALS true Set the Access-Control-Allow-Credentials header
CORS__ALLOW_METHODS * Set the Access-Control-Allow-Methods header
CORS__ALLOW_HEADERS * Set the Access-Control-Allow-Headers header
AUTHENTICATION__ACCESS_TOKEN__SECRET_KEY secret1 Access token's salt
AUTHENTICATION__REFRESH_TOKEN__SECRET_KEY secret2 Refresh token salt
AUTHENTICATION__REFRESH_TOKEN__TRANSITION_KEY transition secret Transition refresh token salt. Used for changing refresh token key (generate new key for AUTHENTICATION__REFRESH_TOKEN__SECRET_KEY and use previous value as transition token key for accepting previously generated refresh tokens during transition period (see AUTHENTICATION__REFRESH_TOKEN__TRANSITION_EXPIRE_DATE))
AUTHENTICATION__REFRESH_TOKEN__TRANSITION_EXPIRE_DATE transition expiration date Transition expiration date. After this date transition token ignored
AUTHENTICATION__ALGORITHM HS256 The JWT's algorithm
AUTHENTICATION__ACCESS_TOKEN__EXPIRATION 30 Time in minutes after which the access token will stop working
AUTHENTICATION__REFRESH_TOKEN__EXPIRATION 30 Time in minutes after which the refresh token will stop working
ADMIN_DOMAIN - Admin panel domain
RABBITMQ__URL rabbitmq Rabbitmq service URL
RABBITMQ__USE_SSL True Rabbitmq ssl setting, turn false to local development
MAILING__MAIL__USERNAME mailhog Mail service username
MAILING__MAIL__PASSWORD mailhog Mail service password
MAILING__MAIL__SERVER mailhog Mail service URL
MULTI_INFORMANT__TEMP_RELATION_EXPIRY_SECS 86400 Expiry (sec) of temporary multi-informant participant take now relation
SECRETS__SECRET_KEY - Secret key for data encryption. Use this key only for local development
βœ‹ Mandatory:

You can see that some environment variables have double underscore (__) instead of _.

As far as pydantic supports nested settings models it uses to have cleaner code

Installation

Create .env file for future needs

It is highly recommended to create an .env file as far as it is needed for setting up the project with Local and Docker approaches. Use .env.default to get started:\

cp .env.default .env

πŸ›‘ NOTE: Make sure to set RABBITMQ__USE_SSL=False for local development

Generate secret keys, update .env with values

openssl rand -hex 32

Generate a key and update .env values:

  • AUTHENTICATION__ACCESS_TOKEN__SECRET_KEY
  • AUTHENTICATION__REFRESH_TOKEN__SECRET_KEY

Required Services

  • Postgres
  • Redis
  • RabbitMQ
  • Mailhog - Only used for running mail services locally

Running required services using Docker is highly recommended even if you intend to run the app locally.

πŸ›‘ NOTE: Make sure to update your environment variables to point to the correct hostname and port for each service.

Run services using Docker

  • Run Postgres

    docker-compose up -d postgres
  • Run Redis

    docker-compose up -d redis
  • Run RabbitMQ

    docker-compose up -d rabbitmq
  • Alternatively, you can run all required services:

    docker-compose up

Run services manually

For manual installation refer to each service's documentation:

Install all project dependencies

Pipenv used as a default dependencies manager Create your virtual environment:

NOTE: Pipenv used as a default dependencies manager. When developing on the API be sure to work from within an active shell. If using VScode, open the terminal from w/in the active shell. Ideally, avoid using the integrated terminal during this process.

# Activate your environment
pipenv shell

If pyenv is installed Python 3.11 should automatically be installed in the virtual environment, you can check the correct version of Python is active by running:

python --version

If the active version is not 3.11, you can manually specify a version while creating your virtual environment:

pipenv --python /opt/homebrew/bin/python3.11

Install all dependencies

# Install all deps from Pipfile.lock
# to install venv to current directory use `export PIPENV_VENV_IN_PROJECT=1`
pipenv sync --dev --system

πŸ›‘ NOTE: if you don't use pipenv for some reason remember that you will not have automatically exported variables from your .env file.

πŸ”— Pipenv docs

So then you have to do it by your own manually

# Manual exporting in Unix (like this)
export PYTHONPATH=src/
export BASIC_AUTH__PASSWORD=1234
...

...or using a Bash-script

set -o allexport; source .env; set +o allexport

πŸ›‘ NOTE 2: Please do not forget about environment variables! Now all environment variables for the Postgres Database which runs in docker are already passed to docker-compose.yaml from the .env file.

πŸ›‘ NOTE 3: If you get an error running pipenv sync --dev related to the dependency greenlet, install it by running:

pipenv install greenlet

πŸ›‘ NOTE 4: If the application can't find the RabbitMQ service even though it's running normally, change your RABBITMQ__URL to your local ip address instead of localhost

Run the migrations

alembic upgrade head

πŸ›‘ NOTE: If you run into role based errors e.g. role "postgres" does not exist, check to see if that program is running anywhere else (e.g. Homebrew), run... ps -ef | grep {program-that-errored} You can attempt to kill the process with the following command kill -9 {PID-to-program-that-errored}, followed by rerunning the previous check to confirm if the program has stopped.

Running the app

Running locally

This option allows you to run the app for development purposes without having to manually build the Docker image (i.e. When developing on the Web or Admin project).

  • Make sure all required services are properly setup

  • If you're running required services using Docker, disable the app service from docker-compose before running:

    docker-compose up -d

    Alternatively, you may run these services using make (i.e. When developing the API):

    • You'll need to sudo into /etc/hosts and append the following changes.
    #mindlogger
    127.0.0.1 postgres
    127.0.0.1 rabbitmq
    127.0.0.1 redis
    127.0.0.1 mailhog
    

    Then run the following command from within the active virtual environment shell...

    make run_local

πŸ›‘ NOTE: Don't forget to set the PYTHONPATH environment variable, e.g: export PYTHONPATH=src/

  • To test that the API is up and running navigate to http://localhost:8000/docs in a browser.

In project we use simplified version of imports: from apps.application_name import class_name, function_name, module_nanme.

To do this we must have src/ folder specified in a PATH.

P.S. You don't need to do this additional step if you run application via Docker container 🀫

uvicorn src.main:app --proxy-headers --port {PORT} --reload

Alternatively, you may run the application using make:

make run

Running via docker

docker-compose up

Additional docker-compose up flags that might be useful for development

-d  # Run docker containers as deamons (in background)
--no-recreate  # If containers already exist, don't recreate them

Stop the application πŸ›‘

docker-compose down

Additional docker-compose down flags that might be useful for development

-v  # Remove with all volumes

Running using Makefile

You can use the Makefile to work with project (run the application / code quality tools / tests ...)

For local usage:

# Run the application
make run

# Check the code quality
make cq

# Check tests passing
make test

# Check everything in one hop
make check

Docker development

Build application images

docker-compose build

βœ… Make sure that you completed .env file. It is using by default in docker-compose.yaml file for buildnig.

βœ… Check building with docker images command. You should see the record with fastapi_service.

πŸ’‘ If you would like to debug the application insode Docker comtainer make sure that you use COMPOSE_FILE=docker-compose.dev.yaml in .env. It has opened stdin and tty.

Testing

The pytest framework is using in order to write unit tests. Currently postgresql is used as a database for tests with running configurations that are defined in pyproject.toml

DATABASE__HOST=postgres
DATABASE__PORT=5432
DATABASE__PASSWORD=postgres
DATABASE__USER=postgres
DATABASE__DB=test

πŸ›‘ NOTE: To run tests localy without changing DATABASE_HOST please add row below to the /etc/hosts file (macOS, Linux). It will automatically redirect postgres to the localhost.

127.0.0.1       postgres

Adjust your database for using with tests

⚠️️ Remember that you have to do this only once before the first test.

# Connect to the database with Docker
docker-compose exec postgres psql -U postgres postgres

# Or connect to the database locally
psql -U postgres postgres


# Create user's database
psql# create database test;

# Create arbitrary database
psql# create database test_arbitrary;

# Create user test
psql# create user test;

# Set password for the user
psql# alter user test with password 'test';

Test coverage

To correctly calculate test coverage, you need to run the coverage with the --concurrency=thread,gevent parameter:

coverage run --branch --concurrency=thread,gevent -m pytest
coverage report -m

Running test via docker

(This is how tests are running on CI)

# Check the code quality
make dcq

# Check tests passing
make dtest

# Check everything in one hop
make dcheck

Scripts

Using pre-commit hooks

It is a good practice to use Git hooks to provide better commits.

For increased security during development, install git-secrets to scan code for aws keys.

Please use this link for that: https://github.com/awslabs/git-secrets#installing-git-secrets

.pre-commit-config.yaml is placed in the root of the repository.

πŸ‘‰ Once you have installed git-secrets and pre-commit simply run the following command.

make aws-scan

πŸ‘‰ Then all your staged cahnges will be checked via git hooks on every git commit

Alembic (migration)

Add a new migrations file πŸ”¨

alembic revision --autogenerate -m "Add a new field"

Upgrade to the latest migration πŸ”¨

alembic upgrade head

Downgrade to the specific one πŸ”¨

alembic downgrade 0e43c346b90d

βœ… This hash is taken from the generated file in the migrations folder

Downgrade to the specific one πŸ”¨

alembic downgrade 0e43c346b90d

Removing the migration πŸ”¨

πŸ’‘ Do not forget that alembic saves the migration version into the database.

delete from alembic_version;

Upgrade arbitrary servers

alembic -c alembic_arbitrary.ini upgrade head

Database relation structure

erDiagram

User_applet_accesses ||--o{ Applets: ""

    User_applet_accesses {
        int id
        datetime created_at
        datetime updated_at
        int user_id FK
        int applet_id FK
        string role
    }

    Users {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        string email
        string full_name
        string hashed_password
    }

 Users||--o{ Applets : ""

    Applets {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        string display_name
        jsonb description
        jsonb about
        string image
        string watermark
        int theme_id
        string version
        int creator_id FK
        text report_server_id
        text report_public_key
        jsonb report_recipients
        boolean report_include_user_id
        boolean report_include_case_id
        text report_email_body
    }

Applet_histories }o--|| Users: ""

    Applet_histories {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        jsonb description
        jsonb about
        string image
        string watermark
        int theme_id
        string version
        int account_id
        text report_server_id
        text report_public_key
        jsonb report_recipients
        boolean report_include_user_id
        boolean report_include_case_id
        text report_email_body
        string id_version
        string display_name
        int creator_id FK
    }

Answers_activity_items }o--|| Applets: ""
Answers_activity_items }o--|| Users: ""
Answers_activity_items }o--|| Activity_item_histories: ""

    Answers_activity_items {
        int id
        datetime created_at
        datetime updated_at
        jsonb answer
        int applet_id FK
        int respondent_id FK
        int activity_item_history_id_version FK
    }

Answers_flow_items }o--|| Applets: ""
Answers_flow_items }o--|| Users: ""
Answers_flow_items ||--o{ Flow_item_histories: ""

    Answers_flow_items {
        int id
        datetime created_at
        datetime updated_at
        jsonb answer
        int applet_id FK
        int respondent_id FK
        int flow_item_history_id_version FK
    }

Activities }o--|| Applets: ""

    Activities {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        UUID guid
        string name
        jsonb description
        text splash_screen
        text image
        boolean show_all_at_once
        boolean is_skippable
        boolean is_reviewable
        boolean response_is_editable
        int ordering
        int applet_id FK
    }

Activity_histories }o--|| Applets: ""

    Activity_histories {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        UUID guid
        string name
        jsonb description
        text splash_screen
        text image
        boolean show_all_at_once
        boolean is_skippable
        boolean is_reviewable
        boolean response_is_editable
        int ordering
        int applet_id FK
    }

Activity_item_histories }o--|| Activity_histories: ""

    Activity_item_histories {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        jsonb question
        string response_type
        jsonb answers
        text color_palette
        int timer
        boolean has_token_value
        boolean is_skippable
        boolean has_alert
        boolean has_score
        boolean is_random
        boolean is_able_to_move_to_previous
        boolean has_text_response
        int ordering
        string id_version
        int activity_id FK
    }

Activity_items }o--|| Activities: ""

    Activity_items {
        int id
        datetime created_at
        datetime updated_at
        jsonb question
        string response_type
        jsonb answers
        text color_palette
        int timer
        boolean has_token_value
        boolean is_skippable
        boolean has_alert
        boolean has_score
        boolean is_random
        boolean is_able_to_move_to_previous
        boolean has_text_response
        int ordering
        int activity_id FK
    }



Flows }o--|| Applets: ""

    Flows {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        string name
        UUID guid
        jsonb description
        boolean is_single_report
        boolean hide_badge
        int ordering
        int applet_id FK
    }

Flow_items }o--|| Flows: ""
Flow_items }o--|| Activities: ""

    Flow_items {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        int ordering
        int activity_flow_id FK
        int activity_id FK
    }

Flow_item_histories }o--|| Flow_histories: ""
Flow_item_histories }o--|| Activity_histories: ""

    Flow_item_histories {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        string id_version
        int activity_flow_id FK
        int activity_id FK
    }

Flow_histories }o--|| Applet_histories: ""

    Flow_histories {
        int id
        datetime created_at
        datetime updated_at
        boolean is_deleted
        string name
        UUID guid
        jsonb description
        boolean is_single_report
        boolean hide_badge
        int ordering
        string id_version
        int applet_id FK
    }


Loading

Arbitrary setup

You can connect arbitrary file storage and database by filling special fields in table user_workspaces.

PostgreSQL

Add your database connection string into database_uri In next format:

postgresql+asyncpg://<username>:<password>@<hostname>:port/database

AWS S3 and GCP S3

For AWS S3 bucket next fields are required: storage_region,storage_bucket, storage_access_key,storage_secret_key.

Azure Blob

In case of Azure blob, specify your connection string into field storage_secret_key

License

Common Public Attribution License Version 1.0 (CPAL-1.0)

Refer to LICENSE.md

Opentelemtry

If app is running in docker

  • Make sure that OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://opentelemetry:4317 endpoint has been already set in .env. Run docker container with opentelemetry:
docker-compose up -d opentelemetry

If app is running locally

  • Make sure that OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://localhost:4317 is exported in environment.
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://localhost:4317

or if you use pipenv for autoloading envs - make sure that OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://localhost:4317 is added to .env file.

  • The same as for containerized app - up container with opentelemetry
```bash
docker-compose up -d opentelemetry
  • Start you app