Movember TrueNTH USA Shared Services
Pick any path for installation
$ export PROJECT_HOME=~/truenth_ss
$ sudo apt-get install postgresql python3-venv python3-dev
$ sudo apt-get install libffi-dev libpq-dev build-essential redis-server
$ git clone https://github.com/uwcirg/truenth-portal.git $PROJECT_HOME
This critical step enables isolation of the project from system python,
making dependency maintenance easier and more stable. It does require
that you activate
the virtual environment before you interact with
python or the installer scripts. The virtual environment can be
installed anywhere, using the nested 'env' pattern here.
$ python3 -m venv $PROJECT_HOME/env
Required to interact with the python installed in this virtual environment. Forgetting this step will result in obvious warnings about missing dependencies. This needs to be done in every shell session that you work from.
$ cd $PROJECT_HOME
$ source env/bin/activate
To create the postgresql database that backs your Shared Services issue the following commands:
$ sudo -u postgres createuser truenth-dev --pwprompt # enter password at prompt
$ sudo -u postgres createdb truenth-dev --owner truenth-dev
Building the schema and populating with basic configured values is done via the :ref:`flask sync <flask-sync>` command. See details below.
The default version of pip provided in the virtual environment is often out of date. Best to update first, for optimal results:
$ pip install --upgrade pip setuptools
Create a configuration file if one does not already exist
$ cp $PROJECT_HOME/instance/application.cfg{.default,}
See :ref:`OAuth Config <oauthconfig>`
Instruct pip
to install the correct version of all dependencies into the
virtual environment. This idempotent step can be run anytime to confirm the
correct libraries are installed:
pip install --requirement requirements.txt
To install additional dependencies necessary for development, replace the named requirements file:
pip install --requirement requirements.dev.txt
A number of built in and custom extensions for command line interaction are available via the click command line interface, several of which are documented below.
To use or view the usage of the available commands:
- :ref:`activate-venv`
- Set FLASK_APP environment variable to point at manage.py
export FLASK_APP=manage.py
- Issue the
flask --help
orflask <cmd> --help
commands for more details
flask sync --help
Note
All flask
commands mentioned within this document require the
first two steps listed above.
The idempotent sync
function takes necessary steps to build tables,
upgrade the database schema and run seed
to populate with static data.
Safe to run on existing or brand new databases.
flask sync
Especially useful in bootstrapping a new install, a user may be added and blessed with the admin role from the command line. Be sure to use a secure password.
flask add-user --email user@server.com --password reDacted! --role admin
Users who forget their passwords should be encouraged to use the forgot password link from the login page. In rare instances when direct password reset is necessary, an admin may perform the following:
flask password-reset --email forgotten_user@server.com --password $NEW_PASSWORD --actor $ADMIN_EMAIL
To update your Shared Services installation run the deploy.sh
script
(this process wraps together pulling the latest from the repository, the
:ref:`pip <pip>` and :ref:`flask sync <flask-sync>` commands listed above).
This script will:
- Update the project with the latest code
- Install any dependencies, if necessary
- Perform any database migrations, if necessary
- Seed any new data to the database, if necessary
$ cd $PROJECT_HOME
$ ./bin/deploy.sh
To see all available options run:
$ ./bin/deploy.sh -h
To run the flask development server, run the below command from an activated virtual environment
$ flask run
By default the flask dev server will run without the debugger and listen on port 5000 of localhost. To override these defaults, call flask run
as follows
$ FLASK_DEBUG=1 flask run --port 5001 --host 0.0.0.0
$ celery worker --app portal.celery_worker.celery --loglevel=info
Alternatively, install an init script and configure. See Daemonizing Celery
Should the need ever arise to purge the queue of jobs, run the following destructive command. All tasks should be idempotent by design, so doing this is suggested, especially if the server is struggling.
$ celery purge --force --app portal.celery_worker.celery
Without running purge
, celery will resume any unfinished tasks when it restarts
The value of SQLALCHEMY_DATABASE_URI
defines which database engine
and database to use. Alternatively, the following environment
variables may be used (and if defined, will be preferred):
PGDATABASE
PGUSER
PGPASSWORD
PGHOST
At this time, only PostgreSQL is supported.
Thanks to Alembic and Flask-Migrate, database migrations are easily managed and run.
Note
Alembic tracks the current version of the database to determine which migration scripts to apply. After the initial install, stamp the current version for subsequent upgrades to succeed:
flask db stamp head
Note
The :ref:`flask sync <flask-sync>` command covers this step automatically.
Anytime a database (might) need an upgrade, run the manage script with
the db upgrade
arguments (or run the deployment
script)
This is idempotent process, meaning it's safe to run again on a database that already received the upgrade.
flask db upgrade
Note
The :ref:`flask sync <flask-sync>` command covers this step automatically.
Update the python source files containing table definitions (typically classes derived from db.Model) and run the manage script to sniff out the code changes and generate the necessary migration steps:
flask db migrate
Then execute the upgrade as previously mentioned:
flask db upgrade
To run the tests, repeat the
postgres createuser && postgres createdb
commands as above with the
values for the {user, password, database} as defined in the
TestConfig
class within portal\config\config.py
All test modules under the tests
directory can be executed via
py.test
(again from project root with the virtual environment
activated)
$ py.test
Alternatively, run a single modules worth of tests, telling py.test to not suppress standard out (vital for debugging) and to stop on first error:
$ py.test tests/test_intervention.py
The test runner Tox is configured to run the portal test suite and test other parts of the build process, each configured as a separate Tox "environment". To run all available environments, execute the following command:
$ tox
To run a specific tox environment, "docs" or the docgen environment in this case, invoke tox with the -e
option eg:
$ tox -e docs
Tox will also run the environment specified by the TOXENV
environment variable, as configured in the TravisCI integration.
Tox will pass any options after -- to the test runner, py.test. To run tests only from a certain module (analogous the above py.test invocation):
$ tox -- tests/test_intervention.py
This project includes integration with the TravisCI continuous integration platform. The full test suite (every Tox virtual environment) is automatically run for the last commit pushed to any branch, and for all pull requests. Results are reported as passing with a ✔ and failing with a ✖.
UI integration/acceptance testing is performed by Selenium and is included in the test suite and continuous integration setup. Specifically, Sauce Labs integration with TravisCI allows Selenium tests to be run with any number of browser/OS combinations and captures video from running tests.
UI tests can also be run locally (after installing xvfb
and geckodriver) by passing
Tox the virtual environment that corresponds to the UI tests (ui
).
sudo apt-get install xvfb
- Install geckodriver from https://github.com/mozilla/geckodriver/releases. For example
$ wget https://github.com/mozilla/geckodriver/releases/download/v0.21.0/geckodriver-v0.21.0-linux64.tar.gz
$ tar -xvzf geckodriver-v0.21.0-linux64.tar.gz
$ rm geckodriver-v0.21.0-linux64.tar.gz
$ chmod +x geckdriver
$ sudo mv geckodriver /usr/local/bin/
$ tox -e ui
Project dependencies are hard-coded to specific versions (see
requirements.txt
) known to be compatible with Shared Services to
prevent dependency updates from breaking existing code.
If pyup.io integration is enabled the service will create pull requests when individual dependencies are updated, allowing the project to track the latest dependencies. These pull requests should be merged without need for review, assuming they pass continuous integration.
Docs are built separately via sphinx. Change to the docs directory and
use the contained Makefile to build - then view in browser starting with
the docs/build/html/index.html
file
$ cd docs
$ make html
Download PostgreSQL via: https://www.postgresql.org/download/windows/
To create the postgresql database, in pgAdmin click "databases" and "create" and enter the desired characteristics of the database, including the owner. To create the user, similarly in pgAdmin, click "login roles" and "create" and enter the desired characteristics of the user. Ensure that it has permission to login.
Ensure that C++ is installed -- if not, download from: https://www.microsoft.com/en-us/download/details.aspx?id=44266
Ensure that setuptools
is up-to-date by running:
$ python -m pip install --upgrade pip setuptools
Ensure that ez_setup
is installed by running:
$ pip install ez_setup
Install requirements by running:
$ pip install --requirement requirements.txt
In $PATH\\data\pg_hba.conf
, change the bottom few lines to read:
# TYPE DATABASE USER ADDRESS METHOD # IPv4 local connections: host all all 127.0.0.1/32 trust # IPv6 local connections: host all all ::1/128 trust
Copy the default configuration file to the named configuration file
$ copy $PROJECT_HOME/instance/application.cfg.default $PROJECT_HOME/instance/application.cfg
In application.cfg
, (below), fill in the values for SQLALCHEMY_DATABASE_URI
for user, password,
localhost, portnum, and dbname.
user, password, and dbname were setup earlier in pgAdmin.
portnum can also be found in pgAdmin.
localhost should be 127.0.0.1
SQLALCHEMY_DATABASE_URI = 'postgresql://user:password@localhost:portnum/dbname'
To test that the database is set up correctly, from a virtual environment run:
$ python ./bin/testconnection.py