A project to publish website analytics for the US federal government.
For a detailed description of how the site works, read 18F's blog post on analytics.usa.gov.
Other organizations who have reused this project for their analytics dashboard:
This blog post details their implementations and lessons learned.
Ths app uses Jekyll to build the site, and Sass, Bourbon, and Neat for CSS.
The javascript provided is a webpacked aggregation of several different modules, leveraging d3 for the visualizations. Learn more on the webpack configuration
There are a couple of different ways to develop locally. Either using docker or running without docker.
You need Docker and docker-compose.
To build and run the app with docker-compose, run docker-compose up -d
then you can access the app from http://localhost:4000
, as the local filesystem is mounted on top of the docker container you can edit the files as you are developing locally.
- this does not yet run the webpack script.
To see the jekyll logs, run:
docker-compose logs -f
Run Jekyll with development settings:
make dev
npm install
npm run build-dev
(This runs bundle exec jekyll serve --watch --config=_config.yml,_development.yml
.)
- Ensure that data is being collected for a specific agency's Google Analytics ID. Visit 18F's analytics-reporter for more information. Save the url path for the data collection path.
- Create a new html file in the
_agencies
directory. The name of the file will be the url path.
touch _agencies/agencyx.html
- Create a new html file in the
_data_pages
directory. Use the same name you used in step 2. This will be the data download page for this agency
touch _data_pages/agencyx.html
- Set the required data for for the new files. (Both files need this data.) example:
---
name: Agency X # Name of the page
slug: agencyx # Same as the name of the html files. Used to generate data page links.
layout: default # type of layout used. available layouts are in `_layouts`
---
- Agency page: Below the data you just entered, include the page content you want. The
_agencies
page will use thecharts.html
partial and the_data_pages
pages will use thedata_download.html
partial. example:
{% include charts.html %}
The development settings assume data is available at /fakedata
. You can change this in _development.yml
.
If also working off of local data, e.g. using analytics-reporter
, you will need to make the data available over HTTP and through CORS.
Various tools can do this. This project recommends using the Node module serve
:
npm install -g serve
Generate data to a directory:
analytics --output [dir]
Then run serve
from the output directory:
serve --cors
The data will be available at http://localhost:3000
over CORS, with no path prefix. For example, device data will be at http://localhost:3000/devices.json
.
- Index - includes the main dom selection and rendering queue of components, and the entry point for the webpack bundler.
- lib/barchart the d3 configuration of the bar charts
- lib/blocks an object of the specific components
- lib/consoleprint the console messages displayed to users
- lib/exceptions agency data to be changed by discrete exception rules
- lib/formatters methods to help format the display of visualization scales and values
- lib/renderblock d3 manipulator to load and render data for a component block
- lib/timeseries the d3 configuration of the timeseries charts
- lib/transformers helper methods to manipulate and consolidate raw data into proportional data.
To deploy to analytics.usa.gov after building the site with the details in _config.yml
:
make deploy_production
To deploy to analytics-staging.app.cloud.gov after building the site with the details in _config.yml
and _staging.yml
:
make deploy_staging
NOTE: 18F does not use Docker in production!
If you are using Docker in production and you want to deploy just the static pages, you can build an nginx container with the static files built in, running the following command:
make docker-build-production PROD_IMAGE=yourvendor/your-image-name PROD_TAG=production
The resulting image will be an nginx server image that you can safely push and deploy to your server.
The image accepts an environment variable to specify the S3 URL that data at /data/*
is served from:
docker run -p 8080:80 -e S3_BUCKET_URL=https://s3-us-gov-west-1.amazonaws.com/your-s3-bucket/data yourvendor/your-image-name:production
This repo has git tags. The tag for Docker images built for this repo relate to these git tags. In the examples below, <version
refers to the tag value of the current commit. When building a new version, be sure to increment the git tag appropriately.
When building images there are 2 images to build: <version>
and <version>-production
.
To build the images:
docker build -f ./Dockerfile -t 18fgsa/analytics.usa.gov:<version> .
docker build -f ./Dockerfile.production -t 18fgsa/analytics.usa.gov:<version>-production .
To push the images:
docker push 18fgsa/analytics.usa.gov:<version>
docker push 18fgsa/analytics.usa.gov:<version>-production
Environment | Branch | URL |
---|---|---|
Production | master | https://analytics.usa.gov |
Staging | master | https://analytics-staging.app.cloud.gov |
The application compiles es6 modules into web friendly js via Wepback and the babel loader.
The webpack configuration is set in the wepback.config.js.
The current configuration uses babel present-env
.
The webpack also includes linting using eslint leveraging the AirBnb linting preset.
The webconfig uses the UglifyJSPlugin to minimize the bundle.
The resulting uglified bundle is build into assest/bundle.js
.
Command | purpose |
---|---|
npm run build-dev | a watch command rebuilding the webpack with a development configuration (i.e. no minifiecation) |
npm run build-prod | a webpack command to build a minified and transpiled bundle.js |
This project is in the worldwide public domain. As stated in CONTRIBUTING:
This project is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the CC0 1.0 Universal public domain dedication.
All contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.