Product | Datastore Services API |
---|---|
Description | Node.JS app that provides various admin API end points for management of the Datastore, and an API end point to check for the existence of IATI Identifiers in the Datastore |
Website | https://datastore.iatistandard.org |
Related | IATI/datastore-search |
Documentation | https://developer.iatistandard.org/ |
Technical Issues | https://github.com/IATI/datastore-services/issues |
Support | https://iatistandard.org/en/guidance/get-support/ |
See OpenAPI specification postman/schemas/index.yaml
. To view locally in Swagger UI, you can use the 42crunch.vscode-openapi
VSCode extension.
- nvm - nvm - Node version manager
- Node LTS
- This will be the latest LTS version supported by Azure Functions, set in
.nvmrc
- once you've installed nvm run
nvm use
which will look at.nvmrc
for the node version, if it's not installed then it will prompt you to install it withnvm install <version> --latest-npm
- This will be the latest LTS version supported by Azure Functions, set in
- npm >=8
- nvm will install the version of npm packaged with node. make sure to use the
--latest-npm
flag to get the latest version - If you forgot to do that install the latest version of npm with
npm i -g npm
- nvm will install the version of npm packaged with node. make sure to use the
- Azure Functions Core Tools v3
- Azure CLI version 2.4 or later.
- Clone repo
- Follow instructions for nvm/node prerequisties above
- Prepare your environment variables as described below
- Run
npm i
- Run
npm start
to run the function locally using the Azure Functions Core Tools
Needs a local.settings.json
with an Azure Storage account connection string to work locally. Use an account you've created yourself for this local testing.
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "node",
"AzureWebJobsStorage": "<connection string>"
}
}
cp .env.example .env
APPINSIGHTS_INSTRUMENTATIONKEY
- Needs to be set for running locally, but will not actually report telemetry to the AppInsights instance in my experience
Others:
PGDATABASE=<dbname>
PGHOST=<host>
PGPASSWORD=
PGPORT=5432
PGSSL=true
PGUSER=<username>@<host>
SOLR_URL=
SOLR_USERNAME=
SOLR_PASSWORD=
The DOWNLOAD_CONTAINER_NAME must be a container with Public Access for Blobs to allow unauthenticated download from the browser.
STORAGE_CONNECTION_STRING=
DOWNLOAD_CONTAINER_NAME=
Add in:
- .env.example
- .env
/config/config.js
Import
import config from "./config.js";
let myEnvVariable = config.ENV_VAR
- Set a breakpoint
- Press F5 to start the Azure Function and Attach the VSCode debugger
- Configuration is contained in
.vscode/launch.json
and.vscode/tasks.json
- Configuration is contained in
- Trigger a request that will hit your break point
- Enojy!
- To show linting inline install ESLint for VSCode
- This is done with eslint following the airbnb-base style and using Prettier. Implemented with this guide.
- If you use VSCode the formatting will happen automagically on save due to the
.vscode/settings.json
>"editor.formatOnSave": true
setting
Downloads the provided Solr query to Azure Blobs and returns URL where it can be downloaded from.
-
Request
-
query - solr query
-
format - JSON, CSV, or XML
Example:
{
"query": "activity/select?q=reporting_org_ref:\"GB-CHC\" AND recipient_country_code:PK&fl=reporting_org_ref,iati_identifier",
"format": "CSV"
}
- Returns
200 Response
Example
{
"req": {
"query": "activity/select?q=reporting_org_ref:\"GB-CHC\" AND recipient_country_code:PK&fl=reporting_org_ref,iati_identifier",
"format": "CSV"
},
"solrResponseMeta": {
"numFound": 206,
"start": 0,
"numFoundExact": true,
"docs": []
},
"fileName": "a1410f0b-6f42-4680-8421-d2b5313d3f02.csv",
"url": "https://name.blob.core.windows.net/dss-downloads/a1410f0b-6f42-4680-8421-d2b5313d3f02.csv",
"blobRequestId": "d953a662-f01e-0001-6511-0ec25c000000"
}
- XML - Have to paginate with SOLR_MAX_ROWS as max rows requested in one call as requesting all rows in one call caused Solr to return 5xx errors
- This is likely due to using either Velocity or XSLT response writer to generate the raw XML
- JSON, CSV, EXCEL - Requests all rows in one call and streams directly to the Blob storage. Solr seems to be able to handle requesting all rows well for these so we can just stream the response directly for all rows.
func new --name <routename> --template "HTTP trigger" --authlevel "Function"
- Install newman globally
npm i -g newman
- Start function
npm start
(or using whatever method you use, if you want debugging) - Run Tests
npm run int:test
You can run the tests locally using Newman, but against the dev or production Azure servers. This is useful because it can be faster than running via Postman, and it doesn't count towards any Postman quotas.
Replace KEY
with the Azure key for the dev Function app (available in the Postman web interface):
npm run int:test:dev -- --env-var "keyValue=KEY"
Replace KEY
with the Azure key for the prod Function app (available in the Postman web interface):
npm run int:test:prod -- --env-var "keyValue=KEY"
Integration tests are written in Postman v2.1 format and run with Newman. The easiest way is to write new tests in Postman (using the website or the desktop client), and then export the collection using the default filename, saving the result into this repo, in integration-tests/
.
Follows the IATI Development Process