-
Notifications
You must be signed in to change notification settings - Fork 24
1.1. Installing for Linux (Production)
Commands in this section are root commands and should be preceded with sudo
if current user is not root
A basic HADatAc installation involves the following five steps:
-
acquiring required software for installing/running the HADatAc Console
-
downloading HADatAc code from this GitHub repository
-
deploying HADatAc database dependencies
-
deploying HADatAc
-
proxy setting
Before anything, however, we need to verify that the host machine has a git client
to retrieve the code, and an sbt
utility to run the console. In the command line of your machine, verify that you have a git
client installed in your machine:
git
If you do not have git, you need to install one: http://git-scm.com/downloads. Still from the command line, verify that you have sbt
applications installed in your machine:
sbt
If not, you need to install one: http://www.scala-sbt.org/.
You will also need:
- Java JDK 8: http://java.com/en/download/ (or use your distribution's java) (MAKE SURE THE MACHINE IS NOT RUNNING JAVA 9)
- wget
The commands in this section are to be issued by root user
Create both /data and /data/git folders:
mkdir -p /data/git
Go to git directory :
cd /data/git
Once your are in your home directory (~
is a shortcut for your home directory), clone HADatAc code by typing the following command:
git clone https://github.com/paulopinheiro1234/hadatac.git
After cloning HADatAC, you should have a hadatac folder under your home directory. Go into the hadatac directory under your home directory by typing the following:
cd /data/git/hadatac
Once you are in /data/git/hadatac (you can check if you are in the right folder by typing pwd
), type the following:
./production_install_hadatac.sh
Note: use chmod a+x *.sh
if the script above cannot be executed
This process will take a while so let it run. It will download and install a SOLR instance, a Blazegraph instance, and initialize the instance.
During the process of executing the script, you will be asked for an installation folder. Use /data/hadatac as installation folder
Open your browser to verify if the SOLR installation was successful
- Open another tab with the url: http://localhost:8983/solr/
Open your browser to verify if the Blazegraph installation was successful
- Open another tab with the url: http://localhost:8080/bigdata/
Once inside Blazegraph's web page (link above), select the 'Namespaces' tab, and verify if the required namespaces store
, store_sandbox
, and store_users
have been created.
Go back to hadatac folder:
cd /data/git/hadatac
Now type the following:
./deploy_hadatac.sh
Browser open hadatac url: http://localhost:9000/hadatac/
Wait for page to open, it will take a while.
In the instruction below we use apache2 for proxying hadatac from localhost:9000
into (domain)/hadatac
Assuming that apache2 is installed, change to apache2
folder in /etc
cd etc/apache2
in the /sites-available
folder, edit the conf file. Look for
<VirtualHost *:80>
Right before the closing tag </VirtualHost>
, add the following content:
ProxyRequests Off
<Proxy *>
Order deny,allow
Allow from all
</Proxy>
ProxyPass /hadatac http://127.0.0.1:9000/hadatac
ProxyPassReverse /hadatac http://127.0.0.1:9000/hadatac
Save the file.
Go to http://(domain)/hadatac
, where (domain)
is the name of your domain
Copyright (c) 2019, HADatAc.org
-
Installation
1.1. Installing for Linux (Production)
1.2. Installing for Linux (Development)
1.3. Installing for MacOS (Development)
1.4. Deploying with Docker (Production)
1.5. Deploying with Docker (Development)
1.6. Installing for Vagrant under Windows
1.7. Upgrading
1.8. Starting HADatAc
1.9. Stopping HADatAc -
Setting Up
2.1. Software Configuration
2.2. Knowledge Graph Bootstrap
2.2.1. Knowledge Graph
2.2.2. Bootstrap without Labkey
2.2.3. Bootstrap with Labkey
2.3. Config Verification -
Using HADatAc
3.1. Initial Page
3.1.1. Home Button
3.1.2. Sandbox Mode Button
3.2. File Ingestion
3.2.1. Ingesting Study Content
3.2.2. Manual Submission of Files
3.2.3. Automatic Submission of Files
3.2.4. Data File Operations
3.3. Manage Working Files 3.3.1. [Create Empty Semantic File from Template]
3.3.2. SDD Editor
3.3.3. DD Editor
3.4. Manage Metadata
3.4.1. Manage Instrument Infrastructure
3.4.2. Manage Deployments 3.4.3. Manage Studies
3.4.4. [Manage Object Collections]
3.4.5. Manage Streams
3.4.6. Manage Semantic Data Dictionaries
3.4.7. Manage Indicators
3.5. Data Search
3.5.1. Data Faceted Search
3.5.2. Data Spatial Search
3.6. Metadata Browser and Search
3.7. Knowledge Graph Browser
3.8. API
3.9. Data Download -
Software Architecture
4.1. Software Components
4.2. The Human-Aware Science Ontology (HAScO) -
Metadata Files
5.1. Deployment Specification (DPL)
5.2. Study Specification (STD)
5.3. Semantic Study Design (SSD)
5.4. Semantic Data Dictionary (SDD)
5.5. Stream Specification (STR) -
Content Evolution
6.1. Namespace List Update
6.2. Ontology Update
6.3. [DPL Update]
6.4. [SSD Update]
6.5. SDD Update -
Data Governance
7.1. Access Network
7.2. User Status, Categories and Access Permissions
7.3. Data and Metadata Privacy - HADatAc-Supported Projects
- Derived Products and Technologies
- Glossary