Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev #97

Merged
merged 125 commits into from
Nov 27, 2024
Merged

Dev #97

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
125 commits
Select commit Hold shift + click to select a range
3972f83
Removed Bio Field from BackupData because of real data now
flo0852 Nov 15, 2024
eb2b564
Added endpoint for batching
flo0852 Nov 15, 2024
2351502
No more dummy data
flo0852 Nov 15, 2024
ea8c26f
Added test for batched create
flo0852 Nov 15, 2024
da7e755
Add model for results table
Omega65536 Nov 16, 2024
4ef4cf2
Add analyzer API endpoint to update the backend database
Omega65536 Nov 17, 2024
dde4b59
adapt to new data format
engelharddirk Nov 17, 2024
d7b5950
add timeline chart
chrisklg Nov 18, 2024
39cbc34
added env readout
heskil Nov 18, 2024
cf1a382
adjusted and fixed analyzer demo
heskil Nov 18, 2024
b422031
fix of broken backup
heskil Nov 18, 2024
0358946
Merge remote-tracking branch 'origin/29-backend-backup-size-timeline-…
chrisklg Nov 18, 2024
77213e4
Minor improvements
Omega65536 Nov 18, 2024
5e4d11b
added swagger specification for analyzer
heskil Nov 19, 2024
6685358
Refactor analyzer API to use a class, add a simple test
Omega65536 Nov 19, 2024
8e96571
Merge branch '29-backend-backup-size-timeline-api' of github.com:amos…
Omega65536 Nov 19, 2024
536cb71
added help methods for aggregating the backup data
chrisklg Nov 19, 2024
b9c41f0
Merge remote-tracking branch 'origin/dev' into 29-backend-backup-size…
flo0852 Nov 19, 2024
3fe0550
Fixed package lock
flo0852 Nov 19, 2024
9c684ca
added separat Observable and Subject for the charts
chrisklg Nov 19, 2024
3a39125
Update apps/analyzer/metadata_analyzer/main.py
heskil Nov 19, 2024
7c65aef
Merge branch '29-backend-backup-size-timeline-api' into 72-backend-al…
flo0852 Nov 20, 2024
48c8f79
Added Alert Entity
flo0852 Nov 20, 2024
8882910
Added basic methods for alerting
flo0852 Nov 20, 2024
1ba5613
Fixes
flo0852 Nov 20, 2024
d3ad9c1
Send a mail on creating an Alert + more detailed mail
flo0852 Nov 21, 2024
236bdc2
Created and adjusted unit tests
flo0852 Nov 21, 2024
b729c0c
Added integration tests for alerting endpoints
flo0852 Nov 21, 2024
ffb3607
Added files for alerting
flo0852 Nov 21, 2024
cb1ab6e
Basic Functionality and design of alert panel
flo0852 Nov 21, 2024
ec2a4ba
Improved design
flo0852 Nov 21, 2024
b863b64
Introduced dynamic status check depending on criticality of alert
flo0852 Nov 21, 2024
e4d6f9f
Improved design
flo0852 Nov 21, 2024
9ff4e0f
Merge pull request #67 from amosproj/29-backend-backup-size-timeline-api
flo0852 Nov 21, 2024
e726ac7
Merge remote-tracking branch 'origin/dev' into 72-backend-alert-handling
flo0852 Nov 21, 2024
c6db41e
Merge branch '72-backend-alert-handling' into 48-frontend-blueprint-a…
flo0852 Nov 21, 2024
1e6a138
Placed alert panel on the overview page
flo0852 Nov 21, 2024
d9d2728
added reactive forms for filtering
chrisklg Nov 22, 2024
8e9ac3f
added backup filter type
chrisklg Nov 22, 2024
677cb07
created clarity datagrid filter logic
chrisklg Nov 22, 2024
37aeabe
created datagrid filtering; updated timeline amchart logic
chrisklg Nov 22, 2024
2e67116
created chart-service to initialize charts
chrisklg Nov 22, 2024
ad34181
Start implementing the simple rule-based analyzer
Omega65536 Nov 22, 2024
dcef967
changed timeline chart from line to column chart
chrisklg Nov 23, 2024
566504c
Made ApiResponse generic, don't use extra call to get total number of…
flo0852 Nov 23, 2024
230c428
Fixed sorting with pagination
flo0852 Nov 23, 2024
4d718fe
Show pagination navigation
flo0852 Nov 23, 2024
d1ad257
Fixed sorting on filter + only backend filtering + fixed date filtering
flo0852 Nov 23, 2024
50d8ffe
Let charts use own api call
flo0852 Nov 23, 2024
d340d54
Fixed pagination
flo0852 Nov 23, 2024
50fde42
Removed workaround things
flo0852 Nov 23, 2024
4e42eac
Added functionality to filter by alerts in the last x days
flo0852 Nov 23, 2024
f6372cc
Merge branch '72-backend-alert-handling' into 48-frontend-blueprint-a…
flo0852 Nov 23, 2024
1301ef2
Only show alerts from the last 7 days
flo0852 Nov 23, 2024
ace9d9a
Auto stash before merge of "47-display-backup-size-timeline" and "ori…
chrisklg Nov 23, 2024
270aff0
Add analyzer database as docker container
ddeli Nov 23, 2024
de8e385
env.docker specified for nest backend
ddeli Nov 23, 2024
73c732d
Add Mail service details to docker
ddeli Nov 23, 2024
d39b997
Adhere to naming convention, fix tests, refactor SimpleRuleBasedAnalyzer
Omega65536 Nov 23, 2024
88f12f7
fixed migrations error
flo0852 Nov 23, 2024
bb02c42
Merge branch '72-backend-alert-handling' into 48-frontend-blueprint-a…
flo0852 Nov 23, 2024
c788ea8
Prepare Analyzer Container, setup Docker env
ddeli Nov 23, 2024
1577776
Update SimpleRuleBasedAnalyzer to send alerts to the backend
Omega65536 Nov 23, 2024
c7205eb
Add basic test for the simpleRuleBasedAnalyzer
Omega65536 Nov 23, 2024
4c93573
fix firs couple column chart issues; create grouping methods
chrisklg Nov 23, 2024
4974d81
Setup Analyzer Container, use env var for main.py flask host
ddeli Nov 24, 2024
c4c6e7e
remove unused code
chrisklg Nov 24, 2024
3b8d221
add no data handler for charts
chrisklg Nov 24, 2024
a911019
edit chart positions
chrisklg Nov 24, 2024
fd48620
Fix Port for analyzer db in env.docker.example
ddeli Nov 24, 2024
109361f
Ignore time
flo0852 Nov 24, 2024
912c953
Removed unused method
flo0852 Nov 24, 2024
c666dba
clean up the code
chrisklg Nov 24, 2024
ba58a45
Merge remote-tracking branch 'origin/47-display-backup-size-timeline'…
chrisklg Nov 24, 2024
150c44a
Merge pull request #66 from amosproj/47-display-backup-size-timeline
flo0852 Nov 24, 2024
fe062db
Merge remote-tracking branch 'origin/dev' into 48-frontend-blueprint-…
flo0852 Nov 24, 2024
8b6cfbf
Adjusted page structure
flo0852 Nov 24, 2024
23abbe2
Fixed division by zero
flo0852 Nov 24, 2024
c30ead8
Merge pull request #78 from amosproj/72-backend-alert-handling
flo0852 Nov 24, 2024
ccfced2
Add script for dmp atuo load into analyzer db container
ddeli Nov 24, 2024
a54b34e
Merge branch 'dev' into 74-improve-local-setup---containerization-ana…
ddeli Nov 24, 2024
2590222
Refactor and add more tests
Omega65536 Nov 24, 2024
d3af9d9
Raise exceptions when backend API call fails
Omega65536 Nov 24, 2024
9c4f74e
Fix Container connection issue and auto import database dump
ddeli Nov 24, 2024
927ea29
Update README with Docker build instructions
ddeli Nov 24, 2024
7508dfc
Added destroy for observable
flo0852 Nov 25, 2024
3f2a33e
Merge remote-tracking branch 'origin/dev' into 48-frontend-blueprint-…
flo0852 Nov 25, 2024
6d57f74
Merge pull request #79 from amosproj/48-frontend-blueprint-alert-panel
flo0852 Nov 25, 2024
d61b84d
Fix for multi os docker setup
ddeli Nov 25, 2024
95a1f0a
Update README.MD
ddeli Nov 25, 2024
7e75f31
Add ANALYZER_URL to dockerfile and env.docker
ddeli Nov 25, 2024
f7a02fd
Merge remote-tracking branch 'origin/dev' into 74-improve-local-setup…
ddeli Nov 25, 2024
48015dd
Add backend_test_pipline.yml ,temp commenting out failing tests
ddeli Nov 25, 2024
22e4c4e
Add specific branch to auto test rule
ddeli Nov 25, 2024
c99a0c7
Test-Commit to trigger pipeline
ddeli Nov 25, 2024
321d871
FIX Move ci scripts to the right folder
ddeli Nov 25, 2024
86bd656
FIX ci script
ddeli Nov 25, 2024
609f1ba
FIX ci script
ddeli Nov 25, 2024
9d1df41
FIX ci script changed node version
ddeli Nov 25, 2024
e15e134
FIX ci script test command
ddeli Nov 25, 2024
3379e5e
Test commit for multiple ci scripts
ddeli Nov 25, 2024
702f2d5
Add analyzer test pipline script
ddeli Nov 25, 2024
55c935a
Fix analyzer test pipline script
ddeli Nov 25, 2024
511f54f
Fix analyzer test pipline script
ddeli Nov 25, 2024
888bf8a
Add limit option to the API to limit the number of triggered alerts, …
Omega65536 Nov 26, 2024
fc5f11f
migrate frontend test runner to vitest, fix some frontend tests
engelharddirk Nov 26, 2024
544bcc4
fix testDataComponentSpec
engelharddirk Nov 26, 2024
36334be
fix testUploadComponentSpec and testUploadServiceSpec
engelharddirk Nov 26, 2024
7c015be
fix testDataComponentSpec
engelharddirk Nov 26, 2024
56d1e2f
add clarity to tests, fix some warning logs
engelharddirk Nov 26, 2024
8904930
added rule based analysis of diff backups
heskil Nov 26, 2024
44d882d
added tests for diff analysis
heskil Nov 26, 2024
78e928a
removed debug prints
heskil Nov 26, 2024
0541eaa
revert test script
engelharddirk Nov 26, 2024
c087a4a
added basic rulebased analysis to inc backups
heskil Nov 26, 2024
b6bf530
small fix and removal of debug prints
heskil Nov 26, 2024
17613e3
Set dev as target branch
ddeli Nov 26, 2024
6eaa891
Merge branch 'dev' into test-fix-frontend-tests
engelharddirk Nov 26, 2024
ef31447
Delete unused lines
ddeli Nov 26, 2024
3807711
Merge pull request #81 from amosproj/74-improve-local-setup---contain…
ddeli Nov 26, 2024
bc288cf
Merge pull request #80 from amosproj/32-github-actions-backend-and-an…
ddeli Nov 26, 2024
fc4c8f0
Merge pull request #89 from amosproj/test-fix-frontend-tests
flo0852 Nov 26, 2024
c37e6d6
Merge remote-tracking branch 'origin/dev' into 50-analysis-module-fir…
flo0852 Nov 26, 2024
1e439ee
Workaround for floats
flo0852 Nov 26, 2024
6b06808
Merge pull request #90 from amosproj/50-analysis-module-first-rule-ba…
heskil Nov 26, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 26 additions & 5 deletions .env.docker.example
Original file line number Diff line number Diff line change
@@ -1,6 +1,27 @@
#Copy and rename this file to .env.docker
DATABASE_HOST="host.docker.internal"
DATABASE_PORT=5433
DATABASE_USER="postgres"
DATABASE_PASSWORD="postgres"
DATABASE_DATABASE="postgres"

#Backend
BACKEND_DATABASE_HOST="backendDatabase"
BACKEND_DATABASE_PORT=5432
BACKEND_DATABASE_USER="postgres"
BACKEND_DATABASE_PASSWORD="postgres"
BACKEND_DATABASE_DATABASE="postgres"
ANALYZER_URL="http://localhost:8000"

#Analyzer
ANALYZER_FLASK_RUN_HOST="0.0.0.0"
ANALYZER_FLASK_RUN_PORT="8000"
BACKEND_URL="http://backend:3000/api/"
ANALYZER_DATABASE_HOST="analyzerDatabase"
ANALYZER_DATABASE_PORT=5432
ANALYZER_DATABASE_USER="postgres"
ANALYZER_DATABASE_PASSWORD="postgres"
ANALYZER_DATABASE_DATABASE="postgres"

#Mailing
MAIL_HOST=smtp.example.com
MAIL_PORT=465
MAIL_USER=user@example.com
MAIL_PASSWORD=topsecret
MAIL_FROM=noreply@example.com
MAILING_LIST=example1@example.com,example2@example.com
43 changes: 43 additions & 0 deletions .github/workflows/analyzer_test_pipeline.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: Analyzer Tests

on:
push:
branches:
- dev
paths-ignore:
- 'deliverables/**'
pull_request:
branches:
- dev
paths-ignore:
- 'deliverables/**'

jobs:
test:
runs-on: ubuntu-latest

steps:
# Step 1: Checkout the code
- name: Checkout code
uses: actions/checkout@v3

# Step 2: Set up Python
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'

# Step 3: Install dependencies
- name: Install dependencies
run: |
npm ci
cd ./apps/analyzer/metadata_analyzer
python -m pip install --upgrade pip
pip install pipx
pipx install poetry
poetry install

# Step 4: Run tests
- name: Run tests
#working-directory: rootfolder/apps/analyzer
run: npx nx run metadata-analyzer:test
37 changes: 37 additions & 0 deletions .github/workflows/backend_test_pipeline.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: Backend Tests

# Trigger on push to dev and on pull request creation, excluding "deliverables" folder
on:
push:
branches:
- dev
paths-ignore:
- 'deliverables/**'
pull_request:
branches:
- dev
paths-ignore:
- 'deliverables/**'

jobs:
test:
runs-on: ubuntu-latest

steps:
# Step 1: Checkout the code
- name: Checkout code
uses: actions/checkout@v3

# Step 2: Set up Node.js environment
- name: Set up Node.js
uses: actions/setup-node@v3
with:
node-version: '18'

# Step 3: Install dependencies
- name: Install dependencies
run: npm ci

# Step 4: Run tests
- name: Run tests
run: npx nx run metadata-analyzer-backend:test
13 changes: 13 additions & 0 deletions .github/workflows/ci_test2.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
name: CI Test 2

on:
push:
branches:
- dev

jobs:
hello:
runs-on: ubuntu-latest
steps:
- name: Hello World
run: echo "CI Test 2"
7 changes: 6 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -50,4 +50,9 @@ apps/*/.env
# Test results
apps/analyzer/.coverage
reports/*
coverage/*
coverage/*

# DB Dumps
*.dmp
*.sql
!00-init-roles.sql
1 change: 1 addition & 0 deletions 00-init-roles.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
CREATE ROLE root WITH SUPERUSER LOGIN PASSWORD 'root';
11 changes: 8 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
# Container for the shared node module
FROM node:18-alpine
FROM node:18-bullseye



WORKDIR /app

COPY . .
COPY package*.json ./
#ENV NODE_ENV=development

RUN npm i -g nx@20.0.5
RUN npm install
RUN npm i
COPY . .
#RUN npm ci
9 changes: 3 additions & 6 deletions Documentation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,8 @@ cd ./apps/analyzer/metadata_analyzer ; poetry install

- `npm ci`: dependency install

- copy `.env.example` file in backend and rename to `.env` (adjust database properties according to database setup if
necessary)
- copy `.env.example` file in analyzer and rename to `.env` (adjust port properties according to backend setup if
necessary)
- To insert dummy data into table backupData you can use the SQL script `dummyData.sql` in `apps/backend/src/app/utils`
- copy `.env.example` file in backend and rename to `.env` (adjust database properties according to database setup if necessary)
- copy `.env.example` file in analyzer and rename to `.env` (adjust port properties according to backend setup if necessary)

### Running the code locally:

Expand All @@ -29,7 +26,7 @@ cd ./apps/analyzer/metadata_analyzer ; poetry install
- the entity files need to be annotated with `@Entity(<table-name>)`
- append the entity file to the `entities` array in `db-config.service.ts`
- run the following command to generate a migration file:
- `nx run metadata-analyzer-backend:migrations:generate --name <migration-name>`
- `nx run metadata-analyzer-backend:migrations:generate --name <migration-name>`
- append the generated file to the `migrations` array in `db-config.service.ts`

### Running tests
Expand Down
41 changes: 32 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,31 +1,54 @@
# AMOS Backup Metadata Analyzer


## Prerequisites

Make sure the following are installed on your machine:

- **Node 20**
- **Docker**
- **Docker Compose**

## Setup Instructions
## Docker Build Setup Instructions

1. **Clone the repository**:

```bash
git clone https://github.com/amosproj/amos2024ws02-backup-metadata-analyzer.git

```

2. **Change directory**:

```bash
cd ./amos2024ws02-backup-metadata-analyzer/
cd ./amos2024ws02-backup-metadata-analyzer/

```

3. **Setup .env files**:

```bash
cp .env.docker.example .env.docker
cp apps/backend/.env.example apps/backend/.env
cp .env.docker.example .env.docker

```

4. **Copy database dump into project**:

Copy the database dump .dmp file in the projects root folder and rename it to **db_dump.sql**

5. **Clean Docker node_modules**:

4. **Docker compose up**:
```bash
docker-compose --env-file .env.docker up --build
docker volume rm amos2024ws02-backup-metadata-analyzer_mono-node-modules
```

6. **Build and start Docker container**:

```bash
docker compose --env-file .env.docker up --build

```

5. **Docker compose down**:
7. **Stop Docker Container**:
```bash
docker-compose --env-file .env.docker down
docker compose --env-file .env.docker down
```
8 changes: 7 additions & 1 deletion apps/analyzer/.env.example
Original file line number Diff line number Diff line change
@@ -1,2 +1,8 @@
FLASK_RUN_HOST="localhost"
FLASK_RUN_PORT="8000"
FLASK_RUN_PORT="8000"
BACKEND_URL = "http://localhost:3000/api/"
DATABASE_HOST="localhost"
DATABASE_PORT=5432
DATABASE_USER="postgres"
DATABASE_PASSWORD="postgres"
DATABASE_DATABASE="postgres"
29 changes: 29 additions & 0 deletions apps/analyzer/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Use Alpine 3.17, which supports Python 3.11
FROM node:18-alpine3.17

# Install Python 3.11 and other dependencies
RUN apk add --no-cache python3 py3-pip python3-dev gcc musl-dev libffi-dev openssl-dev bash

# Create the virtual environment
RUN python3 -m venv /app/.venv

# Set the virtual environment path
ENV PATH="/app/.venv/bin:$PATH"

# Install Poetry
RUN pip install --no-cache --upgrade pip setuptools && \
pip install poetry

# Copy the dependency files
WORKDIR /app
COPY pyproject.toml poetry.lock ./

# Install dependencies with Poetry
RUN poetry config virtualenvs.create false && \
poetry install --no-interaction --no-ansi

#Copy the remaining code
COPY . .

# Standard command to start the application
#CMD ["/app/.venv/bin/python3", "main.py" , "--host", "0.0.0.0"]
70 changes: 70 additions & 0 deletions apps/analyzer/metadata_analyzer/analyzer.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
class Analyzer:
def init(database, backend, simple_analyzer, simple_rule_based_analyzer):
Analyzer.database = database
Analyzer.backend = backend
Analyzer.simple_analyzer = simple_analyzer
Analyzer.simple_rule_based_analyzer = simple_rule_based_analyzer

def analyze():
data = list(Analyzer.database.get_results())
converted_data = []

for elem in data:
if elem.data_size != None:
converted_data.append(Analyzer._convert_result(elem))

result = Analyzer.simple_analyzer.analyze(converted_data)

return result

# Convert a result from the database into the format used by the backend
def _convert_result(result):
return {
"id": result.uuid,
"sizeMB": result.data_size / 1_000_000,
"creationDate": result.start_time.isoformat(),
}

def update_data():
results = list(Analyzer.database.get_results())

# Batch the api calls to the backend for improved efficiency
batch = []
count = 0
for result in results:
# Only send 'full' backups
if result.fdi_type != "F":
continue

# Only send backups where the relevant data is not null
if result.data_size is None or result.start_time is None:
continue

batch.append(Analyzer._convert_result(result))
count += 1

# Send a full batch
if len(batch) == 100:
Analyzer.backend.send_backup_data_batched(batch)
batch = []

# Send the remaining results
if len(batch) > 0:
Analyzer.backend.send_backup_data_batched(batch)

return {"count": count}

def simple_rule_based_analysis(alert_limit):
data = list(Analyzer.database.get_results())
result = Analyzer.simple_rule_based_analyzer.analyze(data, alert_limit)
return result

def simple_rule_based_analysis_diff(alert_limit):
data = list(Analyzer.database.get_results())
result = Analyzer.simple_rule_based_analyzer.analyze_diff(data,alert_limit)
return result

def simple_rule_based_analysis_inc(alert_limit):
data = list(Analyzer.database.get_results())
result = Analyzer.simple_rule_based_analyzer.analyze_inc(data,alert_limit)
return result
15 changes: 15 additions & 0 deletions apps/analyzer/metadata_analyzer/backend.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
import requests

class Backend:
def __init__(self, backend_url):
self.backend_url = backend_url

def send_backup_data_batched(self, batch):
url = self.backend_url + "backupData/batched"
r = requests.post(url, json=batch)
r.raise_for_status()

def create_alert(self, alert):
url = self.backend_url + "alerting"
r = requests.post(url, json=alert)
r.raise_for_status()
Loading