Udacity Cloud DevOps using Microsoft Azure Nanodegree Program Capstone Project: Ensuring Quality Releases
- Introduction
- Getting Started
- Install our dependencies
- Configure storage account and state backend for Terraform
- Create a Service Principal for Terraform
- Configure Pipeline Environment
- Configure an Azure Log Analytics Workspace
- Create Postman Test Suites
- Create a Selenium test for a website
- Create a Test Suite with JMeter
- Enable Monitoring & Observability
- Wrap Up
- Future Work
- References
This is the capstone project for the Udacity Cloud DevOps using Microsoft Azure Nanodegree Program, in this project we create disposable test environments and run a variety of automated tests with the click of a button. Additionally, we monitor and provide insight into an application's behavior, and determine root causes by querying the application’s custom log files.
For this project we use the following tools:
- Azure DevOps: For creating a CI/CD pipeline to run Terraform scripts and execute tests with Selenium, Postman and Jmeter
- Terraform: For creating Azure infrastructure as code (IaS)
- Postman: For creating a regression test suite and publish the results to Azure Pipelines.
- Selenium: For creating a UI test suite for a website.
- JMeter: For creating a Stress Test Suite and an Endurance Test Suite.
- Azure Monitor: For configuring alerts to trigger given a condition from an App Service.
For this project we will follow the next steps:
- Install our dependencies
- Configure storage account and state backend for Terraform
- Create a Service Principal for Terraform
- Configure Pipeline Environment
- Configure an Azure Log Analytics Workspace
- Create Postman Test Suites
- Create a Selenium test for a website
- Create a Test Suite with JMeter
- Enable Monitoring & Observability
For the successful run of this project we need to do the following prework:
- Install Visual Studio Code: https://code.visualstudio.com/
- Create an Outlook Account: https://outlook.live.com/
- Create a free Azure Account: https://azure.microsoft.com/
- Create an Azure Devops account: https://azure.microsoft.com/services/devops/
- Install Azure CLI: https://docs.microsoft.com/cli/azure/install-azure-cli?view=azure-cli-latest
- Install Terraform: https://learn.hashicorp.com/tutorials/terraform/install-cli#install-terraform
- Install the Java Development Kit: https://www.oracle.com/java/technologies/javase/javase-jdk8-downloads.html
- Install JMeter: https://jmeter.apache.org/download_jmeter.cgi
- Install Postman: https://www.postman.com/downloads/
- Install Python: https://www.python.org/downloads/
- Install Selenium for Python: https://pypi.org/project/selenium/
- Install Chromedriver: https://sites.google.com/a/chromium.org/chromedriver/downloads
Terraform state is used to reconcile deployed resources with Terraform configurations. State allows Terraform to know what Azure resources to add, update, or delete. By default, Terraform state is stored locally when you run the terraform apply
command. This configuration isn't ideal for the following reasons:
Local state doesn't work well in a team or collaborative environment. Terraform state can include sensitive information. Storing state locally increases the chance of inadvertent deletion. Terraform supports the persisting of state in remote storage. One such supported back end is Azure Storage. This document shows how to configure and use Azure Storage for this purpose.
For the project we will use Azure Storage for persisting the Terraform state in remote storage. For this we will run the terraformconfig.sh
file, which has the necessary configuration files for creating a blob storage to store the state.
First, we have to ensure we are logged in into Azure CLI by running the following command:
az login
Once that's done, we can run the terraformconfig.sh
file with the following command:
sh terraformconfig.sh
We will get 3 outputs:
- storage_account_name
- container_name
- access_key
We will use this fields, plus a key
field we will define as test.terraform.tfstate
we will store this 4 values on a file named azurecreds.conf
and in a .gitignore
file we will write the file name so that it isn't stored in our repository.
In the ```main.tf``` in the ```environments\test``` directory input the following fields:
```Bash
terraform {
backend "azurerm" {
storage_account_name = ""
container_name = ""
key = ""
access_key = ""
}
}
In the main.tf
file we need the following data:
- tenant_id
- subscription_id
- client_id
- client_secret
For this we have to obtain our subscription_id with the followging command:
az account show
Copy the "id" field. Now it is time to create the service principal, input the following command:
az ad sp create-for-rbac --role="Contributor" --scopes="/subscriptions/your-subscription-id"
We will get an output similar to this:
{
"appId": "00000000-0000-0000-0000-000000000000",
"displayName": "azure-cli-2017-06-05-10-41-15",
"name": "9d778b04-cfbe-4f86-947e-000000000000",
"password": "0000-0000-0000-0000-000000000000",
"tenant": "00000000-0000-0000-0000-000000000000"
}
These values map to the Terraform variables like so:
- appId is the client_id defined above.
- password is the client_secret defined above.
- tenant is the tenant_id defined above.
We will add this values to our azurecreds.conf
file, so at the end we will have data similar to this in our conf file:
subscription_id = "12345678-b866-4328-925f-123456789"
client_id = "00000000-0000-0000-0000-000000000000"
client_secret = "0000-0000-0000-0000-000000000000"
tenant_id = "00000000-0000-0000-0000-000000000000"
storage_account_name = "tstate12345"
container_name = "tstate"
key = "test.terraform.tfstate"
access_key = "qwewqeDddsad13334324asdd7IuD4RK21jNFWq4XUwAQyYtxxneepnXxLWk49OYvTEoPydmRclPEwSBRrrreqr=="
We are now ready to configure an Azure DevOps Pipeline.
We will need to install Terraform extension from Microsoft DevLabs to use terraform in our DevOps Project, install it from the following URL: https://marketplace.visualstudio.com/items?itemName=ms-devlabs.custom-terraform-tasks
Now we need to create a new Service Connection in the Project by going to Project Settings -> Service connections -> New service connection -> Azure Resource Manager -> Service Principal (Manual) -> Choose the subscription -> Fill the data from your azurecreds.conf file -> Name the new service connection to Azure Resource Manager as azurerm-sc
.
This service connection will be used in the azure-pipelines.yml
file.
The next step is to upload our azurecreds.conf
to Azure Devops as a Secure File, to do this we have to navigate to Pipelines -> Library -> Secure Files -> + Secure File -> Upload File. Now the file should be uploaded.
Further ahead when the pipeline is created, remember to go into the "Pipeline permissions" menu by clicking in the file name in the "Secure Files" menu and add add the pipeline that we will be using.
To access the VM that Terraform creates we will need to also upload to Secure Files a private key.
To create the private key, please follow the official documentation from Microsoft at https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/install-ssh-key?view=azure-devops
Save the public key in a variable group.
Our secure files should look something like this, in this case the private key is named id_rsa.
We will also need a variables group, we will add the following data in a variable group named azurecreds
:
- client_id: 'your-client-id'
- client_secret: 'your-client-secret' (click on the lock to change it to a secret variable)
- subscription_id: 'your-subscription-id'
- tenant_id: 'your-tenant-id'
- public_key: 'your-public-key'
We are ready to run the Provision stage of our pipeline.
If all the configuration was correct, then the terraform apply command should be successful, and our resources should be deployed to the cloud.
After Terraform deploys the VM in Azure we need to manually register the Virtual Machine in Pipelines -> Environments -> TEST -> Add resource -> Virtual Machines -> Linux. Then copy the registration script and manually ssh into the virtual machine, paste it on the terminal and run it.
This enables Azure Pipelines to run commands in that Virtual Machine. After a successful Deploy run, it should look something like this:
To run the Deploy stage of our pipeline we must configure an Azure Log Analytics Workspace before running the Deploy Virtual Machine task. To do this run the setup-log-analytics.sh
file in the deployments directory, modify as needed and refer to the official Microsoft documentation if needed: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/quick-create-workspace-cli
After that, navigate to the Azure Portal, go to the resource group where the Workspace was created, click on the resource and navigate to Settings -> Agents Management.
Navigate to Linux Servers and there will be the script to install the Linux Agent in our Virtual Machine.
For security, copy the Workspace ID and the Primary Key, set them up in our variables group, and reference them as an Environment variable in the pipeline.
We are ready to run the Deploy stage of the pipeline!
If everything worked as intented, we should see "1 Linux computers connected" in the Agents Management in the Log Analytics Workspace.
For this part we will use Postman and Newman to test each endpoint of the web app available in the fakerestapi
folder. We will use Postman to test the endpoints, and when we are ready we will download their definitions in .json and then use them in our project to run them using Newman in the Azure Pipeline.
We created both regression tests and validation tests, and an environment to store our variables. And we also defined the publishing of test results to Test Plans of Azure Devops.
Something to note is that the API that we are testing (http://dummy.restapiexample.com) is quite unstable as it recieves a lot of traffic, so some calls to the endpoints might fail with a 429 error code.
After we run the Postman tests in our pipeline we can get Test Results in Test Plans -> Runs -> Postman Test Results (The name that we defined). We should get a result similar to this in the Run Summary page:
We can navigate to the Test Results page to see exactly what tests passed and what failed.
In the Pipeline run we can check the logs of the publishing of the test results.
Please do note that this API is not very stable, so passing test results may vary according to the time of usage.
For the next part of the project we will explain our tests that can be found on the selenium-test.py file.
First we have to create our tests and later configure them to run in "headless" mode, if you are running the file on a pc, comment this lines in the selenium-test.py
file:
options = ChromeOptions()
options.add_argument("--headless")
driver = webdriver.Chrome(options=options)
For this tests we used the website https://www.saucedemo.com/, we tested login, adding 6 items to cart and removing those 6 items.
In the Azure Pipeline Job section we can check the logs of the Selenium Test
We also defined an artifact that contains the logs for all Selenium runs.
In this step we will create both endurance tests and stress tests with Apache JMeter.
Open the stress-test.jmx
and endurance-test.jmx
and navigate to the View Results Tree tab, run the test and then navigate to Tools -> Generate HTML Report, browse for the results file, the user.properties file and the output directory. Generate the HTML reports for both tests.
By running the tests in Azure Pipelines we can get summaries for both of them.
We also have JMeter Artifacts which we can download if we need
In this final section, we will enable Monitoring & Observability in our Virtual Machine and App Service to observe the effects of our tests.
We can set up alerts to fire when a resource meets certain conditions, we will set up an email to fire when requests are more or equal than 10.
In udacitytest-AppService, go to Monitoring and click on Alert, then + New Rule.
In Condition we will select Requests (Platform)
, in Alert logic we will select for Operator Greater than or equal to
and Threshold value as 10. Leave the rest of the configuration as is.
In Actions we will add an action group, then create an action group named actionGroupUdacity
. In Notification we will select for Notification Type Email/SMS message/Push/Voice.
. We will add our email, and for name we will set it as emailNotification
.
We will leave other configurations as is an jump to Review + Create. We will hit Create.
Finally we will add our Alert Rule Details. we will set it as 10Requests
and leave severity as 3 - Informational
.
Finally, hit Create Alert Rule
The Alert Rules should look like this:
When the alert fires, we should get an email similar to this:
As we selected that we should get an alert if the AppService gets 10 requests or more, let's look at the requests graph available in the Azure Portal.
In this graph we can see that the number of requests went above 10.
We can also see the severity of our alerts in the Alerts
section of the App Service
As we previously configured Azure Log Analytics, we can check in the Azure Portal the outputs of the Selenium Test Suite. For this we will configure custom logs.
To configure custom logs go to your Log Analytics Workspace -> Settings -> Custom logs -> Upload sample log.
We will download and use the selenium-test.log artifact that we set up earlier in the pipeline, it must have logs similar to this:
2021-07-08 05:42:26 Browser started successfully. Navigating to the demo page to login.
2021-07-08 05:42:27 Login successful with username standard_user and password secret_sauce
2021-07-08 05:42:28 Sauce Labs Bike Light added to shopping cart!
2021-07-08 05:42:28 Sauce Labs Bolt T-Shirt added to shopping cart!
2021-07-08 05:42:28 Sauce Labs Onesie added to shopping cart!
2021-07-08 05:42:29 Test.allTheThings() T-Shirt (Red) added to shopping cart!
2021-07-08 05:42:29 Sauce Labs Backpack added to shopping cart!
2021-07-08 05:42:29 Sauce Labs Fleece Jacket added to shopping cart!
2021-07-08 05:42:29 6 items added to cart successfully.
2021-07-08 05:42:29 Sauce Labs Bike Light removed from shopping cart!
2021-07-08 05:42:29 Sauce Labs Bolt T-Shirt removed from shopping cart!
2021-07-08 05:42:30 Sauce Labs Onesie removed from shopping cart!
2021-07-08 05:42:30 Test.allTheThings() T-Shirt (Red) removed from shopping cart!
2021-07-08 05:42:30 Sauce Labs Backpack removed from shopping cart!
2021-07-08 05:42:30 Sauce Labs Fleece Jacket removed from shopping cart!
2021-07-08 05:42:30 6 items removed from cart successfully.
2021-07-08 05:42:30 Selenium Tests DONE
In Record delimiter we will select New line
.
In Collection paths we will select Linux
and in Path we will put the path where the logs are located, in our case /home/george/azagent/_work/1/s/log/selenium/selenium-test.log
In Details, we will define the Custom log name as SeleniumTestLogs
.
Finally, in Review + Create we will create the custom log.
We can query it in the Logs section of Log Analytics Workspace by writing SeleniumTestLogs_CL
Finally, we can see that our complete pipeline is correctly executed!
- We could cause errors or other scenarios for the AppService/VM and demonstrate those behaviors in the test suites as well as in Azure Monitor and Log Analytics.
- We could create a VM Scale Set in Terraform and complete each of the steps with the VM Scale set.
- Udacity Project Starter Files
- Visual Studio Code
- Outlook
- Azure
- Azure DevOps
- Azure Command Line Interface
- Terraform
- Terraform Azure Documentation
- Java Development Kit
- Jmeter
- Postman
- Python
- Selenium for Python
- Chromedriver
- Tutorial: Store Terraform state in Azure Storage
- Get subscription id with Azure CLI
- Azure Provider: Authenticating using a Service Principal with a Client Secret
- Terraform - Microsoft DevLabs
- Install SSH Key Task
- Azure CLI Authentication does not work when using the Azure CLI task from Azure DevOps
- Resources in YAML
- Terraform on Azure Pipelines Best Practices
- Use secure files
- Create a Log Analytics workspace with Azure CLI 2.0
- Install Log Analytics agent on Linux computers
- Sauce Demo
- Running collections on the command line with Newman
- Dummy Rest API Example
- Collect custom logs with Log Analytics agent in Azure Monitor