Skip to content

Commit

Permalink
Add autoStop listener to avoid running tests when certain statistics …
Browse files Browse the repository at this point in the history
…criteria are not met

Include automatic support for AzureEngine and fix jsr223Sampler naming
  • Loading branch information
rabelenda-abstracta committed Aug 18, 2023
1 parent a08df93 commit 72874bd
Show file tree
Hide file tree
Showing 38 changed files with 2,149 additions and 68 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ jobs:
branch: master
file_pattern: 'pom.xml */pom.xml README.md docs/index.md docs/guide/**'
- name: package release
run: mvn --batch-mode --no-transfer-progress clean install --settings .github/settings.xml
run: mvn --batch-mode --no-transfer-progress clean install-Dfailsafe.skipAfterFailureCount=1 --settings .github/settings.xml
env:
BZ_TOKEN: ${{ secrets.BZ_TOKEN }}
OCTOPERF_API_KEY: ${{ secrets.OCTOPERF_API_KEY }}
Expand Down
75 changes: 75 additions & 0 deletions docs/guide/autostop.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
## Auto Stop

As previously shown, it is quite easy to check after test plan execution if the collected metrics are the expected ones and fail/pass the test accordingly.

But, what if you want to stop your test plan as soon as the metrics deviate from expected ones? This could help avoiding unnecessary resource usage, especially when conducting tests at scale to avoid incurring additional costs.

With JMeter DSL you can easily define auto-stop conditions over collected metrics, that when met will stop the test plan and throw an exception that will make your test fail.

Here is an example:

```java
import static us.abstracta.jmeter.javadsl.JmeterDsl.*;
import static us.abstracta.jmeter.javadsl.core.listeners.AutoStopListener.AutoStopCondition.*;

import java.io.IOException;
import java.time.Duration;
import org.junit.jupiter.api.Test;
import us.abstracta.jmeter.javadsl.core.TestPlanStats;

public class PerformanceTest {

@Test
public void testPerformance() throws IOException {
TestPlanStats stats = testPlan(
threadGroup(2, Duration.ofMinutes(1),
httpSampler("http://my.service")
),
autoStop()
.when(errors().total().isGreaterThan(0)) // when any sample fails, then test plan will stop and an exception will be thrown pointing to this condition.
).run();
}

}

```

Check [AutoStopListener](/jmeter-java-dsl/src/main/java/us/abstracta/jmeter/javadsl/core/listeners/AutoStopListener.java) for details on available options for auto-stop conditions.

`autoStop` is inspired in [JMeter AutoStop Plugin](https://jmeter-plugins.org/wiki/AutoStop/), but provides a lot more flexibility.

::: tip
`autoStop` will only consider samples within its scope.

If you place it as a test plan child, then it will evaluate metrics for all samples. If you place it as a thread group child, then it will evaluate metrics for samples of such thread group. If you place it as a controller child, then only samples within such controller. And, if you place it as a sampler child, it will only evaluate samples for that particular sampler.

Additionally, you can use the `samplesMatching(regex)` method to only evaluate metrics for a subset of samples within a given scope (eg: all samples with a label starting with `users`).
:::

::: tip
You can add multiple `autoStop` elements within a test plan. The first one containing a condition that is met will trigger the auto-stop.

To identify which `autoStop` element triggered, you can specify a name, like `autoStop("login")`, and the associated name will be included in the exception thrown by `autoStop` when the test plan is stopped.

Additionally, you can specify several conditions on an `autoStop` element. When any of such conditions are met, then the test plan is stopped.
:::

::: tip
By default, `autoStop` will evaluate each condition for each sample and stop the test plan as soon as a condition is met.

This behavior is different from [JMeter AutoStop Plugin](https://jmeter-plugins.org/wiki/AutoStop/), which evaluates and resets aggregations (it only provides average aggregation) for every second.

To change this behavior you can use the `every(Duration)` method (after specifying the aggregation method, eg `errors().perSecond().every(Duration.ofSeconds(5)))`) to specify that the condition should only be evaluated, and the aggregation reset, for every given period.

**This is particularly helpful for some aggregations (like `mean`, `perSecond`, and `percent`) which may get "stuck" due to historical values collected for the metric.**

As an example to illustrate this issue, consider the scenario where after 10 minutes you get 10k requests with an average sample time of 1 second, but in the last 10 seconds you get 10 requests with an average of 10 seconds. In this scenario, the general average will not be much affected by the last seconds, but you would in any case want to stop the test plan since last seconds average has been way up the expected value. This is a clear scenario where you would like to use the `every()` method.
:::

::: tip
By default, `autoStop` will stop the test plan as soon as the condition is met, but in many cases it is better to wait for the condition to be met for some period of time, to avoid some intermittent or short-lived condition. To not stop the test plan until the condition holds for a given period of time, you can use `holdsFor(Duration)` at the end of your condition.
:::

::: warning
`autoStop` will automatically work with `AzureEngine`. But no support has been implemented yet for `BlazeMeterEngine` or `OctoPerfEngine`. If you need such support, please create [an issue in the GitHub repository](https://github.com/abstracta/jmeter-java-dsl/issues).
:::
1 change: 1 addition & 0 deletions docs/guide/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ For an intro to JMeter concepts and components, you can check [JMeter official d
<!-- @include: recorder/index.md -->
<!-- @include: jmx2dsl.md -->
<!-- @include: scale/index.md -->
<!-- @include: autostop.md -->
<!-- @include: thread-groups/index.md -->
<!-- @include: debugging/index.md -->
<!-- @include: reporting/index.md -->
Expand Down
2 changes: 2 additions & 0 deletions docs/guide/response-processing/lambdas.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,8 @@ Here are the steps to run test plans containing Java lambdas in `BlazeMeterEngin
<configuration>
<outputDirectory>${project.build.directory}/libs</outputDirectory>
<!-- include here, separating by commas, any additional dependencies (just the artifacts ids) you need to upload to BlazeMeter -->
<!-- AzureEngine automatically uploads JMeter dsl artifacts, so only transitive or custom dependencies would be required -->
<!-- if you would like for BlazeMeterEngine and OctoPerfEngine to automatically upload JMeter DSL artifacts, please create an issue in GitHub repository -->
<includeArtifactIds>jmeter-java-dsl</includeArtifactIds>
</configuration>
</execution>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ public class AzureClient extends BaseRemoteEngineApiClient {
private String managementToken;
private String loadTestToken;
private LoadTestApi loadTestApi;
private String dataPlaneUrl;

public AzureClient(String tenantId, String clientId, String clientSecret) {
super(LOG);
Expand Down Expand Up @@ -231,15 +232,15 @@ Call<Void> uploadTestFile(@Path("testId") String testId, @Path("fileName") Strin
@GET("tests/{testId}" + API_VERSION)
Call<LoadTest> findTestById(@Path("testId") String testId);

@PATCH("/test-runs/{testRunId}" + API_VERSION)
@PATCH("test-runs/{testRunId}" + API_VERSION)
@Headers(MERGE_PATCH_CONTENT_TYPE_HEADER)
Call<TestRun> createTestRun(@Path("testRunId") String testRunId,
@Query("tenantId") String tenantId, @Body TestRun testRun);

@GET("/test-runs/{testRunId}" + API_VERSION)
@GET("test-runs/{testRunId}" + API_VERSION)
Call<TestRun> findTestRunById(@Path("testRunId") String id);

@POST("/test-runs/{testRunId}:stop" + API_VERSION)
@POST("test-runs/{testRunId}:stop" + API_VERSION)
Call<Void> stopTestRun(@Path("testRunId") String id);

}
Expand Down Expand Up @@ -271,7 +272,7 @@ private <T> Optional<T> execOptionalApiCall(Call<T> call) throws IOException {
throw buildRemoteEngineException(response.code(), errorBody.string());
}
}
return Optional.of(response.body());
return Optional.ofNullable(response.body());
}

public Location findLocation(Subscription subscription) throws IOException {
Expand All @@ -294,15 +295,19 @@ public LoadTestResource findTestResource(String name, ResourceGroup resourceGrou
resourceGroup.getName(), name, RequestOrigin.MANAGEMENT))
.map(r -> {
r.setResourceGroup(resourceGroup);
initLoadTestApi(r);
setLoadTestResource(r);
return r;
})
.orElse(null);
}

private void initLoadTestApi(LoadTestResource r) {
loadTestApi = buildApiFor(String.format("https://%s/", r.getDataPlaneUri()),
LoadTestApi.class);
private void setLoadTestResource(LoadTestResource r) {
dataPlaneUrl = String.format("https://%s", r.getDataPlaneUri());
loadTestApi = buildApiFor(dataPlaneUrl + "/", LoadTestApi.class);
}

public String getDataPlaneUrl() {
return dataPlaneUrl;
}

public void createTestResource(LoadTestResource testResource) throws IOException {
Expand All @@ -312,16 +317,19 @@ public void createTestResource(LoadTestResource testResource) throws IOException
RequestOrigin.MANAGEMENT));
testResource.setId(created.getId());
testResource.setProvisioningState(created.getProvisioningState());
initLoadTestApi(created);
setLoadTestResource(created);
}

public LoadTest findTestByName(String testName, LoadTestResource testResource)
throws IOException {
return execApiCall(loadTestApi.findTestByName(testName)).getFirstElement()
.map(t -> {
t.setTestResource(testResource);
return t;
})
return execApiCall(loadTestApi.findTestByName(testName)).stream()
/*
this is necessary because API returns all tests elements with name starting with the given
string
*/
.filter(t -> testName.equals(t.getDisplayName()))
.peek(t -> t.setTestResource(testResource))
.findAny()
.orElse(null);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,8 @@
import us.abstracta.jmeter.javadsl.azure.api.TestRun;
import us.abstracta.jmeter.javadsl.core.BuildTreeContext;
import us.abstracta.jmeter.javadsl.core.DslJmeterEngine;
import us.abstracta.jmeter.javadsl.core.DslTestPlan;
import us.abstracta.jmeter.javadsl.core.engines.TestStopper;
import us.abstracta.jmeter.javadsl.engines.BaseRemoteEngine;

/**
Expand Down Expand Up @@ -312,11 +314,25 @@ public AzureEngine monitoredResources(String... resourceIds) {
return this;
}

@Override
protected HashTree buildTree(DslTestPlan testPlan, BuildTreeContext context) {
HashTree ret = super.buildTree(testPlan, context);
if (isAutoStoppableTest(ret)) {
AzureTestStopper.addClientSecretVariableToTree(ret, context);
}
return ret;
}

@Override
protected AzureClient buildClient() {
return new AzureClient(tenantId, clientId, clientSecret);
}

@Override
protected TestStopper buildTestStopper() {
return new AzureTestStopper();
}

@Override
protected AzureTestPlanStats run(File jmxFile, HashTree tree, BuildTreeContext context)
throws IOException, InterruptedException, TimeoutException {
Expand All @@ -343,9 +359,13 @@ protected AzureTestPlanStats run(File jmxFile, HashTree tree, BuildTreeContext c
clearTestFiles(loadTest);
updateAppComponents(loadTest);
}
uploadTestFiles(loadTest, jmxFile, context);
uploadTestFiles(jmxFile, tree, context, loadTest);
awaitValidatedTestFile(loadTest);
TestRun testRun = new TestRun(loadTest.getTestId(), solveTestRunName());
if (isAutoStoppableTest(tree)) {
AzureTestStopper.setupTestRun(testRun, tenantId, clientId, clientSecret,
apiClient.getDataPlaneUrl());
}
testRun = apiClient.createTestRun(testRun);
if (!testRun.isAccepted()) {
throw new IllegalStateException(
Expand Down Expand Up @@ -458,13 +478,16 @@ private void updateAppComponents(LoadTest loadTest) throws IOException {
}
}

private void uploadTestFiles(LoadTest loadTest, File jmxFile, BuildTreeContext context)
throws IOException {
private void uploadTestFiles(File jmxFile, HashTree tree, BuildTreeContext context,
LoadTest loadTest) throws IOException {
LOG.info("Uploading test script and asset files");
context.processAssetFile(jmxFile.getPath());
for (File f : assets) {
context.processAssetFile(f.getPath());
}
for (File f : findDependencies(tree, context)) {
context.processAssetFile(f.getPath());
}
for (Map.Entry<String, File> asset : context.getAssetFiles().entrySet()) {
apiClient.uploadTestFile(asset.getValue(), asset.getKey(), loadTest.getTestId());
}
Expand Down Expand Up @@ -497,6 +520,7 @@ private TestRun awaitTestEnd(TestRun testRun)
LOG.info("Test run stopped.");
throw new TimeoutException("Test execution timed out after " + prettyTimeout);
}
AzureTestStopper.handleTestEnd(testRun);
return awaitVirtualUsers(testRun);
}

Expand All @@ -507,7 +531,7 @@ private TestRun awaitVirtualUsers(TestRun testRun) throws InterruptedException,
testRun = apiClient.findTestRunById(testRun.getId());
}
if (!testRun.isSuccess()) {
throw new IllegalStateException("Test has been " + testRun.getStatus().toLowerCase());
throw new IllegalStateException("Test " + testRun.getStatus().toLowerCase());
}
return testRun;
}
Expand Down
Loading

0 comments on commit 72874bd

Please sign in to comment.