Skip to content

Commit

Permalink
Add support for Dart and Mocha JSON test results (#404)
Browse files Browse the repository at this point in the history
  • Loading branch information
EnricoMi authored Mar 17, 2023
1 parent acf931c commit d2532d3
Show file tree
Hide file tree
Showing 36 changed files with 1,478 additions and 82 deletions.
1 change: 1 addition & 0 deletions .github/workflows/ci-cd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -567,6 +567,7 @@ jobs:
files: |
test-files/**/*.xml
test-files/**/*.trx
test-files/**/*.json
junit_files: "test-files/junit-xml/**/*.xml"
nunit_files: "test-files/nunit/**/*.xml"
xunit_files: "test-files/xunit/**/*.xml"
Expand Down
31 changes: 18 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,16 +9,18 @@
![Ubuntu badge](misc/badge-ubuntu.svg)
![macOS badge](misc/badge-macos.svg)
![Windows badge](misc/badge-windows.svg)
   
![JUnit badge](misc/badge-junit-xml.svg)
![NUnit badge](misc/badge-nunit-xml.svg)
![XUnit badge](misc/badge-xunit-xml.svg)
![TRX badge](misc/badge-trx.svg)
![Dart badge](misc/badge-dart.svg)
![Mocha badge](misc/badge-mocha.svg)


[![Test Results](https://gist.githubusercontent.com/EnricoMi/612cb538c14731f1a8fefe504f519395/raw/tests.svg)](https://gist.githubusercontent.com/EnricoMi/612cb538c14731f1a8fefe504f519395/raw/tests.svg)

This [GitHub Action](https://github.com/actions) analyses test result files and
publishes the results on GitHub. It supports the [TRX file format and JUnit, NUnit and XUnit XML formats](#generating-test-result-files),
publishes the results on GitHub. It supports [JSON (Dart, Mocha), TRX (MSTest, VS) and XML (JUnit, NUnit, XUnit) file formats](#generating-test-result-files),
and runs on Linux, macOS and Windows.

You can add this action to your GitHub workflow for ![Ubuntu Linux](https://badgen.net/badge/icon/Ubuntu?icon=terminal&label) (e.g. `runs-on: ubuntu-latest`) runners:
Expand All @@ -31,6 +33,7 @@ You can add this action to your GitHub workflow for ![Ubuntu Linux](https://badg
files: |
test-results/**/*.xml
test-results/**/*.trx
test-results/**/*.json
```
Use this for ![macOS](https://badgen.net/badge/icon/macOS?icon=apple&label) (e.g. `runs-on: macos-latest`)
Expand All @@ -44,6 +47,7 @@ and ![Windows](https://badgen.net/badge/icon/Windows?icon=windows&label) (e.g. `
files: |
test-results/**/*.xml
test-results/**/*.trx
test-results/**/*.json
```

See the [notes on running this action as a composite action](#running-as-a-composite-action) if you run it on Windows or macOS.
Expand All @@ -59,17 +63,18 @@ This behaviour is configurable.*
## Generating test result files

Supported test result files can be generated by many test environments. Here is a small overview, by far not complete.
Check your favorite development and test environment for its TRX file or JUnit, NUnit, XUnit XML file support.

|Test Environment |Language| JUnit<br/>XML | NUnit<br/>XML | XUnit<br/>XML | TRX<br/>file |
|-----------------|:------:|:---------:|:---------:|:---------:|:---:|
|[Jest](https://jestjs.io/docs/configuration#default-reporter)|JavaScript|:heavy_check_mark:| | | |
|[Maven](https://maven.apache.org/surefire/maven-surefire-plugin/examples/junit.html)|Java, Scala, Kotlin|:heavy_check_mark:| | | |
|[Mocha](https://mochajs.org/#xunit)|JavaScript|:heavy_check_mark:| |[not xunit](https://github.com/mochajs/mocha/issues/4758)| |
|MSTest |.Net|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|
|[pytest](https://docs.pytest.org/en/latest/how-to/output.html#creating-junitxml-format-files)|Python|:heavy_check_mark:| | | |
|[sbt](https://www.scala-sbt.org/release/docs/Testing.html#Test+Reports)|Scala|:heavy_check_mark:| | | |
|Your favorite<br/>environment|Your favorite<br/>language|probably<br/>:heavy_check_mark:| | | |
Check your favorite development and test environment for its JSON, TRX file or JUnit, NUnit, XUnit XML file support.

|Test Environment |Language| JUnit<br/>XML | NUnit<br/>XML | XUnit<br/>XML | TRX<br/>file | JSON<br/>file |
|-----------------|:------:|:---------:|:---------:|:---------:|:---:|:---:|
|[Dart](https://github.com/dart-lang/test/blob/master/pkgs/test/doc/json_reporter.md)|Dart, Flutter| | | | | :heavy_check_mark: |
|[Jest](https://jestjs.io/docs/configuration#default-reporter)|JavaScript|:heavy_check_mark:| | | | |
|[Maven](https://maven.apache.org/surefire/maven-surefire-plugin/examples/junit.html)|Java, Scala, Kotlin|:heavy_check_mark:| | | | |
|[Mocha](https://mochajs.org/#xunit)|JavaScript|:heavy_check_mark:| |[not xunit](https://github.com/mochajs/mocha/issues/4758)| | :heavy_check_mark: |
|MSTest |.Net|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:| |
|[pytest](https://docs.pytest.org/en/latest/how-to/output.html#creating-junitxml-format-files)|Python|:heavy_check_mark:| | | | |
|[sbt](https://www.scala-sbt.org/release/docs/Testing.html#Test+Reports)|Scala|:heavy_check_mark:| | | | |
|Your favorite<br/>environment|Your favorite<br/>language|probably<br/>:heavy_check_mark:| | | | |

## What is new in version 2

Expand Down
20 changes: 20 additions & 0 deletions misc/badge-dart.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
20 changes: 20 additions & 0 deletions misc/badge-mocha.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
126 changes: 126 additions & 0 deletions python/publish/dart.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
import json
from collections import defaultdict
from typing import Dict, Any, List

from junitparser.junitparser import etree

from publish.junit import JUnitTree


def is_dart_json(path: str) -> bool:
if not path.endswith('.json'):
return False

try:
with open(path, 'rt') as r:
line = r.readline()
event = json.loads(line)
# {"protocolVersion":"0.1.1","runnerVersion":"1.23.1","pid":1705,"type":"start","time":0}
return event.get('type') == 'start' and 'protocolVersion' in event
except BaseException:
return False


def parse_dart_json_file(path: str) -> JUnitTree:
tests: Dict[int, Dict[Any, Any]] = defaultdict(lambda: dict())
suites: Dict[int, Dict[Any, Any]] = defaultdict(lambda: dict())
suite_tests: Dict[int, List[Any]] = defaultdict(lambda: list())
suite_start = None
suite_time = None

with open(path, 'rt') as r:
for line in r:
# https://github.com/dart-lang/test/blob/master/pkgs/test/doc/json_reporter.md
event = json.loads(line)
type = event.get('type')

if type == 'start':
suite_start = event.get('time')
elif type == 'suite' and 'suite' in event and 'id' in event['suite']:
suite = event['suite']
id = suite['id']
suites[id]['path'] = suite.get('path')
suites[id]['start'] = event.get('time')
elif type == 'testStart' and 'test' in event and 'id' in event['test']:
test = event['test']
id = test['id']
tests[id]['name'] = test.get('name')
tests[id]['suite'] = test.get('suiteID')
tests[id]['line'] = test.get('line') # 1-based
tests[id]['column'] = test.get('column') # 1-based
tests[id]['url'] = test.get('url')
tests[id]['start'] = event.get('time')
if test.get('suiteID') is not None:
suite_tests[test.get('suiteID')].append(tests[id])
elif type == 'testDone' and 'testID' in event:
id = event['testID']
tests[id]['result'] = event.get('result')
tests[id]['hidden'] = event.get('hidden')
tests[id]['skipped'] = event.get('skipped')
tests[id]['end'] = event.get('time')
elif type == 'error' and 'testID' in event:
id = event['testID']
tests[id]['error'] = event.get('error')
tests[id]['stackTrace'] = event.get('stackTrace')
tests[id]['isFailure'] = event.get('isFailure')
elif type == 'print' and 'testID' in event and event.get('messageType') == 'skip':
tests[id]['reason'] = event.get('message')
elif type == 'done':
suite_time = event.get('time')

def create_test(test):
testcase = etree.Element('testcase', attrib={k: str(v) for k, v in dict(
name=test.get('name'),
file=test.get('url'),
line=test.get('line'),
time=(test['end'] - test['start']) / 1000.0 if test.get('start') is not None and test.get('end') is not None else None,
).items() if isinstance(v, str) and v or v is not None})

test_result = test.get('result', 'error')
if test_result != 'success':
result = etree.Element('error' if test_result != 'failure' else test_result, attrib={k: v for k, v in dict(
message=test.get('error')
).items() if v})
result.text = etree.CDATA('\n'.join(text
for text in [test.get('error'), test.get('stackTrace')]
if text))
testcase.append(result)
elif test.get('skipped', False):
result = etree.Element('skipped', attrib={k: v for k, v in dict(
message=test.get('reason')
).items() if v})
testcase.append(result)

return testcase

def create_suite(suite, tests):
testsuite = etree.Element('testsuite', attrib={k: str(v) for k, v in dict(
name=suite.get('path'),
time=(suite['end'] - suite['start']) / 1000.0 if suite.get('start') is not None and suite.get('end') is not None else None,
tests=str(len(tests)),
failures=str(len([test for test in tests if test.get('isFailure', False)])),
errors=str(len([test for test in tests if not test.get('isFailure', True)])),
skipped=str(len([test for test in tests if test.get('skipped', False)])),
).items() if isinstance(v, str) and v or v is not None})

testsuite.extend(create_test(test) for test in tests)

return testsuite

# do not count hidden tests (unless not successfull)
visible_tests = [test for test in tests.values() if test.get('hidden') is not True or test.get('result') != 'success']
testsuites = etree.Element('testsuites', attrib={k: str(v) for k, v in dict(
time=(suite_time - suite_start) / 1000.0 if suite_start is not None and suite_time is not None else None,
tests=str(len(visible_tests)),
failures=str(len([test for test in visible_tests if test.get('isFailure', False)])),
errors=str(len([test for test in visible_tests if not test.get('isFailure', True)])),
skipped=str(len([test for test in visible_tests if test.get('skipped', False)])),
).items() if v is not None})

testsuites.extend([create_suite(suite, [test
for test in suite_tests[suite_id]
if test.get('hidden') is not True])
for suite_id, suite in suites.items()])

xml = etree.ElementTree(testsuites)
return xml
75 changes: 75 additions & 0 deletions python/publish/mocha.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
import json

from junitparser.junitparser import etree

from publish.junit import JUnitTree


def is_mocha_json(path: str) -> bool:
if not path.endswith('.json'):
return False

try:
with open(path, 'rt') as r:
results = json.load(r)
return 'stats' in results and isinstance(results.get('stats'), dict) and 'suites' in results.get('stats') and \
'tests' in results and isinstance(results.get('tests'), list) and all(isinstance(test, dict) for test in results.get('tests')) and (
len(results.get('tests')) == 0 or all(test.get('fullTitle') for test in results.get('tests'))
)
except BaseException:
return False


def parse_mocha_json_file(path: str) -> JUnitTree:
with open(path, 'rt') as r:
results = json.load(r)

stats = results.get('stats', {})
skippedTests = {test.get('fullTitle') for test in results.get('pending', [])}
suite = etree.Element('testsuite', attrib={k: str(v) for k, v in dict(
time=stats.get('duration'),
timestamp=stats.get('start')
).items() if v})

tests = 0
failures = 0
errors = 0
skipped = 0
for test in results.get('tests', []):
tests = tests + 1
testcase = etree.Element('testcase',
attrib={k: str(v) for k, v in dict(
name=test.get('fullTitle'),
file=test.get('file'),
time=test.get('duration')
).items() if v}
)

err = test.get('err')
if err:
if err.get('errorMode'):
errors = errors + 1
type = 'error'
else:
failures = failures + 1
type = 'failure'

result = etree.Element(type, attrib={k: v for k, v in dict(
message=err.get('message').translate(dict.fromkeys(range(32))),
type=err.get('errorMode')
).items() if v})
result.text = etree.CDATA('\n'.join(text.translate(dict.fromkeys(range(32)))
for text in [err.get('name'), err.get('message'), err.get('stack')]
if text))
testcase.append(result)
elif test.get('fullTitle') in skippedTests:
skipped = skipped + 1
result = etree.Element('skipped')
testcase.append(result)

suite.append(testcase)

suite.attrib.update(dict(tests=str(tests), failures=str(failures), errors=str(errors), skipped=str(skipped)))
xml = etree.ElementTree(suite)

return xml
32 changes: 23 additions & 9 deletions python/publish_test_results.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,12 +103,14 @@ def get_number_of_files(files: List[str], label: str = 'file') -> str:
return number_of_files


def parse_xml_files(files: Iterable[str], large_files: bool, drop_testcases: bool,
progress: Callable[[ParsedJUnitFile], ParsedJUnitFile] = lambda x: x) -> Iterable[ParsedJUnitFile]:
def parse_files_as_xml(files: Iterable[str], large_files: bool, drop_testcases: bool,
progress: Callable[[ParsedJUnitFile], ParsedJUnitFile] = lambda x: x) -> Iterable[ParsedJUnitFile]:
junit_files = []
nunit_files = []
xunit_files = []
trx_files = []
dart_json_files = []
mocha_json_files = []
unknown_files = []

def parse(path: str) -> JUnitTree:
Expand All @@ -131,17 +133,29 @@ def parse(path: str) -> JUnitTree:
trx_files.append(path)
return parse_trx_file(path, large_files)

from publish.dart import is_dart_json, parse_dart_json_file
if is_dart_json(path):
dart_json_files.append(path)
return parse_dart_json_file(path)

from publish.mocha import is_mocha_json, parse_mocha_json_file
if is_mocha_json(path):
mocha_json_files.append(path)
return parse_mocha_json_file(path)

unknown_files.append(path)
raise RuntimeError(f'Unsupported file format: {path}')

try:
return progress_safe_parse_xml_file(files, parse, progress)
finally:
for flavour, files in [
('JUnit', junit_files),
('NUnit', nunit_files),
('XUnit', xunit_files),
('JUnit XML', junit_files),
('NUnit XML', nunit_files),
('XUnit XML', xunit_files),
('TRX', trx_files),
('Dart JSON', dart_json_files),
('Mocha JSON', mocha_json_files),
('unsupported', unknown_files)
]:
if files:
Expand All @@ -156,9 +170,9 @@ def parse(path: str) -> JUnitTree:
def parse_files(settings: Settings, gha: GithubAction) -> ParsedUnitTestResultsWithCommit:
# expand file globs
files = expand_glob(settings.files_glob, None, gha)
junit_files = expand_glob(settings.junit_files_glob, 'JUnit', gha)
nunit_files = expand_glob(settings.nunit_files_glob, 'NUnit', gha)
xunit_files = expand_glob(settings.xunit_files_glob, 'XUnit', gha)
junit_files = expand_glob(settings.junit_files_glob, 'JUnit XML', gha)
nunit_files = expand_glob(settings.nunit_files_glob, 'NUnit XML', gha)
xunit_files = expand_glob(settings.xunit_files_glob, 'XUnit XML', gha)
trx_files = expand_glob(settings.trx_files_glob, 'TRX', gha)

elems = []
Expand All @@ -172,7 +186,7 @@ def parse_files(settings: Settings, gha: GithubAction) -> ParsedUnitTestResultsW
progress_item_type=Tuple[str, Any],
logger=logger) as progress:
if files:
elems.extend(parse_xml_files(files, settings.large_files, settings.ignore_runs, progress))
elems.extend(parse_files_as_xml(files, settings.large_files, settings.ignore_runs, progress))
if junit_files:
elems.extend(parse_junit_xml_files(junit_files, settings.large_files, settings.ignore_runs, progress))
if xunit_files:
Expand Down
2 changes: 2 additions & 0 deletions python/test/files/dart/json/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Example test results from https://github.com/dart-code-checker/dart-code-metrics @ 96001b5e78937be84270b6744898c82f9f0d9ddd

Loading

0 comments on commit d2532d3

Please sign in to comment.