Skip to content

Commit

Permalink
Improved doc (#44)
Browse files Browse the repository at this point in the history
Co-authored-by: GerardoPardo <gerardo@rti.com>
  • Loading branch information
angelrti and GerardoPardo authored Jun 25, 2024
1 parent 3b0db76 commit f1e1fa9
Show file tree
Hide file tree
Showing 6 changed files with 154 additions and 141 deletions.
4 changes: 2 additions & 2 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,12 +54,12 @@ def generate_test_description_rst():
# write new group heading
if last_test_group != current_test_group:
test_description_rst += f'{current_test_group}\n'
test_description_rst += '-' * len(current_test_group) + '\n'
test_description_rst += '-' * len(current_test_group) + '\n\n'
last_test_group = current_test_group

# write test name heading
test_description_rst += f'{test_name}\n'
test_description_rst += '~' * len(test_name) + '\n'
test_description_rst += '~' * len(test_name) + '\n\n'

# write test name and description
test_description_rst += f"{test_value['title']}\n\n"
Expand Down
66 changes: 36 additions & 30 deletions doc/introduction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,51 +5,57 @@
Introduction
============

The |INTEROPERABILITY_TESTS_CP| provides a testing framework for different
implementations of the Data Distribution Service® (DDS) standard in terms of
interoperability. This ensures that DDS implementations are tested across a
variety of Quality of Service (QoS) policies and other features among different
products.
The |INTEROPERABILITY_TESTS_CP| is a collection of test cases that validate the
wire protocol interoperability of implementations of the Data Distribution Service®
(DDS) standard. The test suite verifies that different DDS implementations communicate as
expected across a variety of scenarios, including various Quality of Service (QoS)
settings.

The DDS Interoperability Tests are publicly available on this repository:
The |INTEROPERABILITY_TESTS| are publicly available on this repository:
https://github.com/omg-dds/dds-rtps/

Test Descriptions
-----------------

All these tests are based on an application that allows users to create
different test scenarios by setting various QoS policies and enabling/disabling
DDS features, such as content filtering. This application is called |SHAPE_APP|.
More information about the options of this application is available in the
README file's Shape Application parameters section:
The test cases are implemented on using an application, called |SHAPE_APP|, that is compiled
against each of the DDS implementations, resulting in a different binary executable for each
DDS implementation.

The |SHAPE_APP| has a set of command-line parameters controlling its behavior (e.g. publish versus subscribe),
the Qos settings (e.g. reliability, durability, ownership), and the use of various other features (e.g. Content Filters).
More information on the the command-line options can be found in this README file:
`Shape Application Parameters Section in README
<https://github.com/omg-dds/dds-rtps/?tab=readme-ov-file#shape-application-parameters>`__.

A test suite is composed of a set of test cases, which are run with the
|SHAPE_APP| acting as a **Publisher** or a **Subscriber** application.

A test scenario or test case is determined by the parameters used in the
|SHAPE_APP| and the expected test result (return code). The produced return code
Each test case is is defined in terms of a specific deployment of the |SHAPE_APP| executables,
the parameters used to run each executable, and the expected test result (return code). The produced return code
depends on the output of the |SHAPE_APP|. More information is available in the
README file's Return Code section:
`Return Code Section in README
<https://github.com/omg-dds/dds-rtps/?tab=readme-ov-file#return-code>`__.
<https://github.com/omg-dds/dds-rtps/?tab=readme-ov-file#return-code>`__

A test case minimally runs two instances of the |SHAPE_APP|, one acting as a **Publisher** or a **Subscriber** application.
Some test cases may run additional instances when it is required to exercise a particular behavior.

For example, a specific test case may specify running the binary for Implementation1 with parameters
that configure it to publish data with a certain Qos against the binary for Implementation2 with parameters
that cause it to subscribe data with a different Qos. The expected result may state that all
data published by one application should be received by the other one in the correct order without any duplicates.

A test case *passes* if the communication between the **Publisher** and **Subscriber** application(s)
matches what is expected for that scenario. The "expected" behavior may involve receiving all samples sent,
or specific subsets (e.g. when using content filters or exclusive ownership Qos), it may also involve verifying
that samples are received in specific order, or not received at all (e.g. when the Qos is incompatible). Where needed,
a test case defines a ``check_function`` to parse the output printed by the |SHAPE_APP| and determine
whether the test case *passes* or *fails*.

The different test cases that are currently tested are defined in a
The test cases included in the |INTEROPERABILITY_TESTS| are defined in a
`test suite <https://github.com/omg-dds/dds-rtps/blob/master/test_suite.py#L332>`__
that is part of this repository. By default, a test case is considered as
*passed* if there is communication between the **Publisher** and **Subscriber**
applications. Additionally, some test cases may require additional checks to
ensure that the behavior is correct. Each test case may include a
``checking_function`` to do so. These ``checking functions`` are defined in the test
suite as well and determine whether the test case should be considered as
*passed* or *error* depending on some additional checks.
that is part of this repository.

Test Performed
--------------

The |INTEROPERABILITY_TESTS| run the test suite mentioned above with all
combinations of all DDS implementations as **Publisher** and as **Subscriber**
applications. The products used are the |SHAPE_APPS| uploaded to the
`latest release of the repository <https://github.com/omg-dds/dds-rtps/releases>`__,
including a test of the same product as **Publisher** and as **Subscriber**.
The |INTEROPERABILITY_TESTS| runs the test cases using all permutations of DDS implementations as **Publisher** and as **Subscriber**
applications. The products included in the tests are the |SHAPE_APPS| uploaded to the
`latest release of the repository <https://github.com/omg-dds/dds-rtps/releases>`__.
2 changes: 1 addition & 1 deletion doc/test_description.template.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ parameters the ShapeDemo Application configures:

The type used in these tests is the following:

.. code-block:: C
.. code-block::
@appendable
struct ShapeType {
Expand Down
31 changes: 19 additions & 12 deletions doc/test_results.template.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,24 +6,31 @@
Test Results
============

The test results are organized in the following spreadsheet in different tabs.
The first tab presents a comprehensive summary of the tests per product. The
first table delineates the number of passed tests versus total tests, offering a
quick overview of vendor compliance. The second table delineates the tests
performed between products acting as **Publishers** (rows) and **Subscribers**
(columns), providing insights into interoperability between different DDS
implementations.

The subsequent tabs represent individual test case results per product. Each tab
The test results are presented in a spreadsheet containing multiple tabs.

The first tab presents a summary of the tests per product:

* The first table shows the number of passed tests versus total tests, offering
a quick overview of vendor compliance. The result is reported using the notation
(NumberPassedTests / NumberTotalTests). Colors are also used to highlight the
level of interoperability (green being the best and red the worst)
* The second table shows more detail on the interoperability between each pair
of products: One product acting as **Publishers** (rows) and one as **Subscriber**
(columns). The result is again reported using the notation
(NumberPassedTests / NumberTotalTests).

The second tab contains a summary of the test descriptions. The full details can
be found in the `Test Descriptions section <https://omg-dds.github.io/dds-rtps/test_description.html>`__.

The remaining tabs in the spreadsheet (see tab selection at the bottom of the
spreadsheet) contain the individual test case results per product. Each tab
is named after the respective product and contains two tables:

* Left-side table: Current product as publisher and all products as subscribers.
* Righ-side table: Current product as subscriber and all products as publishers.
* Right-side table: Current product as subscriber and all products as publishers.

Access the report at: |LINK_XLSX_URL|

**NOTE**: for a detailed test description visit :ref:`section-test-descriptions`

.. raw:: html

<iframe src="|LINK_XLSX_URL|" width="100%" height="800"></iframe>
4 changes: 2 additions & 2 deletions doc/vars.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@

.. |SHAPE_APP| replace:: ``Shape Application``
.. |SHAPE_APPS| replace:: ``Shape Applications``
.. |INTEROPERABILITY_TESTS_CP| replace:: OMG® DDS® Interoperability Tests
.. |INTEROPERABILITY_TESTS| replace:: OMG DDS Interoperability Tests
.. |INTEROPERABILITY_TESTS_CP| replace:: OMG® DDS® Interoperability Testsuite
.. |INTEROPERABILITY_TESTS| replace:: OMG DDS Interoperability Testsuite
.. |COPYRIGHT_YEAR| replace:: 2024
.. |COPYRIGHT_HEADER_RTI| replace:: © |COPYRIGHT_YEAR| Real-Time Innovations, Inc.
.. |COPYRIGHT_HEADER_ATOSTEK| replace:: © |COPYRIGHT_YEAR| Atostek Oy.
Expand Down
Loading

0 comments on commit f1e1fa9

Please sign in to comment.