Skip to content

feat : Add tests for import of ONNX Nodes test suite

Context

Fetch and test all single Node ONNX tests as provided in the ONNX source tree (https://github.com/onnx/onnx/tree/v1.16.2/onnx/backend/test/data/node).

Refer to https://github.com/onnx/onnx/blob/main/docs/OnnxBackendTest.md for what is a Node ONNX Test.

The test are created thanks to utility classes for managing ONNX tests cases and pytest utility class for generating XFAIL/SKIPPED and filtered tests in the pytets reports.

The current tests provided by this MR have the following properties:

  1. tests from the ONNX Node tests such that the input model contains as single operator/node
  2. there are two test functions (ran for each test in 1.) 2.1. test only the onnx import to aidge (onnx_load) 2.1. test the import and the execution on cpu forward() function and compare to the onnxruntime generated result

This is the first base for a systematic test of ONNX provided test cases, and as of now:

  • failing tests have been analyzed quickly to mark them xfail (i.e. should be fixed in the future and require attention), same for the forward execution
  • the default is to filter (i.e. test not executed) the input models for which aidge_import does not support an operator, there is an optional var to show them as skipped for information
  • for the test which runs forward, the tests are sometimes flacky, produce segfauts or run into infinite loops, in this case we put these tests in the skip list as putting them in xfail will block the test run or make the results flacky.

The idea from now on, when there is a new contribution to aidge_import is that:

  • adding a new operator in import: this will automatically run the related ONNX tests => the developer will have to fix them or mark them explicitly as xfail with a reason (for instance "attribute myattr not supported"), otherwise the tests will not pass
  • fixing an operator: this should change the test case from xfail to xpass. In this case the pytest report will show additional xpass. The number of xpass should always be zero => the developer must verify that the fix actually fixes the initial xfail reason and remove the test case from the xfails.
  • in the last case if the test is in the skip list (flaky, segfault, ...), the transition to xpass is not automatic. hence attention must be taken to rerun the test by commenting it in the skip list and optionally removing it if the test is fixed.

The xfail/skip test are handled reasonably easily with a list as described in the test_import_onnx_nodes.py and pytest_utils.py files.

Here is the current output of this MR with pytest (i.e. only pass ., xfail x, and some skips s due to segfault/timeout):

cd aidge_onnx/unit_tests
pytest test_*.py
==================================================================================================================================== test session starts ====================================================================================================================================
platform linux -- Python 3.12.3, pytest-8.3.3, pluggy-1.5.0
rootdir: /home/cguillon/work/aidge/aidge/aidge_onnx/dev-onnx-import-tests
configfile: pyproject.toml
collected 442 items                                                                                                                                                                                                                                                                         

test_converter_register.py ..                                                                                                                                                                                                                                                         [  0%]
test_generic.py .                                                                                                                                                                                                                                                                     [  0%]
test_import_export.py .                                                                                                                                                                                                                                                               [  0%]
test_models.py .                                                                                                                                                                                                                                                                      [  1%]
test_node_conv.py .                                                                                                                                                                                                                                                                   [  1%]
test_onnx_nodes_import.py .....xx.xxxx....xx...xx.x............................................xxxxxxxx.xx........xxxxxx.....x.x....x.....x..x................xxxxxxxx.........x.xxxxxxxxxx..xx....x........xxxxxxxxxxxxxxxx..............x.........x..x.xxxxxxx.xxxxxxx.xxxxx.ss..ss [ 58%]
...ss.x.x.x.....x..xx.xx.x.x.xxx....xxxxxxxxxxxxxxx...xxxxxxxxxxxxxxx..xxx..xxxxxxx.x...xxxx.x.xx.xxxxxxxxx.........x.xxxxxxxxxx.xxx.xxsx.xx.xx.xxxxxxxxxxxxxxxxx.x..x....x....x.........                                                                                             [100%]

======================================================================================================================= 231 passed, 7 skipped, 204 xfailed in 11.85s ========================================================================================================================

Note that approximately half of the tests are passing, and the other half marked expected to fail (i.e. should pass but the import or implementation is incomplete).

The onnx_test_cases.py file which defined the ONNXTestCases class for fecthing/querying ONNX tests cases can also be executed to give summary information, here is the help text, and below an output extract:

> cd aidge_onnx/unit_tests
> python onnx_test_cases.py -h
usage: onnx_test_cases.py [-h] [--no-summary] [--tag TAG] [--module] [--onnx-path ONNX_PATH] [--debug] [test_data_path]

Generate tests information from ONNX data tests dir

positional arguments:
  test_data_path        optional test data path in onnx sources (default: backend/test/data/node)

options:
  -h, --help            show this help message and exit
  --no-summary          do not print tests summary (default: False)
  --tag TAG             onnx tag to fetch (default: v1.16.2)
  --module              Use locally installed module (default: False)
  --onnx-path ONNX_PATH
                        Optional path to onnx sources, otherwise fetch sources (default: None)
  --debug               debug mode (default: False)

Output tests cases summary information for the ONNX Nodes tests v1.16.2 (the defaults):

> cd aidge_onnx/unit_tests
> python onnx_test_cases.py
onnx test models summary:
  op_type_count: 188
  models_count: 1282
  simple_models_count: 1045
  op_type_tests:
    - abs:
      models: 12
      datasets: 12
      single_models: 1
      single_datasets: 1
...
   - add:
      models: 70
      datasets: 70
      single_models: 3
      single_datasets: 3
...

It shows that the tests contain 1282 ONNX models files, counting fo 188 unique ONNX operator types. There are 1045 models which are "simple", i.e. with a single node. These are actually currently the ones that are tested by this MR.

It also shows that abs (resp. add) is present in 12 (resp. 70) models and that there is 1 (resp. 3) single node model/dataset for this op type.

Modified files

Add test_onnx_nodes_import.py:

  • actual definition of tests: for the recall pytest just execute all functions starting with test_,
    • there are two functions: test_onnx_import_nodes() and test_onnx_import_nodes_forward().
    • each one actually run many test cases (~200 each) one for each ONNX test case for which aidge_import supports the operator

Add onnx_test_cases.py: manage test cases fetch and discovery from the ONNX sources

Add pytest_utils.py: generic class to ease marking of xfail/skipped/filtered test from simple lists of (regexp_on_test_ident, reason).

Detailed major modifications

No modification in the behavior only tests

TODO

Nope.

Future work

This can be extended in the future to add tests:

  • from the ONNX Nodes test suite but not limited to a single node in the model
  • from the ONNX Models test suite to test actual models

Changes waiting for this MR

I have a stack of some patches to the import for some operators such that they at least pass the import step, I intend to contribute when this tests are merged.

Edited by Christophe Guillon

Merge request reports

Loading