Skip to content

Error in aidge_benchmark, due to opset version

Required prerequisites

  • Make sure you've read the documentation. Your issue may be addressed there.
  • Search the issue tracker and discussions to verify that this hasn't already been reported. +1 or comment there if it has.

What commit version of aidge do you use

  • aidge_core: dev (b300211f)
  • aidge_onnx: dev (38440515c6bfb8f43178197373baaaa1136d78ca)
  • aidge_backend_cpu: dev (382f6d471afa648757a72a89f1a31c29c7d88e89)

onnxruntime version is 1.22

Problem description

Running the command :

aidge_benchmark --config-file add.json --time --compare --modules aidge_backend_cpu onnxruntime --save-directory benchmark_results

results in the following error :

Loading modules...
 ├───aidge_backend_cpu [ ok ]

Starting tests...
▷ dim size -- 1
 ├─┬─aidge_backend_cpu
 │ ├───time [ 1.24e-04 ± 2.23e-05 ] (seconds)
 │ └───comp Traceback (most recent call last):
  File "/tmp/cleanenv/bin/aidge_benchmark", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/tmp/cleanenv/lib/python3.12/site-packages/aidge_core/benchmark/benchmark.py", line 378, in main
    ref = compute_output(ref_module_name, model, input_data, ref_module)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/cleanenv/lib/python3.12/site-packages/aidge_core/benchmark/benchmark.py", line 84, in compute_output
    return benchmark_onnxruntime.compute_output(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/cleanenv/lib/python3.12/site-packages/aidge_core/benchmark/benchmark_onnxruntime.py", line 35, in compute_output
    sess = ort.InferenceSession(model.SerializeToString())
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/tmp/cleanenv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/tmp/cleanenv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 552, in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : /onnxruntime_src/onnxruntime/core/graph/model.cc:181 onnxruntime::Model::Model(onnx::ModelProto&&, const onnxruntime::PathString&, const onnxruntime::IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&, const onnxruntime::ModelOptions&) Unsupported model IR version: 11, max supported IR version: 10

Reproducible example code

See the command above.

Edited by Jerome Hue