Add support for QLinearConv
The goal of this issue is to add support for QLinearConv Operator https://github.com/onnx/onnx/blob/main/docs/Operators.md#QLinearConv
To do so, I propose to add a function to fuse all Conv + Scaling operator into a MetaOperator QLinearConv. This function will be left to the user to call depending on how he wants to export his quantized network. (Exporting each operator idependantly or using QOp from ONNX)
Then we will need to add the support for the export of this new MetaOperator.
TODO:
-
Create a simple network with one Conv -
Quantize this network with random data -
Fuse Conv and Scaling using the aidge_core.fuse_to_metaops()
function into a MetaOperator QLinearConv -
Add support in export node for a MetaOperator QLinearConv -
Perform inferences in the exported model and aidge model -
Verify the models by comparing inferences results -
Add quantize and dequantize operators to the MetaOperator -
Method for the removal of quantize and dequantize operators(they will cancel each other out)
Note: Due to future removal of Scaling node the regex to use in
aidge_core.fuse_to_metaops()
may change but this is a minor change
Edited by Noam Zerah