Unexpected output of pooling layer in quantized network CPP export
Required prerequisites
-
Make sure you've read the documentation. Your issue may be addressed there. -
Search the issue tracker and discussions to verify that this hasn't already been reported. +1 or comment there if it has.
What commit version of aidge do you use
-
aidge_core
: dev
Problem description
In a simple CNN, after quantizing the network in int8, I generate a CPP export. I get some compilation warnings (which is not really a problem) that could be fixed. The forward function declares some additional output for each Maxpool layer, here is an example below:
int8_t* _3_MaxPooling2D_1_output_0 = (int8_t*) (mem + _3_MAXPOOLING2D_1_OUTPUT_0_MEM_OFFSET);
int64_t* _3_MaxPooling2D_1_output_1 = (int64_t*) (mem + _3_MAXPOOLING2D_1_OUTPUT_1_MEM_OFFSET);
pooling_forward<_2_PADCONVACT_1_OUTPUT_0_NB_CHANNELS,
_2_PADCONVACT_1_OUTPUT_0_IN_HEIGHT,
_2_PADCONVACT_1_OUTPUT_0_IN_WIDTH,
_3_MAXPOOLING2D_1_OUTPUT_0_NB_OUTPUTS,
_3_MAXPOOLING2D_1_OUTPUT_0_OUT_HEIGHT,
_3_MAXPOOLING2D_1_OUTPUT_0_OUT_WIDTH,
_3_MAXPOOLING2D_1_PADDING_Y,
_3_MAXPOOLING2D_1_PADDING_X,
_3_MAXPOOLING2D_1_STRIDE_Y,
_3_MAXPOOLING2D_1_STRIDE_X,
_3_MAXPOOLING2D_1_KERNEL_HEIGHT,
_3_MAXPOOLING2D_1_KERNEL_WIDTH,
_3_MAXPOOLING2D_1_POOLING_TYPE,
_3_MAXPOOLING2D_1_ACTIVATION,
_2_PADCONVACT_1_OUTPUT_0_MEM_CONT_OFFSET,
_2_PADCONVACT_1_OUTPUT_0_MEM_CONT_SIZE,
_2_PADCONVACT_1_OUTPUT_0_MEM_WRAP_OFFSET,
_2_PADCONVACT_1_OUTPUT_0_MEM_WRAP_SIZE,
_2_PADCONVACT_1_OUTPUT_0_MEM_STRIDE,
_3_MAXPOOLING2D_1_OUTPUT_0_MEM_CONT_OFFSET,
_3_MAXPOOLING2D_1_OUTPUT_0_MEM_CONT_SIZE,
_3_MAXPOOLING2D_1_OUTPUT_0_MEM_WRAP_OFFSET,
_3_MAXPOOLING2D_1_OUTPUT_0_MEM_WRAP_SIZE,
_3_MAXPOOLING2D_1_OUTPUT_0_MEM_STRIDE>
(_2_PadConvAct_1_output_0, _3_MaxPooling2D_1_output_0);
The variable _3_MaxPooling2D_1_output_1
is never used later in the code, and is declared as int64_t
which is weird for an int8 quantization. But maybe this uses some memory for nothing.