FIX : Update gradient tensor dimensions
Context
When scheduler.forward() is called, the gradient tensors are never resized (if already exist) to account for changes in the batch size as described in !20
Modified files
To solve the issue, gradient tensors are resized in Aidge::Optimizer::resetGrad
TODO
In the Aidge::Tensor class, it is unclear whether a tensor and its associated gradient tensor (if any) must always have the same dimensions. If so, this causes problems when calling Tensor::resize, because the tensor's dimensions are resized, but the gradient tensor's dimensions are unchanged.
The issue could also be solved by modifying Tensor::resize behavior, so as to resize both the tensor and the gradient tensor (if any) when the method is called. This have to be discussed internally within the Aidge development team to decide what is the best option.