Update torch interrop to work with the new backprop.
Context
Currently, Loss function only compute loss but do not create the gradient of the graph output node.
This MR aims at fixing this issue following the solution described in: aidge_learning#1 (closed)
In this MR I update the torch interroperability to adapt to these new changes.
Merge request reports
Activity
Filter activity
Please register or sign in to reply