Update torch interrop to work with the new backprop.
Context
Currently, Loss function only compute loss but do not create the gradient of the graph output node.
This MR aims at fixing this issue following the solution described in: aidge_learning#1 (closed)
In this MR I update the torch interroperability to adapt to these new changes.