The source project of this merge request has been removed.
Add backward functions for ReLU, Sigmoid and Tanh
Context
Add some backward functions that were missing.
Detailed modifications
-
Use a protype for backward function that may fit all the needs : input, output and grad_output are needed to compute grad_input in the general case (input was missing, output is kept to save computation). This may lead to unused parameters, but it is probably not an issue ?
-
Add missing backward functions for Sigmoid and Tanh.
-
Use input instead of output for testing the sign in ReLU (both codes should give same results, but the use of input is preferred).
TODO
For this merge request :
- Fell free to adapt the proposed modifications to match code requirements.
- Check if my understanding of the way the forward methods should work is correct.
For future :
- Check the forward function for Sqrt operator that seems not to be correct.
- Homogenize the way forward and backward are coded for all available operators.