The source project of this merge request has been removed.
Fix gradient tensor in Backward method
Context
Solve the issue !272 for CPU backend.
Modified files
The CPU kernel for the following operators is modified: Atan, Clip, Heaviside, LeakyReLU, Ln, MaxPooling, Mul, Pow, ReLU, Sigmoid, Sqrt, Sub, Tanh.
Detailed major modifications
Modify Backward() method:
- Remove upstream gradient tensor initialization (if applicable)
- Replace "gradient = value" by "gradient += value"