Skip to content

Fix gradient tensor in Backward method

Olivier Antoni requested to merge (removed):Fix_backward into dev

Context

Solve the issue !272 for CPU backend.

Modified files

The CPU kernel for the following operators is modified: Atan, Clip, Heaviside, LeakyReLU, Ln, MaxPooling, Mul, Pow, ReLU, Sigmoid, Sqrt, Sub, Tanh.

Detailed major modifications

Modify Backward() method:

  • Remove upstream gradient tensor initialization (if applicable)
  • Replace "gradient = value" by "gradient += value"

Merge request reports

Loading