Skip to content

Update how loss function work

Cyril Moineau requested to merge bindLoss into dev

Context

Currently, Loss function only compute loss but do not create the gradient of the graph output node.

This MR aims at fixing this issue following the solution described in: aidge_learning#1 (closed)

Modified files

  • Tensor:
    • remove multi-line comment in grad
    • add setGrad function
    • Rename initGradient -> initGrad for name consistency
    • Bind setGrad and initGrad.
  • Producer: remove prints
  • SequentialScheduler: Update backward to be compatible with how loss work (remove gradient from arg list)
  • Filler: Add Assert to avoid div by 0

Detailed major modifications

  • Scheduler.backward no longer compile gradient.
Edited by Cyril Moineau

Merge request reports

Loading