Skip to content
Snippets Groups Projects

Add knowledge distillation (KD) loss to aidge_learning

Merged Lucas RAKOTOARIVONY requested to merge lrakotoarivony/aidge_learning:main into dev
1 unresolved thread

Context

Adding the Knowledge Distillation (KD) loss to aidge for incremental learning.

Modified files

  • aidge_learning/src/loss/distillation/KD.cpp: definition of the KD loss (highly inspired by BCE loss)
  • aidge_learning/include/aidge/loss/LossList.hpp: Adding the KD loss the loss list
  • aidge_learning/python_binding/learning/loss/pybind_Loss.cpp: Add the biding python for the KD Loss

TODO

Integrate the merge request into the main branch

  • NOT DONE
  • DONE
  • TO DO

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
  • added 1 commit

    Compare with previous version

  • Olivier BICHLER reset approvals from @olivierbichler by pushing to the branch

    reset approvals from @olivierbichler by pushing to the branch

  • Olivier BICHLER approved this merge request

    approved this merge request

  • Olivier BICHLER enabled an automatic merge when all merge checks for 915aebf8 pass

    enabled an automatic merge when all merge checks for 915aebf8 pass

  • Olivier BICHLER mentioned in commit 9c7303e5

    mentioned in commit 9c7303e5

  • Please register or sign in to reply
    Loading