Add knowledge distillation (KD) loss to aidge_learning
Context
Adding the Knowledge Distillation (KD) loss to aidge for incremental learning.
Modified files
-
aidge_learning/src/loss/distillation/KD.cpp
: definition of the KD loss (highly inspired by BCE loss) -
aidge_learning/include/aidge/loss/LossList.hpp
: Adding the KD loss the loss list -
aidge_learning/python_binding/learning/loss/pybind_Loss.cpp
: Add the biding python for the KD Loss
TODO
Integrate the merge request into the main branch
-
NOT DONE -
DONE -
TO DO