Clarification for tensor operator like exp or log and clipping
What commit version of aidge do you use
-
aidge_core
: 0.6.2 (dev) -
aidge_...
: 0.3.0 (dev)
I am using the AIDGE Python API for deep learning tasks and I need clarification regarding exponential (exp()), natural logarithm (log()), and clipping operations.
My questions are:
Do aidge_core.Tensor objects have direct methods for exp() and log() (e.g., my_tensor.exp(), my_tensor.log())? (I can only see operaotrs like sqrt() or abs())
If not, are there aidge_core functions (e.g., aidge_core.exp(my_tensor), aidge_core.log(my_tensor)) that perform these operations on existing aidge_core.Tensor objects?
Crucially, are these exp() and log() operations (whichever form they take) fully integrated into AIDGE's automatic differentiation graph, ensuring that gradients are correctly computed and propagated during the backpropagation phase?
Does an element-wise clipping operator exist in the AIDGE Python API (e.g., aidge_core.clip(tensor, min_val, max_val) or a method like tensor.clip(min_val, max_val))?
Given the availability of these operators, is it possible to construct a custom loss function, such as the one used in Proximal Policy Optimization (PPO), which involves complex differentiable operations like min(surr1, surr2) (where surr1 and surr2 depend on exp and clip)?
I am trying to build a reinforcement learning algorithm (PPO) and need to confirm that these fundamental operations support automatic differentiation and allow for complex loss function definitions.
Thank you in advance.