Add: backward pass for Pop operator
Added the backward pass implementation for the Pop operator. The backward pass stacks gradients in the correct positions of the input tensor based on the pop sequence.
Key changes:
- Implemented Pop_OpImpl::backward() method to handle gradient propagation
- Added
BackwardStep
counter attribute to track backward pass position - Extended unit tests to verify gradient computation
Edited by Jerome Hue
Merge request reports
Activity
Filter activity
Please register or sign in to reply