Skip to content
Snippets Groups Projects

Add: backward pass for Pop operator

Merged Jerome Hue requested to merge jeromeh/aidge_core:pop-backward into dev

Added the backward pass implementation for the Pop operator. The backward pass stacks gradients in the correct positions of the input tensor based on the pop sequence.

Key changes:

  • Implemented Pop_OpImpl::backward() method to handle gradient propagation
  • Added BackwardStep counter attribute to track backward pass position
  • Extended unit tests to verify gradient computation
Edited by Jerome Hue

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
  • Loading
Please register or sign in to reply
Loading