backward() segfault unless we init input grad
Calling backward() function of an operation generates a segfault unless I call initGrad() for the init before backward() call.
Designs
- Show closed items
No child items are currently assigned. Use child items to break down this issue into smaller parts.
Link issues together to show that they're related or that one is blocking others.
Learn more.
Activity
-
Newest first Oldest first
-
Show all activity Show comments only Show history only
- Houssem ROUIS changed the description
Compare with previous version changed the description
- Houssem ROUIS assigned to @olivierbichler
assigned to @olivierbichler
- Author Developer
I was wrong about calling initGrad() before setGrad() for the output, it is not necessary. So currently it is only necessary for the input.
- Maintainer
@cmoineau Is there a specific reason why the content of
initGrad()
could not be put ingrad()
(therefore implementing simple lazy initialization)?Edited by Olivier BICHLER Collapse replies - Maintainer
No particular reason, we could do this.
Just the values returned will not be initialized ?
Edited by Cyril Moineau
- Maintainer
This has been fixed in MR aidge_core!143 (merged) with lazy initialization of the gradient tensor.
- Olivier BICHLER closed
closed
- Olivier BICHLER mentioned in merge request aidge_core!143 (merged)
mentioned in merge request aidge_core!143 (merged)
- Maxence Naud mentioned in issue aidge_core#133
mentioned in issue aidge_core#133
Please register or sign in to reply