backward() segfault unless we init input grad
Calling backward() function of an operation generates a segfault unless I call initGrad() for the init before backward() call.
Calling backward() function of an operation generates a segfault unless I call initGrad() for the init before backward() call.
changed the description
assigned to @olivierbichler
I was wrong about calling initGrad() before setGrad() for the output, it is not necessary. So currently it is only necessary for the input.
@cmoineau Is there a specific reason why the content of initGrad()
could not be put in grad()
(therefore implementing simple lazy initialization)?
No particular reason, we could do this.
Just the values returned will not be initialized ?
This has been fixed in MR aidge_core!143 (merged) with lazy initialization of the gradient tensor.
closed
mentioned in merge request aidge_core!143 (merged)
mentioned in issue aidge_core#133
Copyright © Eclipse Foundation, Inc. All Rights Reserved. Privacy Policy | Terms of Use | Copyright Agent