Scheduler backward
Example of the expected Python user interface
model = load_onnx("model.onnx")
# set backend, datatype, dimensions, data format
model.forward_compile()
# set backend, datatype, dimensions, data format for the gradient
model.backward_compile() # here
# set learning parameters
Myloss = MSE()
myLR = ConstantLR(10^-3)
param = model.parameters() # here
opt = Adam(param, myLR)
# get data
data = DataBase("path/to/dataset", transformations=[resize, normalize,…])
provider = DataProvider(data, batchsize = 8)
# learn
sch = SequentialScheduler(model)
for x1, x2, label in myProvider:
y = s.forward([x1,x2])
l = myLoss(y, label)
opt.zero_grad() # here
sch.backward(l) # here
opt.update()
Graph manipulation functions:
-
parameters()
to extract parameters of type Producer from a GraphView -
producers()
to extract any operator of type Producer from a GraphView
Backward function:
-
backward()
function inSequentialScheduler
The choice was made to start with reversing the
SequentialScheduler
list. -
backward()
function in OperatorTensors-
Activations -
LeakyReLU -
ReLU -
Sigmoid -
Sqrt
-
-
Arithmetic -
Add -
Sub -
Mul -
Div -
Pow
-
-
Layers -
FC -
Conv
-
-
-
backward()
inGenericOperator
Instanciate gradient
-
instanciateGraphView()
to initialize Tensors gradient with the same datatype/backend -
compile_backward()
Should the computational graph for gradient be independant of the forward graph?
Unit tests for everything
-
unit tests
Other
-
Move forwardDims()
member function out ofGraphView
as it is not about topology but Tensors -
Move compile()
member function out ofGraphView
for the same reasons?
Edited by Maxence Naud