Skip to content
Snippets Groups Projects
Commit c2ff03e3 authored by Maxence Naud's avatar Maxence Naud Committed by Maxence Naud
Browse files

UPD: documentation for 'GraphView::forwardDims()' in CPP and Python

parent af4985ff
No related branches found
No related tags found
2 merge requests!318[Upd] release verision 0.5.0,!313[UPD] add some logging informations
......@@ -247,13 +247,51 @@ public:
const std::vector<std::vector<DimSize_t>> dims = {});
/**
* @brief Compute dimensions of input/output Tensors for each Operator of the
* GraphView object's Nodes, by calling Node::forwardDims().
* This function verifies the following conditions:
* - Every node will forwardDims() regardless of if dims were previously forwarded or not;
* - forwadDims() calls are made in node dependencies order, because if dims have changed
* at any point in the graph, it must de propagated correctly to all succeeding nodes;
* - It handles cyclic dependencies correctly (currently only induced by the Memorize_Op).
* @brief Compute and propagate Tensor dimensions through the GraphView.
*
* This function computes dimensions of input/output Tensors for each of the
* Node's associated Operator in the GraphView by propagating dimensions from
* inputs through the entire network.
* It handles:
* - Dimension propagation in dependency order
* - Cyclic dependencies (specifically for Memorize_Op)
* - Input dimension validation and setting
* - Optional vs mandatory inputs
*
* @note Dimensions will be propagated through every Node regardless of if
* dims were previously forwarded or not;
* @details The algorithm works in several phases:
* 1. Input Dimension Setup:
* - Validates/sets provided input dimensions
* - Checks compatibility with existing tensors
*
* 2. Connection Verification:
* - Ensures all node connections are valid
* - Verifies mandatory inputs are present
*
* 3. Dimension Propagation:
* - Propagates dimensions through the graph in topological order
* - Detects and handles circular dependencies induced by Memorize_Op
*
* Example:
* @code
* auto graph = std::make_shared<GraphView>();
* // ... build graph ...
*
* // Forward with default empty dimensions
* bool success = graph->forwardDims();
*
* // Forward with specific input dimensions
* std::vector<std::vector<DimSize_t>> inputDims = {
* {1, 3, 224, 224}, // First input
* {1, 64, 112, 112} // Second input
* };
* success = graph->forwardDims(inputDims);
* @endcode
*
* @param dims Vector of dimension vectors for graph inputs. Empty by default.
* @param allowDataDependency Whether to allow data-dependent dimension computation. False by default.
* @return true if dimension propagation succeeded, false otherwise.
*/
bool forwardDims(const std::vector<std::vector<DimSize_t>>& dims = {}, bool allowDataDependency = false);
......
......@@ -127,7 +127,66 @@ void init_GraphView(py::module& m) {
.def("clone", &GraphView::clone)
.def("get_nodes", &GraphView::getNodes)
.def("get_node", &GraphView::getNode, py::arg("node_name"))
.def("forward_dims", &GraphView::forwardDims, py::arg("dims")=std::vector<std::vector<DimSize_t>>(), py::arg("allow_data_dependency") = false)
.def("forward_dims", &GraphView::forwardDims, py::arg("dims")=std::vector<std::vector<DimSize_t>>(), py::arg("allow_data_dependency") = false,
R"mydelimiter(
Compute and propagate Tensor dimensions through the GraphView.
This function computes dimensions of input/output Tensors for each of the
Node's associated Operator in the GraphView by propagating dimensions from
inputs through the entire network.
It handles:
* Dimension propagation in dependency order
* Cyclic dependencies (specifically for Memorize_Op)
* Input dimension validation and setting
* Optional vs mandatory inputs
Note
----
Dimensions will be propagated through every Node regardless of if
dims were previously forwarded or not.
The algorithm works in several phases:
1. Input Dimension Setup:
* Validates/sets provided input dimensions
* Checks compatibility with existing tensors
2. Connection Verification:
* Ensures all node connections are valid
* Verifies mandatory inputs are present
3. Dimension Propagation:
* Propagates dimensions through the graph in topological order
* Detects and handles circular dependencies induced by Memorize_Op
Parameters
----------
dims : List[List[int]], optional
Vector of dimension vectors for graph inputs. Empty by default.
allow_data_dependency : bool, optional
Whether to allow data-dependent dimension computation, by default False
Returns
-------
bool
True if dimension propagation succeeded, False otherwise.
Examples
--------
>>> graph = GraphView()
>>> # ... build graph ...
>>>
>>> # Forward with default empty dimensions
>>> success = graph.forward_dims()
>>>
>>> # Forward with specific input dimensions
>>> input_dims = [
... [1, 3, 224, 224], # First input
... [1, 64, 112, 112] # Second input
... ]
>>> success = graph.forward_dims(input_dims)
)mydelimiter")
.def("compile", &GraphView::compile, py::arg("backend"), py::arg("datatype"), py::arg("device") = 0, py::arg("dims")=std::vector<std::vector<DimSize_t>>())
.def("__call__", &GraphView::operator(), py::arg("connectors"))
.def("set_datatype", &GraphView::setDataType, py::arg("datatype"))
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment