Open hsj0429 opened 4 years ago
I have read the source code of mindspore, And I want to know how did mindspore IR deal with the control flow (like conditional statement and loop statement) in graph rewriter, do you have more docs about this?
There is no detailed documentation about control flow at the moment. One way to work around is that maybe you can dump IR to see for yourself.
Dump IR:
Install mindspore from source. Please refer to Mindspore Install
When run build.sh, add option "-D".
-D Enable dumping of function graph ir, default on
In your network script, set context to save IR file. Please refer to Context API example:
import numpy as np
from mindspore import nn, Tensor, context
class Net(nn.Cell):
def __init__(self):
super(Net, self).__init__()
self.conv = nn.Conv2d(1, 1, 3)
self.relu = nn.ReLU()
def construct(self, x):
for i in range(3):
x = self.relu(self.conv(x))
return x
if __name__ == '__main__':
context.set_context(mode=context.GRAPH_MODE, device_target='CPU', save_graphs=True, save_graphs_path='.')
net = Net()
x = Tensor(np.ones((1, 1, 32, 32)).astype(np.float32) * 0.01)
ret = net(x)
Run your script, and IR file "0_parse.dot" will be save to your specified path.
This is a great question and we will further improve out documentation.
I have read the source code of mindspore, And I want to know how did mindspore IR deal with the control flow (like conditional statement and loop statement) in graph rewriter, do you have more docs about this?