-
Notifications
You must be signed in to change notification settings - Fork 160
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tools/onnx-subgraph] add multi subgraphs inference code #14769
Conversation
add code for multi subgraphs inference ONE-DCO-1.0-Signed-off-by: Youxin Chen <yx113.chen@samsung.com>
@@ -116,7 +163,16 @@ def prepare_initial_input_data(onnx_model_path, default_input_data): | |||
"x": np.random.rand(1, 3, 256, 256).astype(np.float32), | |||
} | |||
initial_input_data = prepare_initial_input_data(args.single, default_input_data) | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
plz remove this line, it doesn't seem to be related with the change.
model_input_data = {name: input_data[name] for name in input_names} | ||
outputs = session.run(None, model_input_data) | ||
current_model_outputs = dict(zip(output_names, outputs)) | ||
if output_names_to_collect is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can hoist this if
line above for
loop.
when collected_outputs
is None
, return is {}
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated change doesn't relfect may comment correctly.
please read again.
# Perform inference using multiple split subgraph models | ||
output_multiple = model_inference.infer_multiple_onnx_models( | ||
initial_input_data, output_names_list) | ||
print("Multiple subgraph inference completed!") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Q) is infer_single_onnx_model
for the "source" onnx model and infer_multiple_onnx_models
for our "target" splitted multiple models ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, we use this code to verify the splitted models inference result, then compare the output data and evaluate the accuracy with the "source" onnx model, there will be comparaing code in next PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh, yes, I move the parameter exception checking out of loop now, thank you :)
update code as the review comment
move parameter exception checking out of loop
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
related issue: #14534
historical full changes PR: #14613
add code for multi sub graphs inference
ONE-DCO-1.0-Signed-off-by: Youxin Chen yx113.chen@samsung.com