Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

模型超过2G怎么处理 #32

Open
Worromots opened this issue Jun 26, 2024 · 4 comments
Open

模型超过2G怎么处理 #32

Worromots opened this issue Jun 26, 2024 · 4 comments

Comments

@Worromots
Copy link

File "/workdir/common/anaconda3/envs/qpixart/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/mnt/dolphinfs/hdd_pool/docker/user/hadoop-platcv/tsl/q_mtpixart/Dipoorlet/dipoorlet/main.py", line 120, in
act_clip_val, weight_clip_val = tensor_calibration(onnx_graph, args)
File "/mnt/dolphinfs/hdd_pool/docker/user/hadoop-platcv/tsl/q_mtpixart/Dipoorlet/dipoorlet/tensor_cali/tensor_cali_base.py", line 6, in tensor_calibration
act_clip_val = tensor_cali_dispatcher(args.act_quant, onnx_graph, args)
File "/mnt/dolphinfs/hdd_pool/docker/user/hadoop-platcv/tsl/q_mtpixart/Dipoorlet/dipoorlet/utils.py", line 297, in wrapper
return dispatch(args[0])(*(args[1:]), **kw)
File "/mnt/dolphinfs/hdd_pool/docker/user/hadoop-platcv/tsl/q_mtpixart/Dipoorlet/dipoorlet/tensor_cali/basic_algorithm.py", line 35, in find_clip_val_hist
stats_min_max = forward_get_minmax(onnx_graph, args)
File "/mnt/dolphinfs/hdd_pool/docker/user/hadoop-platcv/tsl/q_mtpixart/Dipoorlet/dipoorlet/forward_net.py", line 200, in forward_get_minmax
ort_session = ort.InferenceSession(net.SerializeToString(), providers=providers)

@gushiqiao
Copy link
Contributor

跑的是transformer类的模型嘛?

@Worromots
Copy link
Author

Worromots commented Aug 26, 2024 via email

@gushiqiao
Copy link
Contributor

超过2G的模型,Dipoorlet会依赖"onnxruntime.transformers.optimizer"进行优化, 具体代码在这里,https://github.com/ModelTC/Dipoorlet/blob/c89130744d45ae2e7cb77081b8799c2ff31ee08d/dipoorlet/__main__.py#L85。
至于DIT,其实要看"onnxruntime.transformers.optimizer"是否支持,目前我们在Dipoorlet里只支持了Unet的优化,可以在这里看到

parser.add_argument("--model_type", help="Transformer model type", choices=["unet"], default=None)

@gushiqiao
Copy link
Contributor

超过2G的模型,Dipoorlet会依赖"onnxruntime.transformers.optimizer"进行优化, 具体代码在这里,https://github.com/ModelTC/Dipoorlet/blob/c89130744d45ae2e7cb77081b8799c2ff31ee08d/dipoorlet/main.py#L85。 至于DIT,其实要看"onnxruntime.transformers.optimizer"是否支持,目前我们在Dipoorlet里只支持了Unet的优化,可以在这里看到

parser.add_argument("--model_type", help="Transformer model type", choices=["unet"], default=None)

@Worromots

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants