-
Notifications
You must be signed in to change notification settings - Fork 170
Use turbine for gen_sharktank imports. #1904
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
vk failure seems gen related |
yep, needed to set the jobs to use IMPORTER=1 as that's where shark-turbine is added to requirements |
edc4361
to
3be95a7
Compare
"windows not yet supported for torch.compile" blocks turbine gen from working in windows CI
nice.... I think I can use transformerbuilder for the dict_input models........ and leave the rest as aot.exports since it's so simple. when I have a chance we'll gut SHARK/tank and just use the turbine tests for most of the way |
Closing as model testing/validation will live in SHARK-Turbine |
Needs these changes nod-ai/SHARK-ModelDev#102 for a successful run for certain models.
TODO: turbine integration for models that accept multiple inputs.
TODO: complete sets of artifacts for turbine path (needs hash generating for each .mlir)