-
Notifications
You must be signed in to change notification settings - Fork 1
[Pytorch 2.10] Variant configuration #14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Changes vs. wheelnext variants v0.0.3 and pytorch 2.9: * Changed schema version to v0.0.3 * Changed `plugin-use` to `install-time` * Triton versions bumped to 3.6.0 across all 3 backends, cuda, rocm, xpu * Bumped XPU dependency versions to match pytorch 2.10 * Added new XPU platforms supported in pytorch 2.10 See: wheelnext/pep_817_wheel_variants#103 Signed-off-by: Dmitry Rogozhkin <dmitry.v.rogozhkin@intel.com>
ffce1d6 to
1b9ff33
Compare
|
I added NVIDIA configuration @atalman please merge when you are ready & satisfied |
|
I am getting |
|
This doesn't exist anymore. Update variantlib. You may need to run "uv lock --upgrade" |
|
I believe we need to bump a version here wheelnext/variantlib#130 |
|
@DEKHTIARJonathan looks like I can't merge this PR. Could you please give me rights to merge ? |
Signed-off-by: Dmitry Rogozhkin <dmitry.v.rogozhkin@intel.com>
This PR starts cooking torch variant config for the upcoming 2.10. I (@dvrogozh) did common and xpu part so far. CUDA and ROCm parts copied from 2.9 config and must be revised (@DEKHTIARJonathan, @jithunnair-amd).
Changes vs. wheelnext variants v0.0.3 and pytorch 2.10:
plugin-usetoinstall-timeSee: wheelnext/pep_817_wheel_variants#103
CC: @DEKHTIARJonathan @atalman @jithunnair-amd @mgorny