Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: using te and fsdp leads to multiple device found error #1453

Merged
merged 1 commit into from
Nov 19, 2024

Conversation

kshitij12345
Copy link
Collaborator

Fixes the error below:

As reported by Mixology team,

NVFUSER_DISABLE=multidevice torchrun --standalone --max-restarts=0 --no-python --nproc-per-node=8 python /opt/pytorch/lightning-thunder/thunder/benchmarks/benchmark_litgpt.py --model_name Llama-3-8B --distributed_mode fsdp --shard_mode zero2 --compile inductor --checkpoint_activations False --low_precision_mode fp8-delayed-te --micro_batch_size 1 --bucketing_mode block

we see the error

RuntimeError: FSDP only supports single device modules but got params on {device(type='cuda', index=2), device(type='meta')}

@kshitij12345 kshitij12345 changed the title fix : te and fsdp leading multiple device found error fix: using te and fsdp leads to multiple device found error Nov 19, 2024
@kshitij12345 kshitij12345 marked this pull request as ready for review November 19, 2024 10:36
Copy link
Collaborator

@t-vi t-vi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @kshitij12345

@t-vi t-vi enabled auto-merge (squash) November 19, 2024 10:58
@t-vi t-vi merged commit f206afa into Lightning-AI:main Nov 19, 2024
44 checks passed
@kshitij12345 kshitij12345 deleted the fix-te-fsdp-device-mismatch branch November 19, 2024 11:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants