Skip to content

examples/huggingface failed #1115

Open
Open
@yaxan

Description

@yaxan

When trying to run the examples I seem to always run into this error:

File "/home/ubuntu/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1512, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback): No module named 'pippy.IR'

Activity

yaxan

yaxan commented on May 13, 2024

@yaxan
Author

From what I can tell huggingface's accelerate has the following check:

if is_pippy_available(): from pippy.IR import Pipe, PipeSplitWrapper, annotate_split_points from pippy.PipelineStage import PipelineStage

which doesn't reflect the PRs that changed IR and PipelineStage to private

kwen2501

kwen2501 commented on May 13, 2024

@kwen2501
Contributor

Hi thanks for the report.

Sorry that PiPPy is currently under refactorization and clean-up, as we are planning to migrate it into pytorch as torch.distributed.pipelining.
See here: https://github.com/pytorch/pytorch/tree/main/torch/distributed/pipelining
During the refactorization, unfortunately some APIs are privatized to following the more rigorous rule of pytorch.

If you are trying PiPPy through HuggingFace Accelerate, the current "stable" version is torchpippy's 0.2.0 binary release, which can be downloaded through pypi: https://pypi.org/project/torchpippy/

pip install torchpippy
yaxan

yaxan commented on May 13, 2024

@yaxan
Author

Hi thanks for the report.

Sorry that PiPPy is currently under refactorization and clean-up, as we are planning to migrate it into pytorch as torch.distributed.pipelining. See here: https://github.com/pytorch/pytorch/tree/main/torch/distributed/pipelining During the refactorization, unfortunately some APIs are privatized to following the more rigorous rule of pytorch.

If you are trying PiPPy through HuggingFace Accelerate, the current "stable" version is torchpippy's 0.2.0 binary release, which can be downloaded through pypi: https://pypi.org/project/torchpippy/

pip install torchpippy

Do you see foresee any issues in me opening a PR to update accelerate's inference.py file to

if is_pippy_available(): from pippy import Pipe, annotate_split_points, SplitPoint, PipelineStage
?

kwen2501

kwen2501 commented on May 13, 2024

@kwen2501
Contributor

If the proposed line works for both the stable version and nightly version (seems so), then it should be fine.

We will also contact them to migrate to torch.distributed.pipelining too.

yaxan

yaxan commented on May 14, 2024

@yaxan
Author

If the proposed line works for both the stable version and nightly version (seems so), then it should be fine.

We will also contact them to migrate to torch.distributed.pipelining too.

I'd imagine that's probably the better long term solution than what I was going to do.
Bit of a question for my own understanding - is adding PipelineStage to torch.distributed.pipelining still in the works?

kwen2501

kwen2501 commented on Jun 10, 2024

@kwen2501
Contributor

The migration is mostly done.
Our new documentation is here:
https://pytorch.org/docs/main/distributed.pipelining.html

yaxan

yaxan commented on Jun 10, 2024

@yaxan
Author

The migration is mostly done. Our new documentation is here: https://pytorch.org/docs/main/distributed.pipelining.html

Sounds great. I'll open a PR for Accelerate following the release of 2.4

kwen2501

kwen2501 commented on Jun 11, 2024

@kwen2501
Contributor

Wow, looking forward to that. Thanks a lot!
Cc @muellerzr @SunMarc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @kwen2501@yaxan

        Issue actions

          examples/huggingface failed · Issue #1115 · pytorch/PiPPy