Skip to content
This repository has been archived by the owner on Nov 24, 2021. It is now read-only.

ERROR: Failed building wheel for tokenizers, even thought setuptools-rust installed #4

Open
RansSelected opened this issue Mar 3, 2021 · 4 comments

Comments

@RansSelected
Copy link

Hi!
while tryoing to

pip install docly, getting this error:

  Building wheel for tokenizers (PEP 517) ... error
  ERROR: Command errored out with exit status 1:
   command: /Users/krisku/opt/miniconda3/envs/auto_doc/bin/python /Users/krisku/opt/miniconda3/envs/auto_doc/lib/python3.9/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /var/folders/mr/5nhwtm3n4dvfdkh3sf2xs41h0000gn/T/tmpb3b95crr
       cwd: /private/var/folders/mr/5nhwtm3n4dvfdkh3sf2xs41h0000gn/T/pip-install-syhlo38v/tokenizers_eb68e965c7654f6bbe50cefd6fff55af
  Complete output (36 lines):
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib
  creating build/lib/tokenizers
  copying tokenizers/__init__.py -> build/lib/tokenizers
  creating build/lib/tokenizers/models
  copying tokenizers/models/__init__.py -> build/lib/tokenizers/models
  creating build/lib/tokenizers/decoders
  copying tokenizers/decoders/__init__.py -> build/lib/tokenizers/decoders
  creating build/lib/tokenizers/normalizers
  copying tokenizers/normalizers/__init__.py -> build/lib/tokenizers/normalizers
  creating build/lib/tokenizers/pre_tokenizers
  copying tokenizers/pre_tokenizers/__init__.py -> build/lib/tokenizers/pre_tokenizers
  creating build/lib/tokenizers/processors
  copying tokenizers/processors/__init__.py -> build/lib/tokenizers/processors
  creating build/lib/tokenizers/trainers
  copying tokenizers/trainers/__init__.py -> build/lib/tokenizers/trainers
  creating build/lib/tokenizers/implementations
  copying tokenizers/implementations/byte_level_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/sentencepiece_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/base_tokenizer.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/__init__.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/char_level_bpe.py -> build/lib/tokenizers/implementations
  copying tokenizers/implementations/bert_wordpiece.py -> build/lib/tokenizers/implementations
  copying tokenizers/__init__.pyi -> build/lib/tokenizers
  copying tokenizers/models/__init__.pyi -> build/lib/tokenizers/models
  copying tokenizers/decoders/__init__.pyi -> build/lib/tokenizers/decoders
  copying tokenizers/normalizers/__init__.pyi -> build/lib/tokenizers/normalizers
  copying tokenizers/pre_tokenizers/__init__.pyi -> build/lib/tokenizers/pre_tokenizers
  copying tokenizers/processors/__init__.pyi -> build/lib/tokenizers/processors
  copying tokenizers/trainers/__init__.pyi -> build/lib/tokenizers/trainers
  running build_ext
  running build_rust
  error: Can not find Rust compiler
  ----------------------------------------
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly

even though the setuptools-rust installed .

MacOS Catalina 10.15.7
Python 3.9.1
conda 4.9.2 + pip 21.0.1

I'll appreciate your help.

@rcshubhadeep
Copy link
Contributor

rcshubhadeep commented Mar 3, 2021 via email

@AABur
Copy link

AABur commented Apr 4, 2021

I have the same issue.

Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (PEP 517) ... error
  ERROR: Command errored out with exit status 1:
   command: /Users/aabur/.asdf/installs/python/3.9.2/bin/python /Users/aabur/.asdf/installs/python/3.9.2/lib/python3.9/site-packages/pip/_vendor/pep517/_in_process.py build_wheel /var/folders/9h/cmzj5d8j247_lyg2h4jsm7_80000gn/T/tmpll84tz3q
       cwd: /private/var/folders/9h/cmzj5d8j247_lyg2h4jsm7_80000gn/T/pip-install-9r0vuary/tokenizers_31b0f24eb85e40d58ab1580a2ccf79bd
  Complete output (226 lines):
...
...
...
  ----------------------------------------
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly

rust and setuptools-rust installed

macOS Big Sur 11.2.3
Python 3.9.2
pip 21.0.1

@rcshubhadeep
Copy link
Contributor

Could you guys resolve this?

@lcihaeon
Copy link

lcihaeon commented Aug 3, 2021

I followed instructions for the same issue from another repo and it worked for me:

huggingface/transformers#2831 (comment)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants