-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Convert tokenizers with openvino_tokenizers #500
Conversation
@apaniukov will cover this functionality. This PR is just to initiate discussion. |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
save_model(ov_tokenizer, tokenizer_path) | ||
save_model(ov_detokenizer, detokenizer_path) | ||
except Exception as exception: | ||
print("[ WARNING ] OpenVINO tokenizer/detokenizer models couldn't be exported because of exception:", exception) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
print("[ WARNING ] OpenVINO tokenizer/detokenizer models couldn't be exported because of exception:", exception) | |
logger.warning(OpenVINO tokenizer/detokenizer models couldn't be exported because of exception:", exception) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
try: | ||
# TODO: Avoid loading the tokenizer again if loaded before | ||
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path) | ||
tokenizer_export(tokenizer, output) | ||
except: | ||
print("[ WARNING ] Could not load tokenizer using specified model ID or path. OpenVINO tokenizer/detokenizer models won't be generated.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tokenizer already loaded in maybe_load_preprocessors function, I recommend to check this function result instead. Also possibly you should take into account trust_remote_code parameter if you want load tokenizer explicitly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reuse tokenizer from maybe_load_preprocessors
result. But a new problem is that there are two identical tokenizers - one from AutoTokenizer
and one from AutoProcessor
. I haven't figured out how to deduplicate them yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is the PR: slyalin#2
@@ -46,6 +47,24 @@ | |||
logger = logging.getLogger(__name__) | |||
|
|||
|
|||
def tokenizer_export( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should be in convert.py file together with other conversion functions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved.
Closing in favor of #513 |
What does this PR do?
Export tokenizer and detokenizer as OpenVINO models using openvino_tokenizers from https://github.com/openvinotoolkit/openvino_contrib/tree/master/modules/custom_operations/user_ie_extensions/tokenizer/python. Activated by default as a part of optimum-cli export openvino command line tool. Compatible with https://github.com/openvinotoolkit/openvino.genai/tree/master/text_generation/causal_lm/cpp.
Before submitting