Skip to content

Stop sequences fail for some sequences #48

@joehoover

Description

@joehoover

Observed Behavior

Some stop sequence inputs (e.g. "}) trigger an error:

Prediction failed.

E2102 TritonTokenizerError: Tokenizer error: in ensemble 'ensemble', Failed to process the request(s) for model instance 'preprocessing_0_126', message: ValueError: To standardize tokenizer behavior, we prepend '!' to the string representation of each stop sequence. We then strip the corresponding first token from the stop sequence IDs. However, the first token of the stop sequence IDs was not '{arbitrary_start_sequence_id}', which suggests there is a problem with the tokenizer that you are using. At: /src/triton_model_repo/preprocessing/1/model.py(287): _to_word_list_format /src/triton_model_repo/preprocessing/1/model.py(182): execute

Expected Behavior

All stop sequence inputs should be handled and applied, such that generation stops when those sequences are encountered.

Reproduce

These request against llama-3-70b triggers the error reliably:

https://replicate.com/p/1w0ht542kdrgj0cg7c2vpkr4a0

This request ran against:

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions