Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: logit_bias not working #7793

Closed
tengerye opened this issue Jan 15, 2025 · 1 comment
Closed

[Bug]: logit_bias not working #7793

tengerye opened this issue Jan 15, 2025 · 1 comment
Labels
bug Something isn't working mlops user request

Comments

@tengerye
Copy link

tengerye commented Jan 15, 2025

What happened?

If I add logit_bias to the completion, I will have the error while removing it enables successful call:

    A8_SETUP = {
        'api_base': 'http://IP_ADDRESS:8082/v1',
        'model': 'openai/meta-llama/Llama-3.1-8B-Instruct',
        'api_key': ".",
        'max_tokens': 512,
    }


    messages = [
        { "content": "Hello, how are you?","role": "user"}
    ]

    response = completion(
        messages=messages,
        # logit_bias={50256: -100},  ################ The call will be successful if we uncomment it.
        **A8_SETUP
    )

    r = response.choices[0].message.content
    print(f"Response from second server: \n{r}")

Relevant log output

Traceback (most recent call last):
  File "/home/ubuntu/projects/litellm/tests/a8client_tests/test_logit_bias.py", line 31, in <module>
    test_server2()
  File "/home/ubuntu/projects/litellm/tests/a8client_tests/test_logit_bias.py", line 20, in test_server2
    response = completion(
  File "/opt/conda/lib/python3.10/site-packages/litellm/utils.py", line 1030, in wrapper
    raise e
  File "/opt/conda/lib/python3.10/site-packages/litellm/utils.py", line 906, in wrapper
    result = original_function(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/litellm/main.py", line 2967, in completion
    raise exception_type(
  File "/opt/conda/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2189, in exception_type
    raise e
  File "/opt/conda/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 414, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Failed to deserialize the JSON body into the target type: logit_bias: invalid type: map, expected a sequence at line 1 column 126

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

fe60a38

Twitter / LinkedIn details

No response

@tengerye tengerye added the bug Something isn't working label Jan 15, 2025
@tengerye
Copy link
Author

It does not work for all provider.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

No branches or pull requests

1 participant