You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I add logit_bias to the completion, I will have the error while removing it enables successful call:
A8_SETUP = {
'api_base': 'http://IP_ADDRESS:8082/v1',
'model': 'openai/meta-llama/Llama-3.1-8B-Instruct',
'api_key': ".",
'max_tokens': 512,
}
messages = [
{ "content": "Hello, how are you?","role": "user"}
]
response = completion(
messages=messages,
# logit_bias={50256: -100}, ################ The call will be successful if we uncomment it.
**A8_SETUP
)
r = response.choices[0].message.content
print(f"Response from second server: \n{r}")
Relevant log output
Traceback (most recent call last):
File "/home/ubuntu/projects/litellm/tests/a8client_tests/test_logit_bias.py", line 31, in<module>test_server2()
File "/home/ubuntu/projects/litellm/tests/a8client_tests/test_logit_bias.py", line 20, in test_server2
response = completion(
File "/opt/conda/lib/python3.10/site-packages/litellm/utils.py", line 1030, in wrapper
raise e
File "/opt/conda/lib/python3.10/site-packages/litellm/utils.py", line 906, in wrapper
result = original_function(*args, **kwargs)
File "/opt/conda/lib/python3.10/site-packages/litellm/main.py", line 2967, in completion
raise exception_type(
File "/opt/conda/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2189, in exception_type
raise e
File "/opt/conda/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 414, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Failed to deserialize the JSON body into the target type: logit_bias: invalid type: map, expected a sequence at line 1 column 126
What happened?
If I add
logit_bias
to the completion, I will have the error while removing it enables successful call:Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
fe60a38
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: