You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have updated the package to the latest version before submitting this issue
(optional) I have used the develop branch
I have searched the existing issues of NeMo-Guardrails
Python version (python --version)
Python 3.10.11
Operating system/version
Microsoft Windows 11 Enterprise
NeMo-Guardrails version (if you must use a specific version and not the latest
0.11.1
Describe the bug
Trying to use Azure OpenAI o3-mini (or any o1 model) causes a bad request error.
Steps To Reproduce
Put this model into config.yml
type: main
engine: azure
model: o3-mini
api_version: 2024-12-01-preview
...
Start a chat with nemoguardrails chat --config your/path --verbose --debug-level=DEBUG
Ask anything that uses the main model.
Expected Behavior
Receive a chat completion without error.
Actual Behavior
HTTPStatusError: Client error '400 Bad Request' for url
'https://...openai.azure.com/openai/.../o3-mini/completions?api-version=...'
Stack trace ends with nemoguardrails.actions.llm.utils.LLMCallException: LLM Call Exception: Error code: 400 - {'error': {'code': 'OperationNotSupported', 'message': 'The completion operation does not work with the specified model, o3-mini. Please choose different model and try again. You can learn more about which models can be used with each operation here: https://go.microsoft.com/fwlink/?linkid=2197993.'}}
The text was updated successfully, but these errors were encountered:
Did you check docs and existing issues?
Python version (python --version)
Python 3.10.11
Operating system/version
Microsoft Windows 11 Enterprise
NeMo-Guardrails version (if you must use a specific version and not the latest
0.11.1
Describe the bug
Trying to use Azure OpenAI o3-mini (or any o1 model) causes a bad request error.
Steps To Reproduce
Start a chat with
nemoguardrails chat --config your/path --verbose --debug-level=DEBUG
Ask anything that uses the main model.
Expected Behavior
Receive a chat completion without error.
Actual Behavior
Stack trace ends with
nemoguardrails.actions.llm.utils.LLMCallException: LLM Call Exception: Error code: 400 - {'error': {'code': 'OperationNotSupported', 'message': 'The completion operation does not work with the specified model, o3-mini. Please choose different model and try again. You can learn more about which models can be used with each operation here: https://go.microsoft.com/fwlink/?linkid=2197993.'}}
The text was updated successfully, but these errors were encountered: