Skip to content

bug: BadRequestError as gpt-5 only supports temperature=1 #1512

@kevin-yauris

Description

@kevin-yauris

Did you check the docs?

  • I have read all the NeMo-Guardrails docs

Is your feature request related to a problem? Please describe.

Hi I am trying to use the latest model for the guardrail I assume better LLM means better performance.
I am trying to use the openai engine with gpt5-mini

DEFAULT_CONFIG_DICT = {"models": [{"type": "main", "engine": "openai", "model": "gpt-5-mini"}]}

but I keep getting error:

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value:                  
                             'temperature' does not support 0.001 with this model. Only the default (1) value is                  
                             supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code':                        
                             'unsupported_value

Since I cannot make it works using the built-in engine I try to use a custom engine that calling the needed model behind but the performance is really slow (in average 30 seconds), compared to using the build in engine(under 5 seconds).

Is there any way to use GPT-5 mini or any other latest model with Nemo Guardrail built-in engine?
If it is not, is the custom engine expected to be slow? How to optimize it?

Thank you.

Describe the solution you'd like

To be able to use GPT5-mini without customization

Describe alternatives you've considered

Custom model engine

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions