Integrate CustomLLM with hallucination rail #584
              
                Unanswered
              
          
                  
                    
                      sivakumarl
                    
                  
                
                  asked this question in
                Q&A
              
            Replies: 1 comment
-
| The Hallucination Guardrail has code that specifically limits it working with Open AI. See line 69 in file: https://github.com/NVIDIA/NeMo-Guardrails/blob/develop/nemoguardrails/library/hallucination/actions.py I made a COPY of this guardrail and commented that section out so that I can use it with Azure OpenAI GPT-4o. It works perfectly. I suspect you could do the same BUT you may have to also modify other bits of the code to ensure various LLM specific parameters are set correctly. | 
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi @everyone,
We are trying to integrate CustomLLM (TGI using Mistral 7B) with hallucination rail, but it looks like the current library doesn’t support this. Could you please help me determine if there is any feasibility to integrate CustomLLM?
Regards,
Siva Kumar.
Beta Was this translation helpful? Give feedback.
All reactions