Joint Tabular/Semantic QA over Tesla 10K with Azure Chat Open AI #8073
Replies: 1 comment
-
🤖 Hi, The error you're encountering seems to be related to the output format of the AzureChatOpenAI model. The LlamaIndex expects the output to be a JSON string, and if it's not in this format, it could lead to the ValueError you're seeing. To resolve this issue, you can modify the import json
class OpenAIAgent:
# ... other methods ...
def chat(self, message):
# ... code to process the message and call the function ...
# Convert the output to a JSON string
output_json = json.dumps(output)
return output_json This will ensure that the output of the function call is a JSON string, which should resolve the ValueError you're encountering. Also, make sure that the environment variables required by the AzureOpenAI class ( You can refer to the following resources for more information:
I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Hi,
I am trying to replicate the same code from https://docs.llamaindex.ai/en/latest/examples/query_engine/sec_tables/tesla_10q_table.html#joint-tabular-semantic-qa-over-tesla-10k.
But, using AzureChatOpenAI as below
from langchain.chat_models import AzureChatOpenAI
import os
from llama_index.llms.langchain import *
from dotenv import load_dotenv
load_dotenv()
model = AzureChatOpenAI(
temperature=0.2,
deployment_name=os.getenv("AZURE_OPENAI_DEPLOYMENT"),
verbose=True)
llm = LangChainLLM(
model
)
And, passing the llm to UnstructuredElementNodeParser
node_obj = UnstructuredElementNodeParser(llm=llm)
raw_nodes_2021 = node_obj.get_nodes_from_documents(docs_2021)
But, ending up with ValueError: Could not extract json string from output: The answer cannot be provided as there is no table mentioned in the context information
Any suggestions, How can I effectively user Azure chat open ai with this
Beta Was this translation helpful? Give feedback.
All reactions