Replies: 1 comment 1 reply
-
From our experience building MuK AI Studio (muk_ai) - which aligns with what you have planned - we’ve found that avoiding frameworks often leads to unnecessary complexity and redundant work. In our development, we initially started with Gemini and later integrated ChatGPT. Along the way, we encountered challenges such as standardizing AI provider interfaces and managing execution loops - exactly the kinds of problems that existing frameworks, like LangChain in our case, have already solved effectively. That said, muk_ai is not as modular as what you envision. Our focus was not on supporting multiple frameworks or making them highly modular, but rather on implementing the functionalities we needed for customers in a structured way. E.g. Adding tools for execution (not complete): from langchain_core.tools import StructuredTool
from langchain.agents import AgentExecutor, create_tool_calling_agent
tools = []
# we loop through all available tools here (we implemented the tools as an Odoo model)
tools.append(StructuredTool.from_function(func=tool_fn, name=record.type, description=record.description,))
# and pass them to an executor (without needing to know which model will execute them)
agent = create_tool_calling_agent(model, tools, prompt)
AgentExecutor(agent=agent, tools=tools, verbose=False, stream_runnable=False,) |
Beta Was this translation helpful? Give feedback.
-
There are multiple ways to structure AI interactions in Odoo, from LangChain-based agents, llamaindex retrieval to pydanticAI for validation.
My point of view is that we should not tie Odoo LLM framework to a single AI framework and support each of them via extension modules.
Beta Was this translation helpful? Give feedback.
All reactions