A collection of configuration documents (JSON or Natural Language Prose), compatible with most large language models (LLMs) that support external function calling (e.g., OpenAI, Mistral). These documents describe the behavior of various AI Agents (Assistants).
To enable conversational interaction with functionality encapsulated in OpenAPI-compliant web services via LLMs.
These configurations are created using the OpenLink AI Layer (OPAL) Personal Assistant and deployed via webpage widgets, the OpenAI CustomGPT store, or as OpenAPI-compliant web services usable via their own JSON or YAML documents in other environments (e.g., LangChain, LlamaIndex, and others).
Enables the execution of SQL, SPARQL, or GraphQL queries directly from various LLMs that support external function integration (e.g., OpenAI and Mistral).
Provides expert product assistance for our high-performance Virtuoso Data Spaces management platform. For instance, a user might instruct an LLM with a prompt like, "I want to buy the cheapest Virtuoso online offer." The system understands and executes this request without requiring the user to specify the algorithm that handles tasks such as natural language translation, context-building (including document lookups or database querying), or querying using declarative languages like SQL or SPARQL.
Provides expert product assistance for our high-performance data access drivers.