A comprehensive Next.js project showcasing how to integrate the Tiptap AI Agent extension with a custom backend and AI model provider. This repository contains 12 complete demos demonstrating different integration patterns and capabilities. It follows the Custom LLM Integration guides available in our docs.
npx degit https://github.com/ueberdosis/ai-agent-custom-llm-demos
This project contains Tiptap Pro extensions that are published in Tiptapβs private npm registry. To install them, you need to configure your .npmrc
file with the necessary authentication details. Follow the private registry guide to set it up.
npm install
Create a .env.local
file:
OPENAI_API_KEY=your_openai_api_key_here
# Optional, only needed for Anthropic Claude Messages API
ANTHROPIC_API_KEY=your_anthropic_api_key_here
npm run dev
Visit http://localhost:3000
to see all available demos.
This repository contains three categories of demos, each with implementations for all four supported adapters:
Simple implementations showing the fundamental integration patterns:
Demonstrations of custom client-side tools that interact with the editor:
- Vercel AI SDK + Replace Tool
- OpenAI Responses API + Replace Tool
- OpenAI Chat Completions API + Replace Tool
- Anthropic Claude Messages API + Replace Tool
Examples of server-side tools that fetch external data:
- Vercel AI SDK + Weather Tool
- OpenAI Responses API + Weather Tool
- OpenAI Chat Completions API + Weather Tool
- Anthropic Claude Messages API + Weather Tool
Each demo follows a consistent three-layer architecture:
flowchart LR
A["Frontend<br/>(React + Tiptap)"] --> B["Backend Services<br/>(Next.js Server Actions)"]
B --> C["LLM Provider<br/>(OpenAI/Anthropic/etc.)"]
βββ app/ # Next.js pages
β βββ basic/ # Basic demos (getting started)
β β βββ openai-responses-api/ # OpenAI Responses API demo
β β βββ openai-chat-completions-api/ # OpenAI Chat Completions demo
β β βββ vercel-ai-sdk/ # Vercel AI SDK demo
β β βββ anthropic-messages/ # Anthropic Claude demo
β βββ client-side-tools/ # Client-side tools demos
β β βββ openai-responses-api/ # With custom replace tool
β β βββ openai-chat-completions-api/ # With custom replace tool
β β βββ vercel-ai-sdk/ # With custom replace tool
β β βββ anthropic-messages/ # With custom replace tool
β βββ server-side-tools/ # Server-side tools demos
β βββ openai-responses-api/ # With weather tool
β βββ openai-chat-completions-api/ # With weather tool
β βββ vercel-ai-sdk/ # With weather tool
β βββ anthropic-messages/ # With weather tool
βββ src/
βββ services/ # Next.js Server Actions
β βββ basic/ # Basic demo services
β βββ client-side-tools/ # Client-side tools services
β βββ server-side-tools/ # Server-side tools services
βββ view/ # React components
βββ basic/common/ # Basic demo UI
βββ client-side-tools/common/ # Client-side tools UI
βββ server-side-tools/common/ # Server-side tools UI
These demos show the fundamental integration patterns for each adapter:
/basic/openai-responses-api
- OpenAI Responses API integration/basic/openai-chat-completions-api
- OpenAI Chat Completions integration/basic/vercel-ai-sdk
- Vercel AI SDK integration/basic/anthropic-messages
- Anthropic Claude Messages integration
- Service:
src/services/basic/[adapter].ts
- Route:
app/basic/[adapter]/page.tsx
- Frontend:
src/view/basic/common/index.jsx
- How to set up the AI Agent toolkit with different adapters
- How to format chat messages for each LLM provider
- How to handle API responses and errors
- Basic system prompt configuration
These demos show how to implement custom client-side tools that interact with the editor:
/client-side-tools/openai-responses-api
- With custom replace tool/client-side-tools/openai-chat-completions-api
- With custom replace tool/client-side-tools/vercel-ai-sdk
- With custom replace tool/client-side-tools/anthropic-messages
- With custom replace tool
- Service:
src/services/client-side-tools/[adapter].ts
- Route:
app/client-side-tools/[adapter]/page.tsx
- Frontend:
src/view/client-side-tools/common/app-with-replace-tool.jsx
- Tool Handler:
src/view/client-side-tools/common/replace-tool-handler.js
- How to create custom client-side tool handlers
- How to define tool schemas for validation
- How to modify editor content programmatically
- How to integrate custom tools with the AI Agent provider
The replace tool allows users to ask the AI to replace all occurrences of a word or phrase in the document. Try asking: "Replace all instances of 'company' with 'organization'"
These demos demonstrate server-side tools that fetch external data:
/server-side-tools/openai-responses-api
- With weather tool/server-side-tools/openai-chat-completions-api
- With weather tool/server-side-tools/vercel-ai-sdk
- With weather tool/server-side-tools/anthropic-messages
- With weather tool
- Service:
src/services/server-side-tools/[adapter].ts
- Route:
app/server-side-tools/[adapter]/page.tsx
- Frontend:
src/view/server-side-tools/common/index.jsx
- How to implement server-side tools that call external APIs
- How to handle tool call loops and responses
- How to format tools for different LLM providers
- How to manage asynchronous tool execution
The weather tool provides dummy weather data for various cities. Try asking: "What's the weather like in Tokyo?"
This project uses the following npm packages for the AI Agent functionality:
@tiptap-pro/extension-ai-agent
- Client-side library@tiptap-pro/extension-ai-agent-server
- Server-side library@tiptap-pro/extension-ai-changes
- Review changes made by the AI Agent
Each demo implements the same functionality using different adapters:
- OpenAI Responses API: Uses the newer responses format
- OpenAI Chat Completions API: Uses the traditional chat completions format
- Vercel AI SDK: Leverages the Vercel AI SDK for provider abstraction
- Anthropic Claude: Direct integration with Claude's Messages API
- Start with Basic Demos: Understand the fundamental integration patterns
- Explore Client-Side Tools: Learn how to extend the editor with custom functionality
- Study Server-Side Tools: See how to integrate external data sources
- Compare Adapters: Notice the differences and similarities between implementations
npm run dev
- Start development servernpm run build
- Create production buildnpm start
- Start production servernpm run lint
- Run ESLint
- Tiptap AI Agent Documentation
- Custom LLM Integration Guide
- Client-Side Tools Guide
- Server-Side Tools Guide
For support, refer to the following resources:
- Tiptap Documentation
- Tiptap Community
- Contact your Tiptap account manager for enterprise support
This project is licensed under the MIT License.