A comprehensive reference implementation demonstrating how to build custom ChatGPT applications using the OpenAI Apps SDK, Model Context Protocol, and Next.js. This project serves as both a working foundation and an interactive gallery of widget patterns showcasing the complete capabilities of the SDK.
This starter enables you to build custom applications that extend ChatGPT by:
- Exposing tools that ChatGPT can intelligently invoke during conversations
- Rendering rich, interactive React components directly within ChatGPT
- Creating bidirectional communication between your widgets and the AI model
- Persisting state across conversation sessions
- Integrating external data sources and APIs into ChatGPT workflows
Think of it as building mini-applications that live inside ChatGPT—where the model decides when to invoke your functionality, and your UI handles the presentation and interaction.
ChatGPT apps are built on a three-layer architecture:
┌─────────────────────────────────────────────────────────┐
│ Your MCP Server │
│ (Registers tools, handles requests, serves resources) │
└────────────────────┬────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ ChatGPT Runtime │
│ (Orchestrates tool calls, fetches resources) │
└────────────────────┬────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ Your Widget │
│ (React component with window.openai bridge access) │
└─────────────────────────────────────────────────────────┘
- User interacts with ChatGPT
- ChatGPT decides to invoke your tool
- Your MCP server returns structured data and a widget reference
- ChatGPT fetches your widget HTML and renders it in a sandboxed iframe
- Widget receives data via
window.openaiand presents it visually - Widget can call tools, send messages, or update state back through the bridge
MCP is the standard protocol for exposing tools and resources to AI models. Your server implements MCP endpoints that:
- Register tools with input/output schemas and metadata
- Register resources (HTML pages containing your widgets)
- Handle tool invocations and return structured responses
- Control data visibility (what the model sees vs. what the widget sees)
When your React widget renders inside ChatGPT, it gains access to a powerful API for bidirectional communication:
Methods:
callTool(name, params)- Invoke MCP tools from your widgetsendFollowUpMessage(text)- Insert messages into the conversationrequestDisplayMode(mode)- Change layout (inline, picture-in-picture, fullscreen)setWidgetState(state)- Persist component state across sessionsopenExternal(url)- Navigate to external URLs
Reactive Properties:
theme- ChatGPT's current theme (light/dark)locale- User's language preferencedisplayMode- Current layout modetoolOutput- Structured data from tool responseswidgetState- Persisted component state
OpenAI extends standard MCP with specialized metadata fields that control behavior:
Tool Metadata (_meta in tool descriptors):
openai/outputTemplate- Links tool to widget resource URIopenai/widgetAccessible- Enables widget-to-tool communicationopenai/toolInvocation/invoking- Loading state text (≤64 chars)openai/toolInvocation/invoked- Completion state text (≤64 chars)
Resource Metadata (_meta in resource descriptors):
openai/widgetDescription- Human-readable widget summaryopenai/widgetPrefersBorder- UI rendering hintopenai/widgetCSP- Content Security Policy configurationopenai/widgetDomain- Optional dedicated subdomain
Tool responses support three fields with different visibility scopes:
{
structuredContent: { /* ... */ }, // Visible to: Model + Widget
content: "Human readable text", // Visible to: Model + Transcript
_meta: { /* ... */ } // Visible to: Widget only
}This separation allows you to:
- Share data with both the model and your widget via
structuredContent - Provide narrative context in the transcript via
content - Pass widget-specific metadata (API keys, debug info) via
_meta
app/
├── mcp/
│ └── route.ts # MCP server - registers tools and resources
├── widgets/
│ ├── read-only/ # Simple data display widget
│ ├── widget-accessible/ # Interactive counter using callTool()
│ ├── send-message/ # Message injection example
│ ├── open-external/ # External link handling
│ ├── display-mode/ # Layout mode transitions
│ └── widget-state/ # State persistence example
├── hooks/
│ ├── use-tool-output.ts # Access tool response data
│ ├── use-call-tool.ts # Call tools from widgets
│ ├── use-send-message.ts # Send follow-up messages
│ ├── use-widget-state.ts # Persist component state
│ ├── use-display-mode.ts # React to layout changes
│ └── ... # Additional hooks wrapping window.openai
├── components/
│ └── ui/ # Radix UI + Tailwind components
├── layout.tsx # Root layout with SDK bootstrap
└── page.tsx # Home page
middleware.ts # CORS handling for RSC fetching
next.config.ts # Asset prefix configuration
baseUrl.ts # Environment-aware URL detection
The heart of your application. This 600+ line file demonstrates:
- Tool registration with OpenAI-specific metadata
- Resource registration linking to widget HTML
- Tool handlers that process requests and return structured data
- Response field configuration for visibility control
Critical for iframe rendering. Sets assetPrefix to ensure /_next/ static assets load from the correct origin:
const nextConfig: NextConfig = {
assetPrefix: baseURL, // Prevents 404s in iframe
};Without this, Next.js attempts to load assets from the iframe URL, causing failures.
Handles OPTIONS preflight requests required for cross-origin React Server Component fetching during client-side navigation.
The NextChatSDKBootstrap component patches browser APIs to work within ChatGPT's iframe:
fetch- Rewrites same-origin requests to use correct base URLhistory.pushState/replaceState- Prevents full-origin URLs in history- HTML element observer - Prevents ChatGPT from modifying root element
Required configuration:
<html lang="en" suppressHydrationWarning>
<head>
<NextChatSDKBootstrap baseUrl={baseURL} />
</head>
<body>{children}</body>
</html>Note: suppressHydrationWarning is required because ChatGPT modifies initial HTML before Next.js hydrates.
React hooks that wrap window.openai for clean, testable component code. Each hook provides reactive access to a specific bridge capability:
const weather = useToolOutput<WeatherData>();
const callTool = useCallTool();
const [state, setState] = useWidgetState();
const sendMessage = useSendMessage();
const theme = useTheme();pnpm installpnpm devThe application runs on http://localhost:3009. The MCP server is available at http://localhost:3009/mcp.
Navigate to the home page and browse the widget gallery. Each example demonstrates a specific capability:
- Read-Only Widget - Simple data display
- Widget-Accessible Tools - Interactive components using
callTool() - Message Injection - Using
sendFollowUpMessage() - External Links - Opening URLs with
openExternal() - Display Mode - Requesting layout changes
- State Persistence - Saving component state with
setWidgetState() - Widget Descriptions - Metadata optimization
- Content Security Policy - Network permission configuration
Deploy to Vercel with one click:
The baseUrl.ts configuration automatically detects Vercel environments:
- Production URLs via
VERCEL_PROJECT_PRODUCTION_URL - Preview/branch URLs via
VERCEL_BRANCH_URL - Local development fallback
- Deploy your application to a publicly accessible URL
- In ChatGPT, navigate to Settings → Connectors → Create
- Add your MCP server URL with the
/mcppath (e.g.,https://your-app.vercel.app/mcp)
Note: Connecting MCP servers to ChatGPT requires developer mode access. See the connection guide for setup instructions.
Register a new tool in app/mcp/route.ts:
{
name: "my_tool",
description: "What this tool does",
inputSchema: { /* ... */ },
outputSchema: { /* ... */ },
_meta: {
"openai/outputTemplate": "template://widgets/my-widget",
"openai/widgetAccessible": true, // Enable callTool()
"openai/toolInvocation/invoking": "Loading...",
"openai/toolInvocation/invoked": "Loaded"
}
}Build a React component in app/widgets/my-widget/page.tsx:
export default function MyWidget() {
const data = useToolOutput<MyDataType>();
const theme = useTheme();
const callTool = useCallTool();
return (
<div className={theme === 'dark' ? 'dark' : ''}>
{/* Your UI here */}
</div>
);
}Add resource registration in app/mcp/route.ts:
{
uri: "template://widgets/my-widget",
name: "My Widget",
mimeType: "text/html+skybridge",
_meta: {
"openai/widgetDescription": "Clear description for the model",
"openai/widgetPrefersBorder": true,
"openai/widgetCSP": {
connect_domains: ["api.example.com"],
resource_domains: ["cdn.example.com"]
}
}
}Implement the tool handler:
if (request.params.name === "my_tool") {
return {
structuredContent: { /* Data for model and widget */ },
content: "Human-readable summary",
_meta: { /* Widget-only metadata */ }
};
}This project includes comprehensive documentation:
- ARCHITECTURE.md - Complete system architecture, data flow diagrams, and metadata reference
- GUIDE.md - Interactive example gallery with categorized patterns
- OPENAI_WIDGET_ARCHITECTURE.md - Physical architecture, component lifecycle, and implementation patterns
- WINDOW_OPENAI_COMPLETE.md - Complete
window.openaiAPI reference and best practices
- Framework: Next.js 15.5.4 with Turbopack
- Runtime: React 19.1.0 with React Server Components
- Backend Protocol: Model Context Protocol (MCP) 1.20.0
- MCP Handler: mcp-handler 1.0.2
- UI Components: Radix UI + Tailwind CSS 4
- Validation: Zod 3.24.2
- Icons: Lucide React
- Use hooks to wrap
window.openaifor cleaner, testable components - Set
widgetDescriptionto reduce redundant model narration - Configure CSP explicitly for any external resources your widget needs
- Keep
widgetStateunder ~4k tokens for optimal performance - Use
structuredContentfor data shared between model and widget - Use
_metafor widget-only data like API keys or debug information - Set
readOnlyHint: truefor non-mutating operations to help model planning - Keep status messages under 64 characters for
invokingandinvokedstates
- OpenAI Apps SDK Documentation
- Custom UX Guide
- API Reference
- Model Context Protocol
- Next.js Documentation
Here's how all pieces connect for a weather widget:
Tool Registration:
{
name: "get_weather",
inputSchema: { location: "string" },
_meta: {
"openai/outputTemplate": "template://widgets/weather",
"openai/widgetAccessible": true
}
}Resource Registration:
{
uri: "template://widgets/weather",
_meta: {
"openai/widgetDescription": "Shows current weather with forecast",
"openai/widgetCSP": {
connect_domains: ["api.weather.com"]
}
}
}Tool Response:
{
structuredContent: {
temperature: 72,
condition: "sunny",
location: "San Francisco"
},
content: "The weather in San Francisco is 72°F and sunny."
}Widget Component:
export default function WeatherWidget() {
const weather = useToolOutput<WeatherData>();
const callTool = useCallTool();
async function refresh() {
await callTool("get_weather", { location: weather.location });
}
return (
<Card>
<h2>{weather.location}</h2>
<p>{weather.temperature}°F - {weather.condition}</p>
<Button onClick={refresh}>Refresh</Button>
</Card>
);
}This is a production-ready starting point for building sophisticated ChatGPT applications. Explore the examples, understand the patterns, and build something unique.