A complete ChatGPT App implementation using the OpenAI Apps SDK (MCP), with OAuth2 authentication via Privy.io.
- Backend: Express + MCP Server (TypeScript/Bun)
- OAuth UI: React + Privy + React Router
- Widgets: React components (rendered in ChatGPT)
- Auth: OAuth2 with PKCE + Privy.io
- Package Manager: Bun
mcp2/
βββ src/
β βββ server/ # Express + MCP server
β β βββ oauth/ # OAuth2 endpoints
β β βββ mcp/ # MCP tools & resources
β β βββ api/ # Backend API integration
β β βββ middleware/ # Auth middleware
β βββ client/ # OAuth authorization UI
β βββ widgets/ # ChatGPT widget components
βββ dist/
β βββ client/ # Built OAuth UI
β βββ widgets/ # Built widget bundles
β βββ server/ # Compiled server
βββ package.json
curl -fsSL https://bun.sh/install | bashbun install# Generate RSA key pair for JWT signing
openssl genrsa -out private-key.pem 2048
openssl rsa -in private-key.pem -pubout -out public-key.pem
# Base64 encode for .env
echo "JWT_PRIVATE_KEY=$(cat private-key.pem | base64)"
echo "JWT_PUBLIC_KEY=$(cat public-key.pem | base64)"
# Clean up PEM files
rm private-key.pem public-key.pemcp .env.example .env
# Edit .env with your values:
# - PRIVY_APP_ID (from Privy dashboard)
# - PRIVY_APP_SECRET (from Privy dashboard)
# - JWT_PRIVATE_KEY (from step 3)
# - JWT_PUBLIC_KEY (from step 3)
# - BACKEND_API_URL (your existing backend)IMPORTANT: Widgets must be built before starting the server!
# First time: Build widgets (required!)
bun run build:widgets
# Then start development server
bun run devThe server will start at http://localhost:3002
bun run dev does NOT automatically build widgets. You must build them separately!
There are three development workflows:
# 1. Build widgets once
bun run build:widgets
# 2. Start server with auto-reload
bun run dev
# 3. Rebuild widgets manually when you change widget code
bun run build:widgets# Terminal 1: Build widgets in watch mode (auto-rebuilds on changes)
bun run dev:widgets
# Terminal 2: Run server with auto-reload
bun run dev# Runs both server AND widget watch mode simultaneously
bun run dev:all# Type check
bun run type-check
# Run tests
bun run test
# Build everything for production
bun run buildServer: src/server/index.ts
- OAuth endpoints:
/authorize,/token,/.well-known/* - MCP endpoint:
/mcp - Health check:
/health
OAuth UI: src/client/src/App.tsx
- Authorization page with Privy login
- Consent screen
- Built with Vite + React + React Router
Widgets: src/widgets/src/
- ListView: Interactive list with actions
- Built as standalone bundles
- Communicate via
window.openaiAPI
# Terminal 1: Run server
bun run dev
# Terminal 2: Run MCP Inspector
bunx @modelcontextprotocol/inspector http://localhost:3002/mcp# Expose local server
ngrok http 3002
# Copy the HTTPS URL (e.g., https://abc123.ngrok.app)
# Use this URL in ChatGPT Settings β Connectors-
Enable Developer Mode:
- ChatGPT Settings β Apps & Connectors β Advanced settings
- Enable "Developer mode"
-
Create Connector:
- Settings β Connectors β Create
- Name: "Your App Name"
- Description: "What your app does"
- Connector URL:
https://your-server.com/mcp(or ngrok URL)
-
Test OAuth Flow:
- Start a new ChatGPT conversation
- Click + β More β Select your connector
- You'll be redirected to
/authorize - Log in with Privy
- Grant consent
- ChatGPT receives OAuth token
-
Test Tools:
- Ask ChatGPT: "Show me my items"
- The
get-itemstool will be called - Widget will render in ChatGPT
# Build everything
bun run build
# Run production server
bun run start
# Or preview locally
bun run preview# Build image
docker build -t chatgpt-app .
# Run container
docker run -p 3000:3000 --env-file .env chatgpt-app# Install flyctl
curl -L https://fly.io/install.sh | sh
# Create app
fly launch
# Set secrets
fly secrets set PRIVY_APP_ID=xxx
fly secrets set PRIVY_APP_SECRET=xxx
fly secrets set JWT_PRIVATE_KEY=xxx
fly secrets set JWT_PUBLIC_KEY=xxx
fly secrets set BACKEND_API_URL=xxx
# Deploy
fly deploy- ChatGPT redirects user to
/authorize?client_id=...&code_challenge=... - Server serves React UI (Privy login)
- User authenticates with Privy
- Frontend shows consent screen
- User approves, server generates authorization code
- Frontend redirects back to ChatGPT with code
- ChatGPT exchanges code for access token at
/token - Server validates PKCE, issues JWT
- ChatGPT uses JWT for
/mcprequests
1. Define Tool in src/server/mcp/tools.ts
{
name: 'my-new-tool',
description: 'What the tool does',
inputSchema: {
type: 'object',
properties: {
param: { type: 'string' }
},
required: ['param']
}
}async function handleMyNewTool(args: any, auth: any) {
// Validate auth
// Call backend API
// Return structured response
}_meta: {
'openai/outputTemplate': 'ui://widget/my-widget.html',
}mkdir -p src/widgets/src/MyWidget// src/widgets/src/MyWidget/index.tsx
import React from 'react';
import ReactDOM from 'react-dom/client';
import { MyWidget } from './MyWidget';
const root = ReactDOM.createRoot(document.getElementById('root')!);
root.render(<MyWidget />);// Update src/widgets/vite.config.ts
build: {
lib: {
entry: {
'my-widget': 'src/MyWidget/index.tsx'
}
}
}// src/server/mcp/resources.ts
await registerMyWidget(server, widgetPath);| Variable | Description | Required |
|---|---|---|
PRIVY_APP_ID |
Your Privy app ID | β |
PRIVY_APP_SECRET |
Your Privy app secret | β |
VITE_PRIVY_APP_ID |
Privy app ID (for frontend) | β |
JWT_PRIVATE_KEY |
Base64-encoded RSA private key | β |
JWT_PUBLIC_KEY |
Base64-encoded RSA public key | β |
SERVER_BASE_URL |
Your server URL | β |
BACKEND_API_URL |
Your existing backend URL | β |
PORT |
Server port (default: 3000) | β |
NODE_ENV |
Environment (development/production) | β |
# Build widgets first
bun run build:widgets
# Restart server
bun run dev- Check
SERVER_BASE_URLmatches your actual URL - Verify Privy app ID is correct
- Check JWT keys are properly base64-encoded
- Ensure redirect URI is registered in ChatGPT
- Verify JWT keys are correct (public/private pair)
- Check token hasn't expired (1 hour default)
- Ensure
audclaim matches your server URL
# Ensure server is running
bun run dev
# Try:
bunx @modelcontextprotocol/inspector http://localhost:3002/mcpMIT
Contributions welcome! Please open an issue or PR.