This file provides comprehensive guidance to Claude Code when working with code in this repository.
Cap is the open source alternative to Loom. It's a Turborepo monorepo with a Tauri v2 desktop app (Rust + SolidStart) and a Next.js web app. The Next.js app at apps/web is the main web application for sharing and management; the desktop app at apps/desktop is the cross‑platform recorder/editor (macOS and Windows).
- Core Purpose: Screen recording with instant sharing capabilities
- Target Users: Content creators, developers, product managers, support teams
- Key Features: Instant recording, studio mode, AI-generated captions, collaborative comments
- Business Model: Freemium SaaS with usage-based pricing
apps/web/— Next.js web application (sharing, management, dashboard)apps/desktop/— Tauri desktop app (recording, editing)apps/discord-bot/— Discord integration botapps/storybook/— UI component documentation
packages/database/— Drizzle ORM, auth, email templatespackages/ui/— React components for web apppackages/ui-solid/— SolidJS components for desktoppackages/utils/— Shared utilities, types, constantspackages/env/— Environment variable validationpackages/web-*— Effect-based web API layers
crates/media*/— Video/audio processing pipelinecrates/recording/— Core recording functionalitycrates/rendering/— Video rendering and effectscrates/camera*/— Cross-platform camera handlingcrates/scap-*/— Screen capture implementations
**/tauri.ts— Auto-generated IPC bindings (DO NOT EDIT)**/queries.ts— Auto-generated query bindings (DO NOT EDIT)apps/web/actions/**/*.ts— Server Actions ("use server")packages/database/schema.ts— Database schema definitions*.config.*— Configuration files (Next.js, Tailwind, etc.)
pnpm dev:web # Start Next.js dev server (apps/web only)
pnpm run dev:desktop # Start Tauri desktop dev (apps/desktop)
pnpm build # Build all packages/apps via Turbo
pnpm lint # Lint with Biome across the repo
pnpm format # Format with Biome
pnpm typecheck # TypeScript project references buildpnpm db:generate # Generate Drizzle migrations
pnpm db:push # Push schema changes to MySQL
pnpm db:studio # Open Drizzle Studio
pnpm --dir packages/database db:check # Verify database schema# Web app (apps/web)
cd apps/web && pnpm dev # Start Next.js dev server
# Desktop (apps/desktop)
cd apps/desktop && pnpm dev # Start SolidStart + Tauri dev
pnpm tauri:build # Build desktop app (release)- Do not start additional development servers or localhost services unless explicitly asked. Assume the developer already has the environment running and focus on code changes.
- Prefer
pnpm dev:weborpnpm run dev:desktopwhen you only need one app. Avoid starting multiple overlapping servers. - Avoid running Docker or external services yourself unless requested; root workflows handle them as needed.
- Database: MySQL via Docker Compose; schema managed through Drizzle migrations
- Storage: S3-compatible (AWS, Cloudflare R2, etc.) for video/audio files
- NEVER EDIT:
tauri.ts,queries.ts(auto-generated on app load) - NEVER EDIT: Files under
apps/desktop/src-tauri/gen/ - Icons: Auto-imported in desktop app; do not import manually
- Regeneration: These files update automatically when Rust types change
- Node Version: Must use Node 20 (specified in package.json engines)
- PNPM Version: Locked to 10.5.2 for consistency
- Turbo Cache: May need clearing if builds behave unexpectedly (
rm -rf .turbo) - Database Migrations: Always run
pnpm db:generatebeforepnpm db:push - Desktop Icons: Use
unplugin-iconsauto-import instead of manual imports
apps/web— Next.js 14 (App Router) web applicationapps/desktop— Tauri v2 desktop app with SolidStart (SolidJS)packages/database— Drizzle ORM (MySQL) + auth utilitiespackages/ui— React UI components for the webpackages/ui-solid— SolidJS UI components for desktoppackages/utils— Shared utilities and typespackages/env— Zod-validated build/server env modulescrates/*— Rust crates for media, rendering, recording, camera, etc.
- Package Manager: pnpm (
pnpm@10.5.2) - Build System: Turborepo
- Frontend (Web): React 19 + Next.js 14.2.x (App Router)
- Desktop: Tauri v2, Rust 2024, SolidStart
- Styling: Tailwind CSS (web consumes
@cap/ui/tailwind) - Server State: TanStack Query v5 on web;
@tanstack/solid-queryon desktop - Database: MySQL (PlanetScale) with Drizzle ORM
- AI Integration: Groq preferred, OpenAI fallback; invoked in Next.js Server Actions
- Analytics: PostHog
- Payments: Stripe
- AI on the Server: All Groq/OpenAI calls execute in Server Actions under
apps/web/actions. Never call AI from client components. - Authentication: NextAuth with a custom Drizzle adapter. Session handling via NextAuth cookies; API keys are supported for certain endpoints.
- API Surface: Prefer Server Actions. When routes are necessary, implement under
app/api/*(Hono-based utilities present), set proper CORS, and revalidate precisely. - Desktop IPC: Use
tauri_spectafor strongly typed commands/events; do not modify generated bindings.
Rust (emit):
use specta::Type;
use tauri_specta::Event;
#[derive(Serialize, Type, tauri_specta::Event, Debug, Clone)]
pub struct UploadProgress {
progress: f64,
message: String,
}
UploadProgress { progress: 0.0, message: "Starting upload...".to_string() }
.emit(&app)
.ok();Frontend (listen; generated bindings):
import { events } from "./tauri"; // auto-generated
await events.uploadProgress.listen((event) => {
// update UI with event.payload
});- Follow Local Patterns: Study neighboring files and shared packages first
- Database Changes: Always
pnpm db:generate→pnpm db:push→ test - Strict Typing: Use existing types; validate config via
@cap/env - Component Consistency: Use
@cap/ui(React) or@cap/ui-solid(Solid) - No Manual Edits: Never touch auto-generated bindings or schemas
"use server";
import { db } from "@cap/database";
import { getCurrentUser } from "@cap/database/auth/session";
export async function updateVideo(data: FormData) {
const user = await getCurrentUser();
if (!user?.id) throw new Error("Unauthorized");
// Database operations with Drizzle
return await db().update(videos).set({ ... }).where(eq(videos.id, id));
}// Rust side - emit events
UploadProgress { progress: 0.5, message: "Uploading...".to_string() }
.emit(&app)
.ok();// Frontend side - listen to events (auto-generated)
import { events, commands } from "./tauri";
// Call commands
await commands.startRecording({ ... });
// Listen to events
await events.uploadProgress.listen((event) => {
setProgress(event.payload.progress);
});// Queries with Server Actions
const { data, isLoading } = useQuery({
queryKey: ["videos", userId],
queryFn: () => getUserVideos(),
staleTime: 5 * 60 * 1000,
});
// Mutations with cache updates
const updateMutation = useMutation({
mutationFn: updateVideo,
onSuccess: (updated) => {
queryClient.setQueryData(["video", updated.id], updated);
},
});NEXT_PUBLIC_WEB_URLNEXT_PUBLIC_POSTHOG_KEY,NEXT_PUBLIC_POSTHOG_HOSTNEXT_PUBLIC_DOCKER_BUILD(enables Next.js standalone output)
- Core:
DATABASE_URL,WEB_URL,NEXTAUTH_SECRET,NEXTAUTH_URL - S3:
CAP_AWS_BUCKET,CAP_AWS_REGION,CAP_AWS_ACCESS_KEY,CAP_AWS_SECRET_KEY, optionalCAP_AWS_ENDPOINT,CAP_AWS_BUCKET_URL - AI:
GROQ_API_KEY,OPENAI_API_KEY - Email/Analytics:
RESEND_API_KEY,RESEND_FROM_DOMAIN,POSTHOG_PERSONAL_API_KEY,DUB_API_KEY,DEEPGRAM_API_KEY - OAuth:
GOOGLE_CLIENT_ID/SECRET,WORKOS_CLIENT_ID,WORKOS_API_KEY - Stripe:
STRIPE_SECRET_KEY_TEST,STRIPE_SECRET_KEY_LIVE,STRIPE_WEBHOOK_SECRET - CDN signing:
CLOUDFRONT_KEYPAIR_ID,CLOUDFRONT_KEYPAIR_PRIVATE_KEY - Optional S3 endpoints:
S3_PUBLIC_ENDPOINT,S3_INTERNAL_ENDPOINT
- Package-Specific: Check each
package.jsonfor test commands - Web App: Uses Vitest for utilities, no comprehensive frontend tests yet
- Desktop: Vitest for SolidJS components in some packages
- Tasks Service: Jest for API endpoint testing
- Rust: Standard Cargo test framework for crates
- Turborepo Caching: Aggressive caching across all packages
- Cache Invalidation: Prefer targeted
--filterover global rebuilds - Docker Builds:
NEXT_PUBLIC_DOCKER_BUILD=trueenables standalone output - Development: Incremental builds via TypeScript project references
- Bundle Analysis: Check Next.js bundle size regularly
- Database Queries: Monitor with Drizzle Studio
- S3 Operations: Watch for excessive uploads/downloads
- Desktop Memory: Rust crates handle heavy media processing
- "Cannot find module": Check workspace dependencies in package.json
- TypeScript errors: Run
pnpm typecheckto see project-wide issues - Turbo cache issues: Clear with
rm -rf .turbo - Node version mismatch: Ensure Node 20 is active
- Migration failures: Check
packages/database/migrations/meta/ - Connection errors: Verify Docker containers are running
- Schema drift: Run
pnpm --dir packages/database db:check
- IPC binding errors: Restart dev server to regenerate
tauri.ts - Rust compile errors: Check Cargo.toml dependencies
- Permission issues: macOS/Windows may require app permissions
- Recording failures: Verify screen capture permissions
- Auth failures: Check NextAuth configuration and database
- S3 upload errors: Verify AWS credentials and bucket policies
- Server Action errors: Check network tab for detailed error messages
- Hot reload issues: Restart Next.js dev server
- Use TanStack Query v5 for all client-side server state and fetching.
- Use Server Components for initial data when possible; pass
initialDatato client components and let React Query take over. - Mutations should call Server Actions directly and perform precise cache updates (
setQueryData/setQueriesData) rather than broad invalidations.
Basic query pattern:
import { useQuery } from "@tanstack/react-query";
function Example() {
const { data, isLoading, error } = useQuery({
queryKey: ["items"],
queryFn: fetchItems,
staleTime: 5 * 60 * 1000,
gcTime: 10 * 60 * 1000,
});
if (isLoading) return <Skeleton />;
if (error) return <ErrorState onRetry={() => { /* refetch */ }} />;
return <List items={data} />;
}Server Action mutation with targeted cache updates:
"use client";
import { useMutation, useQueryClient } from "@tanstack/react-query";
import { updateItem } from "@/actions/items"; // 'use server'
function useUpdateItem() {
const qc = useQueryClient();
return useMutation({
mutationFn: updateItem,
onSuccess: (updated) => {
qc.setQueriesData({ queryKey: ["items"] }, (old: any[] | undefined) =>
old?.map((it) => (it.id === updated.id ? { ...it, ...updated } : it))
);
qc.setQueryData(["item", updated.id], updated);
},
});
}Minimize useEffect usage: compute during render, handle logic in event handlers, and ensure cleanups for any subscriptions/timers.
- Prefer Server Components for SEO/initial rendering; hydrate interactivity in client components.
- Co-locate feature components, keep components focused, and use Suspense boundaries for long fetches.
- Styling: Tailwind CSS only; stay consistent with spacing and tokens.
- Loading: Use static skeletons that mirror content; no bouncing animations.
- Performance: Memoize expensive work; code-split naturally; use Next/Image for remote assets.
apps/web/lib/server.tsbuilds aManagedRuntimefromLayer.mergeAllso database, S3, policy, and tracing services are available to every request. Always run server-side effects throughEffectRuntime.runPromise/runPromiseExitfrom this module so cookie-derived context andVideoPasswordAttachmentare attached automatically.apps/web/lib/EffectRuntime.tsexposes a browser runtime that merges the RPC client and tracing layers. Client code should lean onuseEffectQuery,useEffectMutation, anduseRpcClient; never callManagedRuntime.makeyourself inside components.
- Next.js API folders under
apps/web/app/api/*wrap Effect handlers with@effect/platform'sHttpApi/HttpApiBuilder. Follow the existing pattern: declare a contract class viaHttpApi.make, configure groups/endpoints withSchema, and only export thehandlerreturned byapiToHandler(ApiLive). - Inside
HttpApiBuilder.groupblocks, acquire services (e.g.,Videos,S3Buckets) withyield*insideEffect.gen. Provide layers usingLayer.providerather than manualprovideServicecalls so dependencies stay declarative. - Map domain-level errors to transport errors with
HttpApiError.*. Keep error translation exhaustive (Effect.catchTags,Effect.tapErrorCause(Effect.logError)) to preserve observability. - Use
HttpAuthMiddlewarefor required auth andprovideOptionalAuthwhen guests are allowed. The middleware/utility already hydrateCurrentUser, so avoid duplicating session lookups in route handlers. - Shared HTTP contracts that power the desktop app live in
packages/web-api-contract-effect; update them alongside route changes to keep schemas in sync.
- Server components that need Effect services should call
EffectRuntime.runPromise(effect.pipe(provideOptionalAuth)). This keeps request cookies, tracing spans, and optional auth consistent with the API layer. - Prefer lifting Drizzle queries or other async work into
Effect.genblocks and reusing domain services (Videos,VideosPolicy, etc.) rather than writing ad-hoc logic.
- React Query hooks should wrap Effect workflows with
useEffectQuery/useEffectMutationfromapps/web/lib/EffectRuntime.ts; these helpers surface Fail/Die causes consistently and plug into tracing/span metadata. - When a mutation or query needs the RPC transport, resolve it through
useRpcClient()and invoke the strongly-typed procedures exposed bypackages/web-domaininstead of reaching into fetch directly.
- Data fetching:
@tanstack/solid-queryfor server state. - IPC: Call generated
commandsandeventsfromtauri_specta. Listen directly to generated events and prefer the typed interfaces. - Windowing/permissions are handled in Rust; keep UI logic in Solid and avoid mixing IPC with rendering logic.
- CRITICAL: NO CODE COMMENTS: Never add any form of comments to code. This includes:
- Single-line comments:
//(JavaScript/TypeScript/Rust),#(Python/Shell) - Multi-line comments:
/* */(JavaScript/TypeScript),/* */(Rust) - Documentation comments:
///,//!(Rust),/** */(JSDoc) - Any other comment syntax in any language
- Code must be self-explanatory through naming, types, and structure. Use docs/READMEs for explanations when necessary.
- Single-line comments:
- Directory naming: lower-case-dashed
- Components: PascalCase; hooks: camelCase starting with
use - Strict TypeScript; avoid
any; leverage shared types - Use Biome for linting/formatting; match existing formatting
All Rust code must respect these workspace-level lints defined in Cargo.toml. Violating any of these will fail CI:
Rust compiler lints:
unused_must_use = "deny"— Always handleResult/Optionor types marked#[must_use]; never ignore them.
Clippy lints (all denied — code MUST NOT contain these patterns):
dbg_macro— Never usedbg!()in code; use proper logging (tracing::debug!, etc.) instead.let_underscore_future— Never writelet _ = async_fn()which silently drops futures; await or explicitly handle them.unchecked_duration_subtraction— Useduration.saturating_sub(other)instead ofduration - otherto avoid panics on underflow.collapsible_if— Merge nestedifstatements: writeif a && b { }instead ofif a { if b { } }.clone_on_copy— Don't call.clone()onCopytypes (integers, bools, etc.); just copy them directly.redundant_closure— Use function references directly:iter.map(foo)instead ofiter.map(|x| foo(x)).ptr_arg— Accept&[T]or&strinstead of&Vec<T>or&Stringin function parameters for flexibility.len_zero— Use.is_empty()instead of.len() == 0or.len() > 0/.len() != 0.let_unit_value— Don't assign()to a variable: writefoo();instead oflet _ = foo();orlet x = foo();when return is unit.unnecessary_lazy_evaluations— Use.unwrap_or(val)instead of.unwrap_or_else(|| val)when the default is a simple/cheap value.needless_range_loop— Usefor item in &collectionorfor (i, item) in collection.iter().enumerate()instead offor i in 0..collection.len().manual_clamp— Usevalue.clamp(min, max)instead of manualifchains or.min(max).max(min)patterns.
Examples of violations to avoid:
dbg!(value);
let _ = some_async_function();
let duration = duration_a - duration_b;
if condition {
if other_condition {
do_something();
}
}
let x = 5.clone();
vec.iter().map(|x| process(x))
fn example(v: &Vec<i32>) { }
if vec.len() == 0 { }
let _ = returns_unit();
option.unwrap_or_else(|| 42)
for i in 0..vec.len() { println!("{}", vec[i]); }
value.min(max).max(min)Correct alternatives:
tracing::debug!(?value);
some_async_function().await;
let duration = duration_a.saturating_sub(duration_b);
if condition && other_condition {
do_something();
}
let x = 5;
vec.iter().map(process)
fn example(v: &[i32]) { }
if vec.is_empty() { }
returns_unit();
option.unwrap_or(42)
for item in &vec { println!("{}", item); }
value.clamp(min, max)- Video Storage: S3-compatible storage with signed URLs
- Database: MySQL with connection pooling via PlanetScale
- Authentication: NextAuth with custom Drizzle adapter
- API Security: CORS policies, rate limiting via Hono middleware
- Recording Permissions: Platform-specific (macOS Screen Recording, Windows)
- Data Retention: User-controlled deletion of recordings
- Sharing Controls: Password protection, expiry dates on shared links
- Analytics: PostHog with privacy-focused configuration
- Transcription: Deepgram API for captions generation
- Metadata Generation: Groq (primary) + OpenAI (fallback) for titles/descriptions
- Processing Location: All AI calls in Next.js Server Actions only
- Privacy: Transcripts stored in database, audio sent to external APIs
Desktop Recording → Local Files → Upload to S3 →
Background Processing (tasks service) →
Transcription/AI Enhancement → Database Storage
- TanStack Query: https://tanstack.com/query/latest
- React Patterns: https://react.dev/learn/you-might-not-need-an-effect
- Tauri v2: https://github.com/tauri-apps/tauri
- tauri_specta: https://github.com/oscartbeaumont/tauri-specta
- Drizzle ORM: https://orm.drizzle.team/
- SolidJS: https://solidjs.com/
- Self-hosting: https://cap.so/docs/self-hosting
- API Documentation: Generated from TypeScript contracts
- Architecture Decisions: See individual package READMEs
- Monorepo Guide: Turborepo documentation
- Effect System: Used in web-backend packages
- Media Processing: FFmpeg documentation for Rust bindings
Always format code before completing work:
- TypeScript/JavaScript: Run
pnpm formatto format all code with Biome - Rust: Run
cargo fmtto format all Rust code with rustfmt
These commands should be run regularly during development and always at the end of a coding session to ensure consistent formatting across the codebase.