Conversation
…d rate limiting - Add clustering support based on available CPU cores and environment settings - Integrate PostHog analytics for API request and server metrics tracking - Implement rate limiting with IP validation and bounded in-memory storage - Enhance VercelRequest and VercelResponse interfaces with robust parsing and security headers - Improve CORS handling with origin allowlists and credential support - Validate and sanitize API endpoint paths to prevent directory traversal attacks - Add request body size limit and enforce request timeout handling - Provide structured logging for requests, responses, errors, and server lifecycle events - Add health endpoint with uptime, metrics, environment, and version info - Support graceful shutdown with analytics capture on termination signals - Update create-checkout-session API with stricter CORS origin checks and OPTIONS method handling - Refine hono-polar API subscription syncing with date object conversions and improved checkout flow - Enhance secret-chat API error handling with detailed status codes and messages - Update service worker cache revision for production deployment
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
High Priority Fixes: - Replace vulnerable regex patterns in IP validation with safe string operations - Secure cookie parsing with Object.create(null) to prevent prototype pollution - Enhanced file system operations with additional validation layers - Add PostHog analytics payload size limits (32KB) and comprehensive PII sanitization - Implement error message sanitization to prevent information leakage Security Improvements: - Safe IPv4/IPv6 validation without regex DoS vulnerability - Cookie name/value validation with length limits and safe patterns - Multi-layer path traversal protection for API endpoint resolution - PII pattern detection and redaction for analytics - Development vs production error handling with safe messaging - ESLint security rule compliance with appropriate exemptions for validated cases 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
…ration limits - Updated regex patterns for sanitizing metadata, navigation, images, stylesheets, scripts, fonts, and meta tags to prevent potential vulnerabilities. - Implemented iteration limits to avoid catastrophic backtracking in regex operations. - Added validation checks for extracted URLs and text to ensure safety and compliance with length restrictions. This commit addresses security concerns and improves the robustness of HTML content extraction.
- Resolved CORS configuration conflict in api-dev-server.ts using secure whitelist approach - Resolved git provider detection conflict in lib/deployment/netlify.ts using comprehensive URL parsing - Fixed regex escape character issue in netlify.ts for security compliance 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
**HIGH RISK - CORS Misconfiguration Fixed:** - Separate trusted origins from allowed origins in api-dev-server.ts - Only enable credentials for explicitly trusted domains - Prevent credential hijacking via dynamic origin setting **MEDIUM RISK - URL Validation Bypass Fixed:** - Replace vulnerable substring matching with secure hostname validation - Use proper URL parsing to prevent domain spoofing attacks - Affected files: netlify.ts and vercel.ts deployment services **MEDIUM RISK - Information Exposure Prevention:** - Enhanced error sanitization in both development and production modes - Remove ALL sensitive paths, environment variables, credentials from error messages - Stricter character limits and complete information sanitization Security improvements protect against: - Credential theft via CORS misconfiguration - Domain spoofing attacks (evil.com/github.com bypasses) - Internal system information disclosure through error messages 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
- Fix Promise being passed to Convex in EnhancedChatInterface.tsx by properly consuming textStream - Fix 404 error on tRPC billing endpoint by correcting URL path to /hono/trpc/ - Add robust array checks to prevent Se.map undefined errors - Improve metadata handling with proper values instead of undefined - Enhanced error handling and logging for tRPC requests 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
Take the more secure origin validation from main branch that includes additional isValidOrigin checks for both trusted and allowed origins. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
- Replace `existsSync` with `statSync` for safer file existence checks in API endpoints. - Improve error handling for non-file responses and inaccessible directories. - Update CORS handling in `create-checkout-session` to validate origins more robustly. - Introduce helper functions for validating environment variables in `hono-polar` API. - Refactor input sanitization and validation logic in various components for enhanced security. - Clean up unused imports and optimize component structures across multiple files.
- Update analytics configuration to enable based on the presence of the PostHog API key. - Improve IP hashing method for better privacy using SHA-256. - Refine IP validation logic with comprehensive regex for IPv6 support. - Enhance error responses in API endpoints to include decoded endpoint information. - Implement structured logging for API requests and errors, ensuring sensitive data is scrubbed. - Update README to reflect enhanced security features and request timeout settings. - Add new environment variables for Sentry error monitoring and privacy consent management. - Optimize dependency versions in package.json and bun.lock for improved stability.
- Simplified regex patterns for email and phone number validation. - Improved handling of sensitive headers in the scrubHeaders function. - Updated error sanitization methods to prevent sensitive data leaks. - Enhanced URL validation patterns for GitHub links. - Refined object sanitization to prevent prototype pollution and ensure safe handling of sensitive fields.
…for credentials transfer Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
- Resolved IP validation function conflicts by choosing the newer, more robust implementation - Resolved cookie parsing conflicts by keeping enhanced prototype pollution protection - Resolved file existence check conflicts by using the safer validation approach - Resolved HTML sanitization conflicts by keeping the sanitize-html library approach - All conflicts resolved while maintaining security best practices 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
…m our branch Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>
🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
…ranch Merged security enhancements while preserving functionality: - Enhanced error sanitization with comprehensive path and credential redaction - Improved CORS handling with secure credential-only for trusted origins - Upgraded IP hashing from base64 to SHA-256 for better anonymization - Tightened error message length limits and expanded error pattern matching - More comprehensive PII detection and sanitization patterns - Regenerated bun.lock to resolve dependency conflicts All security improvements from dev-branch have been prioritized while maintaining existing functionality from qoder branch. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
- Introduced `ZapdevDeploymentSecrets` to handle sensitive deployment tokens securely. - Updated `ZapdevDeploymentConfig` to exclude sensitive access tokens, ensuring they are retrieved at runtime. - Enhanced deployment manager to utilize the new secrets structure for Netlify and Vercel services. - Improved error handling and logging in various API endpoints for better security and maintainability. - Updated README to reflect changes in deployment configuration and security practices.
…pdate service worker registration - Removed commented-out critical CSS resources for clarity. - Added module preload link for improved loading performance of main module. - Updated service worker registration to use a more generic '/sw.js' path for better maintainability.
Resolved merge conflicts prioritizing security improvements and maintaining functionality: - API-SERVER-README.md: Fixed malformed line in graceful shutdown feature - api/deploy.ts: Used main branch approach with runtime secrets validation - api/domains.ts: Kept deployment manager configuration comment from main - api/success.ts: Maintained strict authentication requirements from qoder - lib/deployment/types.ts: Used main branch configuration comment - lib/deployment/vercel.ts: Kept enhanced extractRepoPath with security improvements - src/components/PrivacyConsentBanner.tsx: Used function declaration style from main - src/components/SubscriptionUpgradeModal.tsx: Removed SafeText dependency - src/components/auth/EnhancedSignUp.tsx: Kept clean interface without unused onComplete prop - src/components/auth/PrivacyAwareSignInButton.tsx: Used function declaration style - src/main.tsx: Removed unused imports from qoder branch - src/utils/security.ts: Kept simpler script tag regex pattern from qoder - src/utils/text-sanitizer.ts: Preserved Zod validation enhancements from qoder All conflicts resolved while maintaining security improvements and code quality. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
- Fix TypeScript errors in API files (type assertions, unused parameters) - Update AI SDK usage from v2 to v3 (maxTokens → maxCompletionTokens) - Fix tRPC context function signature for Hono adapter - Resolve duplicate interface declarations in deployment types - Add missing required properties to analytics events - Fix Vite PWA plugin Cache-Control header configuration - Remove invalid regex pattern in workbox cacheableResponse config All TypeScript build errors resolved. Vercel deployment now succeeds. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
| GitGuardian id | GitGuardian status | Secret | Commit | Filename | |
|---|---|---|---|---|---|
| 20372498 | Triggered | Generic High Entropy Secret | 72993ac | .env.deployment.template | View secret |
🛠 Guidelines to remediate hardcoded secrets
- Understand the implications of revoking this secret by investigating where it is used in your code.
- Replace and store your secret safely. Learn here the best practices.
- Revoke and rotate this secret.
- If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.
To avoid such incidents in the future consider
- following these best practices for managing and storing secrets including API keys and other credentials
- install secret detection on pre-commit to catch secret before it leaves your machine and ease remediation.
🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
|
Caution Review failedThe pull request is closed. WalkthroughIntroduces runtime secrets into deployment management, adds environment validation and request validation to deploy API, enhances domain analytics, tightens auth and downstream calls in success API, updates CORS type-casting, adjusts tRPC context typing, renames a chat token limit param, tweaks domains handler params, and refines service worker caching headers. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant API as api/deploy.ts
participant Env as validateEnvironmentVariables
participant Manager as ZapdevDeploymentManager
participant Netlify
participant Vercel
participant Analytics as PostHogAnalytics
rect rgb(245,248,252)
note over API: Entry, CORS/method handling
Client->>API: POST /api/deploy (body)
API->>Env: validate tokens/platform
Env-->>API: validatedEnv or error
API->>API: validateDeployRequest(body)
API->>Manager: new Manager({config, secrets, analytics, logger})
end
alt action=deploy
API->>Manager: deploy(config)
Manager->>Netlify: deploy (if platform=netlify)
Manager->>Vercel: deploy (if platform=vercel)
Manager-->>API: result
API-->>Client: 200 result
else action=list
API->>Manager: listDeployments()
Manager-->>API: deployments + platforms
API-->>Client: 200 list
else action=setup/verify/delete
API->>Manager: domain action
Manager->>Analytics: domain_* event
Manager-->>API: result
API-->>Client: status result
end
opt error paths
API-->>Client: 400 (known) or 500 (unknown)
end
sequenceDiagram
participant Client
participant API as api/success.ts
participant Auth as verifyAuth
participant SubAPI as /api/get-subscription
participant Logger as logger
Client->>API: GET /api/success (Authorization)
alt Missing Authorization
API-->>Client: 401 Unauthorized
else Has Authorization
API->>Auth: verifyAuth(token)
alt invalid
API-->>Client: 401 Unauthorized
else valid
API->>SubAPI: GET with Authorization
alt SubAPI ok
SubAPI-->>API: planId/status
API-->>Client: 200 planId/status
else SubAPI not ok
API->>Logger: log sanitized error
API-->>Client: 200 {planId: "free", status: "none"}
end
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested labels
Suggested reviewers
Poem
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. 📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 💡 Knowledge Base configuration:
You can enable these sources in your CodeRabbit configuration. 📒 Files selected for processing (4)
✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 21
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (8)
lib/deployment/types.ts (1)
167-199: Security regression: “non-sensitive” config type includes access tokens; conflicts with comments and ZapdevDeploymentSecretsThe comment at Line 167 states “non-sensitive settings”, and the Netlify/Vercel sections above explicitly say accessToken moved to ZapdevDeploymentSecrets. However, ZapdevDeploymentConfig still embeds accessToken for both providers. This invites accidental serialization/logging of secrets and contradicts the intended separation.
Proposed fix: remove tokens from ZapdevDeploymentConfig and keep them solely in ZapdevDeploymentSecrets. Optionally mark previous fields as deprecated to ease migration.
-// Configuration for zapdev deployment service (non-sensitive settings) +// Configuration for zapdev deployment service (non-sensitive settings) export interface ZapdevDeploymentConfig { // Include sensitive deployment configuration - netlify: { - accessToken: string; - teamId?: string; - }; - vercel: { - accessToken: string; - teamId?: string; - }; + netlify: { + /** @deprecated Access tokens must live in ZapdevDeploymentSecrets */ + // accessToken?: never; + teamId?: string; + }; + vercel: { + /** @deprecated Access tokens must live in ZapdevDeploymentSecrets */ + // accessToken?: never; + teamId?: string; + }; // Main domain for custom subdomains baseDomain: string; // "zapdev.link" // Reference to secure credential storage // Secrets must be retrieved from ZapdevDeploymentSecrets at runtime secretsRef?: string; // Optional reference to secret storage keyFollow-up: ensure api/deploy.ts and lib/deployment/manager.ts exclusively source tokens from ZapdevDeploymentSecrets. I can help generate a small codemod to catch lingering config.accessToken usages if useful.
src/utils/security.ts (1)
95-105: Script-tag regex weakened; restore tolerant close-tag to avoid evasionChanging from a tolerant
</script...>to a strict</script>can miss variants like</script >and some malformed-but-recognized closers. Recommend using a boundary + optional whitespace pattern.- /<script[^>]*>[\s\S]*?<\/script>/is, + /<script\b[^>]*>[\s\S]*?<\/script\s*>/is,Optional hardening (nice-to-have):
- Add
/<style\b[^>]*>[\s\S]*?<\/style\s*>/isif style injection is a concern.- Add stricter SVG handling if needed (e.g.,
/<svg\b[^>]*>[\s\S]*?<\/svg\s*>/is), though youron\w+=and data:svg checks already catch many cases.I can add unit tests covering
</script >, mixed-case tags, and nested/script-like strings if you’d like.src/components/auth/EnhancedSignUp.tsx (1)
23-42: Broken flow: handlePrivacyComplete is never called, so the Clerk signup modal may never openThe only place that calls
openSignUpis insidehandlePrivacyComplete, but that function isn’t invoked or passed down toPrivacyConsentStep. IfshowConsentis true, users see the consent UI and then get stuck; ifshowConsentis false (already consented), the component returnsnulland never opens the modal.Fix by reacting to consent state and opening the modal once consent exists, and also when consent isn’t required. Guard to avoid double-open.
Apply this diff:
@@ -import React, { useState } from 'react'; -import { SignUp, useClerk } from '@clerk/clerk-react'; +import React, { useEffect, useRef, useState } from 'react'; +import { useAuth, useClerk } from '@clerk/clerk-react'; @@ -export function EnhancedSignUp({ redirectUrl = '/chat' }: EnhancedSignUpProps) { +export const EnhancedSignUp: React.FC<EnhancedSignUpProps> = ({ redirectUrl = '/chat' }) => { const [step, setStep] = useState<'privacy' | 'signup' | 'welcome'>('privacy'); - const { showConsent, hasConsent, PrivacyConsentStep, closeConsent } = useSignupPrivacyConsent(); + const { showConsent, hasConsent, PrivacyConsentStep, closeConsent } = useSignupPrivacyConsent(); const { openSignUp } = useClerk(); + const { isLoaded, isSignedIn } = useAuth(); + const openedRef = useRef(false); - const handlePrivacyComplete = () => { - setStep('signup'); - closeConsent(); - // Open Clerk's signup modal after privacy consent - openSignUp({ - redirectUrl, - appearance: { - elements: { - modalContent: 'bg-gray-900 border border-gray-700', - modalCloseButton: 'text-gray-400 hover:text-white', - formButtonPrimary: 'bg-primary hover:bg-primary/90', - card: 'bg-gray-900 border border-gray-700', - formFieldInput: 'bg-gray-800 border-gray-600 text-white', - formFieldLabel: 'text-gray-300', - headerTitle: 'text-white', - headerSubtitle: 'text-gray-300' - } - } - }); - }; + // Redirect authenticated users to /chat + useEffect(() => { + if (!isLoaded) return; + if (isSignedIn) { + window.location.assign('/chat'); + } + }, [isLoaded, isSignedIn]); + + // Open signup when consent is not required or has been granted + useEffect(() => { + if (openedRef.current) return; + if (!showConsent || hasConsent) { + openedRef.current = true; + setStep('signup'); + try { + closeConsent(); + openSignUp({ + redirectUrl, + appearance: { + elements: { + modalContent: 'bg-gray-900 border border-gray-700', + modalCloseButton: 'text-gray-400 hover:text-white', + formButtonPrimary: 'bg-primary hover:bg-primary/90', + card: 'bg-gray-900 border border-gray-700', + formFieldInput: 'bg-gray-800 border-gray-600 text-white', + formFieldLabel: 'text-gray-300', + headerTitle: 'text-white', + headerSubtitle: 'text-gray-300' + } + } + }); + } catch { + // Optionally: add toast error reporting here (Sonner) per guidelines + } + } + }, [showConsent, hasConsent, closeConsent, openSignUp, redirectUrl]); @@ -} +};src/utils/text-sanitizer.ts (1)
49-112: Existing sanitization functions bypass new validation logicThe original
sanitizeText,sanitizeCode,sanitizeOutput, andcreateSanitizedHtmlfunctions don't use the new validation logic withMALICIOUS_PATTERNSchecking. This creates an inconsistent security posture.Consider updating the existing functions to use the new validation:
export function sanitizeText(text: string | null | undefined, fallback = ''): string { if (text == null) { return fallback; } if (typeof text !== 'string') { return escapeHtml(String(text)); } + // Check for malicious patterns + const validation = validateInput(text); + if (!validation.isValid) { + console.warn('Potentially unsafe content detected:', validation.error); + return fallback; + } + return escapeHtml(text); }lib/deployment/manager.ts (1)
414-427: Fix path-type import inconsistency that can break NodeNext buildsTwo inline type imports reference different module specifiers:
import('./types.js')(Line 414) vsimport('./types')(Line 425). With"moduleResolution": "NodeNext", the latter can fail at build/runtime.Apply this diff to standardize on the
.jspath:- status: import('./types').DeploymentStatus; + status: import('./types.js').DeploymentStatus;api/deploy.ts (3)
121-123: Eager environment validation at module load will take the entire endpoint downCalling
validateEnvironmentVariables()at import-time throws if tokens are missing, preventing even OPTIONS/GET error responses. This is a hard failure on Vercel.Move validation into
getDeploymentManager()or the request path. Example:-// Validate environment variables -const validatedEnv = validateEnvironmentVariables(); +// Lazy-init validated env +let validatedEnv: ReturnType<typeof validateEnvironmentVariables> | null = null;And inside
getDeploymentManager():- if (!deploymentManager) { + if (!deploymentManager) { + if (!validatedEnv) validatedEnv = validateEnvironmentVariables(); const secrets = { - netlify: { accessToken: validatedEnv.netlifyAccessToken, teamId: process.env.NETLIFY_TEAM_ID }, - vercel: { accessToken: validatedEnv.vercelAccessToken, teamId: process.env.VERCEL_TEAM_ID }, + netlify: { accessToken: validatedEnv.netlifyAccessToken, teamId: process.env.NETLIFY_TEAM_ID }, + vercel: { accessToken: validatedEnv.vercelAccessToken, teamId: process.env.VERCEL_TEAM_ID }, }; deploymentManager = new ZapdevDeploymentManager({ config: deploymentConfig, secrets, analytics: { track: analytics.track.bind(analytics) }, logger }); }
101-112: Over-strict: requiring both Netlify and Vercel tokens to be setThe validator fails unless BOTH tokens exist, even if only one platform is needed. This contradicts the manager’s ability to operate with either.
- Require at least one token.
- Default platform to whichever token is present, else throw.
- if (!netlifyAccessToken) { missingVars.push('NETLIFY_ACCESS_TOKEN'); } - if (!vercelAccessToken) { missingVars.push('VERCEL_ACCESS_TOKEN'); } + const hasNetlify = !!netlifyAccessToken; + const hasVercel = !!vercelAccessToken; + if (!hasNetlify && !hasVercel) { + const errorMessage = 'Missing required environment variables: at least one of NETLIFY_ACCESS_TOKEN or VERCEL_ACCESS_TOKEN must be set.'; + logger.error('Environment validation failed', new Error(errorMessage)); + throw new Error(errorMessage); + } @@ - const finalDefaultPlatform = defaultPlatform && validPlatforms.includes(defaultPlatform) - ? defaultPlatform - : 'vercel'; + const finalDefaultPlatform = defaultPlatform && validPlatforms.includes(defaultPlatform) + ? defaultPlatform + : (hasVercel ? 'vercel' : 'netlify');Also applies to: 114-119
124-135: Do not put access tokens into the config object; keep them only in the secrets object
ZapdevDeploymentConfigWithSecretsembeds tokens intodeploymentConfig. Configs are more likely to be serialized, logged, or returned by mistake. You already pass a propersecretsobject to the manager—duplicate storage increases leakage risk.Remove token fields from the config entirely and drop the
ZapdevDeploymentConfigWithSecretstype:-// Extended configuration interface that includes runtime secrets -interface ZapdevDeploymentConfigWithSecrets extends ZapdevDeploymentConfig { - netlify: { - accessToken: string; - teamId?: string; - }; - vercel: { - accessToken: string; - teamId?: string; - }; -} - -// Deployment manager configuration (with runtime secrets) -const deploymentConfig: ZapdevDeploymentConfigWithSecrets = { +// Deployment manager configuration (no secrets) +const deploymentConfig: ZapdevDeploymentConfig = { baseDomain: 'zapdev.link', netlify: { - accessToken: validatedEnv.netlifyAccessToken, teamId: process.env.NETLIFY_TEAM_ID, }, vercel: { - accessToken: validatedEnv.vercelAccessToken, teamId: process.env.VERCEL_TEAM_ID, }, defaults: { platform: validatedEnv.defaultPlatform, buildCommand: process.env.DEFAULT_BUILD_COMMAND || 'npm run build', outputDirectory: process.env.DEFAULT_OUTPUT_DIR || 'dist', nodeVersion: process.env.DEFAULT_NODE_VERSION || '18.x', }, };Also applies to: 136-153
🧹 Nitpick comments (48)
.claude/settings.local.json (3)
96-96: 30s build timeout is likely too short; consider relaxing or wildcarding the command.Bun/Next/Vite builds can exceed 30 seconds under CI load. An overly strict, exact-match allow entry may cause flakiness or future edits to be blocked.
Apply one of the following:
- "Bash(timeout 30 bun run build)", + "Bash(timeout 180 bun run build)",Or make the permission resilient to minor flag changes:
- "Bash(timeout 30 bun run build)", + "Bash(timeout * bun run build:*)",Please confirm which pattern the permissions parser supports (exact match vs. globbing within parentheses and after), and choose accordingly.
97-97: Allowing ripgrep is useful; scope it to safe defaults to reduce noisy output and accidental secret exposure.Unconstrained
rg:*can traverse large trees and print entire files (including .env-like files) if invoked carelessly. Sincegrep:*already exists, addingrgis consistent, but we can tame defaults.Consider constraining to typical, read-only search flags and excluding common heavy/secret paths:
- "Bash(rg:*)" + "Bash(rg -n -C3 --hidden --max-columns 200 --glob '!**/node_modules/**' --glob '!**/.git/**' --glob '!.env*':*)"If the permissions grammar doesn’t support embedded flags or globs, keep
Bash(rg:*)and add a deny rule for.env*and VCS dirs, or rely on repo conventions. Please confirm the grammar and we can adjust.
1-3: Sanity check: is tracking a local Claude settings file intentional?File name suggests a local, tool-specific config. If that’s by design for this repo, all good; otherwise consider ignoring
.claude/*.local.jsonvia VCS to avoid leaking tool policy changes into PRs.api/success.ts (6)
19-24: Auth header normalization is correct; consider centralizing to avoid duplication across API routes.Good handling of Node’s
IncomingHttpHeadersunion (string | string[] | undefined). To keep this consistent across routes, consider a tiny shared helper likegetAuthorizationHeader(req)inapi/_utils/auth.tsand reuse it here and elsewhere.
37-41: Harden the internal fetch: add Accept header, timeout, and robust URL join.Small hardening to make this call more resilient and type-friendly:
- Add
Accept: application/jsonto document expectations.- Add a fetch timeout to avoid hanging executions (especially relevant on serverless).
- Build the URL via
new URL()to avoid accidental double slashes.Apply:
- const baseUrl = process.env.PUBLIC_ORIGIN || 'http://localhost:3000'; - const response = await fetch(`${baseUrl}/api/get-subscription`, { - headers: { Authorization: authorization }, - }); + const baseUrl = process.env.PUBLIC_ORIGIN || 'http://localhost:3000'; + const url = new URL('/api/get-subscription', baseUrl).toString(); + const controller = new AbortController(); + const timeout = setTimeout(() => controller.abort(), 10_000); // 10s + const response = await fetch(url, { + method: 'GET', + headers: { + Authorization: authorization, + Accept: 'application/json', + }, + signal: controller.signal, + }); + clearTimeout(timeout);
42-56: Confirm intent: returning 200 on subscription fetch failure may mask issues; at least signal fallback in the payload.If this is deliberate to keep UX smooth, consider explicit signaling so the client/UI can reflect the degraded state and optionally trigger a retry.
Apply:
- // Return a basic success response even if subscription fetch fails + // Return a basic success response even if subscription fetch fails + // Signal fallback so the client can react (e.g., retry, banner) return res.status(200).json({ success: true, message: 'Success endpoint called, subscription sync may be delayed', planId: 'free', - status: 'none' + status: 'none', + fallback: true });Optionally, treat network errors in the catch block the same way (see a suggested change on Lines 67-78).
Is the 200-on-failure behavior a product decision? If not, a 502 with a client-visible retry path might be preferable.
58-65: Parse the JSON defensively and narrow types before use.
response.json()returnsunknownat runtime. Guard the shape to avoid surprises and align with strict typing. No external deps required.Apply:
- const subscription = await response.json(); - - return res.status(200).json({ - success: true, - message: 'Subscription status synced successfully', - planId: subscription.planId || 'free', - status: subscription.status || 'none' - }); + const raw = (await response.json()) as unknown; + const { planId, status } = isSubscriptionPayload(raw) ? raw : {}; + return res.status(200).json({ + success: true, + message: 'Subscription status synced successfully', + planId: typeof planId === 'string' && planId ? planId : 'free', + status: typeof status === 'string' && status ? status : 'none', + });Add this small type guard (place near the bottom of the file or in a shared utils module):
function isSubscriptionPayload(x: unknown): x is { planId?: string; status?: string } { return !!x && typeof x === 'object' && ( 'planId' in x || 'status' in x ); }
67-78: Network errors are handled as 500; consider aligning with the non-OK branch and return a marked fallback.If the intent is “best-effort” sync, match the behavior of the non-OK flow to reduce variance in client handling. You still log the sanitized error for observability.
Apply:
} catch (error) { logSanitizedError('Success endpoint error', error instanceof Error ? error : new Error(String(error)), { method: req.method, url: req.url, hasAuth: !!authorization }); - - return res.status(500).json({ - error: 'Internal server error', - message: 'Failed to process success callback' - }); + // Degraded mode: return a marked fallback like the non-OK branch above + return res.status(200).json({ + success: true, + message: 'Success endpoint processed with degraded subscription sync', + planId: 'free', + status: 'none', + fallback: true + }); }
5-5: Add an explicit return type to the route handler.To satisfy “explicit return types for public APIs” without refactoring the early
return res.status().json(...)lines, annotate asPromise<VercelResponse | void>.Apply:
-export default async function handler(req: VercelRequest, res: VercelResponse) { +export default async function handler(req: VercelRequest, res: VercelResponse): Promise<VercelResponse | void> {api/secret-chat.ts (2)
27-27: Potential type compatibility issue with Headers constructorThe casting of
req.headerstoRecord<string, string>may cause runtime issues since Vercel request headers can containstring | string[]values. The Headers constructor expectsstring | string[]for values, but the cast forces all values to be strings only.Consider removing the type assertion to allow proper header handling:
- const authResult = await verifyAuth({ headers: new Headers(req.headers as Record<string, string>) }); + const authResult = await verifyAuth({ headers: new Headers(req.headers) });If the
verifyAuthfunction specifically requires the current signature, consider updating it to handle the proper header types instead.
85-87: Consider optional chaining for usage propertiesWhile the current null coalescing with
|| 0works, using optional chaining would be more consistent with modern TypeScript patterns and clearer about the intent.- usage: { - promptTokens: usage?.inputTokens || 0, - completionTokens: usage?.outputTokens || 0, - totalTokens: usage?.totalTokens || 0, - }, + usage: { - promptTokens: usage?.inputTokens ?? 0, - completionTokens: usage?.outputTokens ?? 0, - totalTokens: usage?.totalTokens ?? 0, - },API-SERVER-README.md (1)
19-21: Nit: tighten bullet punctuation and parallelism for the two new itemsMake the two bullets grammatically parallel and end with periods (or neither) to match the surrounding list style. Current wording triggers lint/grammar tools.
- - **Request Timeout**: Configurable timeout protection (default: 30s) - - **Graceful Shutdown**: Clean shutdown with analytics reporting + - **Request timeout**: Configurable timeout protection (default: 30s). + - **Graceful shutdown**: Clean shutdown with analytics reporting.If you prefer Title Case, apply it consistently to all bullets in the section instead.
src/utils/security.ts (1)
28-43: Optional: avoid trimming in sanitizeText or make it opt-inTrimming may surprise users by removing intentional leading/trailing spaces. Consider an optional parameter to control trimming, defaulting to false in non-UI contexts.
-export const sanitizeText = (text: string): string => { +export const sanitizeText = (text: string, opts: { trim?: boolean } = {}): string => { return text .replace(/[<>'"&]/g, (char) => { switch (char) { case '<': return '<'; case '>': return '>'; case "'": return '''; case '"': return '"'; case '&': return '&'; default: return char; } }) - .trim(); + [opts.trim === false ? 'toString' : 'trim'](); };Call sites can then enable trimming where appropriate.
Dockerfile.dev (2)
29-30: Dev UX: consider non-root user to avoid host permission issues when mounting volumesRunning as root in dev can cause file ownership friction on bind mounts. The Bun image typically includes a
bunuser.# Start the development server with hot reloading -CMD ["bun", "run", "dev:api"] +USER bun +CMD ["bun", "run", "dev:api"]If the image variant lacks the user, you can add one:
+RUN adduser -D -u 10001 appuser && chown -R appuser:appuser /app +USER appuser
11-14: Lockfile name verified – bun.lock is correctThe repository contains
bun.lock(nobun.lockb), so the Dockerfile.dev line remains valid.
- File affected:
Dockerfile.dev(lines 11–14)- No change needed for the lockfile name
- Optional improvement: ensure reproducible installs by pinning the lockfile during install
RUN bun install --frozen-lockfileapi/create-checkout-session.ts (1)
24-24: Fix type assertions - requestOrigin is already validatedThe type assertions
requestOrigin as stringare unnecessary and potentially unsafe. TheisOriginAllowedhelper already validates thatrequestOriginexists before returningtrue, so when we reach these lines,requestOriginis guaranteed to be a string.Apply this diff to remove the unnecessary type assertions:
- res.setHeader('Access-Control-Allow-Origin', requestOrigin as string); + res.setHeader('Access-Control-Allow-Origin', requestOrigin!);- res.setHeader('Access-Control-Allow-Origin', requestOrigin as string); + res.setHeader('Access-Control-Allow-Origin', requestOrigin!);The non-null assertion
!is safer here since we know from theisOriginAllowedcheck thatrequestOriginis defined.Also applies to: 36-36
src/components/auth/EnhancedSignUp.tsx (4)
14-20: Type the component as React.FC and keep props explicitGuidelines require proper React component typing. Switch to
React.FC<EnhancedSignUpProps>and keep the explicit prop type. The diff above applies this change.
62-107: Add a completion callback to PrivacyConsentStep (if supported) to advance the flowIf
PrivacyConsentStepsupports anonComplete/onAcceptprop, pass a callback that flipsstepand triggers the signup. Otherwise, theuseEffectapproach in the earlier diff guarantees progress without needing a prop. IfPrivacyConsentStepdoes expose a callback, wire it for clearer intent.
109-111: Avoid returning null; provide loading feedback while the modal opensPer UI guidelines, show loading feedback during async transitions. Render a lightweight placeholder (skeleton or spinner) when not showing the consent step and before Clerk’s modal appears.
Apply this minimal placeholder:
- // Once privacy consent is handled, Clerk modal will handle the rest - return null; + // Once privacy consent is handled, Clerk modal will handle the rest + return ( + <div className="flex items-center justify-center py-12 text-sm text-muted-foreground"> + Preparing signup… + </div> + );
8-10: Remove unused import SignUp
SignUpis imported but unused. Drop it to keep the bundle lean and silence TS/ESLint warnings.-import { SignUp, useClerk } from '@clerk/clerk-react'; +import { useClerk } from '@clerk/clerk-react';src/components/ui/LazyLoader.tsx (3)
25-51: Harden IntersectionObserver setup (SSR-safe + proper cleanup)
- If
IntersectionObserverisn’t available (older browsers, some bots), the content will never render. Bail out by marking as loaded.- Prefer
observer.disconnect()in cleanup to ensure we drop all targets and GC cleanly.Apply this diff:
useEffect(() => { - const observer = new IntersectionObserver( + if (typeof window === 'undefined' || !('IntersectionObserver' in window)) { + setHasLoaded(true); + return; + } + const observer = new IntersectionObserver( ([entry]) => { if (entry.isIntersecting && !hasLoaded) { setIsIntersecting(true); setHasLoaded(true); observer.unobserve(entry.target); } }, { rootMargin, threshold, } ); const currentRef = ref.current; if (currentRef) { observer.observe(currentRef); } return () => { - if (currentRef) { - observer.unobserve(currentRef); - } + observer.disconnect(); }; }, [rootMargin, threshold, hasLoaded]);
52-62: Improve accessibility of the fallbackAdd
role="status"andaria-busy="true"to communicate loading state to assistive tech. Keep visual skeleton intact.- const defaultFallback = ( - <div + const defaultFallback = ( + <div className={cn( "flex items-center justify-center bg-gray-50 animate-pulse", className )} style={{ minHeight }} + role="status" + aria-busy="true" > <div className="text-gray-400 text-sm">Loading...</div> </div> );
21-23: Consider removing redundant isIntersecting state
hasLoadedalready gates rendering;isIntersectingisn’t used elsewhere. Simplify the state to reduce re-renders.- const [isIntersecting, setIsIntersecting] = useState(false); const [hasLoaded, setHasLoaded] = useState(false); @@ - if (entry.isIntersecting && !hasLoaded) { - setIsIntersecting(true); + if (entry.isIntersecting && !hasLoaded) { setHasLoaded(true); observer.unobserve(entry.target); } @@ - {isIntersecting || hasLoaded ? children : (fallback || defaultFallback)} + {hasLoaded ? children : (fallback || defaultFallback)}src/components/ResourcePreloader.tsx (3)
15-21: Nit: dns-prefetch targets should be origins, not full URLsMost UA examples use
href="//origin"orhref="https://origin"without paths. Yours are fine, but consider normalizing and de-duplicating with preconnect (which you already do).-const DNS_PREFETCH_DOMAINS = [ - 'https://fonts.googleapis.com', - 'https://fonts.gstatic.com', - 'https://api.groq.com', - 'https://api.clerk.dev', -]; +const DNS_PREFETCH_DOMAINS = [ + 'https://fonts.googleapis.com', + 'https://fonts.gstatic.com', + 'https://api.groq.com', + 'https://api.clerk.dev', +];Additionally, you can skip
dns-prefetchwhen usingpreconnectfor the same origins to reduce tag noise.
12-13: Preloading Google Fonts stylesheet needs rel="preload" + onload swap to be effectivePreloading a stylesheet cross-origin without applying it yields limited benefit. If you intend to apply it, use the standard pattern or rely on your existing CSS imports and keep only
preconnect.Example pattern:
<link rel="preload" as="style" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;600&display=swap" onload="this.onload=null;this.rel='stylesheet'"> <noscript><link rel="stylesheet" href="https://fonts.googleapis.com/css2?family=Inter:wght@400;600&display=swap"></noscript>
65-72: Use the PWA plugin’s registration helper instead of manual SW registrationYour project is already configured with
vite-plugin-pwainvite.config.ts(lines 51–56), which:
- Generates
sw.jsin production builds- Injects a manifest into
sw-custom.jsin development- Uses
injectRegister: 'auto'andregisterType: 'autoUpdate'Manually calling
navigator.serviceWorker.register('/sw.js')insrc/components/ResourcePreloader.tsxalways points at/sw.js, even in dev—resulting in 404s forsw-custom.js. Instead, let the plugin handle registration and updates via its virtual helper:• Remove the manual block in
ResourcePreloader.tsx(around lines 65–72)
• Add at the top of your app (e.g. inmain.tsx):import { registerSW } from 'virtual:pwa-register' const updateSW = registerSW({ immediate: true, onNeedRefresh() { /* prompt user to reload */ }, onOfflineReady() { /* show offline indicator */ }, })This ensures the correct file is registered (whether
sw.jsorsw-custom.js), handles auto-updates, and keeps your registration logic in sync with your Vite PWA config.src/components/ui/HeroImagePreloader.tsx (2)
28-35: Use window.requestIdleCallback explicitly to satisfy TS and avoid global name shadowingAccess the method via
windowand provide a minimal timeout budget.- if ('requestIdleCallback' in window) { - requestIdleCallback(preloadImages); + if ('requestIdleCallback' in window && typeof window.requestIdleCallback === 'function') { + window.requestIdleCallback(preloadImages, { timeout: 2000 } as any); } else { // Fallback for browsers without requestIdleCallback setTimeout(preloadImages, 1); }
4-9: Consider whether preloading favicon.svg is necessaryFavicons are typically handled via
<link rel="icon">and cached; preloading it may be redundant. Keep hero/LCP images in the list and drop non-critical assets.src/components/ui/CriticalCSS.tsx (2)
93-101: Preload both font weights you declare as criticalYou embed 400 and 600 weights but only preload 400. Preloading both avoids FOIT when the bold variant first appears.
// Preload critical resources const preloadLink = document.createElement('link'); preloadLink.rel = 'preload'; preloadLink.href = '/fonts/inter-v12-latin-regular.woff2'; preloadLink.as = 'font'; preloadLink.type = 'font/woff2'; preloadLink.crossOrigin = 'anonymous'; document.head.appendChild(preloadLink); + + const preloadLink600 = document.createElement('link'); + preloadLink600.rel = 'preload'; + preloadLink600.href = '/fonts/inter-v12-latin-600.woff2'; + preloadLink600.as = 'font'; + preloadLink600.type = 'font/woff2'; + preloadLink600.crossOrigin = 'anonymous'; + document.head.appendChild(preloadLink600);
85-92: Guard against duplicate injections across hot reloadsAdd an existence check before injecting the style or preloads to avoid duplicates during HMR and navigation.
- const style = document.createElement('style'); - style.textContent = criticalCSS; - style.id = 'critical-css'; - document.head.insertBefore(style, document.head.firstChild); + if (!document.getElementById('critical-css')) { + const style = document.createElement('style'); + style.textContent = criticalCSS; + style.id = 'critical-css'; + document.head.insertBefore(style, document.head.firstChild); + }Also applies to: 102-109
src/components/LazyComponents.tsx (2)
64-68: Default export doesn't include LazyEnhancedChatInterfaceThe default export object is missing
LazyEnhancedChatInterfacewhich is exported individually but not included in the default export. This could be an oversight.Consider adding it to maintain consistency:
export default { + LazyEnhancedChatInterface, SuspendedWebsiteCloneDialog, SuspendedSmartPrompts, SuspendedLivePreview, };
24-31: ExtractOptimizedFallbackinto its own reusable componentThe
OptimizedFallbackdefinition is currently inlined withinsrc/components/LazyComponents.tsx(lines 24–31) and duplicated across multiple<Suspense>fallbacks. To improve maintainability and enable reuse, extract it into a standalone file and import it where needed.Key changes:
- Create a new file
src/components/ui/OptimizedFallback.tsxwith the exported component:// src/components/ui/OptimizedFallback.tsx import React from 'react'; import { cn } from '@/lib/utils'; export const OptimizedFallback = ({ className, children, }: { className?: string; children?: React.ReactNode; }) => ( <div className={cn("animate-pulse", className)}> <div className="h-8 bg-gray-200 rounded w-full mb-4"></div> <div className="h-4 bg-gray-200 rounded w-3/4 mb-2"></div> <div className="h-4 bg-gray-200 rounded w-1/2"></div> {children} </div> );- In
src/components/LazyComponents.tsx, remove the inlineOptimizedFallbackand add:- const OptimizedFallback = ({ className, children }: { className?: string; children?: React.ReactNode }) => (…); + import { OptimizedFallback } from '@/components/ui/OptimizedFallback';- Update all
<Suspense>fallbacks to use the imported component:<Suspense fallback={<OptimizedFallback className="min-h-[200px]" />} /> <Suspense fallback={<OptimizedFallback className="min-h-[150px]" />} /> <Suspense fallback={<OptimizedFallback className="min-h-[300px]" />} />This refactoring centralizes the fallback UI, reduces duplication, and makes future style tweaks or enhancements trivial.
src/hooks/usePerformanceMonitoring.ts (1)
48-51: Unused error parameter in catch blockThe error parameter in the catch block is not being used or logged, which could make debugging harder.
Consider logging the error in development mode:
} catch (error) { + if (process.env.NODE_ENV === 'development') { + console.warn('Failed to load web-vitals library:', error); + } // web-vitals library not available, use Performance API measureWithPerformanceAPI(); }src/components/ui/OptimizedImage.tsx (1)
84-98: Improve image format handling logicThe current check
src.includes('lovable-uploads')on Lines 87 and 94 is fragile and makes assumptions about URL patterns. This could break if the CDN changes or for other image sources.Consider a more robust approach to determine whether format conversion is supported:
+ // Check if the source supports format conversion + const supportsFormatConversion = useCallback((url: string) => { + // External URLs or specific CDNs might not support format conversion + const unsupportedPatterns = ['lovable-uploads', 'githubusercontent', 'unsplash']; + return !unsupportedPatterns.some(pattern => url.includes(pattern)); + }, []); return ( <div className={cn("relative overflow-hidden", className)}> {/* ... */} <picture> {/* AVIF format for modern browsers */} <source - srcSet={src.includes('lovable-uploads') ? src : srcSet.replace('webp', 'avif')} + srcSet={supportsFormatConversion(src) ? avifSrc : src} type="image/avif" sizes={sizes} /> {/* WebP format */} <source - srcSet={src.includes('lovable-uploads') ? src : srcSet} + srcSet={supportsFormatConversion(src) ? srcSet : src} type="image/webp" sizes={sizes} />k8s-deployment.yaml (1)
148-149: Security: Review rate limiting configurationThe Ingress rate limit is set to 1000 requests per minute window, which might be too permissive for a production API. Consider adjusting based on your actual traffic patterns and security requirements.
Consider more conservative rate limiting:
- nginx.ingress.kubernetes.io/rate-limit: "1000" - nginx.ingress.kubernetes.io/rate-limit-window: "1m" + nginx.ingress.kubernetes.io/rate-limit: "100" + nginx.ingress.kubernetes.io/rate-limit-window: "1m" + # Add burst handling for legitimate traffic spikes + nginx.ingress.kubernetes.io/limit-burst-multiplier: "2"lib/deployment/vercel.ts (1)
600-602: Consider edge cases in Git URL normalizationWhile the URL normalization handles common cases well, it might fail for URLs with ports (e.g.,
git@github.com:22:owner/repo.git).Add handling for port numbers in the SCP-like format:
const normalized = sanitized.startsWith('git@') - // Convert scp-like git@host:owner/repo.git to ssh://git@host/owner/repo.git for URL parsing - ? sanitized.replace(/^git@/, 'ssh://git@').replace(':', '/') + // Convert scp-like git@host:owner/repo.git to ssh://git@host/owner/repo.git for URL parsing + // Handle optional port: git@host:port:owner/repo.git -> ssh://git@host:port/owner/repo.git + ? sanitized.replace(/^git@([^:]+):(\d+:)?/, (_, host, port) => + port ? `ssh://git@${host}:${port.slice(0, -1)}/` : `ssh://git@${host}/`) : sanitized;api/domains.ts (1)
496-504: Consider implementing persistent storage for subdomain availabilityThe
checkSubdomainAvailabilityfunction currently only checks against reserved subdomains. For production use, this should check against actually registered subdomains.The TODO comments indicate this needs a proper implementation. Would you like me to help implement a database-backed availability check or create an issue to track this enhancement?
Consider adding a database check:
async function checkSubdomainAvailability(subdomain: string): Promise<boolean> { // Check reserved subdomains if (RESERVED_SUBDOMAINS.includes(subdomain.toLowerCase())) { return false; } // TODO: Check database for registered subdomains // const existing = await db.subdomains.findOne({ subdomain: subdomain.toLowerCase() }); // return !existing; // TODO: Query DNS to verify subdomain doesn't already exist // const dnsExists = await checkDNSRecord(`${subdomain}.zapdev.link`); // return !dnsExists; return true; }lib/deployment/manager.ts (3)
335-337: Avoid hard-coding base domain in analytics properties
domain.replace('.zapdev.link', '')bakes in the old base domain and will break ifthis.config.baseDomainchanges.Use the configured base domain instead:
- subdomain: domain.replace('.zapdev.link', ''), + subdomain: domain.endsWith(`.${this.config.baseDomain}`) + ? domain.slice(0, -(`.${this.config.baseDomain}`).length) + : domain,
493-507: Prefer platform-returned DNS records over placeholders
createDNSInstructionsunconditionally adds canned CNAME records and then appends platform-provideddnsRecords. That can produce conflicting instructions (duplicate/incorrect targets).
- If
platformDnsRecordsexist, use only those.- Otherwise, fall back to the generic guidance.
- if (platform === 'netlify') { - instructions.push({ - type: 'CNAME', - name: subdomain, - value: 'your-netlify-site.netlify.app', - description: `Point ${subdomain}.zapdev.link to your Netlify site` - }); - } else if (platform === 'vercel') { - instructions.push({ - type: 'CNAME', - name: subdomain, - value: 'cname.vercel-dns.com', - description: `Point ${subdomain}.zapdev.link to Vercel's CDN` - }); - } - - // Add platform-specific DNS records if available - if (platformDnsRecords) { + // Prefer authoritative records from the platform when available + if (platformDnsRecords?.length) { platformDnsRecords.forEach(record => { instructions.push({ type: record.type, name: record.name, value: record.value, description: `${platform} DNS record: ${record.type} record for ${record.name}` }); }); + } else { + // Fallback generic guidance + if (platform === 'netlify') { + instructions.push({ + type: 'CNAME', + name: subdomain, + value: 'your-netlify-site.netlify.app', + description: `Point ${subdomain}.${this.config.baseDomain} to your Netlify site` + }); + } else if (platform === 'vercel') { + instructions.push({ + type: 'CNAME', + name: subdomain, + value: 'cname.vercel-dns.com', + description: `Point ${subdomain}.${this.config.baseDomain} to Vercel's CDN` + }); + } }Also applies to: 509-519
255-261: Analytics property “project_name” is misleading (carries ID)You’re sending
project_name: projectId || 'unknown'. That’s not a name and will skew analytics.Either remove
project_nameor rename toproject_identifier(and keepproject_idas-is). If you intend to send a human-readable name, plumb it through the method signature.Also applies to: 281-289
DEPLOYMENT-DOCKER-K8S.md (1)
24-37: Docker examples are solid; minor polish suggestions
- Prefer modern CLI:
docker composeoverdocker-compose.- Consider noting that healthcheck relies on curl in the image (see docker-compose.yml comment).
Also applies to: 41-68, 90-94
docker-compose.yml (2)
25-29: Healthcheck may fail if the image lacks curlMany minimal base images don’t ship curl by default. If your Dockerfile doesn’t add it, healthchecks will flap.
Two options:
- Install curl in the image (recommended for ops parity), or
- Use CMD-SHELL and a POSIX sh + wget (if available).
Example:
- healthcheck: - test: ["CMD", "curl", "-f", "http://localhost:3000/health"] + healthcheck: + test: ["CMD-SHELL", "curl -fsS http://localhost:3000/health >/dev/null || exit 1"] interval: 30s timeout: 10s retries: 3 start_period: 40sIf
curlis not available, switch towget -qO-accordingly.
21-23: Log volume can hit permissions issues with non-root containersMounting
./logs:/app/logsis good, but if the container runs as a non-root user (recommended), ensure the host directory exists and is writable.Document or create the directory with appropriate ownership, or mount a named volume and chown in entrypoint.
api/deploy.ts (5)
488-491: Use configured base domain, not a literal
const domain =${subdomain}.zapdev.link`` duplicates configuration and can drift.Refer to
deploymentConfig.baseDomain:- const domain = `${subdomain}.zapdev.link`; + const domain = `${subdomain}.${deploymentConfig.baseDomain}`;
31-59: Don’t swallow analytics errors twice; let the manager log
PostHogAnalytics.trackcatches and suppresses errors, and later the manager also wraps.trackwith its own try/catch. The inner catch prevents the manager from logging failures.Remove the inner try/catch or at least rethrow. Example:
- try { - const payload = { /* ... */ }; - await fetch(`${this.host}/capture/`, { /* ... */ }); - } catch { - // Silently fail analytics - } + const payload = { /* ... */ }; + await fetch(`${this.host}/capture/`, { /* ... */ });
330-336: Avoid logging raw client IP (PII) in routine INFO logsStoring IPs can introduce privacy obligations. Unless you have a DPA/legal basis, prefer masking or hashing.
Mask last octet:
a.b.c.xxx, or hash via a stable salt before logging.
407-421: Default subdomain generation may violate your own validationWhen
subdomainis omitted, you derive it fromprojectNamewithout trimming leading/trailing hyphens. The manager will reject invalid cases later, causing a 400. You can fail fast and give a clearer message.Sanitize further:
- subdomain: subdomain || projectName.toLowerCase().replace(/[^a-z0-9-]/g, '-'), + subdomain: subdomain || projectName + .toLowerCase() + .replace(/[^a-z0-9-]/g, '-') + .replace(/^-+/, '') + .replace(/-+$/, '') + .slice(0, 63),
8-17: Standardize ESM import extensions across API and lib
The library files inlib/deployment/*.tsconsistently import their own modules with the.jsextension (e.g.import … from './types.js'), whereas the API endpoints inapi/*.tsomit it. Under NodeNext’s strict ESM resolution, these mixed imports will break at runtime.Please update the API imports to match the
.jsextensions used in the libs:• api/domains.ts
- Line 9:
-import { ZapdevDeploymentManager } from '../lib/deployment/manager'; +import { ZapdevDeploymentManager } from '../lib/deployment/manager.js';- Line 16:
-} from '../lib/deployment/types'; +} from '../lib/deployment/types.js';• api/deploy.ts
- Line 9:
-import { ZapdevDeploymentManager } from '../lib/deployment/manager'; +import { ZapdevDeploymentManager } from '../lib/deployment/manager.js';- Line 17:
-} from '../lib/deployment/types'; +} from '../lib/deployment/types.js';Adopting this change will ensure consistent module resolution and prevent build failures under ESM/NodeNext.
|
Claude finished @otdoges's task —— View job Code Review: Fix Vercel Deployment Build ErrorsI've completed a comprehensive review of this PR. While the changes successfully address the Vercel deployment build errors, I've identified several security concerns and code quality issues that should be addressed before merging. Review Summary✅ Positive Aspects:
🚨 Critical Security Issues: 1. Environment Validation Logic (api/deploy.ts:94-100)// PROBLEMATIC: Requires BOTH tokens even when only one platform is needed
if (!netlifyAccessToken) missingVars.push('NETLIFY_ACCESS_TOKEN');
if (!vercelAccessToken) missingVars.push('VERCEL_ACCESS_TOKEN');Issue: The validation fails unless BOTH Netlify and Vercel tokens are present, but the deployment manager can work with either platform. 2. Eager Environment Validation (api/deploy.ts:123)const validatedEnv = validateEnvironmentVariables(); // At module load!Issue: This will crash the entire endpoint if tokens are missing, preventing even OPTIONS requests. 3. Duplicate Secret Storage (api/deploy.ts:126-135)interface ZapdevDeploymentConfigWithSecrets extends ZapdevDeploymentConfig {
netlify: { accessToken: string; teamId?: string; };
vercel: { accessToken: string; teamId?: string; };
}Issue: Access tokens are stored in both 4. Type Assertion Issuesapi/create-checkout-session.ts:24, 36: res.setHeader('Access-Control-Allow-Origin', requestOrigin as string);Issue: Unsafe type assertion when api/secret-chat.ts:27: const authResult = await verifyAuth({ headers: new Headers(req.headers as Record<string, string>) });Issue: Forces headers to string-only when they can be string arrays. api/hono-trpc.ts:34-36: createContext: async (opts: FetchCreateContextFnOptions, c) => {
const context = await createContext({ req: opts.req });
return context as Record<string, unknown>; // Unnecessary cast
},Issue: Unused parameter 5. Duplicate Authorization Header Processing (api/success.ts:19-31)// Lines 19-23: First processing
const rawAuthHeader = Array.isArray(req.headers['authorization'])
? req.headers['authorization'][0] : req.headers['authorization'];
// Lines 28-32: Identical duplicate processing
const rawAuthHeader = Array.isArray(req.headers['authorization'])
? req.headers['authorization'][0] : req.headers['authorization'];Issue: Code duplication that suggests copy-paste error. 6. Hard-coded Domain Referencesapi/deploy.ts:489, lib/deployment/manager.ts:336: const domain = `${subdomain}.zapdev.link`; // Hard-coded!
subdomain: domain.replace('.zapdev.link', ''), // Hard-coded!Issue: Should use Performance & Code Quality Issues7. Incomplete Fetch in Analytics (api/deploy.ts:56)await fetch(`${this.host}/capture/`, { /* ... */ }); // Missing catch!Issue: The 8. Inconsistent ESM Import ExtensionsThe codebase mixes imports with and without
Fix: Add Minor Issues
Recommendations
Overall Assessment: The PR fixes the immediate build issues but introduces several security and maintainability concerns that should be addressed before production deployment. |
Summary
Test plan
🤖 Generated with Claude Code
Summary by CodeRabbit
New Features
Bug Fixes
Performance