Conversation
…d rate limiting - Add clustering support based on available CPU cores and environment settings - Integrate PostHog analytics for API request and server metrics tracking - Implement rate limiting with IP validation and bounded in-memory storage - Enhance VercelRequest and VercelResponse interfaces with robust parsing and security headers - Improve CORS handling with origin allowlists and credential support - Validate and sanitize API endpoint paths to prevent directory traversal attacks - Add request body size limit and enforce request timeout handling - Provide structured logging for requests, responses, errors, and server lifecycle events - Add health endpoint with uptime, metrics, environment, and version info - Support graceful shutdown with analytics capture on termination signals - Update create-checkout-session API with stricter CORS origin checks and OPTIONS method handling - Refine hono-polar API subscription syncing with date object conversions and improved checkout flow - Enhance secret-chat API error handling with detailed status codes and messages - Update service worker cache revision for production deployment
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
High Priority Fixes: - Replace vulnerable regex patterns in IP validation with safe string operations - Secure cookie parsing with Object.create(null) to prevent prototype pollution - Enhanced file system operations with additional validation layers - Add PostHog analytics payload size limits (32KB) and comprehensive PII sanitization - Implement error message sanitization to prevent information leakage Security Improvements: - Safe IPv4/IPv6 validation without regex DoS vulnerability - Cookie name/value validation with length limits and safe patterns - Multi-layer path traversal protection for API endpoint resolution - PII pattern detection and redaction for analytics - Development vs production error handling with safe messaging - ESLint security rule compliance with appropriate exemptions for validated cases 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
…ration limits - Updated regex patterns for sanitizing metadata, navigation, images, stylesheets, scripts, fonts, and meta tags to prevent potential vulnerabilities. - Implemented iteration limits to avoid catastrophic backtracking in regex operations. - Added validation checks for extracted URLs and text to ensure safety and compliance with length restrictions. This commit addresses security concerns and improves the robustness of HTML content extraction.
- Resolved CORS configuration conflict in api-dev-server.ts using secure whitelist approach - Resolved git provider detection conflict in lib/deployment/netlify.ts using comprehensive URL parsing - Fixed regex escape character issue in netlify.ts for security compliance 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
**HIGH RISK - CORS Misconfiguration Fixed:** - Separate trusted origins from allowed origins in api-dev-server.ts - Only enable credentials for explicitly trusted domains - Prevent credential hijacking via dynamic origin setting **MEDIUM RISK - URL Validation Bypass Fixed:** - Replace vulnerable substring matching with secure hostname validation - Use proper URL parsing to prevent domain spoofing attacks - Affected files: netlify.ts and vercel.ts deployment services **MEDIUM RISK - Information Exposure Prevention:** - Enhanced error sanitization in both development and production modes - Remove ALL sensitive paths, environment variables, credentials from error messages - Stricter character limits and complete information sanitization Security improvements protect against: - Credential theft via CORS misconfiguration - Domain spoofing attacks (evil.com/github.com bypasses) - Internal system information disclosure through error messages 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
|
| GitGuardian id | GitGuardian status | Secret | Commit | Filename | |
|---|---|---|---|---|---|
| 20372498 | Triggered | Generic High Entropy Secret | 72993ac | .env.deployment.template | View secret |
🛠 Guidelines to remediate hardcoded secrets
- Understand the implications of revoking this secret by investigating where it is used in your code.
- Replace and store your secret safely. Learn here the best practices.
- Revoke and rotate this secret.
- If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.
To avoid such incidents in the future consider
- following these best practices for managing and storing secrets including API keys and other credentials
- install secret detection on pre-commit to catch secret before it leaves your machine and ease remediation.
🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
WalkthroughThe PR adds comprehensive SEO assets and components, strengthens server error handling, CORS, and IP/rate limiting, enhances HTML parsing/sanitization, introduces performance monitoring utilities, refines git provider detection for deployments, updates public metadata files (robots/sitemaps/humans), and adjusts CLA settings. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Client
participant Server
participant CORS
participant RateLimit
participant ErrorSanitizer
Client->>Server: HTTP request
Server->>CORS: Validate Origin against CONFIG.CORS_ORIGINS
CORS-->>Server: Allow-Origin header (and conditional Allow-Credentials if trusted)
Server->>RateLimit: validateAndNormalizeIP + check/update counters
alt Over limit
Server-->>Client: 429 with generic message
else OK
Server->>Server: Handle route (10MB body cap)
alt Success
Server-->>Client: 200 JSON
else Error in dev
Server->>ErrorSanitizer: Sanitize stack, paths, secrets
ErrorSanitizer-->>Server: { error, code: DEVELOPMENT_ERROR }
Server-->>Client: 500 sanitized dev error
else Error in prod
Server->>ErrorSanitizer: Match known patterns
ErrorSanitizer-->>Server: { error/code or GENERIC_ERROR }
Server-->>Client: 500 safe message
end
end
sequenceDiagram
autonumber
participant App
participant SEO
participant Helmet
participant DOM
App->>SEO: Render with SEOProps
SEO->>SEO: Merge with defaults/config
SEO->>Helmet: Set meta, OG/Twitter, canonical, robots
SEO-->>DOM: Inject head tags
SEO->>DOM: useEffect update title/description/canonical
sequenceDiagram
autonumber
participant App
participant PerfOpt
participant PerfObserver
participant Window
App->>PerfOpt: Mount PerformanceOptimizer
PerfOpt->>PerfObserver: Observe FCP/LCP/FID/CLS
PerfOpt->>Window: Set zapdevPerformanceMetrics
PerfObserver-->>PerfOpt: Metrics updates
PerfOpt-->>App: onMetricsUpdate(metrics)
PerfOpt->>DOM: Apply lazy loading, preloads, resource hints
sequenceDiagram
autonumber
participant Service
participant Parser
participant URLAPI
participant Regex
Service->>Parser: extractGitProvider(url)
alt URL parse ok
Parser->>URLAPI: new URL(url)
URLAPI-->>Parser: hostname
else Git-style URL
Parser->>Regex: Extract host from git@host:/ssh://
Regex-->>Parser: hostname
end
Parser->>Parser: Map host → provider (gh/gl/bb else github)
Parser-->>Service: provider
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Suggested labels
Poem
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. ✨ Finishing Touches
🧪 Generate unit tests
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
|
Claude finished @otdoges's task —— View job PR Code ReviewI've completed a comprehensive review of this large pull request (2,483 additions, 180 deletions across 15 files). Here are my findings: Review Tasks
🚨 Critical Security Issues
🎯 SEO Implementation Review✅ Strengths
|
There was a problem hiding this comment.
Actionable comments posted: 13
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (15)
lib/deployment/netlify.ts (4)
447-455: Type mismatch and potential runtime failure:deploy.site_idnot inNetlifyDeploy
deleteDeploymentrelies ondeploy.site_id, butNetlifyDeploylacks this field. UnderstrictTS, this won’t compile; at runtime, an unexpected shape would silently break deletion.Fix by extending the interface and guarding the read:
interface NetlifyDeploy { id: string; url: string; deploy_url: string; admin_url: string; state: 'new' | 'building' | 'ready' | 'error' | 'uploading'; name: string; created_at: string; build_id?: string; error_message?: string; + // Present on Netlify deploy objects; required for site deletion + site_id?: string; } @@ - const siteId = deploy.site_id; + const siteId = deploy.site_id; if (!siteId) { return { success: false, error: 'Site ID not found in deployment data' }; }Also applies to: 21-31
215-221: Missing error handling when fetching latest deploys
deploysResponse.okisn’t checked. A 4xx/5xx will cause.json()to throw or return an error body, makinglatestDeployundefined without context.Add proper checks:
- const deploys: NetlifyDeploy[] = await deploysResponse.json(); + if (!deploysResponse.ok) { + const error = await deploysResponse.text(); + throw new DeploymentError( + `Failed to fetch deploys: ${error}`, + 'netlify', + 'DEPLOYS_LIST_FAILED' + ); + } + const deploys: NetlifyDeploy[] = await deploysResponse.json();
549-553: Robust repo path extraction; support nested groups and more URL formsCurrent regex only captures two path segments (
owner/repo) and can misbehave on nested GitLab groups (group/subgroup/repo) or odd URL variants. Prefer structured parsing first and SSH fallback:- private extractRepoPath(url: string): string { - // Extract owner/repo from git URL - const match = url.match(/[:/]([^/]+\/[^/]+?)(?:\.git)?(?:[?#]|$)/); - return match?.[1] || url; - } + private extractRepoPath(url: string): string { + // Prefer structured parsing; fallback for git@host:... forms + try { + const normalized = url.startsWith('git@') ? url.replace(/^git@/, 'ssh://git@') : url; + const u = new URL(normalized); + // Drop leading slash and .git; keep nested groups if present + return u.pathname.replace(/^\/+/, '').replace(/\.git$/i, ''); + } catch { + // git@host:owner/repo(.git) + const m = url.match(/^[\w.-]+@[^:]+:(.+?)(?:\.git)?(?:[?#]|$)/); + return m?.[1] ?? url; + } + }
129-147: Refactor Netlify deployment to use SHA-based file upload workflowNetlify’s Create a deploy endpoint does not accept raw file contents in the JSON body; it requires a SHA-to-path map and then separate uploads to the returned URLs. The current code (lines 129–147 and 180–201) must be updated accordingly.
• lib/deployment/netlify.ts, lines 129–147:
– Remove the directfiles: { path: content }payload.
– Compute a SHA-256 hash for each file’s content and build a map of{ [sha]: path }.
– POST this SHA map to/sites/{site.id}/deploys.• lib/deployment/netlify.ts, lines 180–201:
– Parse the deploy response to get the list of missing SHAs and their upload URLs.
– Upload each missing file’s content to its corresponding URL (ideally in parallel).
– Handle errors per upload and only finalize once all files are uploaded.Please implement the hashing step, adjust the deploy request, and add the subsequent upload logic so that deployments succeed on larger projects.
lib/deployment/vercel.ts (3)
534-536: Query-string bug:limitmissing when noteamId
/v6/deployments${this.teamQuery}&limit=${limit}fails ifteamIdis absent (URL becomes.../deployments&limit=...). Build the QS robustly:- const response = await fetch(`${this.baseUrl}/v6/deployments${this.teamQuery}&limit=${limit}`, { + const qs = new URLSearchParams(); + if (this.teamId) qs.set('teamId', this.teamId); + qs.set('limit', String(limit)); + const response = await fetch(`${this.baseUrl}/v6/deployments?${qs.toString()}`, { headers: this.headers });
216-221: Inconsistent provider:gitSource.typeis hardcoded to 'github'When deploying from Git,
gitSource.typeis always'github'even for GitLab/Bitbucket URLs. Use the same provider extraction used elsewhere:- gitSource: { - type: 'github', // Could be enhanced to support other providers + gitSource: { + type: this.extractGitProvider(gitRepo.url), repo: this.extractRepoPath(gitRepo.url), ref: gitRepo.branch || 'main' },
420-436: Incorrect A record for Vercel apex domainsThe A record value should be
76.76.21.21(Vercel’s Anycast IP). Current code hardcodes76.76.19.61, which is not the documented address and may break DNS.Change:
- dnsRecords = [{ - type: 'A', - name: config.subdomain, - value: '76.76.19.61' // Vercel's A record - }]; + dnsRecords = [{ + type: 'A', + name: config.subdomain, + value: '76.76.21.21' // Vercel Anycast A record + }];References: Vercel guidance cites
76.76.21.21for apex A records. See official guides and examples. (examples.vercel.com)convex/messages.ts (4)
7-7: DOMPurify import is not safe in Convex/server context.The default DOMPurify package expects a DOM; this server file lacks JSDOM wiring and will crash/balloon bundle size. Use
sanitize-html(already used elsewhere) for server-side sanitization.-import DOMPurify from 'dompurify'; +import sanitizeHtml from 'sanitize-html';
20-30: Replace DOMPurify usage with sanitize-html.Keep a strict whitelist; avoid DOM APIs in server code.
-const sanitizeHTML = (input: string): string => { - // Use DOMPurify for robust XSS protection - return DOMPurify.sanitize(input, { - ALLOWED_TAGS: ['b', 'i', 'em', 'strong', 'u', 'br', 'p'], - ALLOWED_ATTR: [], - KEEP_CONTENT: true, - RETURN_DOM: false, - RETURN_DOM_FRAGMENT: false, - SANITIZE_DOM: true, - }); -}; +const sanitizeHTML = (input: string): string => { + if (typeof input !== 'string') return ''; + return sanitizeHtml(input, { + allowedTags: ['b', 'i', 'em', 'strong', 'u', 'br', 'p'], + allowedAttributes: {}, + disallowedTagsMode: 'discard' + }); +};
900-907: Avoid DOMFiletype in Convex/server API.Convex server runtimes may not include DOM lib types. Accept a POJO instead (name/size/type) or a Blob/ArrayBuffer and validate separately.
-export const validateFileUpload = ( - file: File, +type UploadFileMeta = { name: string; size: number; type: string }; +export const validateFileUpload = ( + file: UploadFileMeta,
955-1046: Move browser storage helpers out ofconvex/(server) module.These reference
window/localStorageand don’t belong in server-side Convex code. Extract to a client util, and import where needed.-export const browserStorage = { +// Move to src/utils/storage.ts (client-only) +export const browserStorage = { ... -}; +};Client util (new file suggestion):
// src/utils/storage.ts export const browserStorage = { /* same implementation */ }; export const sessionStorage = { /* same implementation */ };src/lib/firecrawl.ts (1)
2-5:@sentry/reactdoes not exportlogger; this will be undefined at runtime.Replace with a tiny wrapper using Sentry’s public API (capture/addBreadcrumb) and console.
-import * as Sentry from '@sentry/react' - -const { logger } = Sentry +import * as Sentry from '@sentry/react' +const log = { + info: (message: string, extra?: Record<string, unknown>) => { + Sentry.addBreadcrumb({ level: 'info', message, data: extra }) + console.info(message, extra) + }, + error: (message: string, extra?: Record<string, unknown>) => { + Sentry.captureMessage(message, { level: 'error' }) + console.error(message, extra) + } +}Update calls:
-logger.info('Starting Firecrawl crawl', { url, options: body }) +log.info('Starting Firecrawl crawl', { url, options: body }) -logger.error('Firecrawl crawl failed', { status: res.status, text }) +log.error('Firecrawl crawl failed', { status: res.status, text }) -logger.info('Firecrawl crawl completed', { url, pageCount: pages.length, crawlTime }) +log.info('Firecrawl crawl completed', { url, pageCount: pages.length, crawlTime })…and similarly for the scrape/page/website analysis log calls.
api-dev-server.ts (3)
85-89: Incorrect Content-Length for UTF‑8 payloads.Using
payloadString.lengthcounts UTF‑16 code units, not bytes. UseBuffer.byteLengthto avoid truncated/overlong requests.- headers: { - 'Content-Type': 'application/json', - 'Content-Length': payloadString.length.toString() - }, + headers: { + 'Content-Type': 'application/json', + 'Content-Length': Buffer.byteLength(payloadString, 'utf8').toString() + },
191-195: “hashIP” isn’t a hash (base64-encoding is reversible). Use a real one-way hash.This can leak IPs to analytics. Replace with SHA‑256 (salt optional).
+import crypto from 'node:crypto'; ... private hashIP(ip: string): string { - // Simple hash for privacy - return Buffer.from(ip).toString('base64').substring(0, 8); + // One-way hash for privacy + const hash = crypto.createHash('sha256').update(ip, 'utf8').digest('hex'); + return hash.substring(0, 16); // short, non-reversible id }Also applies to: 639-647
339-342: Rate limiter rejects requests with invalid IPs (self‑DoS vector).If
remoteAddressis empty/parse-fails (proxies, tests), you instantly 429. Better: allow the request but skip counting.if (!validIP) { logger.warn('Invalid IP for rate limiting', { rawIP }); - return false; // Reject requests with invalid IPs + return true; // Allow request but do not count it toward limits }
🧹 Nitpick comments (28)
.claude/settings.local.json (1)
94-96: Tighten permissions scope and deduplicate entriesI ran the duplicate‐entry check and found the following repeated permissions in
.claude/settings.local.json:
- Bash(git add:*)
- Bash(git commit:*)
- Bash(pnpm audit:*)
- Bash(pnpm install:*)
- Bash(pnpm run lint:*)
- Bash(pnpm run:*)
- WebFetch(domain:docs.useautumn.com)
To reduce supply-chain risk, I recommend:
- Constrain
git fetchto shallow clones.- Enforce a timeout on all build commands for parity (bun, pnpm, npm).
- Remove duplicate permission lines.
Minimal diff suggestion:
--- a/.claude/settings.local.json +++ b/.claude/settings.local.json @@ Lines 94-96: - "Bash(git fetch:*)", - "Bash(timeout 30 bun run build)" + "Bash(git fetch --depth=1:*)", + "Bash(timeout 30 bun run build)", + "Bash(timeout 30 pnpm run build:*)", + "Bash(timeout 30 npm run build:*)"After applying the above, please remove the duplicate lines listed above to keep the
permissions.allowlist clean.public/sitemap-blog.xml (1)
8-240: Consider generating this sitemap with real per-post lastmod valuesAll entries share
lastmod>2024-12-21</lastmod>. Search engines use lastmod to prioritize recrawls; stale dates across the board reduce its utility. Suggest generating from your blog source (file mtime or CMS updated_at) and emitting ISO 8601 timestamps.If static is intentional, at least refresh the dates periodically.
I can add a small script to aggregate slugs and real lastmod into XML at build time.
lib/deployment/vercel.ts (1)
566-588: Provider parsing hardening looks solidSwitching to URL parsing with a minimal SSH fallback reduces spoofing risk and keeps behavior aligned with the Netlify implementation in this PR.
Consider centralizing these helpers (provider + repo path) in a shared module to keep Netlify/Vercel parity.
public/sitemap-pages.xml (1)
7-254: Prefer dynamiclastmodand avoid over-optimisticpriorityMany entries share the same
lastmod(2024-12-21). For frequently changing pages (docs, changelog, status), consider accurate timestamps. Also, multiple pages are set to 0.8–1.0 priority; Google treats this as a hint, but wide high values dilute intent. Recommend calibrating priorities and generating this from source metadata at build time.I can wire a small Node script to read routes/content and output XML automatically.
docs/SEO_IMPLEMENTATION_GUIDE.md (4)
145-166: Modernize meta tag example: drop keywords, add canonical, use absolute OG image URL.The keywords meta tag is obsolete. Canonical is referenced elsewhere but missing here. OG images should be absolute and typically raster (png/jpg/webp).
<!-- Basic Meta Tags --> <title>Page Title | ZapDev</title> <meta name="description" content="Page description" /> -<meta name="keywords" content="keyword1, keyword2" /> <meta name="author" content="ZapDev" /> <meta name="robots" content="index, follow" /> +<link rel="canonical" href="https://zapdev.com/page" /> <!-- Open Graph --> <meta property="og:title" content="Page Title" /> <meta property="og:description" content="Page description" /> <meta property="og:type" content="website" /> <meta property="og:url" content="https://zapdev.com/page" /> -<meta property="og:image" content="/og-image.svg" /> +<meta property="og:image" content="https://zapdev.com/og-image.jpg" /> +<meta property="og:site_name" content="ZapDev" /> <!-- Twitter Cards --> <meta name="twitter:card" content="summary_large_image" /> <meta name="twitter:site" content="@zapdev" /> <meta name="twitter:title" content="Page Title" /> <meta name="twitter:description" content="Page description" /> +<meta name="twitter:image" content="https://zapdev.com/og-image.jpg" />
33-35: Qualify AI bot blocking claims.Robots.txt is advisory; many crawlers respect it, many do not. Consider clarifying that “AI Bot Protection” reduces compliant bot crawling but does not prevent data access.
216-223: Add HSTS and Permissions-Policy to security headers section.Reflect the production headers you ship (see convex/messages.ts security headers). Include HSTS guidance.
Content-Security-Policy: default-src 'self' X-Content-Type-Options: nosniff X-Frame-Options: DENY X-XSS-Protection: 0 Referrer-Policy: strict-origin-when-cross-origin +Strict-Transport-Security: max-age=31536000; includeSubDomains +Permissions-Policy: camera=(), microphone=(), geolocation=()
4-4: Tighten wording (“SEO optimization” tautology).Use “comprehensive SEO program” or “comprehensive SEO work” to avoid repetition.
public/humans.txt (2)
13-13: Update “Last update” date.It shows 2024/12/21. Consider updating to the current release date or automate this to avoid stale metadata.
106-106: Tone: “amazing” is informal.Use “excellent” or “outstanding” for a more professional tone in public docs.
- Built with love using amazing open source technologies + Built with love using excellent open-source technologiesconvex/messages.ts (2)
1069-1100: Trusted Types policy also belongs on the client.Same rationale as storage: keep browser-only concerns outside Convex server code.
10-18: Crypto fallback for non-WebCrypto environments.If
crypto.getRandomValuesis unavailable, offer a small fallback using Node’s crypto to avoid throwing in supported server contexts.const generateSecureToken = async (length: number): Promise<string> => { const array = new Uint8Array(length); - if (typeof crypto !== 'undefined' && crypto.getRandomValues) { + if (typeof crypto !== 'undefined' && 'getRandomValues' in crypto) { crypto.getRandomValues(array); } else { - throw new Error('Secure random number generation is not available in this context'); + const { randomBytes } = await import('node:crypto'); + const buf = randomBytes(length); + for (let i = 0; i < length; i++) array[i] = buf[i]; } return Array.from(array, byte => byte.toString(16).padStart(2, '0')).join(''); };public/robots.txt (2)
84-92: Remove Crawl-delay for Googlebot (ignored) and fractional values.Google ignores
Crawl-delay. Fractional delays are non-standard. Use Search Console for Google crawl rate; keepCrawl-delayonly for engines that honor it.User-agent: Googlebot Allow: / -Crawl-delay: 0.5
51-58: Blocking/_nextis fine, but broad “Allow: /” lines are redundant.The many Allow entries after a global Allow are cosmetic. Consider trimming for maintainability; Disallow entries are sufficient.
src/lib/firecrawl.ts (2)
126-134: Add timeouts to fetch calls to avoid hanging requests.Abort long-running Firecrawl requests to improve UX and resource safety.
-const res = await fetch(endpoint, { +const ctrl = new AbortController() +const t = setTimeout(() => ctrl.abort(), 20000) // 20s timeout +const res = await fetch(endpoint, { method: 'POST', headers: { 'Content-Type': 'application/json', 'Authorization': `Bearer ${apiKey}`, }, - body: JSON.stringify(body), + body: JSON.stringify(body), + signal: ctrl.signal }) +clearTimeout(t)Repeat for the scrape endpoint.
Also applies to: 205-213
320-384: Heuristics for jQuery detection produce many false positives.
$appears in currency and math. Require more specific signals (e.g.,window.jQuery,jQuery.fn.jquery,<script src*=jquery>in HTML) to cut noise.index.html (2)
98-131: Avoid synthetic AggregateRating/Review in JSON-LD unless verifiable.Google requires ratings/reviews be representative and visible to users. If these are placeholders, remove until you have real, user-visible content to avoid rich result eligibility issues.
- "aggregateRating": { - "@type": "AggregateRating", - "ratingValue": "4.8", - "ratingCount": "150", - "bestRating": "5", - "worstRating": "1" - }, - "review": [ - { - "@type": "Review", - "reviewRating": { - "@type": "Rating", - "ratingValue": "5", - "bestRating": "5" - }, - "author": { - "@type": "Person", - "name": "Sarah Chen" - }, - "reviewBody": "ZapDev transformed our development process. We built our MVP in days instead of months!" - }, - { - "@type": "Review", - "reviewRating": { - "@type": "Rating", - "ratingValue": "5", - "bestRating": "5" - }, - "author": { - "@type": "Person", - "name": "Marcus Rodriguez" - }, - "reviewBody": "The AI code generation is incredible. It understands context and generates production-ready code." - } - ],
44-48: Minor: add crossorigin to all third-party preconnects for consistency.You already set it for fonts.gstatic.com. Consider adding it to cdn.gpteng.co as well.
- <link rel="preconnect" href="https://cdn.gpteng.co"> + <link rel="preconnect" href="https://cdn.gpteng.co" crossorigin>api-dev-server.ts (1)
619-623: Don’t return internal error details to clients.
internalError()currently returns{ error: 'Internal Server Error', message }. To avoid leakage, dropmessagein production.private internalError(message: string): void { this.statusCode = 500; - this.json({ error: 'Internal Server Error', message }); + this.json({ error: 'Internal Server Error' }); }src/components/PerformanceOptimizer.tsx (3)
116-125: TTFB measurement should use PerformanceNavigationTiming when available.
performance.timingis deprecated. Preferperformance.getEntriesByType('navigation')[0].responseStart.- if (performance.timing) { - const navigationStart = performance.timing.navigationStart; - const responseStart = performance.timing.responseStart; - if (navigationStart && responseStart) { - metricsRef.current.ttfb = responseStart - navigationStart; - onMetricsUpdate?.(metricsRef.current); - } - } + const nav = performance.getEntriesByType?.('navigation')[0] as PerformanceNavigationTiming | undefined; + if (nav) { + metricsRef.current.ttfb = nav.responseStart; + onMetricsUpdate?.(metricsRef.current); + } else if (performance.timing) { + const { navigationStart, responseStart } = performance.timing; + if (navigationStart && responseStart) { + metricsRef.current.ttfb = responseStart - navigationStart; + onMetricsUpdate?.(metricsRef.current); + } + }
264-271: Intervals aren’t cleared on unmount (memory monitor).Store the interval id and clear it during cleanup.
+ const memIntervalRef = useRef<number | null>(null); ... - setInterval(() => { + memIntervalRef.current = window.setInterval(() => { const memory = (performance as Performance & { memory: { usedJSHeapSize: number } }).memory; if (memory.usedJSHeapSize > 50 * 1024 * 1024) { // 50MB console.warn('High memory usage detected:', memory); } - }, 10000); + }, 10000); ... - if (observerRef.current) { + if (observerRef.current) { observerRef.current.disconnect(); } + if (memIntervalRef.current) { + clearInterval(memIntervalRef.current); + memIntervalRef.current = null; + }Also applies to: 285-290
308-379: Fast Refresh warning: export utilities from a separate file.Mixing component and non-component exports breaks React Fast Refresh in dev.
Apply this diff to keep the component file clean, then create a new file (shown below):
-// Utility functions for performance optimization -export const performanceUtils = { - ... -}; +// Moved to src/utils/performanceUtils.tsNew file (outside selected range):
// src/utils/performanceUtils.ts export type Json = string | number | boolean | null | Json[] | { [k: string]: Json }; export const performanceUtils = { debounce: <T extends (...args: unknown[]) => unknown>(func: T, wait: number) => { let timeout: ReturnType<typeof setTimeout>; return (...args: Parameters<T>) => { clearTimeout(timeout); timeout = setTimeout(() => func(...args), wait); }; }, throttle: <T extends (...args: unknown[]) => unknown>(func: T, limit: number) => { let inThrottle = false; return (...args: Parameters<T>) => { if (!inThrottle) { func(...args); inThrottle = true; setTimeout(() => (inThrottle = false), limit); } }; }, preloadImage: (src: string): Promise<void> => new Promise((resolve, reject) => { const img = new Image(); img.onload = () => resolve(); img.onerror = reject; img.src = src; }), preloadScript: (src: string): Promise<void> => new Promise((resolve, reject) => { const script = document.createElement('script'); script.src = src; script.onload = () => resolve(); script.onerror = reject; document.head.appendChild(script); }), getMetrics: () => (window as Window & { zapdevPerformanceMetrics?: Record<string, unknown> }).zapdevPerformanceMetrics, areMetricsGood: (m: { fcp: number; lcp: number; fid: number; cls: number }) => m.fcp < 1800 && m.lcp < 2500 && m.fid < 100 && m.cls < 0.1 };src/config/seo.ts (3)
1-49: Strongly type the SEO config and exported generators.This module is public surface; add explicit types and
as constto prevent accidental mutation and provide intellisense across the app.-export const seoConfig = { +export type StructuredData = Record<string, unknown>; +export interface PageSEO { + title: string; + description: string; + keywords: string[]; + ogType?: 'website' | 'article' | 'product' | 'profile'; + structuredData?: StructuredData; + canonical?: string; +} + +export const seoConfig = { // Site Information site: { name: 'ZapDev', url: 'https://zapdev.com', description: 'AI-powered development platform for building full-stack web applications', language: 'en', defaultLocale: 'en_US', supportedLocales: ['en', 'es', 'fr', 'de', 'ja', 'zh'], twitterHandle: '@zapdev', facebookPage: 'zapdev', linkedinCompany: 'zapdev' },
188-231: Explicit return types for generators; align with strict TS guideline.Add input/output types to
blogPost,featurePage, anduseCasePage.- blogPost: (post: { + blogPost: (post: { title: string; description: string; author: string; publishedAt: string; updatedAt?: string; tags: string[]; slug: string; - }) => ({ + }): PageSEO => ({ ... structuredData: { '@context': 'https://schema.org', '@type': 'BlogPosting', ... } as StructuredData }), ... - featurePage: (feature: { + featurePage: (feature: { name: string; description: string; benefits: string[]; useCases: string[]; - }) => ({ + }): PageSEO => ({ ... structuredData: { '@context': 'https://schema.org', '@type': 'SoftwareApplication', ... } as StructuredData }), ... - useCasePage: (useCase: { + useCasePage: (useCase: { title: string; description: string; industry: string; solutions: string[]; benefits: string[]; - }) => ({ + }): PageSEO => ({ ... structuredData: { '@context': 'https://schema.org', '@type': 'Service', ... } as StructuredData }),Also applies to: 233-256, 258-285
338-357: CSP can be tightened; include analytics endpoints you actually use.
- Dropping
'unsafe-inline'for scripts/styles (use nonces or hashes) is strongly recommended.- Include
https://app.posthog.cominconnect-srcif analytics are enabled server-side or client-side.- 'script-src': ["'self'", "'unsafe-inline'", 'https://cdn.gpteng.co'], - 'style-src': ["'self'", "'unsafe-inline'", 'https://fonts.googleapis.com'], + 'script-src': ["'self'", 'https://cdn.gpteng.co'], // prefer nonces/hashes if inline is required + 'style-src': ["'self'", 'https://fonts.googleapis.com'], // prefer nonces if inline styles are used - 'connect-src': ["'self'", 'https://api.openai.com', 'https://api.anthropic.com'], + 'connect-src': ["'self'", 'https://api.openai.com', 'https://api.anthropic.com', 'https://app.posthog.com'],src/components/SEO.tsx (3)
86-101: Duple setting via DOM + Helmet; rely on Helmet only.The useEffect manually mutates title/meta/canonical which Helmet already manages, risking race conditions in SSR/CSR transitions.
- useEffect(() => { - // Update document title for better UX - document.title = finalTitle; - - // Update meta description for dynamic content - const metaDescription = document.querySelector('meta[name="description"]'); - if (metaDescription) { - metaDescription.setAttribute('content', finalDescription); - } - - // Update canonical link - const canonicalLink = document.querySelector('link[rel="canonical"]'); - if (canonicalLink) { - canonicalLink.setAttribute('href', finalCanonical); - } - }, [finalTitle, finalDescription, finalCanonical]); + // Helmet below handles title/meta/canonical; no manual DOM updates needed.
178-181: Use explicit crossOrigin value.React expects a specific value; an empty string is ambiguous.
- <link rel="preconnect" href="https://fonts.gstatic.com" crossOrigin="" /> + <link rel="preconnect" href="https://fonts.gstatic.com" crossOrigin="anonymous" />
190-259: SEOPresets: consider consolidating with src/config/seo.ts to avoid drift.You now have two sources of truth. Import from the central config to keep metadata consistent.
I can wire
SEOPresetsto read fromseoConfig.pagesand map fields accordingly so we don’t maintain two sets.
Summary by CodeRabbit
New Features
Bug Fixes
Documentation
Chores