feat(server): implement secure, scalable API server with analytics an…#72
feat(server): implement secure, scalable API server with analytics an…#72Jackson57279 wants to merge 3 commits intomainfrom
Conversation
…d rate limiting - Add clustering support based on available CPU cores and environment settings - Integrate PostHog analytics for API request and server metrics tracking - Implement rate limiting with IP validation and bounded in-memory storage - Enhance VercelRequest and VercelResponse interfaces with robust parsing and security headers - Improve CORS handling with origin allowlists and credential support - Validate and sanitize API endpoint paths to prevent directory traversal attacks - Add request body size limit and enforce request timeout handling - Provide structured logging for requests, responses, errors, and server lifecycle events - Add health endpoint with uptime, metrics, environment, and version info - Support graceful shutdown with analytics capture on termination signals - Update create-checkout-session API with stricter CORS origin checks and OPTIONS method handling - Refine hono-polar API subscription syncing with date object conversions and improved checkout flow - Enhance secret-chat API error handling with detailed status codes and messages - Update service worker cache revision for production deployment
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Caution Review failedThe pull request is closed. WalkthroughAdds a cross‑platform deployment system (types, manager, Netlify/Vercel services), new deployment and domain APIs, a production‑grade API server with analytics/health/rate‑limiting, Live Preview and Smart Prompts UI, numerous security/typing hardenings, docs and env templates, and assorted client/server fixes. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Client
participant APIServer as API Server
participant Router as Endpoint Resolver
participant Handler as API Handler
participant Analytics as PostHog
participant RateLimiter as RateLimiter
Client->>APIServer: HTTP request (/api/*)
APIServer->>RateLimiter: check/update quota
alt rate limited
RateLimiter-->>APIServer: blocked
APIServer-->>Client: 429 Too Many Requests
APIServer-->>Analytics: capture api_error (async)
else allowed
APIServer->>Router: sanitize & resolve path
Router-->>Handler: dynamic import
APIServer->>Handler: invoke with EnhancedRequest/Response
Handler-->>APIServer: response
APIServer-->>Client: JSON/HTML
APIServer-->>Analytics: capture api_request & server_metrics (fire-and-forget)
end
sequenceDiagram
autonumber
participant User
participant DeployAPI as /api/deploy
participant DomainAPI as /api/domains
participant Manager as ZapdevDeploymentManager
participant Netlify
participant Vercel
participant Analytics as PostHog
User->>DeployAPI: POST action=deploy/status/list/delete
DeployAPI->>Manager: dispatch(platform, action, payload)
alt platform = netlify
Manager->>Netlify: deploy/status/...
Netlify-->>Manager: DeploymentResult
else platform = vercel
Manager->>Vercel: deploy/status/...
Vercel-->>Manager: DeploymentResult
end
Manager-->>DeployAPI: result
DeployAPI-->>User: JSON { success, url, status, ... }
DeployAPI-->>Analytics: track deployment_* (async)
User->>DomainAPI: POST action=check/setup/verify
DomainAPI->>Manager: validate & delegate
Manager->>Netlify/Vercel: setupCustomDomain/verify
Netlify/Vercel-->>Manager: dnsInstructions/verified
Manager-->>DomainAPI: response
DomainAPI-->>User: JSON { domain, dnsInstructions, verified }
DomainAPI-->>Analytics: track domain_* (async)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60–90 minutes Possibly related PRs
Suggested labels
Poem
Tip 🔌 Remote MCP (Model Context Protocol) integration is now available!Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats. 📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 💡 Knowledge Base configuration:
You can enable these sources in your CodeRabbit configuration. 📒 Files selected for processing (1)
✨ Finishing Touches
🧪 Generate unit tests
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
|
Claude finished @otdoges's task —— View job PR Review: Secure API Server Implementation
|
| const query: { [key: string]: string | string[] } = {}; | ||
| for (const [key, value] of searchParams.entries()) { | ||
| const safeKey = String(key); | ||
| const existingValue = query[safeKey]; |
Check failure
Code scanning / ESLint
Detects "variable[key]" as a left- or right-hand assignment operand. Error
| if (Array.isArray(existingValue)) { | ||
| (existingValue as string[]).push(value); | ||
| } else { | ||
| query[safeKey] = [existingValue as string, value]; |
Check failure
Code scanning / ESLint
Detects "variable[key]" as a left- or right-hand assignment operand. Error
| // Mark current step as error | ||
| if (steps[currentStep]) { | ||
| updateStepStatus(steps[currentStep]?.id || '', 'error'); | ||
| const currentStepData = steps.length > currentStep ? steps[currentStep] : null; |
Check failure
Code scanning / ESLint
Detects "variable[key]" as a left- or right-hand assignment operand. Error
| >(({ index, className, ...props }, ref) => { | ||
| const inputOTPContext = React.useContext(OTPInputContext) | ||
| const slot = inputOTPContext.slots[index] | ||
| const slot = inputOTPContext?.slots?.[index] || null |
Check failure
Code scanning / ESLint
Detects "variable[key]" as a left- or right-hand assignment operand. Error
| /(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)(?:\/.*)?$/, | ||
| /(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)\.git$/, | ||
| /git@github\.com:([^/]+)\/([^/]+)\.git$/, | ||
| /^(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)(?:\/.*)?$/, |
Check failure
Code scanning / ESLint
Detects potentially unsafe regular expressions, which may take a very long time to run, blocking the event loop. Error
| /(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)\.git$/, | ||
| /git@github\.com:([^/]+)\/([^/]+)\.git$/, | ||
| /^(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)(?:\/.*)?$/, | ||
| /^(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)\.git$/, |
Check failure
Code scanning / ESLint
Detects potentially unsafe regular expressions, which may take a very long time to run, blocking the event loop. Error
| const allowedOrigin = CONFIG.CORS_ORIGINS.includes('*') || | ||
| (origin && CONFIG.CORS_ORIGINS.includes(origin)) ? (origin || '*') : null; | ||
|
|
||
| if (allowedOrigin) res.setHeader('Access-Control-Allow-Origin', allowedOrigin); |
Check failure
Code scanning / CodeQL
CORS misconfiguration for credentials transfer High
This autofix suggestion was applied.
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 6 months ago
To fix the issue, we must ensure that if "Access-Control-Allow-Credentials: true" is set, then "Access-Control-Allow-Origin" must never be "*" or "null". Instead, the allowed origins should be an explicit, immutable whitelist of trusted values (ideally HTTPS origins under your control), and never influenced by untrusted data. In this context, the code should reject "*" and "null" from the set of allowed origins, either by sanitizing CONFIG.CORS_ORIGINS on startup or by ensuring that requests with these origins never receive the header.
The best way to fix this code is to:
- Filter out
"*"and"null"from the allowed origins whitelist at startup (CONFIG.CORS_ORIGINS). - Change the logic at request time: Only set
Access-Control-Allow-Originif the incoming request's origin matches an entry in the sanitized whitelist. - If the origin is not in the whitelist, do not set
Access-Control-Allow-Origin. Optionally, you may set a 403 Forbidden response, or simply continue without CORS headers.
Specifically, update the handling for lines 642–647:
- On server startup, remove
"*"and"null"fromCONFIG.CORS_ORIGINS. - At request time, check that
originis non-empty and exactly matches one of the allowed origins before setting the CORS headers. - Remove any logic that would set the header to
"*"or an echo of user-controlled input outside the whitelist.
No new imports are necessary, only edits to this file. All edits are in api-dev-server.ts.
| @@ -14,7 +14,11 @@ | ||
| ENABLE_ANALYTICS: process.env.NODE_ENV !== 'production' ? false : (process.env.POSTHOG_API_KEY ? true : false), | ||
| MAX_WORKERS: Number(process.env.MAX_WORKERS) || Math.min(4, os.cpus().length), | ||
| TIMEOUT: Number(process.env.REQUEST_TIMEOUT) || 30000, | ||
| CORS_ORIGINS: (process.env.CORS_ORIGINS || 'http://localhost:5173,http://localhost:3000').split(','), | ||
| // Sanitize: Remove "*" and "null" from allowed CORS origins | ||
| CORS_ORIGINS: (process.env.CORS_ORIGINS || 'http://localhost:5173,http://localhost:3000') | ||
| .split(',') | ||
| .map(o => o.trim()) | ||
| .filter(o => o && o !== '*' && o !== 'null'), | ||
| RATE_LIMIT: { | ||
| windowMs: 60 * 1000, // 1 minute | ||
| maxRequests: 1000, // per IP | ||
| @@ -638,15 +642,17 @@ | ||
|
|
||
| // Production-Ready Server with PostHog Analytics | ||
| const server = createServer(async (req: IncomingMessage, res: ServerResponse) => { | ||
| // Enhanced CORS | ||
| // Enhanced CORS (only allow credentialed requests if origin matches sanitized whitelist) | ||
| const origin = req.headers.origin; | ||
| const allowedOrigin = CONFIG.CORS_ORIGINS.includes('*') || | ||
| (origin && CONFIG.CORS_ORIGINS.includes(origin)) ? (origin || '*') : null; | ||
| // Only allow origins that are explicitly whitelisted and valid | ||
| const allowedOrigin = origin && CONFIG.CORS_ORIGINS.includes(origin) ? origin : null; | ||
|
|
||
| if (allowedOrigin) res.setHeader('Access-Control-Allow-Origin', allowedOrigin); | ||
| res.setHeader('Access-Control-Allow-Credentials', 'true'); | ||
| res.setHeader('Access-Control-Allow-Methods', 'GET, POST, OPTIONS, PUT, DELETE'); | ||
| res.setHeader('Access-Control-Allow-Headers', 'Content-Type, Authorization, X-Requested-With'); | ||
| if (allowedOrigin) { | ||
| res.setHeader('Access-Control-Allow-Origin', allowedOrigin); | ||
| res.setHeader('Access-Control-Allow-Credentials', 'true'); | ||
| res.setHeader('Access-Control-Allow-Methods', 'GET, POST, OPTIONS, PUT, DELETE'); | ||
| res.setHeader('Access-Control-Allow-Headers', 'Content-Type, Authorization, X-Requested-With'); | ||
| } | ||
|
|
||
| if (req.method === 'OPTIONS') { | ||
| res.writeHead(204); |
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
This autofix suggestion was applied.
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 6 months ago
The correct way to fix this issue is to parse the URL using a reliable URL parser (such as the built-in URL class in Node.js/TypeScript), and then check the host or hostname property of the parsed URL object to determine the git provider. Specifically, the checks should be for exact matches to 'github.com', 'gitlab.com', or 'bitbucket.org' (case-insensitive if appropriate), rather than substring matching. This change should be applied to the extractGitProvider function in lib/deployment/netlify.ts. No changes are needed elsewhere.
If the URLs being handled may be "git remote" style URLs (like git@github.com:owner/repo.git), which are not valid for the URL parser, a fallback or additional parsing for that style will be needed (e.g. using regex for those cases). For most web-based URLs, however, the standard URL class suffices.
To implement the fix:
- Import Node.js
URLif not already available. - In
extractGitProvider, parse the URL and check the hostname for exact matches. - For invalid URLs, optionally fallback to substring matching or default to
'github'.
| @@ -15,6 +15,9 @@ | ||
| DeploymentStatus | ||
| } from './types.js'; | ||
|
|
||
| // For URL parsing | ||
| // Node.js global URL is available, but add import if needed: | ||
| // import { URL } from 'url'; | ||
| interface NetlifyDeploy { | ||
| id: string; | ||
| url: string; | ||
| @@ -520,10 +523,29 @@ | ||
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| return 'github'; // default | ||
| try { | ||
| // Try to parse the URL (supports https/git:///ssh; fallback for git@ style) | ||
| let hostname = ''; | ||
| try { | ||
| // Handle web and git URLs (e.g., https://github.com/..., git://github.com/...) | ||
| hostname = new URL(url).hostname.toLowerCase(); | ||
| } catch { | ||
| // Fallback for git@github.com:owner/repo.git | ||
| // E.g. git@github.com:owner/repo.git or ssh://git@github.com/owner/repo.git | ||
| const match = url.match(/@([a-zA-Z0-9.\-]+)[/:]/); | ||
| if (match) hostname = match[1].toLowerCase(); | ||
| } | ||
| if (hostname === 'github.com') return 'github'; | ||
| if (hostname === 'gitlab.com') return 'gitlab'; | ||
| if (hostname === 'bitbucket.org') return 'bitbucket'; | ||
| return 'github'; // default | ||
| } catch { | ||
| // Fallback to original heuristic | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| return 'github'; | ||
| } | ||
| } | ||
|
|
||
| private extractRepoPath(url: string): string { |
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 6 months ago
The correct fix is to properly parse the provided git URL (which could be HTTP(s) or SSH) and inspect the host component rather than using substring matches. This means:
- Parse the input URL string using the WHATWG URL API if possible. If the URL is SSH-style (e.g.
git@github.com:owner/repo.git), then use regex to extract the host. - Once parsed/extracted, compare the host exactly (e.g., host is
github.com), or in the case of subdomain tolerance, use a whitelist of acceptable hosts. - Only then return the provider value.
The required code changes are within the extractGitProvider(url: string): string method in lib/deployment/netlify.ts:
- Replace each
url.includes('...')check with one that extracts and inspects the host (parse as a URL, or fallback to SSH regex). - No additional packages are needed, as
URLis part of the standard library.
| @@ -520,10 +520,29 @@ | ||
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| return 'github'; // default | ||
| // Parse both HTTP(S) URLs and SSH git URLs | ||
| let host = ''; | ||
| try { | ||
| // Try regular URL parsing | ||
| const parsed = new URL(url); | ||
| host = parsed.hostname; | ||
| } catch { | ||
| // Fallback for SSH-like URLs: git@github.com:owner/repo.git | ||
| const match = url.match(/^([^@]+)@([^:]+):/); | ||
| if (match) { | ||
| host = match[2]; | ||
| } | ||
| } | ||
| switch (host) { | ||
| case 'github.com': | ||
| return 'github'; | ||
| case 'gitlab.com': | ||
| return 'gitlab'; | ||
| case 'bitbucket.org': | ||
| return 'bitbucket'; | ||
| default: | ||
| return 'github'; // default | ||
| } | ||
| } | ||
|
|
||
| private extractRepoPath(url: string): string { |
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 6 months ago
The ideal way to fix this problem is to parse the URL with the built-in URL class (in Node.js/global TS/JS) and then check the hostname property for a strict, whitelisted set, including proper handling of subdomains if needed. Specifically:
- In
extractGitProvider(url: string), parse the string as a URL using theURLconstructor. - Compare the
hostnameof the parsed URL against an exact list:github.com,gitlab.com,bitbucket.org. - For non-matching hosts or invalid URLs, default to
'github'as in the original code.
Required changes:
- Edit the
extractGitProvidermethod inlib/deployment/netlify.ts(lines 522-527). - Add a try-catch to handle cases where the input may not be a valid URL.
- No new libraries are needed; use the global
URLclass.
| @@ -520,9 +520,14 @@ | ||
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| try { | ||
| const hostname = new URL(url).hostname.toLowerCase(); | ||
| if (hostname === 'github.com') return 'github'; | ||
| if (hostname === 'gitlab.com') return 'gitlab'; | ||
| if (hostname === 'bitbucket.org') return 'bitbucket'; | ||
| } catch (_) { | ||
| // Ignore parse errors, fall through | ||
| } | ||
| return 'github'; // default | ||
| } | ||
|
|
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 6 months ago
The correct fix is to parse the URL using the standard URL class provided in modern Node.js/TypeScript environments, extract the hostname, and then strictly compare it to trusted git provider domain names. We should update the extractGitProvider method so that it parses the input URL string and selected the provider based on exact host matches, e.g., 'github.com', 'gitlab.com', 'bitbucket.org'.
To do this:
- At the start of
extractGitProvider, attempt to parseurlusing the globalURLclass. - Use the
.hostnameproperty for comparison against provider domains. - Fall back gracefully if parsing fails or for incomplete URLs (e.g., SSH URLs like
git@github.com:org/repo.git), by checking if the string starts with the provider host segment at expected positions. - No new package is required; use the built-in global
URLclass.
Change required:
- Modify lines 566–570 in file
lib/deployment/vercel.tsto parse the URL and match the host value instead of using.includes. - There are no extra imports required, as the
URLclass is available globally in Node.js v10+ and modern runtimes. - The fix should preserve functionality for typical SSH and HTTPS git URLs.
| @@ -564,9 +564,19 @@ | ||
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| try { | ||
| // Use URL class to parse the host, for input like https://github.com/org/repo.git | ||
| const host = new URL(url).hostname.toLowerCase(); | ||
| if (host === 'github.com') return 'github'; | ||
| if (host === 'gitlab.com') return 'gitlab'; | ||
| if (host === 'bitbucket.org') return 'bitbucket'; | ||
| } catch { | ||
| // Support SSH git URLs: git@github.com:org/repo.git | ||
| // Check for 'git@provider:' pattern | ||
| if (/^git@github\.com:/.test(url)) return 'github'; | ||
| if (/^git@gitlab\.com:/.test(url)) return 'gitlab'; | ||
| if (/^git@bitbucket\.org:/.test(url)) return 'bitbucket'; | ||
| } | ||
| return 'github'; // default | ||
| } | ||
|
|
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 6 months ago
To fix this problem, we should parse the provided URL using the standard URL class (or a well-known library, but URL is native). Then, compare the hostname (or host) property of the parsed URL with the canonical hosts for GitHub (github.com), GitLab (gitlab.com), or Bitbucket (bitbucket.org). This change should be made in the extractGitProvider method. Since TS/JS's standard library includes the URL class, we don't need external dependencies. The fix is to replace substring checks with explicit hostname matching after URL parsing.
Specifically:
- Change the logic in
extractGitProviderto parse the input string with theURLconstructor, extract thehostname(or use fallback for e.g. scp-like git URLs), and compare it directly to the known hostnames. - If parsing fails (invalid URL), optionally fall back to original logic or a safe default.
No imports are needed; URL is built in.
| @@ -564,9 +564,25 @@ | ||
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| try { | ||
| let hostname = ''; | ||
| try { | ||
| // Try native URL parsing | ||
| hostname = new URL(url).hostname; | ||
| } catch { | ||
| // Handle scp-like SSH URLs (e.g. git@github.com:user/repo.git) | ||
| // Extract by regex: git@hostname:... | ||
| const match = url.match(/^git@([^:]+):/); | ||
| if (match) { | ||
| hostname = match[1]; | ||
| } | ||
| } | ||
| if (hostname === 'github.com') return 'github'; | ||
| if (hostname === 'gitlab.com') return 'gitlab'; | ||
| if (hostname === 'bitbucket.org') return 'bitbucket'; | ||
| } catch { | ||
| // ignore and fall back | ||
| } | ||
| return 'github'; // default | ||
| } | ||
|
|
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 6 months ago
To fix the problem, we should parse the incoming URL using the standard URL class to robustly extract the actual host, and then perform a check against this value—rather than using includes, which matches anywhere in the string. This ensures only the true host part of the URL is checked. In TypeScript/Node.js, the URL class from the standard library is available, so we do not need any new dependencies.
Specifically, update extractGitProvider to attempt to parse the URL, get its host (or hostname), and compare this value directly (and only) to the list of allowed provider hosts (github.com, gitlab.com, bitbucket.org). If parsing fails (invalid URL), we fall back to the original logic for backward compatibility.
You only need to change the extractGitProvider method (lines 566–571).
| @@ -564,9 +564,18 @@ | ||
| } | ||
|
|
||
| private extractGitProvider(url: string): string { | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| try { | ||
| const parsedUrl = new URL(url); | ||
| const hostname = parsedUrl.hostname.toLowerCase(); | ||
| if (hostname === 'github.com') return 'github'; | ||
| if (hostname === 'gitlab.com') return 'gitlab'; | ||
| if (hostname === 'bitbucket.org') return 'bitbucket'; | ||
| } catch (e) { | ||
| // If URL parsing fails, fall back to original behaviour | ||
| if (url.includes('github.com')) return 'github'; | ||
| if (url.includes('gitlab.com')) return 'gitlab'; | ||
| if (url.includes('bitbucket.org')) return 'bitbucket'; | ||
| } | ||
| return 'github'; // default | ||
| } | ||
|
|
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
|
Claude finished @otdoges's task —— View job PR Review: Secure API Server Implementation
|
There was a problem hiding this comment.
Actionable comments posted: 32
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (23)
src/components/ui/input-otp.tsx (1)
49-55: Wrap OTP characters with SafeText for XSS protectionTo ensure every user‐generated character is properly sanitized per our XSS guidelines, wrap the
charrender in theSafeTextcomponent instead of rendering it directly.• File:
src/components/ui/input-otp.tsx
Lines 49–55{char} - {char} + <SafeText>{char}</SafeText> {hasFakeCaret && ( <div className="pointer-events-none absolute inset-0 flex items-center justify-center"> <div className="h-4 w-px animate-caret-blink bg-foreground duration-1000" /> </div> )}• Add import (near the top of the file, alongside the other UI imports):
import { SafeText } from "@/components/ui/SafeText";This change aligns with our security rules requiring all user‐provided content—even single characters—to be sanitized via the
SafeTextcomponent.src/components/Navigation.tsx (2)
160-163: Wrap displayeduser.fullNamewithSafeTextfor XSS protectionThe
SafeTextcomponent lives insrc/components/ui/SafeText.tsxand is exported as a named export. Replace the default import and direct interpolation with a named import and wrap both instances of the first name in<SafeText>.• File:
src/components/Navigation.tsx
– Lines 160–163
– Lines 274–275Apply this diff (adjust
@/alias if needed):-import SafeText from "@/components/SafeText"; +import { SafeText } from "@/components/ui/SafeText"; @@ -157,7 +157,7 @@ <User className="w-4 h-4 mr-2" /> - {user?.fullName?.split(' ')[0] || 'Profile'} + <SafeText>{user?.fullName?.split(' ')[0] ?? 'Profile'}</SafeText> </Button> </motion.div> )} @@ -271,7 +271,7 @@ <User className="w-4 h-4 mr-2" /> - {user?.fullName?.split(' ')[0] || 'Profile'} + <SafeText>{user?.fullName?.split(' ')[0] ?? 'Profile'}</SafeText> </Button> </Menu.Items> )}
151-191: Ensure authentication loading state is handled before rendering auth actionsYour
useAuthhook already returns anisLoadingflag, so you can now render a loading placeholder while the authentication status is being determined. Per our authentication‐patterns guideline, show a skeleton or spinner whenisLoadingis true, then render the unauthenticated or authenticated UI.• File:
src/components/Navigation.tsx
– At the top of your render (before checkingisAuthenticated), destructureisLoadingfrom the hook:
tsx const { isLoading, isAuthenticated, user } = useAuth();
– Replace the auth‐actions block with a loading state whenisLoadingis true. For example:
tsx {isLoading ? ( <div className="flex items-center gap-2"> <Skeleton className="h-9 w-24 rounded-md" /> </div> ) : isAuthenticated ? ( /* existing Profile + Sign Out buttons */ ) : ( /* existing Get Started button */ )}
– Import yourSkeleton(or Spinner) component at the top of the file.This ensures users see immediate feedback during auth initialization and complies with our loading‐state requirements.
src/components/UserSync.tsx (1)
22-27: Sanitize fields and avoid email-derived usernames (PII).
- fullName, username, and avatarUrl should be sanitized/validated before persisting.
- Falling back to
${email.split('@')[0]}_${Date.now()}leaks PII (email local part) and is unstable across renders. Prefer a deterministic, non-PII fallback derived from Clerk user id.- Avoid storing empty strings; use
undefinedfor optional fields.Apply this minimal diff within the payload:
- upsertUser({ - email, - fullName: clerkUser.fullName || undefined, - avatarUrl: clerkUser.imageUrl || undefined, - username: clerkUser.username || `${email.split('@')[0]}_${Date.now()}`, - bio: '', - }).catch((error) => { + upsertUser({ + email, + fullName: sanitizeText(clerkUser.fullName ?? '') || undefined, + avatarUrl: isValidHttpsUrl(clerkUser.imageUrl) ? clerkUser.imageUrl : undefined, + username: + sanitizeText( + clerkUser.username || + `user_${clerkUser.id.replace(/[^a-zA-Z0-9]/g, '').slice(-6).toLowerCase()}` + ), + bio: undefined, + }).catch((error) => {And add the helper imports (adjust path to your shared utils):
-import { useEffect } from 'react'; +import { useEffect } from 'react'; +import { sanitizeText } from '@/lib/sanitizeText';Add a small URL validator in this file:
function isValidHttpsUrl(value?: string | null): boolean { if (!value) return false; try { const u = new URL(value); return u.protocol === 'https:'; } catch { return false; } }Optionally clamp lengths with your security constants (if available) before passing to
sanitizeTextto enforce size limits.If you share your sanitize/validation utility paths and constants (e.g., MAX_DISPLAY_NAME_LENGTH, MAX_USERNAME_LENGTH), I can tailor the exact import lines.
src/lib/github-token-storage.ts (4)
7-10: Do not rely onSentry.logger; usecaptureException/captureMessageor a project logger.
@sentry/reactdoes not expose a publicloggerAPI for app use. Destructuringloggerwill be undefined in many setups and can crash at runtime. PreferSentry.captureException/Sentry.captureMessage(and/or your structured logger) instead. Also consider importing a cross‑env wrapper (e.g., your app logger) rather than@sentry/reactin a shared lib that may run in SSR/Node.Apply this minimal change to remove the unsafe destructure:
-const { logger } = Sentry; +// Use Sentry.captureException / Sentry.captureMessage or your app's logger utility instead.Optionally, if you want a single surface, centralize logging in a small wrapper and import that here.
27-30: Avoid logging any part of secrets; remove token prefix from logs.Logging even a 10‑char prefix of a GitHub token is sensitive and can leak secrets into analytics/log pipelines. Reject the token silently or log a generic message without token material.
Proposed safe change:
- if (token.includes('<script>') || token.includes('javascript:') || token.includes('data:')) { - logger.warn('Potentially malicious token detected', { tokenPrefix: token.substring(0, 10) }); - return false; - } + if (token.includes('<') || token.includes(':')) { + // Do not log token material + Sentry.captureMessage('Rejected GitHub token: invalid characters detected', 'warning'); + return false; + }
47-51: Replacelogger.*calls withcaptureException/captureMessage(and console) to avoid runtime errors and PII leakage.Current calls to
logger.info/warn/errorwill likely fail; also ensure errors are captured by Sentry and avoid including token content in messages.- logger.info('GitHub token stored securely'); + // Avoid logging token contents + console.info('GitHub token stored securely'); } catch (error) { - logger.error('Failed to store GitHub token:', error); + Sentry.captureException(error); + console.error('Failed to store GitHub token'); throw new Error('Failed to securely store GitHub token'); }- } catch (error) { - logger.error('Failed to retrieve GitHub token:', error); - return null; + } catch (error) { + Sentry.captureException(error); + console.error('Failed to retrieve GitHub token'); + return null; }- logger.info('GitHub token cleared from storage'); + console.info('GitHub token cleared from storage'); } catch (error) { - logger.error('Failed to clear GitHub token:', error); + Sentry.captureException(error); + console.error('Failed to clear GitHub token'); }- logger.info('Successfully migrated GitHub token from localStorage to secure storage'); + console.info('Migrated GitHub token from localStorage to secure storage');- } catch (error) { - logger.error('Failed to migrate GitHub token:', error); + } catch (error) { + Sentry.captureException(error); + console.error('Failed to migrate GitHub token'); }Also applies to: 78-80, 89-92, 116-117, 121-123
110-116: GuardlocalStorageaccess for SSR/Node environments.This library may be imported in server contexts. Direct
localStorageaccess will throwReferenceError. Add a runtime guard and usewindow.localStorageexplicitly.export async function migrateFromLocalStorage(): Promise<boolean> { try { - const legacyToken = localStorage.getItem('github_access_token'); + if (typeof window === 'undefined' || typeof window.localStorage === 'undefined') { + return false; + } + const legacyToken = window.localStorage.getItem('github_access_token'); if (legacyToken && validateGitHubToken(legacyToken)) { await setGitHubToken(legacyToken); - localStorage.removeItem('github_access_token'); + window.localStorage.removeItem('github_access_token'); console.info('Migrated GitHub token from localStorage to secure storage'); return true; }src/components/GitHubIntegration.tsx (5)
27-31: Critical: logger is imported from the wrong source (Sentry), breaking error capture
@sentry/reactdoes not exportlogger. Our repo's logger lives insrc/lib/error-handler.ts(see referenced snippet) and conditionally forwards realErrorinstances to the error handler/Sentry. Importing from Sentry and destructuring{ logger }will yieldundefined/wrong object and silently drop error reporting.Apply this diff:
-import * as Sentry from '@sentry/react'; - -const { logger } = Sentry; +import { logger } from '@/lib/error-handler';
150-156: Do not surface raw exception messages to end-usersPer guidelines, avoid leaking internal/3P API error details in UI strings. Replace with actionable, generic text; keep full details in logs. Also ensure
operationStatus.messagenever displays raw backend messages.Apply this diff:
- setOperationStatus({ - stage: 'error', - message: error instanceof Error ? error.message : 'Failed to load repository', - progress: 0 - }); - toast.error(error instanceof Error ? error.message : 'Failed to load repository'); + setOperationStatus({ + stage: 'error', + message: 'Failed to load repository. Please verify the URL and your token, then try again.', + progress: 0 + }); + toast.error('Failed to load repository. Please verify the URL and your token, then try again.');- setOperationStatus({ - stage: 'error', - message: error instanceof Error ? error.message : 'Failed to create pull request', - progress: 0 - }); - toast.error(error instanceof Error ? error.message : 'Failed to create pull request'); + setOperationStatus({ + stage: 'error', + message: 'Failed to create pull request. Please check your permissions and try again.', + progress: 0 + }); + toast.error('Failed to create pull request. Please check your permissions and try again.');Additionally, for the status chip text:
-<span className="text-sm">{operationStatus.message}</span> +<span className="text-sm">{operationStatus.message}</span>(Keep as-is if you ensure
operationStatus.messageis always a safe, pre-curated string, as above.)Also applies to: 248-254, 476-477
498-506: Wrap user-generated content with SafeText (XSS defense, mandated by guidelines)
currentRepo.full_name,currentRepo.description, andcurrentRepo.languageare user-controlled (GitHub data). Even though React escapes, our standard requires SafeText for all user-generated text.Add the import (see next comment for import location), then apply:
- <h3 className="font-semibold">{currentRepo.full_name}</h3> - <p className="text-sm text-muted-foreground">{currentRepo.description}</p> + <h3 className="font-semibold"><SafeText text={currentRepo.full_name} /></h3> + <p className="text-sm text-muted-foreground"><SafeText text={currentRepo.description ?? ''} /></p>- {currentRepo.language && ( - <Badge variant="secondary">{currentRepo.language}</Badge> - )} + {currentRepo.language && ( + <Badge variant="secondary"><SafeText text={currentRepo.language} /></Badge> + )}Also applies to: 508-513
72-76: Add explicit return types, error handling, and input validationThe async helpers in
src/components/GitHubIntegration.tsxshould be annotated withPromise<void>, wrapped intry/catchto surface errors, and sanitize/validate user-provided URLs. A search in.ts/.tsxfiles didn’t reveal any shared URL validator, so you’ll need to implement one (e.g. via Zod’surl()or a simple regex) and trim inputs before use.Locations to update:
src/components/GitHubIntegration.tsxlines 72–76 (checkGitHubSetup)- lines 104–109 (
parseAndLoadRepo)- lines 159–168 (
createPullRequest)Suggested diffs:
- const checkGitHubSetup = async () => { + const checkGitHubSetup = async (): Promise<void> => { try { const isSetup = await initializeGitHub(); setIsTokenSetup(isSetup); } catch (error) { console.error("Failed to initialize GitHub:", error); // optionally surface to user/UI }- const parseAndLoadRepo = async () => { + const parseAndLoadRepo = async (): Promise<void> => { try { + const repoUrl = githubUrl.trim(); + if (!isValidGitHubUrl(repoUrl)) { + throw new Error("Invalid repository URL"); + } const repo = parseRepo(repoUrl); await loadRepository(repo); } catch (error) { console.error("Error parsing or loading repo:", error); }- const createPullRequest = async () => { + const createPullRequest = async (): Promise<void> => { try { + const repoUrl = githubUrl.trim(); + if (!isValidGitHubUrl(repoUrl)) { + throw new Error("Invalid repository URL"); + } const pr = await githubService.createPR(repoUrl, prDetails); handlePrCreated(pr); } catch (error) { console.error("Failed to create pull request:", error); }You’ll also need to add or import a helper such as:
import { z } from "zod"; const isValidGitHubUrl = (url: string): boolean => z.string().url().regex(/github\.com\/[^/]+\/[^/]+/).safeParse(url).success;
170-208: Sanitize and bound PR title/description using existing security utilitiesBefore generating branch names, composing commits, or opening PRs, we should sanitize the PR title and description and enforce length limits to prevent abuse and unexpected API errors. We already have
sanitizeText,MAX_TITLE_LENGTH, andMAX_MESSAGE_LENGTHdefined insrc/utils/security.ts, so let’s reuse those.Please apply the following changes around lines 170–208 (and similarly at 215–225) of
src/components/GitHubIntegration.tsx:--- a/src/components/GitHubIntegration.tsx +++ b/src/components/GitHubIntegration.tsx @@ -import { toast } from 'sonner'; +import { toast } from 'sonner'; +import { sanitizeText, MAX_TITLE_LENGTH, MAX_MESSAGE_LENGTH } from '@/utils/security'; @@ try { - const originalOwner = currentRepo.owner.login; + const originalOwner = currentRepo.owner.login; const originalRepo = currentRepo.name; @@ setOperationStatus({ @@ // Fork the repository @@ const userLogin = forkedRepo.owner.login; @@ setOperationStatus({ @@ - // Create a new branch - const branchName = githubService.generateBranchName(prTitle); + // Sanitize inputs and generate a safe branch name + const sanitizedTitle = sanitizeText(prTitle.trim()).substring(0, MAX_TITLE_LENGTH); + const sanitizedDescription = sanitizeText(prDescription.trim()).substring(0, MAX_MESSAGE_LENGTH); + const branchName = githubService.generateBranchName(sanitizedTitle); await githubService.createBranch(userLogin, originalRepo, branchName, currentRepo.default_branch); @@ - await githubService.updateFiles( - userLogin, - originalRepo, - branchName, - changes, - prTitle - ); + await githubService.updateFiles( + userLogin, + originalRepo, + branchName, + changes, + sanitizedTitle + ); @@ - const prOptions: CreatePullRequestOptions = { + const prOptions: CreatePullRequestOptions = { owner: originalOwner, repo: originalRepo, - title: prTitle, - body: prDescription, + title: sanitizedTitle, + body: sanitizedDescription, headBranch: branchName, baseBranch: currentRepo.default_branch, originalOwner, };• We import
sanitizeText,MAX_TITLE_LENGTH, andMAX_MESSAGE_LENGTHfrom@/utils/security.
• We trim and sanitize bothprTitleandprDescription, then clamp them to their respective max lengths.
• We generate the branch name and pass only the sanitized title toupdateFiles.
• We open the pull request with sanitized title and description.This will ensure all user inputs are properly escaped and bounded before interacting with GitHub’s APIs.
src/lib/firecrawl.ts (2)
84-90: Leaking secret: VITE_FIRECRAWL_API_KEY will be exposed to the browser bundle.Using
import.meta.env.VITE_...makes the Firecrawl API key public to all clients. This is a high-severity secret exposure.Recommended approach:
- Move Firecrawl calls to a server-side API route (e.g.,
/api/firecrawl/*) that holds the secret in server env (no VITE_ prefix).- The browser calls your API route without the key.
- If client-side crawl is strictly required, use a short-lived server-issued token and rate-limit aggressively.
I can draft the server endpoint and a small client wrapper if helpful.
274-276: Possible division by zero when there are no crawled pages.If
crawlResult.totalPagesis 0,avgLoadTimebecomesInfinity. Guard this case.Apply this diff:
- avgLoadTime: crawlResult.crawlTime / crawlResult.totalPages, + avgLoadTime: crawlResult.totalPages > 0 + ? crawlResult.crawlTime / crawlResult.totalPages + : 0,api/secret-chat.ts (1)
6-7: Edge runtime mismatch: handler uses NodeVercelRequest/VercelResponse.
export const runtime = 'edge'requires a Web Fetch API handler(req: Request) => Response. Current code imports@vercel/nodetypes and usesres.status().json(...), which is incompatible and will fail in Edge.Apply this refactor to convert the handler to Edge-compatible Fetch API:
-import type { VercelRequest, VercelResponse } from '@vercel/node'; +// Edge runtime uses the standard Fetch API types (Request/Response) export const runtime = 'edge'; -export default async function handler(req: VercelRequest, res: VercelResponse) { - if (req.method !== 'POST') { - return res.status(405).json({ error: 'Method not allowed' }); - } +export default async function handler(req: Request): Promise<Response> { + if (req.method !== 'POST') { + return new Response(JSON.stringify({ error: 'Method not allowed' }), { + status: 405, + headers: { 'content-type': 'application/json' }, + }) + } @@ - const authResult = await verifyAuth({ headers: new Headers(req.headers as Record<string, string>) }); + const authResult = await verifyAuth({ headers: req.headers }); @@ - const { messages, apiKey, model = 'gemini-2.0-flash-exp' }: ChatRequest = req.body; + const { messages, apiKey, model = 'gemini-2.0-flash-exp' }: ChatRequest = await req.json(); @@ - const isStreaming = req.headers['accept']?.includes('text/stream') || - req.headers['x-stream'] === 'true'; + const accept = req.headers.get('accept') ?? '' + const isStreaming = accept.includes('text/stream') || req.headers.get('x-stream') === 'true' @@ - return result.toTextStreamResponse(); + return result.toTextStreamResponse() @@ - return res.status(200).json({ - message: { - role: 'assistant', - content: text, - }, - usage: { - promptTokens: usage?.inputTokens || 0, - completionTokens: usage?.outputTokens || 0, - totalTokens: usage?.totalTokens || 0, - }, - }); + return new Response(JSON.stringify({ + message: { role: 'assistant', content: text }, + usage: { + promptTokens: usage?.inputTokens ?? 0, + completionTokens: usage?.outputTokens ?? 0, + totalTokens: usage?.totalTokens ?? 0, + }, + }), { status: 200, headers: { 'content-type': 'application/json' } }) @@ - } catch (error: unknown) { - const errorMessage = error instanceof Error ? error.message : String(error); - console.error('Secret chat API error:', error); - - // Handle specific API errors - if (errorMessage.includes('API key')) { - return res.status(401).json({ - error: 'Invalid or expired API key. Please check your Gemini API key.' - }); - } - - if (errorMessage.includes('quota') || errorMessage.includes('rate limit')) { - return res.status(429).json({ - error: 'API quota exceeded or rate limit reached. Please try again later.' - }); - } - - if (errorMessage.includes('model')) { - return res.status(400).json({ - error: 'Invalid model specified. Please use a valid Gemini model.' - }); - } - - return res.status(500).json({ - error: 'An error occurred while processing your request. Please try again.' - }); - } + } catch (error: unknown) { + const errorMessage = error instanceof Error ? error.message : String(error) + console.error('Secret chat API error:', error) + + if (errorMessage.includes('API key')) { + return new Response(JSON.stringify({ + error: 'Invalid or expired API key. Please check your Gemini API key.' + }), { status: 401, headers: { 'content-type': 'application/json' } }) + } + if (errorMessage.includes('quota') || errorMessage.includes('rate limit')) { + return new Response(JSON.stringify({ + error: 'API quota exceeded or rate limit reached. Please try again later.' + }), { status: 429, headers: { 'content-type': 'application/json' } }) + } + if (errorMessage.includes('model')) { + return new Response(JSON.stringify({ + error: 'Invalid model specified. Please use a valid Gemini model.' + }), { status: 400, headers: { 'content-type': 'application/json' } }) + } + return new Response(JSON.stringify({ + error: 'An error occurred while processing your request. Please try again.' + }), { status: 500, headers: { 'content-type': 'application/json' } }) + } }Also applies to: 19-23
src/components/WebsiteCloneDialog.tsx (1)
401-413: Sanitize and safely render analysis fields (user-generated content)Fields like title, description, url, technologies, components, designPatterns, and color values originate from external content and should be rendered via SafeText (or sanitized) to prevent XSS. Also validate the URL before window.open.
Recommendations:
- Replace direct string rendering with SafeText for all analysis.* user-facing fields.
- Guard window.open with a protocol check (http/https only) and encode the URL.
Example (illustrative; adjust imports/paths to your SafeText and sanitize helpers):
// imports // import { SafeText } from '@/components/SafeText'; // import { sanitizeText } from '@/lib/security'; // render title/description/url: <h3 className="text-lg font-semibold text-gradient-static mb-2"> {/* fallback handled by SafeText */} <SafeText text={analysis.title ?? ''} /> </h3> <p className="text-muted-foreground text-sm mb-3"> <SafeText text={analysis.description ?? ''} /> </p> <div className="flex items-center gap-2 text-xs text-muted-foreground"> <Globe className="w-3 h-3" /> <SafeText text={analysis.url} /> <ExternalLink className="w-3 h-3 ml-2 cursor-pointer hover:text-blue-400" onClick={() => { try { const u = new URL(analysis.url); if (u.protocol === 'https:' || u.protocol === 'http:') window.open(u.href, '_blank', 'noopener,noreferrer'); } catch { /* ignore invalid */ } }} /> </div> // lists: {analysis.technologies?.map((tech, i) => ( <Badge key={i} variant="secondary" className="text-xs"> <SafeText text={tech} /> </Badge> ))}If SafeText is unavailable, sanitize with sanitizeText() before rendering and ensure no usage of dangerouslySetInnerHTML. I can push a patch once you confirm the SafeText import path.
Would you like me to wire up SafeText/sanitizeText across the component and add unit tests to exercise common payloads?
Also applies to: 424-449, 455-467, 479-483, 496-501, 513-523
src/hono-server.ts (3)
41-53: Polar client is used but never importedgetPolarClient() constructs new Polar(...) but Polar is undefined; import the SDK client from @polar-sh/sdk. The @polar-sh/hono package provides Hono adapters (Checkout/CustomerPortal/Webhooks), while the API client comes from @polar-sh/sdk. (docs.polar.sh, npmjs.com)
Apply this diff:
@@ -import { Hono, Context, Next } from 'hono'; +import { Hono, Context, Next } from 'hono'; +import { Polar } from '@polar-sh/sdk'Also applies to: 1-7
183-187: Do not trust customerId from query params for Customer PortalAllowing a user-supplied customerId enables account takeover of the portal. Resolve the Polar customer ID from the authenticated user on the server (e.g., via your DB mapping) and ignore the query param.
- getCustomerId: (event) => { - // Extract customer ID from query params or user context - const customerId = event.req.query('customerId'); - return customerId || ''; - }, + getCustomerId: async (event) => { + // Derive from authenticated user → server-side mapping only + const user = event.get<{ id: string }>('user') + // TODO: Implement a secure lookup: user.id -> polarCustomerId + const polarCustomerId = await /* fetch from DB using user.id */ + return polarCustomerId + },I can wire this to your Convex DB (user -> Polar customer mapping) if you share the schema.
56-76: Use the correct Clerk JWT key (not the issuer domain) forverifyToken()The
verifyToken()SDK method requires either ajwtKey(your JWKS public key) or asecretKey(your Clerk Secret Key) – passing the issuer domain will cause verification to fail. You should:
- Remove
CLERK_JWT_ISSUER_DOMAINentirely from this call.- Use
process.env.CLERK_JWT_KEY(your JWKS public key) or, as a fallback,process.env.CLERK_SECRET_KEY.- Continue to pass
audiencefromprocess.env.CLERK_JWT_AUDIENCEif applicable, sinceVerifyTokenOptionssupports anaudience?: string | string[](clerk.com).- (Optional) Supply
authorizedParties: process.env.CLERK_AUTHORIZED_PARTIES?.split(',')for CSRF hardening, matching the SDK’sauthorizedParties: string[]option (clerk.com).Please update
src/hono-server.ts(lines 56–76) accordingly:@@ async function verifyClerkToken(authHeader: string): Promise<{ id: string; email?: string } | null> { - const issuer = process.env.CLERK_JWT_ISSUER_DOMAIN; - - if (!issuer || !token) { - return null; - } + if (!token) { + return null; + } const audience = process.env.CLERK_JWT_AUDIENCE; const { verifyToken } = await import('@clerk/backend'); - - const options: { jwtKey?: string; audience?: string } = { jwtKey: issuer }; - if (audience) options.audience = audience; + const jwtKey = process.env.CLERK_JWT_KEY; + const secretKey = process.env.CLERK_SECRET_KEY; + + const options: { + jwtKey?: string; + secretKey?: string; + audience?: string | string[]; + authorizedParties?: string[]; + } = {}; + + if (jwtKey) { + options.jwtKey = jwtKey; + } else if (secretKey) { + options.secretKey = secretKey; + } + + if (audience) { + options.audience = audience; + } + + if (process.env.CLERK_AUTHORIZED_PARTIES) { + options.authorizedParties = process.env.CLERK_AUTHORIZED_PARTIES.split(','); + } const verified = await verifyToken(token, options) as { sub?: string; email?: string }; if (!verified.sub) return null;This ensures you’re supplying a valid JWKS public key (or secret) to the SDK as documented.
src/components/E2BCodeExecution.tsx (2)
60-71: Fix missing dependencies in autoRun effect (and avoid suppressing lint).The effect intentionally ignores deps; this can cause stale code/language execution. Use useCallback for handleExecute and include it as a dependency.
Apply this diff:
-useEffect(() => { - if (autoRun) { - // fire and forget, errors are handled inside - // slight delay to ensure mount animation completes - const t = setTimeout(() => { - handleExecute(); - }, 200); - return () => clearTimeout(t); - } -// eslint-disable-next-line react-hooks/exhaustive-deps -}, [autoRun]); +useEffect(() => { + if (!autoRun) return; + const t = setTimeout(() => { + handleExecute(); + }, 200); + return () => clearTimeout(t); +}, [autoRun, handleExecute]);(See next comment to wrap handleExecute in useCallback.)
72-114: Wrap handleExecute in useCallback to stabilize dependencies.Stabilizes the function reference and eliminates stale-closure risks.
Apply this diff:
-const handleExecute = async () => { +const handleExecute = React.useCallback(async () => { setIsExecuting(true); setE2bFailed(false); setShowFailsafe(false); try { const executionResult = await onExecute(code, language); setResult(executionResult); if (executionResult.success) { toast.success('Code executed successfully with E2B!'); } else { toast.error('E2B execution failed. WebContainer failsafe available.'); setE2bFailed(true); } } catch (error) { console.error('E2B execution error:', error); const errorMessage = String(error); // Check if it's an E2B-related error const isE2bError = errorMessage.includes('E2B') || errorMessage.includes('sandbox') || errorMessage.includes('timeout') || errorMessage.includes('connection') || errorMessage.includes('network'); if (isE2bError) { toast.error('E2B service unavailable. Activating WebContainer failsafe...'); setE2bFailed(true); setShowFailsafe(true); } else { toast.error('Code execution failed'); } setResult({ success: false, error: errorMessage, logs: [] }); } finally { setIsExecuting(false); } -}; +}, [code, language, onExecute]);src/components/ChatInterface.tsx (1)
149-151: Avoid duplicating security constants; import MAX_MESSAGE_LENGTH from utils.There’s an exported MAX_MESSAGE_LENGTH in src/utils/security.ts. Import and use it to prevent drift.
Apply this diff:
+import { MAX_MESSAGE_LENGTH } from '@/utils/security'; @@ -// Security constants for input validation -const MAX_MESSAGE_LENGTH = 10000;
♻️ Duplicate comments (12)
src/components/ui/input-otp.tsx (1)
36-37: Silence security/detect-object-injection by validating array and using .at() (keeps your new null-safety, avoids the lint error).Nice guard against missing context. However, ESLint flags Line 36 (“Generic Object Injection Sink”) because of computed bracket access
slots?.[index]. While this is a false positive for array indexing, it will still fail CI. Validate thatslotsis an array and switch to.at(); also add light range/shape defaults to keep rendering predictable.Apply this diff:
- const slot = inputOTPContext?.slots?.[index] || null - const { char, hasFakeCaret, isActive } = slot || {} + const slots = inputOTPContext?.slots + const slot = + Array.isArray(slots) && Number.isInteger(index) && index >= 0 + ? slots.at(index) ?? null + : null + const { char = "", hasFakeCaret = false, isActive = false } = slot ?? {}If you must keep bracket indexing due to browser targets without
.at(), add an inline eslint disable with justification and retain theArray.isArray/bounds checks:// eslint-disable-next-line security/detect-object-injection -- safe: validated array + bounded numeric index const slot = Array.isArray(slots) && Number.isInteger(index) && index >= 0 ? slots[index] ?? null : nullsrc/components/LoadingStates.tsx (1)
52-62: Harden getSizeClass: remove object-injection sink and tighten typesIndexing
sizeMap[safeSize]with a string key plusRecord<string, string>triggers the ESLint security rule (security/detect-object-injection) and matches the prior GHAS alert on Line 61. We don’t need dynamic keys here—sizeis a fixed union. Switch on the union (or, at minimum, narrow the type and guard), and keep the default in a branch, not via||. This also aligns with our strict TypeScript guidelines.Apply this diff:
- const getSizeClass = (size: string) => { - const safeSize = size || 'md'; - const sizeMap: Record<string, string> = { - xs: sizeClasses.xs, - sm: sizeClasses.sm, - md: sizeClasses.md, - lg: sizeClasses.lg, - xl: sizeClasses.xl - }; - return sizeMap[safeSize] || sizeClasses.md; - }; + const getSizeClass = (size?: LoadingSpinnerProps['size']): string => { + switch (size) { + case 'xs': + return sizeClasses.xs; + case 'sm': + return sizeClasses.sm; + case 'lg': + return sizeClasses.lg; + case 'xl': + return sizeClasses.xl; + case 'md': + case undefined: + default: + return sizeClasses.md; + } + };src/lib/github-service.ts (1)
129-133: Fix potentially unsafe regular expressions flagged by static analysis.The regular expressions on lines 131-132 are flagged as potentially unsafe by ESLint security checks. These patterns could be vulnerable to ReDoS (Regular Expression Denial of Service) attacks with maliciously crafted input.
The issue stems from the nested quantifiers and alternations that can cause exponential backtracking. Apply this diff to use safer, more specific patterns:
- /^(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)(?:\/.*)?$/, - /^(?:https?:\/\/)?(?:www\.)?github\.com\/([^/]+)\/([^/]+)\.git$/, + /^https?:\/\/(?:www\.)?github\.com\/([^/]+)\/([^/]+?)(?:\.git)?(?:\/.*)?$/, /^git@github\.com:([^/]+)\/([^/]+)\.git$/,This consolidates the first two patterns into one safer regex that:
- Requires the protocol to be present (removes optional
(?:https?:\/\/)?)- Uses a non-greedy quantifier
+?for the repo name- Combines the
.gitand path suffix handlingAlternatively, consider using URL parsing without regex for HTTP(S) URLs:
// For HTTP(S) URLs, use URL parsing instead of regex if (url.startsWith('http://') || url.startsWith('https://')) { const parsedUrl = new URL(url); if (allowedDomains.includes(parsedUrl.hostname.toLowerCase())) { const pathParts = parsedUrl.pathname.split('/').filter(Boolean); if (pathParts.length >= 2) { const owner = pathParts[0]; const repo = pathParts[1].replace(/\.git$/, ''); // Continue with validation... } } } // Only use regex for git@ format else if (url.startsWith('git@')) { const gitPattern = /^git@github\.com:([^/]+)\/([^/]+)\.git$/; const match = url.match(gitPattern); // Process match... }src/components/LivePreview.tsx (1)
127-133: Address missing dependency warning by depending on createSecurePreviewUrl.This fixes the static analysis warning reported previously.
Apply this diff:
-}, [code, language]); +}, [code, language, createSecurePreviewUrl]);lib/deployment/netlify.ts (1)
522-527: Use robust host parsing for git provider detection (fixes CodeQL finding)
includes('github.com')etc. is vulnerable to crafted URLs. Parse the host (and SCP-like syntax) and validate against a whitelist.Apply this diff:
- private extractGitProvider(url: string): string { - if (url.includes('github.com')) return 'github'; - if (url.includes('gitlab.com')) return 'gitlab'; - if (url.includes('bitbucket.org')) return 'bitbucket'; - return 'github'; // default - } + private extractGitProvider(url: string): 'github' | 'gitlab' | 'bitbucket' { + const scpLike = /^git@([^:]+):/i.exec(url)?.[1]?.toLowerCase(); + const host = scpLike ?? (() => { + try { return new URL(url).hostname.toLowerCase(); } catch { return ''; } + })(); + switch (host) { + case 'github.com': return 'github'; + case 'gitlab.com': return 'gitlab'; + case 'bitbucket.org': return 'bitbucket'; + default: throw new DeploymentError(`Unsupported git host: ${host || 'unknown'}`, 'netlify', 'UNSUPPORTED_GIT_HOST'); + } + }Optionally tighten
extractRepoPathto only extract when host is recognized.api/domains.ts (1)
522-535: Dynamic indexing is safe here; ESLint warning is a false positive
safePlatformis validated byisPlatformbefore indexingbaseInstructions, so this is not an injection sink.api/deploy.ts (1)
228-232: Duplicate subdomain validation logicThe subdomain validation regex here duplicates the validation logic in the manager. Consider using the manager's validation or extracting it to a shared utility.
Consider extracting the validation to a shared utility function:
-// Validate subdomain format -if (subdomain && !/^[a-z0-9-]{3,63}$/i.test(subdomain)) { - return res.status(400).json({ - error: 'Invalid subdomain format. Must be 3-63 characters, alphanumeric and hyphens only.' - }); -} +// Validate subdomain format using the manager's validation +if (subdomain && !deploymentManager.isValidSubdomain(subdomain)) { + return res.status(400).json({ + error: 'Invalid subdomain format. Must be 3-63 characters, alphanumeric and hyphens only.' + }); +}Note: This would require exposing the
isValidSubdomainmethod as public in the manager class.lib/deployment/vercel.ts (1)
567-570: Security: URL validation vulnerability for Git provider detectionThe current implementation has a security vulnerability where malicious URLs like
malicious.com?github.comorgithub.com.attacker.comcould bypass the validation.Apply this diff to fix the URL validation:
private extractGitProvider(url: string): string { - if (url.includes('github.com')) return 'github'; - if (url.includes('gitlab.com')) return 'gitlab'; - if (url.includes('bitbucket.org')) return 'bitbucket'; + try { + const parsedUrl = new URL(url); + const hostname = parsedUrl.hostname.toLowerCase(); + + if (hostname === 'github.com' || hostname === 'www.github.com') return 'github'; + if (hostname === 'gitlab.com' || hostname === 'www.gitlab.com') return 'gitlab'; + if (hostname === 'bitbucket.org' || hostname === 'www.bitbucket.org') return 'bitbucket'; + } catch { + // Invalid URL, fall through to default + } return 'github'; // default }api-dev-server.ts (4)
546-546: False positive: File path operations are properly validatedThe static analysis warnings about non-literal fs operations are false positives. The code properly validates and sanitizes paths before using them with fs operations.
Also applies to: 623-624
123-125: Security: ReDoS vulnerability in IP validation regexesThe IPv4 and IPv6 regex patterns are vulnerable to ReDoS attacks due to nested quantifiers.
Replace with safer validation using the built-in
netmodule:-// Basic IPv4 validation -const ipv4Regex = /^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/; -// Basic IPv6 validation (simplified) -const ipv6Regex = /^(?:[0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}$|^::1$|^::$/; - -const trimmedIP = ip.trim(); - -if (ipv4Regex.test(trimmedIP) || ipv6Regex.test(trimmedIP)) { - return trimmedIP; -} +import { isIPv4, isIPv6 } from 'net'; + +const trimmedIP = ip.trim(); + +if (isIPv4(trimmedIP) || isIPv6(trimmedIP)) { + return trimmedIP; +}
462-466: Security: CORS credentials vulnerabilitySetting
Access-Control-Allow-Credentialstotruewhile allowing wildcard origins creates a security vulnerability.Apply this diff to fix the CORS configuration:
-const allowedOrigin = CONFIG.CORS_ORIGINS.includes('*') || - (origin && CONFIG.CORS_ORIGINS.includes(origin)) ? (origin || '*') : null; - -if (allowedOrigin) res.setHeader('Access-Control-Allow-Origin', allowedOrigin); -res.setHeader('Access-Control-Allow-Credentials', 'true'); +const allowedOrigin = origin && CONFIG.CORS_ORIGINS.includes(origin) ? origin : null; + +if (allowedOrigin) { + res.setHeader('Access-Control-Allow-Origin', allowedOrigin); + res.setHeader('Access-Control-Allow-Credentials', 'true'); +} else if (CONFIG.CORS_ORIGINS.includes('*')) { + res.setHeader('Access-Control-Allow-Origin', '*'); + // Don't set credentials when using wildcard +}
280-283: Object injection vulnerability in query/cookie parsingThe code uses user-controlled input as object keys without validation, which could lead to prototype pollution.
Apply this diff to prevent prototype pollution:
private parseQuery(searchParams: URLSearchParams): { [key: string]: string | string[] } { const query: { [key: string]: string | string[] } = {}; for (const [key, value] of searchParams.entries()) { const safeKey = String(key); + // Prevent prototype pollution + if (safeKey === '__proto__' || safeKey === 'constructor' || safeKey === 'prototype') { + continue; + } const existingValue = query[safeKey];Apply similar protection in
parseCookiesandparseBodymethods:// In parseCookies try { const decodedValue = decodeURIComponent(rawValue); + // Prevent prototype pollution + if (trimmedName === '__proto__' || trimmedName === 'constructor' || trimmedName === 'prototype') { + skippedCount++; + return; + } cookies[trimmedName] = decodedValue;// In parseBody for (const [key, value] of params.entries()) { const safeKey = String(key); + // Prevent prototype pollution + if (safeKey === '__proto__' || safeKey === 'constructor' || safeKey === 'prototype') { + continue; + } result[safeKey] = value;Also applies to: 300-308, 325-325
High Priority Fixes: - Replace vulnerable regex patterns in IP validation with safe string operations - Secure cookie parsing with Object.create(null) to prevent prototype pollution - Enhanced file system operations with additional validation layers - Add PostHog analytics payload size limits (32KB) and comprehensive PII sanitization - Implement error message sanitization to prevent information leakage Security Improvements: - Safe IPv4/IPv6 validation without regex DoS vulnerability - Cookie name/value validation with length limits and safe patterns - Multi-layer path traversal protection for API endpoint resolution - PII pattern detection and redaction for analytics - Development vs production error handling with safe messaging - ESLint security rule compliance with appropriate exemptions for validated cases 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
|
| GitGuardian id | GitGuardian status | Secret | Commit | Filename | |
|---|---|---|---|---|---|
| 20372496 | Triggered | Generic High Entropy Secret | 72993ac | .env.deployment.template | View secret |
| 20372498 | Triggered | Generic High Entropy Secret | 72993ac | .env.deployment.template | View secret |
🛠 Guidelines to remediate hardcoded secrets
- Understand the implications of revoking this secret by investigating where it is used in your code.
- Replace and store your secrets safely. Learn here the best practices.
- Revoke and rotate these secrets.
- If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.
To avoid such incidents in the future consider
- following these best practices for managing and storing secrets including API keys and other credentials
- install secret detection on pre-commit to catch secret before it leaves your machine and ease remediation.
🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
|
Claude finished @otdoges's task —— View job PR Review: Secure API Server Implementation
|
| // Skip potentially sensitive keys | ||
| const isSensitive = PII_PATTERNS.some(pattern => pattern.test(key)); | ||
| if (isSensitive) { | ||
| sanitized[key] = '[REDACTED]'; |
Check failure
Code scanning / ESLint
Detects "variable[key]" as a left- or right-hand assignment operand. Error
| if (isDevelopment && CONFIG.NODE_ENV === 'development') { | ||
| // Remove sensitive paths and internal details but keep useful info | ||
| const sanitized = errorMessage | ||
| .replace(/\/[^/\s]*\/([^/\s]*\/)*[^/\s]*(\.(js|ts|json))/g, '[FILE_PATH]') |
Check failure
Code scanning / ESLint
Detects potentially unsafe regular expressions, which may take a very long time to run, blocking the event loop. Error

…d rate limiting
Summary by CodeRabbit
New Features
Improvements
Bug Fixes
Documentation
Chores