feat: implement @mention feature for calendar notes#500
feat: implement @mention feature for calendar notes#500ngoiyaeric wants to merge 3 commits intomainfrom
Conversation
Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
|
👋 Jules, reporting for duty! I'm here to lend a hand with this pull request. When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down. I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job! For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with New to Jules? Learn more at jules.google/docs. For security, I will only act on instructions from the user who triggered this task. |
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
|
WalkthroughAdds inline Changes
Sequence Diagram(s)sequenceDiagram
participant User as Browser User
participant UI as Calendar Notepad Component
participant Server as Server Actions
participant DB as Database
User->>UI: Type "@" + query
UI->>Server: searchUsers(query)
Server->>DB: SELECT users WHERE email ILIKE query%
DB-->>Server: matching users
Server-->>UI: user suggestions
UI->>User: show dropdown
User->>UI: select suggestion
UI->>UI: insert formatted `@mention`, hide dropdown
User->>UI: Save note
UI->>Server: saveNote(content)
Server->>Server: extractAndValidateMentions(content)
Server->>DB: syncUserWithDatabase() / validate mention IDs
Server->>DB: INSERT/UPDATE calendar_notes (with userTags)
DB-->>Server: saved
Server-->>UI: success
UI->>User: note persisted / rendered with mentions
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested labels
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
The current implementation has scaling and correctness problems: the UI triggers searchUsers() on every keystroke without debouncing and can display stale results, while server-side mention validation does a full users table scan. Mention parsing/matching is also too restrictive and ambiguous (@\w+ + startsWith) and can silently tag the wrong user. Finally, the migration adds users.email without indexes/constraints, which will hurt prefix-search performance.
Summary of changes
Summary
This PR adds an @mention feature for calendar notes end-to-end:
-
UI (Calendar Notepad)
- Adds an autocomplete dropdown that calls the new server action
searchUsers()while typing@.... - Highlights
@mentionsand#locationtokens when rendering note content. - Shows a
Usersicon whennote.userTagsexists.
- Adds an autocomplete dropdown that calls the new server action
-
Server actions / persistence
- Adds mention parsing/validation in
saveNote()by extracting@prefixtokens and mapping them to user IDs. - Adds
syncUserWithDatabase()andsearchUsers()inlib/actions/users.ts.
- Adds mention parsing/validation in
-
Database
- Adds
emailcolumn tousersvia0002_add_email_to_users.sql. - Updates Drizzle migration snapshots and journal.
- Adds
-
Dependencies
- Bumps
drizzle-ormfrom^0.29.0to^0.45.1.
- Bumps
| const handleNoteContentChange = async (e: React.ChangeEvent<HTMLTextAreaElement>) => { | ||
| const value = e.target.value; | ||
| setNoteContent(value); | ||
|
|
||
| const cursorPosition = e.target.selectionStart; | ||
| const textBeforeCursor = value.substring(0, cursorPosition); | ||
| const words = textBeforeCursor.split(/\s/); | ||
| const lastWord = words[words.length - 1]; | ||
|
|
||
| if (lastWord.startsWith("@")) { | ||
| const query = lastWord.slice(1); | ||
| setMentionQuery(query); | ||
| const results = await searchUsers(query); | ||
| setUserSuggestions(results); | ||
| setShowSuggestions(results.length > 0); | ||
| } else { | ||
| setShowSuggestions(false); | ||
| } | ||
| }; |
There was a problem hiding this comment.
Mention search triggers a server request on every keystroke (no debouncing / cancellation)
handleNoteContentChange calls searchUsers(query) for every change while the last word starts with @. This can spam server actions and also introduces race conditions where a slower response overwrites a newer query’s suggestions.
Because this runs in a client component, you should debounce input and ignore out-of-order responses (or use AbortController if your server action/fetch supports it).
Suggestion
Debounce the query and guard against stale responses. For example:
const [mentionQuery, setMentionQuery] = useState("")
const latestQueryRef = useRef("")
useEffect(() => {
if (!mentionQuery) {
setShowSuggestions(false)
setUserSuggestions([])
return
}
latestQueryRef.current = mentionQuery
const handle = setTimeout(async () => {
const q = latestQueryRef.current
const results = await searchUsers(q)
// ignore stale responses
if (latestQueryRef.current !== q) return
setUserSuggestions(results)
setShowSuggestions(results.length > 0)
}, 150)
return () => clearTimeout(handle)
}, [mentionQuery])Then in onChange, only compute/set mentionQuery and avoid awaiting network work in the event handler.
Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
| const handleSelectUser = (email: string) => { | ||
| const prefix = email.split('@')[0]; | ||
| setNoteContent(prev => prev.replace(/@\w*$/, `@${prefix} `)); | ||
| setShowSuggestions(false); | ||
| }; |
There was a problem hiding this comment.
Selecting a user replaces the last token in the entire textarea, not the token at the cursor
prev.replace(/@\w*$/, ...) only matches an @... at the end of the entire string. If the cursor is in the middle of the textarea, or there is punctuation after the mention, the replacement will fail or replace the wrong text.
This makes the autocomplete feel broken in common editing scenarios (e.g., adding a mention mid-note).
Suggestion
Track the selection start/end and replace relative to the cursor, not the end of the string. One approach:
const [selectionStart, setSelectionStart] = useState<number | null>(null)
const handleNoteContentChange = (e: React.ChangeEvent<HTMLTextAreaElement>) => {
setNoteContent(e.target.value)
setSelectionStart(e.target.selectionStart)
// compute mentionQuery from textBeforeCursor and setMentionQuery(...)
}
const handleSelectUser = (email: string) => {
const prefix = email.split("@")[0]
setNoteContent(prev => {
const pos = selectionStart ?? prev.length
const before = prev.slice(0, pos)
const after = prev.slice(pos)
// replace the last @token immediately before cursor
const updatedBefore = before.replace(/(^|\s)@[^\s@]*$/, `$1@${prefix} `)
return updatedBefore + after
})
setShowSuggestions(false)
}Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
| const renderContent = (text: string) => { | ||
| if (!text) return null; | ||
| return text.split(/(@\w+|#location)/g).map((part, i) => { | ||
| if (part.startsWith('@')) return <span key={i} className="text-primary font-medium">{part}</span>; | ||
| if (part === '#location') return <span key={i} className="text-primary font-medium">{part}</span>; | ||
| return part; | ||
| }); | ||
| }; |
There was a problem hiding this comment.
Mention tokenization is too restrictive (@\w+ only)
Both the UI highlighter and the server-side extractor use @\w+, which excludes valid/expected characters commonly present in emails/usernames (e.g., ., -). This means mentions like @john.doe won't highlight correctly and won’t be validated.
Also, the UI uses split(/(@\w+|#location)/g) which can produce surprising results if the mention is followed by punctuation.
Suggestion
Broaden the mention pattern to match the same character set you intend to support (and keep it consistent with the server extractor). For example, support GitHub-like handles: [a-zA-Z0-9_.-]+.
const mentionPattern = /@([a-zA-Z0-9_.-]+)/gUpdate both renderContent and the server-side extractAndValidateMentions to use the same pattern.
Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
| const [dateOffset, setDateOffset] = useState(0) | ||
| const [taggedLocation, setTaggedLocation] = useState<any | null>(null) | ||
|
|
||
| const [showSuggestions, setShowSuggestions] = useState(false); | ||
| const [userSuggestions, setUserSuggestions] = useState<any[]>([]); | ||
| const [mentionQuery, setMentionQuery] = useState(""); |
There was a problem hiding this comment.
Multiple new any usages were introduced (taggedLocation: any, userSuggestions: any[], handleFlyTo(location: any)). Even if tsc passes, this makes the new feature brittle and harder to evolve safely.
Since the suggestions are coming from the users table, you can type the shape minimally (e.g., { id: string; email: string | null }).
Suggestion
Replace any with small, local types to keep the feature maintainable.
type UserSuggestion = { id: string; email: string | null };
const [userSuggestions, setUserSuggestions] = useState<UserSuggestion[]>([]);For locations, consider a minimal GeoJSON type instead of any.
Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
| async function extractAndValidateMentions(content: string): Promise<string[]> { | ||
| const mentionRegex = /@(\w+)/g; | ||
| const matches = Array.from(content.matchAll(mentionRegex)); | ||
| const potentialEmails = matches.map(match => match[1]); | ||
|
|
||
| if (potentialEmails.length === 0) return []; | ||
|
|
||
| try { | ||
| // Fetch all users to match against prefixes | ||
| // In a larger system, we would use a more optimized search | ||
| const users = await db.select({ id: usersSchema.id, email: usersSchema.email }) | ||
| .from(usersSchema) | ||
| .execute(); | ||
|
|
||
| const validatedIds: string[] = []; | ||
| potentialEmails.forEach(mention => { | ||
| const found = users.find(u => u.email?.toLowerCase().startsWith(mention.toLowerCase())); | ||
| if (found) validatedIds.push(found.id); | ||
| }); | ||
|
|
||
| return Array.from(new Set(validatedIds)); | ||
| } catch (error) { |
There was a problem hiding this comment.
Mention validation does a full table scan of users
extractAndValidateMentions loads all users then does a find() for each mention. This will not scale and will get slower as the user table grows.
Given you already implemented searchUsers() with ilike(email, '${prefix}%'), the validation should query only the prefixes present in the note (or use OR over prefixes).
Suggestion
Avoid fetching the full users table. Instead, query only candidates for the extracted prefixes.
A simple improvement is to issue one query per unique mention prefix (usually small), or build a single OR clause:
const prefixes = Array.from(new Set(potentialMentions.map(m => m.toLowerCase())))
const candidates = await db
.select({ id: usersSchema.id, email: usersSchema.email })
.from(usersSchema)
.where(or(...prefixes.map(p => ilike(usersSchema.email, `${p}%`))))Then map prefix -> best match deterministically.
Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
| const validatedIds: string[] = []; | ||
| potentialEmails.forEach(mention => { | ||
| const found = users.find(u => u.email?.toLowerCase().startsWith(mention.toLowerCase())); | ||
| if (found) validatedIds.push(found.id); | ||
| }); |
There was a problem hiding this comment.
Prefix matching is ambiguous and can tag the wrong user
The validation logic treats @jo as a prefix and tags the first user whose email starts with that prefix. With multiple users sharing a prefix (e.g., john@... and joel@...), this can silently tag the wrong person.
At minimum, you need deterministic selection rules, and ideally you should persist an unambiguous identifier (e.g., tag by user id using the autocomplete selection, or require full email/local-part match).
Suggestion
Make tagging unambiguous:
- Prefer storing a stable token in content on selection (e.g.,
@[userId]or@{userId}) while rendering it as@prefixfor display. - If you must keep plain text
@prefix, require an exact match on local-part (email.split('@')[0]) rather thanstartsWith, and ignore partials.
Example exact match approach:
const found = users.find(u => u.email && u.email.split('@')[0].toLowerCase() === mention.toLowerCase())Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
| export async function saveNote(noteData: NewCalendarNote | CalendarNote): Promise<CalendarNote | null> { | ||
| // Ensure current user is synced | ||
| await syncUserWithDatabase(); | ||
|
|
||
| const userId = await getCurrentUserIdOnServer(); | ||
|
|
||
| const userTags = await extractAndValidateMentions(noteData.content); | ||
| if (!userId) { | ||
| console.error('saveNote: User not authenticated'); |
There was a problem hiding this comment.
saveNote syncs user before checking authentication
saveNote calls syncUserWithDatabase() before it checks getCurrentUserIdOnServer(). If the user isn't authenticated, you're still doing extra work and potentially additional auth/session lookups.
This is minor but easy to fix and reduces unnecessary calls.
Suggestion
Check userId first, then sync (or have syncUserWithDatabase return the id and use that):
const userId = await getCurrentUserIdOnServer()
if (!userId) return null
await syncUserWithDatabase()Or:
const userId = await syncUserWithDatabase()
if (!userId) return nullReply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
| @@ -0,0 +1 @@ | |||
| ALTER TABLE "users" ADD COLUMN "email" text; | |||
There was a problem hiding this comment.
Migration adds users.email without uniqueness / indexing
You’re now depending on email/prefix lookup for mentions (searchUsers via ilike and validation via prefix matching). Without an index, ilike 'prefix%' can degrade as the table grows.
Also, if you expect 1:1 mapping between Supabase users and rows, email duplication can cause ambiguous mentions and inconsistent results.
Suggestion
Add constraints/indexes that match how you query:
- If email should be unique:
CREATE UNIQUE INDEX users_email_unique_idx ON users (email); - If not unique, at least add an index:
CREATE INDEX users_email_idx ON users (email);
For prefix search on Postgres, consider varchar_pattern_ops or pg_trgm + GIN depending on desired search behavior.
Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this suggestion.
There was a problem hiding this comment.
Actionable comments posted: 11
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
lib/actions/users.ts (1)
20-51: 🛠️ Refactor suggestion | 🟠 MajorInconsistent user management: in-memory store vs. database.
The file retains an in-memory
usersStorewithgetUsersandaddUserfunctions that simulate database operations, while the newsyncUserWithDatabaseandsearchUsersfunctions use the actual database. This creates an inconsistent dual approach that may confuse maintainers.Consider removing the in-memory simulation if it's no longer needed, or clearly document the distinction if both approaches are intentionally preserved for different use cases.
lib/actions/calendar.ts (1)
97-106: 🛠️ Refactor suggestion | 🟠 MajorReorder operations: authenticate before database queries.
extractAndValidateMentions(line 102) performs a database query before the authentication check (line 103). If the user is unauthenticated, we've already incurred unnecessary database access.♻️ Proposed reordering
export async function saveNote(noteData: NewCalendarNote | CalendarNote): Promise<CalendarNote | null> { // Ensure current user is synced await syncUserWithDatabase(); const userId = await getCurrentUserIdOnServer(); - - const userTags = await extractAndValidateMentions(noteData.content); if (!userId) { console.error('saveNote: User not authenticated'); return null; } + + const userTags = await extractAndValidateMentions(noteData.content);
🤖 Fix all issues with AI agents
In `@components/calendar-notepad.tsx`:
- Around line 183-196: The suggestion list currently maps userSuggestions and
directly uses user.email for display and passing to handleSelectUser, but email
can be null; update the rendering in the showSuggestions block to skip or filter
out entries where user.email is null (e.g., filter userSuggestions by user.email
truthiness or add a conditional render inside the map), and ensure
handleSelectUser is only called with a non-null string; reference:
showSuggestions, userSuggestions, handleSelectUser, and user.email.
- Around line 218-222: The button rendering the MapPin when note.locationTags
exists is missing an explicit type and may submit surrounding forms (see the JSX
using note.locationTags and the handleFlyTo handler and MapPin icon); update
that button element to include type="button" so clicking it invokes handleFlyTo
without triggering form submission.
- Around line 94-112: handleNoteContentChange triggers searchUsers on every
keystroke and can produce out-of-order results; debounce the query or serialize
requests to prevent stale responses from overwriting newer ones: implement a
debounce around invoking searchUsers (or add a sequential requestId token stored
in a ref) inside handleNoteContentChange so only the latest search result calls
setUserSuggestions/setShowSuggestions and only for the current requestId, and
ensure setMentionQuery is updated with the debounced value; update references to
handleNoteContentChange, searchUsers, setUserSuggestions, setShowSuggestions and
setMentionQuery accordingly.
- Around line 27-29: The state userSuggestions is currently typed as any[];
define a proper TypeScript interface (e.g., UserSuggestion or UserRecord) that
matches the database user schema (fields like id, name, email, avatarUrl, etc.),
then replace useState<any[]>([]) with useState<UserSuggestion[]>([]) and update
any usages such as setUserSuggestions and places that read from userSuggestions
to use the new type; ensure imports/types are declared or exported where needed
so compiler and IDE get correct autocomplete for functions/components that
reference userSuggestions or mentionQuery.
In `@drizzle/migrations/0002_add_email_to_users.sql`:
- Line 1: The migration should make the "email" column NOT NULL and UNIQUE to
prevent duplicate emails; update the ALTER TABLE in the migration to add the
column with NOT NULL and a UNIQUE constraint (or create a UNIQUE index) for
"email". Also update the schema definition for the Users table in
lib/db/schema.ts (the Users/UsersTable export around line 12) to mark the email
field as non-nullable and unique so the TypeScript schema matches the DB change
and functions like extractAndValidateMentions and the calendar-notepad user
suggestions resolve unambiguously.
In `@drizzle/migrations/meta/0002_snapshot.json`:
- Around line 260-273: Add a database index on the "email" column to avoid
full-table scans for `@mention/autocomplete` queries by adding an entry for
"email" under the "indexes" object in the migration snapshot (the key names to
edit are "email" and "indexes"); also verify whether "email" should be UNIQUE
and if so add the corresponding entry under "uniqueConstraints" in the snapshot
to reflect the migration's UNIQUE constraint so the schema and constraints
remain consistent (update the JSON where "uniqueConstraints" and "indexes" are
defined).
- Line 75: The calendar_notes table currently has no indexes; add indexes for
the commonly filtered columns to improve query performance by updating the
migration/snapshot to define indexes on calendar_notes.user_id and
calendar_notes.date (or a composite index on (user_id, date) if queries often
filter by both). Modify the migration snapshot/definition that contains the
calendar_notes schema to include either individual indexes (e.g., index on
user_id and index on date) or a composite index, and generate/apply a new
migration that creates those indexes so the database and the Drizzle migration
metadata stay in sync.
In `@lib/actions/calendar.ts`:
- Around line 70-72: The current mentionRegex only captures word characters and
misses common email/name characters like dots and hyphens; update the
mentionRegex (used where mentionRegex, matches and potentialEmails are defined)
to allow letters, digits, dot, underscore, hyphen and plus so `@john.doe` and
similar tokens are fully captured, then regenerate matches and potentialEmails
from that updated pattern to ensure full local-part prefixes are extracted.
- Around line 69-93: The extractAndValidateMentions function currently fetches
all users which is inefficient; modify it to perform targeted DB queries for the
extracted mention prefixes (potentialEmails) instead of selecting the entire
users table: for example, perform a single query that filters users by email
ILIKE each mention% (or use a WHERE email ILIKE ANY(...) / multiple ORs) via the
same db/select pipeline that references usersSchema, then map the matching rows
to ids (dedupe) and return them; update the logic around potentialEmails,
validatedIds and the error handling to reflect the new query approach and avoid
loading all users into memory.
In `@lib/actions/users.ts`:
- Around line 57-80: The syncUserWithDatabase function may insert or update an
undefined email from getSupabaseUserAndSessionOnServer; update the logic in
syncUserWithDatabase so db.insert(usersSchema).values(...) only provides the
email field if user.email is defined (or explicitly set it to null if your
schema expects nullable), and change the update branch to only call .set({
email: user.email }) when user.email !== undefined (or use user.email ?? null if
you prefer an explicit null). Ensure comparisons against existingUser.email
handle undefined (e.g., check existingUser.email !== user.email && user.email
!== undefined) so you don't overwrite or insert an undefined email via
db.insert/db.update on usersSchema.
In `@lib/db/schema.ts`:
- Around line 10-14: The users table's email column is currently nullable and
not constrained, which can cause missed or ambiguous `@mention` matches; update
the users pgTable definition (the users export) to make the email column
non-nullable and add a uniqueness constraint (e.g., change text('email') to a
not-null text and add a unique constraint/index on the email column) so emails
mirror Supabase auth uniqueness and cannot be null or duplicated.
📜 Review details
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (10)
components/calendar-notepad.tsxdrizzle/migrations/0002_add_email_to_users.sqldrizzle/migrations/meta/0000_snapshot.jsondrizzle/migrations/meta/0001_snapshot.jsondrizzle/migrations/meta/0002_snapshot.jsondrizzle/migrations/meta/_journal.jsonlib/actions/calendar.tslib/actions/users.tslib/db/schema.tspackage.json
🧰 Additional context used
🧬 Code graph analysis (3)
lib/actions/calendar.ts (5)
lib/db/schema.ts (1)
users(10-14)lib/db/index.ts (1)
db(25-25)lib/types/index.ts (2)
NewCalendarNote(92-92)CalendarNote(79-90)lib/actions/users.ts (1)
syncUserWithDatabase(57-81)lib/auth/get-current-user.ts (1)
getCurrentUserIdOnServer(112-127)
components/calendar-notepad.tsx (2)
lib/actions/users.ts (1)
searchUsers(86-101)components/chat.tsx (1)
Chat(29-216)
lib/actions/users.ts (3)
lib/auth/get-current-user.ts (1)
getSupabaseUserAndSessionOnServer(22-103)lib/db/index.ts (1)
db(25-25)lib/auth/use-current-user.ts (1)
fetchUser(11-17)
🪛 Biome (2.3.13)
components/calendar-notepad.tsx
[error] 219-220: Provide an explicit type prop for the button element.
The default type of a button is submit, which causes the submission of a form when placed inside a form element. This is likely not the behaviour that you want inside a React application.
Allowed button types are: submit, button or reset
(lint/a11y/useButtonType)
🔇 Additional comments (6)
drizzle/migrations/meta/0001_snapshot.json (1)
1-281: LGTM!The migration snapshot correctly defines the schema baseline with proper table structures, foreign key constraints with cascade deletes, and appropriate column definitions.
drizzle/migrations/meta/0000_snapshot.json (1)
1-178: LGTM!The snapshot format updates are consistent with the Drizzle ORM version upgrade. The changes include proper namespace prefixes (
public.), updated dialect identifier, and additional metadata fields required by the newer version.lib/actions/users.ts (1)
86-101: LGTM!The
searchUsersfunction correctly handles the@prefix stripping, uses parameterizedilikefor safe prefix matching, and limits results to 5. Error handling returns an empty array gracefully.components/calendar-notepad.tsx (1)
186-189: 🧹 Nitpick | 🔵 TrivialAdd
type="button"to suggestion buttons for consistency.The suggestion buttons should also have an explicit
type="button"attribute to prevent any potential form submission behavior.🔧 Proposed fix
<button key={user.id} - type="button" + type="button" onClick={() => handleSelectUser(user.email)}Note: I see
type="button"is already present in the code at line 188. This is correct.Actually, reviewing line 188 again -
type="button"is already present.Likely an incorrect or invalid review comment.
package.json (1)
65-65: Version bump is intentional and properly executed.The drizzle-orm upgrade from ^0.29.0 to ^0.45.1 is part of the Supabase integration work and is correctly paired with drizzle-kit ^0.31.1. Both packages are upgraded together, following the recommended migration path. The breaking changes in this version range (0.30.0–0.44.0) either do not apply to this PostgreSQL-based codebase or are already handled: migrations infrastructure is in place, timestamp columns use
{ withTimezone: true }, and no PostgreSQL index definitions require migration. No further action needed.drizzle/migrations/meta/_journal.json (1)
11-25: LGTM!The new migration journal entries are properly sequenced with incrementing indices and timestamps. The version upgrade from "5" to "7" in newer entries aligns with the drizzle-orm dependency update mentioned in the PR context.
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| const [showSuggestions, setShowSuggestions] = useState(false); | ||
| const [userSuggestions, setUserSuggestions] = useState<any[]>([]); | ||
| const [mentionQuery, setMentionQuery] = useState(""); |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider adding proper TypeScript types for user suggestions.
userSuggestions is typed as any[]. For better type safety and IDE support, consider defining a proper type matching the database user schema.
♻️ Proposed typing
+type UserSuggestion = { id: string; email: string | null };
+
export function CalendarNotepad({ chatId }: CalendarNotepadProps) {
// ...
- const [userSuggestions, setUserSuggestions] = useState<any[]>([]);
+ const [userSuggestions, setUserSuggestions] = useState<UserSuggestion[]>([]);🤖 Prompt for AI Agents
In `@components/calendar-notepad.tsx` around lines 27 - 29, The state
userSuggestions is currently typed as any[]; define a proper TypeScript
interface (e.g., UserSuggestion or UserRecord) that matches the database user
schema (fields like id, name, email, avatarUrl, etc.), then replace
useState<any[]>([]) with useState<UserSuggestion[]>([]) and update any usages
such as setUserSuggestions and places that read from userSuggestions to use the
new type; ensure imports/types are declared or exported where needed so compiler
and IDE get correct autocomplete for functions/components that reference
userSuggestions or mentionQuery.
| const handleNoteContentChange = async (e: React.ChangeEvent<HTMLTextAreaElement>) => { | ||
| const value = e.target.value; | ||
| setNoteContent(value); | ||
|
|
||
| const cursorPosition = e.target.selectionStart; | ||
| const textBeforeCursor = value.substring(0, cursorPosition); | ||
| const words = textBeforeCursor.split(/\s/); | ||
| const lastWord = words[words.length - 1]; | ||
|
|
||
| if (lastWord.startsWith("@")) { | ||
| const query = lastWord.slice(1); | ||
| setMentionQuery(query); | ||
| const results = await searchUsers(query); | ||
| setUserSuggestions(results); | ||
| setShowSuggestions(results.length > 0); | ||
| } else { | ||
| setShowSuggestions(false); | ||
| } | ||
| }; |
There was a problem hiding this comment.
Potential race condition with async autocomplete on every keystroke.
handleNoteContentChange is an async function called on every keystroke. Rapid typing can cause out-of-order responses where an older search result overwrites a newer one. Consider debouncing the search or tracking request ordering.
🔧 Proposed debounce approach
+import { useRef } from "react";
+
export function CalendarNotepad({ chatId }: CalendarNotepadProps) {
// ... existing state ...
+ const searchTimeoutRef = useRef<NodeJS.Timeout | null>(null);
const handleNoteContentChange = async (e: React.ChangeEvent<HTMLTextAreaElement>) => {
const value = e.target.value;
setNoteContent(value);
const cursorPosition = e.target.selectionStart;
const textBeforeCursor = value.substring(0, cursorPosition);
const words = textBeforeCursor.split(/\s/);
const lastWord = words[words.length - 1];
if (lastWord.startsWith("@")) {
const query = lastWord.slice(1);
setMentionQuery(query);
- const results = await searchUsers(query);
- setUserSuggestions(results);
- setShowSuggestions(results.length > 0);
+ if (searchTimeoutRef.current) clearTimeout(searchTimeoutRef.current);
+ searchTimeoutRef.current = setTimeout(async () => {
+ const results = await searchUsers(query);
+ setUserSuggestions(results);
+ setShowSuggestions(results.length > 0);
+ }, 150);
} else {
setShowSuggestions(false);
}
};📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const handleNoteContentChange = async (e: React.ChangeEvent<HTMLTextAreaElement>) => { | |
| const value = e.target.value; | |
| setNoteContent(value); | |
| const cursorPosition = e.target.selectionStart; | |
| const textBeforeCursor = value.substring(0, cursorPosition); | |
| const words = textBeforeCursor.split(/\s/); | |
| const lastWord = words[words.length - 1]; | |
| if (lastWord.startsWith("@")) { | |
| const query = lastWord.slice(1); | |
| setMentionQuery(query); | |
| const results = await searchUsers(query); | |
| setUserSuggestions(results); | |
| setShowSuggestions(results.length > 0); | |
| } else { | |
| setShowSuggestions(false); | |
| } | |
| }; | |
| import { useRef } from "react"; | |
| export function CalendarNotepad({ chatId }: CalendarNotepadProps) { | |
| // ... existing state ... | |
| const searchTimeoutRef = useRef<NodeJS.Timeout | null>(null); | |
| const handleNoteContentChange = async (e: React.ChangeEvent<HTMLTextAreaElement>) => { | |
| const value = e.target.value; | |
| setNoteContent(value); | |
| const cursorPosition = e.target.selectionStart; | |
| const textBeforeCursor = value.substring(0, cursorPosition); | |
| const words = textBeforeCursor.split(/\s/); | |
| const lastWord = words[words.length - 1]; | |
| if (lastWord.startsWith("@")) { | |
| const query = lastWord.slice(1); | |
| setMentionQuery(query); | |
| if (searchTimeoutRef.current) clearTimeout(searchTimeoutRef.current); | |
| searchTimeoutRef.current = setTimeout(async () => { | |
| const results = await searchUsers(query); | |
| setUserSuggestions(results); | |
| setShowSuggestions(results.length > 0); | |
| }, 150); | |
| } else { | |
| setShowSuggestions(false); | |
| } | |
| }; |
🤖 Prompt for AI Agents
In `@components/calendar-notepad.tsx` around lines 94 - 112,
handleNoteContentChange triggers searchUsers on every keystroke and can produce
out-of-order results; debounce the query or serialize requests to prevent stale
responses from overwriting newer ones: implement a debounce around invoking
searchUsers (or add a sequential requestId token stored in a ref) inside
handleNoteContentChange so only the latest search result calls
setUserSuggestions/setShowSuggestions and only for the current requestId, and
ensure setMentionQuery is updated with the debounced value; update references to
handleNoteContentChange, searchUsers, setUserSuggestions, setShowSuggestions and
setMentionQuery accordingly.
| {showSuggestions && ( | ||
| <div className="absolute bottom-full mb-2 w-full bg-background border rounded-md shadow-lg z-50 p-1 max-h-40 overflow-y-auto"> | ||
| {userSuggestions.map(user => ( | ||
| <button | ||
| key={user.id} | ||
| type="button" | ||
| onClick={() => handleSelectUser(user.email)} | ||
| className="w-full text-left px-3 py-2 hover:bg-accent rounded-sm text-sm truncate" | ||
| title={user.email} | ||
| > | ||
| {user.email} | ||
| </button> | ||
| ))} | ||
| </div> |
There was a problem hiding this comment.
Add null check for user.email before rendering.
The searchUsers function returns users from the database where email could be null. Clicking a user with a null email would pass null to handleSelectUser, causing unexpected behavior.
🛡️ Proposed fix with null filtering
{showSuggestions && (
<div className="absolute bottom-full mb-2 w-full bg-background border rounded-md shadow-lg z-50 p-1 max-h-40 overflow-y-auto">
- {userSuggestions.map(user => (
+ {userSuggestions.filter(user => user.email).map(user => (
<button
key={user.id}
type="button"
onClick={() => handleSelectUser(user.email)}
className="w-full text-left px-3 py-2 hover:bg-accent rounded-sm text-sm truncate"
title={user.email}
>
{user.email}
</button>
))}
</div>
)}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| {showSuggestions && ( | |
| <div className="absolute bottom-full mb-2 w-full bg-background border rounded-md shadow-lg z-50 p-1 max-h-40 overflow-y-auto"> | |
| {userSuggestions.map(user => ( | |
| <button | |
| key={user.id} | |
| type="button" | |
| onClick={() => handleSelectUser(user.email)} | |
| className="w-full text-left px-3 py-2 hover:bg-accent rounded-sm text-sm truncate" | |
| title={user.email} | |
| > | |
| {user.email} | |
| </button> | |
| ))} | |
| </div> | |
| {showSuggestions && ( | |
| <div className="absolute bottom-full mb-2 w-full bg-background border rounded-md shadow-lg z-50 p-1 max-h-40 overflow-y-auto"> | |
| {userSuggestions.filter(user => user.email).map(user => ( | |
| <button | |
| key={user.id} | |
| type="button" | |
| onClick={() => handleSelectUser(user.email)} | |
| className="w-full text-left px-3 py-2 hover:bg-accent rounded-sm text-sm truncate" | |
| title={user.email} | |
| > | |
| {user.email} | |
| </button> | |
| ))} | |
| </div> | |
| )} |
🤖 Prompt for AI Agents
In `@components/calendar-notepad.tsx` around lines 183 - 196, The suggestion list
currently maps userSuggestions and directly uses user.email for display and
passing to handleSelectUser, but email can be null; update the rendering in the
showSuggestions block to skip or filter out entries where user.email is null
(e.g., filter userSuggestions by user.email truthiness or add a conditional
render inside the map), and ensure handleSelectUser is only called with a
non-null string; reference: showSuggestions, userSuggestions, handleSelectUser,
and user.email.
| {note.locationTags && ( | ||
| <button onClick={() => handleFlyTo(note.locationTags)} className="text-muted-foreground hover:text-foreground ml-2"> | ||
| <MapPin className="h-5 w-5" /> | ||
| </button> | ||
| )} |
There was a problem hiding this comment.
Add explicit type="button" to prevent form submission.
The button element lacks an explicit type prop. In form contexts, buttons default to type="submit", which may cause unintended form submissions.
🔧 Proposed fix
- <button onClick={() => handleFlyTo(note.locationTags)} className="text-muted-foreground hover:text-foreground ml-2">
+ <button type="button" onClick={() => handleFlyTo(note.locationTags)} className="text-muted-foreground hover:text-foreground ml-2">
<MapPin className="h-5 w-5" />
</button>📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| {note.locationTags && ( | |
| <button onClick={() => handleFlyTo(note.locationTags)} className="text-muted-foreground hover:text-foreground ml-2"> | |
| <MapPin className="h-5 w-5" /> | |
| </button> | |
| )} | |
| {note.locationTags && ( | |
| <button type="button" onClick={() => handleFlyTo(note.locationTags)} className="text-muted-foreground hover:text-foreground ml-2"> | |
| <MapPin className="h-5 w-5" /> | |
| </button> | |
| )} |
🧰 Tools
🪛 Biome (2.3.13)
[error] 219-220: Provide an explicit type prop for the button element.
The default type of a button is submit, which causes the submission of a form when placed inside a form element. This is likely not the behaviour that you want inside a React application.
Allowed button types are: submit, button or reset
(lint/a11y/useButtonType)
🤖 Prompt for AI Agents
In `@components/calendar-notepad.tsx` around lines 218 - 222, The button rendering
the MapPin when note.locationTags exists is missing an explicit type and may
submit surrounding forms (see the JSX using note.locationTags and the
handleFlyTo handler and MapPin icon); update that button element to include
type="button" so clicking it invokes handleFlyTo without triggering form
submission.
| @@ -0,0 +1 @@ | |||
| ALTER TABLE "users" ADD COLUMN "email" text; | |||
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
find . -type f -name "*.ts" -o -name "*.sql" -o -name "*schema*" -o -name "*snapshot*" | grep -E "(schema|drizzle|migration)" | head -20Repository: QueueLab/QCX
Length of output: 1240
🏁 Script executed:
# List the drizzle migrations directory structure
ls -la drizzle/migrations/ 2>/dev/null || echo "Checking for drizzle directory..."
git ls-files | grep -E "(drizzle|migration)" | head -20Repository: QueueLab/QCX
Length of output: 778
🏁 Script executed:
# Look for schema definition files
git ls-files | grep -E "(schema|drizzle)" | sortRepository: QueueLab/QCX
Length of output: 543
🏁 Script executed:
cat -n drizzle/migrations/0002_add_email_to_users.sqlRepository: QueueLab/QCX
Length of output: 107
🏁 Script executed:
cat -n lib/db/schema.tsRepository: QueueLab/QCX
Length of output: 4672
🏁 Script executed:
cat -n drizzle/migrations/meta/0002_snapshot.jsonRepository: QueueLab/QCX
Length of output: 9192
🏁 Script executed:
rg -i "mention" --type=ts --type=tsx --type=js --type=jsx -A 2 -B 2 | head -50Repository: QueueLab/QCX
Length of output: 83
🏁 Script executed:
# Search for email usage and constraints in the codebase
rg "email" --type=ts --type=tsx -B 2 -A 2 | head -100Repository: QueueLab/QCX
Length of output: 83
🏁 Script executed:
rg -i "mention" -A 2 -B 2 | head -80Repository: QueueLab/QCX
Length of output: 3336
🏁 Script executed:
# Search for email usage in codebase (excluding node_modules)
rg "email" --glob="!node_modules" -B 2 -A 2 | head -100Repository: QueueLab/QCX
Length of output: 6538
🏁 Script executed:
# Check for Supabase auth references
rg -i "supabase|auth" --glob="!node_modules" -B 1 -A 1 | head -60Repository: QueueLab/QCX
Length of output: 3559
Add UNIQUE constraint on email column for mentions feature.
The application uses email addresses to identify and mention users in calendar notes (extractAndValidateMentions in lib/actions/calendar.ts and user suggestions in components/calendar-notepad.tsx). Without a UNIQUE constraint, duplicate emails would cause ambiguity when resolving @mentions. Additionally, email should be NOT NULL since it's used as a user identifier.
Proposed fix
-ALTER TABLE "users" ADD COLUMN "email" text;
+ALTER TABLE "users" ADD COLUMN "email" text NOT NULL UNIQUE;Also update the schema definition in lib/db/schema.ts line 12:
- email: text('email'),
+ email: text('email').notNull().unique(),📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ALTER TABLE "users" ADD COLUMN "email" text; | |
| ALTER TABLE "users" ADD COLUMN "email" text NOT NULL UNIQUE; |
🤖 Prompt for AI Agents
In `@drizzle/migrations/0002_add_email_to_users.sql` at line 1, The migration
should make the "email" column NOT NULL and UNIQUE to prevent duplicate emails;
update the ALTER TABLE in the migration to add the column with NOT NULL and a
UNIQUE constraint (or create a UNIQUE index) for "email". Also update the schema
definition for the Users table in lib/db/schema.ts (the Users/UsersTable export
around line 12) to mark the email field as non-nullable and unique so the
TypeScript schema matches the DB change and functions like
extractAndValidateMentions and the calendar-notepad user suggestions resolve
unambiguously.
| "email": { | ||
| "name": "email", | ||
| "type": "text", | ||
| "primaryKey": false, | ||
| "notNull": false | ||
| } | ||
| }, | ||
| "indexes": {}, | ||
| "foreignKeys": {}, | ||
| "compositePrimaryKeys": {}, | ||
| "uniqueConstraints": {}, | ||
| "policies": {}, | ||
| "checkConstraints": {}, | ||
| "isRLSEnabled": false |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider adding an index on the email column for mention search performance.
The @mention autocomplete feature will search users by email. Without an index, this becomes a full table scan that degrades as the user base grows.
Additionally, confirm whether uniqueConstraints should be populated here to match the UNIQUE constraint suggested for the migration.
🤖 Prompt for AI Agents
In `@drizzle/migrations/meta/0002_snapshot.json` around lines 260 - 273, Add a
database index on the "email" column to avoid full-table scans for
`@mention/autocomplete` queries by adding an entry for "email" under the "indexes"
object in the migration snapshot (the key names to edit are "email" and
"indexes"); also verify whether "email" should be UNIQUE and if so add the
corresponding entry under "uniqueConstraints" in the snapshot to reflect the
migration's UNIQUE constraint so the schema and constraints remain consistent
(update the JSON where "uniqueConstraints" and "indexes" are defined).
| async function extractAndValidateMentions(content: string): Promise<string[]> { | ||
| const mentionRegex = /@(\w+)/g; | ||
| const matches = Array.from(content.matchAll(mentionRegex)); | ||
| const potentialEmails = matches.map(match => match[1]); | ||
|
|
||
| if (potentialEmails.length === 0) return []; | ||
|
|
||
| try { | ||
| // Fetch all users to match against prefixes | ||
| // In a larger system, we would use a more optimized search | ||
| const users = await db.select({ id: usersSchema.id, email: usersSchema.email }) | ||
| .from(usersSchema) | ||
| .execute(); | ||
|
|
||
| const validatedIds: string[] = []; | ||
| potentialEmails.forEach(mention => { | ||
| const found = users.find(u => u.email?.toLowerCase().startsWith(mention.toLowerCase())); | ||
| if (found) validatedIds.push(found.id); | ||
| }); | ||
|
|
||
| return Array.from(new Set(validatedIds)); | ||
| } catch (error) { | ||
| console.error("Error validating mentions:", error); | ||
| return []; | ||
| } |
There was a problem hiding this comment.
Performance concern: fetches all users for mention validation.
The function queries the entire users table to validate mentions. For applications with many users, this becomes inefficient. Consider using targeted queries similar to searchUsers in lib/actions/users.ts:
⚡ Proposed optimization
async function extractAndValidateMentions(content: string): Promise<string[]> {
const mentionRegex = /@(\w+)/g;
const matches = Array.from(content.matchAll(mentionRegex));
const potentialEmails = matches.map(match => match[1]);
if (potentialEmails.length === 0) return [];
try {
- // Fetch all users to match against prefixes
- // In a larger system, we would use a more optimized search
- const users = await db.select({ id: usersSchema.id, email: usersSchema.email })
- .from(usersSchema)
- .execute();
-
- const validatedIds: string[] = [];
- potentialEmails.forEach(mention => {
- const found = users.find(u => u.email?.toLowerCase().startsWith(mention.toLowerCase()));
- if (found) validatedIds.push(found.id);
- });
-
- return Array.from(new Set(validatedIds));
+ const validatedIds: string[] = [];
+ for (const mention of potentialEmails) {
+ const [found] = await db.select({ id: usersSchema.id })
+ .from(usersSchema)
+ .where(ilike(usersSchema.email, `${mention}%`))
+ .limit(1);
+ if (found) validatedIds.push(found.id);
+ }
+ return Array.from(new Set(validatedIds));
} catch (error) {
console.error("Error validating mentions:", error);
return [];
}
}🤖 Prompt for AI Agents
In `@lib/actions/calendar.ts` around lines 69 - 93, The extractAndValidateMentions
function currently fetches all users which is inefficient; modify it to perform
targeted DB queries for the extracted mention prefixes (potentialEmails) instead
of selecting the entire users table: for example, perform a single query that
filters users by email ILIKE each mention% (or use a WHERE email ILIKE ANY(...)
/ multiple ORs) via the same db/select pipeline that references usersSchema,
then map the matching rows to ids (dedupe) and return them; update the logic
around potentialEmails, validatedIds and the error handling to reflect the new
query approach and avoid loading all users into memory.
| const mentionRegex = /@(\w+)/g; | ||
| const matches = Array.from(content.matchAll(mentionRegex)); | ||
| const potentialEmails = matches.map(match => match[1]); |
There was a problem hiding this comment.
Regex may not match email prefixes containing dots or hyphens.
The regex /@(\w+)/g only captures word characters ([a-zA-Z0-9_]). Email local parts often contain dots (e.g., john.doe@example.com). A user typing @john.doe would only have john captured, potentially matching the wrong user or failing to match.
Consider extending the regex to include common email characters:
🔧 Proposed fix
- const mentionRegex = /@(\w+)/g;
+ const mentionRegex = /@([\w.+-]+)/g;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const mentionRegex = /@(\w+)/g; | |
| const matches = Array.from(content.matchAll(mentionRegex)); | |
| const potentialEmails = matches.map(match => match[1]); | |
| const mentionRegex = /@([\w.+-]+)/g; | |
| const matches = Array.from(content.matchAll(mentionRegex)); | |
| const potentialEmails = matches.map(match => match[1]); |
🤖 Prompt for AI Agents
In `@lib/actions/calendar.ts` around lines 70 - 72, The current mentionRegex only
captures word characters and misses common email/name characters like dots and
hyphens; update the mentionRegex (used where mentionRegex, matches and
potentialEmails are defined) to allow letters, digits, dot, underscore, hyphen
and plus so `@john.doe` and similar tokens are fully captured, then regenerate
matches and potentialEmails from that updated pattern to ensure full local-part
prefixes are extracted.
| export const users = pgTable('users', { | ||
| id: uuid('id').primaryKey().defaultRandom(), // Assuming Supabase user IDs are UUIDs | ||
| // email: text('email'), // Supabase handles this in auth.users | ||
| email: text('email'), // Supabase handles this in auth.users | ||
| // Other profile fields if necessary | ||
| }); |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider adding a unique constraint and NOT NULL for the email column.
The email column is nullable and has no uniqueness constraint. Since this column is used for @mention searches via ilike, having null emails or duplicates could lead to:
- Missed search results when emails are null
- Ambiguous mention matches if duplicate emails exist
If emails are synced from Supabase auth (which enforces uniqueness), consider mirroring those constraints here.
♻️ Proposed schema refinement
export const users = pgTable('users', {
id: uuid('id').primaryKey().defaultRandom(), // Assuming Supabase user IDs are UUIDs
- email: text('email'), // Supabase handles this in auth.users
+ email: text('email').unique(), // Supabase handles this in auth.users
// Other profile fields if necessary
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export const users = pgTable('users', { | |
| id: uuid('id').primaryKey().defaultRandom(), // Assuming Supabase user IDs are UUIDs | |
| // email: text('email'), // Supabase handles this in auth.users | |
| email: text('email'), // Supabase handles this in auth.users | |
| // Other profile fields if necessary | |
| }); | |
| export const users = pgTable('users', { | |
| id: uuid('id').primaryKey().defaultRandom(), // Assuming Supabase user IDs are UUIDs | |
| email: text('email').unique(), // Supabase handles this in auth.users | |
| // Other profile fields if necessary | |
| }); |
🤖 Prompt for AI Agents
In `@lib/db/schema.ts` around lines 10 - 14, The users table's email column is
currently nullable and not constrained, which can cause missed or ambiguous
`@mention` matches; update the users pgTable definition (the users export) to make
the email column non-nullable and add a uniqueness constraint (e.g., change
text('email') to a not-null text and add a unique constraint/index on the email
column) so emails mirror Supabase auth uniqueness and cannot be null or
duplicated.
|
@coderabbitai when I add people on the team in settings the site completely refreshes losing all contexts, then after the users that have been added cannot be instantiated in the notes section, do an in depth analysis of this code issue. |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain🏁 Script executed: Length of output: 610 🏁 Script executed: Length of output: 26639 🏁 Script executed: Length of output: 13120 🏁 Script executed: Length of output: 5921
🔍 Root Cause AnalysisIssue 1: Page Refresh Losing ContextWhen you add users in settings, the
Here's the problematic code in // This is the OLD in-memory implementation still being used
let usersStore: Record<string, Array<User>> = {
'default-user': [
{ id: '1', email: 'admin@example.com', role: 'admin' },
{ id: '2', email: 'editor@example.com', role: 'editor' },
],
};
export async function addUser(userId: string = 'default-user', newUser: { email: string; role: UserRole }): Promise<{ user?: User; error?: string }> {
// ... validation ...
const userToAdd: User = { ...newUser, id: Math.random().toString(36).substr(2, 9) };
usersStore[userId].push(userToAdd); // ❌ Only adds to memory
revalidatePath('/settings'); // ❌ Causes page refresh
return { user: userToAdd };
}Issue 2: Users Not Available for MentionsThe calendar notepad uses export async function searchUsers(query: string) {
// ...
const results = await db.select()
.from(usersSchema)
.where(ilike(usersSchema.email, `${searchTerm}%`)) // ❌ Queries DB, not memory
.limit(5);
return results;
}The disconnect: Users added via settings are stored only in the in-memory 💡 Solution RecommendationsYou need to replace the old in-memory 1. Replace
|
I have completed an in-depth analysis and fixed the issues you mentioned:
|
Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
|
@coderabbitai now resolution search has stopped working! can you make the add user and tag user as well as add user to notes feature work in memory as a flag for now before the backend is connected |
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain
🏁 Script executed: Length of output: 14570
Plan: Add In-Memory User Management with Feature FlagI'll implement a feature flag system that allows you to switch between in-memory and database modes. By default, it will use in-memory storage until you're ready to connect the backend. Changes needed:
Summary of changes: ✅ In-memory mode by default - Set ✅ All features work in memory:
✅ Sample data included - 3 users pre-loaded: admin@example.com, editor@example.com, viewer@example.com ✅ Easy to switch - When backend is ready, just change the flag to The in-memory store persists during the server's runtime, so you won't lose users between operations. When you're ready to connect the real backend, simply change 💡 Use |
I've addressed the resolution search failure and implemented the in-memory fallback as requested:
|
Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
There was a problem hiding this comment.
Actionable comments posted: 9
🤖 Fix all issues with AI agents
In `@drizzle/migrations/0003_bent_cammi.sql`:
- Line 1: The migration adds a nullable "role" column but lib/db/schema.ts
defines role as non-null with a default; update the migration to add the column
with the same NOT NULL and DEFAULT value used in lib/db/schema.ts (e.g., ALTER
TABLE "users" ADD COLUMN "role" text NOT NULL DEFAULT
'<exact_default_from_lib/db/schema.ts">'), ensuring the literal default matches
the schema file so existing rows get the default and runtime assumptions stay
consistent.
In `@drizzle/migrations/meta/0003_snapshot.json`:
- Around line 260-271: The snapshot currently marks users.email and users.role
as nullable; update the migration that defines the users table so the email
column has notNull: true and the role column has notNull: true plus the proper
default (e.g., default: 'user' or your app's default), then re-run your
migration/snapshot generation command (drizzle-kit or your migration tool) to
regenerate the 0003 snapshot so it reflects NOT NULL and the role default;
specifically locate the migration that creates/updates the users table (the
users table definition and its columns email and role) and change the column
metadata to notNull: true and add the default for role before regenerating the
snapshot.
In `@lib/actions/users.ts`:
- Around line 21-27: The module-level usersStore currently seeds hardcoded
default users which is unsafe for serverless and production; change usersStore
to initialize as an empty Record<string, Array<User>> and only populate sample
users when an explicit dev/test flag is set (e.g., NODE_ENV === 'development' or
a dedicated isDev/isTest toggle), add a clear processLogger.warn or similar
warning when the fallback in-memory store is activated so production incidents
are visible, and update the README/comments to state this fallback is intended
for local/dev testing only; ensure all references to usersStore (the
module-level variable) and any functions that call it check the dev/test flag
before using seeded data.
- Around line 34-50: getUsers currently ignores the userId when db is present,
causing different semantics vs the in-memory fallback; update getUsers to treat
userId as a tenant/context key: change the DB path (the query built from
usersSchema in getUsers) to filter by userId (e.g., add a where clause using the
schema's user id column) and return only that user's records, and keep the
existing in-memory fallback using usersStore[userId]; ensure the try/catch still
falls back to simulateDBDelay and usersStore[userId] on DB error so both code
paths return the same tenant-scoped results.
- Around line 56-73: The DB catch block currently logs the error and lets
execution fall through to the in-memory path, masking failures; change the catch
in the users add flow (the try around db.select/db.insert against usersSchema
and db) to surface the failure by returning an error result or rethrowing the
error instead of continuing: include the original error message (or error
object) in the returned error payload (e.g., return { error: 'Database error',
details: String(error) }) or throw error after logging so callers do not get a
successful in-memory response when db.insert or db.select fails. Ensure this
change is applied to the block that uses db.select().from(usersSchema)... and
db.insert(usersSchema)...returning() so the in-memory fallback is not executed
on DB errors.
- Around line 96-105: The DB update block using db.update(usersSchema).set({
role: newRole }).where(eq(usersSchema.email, userEmail)).returning() currently
falls through when the returned variable updated is falsy; detect that case
immediately after the update and short-circuit by returning or throwing a "user
not found" result (consistent with the function's error handling) instead of
continuing to the in-memory fallback branch—ensure you reference the updated
variable and userEmail to decide and avoid touching the in-memory store when the
DB reports zero rows affected.
- Around line 262-284: The current getSelectedModel and saveSelectedModel
functions use modelConfigPath to read/write a local config file which won't
persist in serverless/ephemeral environments; replace this file-based
persistence with a durable store (e.g., the app's database or environment-backed
config): change getSelectedModel to query the user/settings table or a shared
settings row for the selectedModel (falling back to the file only for
local/dev), and change saveSelectedModel to upsert the selectedModel into that
DB/settings store and return detailed error info on failure; ensure to keep
noStore() behavior where needed and update call sites that import
getSelectedModel/saveSelectedModel to work with the async DB-backed API.
- Around line 152-184: The code runs deletes and upserts one-by-one and without
a transaction, which can cause N+1 queries and partial commits; wrap the whole
operation in a transaction (use db.transaction or equivalent) and perform
set-based operations: compute incomingEmails from formData.users, use
tx.delete(usersSchema).where(email not in incomingEmails) to delete missing
users in one statement, fetch existingUsers via tx.select().from(usersSchema),
split formData.users into toInsert and toUpdate arrays by comparing to
existingUsers, then inside the same transaction perform bulk insert for toInsert
(assign id with crypto.randomUUID() when missing) and either use a single upsert
API (insert...onConflict...doUpdate) if supported or run Promise.all of
tx.update(...) calls for toUpdate so updates run in parallel but still inside
the transaction; return the final tx.select().from(usersSchema) and ensure the
transaction rolls back on error.
In `@lib/db/index.ts`:
- Around line 12-26: The current PoolConfig sets ssl.rejectUnauthorized = false
when databaseUrl includes 'supabase.co', which disables TLS verification; update
the poolConfig in the if (databaseUrl.includes('supabase.co')) branch to load
Supabase's root CA and keep verification enabled: read the CA file (e.g., via
fs.readFileSync) and set ssl = { ca: <caPem>, rejectUnauthorized: true }, and if
the CA file is not present fall back only when an explicit env flag (e.g.,
process.env.ALLOW_INSECURE_SSL === 'true') is set to allow rejectUnauthorized:
false; ensure this change is applied where Pool is constructed and dbInstance =
drizzle(...) is created so the Pool uses the trusted CA.
📜 Review details
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (8)
components/settings/components/user-management-form.tsxdrizzle/migrations/0003_bent_cammi.sqldrizzle/migrations/meta/0003_snapshot.jsondrizzle/migrations/meta/_journal.jsonlib/actions/calendar.tslib/actions/users.tslib/db/index.tslib/db/schema.ts
🧰 Additional context used
🧬 Code graph analysis (2)
lib/db/index.ts (1)
lib/db/migrate.ts (1)
runMigrations(8-34)
lib/actions/users.ts (2)
lib/db/index.ts (1)
db(33-33)lib/auth/get-current-user.ts (1)
getSupabaseUserAndSessionOnServer(22-103)
🔇 Additional comments (8)
drizzle/migrations/meta/_journal.json (1)
11-31: Journal entries look consistent.Appended migration entries align with the new migration artifacts.
lib/db/schema.ts (1)
12-13: Good schema tightening for required user fields.Non-null
rolealign with expected user metadata.components/settings/components/user-management-form.tsx (1)
146-149: Nice fix to prevent accidental form submission.Adding
type="button"on action buttons avoids unintended submit behavior.lib/actions/calendar.ts (4)
4-5: Imports are appropriate for mention validation.
25-58: Graceful DB-unavailable guard is good.Returning an empty list with a warning keeps behavior predictable when db is unset.
129-163: User tag persistence in update/insert paths looks correct.
112-127: No changes needed —syncUserWithDatabase()is already safe to call unconditionally.The function has an explicit
if (db)guard at line 204 of lib/actions/users.ts that prevents database operations when db is unavailable. All database queries and mutations are wrapped within this guard, so the function handles db unavailability gracefully without throwing. The unconditional call insaveNote()is correct.lib/actions/users.ts (1)
241-260: LGTM!The search implementation correctly:
- Validates minimum query length
- Strips the
@prefix for autocomplete compatibility- Uses
ilikefor case-insensitive prefix matching (drizzle-orm parameterizes the value safely)- Limits results to 5 for autocomplete performance
- Falls back to in-memory search across all contexts
✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.
| @@ -0,0 +1 @@ | |||
| ALTER TABLE "users" ADD COLUMN "role" text; No newline at end of file | |||
There was a problem hiding this comment.
Add NOT NULL + default to match the app schema.
lib/db/schema.ts sets role as non-null with a default, but this migration creates a nullable column. Existing rows can remain null, causing runtime mismatches and unexpected nulls in a field treated as non-null.
🛠️ Proposed migration adjustment
-ALTER TABLE "users" ADD COLUMN "role" text;
+ALTER TABLE "users" ADD COLUMN "role" text NOT NULL DEFAULT 'viewer';📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ALTER TABLE "users" ADD COLUMN "role" text; | |
| ALTER TABLE "users" ADD COLUMN "role" text NOT NULL DEFAULT 'viewer'; |
🤖 Prompt for AI Agents
In `@drizzle/migrations/0003_bent_cammi.sql` at line 1, The migration adds a
nullable "role" column but lib/db/schema.ts defines role as non-null with a
default; update the migration to add the column with the same NOT NULL and
DEFAULT value used in lib/db/schema.ts (e.g., ALTER TABLE "users" ADD COLUMN
"role" text NOT NULL DEFAULT '<exact_default_from_lib/db/schema.ts">'), ensuring
the literal default matches the schema file so existing rows get the default and
runtime assumptions stay consistent.
| "email": { | ||
| "name": "email", | ||
| "type": "text", | ||
| "primaryKey": false, | ||
| "notNull": false | ||
| }, | ||
| "role": { | ||
| "name": "role", | ||
| "type": "text", | ||
| "primaryKey": false, | ||
| "notNull": false | ||
| } |
There was a problem hiding this comment.
Snapshot reflects nullable fields that should be non-null.
users.email and users.role are marked nullable here, but the app schema treats them as required. After fixing the migration, regenerate the snapshot so it reflects NOT NULL (and role default).
🤖 Prompt for AI Agents
In `@drizzle/migrations/meta/0003_snapshot.json` around lines 260 - 271, The
snapshot currently marks users.email and users.role as nullable; update the
migration that defines the users table so the email column has notNull: true and
the role column has notNull: true plus the proper default (e.g., default: 'user'
or your app's default), then re-run your migration/snapshot generation command
(drizzle-kit or your migration tool) to regenerate the 0003 snapshot so it
reflects NOT NULL and the role default; specifically locate the migration that
creates/updates the users table (the users table definition and its columns
email and role) and change the column metadata to notNull: true and add the
default for role before regenerating the snapshot.
| // In-memory store for fallback/mock mode | ||
| let usersStore: Record<string, Array<User>> = { | ||
| 'default-user': [ // Simulate a default user having some initial users | ||
| 'default-user': [ | ||
| { id: '1', email: 'admin@example.com', role: 'admin' }, | ||
| { id: '2', email: 'editor@example.com', role: 'editor' }, | ||
| ], | ||
| }; |
There was a problem hiding this comment.
In-memory store is ephemeral in serverless deployments.
Module-level state (usersStore) resets on cold starts and diverges across server instances. The hardcoded default users (admin@example.com, editor@example.com) could leak into production if the DB becomes temporarily unavailable.
Consider:
- Initializing with an empty store and populating only in explicit dev/test modes.
- Adding a warning log when fallback activates in production.
- Documenting that this fallback is intended for development only.
🤖 Prompt for AI Agents
In `@lib/actions/users.ts` around lines 21 - 27, The module-level usersStore
currently seeds hardcoded default users which is unsafe for serverless and
production; change usersStore to initialize as an empty Record<string,
Array<User>> and only populate sample users when an explicit dev/test flag is
set (e.g., NODE_ENV === 'development' or a dedicated isDev/isTest toggle), add a
clear processLogger.warn or similar warning when the fallback in-memory store is
activated so production incidents are visible, and update the README/comments to
state this fallback is intended for local/dev testing only; ensure all
references to usersStore (the module-level variable) and any functions that call
it check the dev/test flag before using seeded data.
| export async function getUsers(userId: string = 'default-user'): Promise<{ users: User[] }> { | ||
| if (db) { | ||
| try { | ||
| const results = await db.select().from(usersSchema); | ||
| return { users: results as User[] }; | ||
| } catch (error) { | ||
| console.error('Error fetching users from DB:', error); | ||
| } | ||
| } | ||
|
|
||
| // Fallback | ||
| await simulateDBDelay(); | ||
| if (!usersStore[userId]) { | ||
| usersStore[userId] = []; | ||
| } | ||
| console.log(`[Action: getUsers] Fetched users for ${userId}:`, usersStore[userId]); | ||
| return { users: usersStore[userId] }; | ||
| } |
There was a problem hiding this comment.
userId parameter ignored in DB path creates inconsistent behavior.
When DB is available, getUsers returns all users globally (line 37). When DB is unavailable, it returns users keyed by userId (line 49). This semantic mismatch means callers will get different results depending on which code path executes.
If userId is intended as a tenant/context key, the DB query should filter by it. If global user list is intended, remove the parameter and the in-memory keying.
🔧 Proposed fix if userId should be ignored (simplify in-memory to match DB)
-let usersStore: Record<string, Array<User>> = {
- 'default-user': [
+let usersStore: Array<User> = [
{ id: '1', email: 'admin@example.com', role: 'admin' },
{ id: '2', email: 'editor@example.com', role: 'editor' },
- ],
-};
+];
export async function getUsers(userId: string = 'default-user'): Promise<{ users: User[] }> {
if (db) {
// ... DB path unchanged
}
await simulateDBDelay();
- if (!usersStore[userId]) {
- usersStore[userId] = [];
- }
- return { users: usersStore[userId] };
+ return { users: usersStore };
}🤖 Prompt for AI Agents
In `@lib/actions/users.ts` around lines 34 - 50, getUsers currently ignores the
userId when db is present, causing different semantics vs the in-memory
fallback; update getUsers to treat userId as a tenant/context key: change the DB
path (the query built from usersSchema in getUsers) to filter by userId (e.g.,
add a where clause using the schema's user id column) and return only that
user's records, and keep the existing in-memory fallback using
usersStore[userId]; ensure the try/catch still falls back to simulateDBDelay and
usersStore[userId] on DB error so both code paths return the same tenant-scoped
results.
| if (db) { | ||
| try { | ||
| const existing = await db.select().from(usersSchema).where(eq(usersSchema.email, newUser.email)).limit(1); | ||
| if (existing.length > 0) { | ||
| return { error: 'User with this email already exists.' }; | ||
| } | ||
|
|
||
| const [inserted] = await db.insert(usersSchema).values({ | ||
| id: crypto.randomUUID(), | ||
| email: newUser.email, | ||
| role: newUser.role | ||
| }).returning(); | ||
|
|
||
| return { user: inserted as User }; | ||
| } catch (error) { | ||
| console.error('Error adding user to DB:', error); | ||
| } | ||
| } |
There was a problem hiding this comment.
Silent fallback to in-memory after DB error masks failures.
When the DB operation fails (line 70-72), the error is logged but execution falls through to the in-memory path, returning a success response. The caller has no indication that:
- The DB operation failed
- Data was stored in ephemeral memory instead
This can cause data loss and inconsistent state between what the user sees and what's persisted.
🛡️ Proposed fix to surface fallback status
export async function addUser(userId: string = 'default-user', newUser: { email: string; role: UserRole }): Promise<{ user?: User; error?: string }> {
+ let usedFallback = false;
if (db) {
try {
// ... existing DB logic
return { user: inserted as User };
} catch (error) {
console.error('Error adding user to DB:', error);
+ usedFallback = true;
}
}
// Fallback
await simulateDBDelay();
// ... existing fallback logic
- return { user: userToAdd };
+ return { user: userToAdd, ...(usedFallback && { warning: 'Stored in memory only due to DB error' }) };
}🤖 Prompt for AI Agents
In `@lib/actions/users.ts` around lines 56 - 73, The DB catch block currently logs
the error and lets execution fall through to the in-memory path, masking
failures; change the catch in the users add flow (the try around
db.select/db.insert against usersSchema and db) to surface the failure by
returning an error result or rethrowing the error instead of continuing: include
the original error message (or error object) in the returned error payload
(e.g., return { error: 'Database error', details: String(error) }) or throw
error after logging so callers do not get a successful in-memory response when
db.insert or db.select fails. Ensure this change is applied to the block that
uses db.select().from(usersSchema)... and db.insert(usersSchema)...returning()
so the in-memory fallback is not executed on DB errors.
| const [updated] = await db.update(usersSchema) | ||
| .set({ role: newRole }) | ||
| .where(eq(usersSchema.email, userEmail)) | ||
| .returning(); | ||
|
|
||
| if (updated) return { user: updated as User }; | ||
| } catch (error) { | ||
| console.error('Error updating role in DB:', error); | ||
| } | ||
| } |
There was a problem hiding this comment.
Missing "user not found" handling when DB update affects zero rows.
If updated is falsy (no matching email in DB), the code falls through to the in-memory path instead of returning an error. This causes incorrect behavior: a user that doesn't exist in DB could be "updated" in memory, or the in-memory store could report "not found" for a user that exists in DB under a different context.
🐛 Proposed fix
const [updated] = await db.update(usersSchema)
.set({ role: newRole })
.where(eq(usersSchema.email, userEmail))
.returning();
- if (updated) return { user: updated as User };
+ if (updated) {
+ return { user: updated as User };
+ }
+ return { error: 'User not found.' };
} catch (error) {
console.error('Error updating role in DB:', error);
+ // Only fall through to in-memory on connection/query errors, not "not found"
}
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const [updated] = await db.update(usersSchema) | |
| .set({ role: newRole }) | |
| .where(eq(usersSchema.email, userEmail)) | |
| .returning(); | |
| if (updated) return { user: updated as User }; | |
| } catch (error) { | |
| console.error('Error updating role in DB:', error); | |
| } | |
| } | |
| const [updated] = await db.update(usersSchema) | |
| .set({ role: newRole }) | |
| .where(eq(usersSchema.email, userEmail)) | |
| .returning(); | |
| if (updated) { | |
| return { user: updated as User }; | |
| } | |
| return { error: 'User not found.' }; | |
| } catch (error) { | |
| console.error('Error updating role in DB:', error); | |
| // Only fall through to in-memory on connection/query errors, not "not found" | |
| } | |
| } |
🤖 Prompt for AI Agents
In `@lib/actions/users.ts` around lines 96 - 105, The DB update block using
db.update(usersSchema).set({ role: newRole }).where(eq(usersSchema.email,
userEmail)).returning() currently falls through when the returned variable
updated is falsy; detect that case immediately after the update and
short-circuit by returning or throwing a "user not found" result (consistent
with the function's error handling) instead of continuing to the in-memory
fallback branch—ensure you reference the updated variable and userEmail to
decide and avoid touching the in-memory store when the DB reports zero rows
affected.
| if (db) { | ||
| try { | ||
| const existingUsers = await db.select().from(usersSchema); | ||
| const incomingEmails = new Set(formData.users.map(u => u.email)); | ||
|
|
||
| // Delete missing | ||
| const toDelete = existingUsers.filter((u: any) => u.email && !incomingEmails.has(u.email)); | ||
| for (const u of toDelete) { | ||
| if (u.email) await db.delete(usersSchema).where(eq(usersSchema.email, u.email)); | ||
| } | ||
|
|
||
| // Simulate saving systemPrompt and selectedModel | ||
| // ... (logic for other settings) | ||
| // Upsert incoming | ||
| for (const u of formData.users) { | ||
| if (!u.email) continue; | ||
| const existing = existingUsers.find((ex: any) => ex.email === u.email); | ||
| if (existing) { | ||
| await db.update(usersSchema) | ||
| .set({ role: u.role }) | ||
| .where(eq(usersSchema.id, existing.id)); | ||
| } else { | ||
| await db.insert(usersSchema).values({ | ||
| id: u.id || crypto.randomUUID(), | ||
| email: u.email, | ||
| role: u.role | ||
| }); | ||
| } | ||
| } | ||
|
|
||
| // For users, the frontend form already constructs the 'users' array. | ||
| // Here, we could compare the incoming users list with the stored one | ||
| // and make granular calls to addUser, updateUserRole, removeUser if needed, | ||
| // or simply replace the user list if that's the desired behavior. | ||
| // For simplicity in this simulation, let's assume the form sends the complete new user list. | ||
| const updatedUsers = await db.select().from(usersSchema); | ||
| return { success: true, message: 'Settings and users updated successfully.', users: updatedUsers as User[] }; | ||
| } catch (error) { | ||
| console.error('Error updating settings and users in DB:', error); | ||
| } |
There was a problem hiding this comment.
Bulk update lacks transaction and uses N+1 queries.
The current implementation:
- Executes deletes and upserts one-by-one in a loop (N+1 queries)
- Has no transaction boundary—partial failures leave the database in an inconsistent state
If an insert fails mid-loop (line 172-177), previously deleted users are lost and only some new users are created.
🔧 Proposed fix using transaction
if (db) {
try {
+ return await db.transaction(async (tx) => {
const existingUsers = await tx.select().from(usersSchema);
const incomingEmails = new Set(formData.users.map(u => u.email));
// Delete missing
const toDelete = existingUsers.filter((u: any) => u.email && !incomingEmails.has(u.email));
- for (const u of toDelete) {
- if (u.email) await db.delete(usersSchema).where(eq(usersSchema.email, u.email));
- }
+ if (toDelete.length > 0) {
+ const emailsToDelete = toDelete.map(u => u.email).filter(Boolean);
+ // Use inArray for batch delete
+ await tx.delete(usersSchema).where(inArray(usersSchema.email, emailsToDelete));
+ }
// Upsert incoming (consider using onConflictDoUpdate for true upsert)
for (const u of formData.users) {
// ... existing upsert logic but use `tx` instead of `db`
}
const updatedUsers = await tx.select().from(usersSchema);
return { success: true, message: 'Settings and users updated successfully.', users: updatedUsers as User[] };
+ });
} catch (error) {🤖 Prompt for AI Agents
In `@lib/actions/users.ts` around lines 152 - 184, The code runs deletes and
upserts one-by-one and without a transaction, which can cause N+1 queries and
partial commits; wrap the whole operation in a transaction (use db.transaction
or equivalent) and perform set-based operations: compute incomingEmails from
formData.users, use tx.delete(usersSchema).where(email not in incomingEmails) to
delete missing users in one statement, fetch existingUsers via
tx.select().from(usersSchema), split formData.users into toInsert and toUpdate
arrays by comparing to existingUsers, then inside the same transaction perform
bulk insert for toInsert (assign id with crypto.randomUUID() when missing) and
either use a single upsert API (insert...onConflict...doUpdate) if supported or
run Promise.all of tx.update(...) calls for toUpdate so updates run in parallel
but still inside the transaction; return the final tx.select().from(usersSchema)
and ensure the transaction rolls back on error.
| // Keep the model config logic | ||
| const modelConfigPath = path.resolve(process.cwd(), 'config', 'model.json'); | ||
|
|
||
| export async function getSelectedModel(): Promise<string | null> { | ||
| noStore(); | ||
| console.log(`[DEBUG] getSelectedModel - Reading from path: "${modelConfigPath}"`); | ||
| try { | ||
| const data = await fs.readFile(modelConfigPath, 'utf8'); | ||
| console.log(`[DEBUG] getSelectedModel - Raw file content: "${data}"`); | ||
| const config = JSON.parse(data); | ||
| return config.selectedModel || null; | ||
| } catch (error) { | ||
| console.error('Error reading model config:', error); | ||
| console.log(`[DEBUG] getSelectedModel - Error reading file:`, error); | ||
| return null; | ||
| } | ||
| } | ||
|
|
||
| export async function saveSelectedModel(model: string): Promise<{ success: boolean; error?: string }> { | ||
| console.log(`[DEBUG] saveSelectedModel - Received model selection: "${model}"`); | ||
| console.log(`[DEBUG] saveSelectedModel - Writing to path: "${modelConfigPath}"`); | ||
| try { | ||
| const data = JSON.stringify({ selectedModel: model }, null, 2); | ||
| await fs.writeFile(modelConfigPath, data, 'utf8'); | ||
| console.log(`[DEBUG] saveSelectedModel - Successfully wrote to file.`); | ||
| revalidatePath('/settings'); | ||
| return { success: true }; | ||
| } catch (error) { | ||
| console.error('Error saving model config:', error); | ||
| console.log(`[DEBUG] saveSelectedModel - Error writing to file:`, error); | ||
| return { success: false, error: 'Failed to save selected model.' }; | ||
| } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check if model config is used elsewhere and how it's initialized
rg -n "modelConfigPath|getSelectedModel|saveSelectedModel" --type ts -A 2Repository: QueueLab/QCX
Length of output: 2482
🏁 Script executed:
#!/bin/bash
# Check if the file exists and what deployment platform is being used
cat package.json | grep -A 5 -E '"name"|"build"|"deploy"' | head -20Repository: QueueLab/QCX
Length of output: 591
🏁 Script executed:
#!/bin/bash
# Look for deployment config files (vercel.json, netlify.toml, etc)
fd -e json -e toml -e yaml | grep -E "(vercel|netlify|deployment|config)" | head -20Repository: QueueLab/QCX
Length of output: 110
File-based config may not persist in serverless environments.
saveSelectedModel writes to the local filesystem, which is typically read-only or ephemeral in serverless deployments (Vercel, AWS Lambda, etc.). The functions are actively used in settings and agent tools for model selection, but changes will be lost on redeployment or cold starts.
Consider storing model selection in the database or using environment variables for production deployments.
🤖 Prompt for AI Agents
In `@lib/actions/users.ts` around lines 262 - 284, The current getSelectedModel
and saveSelectedModel functions use modelConfigPath to read/write a local config
file which won't persist in serverless/ephemeral environments; replace this
file-based persistence with a durable store (e.g., the app's database or
environment-backed config): change getSelectedModel to query the user/settings
table or a shared settings row for the selectedModel (falling back to the file
only for local/dev), and change saveSelectedModel to upsert the selectedModel
into that DB/settings store and return detailed error info on failure; ensure to
keep noStore() behavior where needed and update call sites that import
getSelectedModel/saveSelectedModel to work with the async DB-backed API.
| if (databaseUrl) { | ||
| const poolConfig: PoolConfig = { | ||
| connectionString: databaseUrl, | ||
| }; | ||
| } | ||
|
|
||
| const pool = new Pool(poolConfig); | ||
| if (databaseUrl.includes('supabase.co')) { | ||
| poolConfig.ssl = { | ||
| rejectUnauthorized: false, | ||
| }; | ||
| } | ||
|
|
||
| try { | ||
| const pool = new Pool(poolConfig); | ||
| dbInstance = drizzle(pool, { schema, logger: process.env.NODE_ENV === 'development' }); | ||
| } catch (error) { |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Supabase PostgreSQL SSL connection rejectUnauthorized configuration
💡 Result:
For Supabase Postgres, rejectUnauthorized is a Node.js TLS option (used by drivers like pg, Sequelize, etc.). You generally only set rejectUnauthorized: false when you can’t validate the server cert (e.g., missing CA), but the recommended approach is to use Supabase’s root CA and keep verification on.
Recommended (verify server cert)
- Download the Server root certificate from your Supabase Dashboard (Database → SSL configuration). (supabase.com)
- Connect with full verification (Postgres
sslmode=verify-full). (supabase.com)
pg (node-postgres) example
import fs from "fs";
import { Pool } from "pg";
const pool = new Pool({
connectionString: process.env.DATABASE_URL, // keep this free of sslmode/sslrootcert if you set ssl below
ssl: {
ca: fs.readFileSync("./prod-supabase.cer").toString(),
rejectUnauthorized: true,
},
});pg passes the ssl object to Node’s TLSSocket, where rejectUnauthorized controls certificate verification. (node-postgres.com)
Important: if your connectionString includes sslmode, sslrootcert, etc., pg will overwrite the ssl object you provided. (node-postgres.com)
“Make it connect” (not recommended for production)
If you’re seeing cert errors and just need a quick workaround, you can disable verification:
ssl: { rejectUnauthorized: false }This is the common pattern for self-signed / unverified cert situations in pg, but it weakens security. (node-postgres.com)
Supabase-side note (enforcement)
Supabase can be configured to reject non-SSL connections (“SSL enforcement”). This is separate from rejectUnauthorized (which is client-side cert verification). (supabase.com)
Citations:
- 1: https://supabase.com/docs/guides/database/psql?utm_source=openai
- 2: https://supabase.com/docs/guides/database/psql?utm_source=openai
- 3: https://node-postgres.com/features/ssl?utm_source=openai
- 4: https://node-postgres.com/features/ssl?utm_source=openai
- 5: https://node-postgres.com/features/ssl?utm_source=openai
- 6: https://supabase.com/docs/guides/platform/ssl-enforcement?utm_source=openai
🌐 Web query:
Node.js pg pool SSL rejectUnauthorized Supabase best practices
💡 Result:
Best practice (Node.js pg / Pool) for Supabase SSL
1) Don’t use rejectUnauthorized: false in production.
That setting disables certificate verification (you’re encrypted, but vulnerable to MITM). Instead, verify the server cert using Supabase’s CA cert and keep rejectUnauthorized: true (default). Supabase recommends the strongest client mode, verify-full, which requires trusting their CA certificate. [1][2]
2) Download the Supabase database CA certificate and pass it as ssl.ca.
You can download the “Server root certificate” / SSL certificate from your Supabase project dashboard (Database settings / SSL Configuration). [2][3]
import fs from "node:fs";
import { Pool } from "pg";
const pool = new Pool({
connectionString: process.env.DATABASE_URL, // keep this "clean" (no sslmode/sslrootcert params)
ssl: {
ca: fs.readFileSync("./prod-ca-2021.cer", "utf8"), // from Supabase dashboard
// rejectUnauthorized: true, // optional; true is the default
},
});3) Avoid mixing sslmode / sslrootcert in the connection string with an ssl object.
node-postgres warns that if sslmode, sslrootcert, etc. are present in the connection string, it can overwrite/replace the ssl config you pass in code—leading to accidental misconfiguration. Best practice: keep TLS options in the ssl object (as above). [4]
4) If you use Supabase “SSL enforcement”, still verify properly client-side.
Supabase notes that enforcement ensures SSL is used, but for strongest protection you should use verify-full semantics by trusting the Supabase CA cert (i.e., don’t skip verification). [2]
5) Pooling note (common with Supabase):
If you’re using Supabase’s pooler (Supavisor / “session pooler”), use the pooler host/port shown in the “Connect” panel; Supabase docs highlight pooled connections typically use port 6543. [3]
Sources
[1] Supabase Docs — “Connecting with SSL” (psql example with sslmode=verify-full + sslrootcert)
[2] Supabase Docs — SSL Enforcement + recommendation to use verify-full and download CA cert
[3] Supabase Docs — Connecting to Postgres (mentions downloading cert; pooling port 6543)
[4] node-postgres Docs — SSL + warning about connectionString params overwriting the ssl object
🌐 Web query:
Supabase postgres client certificate SSL requirements
💡 Result:
Supabase Postgres connections use TLS/SSL server authentication, not client-certificate (mTLS) authentication.
- SSL is (optionally) enforced server-side per project, and Supabase recommends using Postgres
sslmode=verify-fullwhen enforcement is enabled. To do that you download the Supabase database CA/root certificate from the Dashboard (Database Settings → SSL Configuration) and trust it locally. [1] - Typical connection parameters are
sslmode=verify-full+sslrootcert=/path/to/prod-supabase.cer(example shown forpsql). [2] - Client certificates (
sslcert/sslkey) are only used if the Postgres server requests them (i.e., when the server is configured to verify client cert identity). Postgres/libpq documents these parameters, but Supabase’s published connection guidance only calls forsslmodeandsslrootcert, not client cert/key. [2][3]
Bottom line: For Supabase you generally need the server root CA cert (sslrootcert) and a secure SSL mode (ideally verify-full); you do not need a client certificate unless you’re running your own Postgres configured for mTLS. [1][2][3]
Sources: [1] Supabase “Postgres SSL Enforcement” docs. [2] Supabase “Connecting with PSQL” docs. [3] PostgreSQL libpq SSL support (client certificates).
Use Supabase's root CA certificate instead of disabling TLS verification.
Setting rejectUnauthorized: false disables certificate verification and creates a MITM vulnerability. Supabase's official recommendation is to download the root CA certificate from your Supabase Dashboard (Database Settings → SSL Configuration) and trust it locally:
import fs from "fs";
const poolConfig: PoolConfig = {
connectionString: databaseUrl,
ssl: {
ca: fs.readFileSync("./prod-supabase.cer", "utf8"), // from Supabase dashboard
rejectUnauthorized: true, // default; explicit for clarity
},
};If the CA certificate is unavailable at runtime, gate insecure SSL behind an explicit environment flag as a temporary workaround only.
🤖 Prompt for AI Agents
In `@lib/db/index.ts` around lines 12 - 26, The current PoolConfig sets
ssl.rejectUnauthorized = false when databaseUrl includes 'supabase.co', which
disables TLS verification; update the poolConfig in the if
(databaseUrl.includes('supabase.co')) branch to load Supabase's root CA and keep
verification enabled: read the CA file (e.g., via fs.readFileSync) and set ssl =
{ ca: <caPem>, rejectUnauthorized: true }, and if the CA file is not present
fall back only when an explicit env flag (e.g., process.env.ALLOW_INSECURE_SSL
=== 'true') is set to allow rejectUnauthorized: false; ensure this change is
applied where Pool is constructed and dbInstance = drizzle(...) is created so
the Pool uses the trusted CA.
This change implements the planned @mention feature for calendar notes.
It includes:
PR created automatically by Jules for task 5204406483989686825 started by @ngoiyaeric
Summary by CodeRabbit
New Features
Bug Fixes
Chores