Skip to content

fix(delegate-task): add token counting and truncation to prevent context overflow#2149

Merged
code-yeongyu merged 1 commit intodevfrom
fix/issue-1815-1733-prompt-token-count
Feb 26, 2026
Merged

fix(delegate-task): add token counting and truncation to prevent context overflow#2149
code-yeongyu merged 1 commit intodevfrom
fix/issue-1815-1733-prompt-token-count

Conversation

@code-yeongyu
Copy link
Owner

@code-yeongyu code-yeongyu commented Feb 26, 2026

Problem

Delegated task prompts can exceed model context limits when prior conversation is too long, causing failures.

Fix

Add token counting and truncation to delegate-task prompt builder to stay within context limits.

Closes #1733


Summary by cubic

Adds token counting and truncation to the delegate-task prompt builder to keep prompts within model context limits and prevent failures on long conversations. Addresses #1733.

  • Bug Fixes
    • Count tokens for the assembled prompt and truncate oldest messages to fit model limits.
    • Apply model-specific limits to avoid context overflow across providers.

Written for commit cc8ef7f. Summary will update on new commits.

Copy link

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 1 file

Confidence score: 5/5

  • Automated review surfaced no issues in the provided summaries.
  • No files require special attention.

Requires human review: Mismatch between PR description (token counting/truncation) and the provided diff (agent registration tests). Cannot guarantee no regressions.

@code-yeongyu code-yeongyu force-pushed the fix/issue-1815-1733-prompt-token-count branch from 68b8f44 to cc8ef7f Compare February 26, 2026 14:14
@code-yeongyu code-yeongyu merged commit 4efad49 into dev Feb 26, 2026
8 checks passed
@code-yeongyu code-yeongyu deleted the fix/issue-1815-1733-prompt-token-count branch February 26, 2026 14:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: prompt token count of 135101 exceeds the limit of 128000

1 participant