Skip to content

fix: handle array content format in message conversion#17

Open
lukedd312 wants to merge 1 commit intoatalovesyou:mainfrom
lukedd312:fix/handle-array-content-format
Open

fix: handle array content format in message conversion#17
lukedd312 wants to merge 1 commit intoatalovesyou:mainfrom
lukedd312:fix/handle-array-content-format

Conversation

@lukedd312
Copy link

Summary

  • The OpenAI Chat API accepts content as either a plain string or an array of content parts ([{ type: "text", text: "..." }])
  • Many upstream callers (OpenClaw, LiteLLM, etc.) send the array form even for text-only messages
  • Without this fix, messagesToPrompt() produces [object Object] as the prompt instead of the actual user message

Changes

  • Add OpenAIContentPart type and widen OpenAIChatMessage.content to string | OpenAIContentPart[]
  • Add extractContent() helper that normalises both representations to a plain string
  • Apply extractContent() to all message roles in messagesToPrompt()

Test plan

  • tsc --noEmit passes
  • Verified with OpenClaw sending content: [{ type: "text", text: "..." }] — messages now display correctly
  • Plain string content still works as before

🤖 Generated with Claude Code

The OpenAI Chat API accepts `content` as either a plain string or an
array of content parts (`[{ type: "text", text: "..." }]`).  Many
upstream callers — including OpenClaw, LiteLLM, and other OpenAI-
compatible routers — send the array form even for text-only messages.

Without this fix, `messagesToPrompt()` passes the raw array object
into template literals, producing `[object Object]` as the prompt
text instead of the actual user message.

Changes:
- Add `OpenAIContentPart` type and widen `OpenAIChatMessage.content`
  to accept `string | OpenAIContentPart[]`
- Add `extractContent()` helper that normalises both representations
- Apply `extractContent()` to all message roles in `messagesToPrompt()`

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
bwiedmann added a commit to bwiedmann/claude-max-api-proxy that referenced this pull request Feb 15, 2026
… message conversion

# Conflicts:
#	src/adapter/openai-to-cli.ts
#	src/types/openai.ts
Grivn added a commit to mnemon-dev/claude-max-api-proxy that referenced this pull request Feb 20, 2026
Fixes applied (from upstream PRs atalovesyou#7, atalovesyou#11, atalovesyou#12, atalovesyou#13, atalovesyou#17, atalovesyou#20):

- fix: normalizeModelName crash on undefined model (issue atalovesyou#21)
- fix: [object Object] serialization for array content parts
- fix: E2BIG error by passing prompt via stdin instead of CLI arg
- fix: ensureString on result.result to prevent non-string output
- feat: CLAUDE_DANGEROUSLY_SKIP_PERMISSIONS env var for headless mode
- feat: OPENCLAW_PROXY=1 env injection for hook isolation
- feat: model aliases for claude-*-4-5, claude-*-4-6 generations
- feat: claude-proxy/ provider prefix support
- feat: developer role support in message conversion
- perf: increase subprocess timeout from 5min to 15min

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant