Skip to content

fix: handle OpenAI content array format in messagesToPrompt#27

Open
darrenwadley-ui wants to merge 1 commit intoatalovesyou:mainfrom
darrenwadley-ui:fix/content-array-serialization
Open

fix: handle OpenAI content array format in messagesToPrompt#27
darrenwadley-ui wants to merge 1 commit intoatalovesyou:mainfrom
darrenwadley-ui:fix/content-array-serialization

Conversation

@darrenwadley-ui
Copy link

Summary

  • Fix messagesToPrompt to handle the case where msg.content is an array of content parts instead of a plain string
  • Add extractContent() helper that normalizes both formats to a string
  • Replace all three msg.content references in messagesToPrompt with extractContent(msg.content)

Bug

The OpenAI chat completions API allows message content to be either a string or an array of content part objects:

{
  "role": "user",
  "content": [{"type": "text", "text": "Hello, how are you?"}]
}

Clients like OpenClaw always send content in the array format. When the proxy receives this, JavaScript's template literal coerces the array to a string, producing [object Object] instead of the actual text. This means Claude CLI receives garbage input like:

[object Object]

instead of:

Hello, how are you?

How to reproduce

Send a request to the proxy with content as an array:

curl -X POST http://localhost:3000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-opus-4",
    "messages": [
      {"role": "user", "content": [{"type": "text", "text": "Hello!"}]}
    ]
  }'

Before this fix, Claude CLI would receive [object Object] as the prompt.

Fix

Added an extractContent() helper function that:

  1. Returns the string as-is if content is already a string
  2. Extracts and joins .text fields from content part arrays
  3. Falls back to String(content ?? "") for any other shape

All three uses of msg.content in messagesToPrompt (system, user, assistant cases) now go through this helper.

Test plan

  • Send a request with content as a plain string — should work as before
  • Send a request with content as [{type: "text", text: "..."}] — should now correctly extract the text
  • Send a request with multiple content parts — should join them with newlines

🤖 Generated with Claude Code

OpenAI chat completions API allows message content to be either a string
or an array of content parts [{type: "text", text: "..."}]. Clients like
OpenClaw always send content as an array, causing the proxy to pass
[object Object] to Claude CLI instead of the actual text.

Add extractContent() helper that normalizes both formats to a string.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
chrisle added a commit to chrisle/claude-max-api-proxy that referenced this pull request Feb 27, 2026
Users can now send large prompts without E2BIG errors, use array-based
content format for multimodal support, properly handle system prompts,
and receive token usage data in streaming responses. Model names with
any provider prefix are now correctly normalized.

Changes:
- Fix array content serialization preventing [object Object] bugs
- Pass prompts via stdin to prevent E2BIG errors with large requests
- Add proper system prompt support via --append-system-prompt flag
- Include usage data in final streaming chunks
- Support any provider prefix in model names (claude-max/, etc.)
- Add CLAUDE_DANGEROUSLY_SKIP_PERMISSIONS env var for service mode
- Update to Claude 4.5/4.6 model support

Incorporates improvements from PRs atalovesyou#12, atalovesyou#13, atalovesyou#16, atalovesyou#24, atalovesyou#27 at
https://github.com/atalovesyou/claude-max-api-proxy
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant