fix: preserve raw string tool outputs#1608
Merged
gold-silver-copper merged 1 commit into0xPlaygrounds:mainfrom Apr 7, 2026
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes #1605.
This changes tool output serialization in
rig-coreso thatTool::Output = Stringis preserved as raw text instead of being JSON-stringified again. Non-string tool outputs are still serialized to JSON text as before.That fixes the double-escaped tool result behavior seen with OpenAI-compatible providers and also restores the intended behavior for string-returning tools that deliberately emit structured JSON for downstream parsing.
What changed
rig-coreStringoutputs verbatimToolResultContent::from_tool_outputunchangedWhy this approach
The bug was caused at the generic tool serialization boundary, not in provider-specific message conversion.
Previously, all tool outputs went through
serde_json::to_string(&output). ForStringoutputs, that produced a JSON string literal, which then surfaced in tool result messages as quoted/escaped text.Fixing this at the serializer boundary is lower-risk than adding parser-side unwrapping logic:
Stringtools that return structured JSON payloads to continue flowing into existingfrom_tool_outputparsingTests
Added regression coverage for:
Stringtool remaining parseable as tool result contentOpenAI’s chat tool-message schema uses
role: "tool", a requiredtool_call_id, andcontentas the tool result payload, which indicates provider-facing tool results are raw text/content rather than a JSON-stringified string literal.[1] vLLM follows that same shape: its tool-calling docs say chat templates must handletool-role messages,[2] and its official OpenAI-compatible example appends tool results as{"role": "tool", "content": result, "tool_call_id": call.id, "name": call.function.name}.[3]So the Rig fix is aligned with the strongest available evidence:
Stringtool outputs should stay raw when sent back through OpenAI-compatible providers like vLLM.Sources: