Skip to content

Latest commit

 

History

History
67 lines (52 loc) · 1.13 KB

File metadata and controls

67 lines (52 loc) · 1.13 KB

Streaming in chat completions

Stream options

export interface StreamOptions {
  includeUsage?: boolean;
}

Text streaming

const { textStream } = await openai()
  .chat("gpt-4o-mini")
  .messages([user("hi")])
  .stream()
  .run();

for await (const text of textStream) {
  process.stdout.write(text);
}

Streaming with tools

const { toolCallStream } = await openai()
  .chat("gpt-4o-mini")
  .tool(weatherTool)
  .messages([user("What's the weather like in Boston, Beijing, Tokyo today?")])
  .stream()
  .run();

for await (const toolCalls of toolCallStream) {
  console.log(toolCalls);
}

Structured output streaming

const { objectStream } = await openai()
  .chat("gpt-4o-mini")
  .messages([user("generate a person with name and age in json format")])
  .responseSchema(personSchema)
  .objectStream()
  .run();

for await (const object of objectStream) {
  console.log(object);
}

Chunk Stream

The original chunk object from providers

const { stream } = await openai()
  .chat("gpt-4o-mini")
  .messages([user("hi")]);

for await (const chunk of stream) {
  console.log(chunk);
}