Official docs - https://platform.openai.com/docs/api-reference/chat
- types included
- docs are included
- Stream included
npm i chatgpt-wrapper
or
yarn add chatgpt-wrapper
const { ChatGPT } = require('chatgpt-wrapper');
import { ChatGPT } from 'chatgpt-wrapper';
with Types
import { ChatGPT, Message, ReqBody, ResBody } from 'chatgpt-wrapper';
-
API_KEY (Required): Visit your API Keys page to retrieve the API key
-
ORG (Optional): For users who belong to multiple organizations, you can specify which organization is used for an API request. Usage from these API requests will count against the specified organization's subscription quota. Get Org ID here.
-
URL (Optional): API endpoint. Default set to 'Create chat completion' method.
-
MODEL (Optional): Model for requests, where not specified. Default is 'gpt-3.5-turbo'. Models list.
const chat = new ChatGPT({
API_KEY: '...', // Your API KEY (Required)
ORG: '...', // Your organization (Optional)
URL: '...', // API endpoint (Optional)
MODEL: '...', // Custom default model (Optional)
});
Don't forget to catch errors from your requests since OpenAI API sometimes returns an error message instead of response.
"API error" errors returns APIError type.
try {
const answer = await chat.send('question');
// ...
} catch (err) {
// handle error
}
chat.send('question')
.then((answer) => { /* ... */ })
.catch((err) => { /* handle error */ });
send(content: ReqBody | string, fetchOptions: RequestInit = {}): Promise<ResBody>
- content - string or ReqBody
- fetchOptions - [optional] node-fetch options
Use this method to send a request to ChatGPT API
Raw string equals to
{
model: 'gpt-3.5-turbo',
messages: [{
role: 'user',
content: 'YOUR STRING',
}],
}
Examples:
const answer = await chat.send('what is JavaScript');
console.log(answer.choices[0].message);
chat.send('what is JavaScript').then((answer) => {
console.log(answer.choices[0].message);
});
const answer = await chat.send({
model: 'gpt-3.5-turbo-0301',
messages: [{
role: 'user',
content: 'what is JavaScript',
}],
max_tokens: 200,
});
console.log(answer.choices[0].message);
stream(content: ReqBody | string, fetchOptions: RequestInit = {}): Promise<ResBody>
- content - string or ReqBody
- fetchOptions - [optional] node-fetch options
Use this method to send a request to ChatGPT API and get steam response back
Raw string equals to
{
model: 'gpt-3.5-turbo',
stream: true,
messages: [{
role: 'user',
content: 'YOUR STRING',
}],
}
Examples:
(async () => {
const answer = await chat.stream('what is JavaScript in 200 words');
answer.pipe(process.stdout);
})();
Since you can pass options to fetch, you can abort request with AbortController. See fetch docs.
Example:
const controller = new AbortController();
const doStop = () => controller.abort();
// ...
const answer = await chat.stream('generate some long story', {
signal: controller.signal,
});
answer.pipe(process.stdout);
Now, if you call doStop(), the controller will abort the request along with the stream.
Message in chat format
Source: index.ts#L4
Function model description. See more
Source: index.ts#L46
Request body
Source: index.ts#L70
Response body
Source: index.ts#L188
OpenAI API error
Source: index.ts#L263