Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Version 2.0 #32

Merged
merged 32 commits into from
Nov 7, 2023
Merged
Changes from 1 commit
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
f445a81
Switch to @dexaai eslint config
rileytomasek Oct 21, 2023
f9c8082
Upgrade dev dependencies
rileytomasek Oct 21, 2023
b0ea006
Reformat files with new Prettier version
rileytomasek Oct 21, 2023
fd2d413
Require Node version 18 or higher
rileytomasek Oct 21, 2023
77012e3
Switch to ECMAScript modules + TypeScript build
rileytomasek Oct 21, 2023
8abdb12
Upgrade dependencies
rileytomasek Oct 21, 2023
dba3876
Remove barely used or unused dependencies
rileytomasek Oct 21, 2023
3ed2f47
Clean up client initialization args for Ky
rileytomasek Oct 21, 2023
795690e
Remove deprecated edit endpoint
rileytomasek Oct 21, 2023
9d61726
Remove Zod dependency & use OpenAI types
rileytomasek Oct 21, 2023
2d2f693
Move closer to raw OpenAI API endpoints/shapes
rileytomasek Oct 22, 2023
7251318
Use Error class from OpenAI Node package
rileytomasek Oct 22, 2023
caa4a24
Add jitter to failed request retries
rileytomasek Oct 22, 2023
b9e2abe
Bump version to 2.0.0-beta.1
rileytomasek Oct 22, 2023
de6c343
Standardize param/response types and export
rileytomasek Oct 22, 2023
0362174
Export error classes for instanceof checks
rileytomasek Oct 22, 2023
debbe5d
Bump to 2.0.0-beta.2
rileytomasek Oct 22, 2023
a3503a3
Support passing custom headers to a single request
rileytomasek Oct 22, 2023
d7c0620
Include OpenAI library types
rileytomasek Oct 23, 2023
a699bc2
Update readme
rileytomasek Oct 23, 2023
37eeec0
Update package sizes in readme
rileytomasek Oct 23, 2023
193ec64
Export type for chat stream chunk response
rileytomasek Oct 24, 2023
8bb9276
Add streaming fix from v1 branch
rileytomasek Oct 30, 2023
fc73480
Bump version to beta v5
rileytomasek Oct 30, 2023
9e93ad0
Bump to beta 6
rileytomasek Nov 1, 2023
ffa29ed
Fix absolute import in openai-types (#33)
transitive-bullshit Nov 3, 2023
861a725
Bump to beta.7
rileytomasek Nov 3, 2023
0be310c
feat: Update openai to latest after dev day 2023
transitive-bullshit Nov 7, 2023
9802b23
Clean up ky imports
transitive-bullshit Nov 7, 2023
079184a
feat: bump to beta 8
transitive-bullshit Nov 7, 2023
1adb936
Merge branch 'master' into v2
transitive-bullshit Nov 7, 2023
bae240a
v2 remove all absolute imports from openai-types and package.json scr…
transitive-bullshit Nov 7, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Support passing custom headers to a single request
rileytomasek committed Oct 22, 2023
commit a3503a330c6c44dac728c1afc17a3edd2b6c4be4
32 changes: 24 additions & 8 deletions src/openai-client.ts
Original file line number Diff line number Diff line change
@@ -39,8 +39,11 @@ export type ConfigOpts = {
kyOptions?: KyOptions;
};

/** Override the default Ky options for a single request. */
type RequestOpts = { headers?: KyOptions['headers'] };

export class OpenAIClient {
api: ReturnType<typeof createApiInstance>;
private api: ReturnType<typeof createApiInstance>;

constructor(opts: ConfigOpts = {}) {
const process = globalThis.process || { env: {} };
@@ -58,9 +61,16 @@ export class OpenAIClient {
});
}

private getApi(opts?: RequestOpts) {
return opts ? this.api.extend(opts) : this.api;
}

/** Create a completion for a chat message. */
async createChatCompletion(params: ChatParams): Promise<ChatResponse> {
const response: OpenAI.ChatCompletion = await this.api
async createChatCompletion(
params: ChatParams,
opts?: RequestOpts,
): Promise<ChatResponse> {
const response: OpenAI.ChatCompletion = await this.getApi(opts)
.post('chat/completions', { json: params })
.json();
return response;
@@ -69,8 +79,9 @@ export class OpenAIClient {
/** Create a chat completion and stream back partial progress. */
async streamChatCompletion(
params: ChatStreamParams,
opts?: RequestOpts,
): Promise<ChatStreamResponse> {
const response = await this.api.post('chat/completions', {
const response = await this.getApi(opts).post('chat/completions', {
json: { ...params, stream: true },
onDownloadProgress: () => {}, // trick ky to return ReadableStream.
});
@@ -85,8 +96,9 @@ export class OpenAIClient {
/** Create completions for an array of prompt strings. */
async createCompletions(
params: CompletionParams,
opts?: RequestOpts,
): Promise<CompletionResponse> {
const response: OpenAI.Completion = await this.api
const response: OpenAI.Completion = await this.getApi(opts)
.post('completions', { json: params })
.json();
return response;
@@ -95,8 +107,9 @@ export class OpenAIClient {
/** Create a completion for a single prompt string and stream back partial progress. */
async streamCompletion(
params: CompletionStreamParams,
opts?: RequestOpts,
): Promise<CompletionStreamResponse> {
const response = await this.api.post('completions', {
const response = await this.getApi(opts).post('completions', {
json: { ...params, stream: true },
onDownloadProgress: () => {}, // trick ky to return ReadableStream.
});
@@ -107,8 +120,11 @@ export class OpenAIClient {
}

/** Create an embedding vector representing the input text. */
async createEmbeddings(params: EmbeddingParams): Promise<EmbeddingResponse> {
const response: OpenAI.CreateEmbeddingResponse = await this.api
async createEmbeddings(
params: EmbeddingParams,
opts?: RequestOpts,
): Promise<EmbeddingResponse> {
const response: OpenAI.CreateEmbeddingResponse = await this.getApi(opts)
.post('embeddings', { json: params })
.json();
return response;