-
Notifications
You must be signed in to change notification settings - Fork 404
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: Improve OpenAI mock server streams #1890
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1890 +/- ##
=======================================
Coverage 96.87% 96.87%
=======================================
Files 209 209
Lines 39768 39768
=======================================
Hits 38527 38527
Misses 1241 1241
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
@@ -105,28 +128,25 @@ function goodStream(dataToStream, chunkTemplate) { | |||
this.push(null) | |||
} | |||
} | |||
}) | |||
}).pause() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok so because the stream was finished we couldn't destroy it which would cause the openai library to not handle the errors properly? do we need to pause the stream in the good case though?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Streams start filling up their internal buffer as soon as they are created. Hitting pause on it until the stream.pipe(res)
happens just gives some breathing room between stream creation and consumption. I am convinced it is necessary for the error case stream. I am open to removing .pause()
from the normal case, though, if you feel strongly about it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok makes sense. I don't feel strongly about it just caling it out. I'll approve
The OpenAI folks helped me realize that I didn't quite design the error stream case in our mock server. This PR refactors the mock server to induce an error at the client side that will be handled correctly by the OpenAI client, and in turn bubbled up to our code as a legitimate connection error.