Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reraise exception from background task #2696

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 8 additions & 22 deletions starlette/middleware/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,32 +60,20 @@ async def wrapped_receive(self) -> Message:
if getattr(self, "_body", None) is not None:
# body() was called, we return it even if the client disconnected
self._wrapped_rcv_consumed = True
return {
"type": "http.request",
"body": self._body,
"more_body": False,
}
return {"type": "http.request", "body": self._body, "more_body": False}
Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ignore this. It was just getting me crazy.

elif self._stream_consumed:
# stream() was called to completion
# return an empty body so that downstream apps don't hang
# waiting for a disconnect
self._wrapped_rcv_consumed = True
return {
"type": "http.request",
"body": b"",
"more_body": False,
}
return {"type": "http.request", "body": b"", "more_body": False}
Comment on lines -73 to +69
Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ignore this. It was just getting me crazy.

else:
# body() was never called and stream() wasn't consumed
try:
stream = self.stream()
chunk = await stream.__anext__()
self._wrapped_rcv_consumed = self._stream_consumed
return {
"type": "http.request",
"body": chunk,
"more_body": not self._stream_consumed,
}
return {"type": "http.request", "body": chunk, "more_body": not self._stream_consumed}
Comment on lines -84 to +76
Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ignore this. It was just getting me crazy.

except ClientDisconnect:
self._wrapped_rcv_disconnected = True
return {"type": "http.disconnect"}
Expand Down Expand Up @@ -148,6 +136,8 @@ async def coro() -> None:
try:
await self.app(scope, receive_or_disconnect, send_no_error)
except Exception as exc:
# import traceback
# traceback.print_exc()
Comment on lines +139 to +140
Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you uncomment those lines, you can see that we have the exceptions on the app_exc, but we don't reraise them at any point.

app_exc = exc

task_group.start_soon(close_recv_stream_on_response_sent)
Expand Down Expand Up @@ -175,6 +165,8 @@ async def body_stream() -> typing.AsyncGenerator[bytes, None]:
if not message.get("more_body", False):
break

await anyio.sleep(0)
Copy link
Sponsor Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like this is enough to make the switch so we have time to run the coro task after the break above.

This shouldn't be the way to solve this issue... Also, it breaks one of the tests.


if app_exc is not None:
raise app_exc

Expand Down Expand Up @@ -211,13 +203,7 @@ def __init__(
async def __call__(self, scope: Scope, receive: Receive, send: Send) -> None:
if self.info is not None:
await send({"type": "http.response.debug", "info": self.info})
await send(
{
"type": "http.response.start",
"status": self.status_code,
"headers": self.raw_headers,
}
)
await send({"type": "http.response.start", "status": self.status_code, "headers": self.raw_headers})

async for chunk in self.body_iterator:
await send({"type": "http.response.body", "body": chunk, "more_body": True})
Expand Down
Loading