You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am writing a function that retrieves the files from Azure Blob Storage and returns a zip file containing them. HTTP Streams seem like a perfect match for this use case: I can just start loading files one by one from storage and stream the archive to the client on the fly.
The proof of concept works, but I'm facing the following issues:
There seems to be no way to detect that a client closed the connection. The function keeps writing data somewhere, and only when everything is written, there appears an error in the log ([Error] Executed '<function name>' (Failed, Id=<id>, Duration=54018ms))
There is no way to implement backpressure. The "drain" event is never emitted on the stream that I return in the response body, the data is consumed immediately and fully whenever I pass it into the stream. I tested with slow clients: the function keeps sending data, while the memory used by the function app rapidly increases. Out of curiosity I looked around the codebase, it seems the reason is this code, where the response is written unconditionally, without checking the value returned by ServerResponse.write.
For completeness, this is the PoC code I have:
import{InvocationContext,HttpHandler,HttpRequest}from'@azure/functions';import*asarchiverfrom"archiver";import*asstreamfrom"stream";exportconstDownload: HttpHandler=async(request: HttpRequest,context: InvocationContext)=>{constptStream=newstream.PassThrough({highWaterMark: 64*1024});ptStream.on("drain",()=>{context.log("This is never called");});constarchive=archiver("zip",{zlib: {level: 1}});archive.pipe(ptStream);constqueue=[...input.filenames];constprocessNext=async()=>{constfilename=queue.pop();constblockBlobClient=containerClient.getBlockBlobClient(filename);constdownloadResponse=awaitblockBlobClient.download();conststream=downloadResponse.readableStreamBodyasNodeJS.ReadableStream;archive.append(stream,{name: filename});};archive.on('entry',(e)=>{if(queue.length==0){returnarchive.finalize();}processNext();});processNext();return{body: ptStream,status: 200,headers: {"Content-Type": "application/zip","Content-Disposition": `attachment; filename=${input.archiveName}`}}}
The text was updated successfully, but these errors were encountered:
I am writing a function that retrieves the files from Azure Blob Storage and returns a zip file containing them. HTTP Streams seem like a perfect match for this use case: I can just start loading files one by one from storage and stream the archive to the client on the fly.
The proof of concept works, but I'm facing the following issues:
[Error] Executed '<function name>' (Failed, Id=<id>, Duration=54018ms)
)For completeness, this is the PoC code I have:
The text was updated successfully, but these errors were encountered: