Skip to content
This repository has been archived by the owner on Apr 12, 2021. It is now read-only.

Retrieving and processing multiple batches in parallel #21

Open
joeyeng opened this issue Apr 9, 2021 · 1 comment
Open

Retrieving and processing multiple batches in parallel #21

joeyeng opened this issue Apr 9, 2021 · 1 comment

Comments

@joeyeng
Copy link

joeyeng commented Apr 9, 2021

I see in issue #13 that QueueBatch doesn't respect host.json for the maxPollingInterval setting. Does it also ignore the rest of the host.json settings for queues as well?

"visibilityTimeout": "00:00:05",
"batchSize": 16,
"maxDequeueCount": 5,
"newBatchThreshold": 8,
"messageEncoding": "base64"

In particular I'm interested in setting the # of batches that can be retrieved and processed in parallel. The equivalent setting for single messages is the batchSize parameter. Also the newBatchThreshold for getting a new batch when the # processed gets down to a certain threshold. Is there an equivalent for these settings using the attribute or does QueueBatch already respect these settings from host.json?

@Scooletz
Copy link
Owner

Hi there. Thank you for raising this but for it's been a while since I took a look at this repo and can't answer your question right out of my head. This reminded me that I probably should have archived it a while ago to do not set wrong expectations.

Just to not leave you hanging, when working on it I didn't do much to respect host.json or anything so highly likely this is a no.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants