You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
demonstrate relation between chunk size and computation time / number of tasks with a simple example?
maybe even memory usage
This would be huge! This comes up often in Satpy where users want to process satellite images on their local machine but they only have 8GB or 16GB of memory. If someone can make a good diagram showing chunks being processed by a worker thread/process and how changing the size of all chunks or number of workers contributes to the overall memory usage that would be such a help when explaining this to users.
from the pangeo working meeting discussion with @mgrover1 @jmunroe @norlandrhagen
Here's an outline for an intermediate tutorial talking about dask chunking specifically for Xarray users
Motivation: why care about chunk size?
Keeping track
Why is it important to choose appropriate chunks early in the pipeline?
Specify chunks when reading data
chunks="auto"
.chunks
during data readopen_dataset
open_mfdataset
The text was updated successfully, but these errors were encountered: