This repository contains a Rust library for downloading, uploading, and manipulating datasets in the Precomputed format popularized by CloudVolume, Neuroglancer and TensorStore.
| Feature | Description | Status |
|---|---|---|
| Download info | Download and parse the info manifest files |
✅ |
| Download chunks | Download data chunks from a Precomputed dataset | ✅ |
| Upload chunks | Upload data chunks to a Precomputed dataset | 🚧 |
| Create datasets | Create new Precomputed datasets with specified parameters | 🚧 |
| Multiscale | Handle multiscale datasets | 🚧 |
| Compression | Support for compressed data chunks (e.g., gzip) | 🚧 |
| Authentication | Support for authenticated access to private datasets | 🚧 |
use precomputed_rs::{PrecomputedVolume, Parallelism};
use ndarray::s;
let uri = "https://bossdb-open-data.s3.amazonaws.com/witvliet2020/Dataset_8/em";
// or use `let uri = "file:///path/to/data";` for local filesystem access
let mip = (64.0, 64.0, 30.0);
let cv = PrecomputedVolume::new_at_mip(uri, mip)?
.fill_missing(0.0)
.parallel(Parallelism::Auto);
let data = cv.cutout(s![0..512, 0..512, 0..16])?;
println!("dtype = {:?}", data.dtype());
println!("shape = {:?}", data.shape());