v3.1.0
What's Changed
[NEW] Apache Iceberg sink
A new sink that writes batches of data to an Apache Iceberg table.
It serializes incoming data batches into Parquet format and appends them to the
Iceberg table, updating the table schema as necessary.
Currently, it supports Apache Iceberg hosted in AWS and AWS Glue data catalogs.
To learn more about the Iceberg sink, see the docs.
Added by @tomas-quix in #555
Docs
- Update import paths in sources docs by @daniil-quix in #570
- Fix missing imports in Windowing docs by @daniil-quix in #574
- Update README.md by @mikerosam in #582
- Iceberg sink docs by @daniil-quix in #586
- Chore/docs updates by @daniil-quix in #577
Dependencies
- Update pydantic-settings requirement from <2.6,>=2.3 to >=2.3,<2.7 by @dependabot in #583
- Bump testcontainers from 4.8.1 to 4.8.2 by @dependabot in #579
Misc
- Abstract away the state update cache by @daniil-quix in #576
- Add Conda release script by @gwaramadze in #571
- app: Add option to select store backend by @quentin-quix in #544
- Refactor WindowedRocksDBPartitionTransaction.get_windows by @gwaramadze in #558
New Contributors
- @tomas-quix made their first contribution in #555
Full Changelog: v3.0.0...v3.1.0