Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add transport compression to attic push #43

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

DarkKirb
Copy link
Contributor

@DarkKirb DarkKirb commented Apr 7, 2023

My buildserver is behind a 4Mbps uplink, so it is very beneficial for me if attic supported transport compression. This pull requests adds transport compression to both the server and the client.

It adds support for the Content-Encoding header to /api/v1/upload-path, with the following compression formats:

  • Deflate (standard)
  • Gzip (standard)
  • Brotli (standard)
  • xz (supported by attic already)
  • zstd (supported by attic already)

As per the standard, it compresses the whole request body, including the narinfo file. X-Attic-Nar-Info-Preamble-Size refers to the byte offset in the decompressed body. Ideally this would reuse the request decompression middleware from tower-http, but i didn’t figure out how to get it to work, due to differing body types.

The client portion of the change adds a configuration option that changes transport compression to the server. This option is scoped to each server, as for example an attic server running on LAN might not need transport compression.

The client compression code is a bit suboptimal, due to how async_compression and reqwest use different types, and more efficient ways of adapting an AsyncRead to a Stream require tokio (which fwict would make it more difficult to adapt a stream into an AsyncRead first)

@DarkKirb
Copy link
Contributor Author

I have deployed it in prod and with default settings it seems to give me a 2x performance improvement in most larger derivations (1MB/s instead of 500kB/s) and no change in derivations with large amounts of random data (tarballs for example). I think i can still improve compression ratios and therefore speed

@gador
Copy link

gador commented Jul 9, 2024

This looks like a really cool feature. Can you rebase it to the current main branch?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants