Manually backing up website via SFTP and it's taking many, many hours #16985
Unanswered
warrenlain
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm currently encountering some very slow behavior of Cyberduck as I do a manual backup of my website, which is not necesssarily large, but is comprised of many small files. It's been preparing to download (and not downloading) for the past six hours now, and the estimated size has slowly grown over that period (10GB now). I'm okay with leaving it on overnight but at this point, I'm not sure even that will be enough time.
I'm connected over SFTP if that context helps, and I'm using what should be the latest Mac OS version of Cyberduck, Version 9.1.3 (42945). Has Cyberduck attempted to address this? Am I just doing this wrong (dragging the whole group of top level folders)? Or is this just expected behavior?
The last time this was discussed under "Issues" the thread was closed and that was in 2023 so forgive me if this is not supposed to be posted here.
Beta Was this translation helpful? Give feedback.
All reactions