Skip to content

Conversation

@matkoniecz
Copy link

based on https://www.gwern.net/Archiving-URLs

If that is intended as source-available or as backup then please ignore this PR.

But I am right now looking for such tool and while https://www.gwern.net/Archiving-URLs is magnificent: this repository seems to not have info how this can be used.

Also, sorry if I misinterpreted

the Python package archivenow does something similar & as of January 2021 is probably better.

based on https://www.gwern.net/Archiving-URLs

If that is intended as source-available or as backup then please ignore this PR.

But I am right now looking for such tool and while https://www.gwern.net/Archiving-URLs is magnificent: this repository seems to not have info how this can be used.

Also, sorry if I misinterpreted

> the Python package archivenow does something similar & as of January 2021 is probably better.
@matkoniecz
Copy link
Author

BTW, this article mentions

It was to fix these problems that I began working on archiver—which would run constantly archiving URLs in the background, archive them into the IA as well, and be smarter about media file downloads. It has been much more satisfactory.

is it published in this repo and

Background daemon which archives a list of URLs to the Internet Archive, archive.is, and other services

description is simply outdated? Or is local backup part of "other services"?

@gwern
Copy link
Owner

gwern commented Sep 12, 2023

Hm, I'm not sure about that. archivenow no longer seems well maintained despite a promising beginning; it hasn't had any patches in 2 years despite a lot of open issues, so recommending it is a bit of a pan/fire situation.


'local backup' is other services - whatever you specify the shell command.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants