-
Notifications
You must be signed in to change notification settings - Fork 4
Bulk upload
If you have a well formatted bibliography file (e.g. a BibTeX file) then use the import feature in the Zotero client. If you have a smallish number of DOIs/PMIDs then use the Zotero client also.
We have found it can be troublesome to upload large numbers (like a thousand) of papers to zotero, so we've got a simple upload script that helps.
Call it like:
python uploadPapers.py --input <input file> --type doi --collection <zotero collection>
This does need the file tree to be in place to work, so manually make sure the following exist
/cache/shortname/processed/upload
/cache/shortname/raw/doi
/cache/shortname/raw/pubmed/xml
/cache/shortname/upload
The input file goes in /cache/shortname/upload. It has to be a file with a list of dictionary items like
[
{"DOI": "10.1182/blood-2011-03-339630"},
{"DOI": "10.1182/blood-2012-03-413591"},
{"DOI": "10.1183/09031936.00068014"}
]
Sometimes this fails with errors like "ssl.SSLError: ('The read operation timed out',)", running it again it may get past this - it's not an error specifically with an item. It may make sense to split up the input file into e.g. chunks of 200? Although this would take a bit of management - if one fails you would not be able to figure out where it went wrong. Might need to upload a chunk to one library and then move them when it's finished.
Can rerun this, it will return a status of unchanged from zotero if it already exists, although it does actually add it to zotero!
Introduction
Install and run
Reference
Misc