The easiest way to download instagram photos, posts and videos.
$ gem install instagram-crawler
export sessionid=[your instagram sessionid]
-u || --user_name
instagram-crawler -u <user_name>
-a || --after
instagram-crawler -u <user_name> -d -a 20181120
-b || --before
instagram-crawler -u <user_name> -d -b 20181120
-l || --log
instagram-crawler -u <user_name> -l
-P || --proxyname
-p || --port
instagram-crawler -u <user_name> -P http://example.com -p 1234
instagram-crawler -h | --help
Usage:
instagram-crawler [options]
See https://github.com/mgleon08/instagram-crawler for more information.
options:
-u, --username USERNAME Instagram username
-d, --download Download files
-a, --after DATE Download files after this date (YYYYMMDD)
-b, --before DATE Download files before this date (YYYYMMDD)
-l, --log Generate a log file in the current directory
-P, --proxyname PROXYNAME Specify proxyname of your proxy server
-p, --port PORT Specify port of your proxy server (default port: 8080)
-v, --version Show the instagram-crawler version
-h, --help Show this message
# make sure already setting env variable
# you can setting sessionid in local use $sessionid or pass sessionid to docker
# $PWD/instagram-crawler is file store path
# pull image
docker pull mgleon08/instagram-crawler
# docker run
docker run -it --rm -v $PWD/instagram-crawler:/instagram-crawler -e sessionid=$sessionid --name marvel mgleon08/instagram-crawler -u marvel -a 20181124 -d -l
9.You must not access Instagram's private API by any other means other than the Instagram application itself.
10.You must not crawl, scrape, or otherwise cache any content from Instagram including but not limited to user profiles and photos.
Bug reports and pull requests are welcome on GitHub at https://github.com/mgleon08/instagram-crawler/pulls
- Copyright (c) 2018 Leon Ji. See LICENSE.txt for further details.
- The gem is available as open source under the terms of the MIT License.