bulk-downloader-for-reddit
SCrawler
Our great sponsors
bulk-downloader-for-reddit | SCrawler | |
---|---|---|
80 | 25 | |
2,197 | 1,004 | |
- | - | |
0.0 | 8.5 | |
2 months ago | 8 days ago | |
Python | Visual Basic .NET | |
GNU General Public License v3.0 only | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
bulk-downloader-for-reddit
- BDFR skipping Reddit hosted videos
-
Limited Reddit access?
Until now, I ran a script every day using bulk-downloader-for-reddit to archive about ten subreddits. It usually took less than an hour each time, even when he had to download video files of a few hundred MB.
-
Any methods with Unraid to Automatically download all saved items from Reddit
Look into bdfr, it's a cli reddit downloader.
-
Goodbye everyone
I'm with you! I've already mass edited all my comments on both and removed all my submissions on this account, and my other/older one. Just waiting on bdfr to finish pulling all of my saved things for me, then I'll delete both. I may just be a tiny drop in the user ocean, but some of my posts definitely helped others and contributed useful things.
-
POLL RESULTS - Reddit API changes and the future of /r/ErgoMechKeyboards
FWIW I just backed up my sub with bulk-downloader-for-reddit : https://www.reddit.com/r/DataHoarder/comments/1479c7b/historic_reddit_archives_ongoing_archival_effort/ -> https://github.com/aliparlakci/bulk-downloader-for-reddit
-
How to keep my saved
A GDPR export should have all the data, but they’re taking their time processing. An alternative is to use an archiving tool like BDFR. Unfortunately, it’s limited to 1000 posts due to API limitations.
-
The future of r/ObscureMedia and Reddit
r/DataHoarder was the big reason I started using Reddit regularly. While I don't speak for all of them, many will suggest bdfr or HTTrack for all your scraping needs.
- Reddit limits the use of API to 1000,Let's work together to save the content of StableDiffusion Subreddit as a team
-
Information is currently available.
https://github.com/aliparlakci/bulk-downloader-for-reddit or MONOLITH which has an extension for chrome
-
So, how is everyone?
I’m going to delete my Reddit account soon to further the protest, since by this point I don’t use it much anymore. I already backed up my entire post history using this tool since there are some important memories there for me, so I basically have nothing to lose now.
SCrawler
- Could someone please review my reddit-img-dl command
- SCrawler. Reddit, Twitter, Instagram, YouTube and any other sites downloader. New grand update.
- How can I mass download all posts to imgur from my reddit account?
-
Downloading an entire subreddit - in 2023?
Scrawler
-
Just getting into data hoarding
https://github.com/AAndyProgram/SCrawler how can i run this in mac os?
- SCrawler: Reddit, Twitter, Instagram and any other sites downloader.
- Website that can mass download images/gifs/videos at original quality
-
It finally happened. Something I archived was erased from the Internet.
maybe SCrawler? if not, it shouldn't be too difficult to scrape with the API and a custom script.
- SCrawler Alternative for Linux
- GitHub - AAndyProgram/SCrawler: Media downloader from any sites, including Twitter, Reddit, Instagram, XVIDEOS etc.
What are some alternatives?
gallery-dl - Command-line program to download image galleries and collections from several image hosting sites
ripme - Downloads albums in bulk
UltimaScraper - Scrape content from OnlyFans and Fansly
TumblThree - A Tumblr and Twitter Blog Backup Application
youtube-dl - Command-line program to download videos from YouTube.com and other video sites
lux - 👾 Fast and simple video download library and CLI tool written in Go
bdfr-html - Converts the output of the bulk downloader for reddit to a set of HTML pages.
facebook-scraper - Scrape Facebook public pages without an API key
redditDataExtractor - The reddit Data Extractor is a cross-platform GUI tool for downloading almost any content posted to reddit. Downloads from specific users, specific subreddits, users by subreddit, and with filters on the content is supported. Some intelligence is built in to attempt to avoid downloading duplicate external content.
reddit-save - A Python tool for backing up your saved and upvoted posts on reddit to your computer.
timesearch - The subreddit archiver