reddit-html-archiver
bulk-downloader-for-reddit
reddit-html-archiver | bulk-downloader-for-reddit | |
---|---|---|
12 | 80 | |
165 | 2,205 | |
- | - | |
1.8 | 0.0 | |
almost 4 years ago | 3 months ago | |
Python | Python | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
reddit-html-archiver
-
/r/planetside will be going private on June 12th, and will not be coming back until Reddit reverses course on API pricing
Other options, like https://github.com/libertysoft3/reddit-html-archiver are not working anymore (I tried it to create a self-hosted /r/planetside backup).
-
This Reddit Community Has Been Archived
Well done, now you should make it sane. No need to reinvent the wheel here. Just rewrite reddit-html-archiver to use the raw json from redarcs rather than the pushshift api.
-
r/okbuddyretard will be "completely wiped from existence" according to one of the mods
I've seen several banned subs archived using https://github.com/libertysoft3/reddit-html-archiver
- What are Your favorite tools to backup reddit data? (Text Posts, Media Content, Comments..)
-
Archiving as much of Soundgasm as possible
https://github.com/libertysoft3/reddit-html-archiver can accomplish step 1 out of the box Parse for every line including soundgasm and/or other domains you are targeting, and maybe run a dedupe on the list before download to lighten the load on yt-dl since it wasnt optimized for that last I checked that deep (which is YEEEEARS ago fwiw)
- I’m leaving Reddit. If there’s a mass movement to do something about what’s happening, let me know.
- /r/NoNewNormal has been banned by Reddit. A good reminder that Reddit is run by fascists, and that all the subreddits that petitioned for this are book-burners. Are you a developer? Help us program the alternative. See comments for details.
- Welcome my r/NoNewNormal bretheren
- r/NoNewNormal has been banned!
-
Is there a way I can archieve the r/lounge subreddit?
You could try using https://github.com/libertysoft3/reddit-html-archiver which is the software we use to power our reddit archiving efforts over at https://the-eye.eu/r/
bulk-downloader-for-reddit
- BDFR skipping Reddit hosted videos
-
Limited Reddit access?
Until now, I ran a script every day using bulk-downloader-for-reddit to archive about ten subreddits. It usually took less than an hour each time, even when he had to download video files of a few hundred MB.
-
Any methods with Unraid to Automatically download all saved items from Reddit
Look into bdfr, it's a cli reddit downloader.
-
Goodbye everyone
I'm with you! I've already mass edited all my comments on both and removed all my submissions on this account, and my other/older one. Just waiting on bdfr to finish pulling all of my saved things for me, then I'll delete both. I may just be a tiny drop in the user ocean, but some of my posts definitely helped others and contributed useful things.
-
POLL RESULTS - Reddit API changes and the future of /r/ErgoMechKeyboards
FWIW I just backed up my sub with bulk-downloader-for-reddit : https://www.reddit.com/r/DataHoarder/comments/1479c7b/historic_reddit_archives_ongoing_archival_effort/ -> https://github.com/aliparlakci/bulk-downloader-for-reddit
-
How to keep my saved
A GDPR export should have all the data, but they’re taking their time processing. An alternative is to use an archiving tool like BDFR. Unfortunately, it’s limited to 1000 posts due to API limitations.
-
The future of r/ObscureMedia and Reddit
r/DataHoarder was the big reason I started using Reddit regularly. While I don't speak for all of them, many will suggest bdfr or HTTrack for all your scraping needs.
- Reddit limits the use of API to 1000,Let's work together to save the content of StableDiffusion Subreddit as a team
-
Information is currently available.
https://github.com/aliparlakci/bulk-downloader-for-reddit or MONOLITH which has an extension for chrome
-
So, how is everyone?
I’m going to delete my Reddit account soon to further the protest, since by this point I don’t use it much anymore. I already backed up my entire post history using this tool since there are some important memories there for me, so I basically have nothing to lose now.
What are some alternatives?
redscarepod-archive
gallery-dl - Command-line program to download image galleries and collections from several image hosting sites
saidit - The reddit open source fork powering SaidIt
UltimaScraper - Scrape content from OnlyFans and Fansly
redditPostArchiver - Easily archive important Reddit post threads onto your computer
youtube-dl - Command-line program to download videos from YouTube.com and other video sites
eternity - bypass Reddit's 1000-item listing limits by externally storing your Reddit items (saved, created, upvoted, downvoted, hidden) in your own database
bdfr-html - Converts the output of the bulk downloader for reddit to a set of HTML pages.
ripme - Downloads albums in bulk
redditDataExtractor - The reddit Data Extractor is a cross-platform GUI tool for downloading almost any content posted to reddit. Downloads from specific users, specific subreddits, users by subreddit, and with filters on the content is supported. Some intelligence is built in to attempt to avoid downloading duplicate external content.
gwaripper - Tool for conveniently downloading audios from r/gonewildaudio and similar subreddits
reddit-save - A Python tool for backing up your saved and upvoted posts on reddit to your computer.