Reddit-Archive-Host
reddit-html-archiver
Reddit-Archive-Host | reddit-html-archiver | |
---|---|---|
4 | 12 | |
53 | 165 | |
- | - | |
0.0 | 1.8 | |
over 5 years ago | almost 4 years ago | |
Python | Python | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Reddit-Archive-Host
-
sub status update
Check out this tool: https://github.com/DrPugsley/Reddit-Archive-Host
- What are Your favorite tools to backup reddit data? (Text Posts, Media Content, Comments..)
-
Is there a way to get into reddit premium only subreddits?
https://github.com/DrPugsley/Reddit-Archive-Host this seems promising, but there are probably better scripts too
-
Are there any coding geeks here, that can figure out how to make an offline version of /r/Aspergers, for whenever the internet goes down?
https://github.com/DrPugsley/Reddit-Archive-Host works pretty well
reddit-html-archiver
-
/r/planetside will be going private on June 12th, and will not be coming back until Reddit reverses course on API pricing
Other options, like https://github.com/libertysoft3/reddit-html-archiver are not working anymore (I tried it to create a self-hosted /r/planetside backup).
-
This Reddit Community Has Been Archived
Well done, now you should make it sane. No need to reinvent the wheel here. Just rewrite reddit-html-archiver to use the raw json from redarcs rather than the pushshift api.
-
r/okbuddyretard will be "completely wiped from existence" according to one of the mods
I've seen several banned subs archived using https://github.com/libertysoft3/reddit-html-archiver
- What are Your favorite tools to backup reddit data? (Text Posts, Media Content, Comments..)
-
Archiving as much of Soundgasm as possible
https://github.com/libertysoft3/reddit-html-archiver can accomplish step 1 out of the box Parse for every line including soundgasm and/or other domains you are targeting, and maybe run a dedupe on the list before download to lighten the load on yt-dl since it wasnt optimized for that last I checked that deep (which is YEEEEARS ago fwiw)
- I’m leaving Reddit. If there’s a mass movement to do something about what’s happening, let me know.
- /r/NoNewNormal has been banned by Reddit. A good reminder that Reddit is run by fascists, and that all the subreddits that petitioned for this are book-burners. Are you a developer? Help us program the alternative. See comments for details.
- Welcome my r/NoNewNormal bretheren
- r/NoNewNormal has been banned!
-
Is there a way I can archieve the r/lounge subreddit?
You could try using https://github.com/libertysoft3/reddit-html-archiver which is the software we use to power our reddit archiving efforts over at https://the-eye.eu/r/
What are some alternatives?
eternity - bypass Reddit's 1000-item listing limits by externally storing your Reddit items (saved, created, upvoted, downvoted, hidden) in your own database
redscarepod-archive
redditSavedDownloader - Script to export your saved submissions and comments
saidit - The reddit open source fork powering SaidIt
export-saved-reddit - Export saved Reddit posts into a HTML file for import into Google Chrome.
redditPostArchiver - Easily archive important Reddit post threads onto your computer
ripme - Downloads albums in bulk
gwaripper - Tool for conveniently downloading audios from r/gonewildaudio and similar subreddits
reddit_export_userdata - Export userdata from your reddit accounts. Submissions, comments, saved, upvoted contents are supported.
redditDataExtractor - The reddit Data Extractor is a cross-platform GUI tool for downloading almost any content posted to reddit. Downloads from specific users, specific subreddits, users by subreddit, and with filters on the content is supported. Some intelligence is built in to attempt to avoid downloading duplicate external content.