reddit_export_userdata
reddit-html-archiver
reddit_export_userdata | reddit-html-archiver | |
---|---|---|
4 | 12 | |
12 | 165 | |
- | - | |
10.0 | 1.8 | |
over 3 years ago | almost 4 years ago | |
Python | Python | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
reddit_export_userdata
-
Looking For An App That Will Download Whole Webpages Offline (Specifically Reddit Threads)
You can use cron script to run a regular export of your Reddit saves: https://github.com/dbeley/reddit_export_userdata
- What are Your favorite tools to backup reddit data? (Text Posts, Media Content, Comments..)
-
What are the best programs to batch convert URLs or HTML files to PDFs?
Here's the script: https://github.com/dbeley/reddit_export_userdata
- Save your Reddit Data (saves, etc.)
reddit-html-archiver
-
/r/planetside will be going private on June 12th, and will not be coming back until Reddit reverses course on API pricing
Other options, like https://github.com/libertysoft3/reddit-html-archiver are not working anymore (I tried it to create a self-hosted /r/planetside backup).
-
This Reddit Community Has Been Archived
Well done, now you should make it sane. No need to reinvent the wheel here. Just rewrite reddit-html-archiver to use the raw json from redarcs rather than the pushshift api.
-
r/okbuddyretard will be "completely wiped from existence" according to one of the mods
I've seen several banned subs archived using https://github.com/libertysoft3/reddit-html-archiver
- What are Your favorite tools to backup reddit data? (Text Posts, Media Content, Comments..)
-
Archiving as much of Soundgasm as possible
https://github.com/libertysoft3/reddit-html-archiver can accomplish step 1 out of the box Parse for every line including soundgasm and/or other domains you are targeting, and maybe run a dedupe on the list before download to lighten the load on yt-dl since it wasnt optimized for that last I checked that deep (which is YEEEEARS ago fwiw)
- I’m leaving Reddit. If there’s a mass movement to do something about what’s happening, let me know.
- /r/NoNewNormal has been banned by Reddit. A good reminder that Reddit is run by fascists, and that all the subreddits that petitioned for this are book-burners. Are you a developer? Help us program the alternative. See comments for details.
- Welcome my r/NoNewNormal bretheren
- r/NoNewNormal has been banned!
-
Is there a way I can archieve the r/lounge subreddit?
You could try using https://github.com/libertysoft3/reddit-html-archiver which is the software we use to power our reddit archiving efforts over at https://the-eye.eu/r/
What are some alternatives?
eternity - bypass Reddit's 1000-item listing limits by externally storing your Reddit items (saved, created, upvoted, downvoted, hidden) in your own database
redscarepod-archive
redditSavedDownloader - Script to export your saved submissions and comments
saidit - The reddit open source fork powering SaidIt
ripme - Downloads albums in bulk
redditPostArchiver - Easily archive important Reddit post threads onto your computer
single-file-cli - CLI tool for saving a faithful copy of a complete web page in a single HTML file (based on SingleFile)
bulk-downloader-for-reddit - Downloads and archives content from reddit
export-saved-reddit - Export saved Reddit posts into a HTML file for import into Google Chrome.
gwaripper - Tool for conveniently downloading audios from r/gonewildaudio and similar subreddits