expanse
reddit_export_userdata
expanse | reddit_export_userdata | |
---|---|---|
19 | 4 | |
338 | 12 | |
- | - | |
6.3 | 10.0 | |
7 months ago | over 3 years ago | |
JavaScript | Python | |
GNU Affero General Public License v3.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
expanse
-
How to export Reddit data in a usable format?
There is https://github.com/jc9108/expanse
-
Don’t just delete your account. Use a tool to delete all your data before deleting your account, so Reddit doesn’t get money for your posts from 5 years ago showing up in google. Take your value with you when you leave.
Expanse
- How can I download all my saved posts on Reddit?
-
Best tools for downloading Reddit before API access is cut off?
I have found this earlier, but didn't use it yet, so don't know whether it works: https://github.com/jc9108/expanse
-
self hosted Reddit archiver
I'm using jc9108/expanse to archive what I post and what I save.
- Search your reddit saved & upvoted posts via Spyglass
- Update on Expanse, the Reddit Personal Data Archiver
-
My Docker containers seem to share a Postgres database even if I didn't set them up that way
I have some services I host with WSL2 Docker. It looks like paperless-ngx and expanse for example share the same Postgres database somehow? Both are exposed publically and both have database issues somehow.
- Using infinity
-
Cherry - an open source self-hostable bookmark service
Check our jc109s GitHub for expanse. It’s a Reddit archive docker container.
reddit_export_userdata
-
Looking For An App That Will Download Whole Webpages Offline (Specifically Reddit Threads)
You can use cron script to run a regular export of your Reddit saves: https://github.com/dbeley/reddit_export_userdata
- What are Your favorite tools to backup reddit data? (Text Posts, Media Content, Comments..)
-
What are the best programs to batch convert URLs or HTML files to PDFs?
Here's the script: https://github.com/dbeley/reddit_export_userdata
- Save your Reddit Data (saves, etc.)
What are some alternatives?
reddit-shreddit - Program to delete ENTIRE Reddit user post and comments history, AND daily job to keep user history limited to X days.
eternity - bypass Reddit's 1000-item listing limits by externally storing your Reddit items (saved, created, upvoted, downvoted, hidden) in your own database
bdfr-html - Converts the output of the bulk downloader for reddit to a set of HTML pages.
redditSavedDownloader - Script to export your saved submissions and comments
RedditDownloader - Scrapes Reddit to download media of your choice.
reddit-html-archiver - archive reddit data as offline friendly web pages
reddelete - scramble your past data and automate the deletion of your reddit posts and comment history
ripme - Downloads albums in bulk
export-saved-reddit - Export saved Reddit posts into a HTML file for import into Google Chrome.
single-file-cli - CLI tool for saving a faithful copy of a complete web page in a single HTML file (based on SingleFile)
cherry - Cherry is a self-hostable bookmark service
bulk-downloader-for-reddit - Downloads and archives content from reddit