bulk-downloader-for-reddit
expanse
Our great sponsors
bulk-downloader-for-reddit | expanse | |
---|---|---|
80 | 19 | |
2,203 | 334 | |
- | - | |
0.0 | 6.3 | |
3 months ago | 7 months ago | |
Python | JavaScript | |
GNU General Public License v3.0 only | GNU Affero General Public License v3.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
bulk-downloader-for-reddit
- BDFR skipping Reddit hosted videos
-
Limited Reddit access?
Until now, I ran a script every day using bulk-downloader-for-reddit to archive about ten subreddits. It usually took less than an hour each time, even when he had to download video files of a few hundred MB.
-
Any methods with Unraid to Automatically download all saved items from Reddit
Look into bdfr, it's a cli reddit downloader.
-
Goodbye everyone
I'm with you! I've already mass edited all my comments on both and removed all my submissions on this account, and my other/older one. Just waiting on bdfr to finish pulling all of my saved things for me, then I'll delete both. I may just be a tiny drop in the user ocean, but some of my posts definitely helped others and contributed useful things.
-
POLL RESULTS - Reddit API changes and the future of /r/ErgoMechKeyboards
FWIW I just backed up my sub with bulk-downloader-for-reddit : https://www.reddit.com/r/DataHoarder/comments/1479c7b/historic_reddit_archives_ongoing_archival_effort/ -> https://github.com/aliparlakci/bulk-downloader-for-reddit
-
How to keep my saved
A GDPR export should have all the data, but they’re taking their time processing. An alternative is to use an archiving tool like BDFR. Unfortunately, it’s limited to 1000 posts due to API limitations.
-
The future of r/ObscureMedia and Reddit
r/DataHoarder was the big reason I started using Reddit regularly. While I don't speak for all of them, many will suggest bdfr or HTTrack for all your scraping needs.
- Reddit limits the use of API to 1000,Let's work together to save the content of StableDiffusion Subreddit as a team
-
Information is currently available.
https://github.com/aliparlakci/bulk-downloader-for-reddit or MONOLITH which has an extension for chrome
-
So, how is everyone?
I’m going to delete my Reddit account soon to further the protest, since by this point I don’t use it much anymore. I already backed up my entire post history using this tool since there are some important memories there for me, so I basically have nothing to lose now.
expanse
-
How to export Reddit data in a usable format?
There is https://github.com/jc9108/expanse
-
Don’t just delete your account. Use a tool to delete all your data before deleting your account, so Reddit doesn’t get money for your posts from 5 years ago showing up in google. Take your value with you when you leave.
Expanse
- How can I download all my saved posts on Reddit?
-
Best tools for downloading Reddit before API access is cut off?
I have found this earlier, but didn't use it yet, so don't know whether it works: https://github.com/jc9108/expanse
-
self hosted Reddit archiver
I'm using jc9108/expanse to archive what I post and what I save.
- Search your reddit saved & upvoted posts via Spyglass
- Update on Expanse, the Reddit Personal Data Archiver
-
My Docker containers seem to share a Postgres database even if I didn't set them up that way
I have some services I host with WSL2 Docker. It looks like paperless-ngx and expanse for example share the same Postgres database somehow? Both are exposed publically and both have database issues somehow.
- Using infinity
-
Cherry - an open source self-hostable bookmark service
Check our jc109s GitHub for expanse. It’s a Reddit archive docker container.
What are some alternatives?
gallery-dl - Command-line program to download image galleries and collections from several image hosting sites
reddit-shreddit - Program to delete ENTIRE Reddit user post and comments history, AND daily job to keep user history limited to X days.
UltimaScraper - Scrape content from OnlyFans and Fansly
bdfr-html - Converts the output of the bulk downloader for reddit to a set of HTML pages.
youtube-dl - Command-line program to download videos from YouTube.com and other video sites
RedditDownloader - Scrapes Reddit to download media of your choice.
reddelete - scramble your past data and automate the deletion of your reddit posts and comment history
redditDataExtractor - The reddit Data Extractor is a cross-platform GUI tool for downloading almost any content posted to reddit. Downloads from specific users, specific subreddits, users by subreddit, and with filters on the content is supported. Some intelligence is built in to attempt to avoid downloading duplicate external content.
export-saved-reddit - Export saved Reddit posts into a HTML file for import into Google Chrome.
reddit-save - A Python tool for backing up your saved and upvoted posts on reddit to your computer.
cherry - Cherry is a self-hostable bookmark service