twitter-archiver
snscrape
Our great sponsors
- Appwrite - The open-source backend cloud platform
- InfluxDB - Collect and Analyze Billions of Data Points in Real Time
- Onboard AI - Learn any GitHub repo in 59 seconds
twitter-archiver | snscrape | |
---|---|---|
4 | 29 | |
284 | 4,010 | |
- | - | |
10.0 | 8.5 | |
about 1 month ago | 21 days ago | |
JavaScript | Python | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
twitter-archiver
snscrape
- API to scrape tweets
-
Twitter scraping for complete profiles (very large data sets)?
Try Snscrape.
- snscrape getting blocked from twitter
- [Project]Topic modelling of tweets from the same user
- Show HN: Twitter API Reverse Engineered
-
Recommend tool other than Maltego ?
If you're looking for small, standalone tools, you can have a look at tools like snscrape (https://github.com/JustAnotherArchivist/snscrape) and WhatsMyName (https://github.com/WebBreacher/WhatsMyName).
- Anyone familiar with the open-source Twitter API alternative Twint?
-
import requests
snscrape is a great program for scraping social media
-
“Starting February 9, we will no longer support free access to the Twitter API”
I've been using this https://github.com/JustAnotherArchivist/snscrape and with 5 threads and no vpn just my laptop and getting about 1 million tweets a day (vs 1000 before being rate limited on the api).
-
Twitter archiver: Make your own simple, public, searchable Twitter archive
I hope you’ll get a better answer, but a minimal solution is to use snscrape (https://github.com/JustAnotherArchivist/snscrape) to download a raw JSON dump of a user timeline. Some caveats: 1. This doesn’t work for all users (e.g. some accounts seem to remain deindexed after unsuspensions). 2. While you can get rid of 90% of the file size by removing fluff columns (e.g. related to processing media and emojis), keep a backup. When importing such JSON, I unwittingly did a lossy data type conversion, and it can be irreparable once the tweets are deleted.
$ snscrape --progress --jsonl twitter-user jack > jack.json
What are some alternatives?
facebook_page_scraper - Scrapes facebook's pages front end with no limitations & provides a feature to turn data into structured JSON or CSV
TWINT - An advanced Twitter scraping & OSINT tool written in Python that doesn't use Twitter's API, allowing you to scrape a user's followers, following, Tweets and more while evading most API limitations.
instagram_hunter - Instagram-Hunter is a simple tool that helps you find instagram accounts.
reddit-detective - Play detective on Reddit: Discover political disinformation campaigns, secret influencers and more
Socialhome - A federated social home
webtoondl - Python webcomics scraper
privalise - Social Media Network Focuses On Data Security And Being Community Driven Web App
linkedin-visualizer - The missing feature in LinkedIn
LinkScope_Client - Repository for the LinkScope Client software.
nider - Python package to add text to images, textures and different backgrounds
tweety - Twitter Scraper
TweetScraper - TweetScraper is a simple crawler/spider for Twitter Search without using API