crawlab
turbo-tor-crawl
crawlab | turbo-tor-crawl | |
---|---|---|
4 | 1 | |
10,803 | 6 | |
1.0% | - | |
6.0 | 10.0 | |
13 days ago | over 1 year ago | |
Go | Go | |
BSD 3-clause "New" or "Revised" License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
crawlab
-
Self-hosted web scraper?
Haven't tried but this project https://github.com/crawlab-team/crawlab looks promising.
-
CI/CD in Action: Manage auto builds of large open-source projects with GitHub Actions?
The new version of Crawlab v0.6 split general functionalities into separated modules, so that the whole project is consisted of a few dependent sub-projects. For example, the main project crawlab depends on the front-end project crawlab-ui and back-end project crawlab-core. Higher decoupling and maintainability are the benefits.
-
CI/CD in Action: How to use Microsoft's GitHub Actions in a right way?
GitHub Actions is the official CI/CD workflow service provided by GitHub. It is aimed at making it easy for open-source project contributors to manage operational maintenance, and enable open-source communities to embrace cloud-native DevOps. GitHub Actions is integrated into most of my open-source projects including Crawlab and ArtiPub. As a contributor, I think GitHub Actions is not only easy to use, but also free (which is the most important). Therefore, I hope this article will allow open-source project contributors who are not familiar with GitHub Actions, to really get ideas on how to utilize it and make an impact.
-
Golang in Action: How to quickly implement a minimal task scheduling system
Task Scheduling is one of the most important features in software systems, which literally means assigning and executing long tasks or scripts according to certain specifications. In the web crawler management platform Crawlab, task scheduling serves as a core module, which you may wonder how to build it from scratch. This article will introduce you how to build a simple but useful task scheduler with Go.
turbo-tor-crawl
-
Event Horizon
Event Horizon is a project that tells about interesting and safe places of the darknet space, with the aim of destroying stereotypes established in society, about all the immoral horrors of the dark web. It has many interesting sites with attached screenshots, some description and sometimes files. The project also has its own telegram bot for screenshots of onion resources and an onion crawler for searching for them - whose sources can be found on Github. Horizon was previously removed and resumed last year, now the activity of publications has fallen, presumably the project is on pause. Links: Telegram, Telegram-bot, Tor-Crawler.
What are some alternatives?
slrp - rotating open proxy multiplexer
Pholcus - Pholcus is a distributed high-concurrency crawler software written in pure golang
crawlab-core - Backend core modules for Crawlab
DHT - BitTorrent DHT Protocol && DHT Spider.
ant - A web crawler for Go
colly - Elegant Scraper and Crawler Framework for Golang
docker-base-images
Geziyor - Geziyor, blazing fast web crawling & scraping framework for Go. Supports JS rendering.
crawlab-ui - 🎉 A Vue.js 3.0 UI Library made by Crawlab team
cariddi - Take a list of domains, crawl urls and scan for endpoints, secrets, api keys, file extensions, tokens and more
artipub - Article publishing platform that automatically distributes your articles to various media channels
Gin - Gin is a HTTP web framework written in Go (Golang). It features a Martini-like API with much better performance -- up to 40 times faster. If you need smashing performance, get yourself some Gin.