slrp
crawlab
slrp | crawlab | |
---|---|---|
2 | 4 | |
149 | 10,803 | |
- | 1.0% | |
8.0 | 6.0 | |
24 days ago | 13 days ago | |
Go | Go | |
MIT License | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
slrp
- SLRP – rotating open proxy multiplexer
-
Webscraping Proxy Library for Scrapy
You may also look to this project: https://github.com/nfx/slrp - rotating open proxy multiplexer
crawlab
-
Self-hosted web scraper?
Haven't tried but this project https://github.com/crawlab-team/crawlab looks promising.
-
CI/CD in Action: Manage auto builds of large open-source projects with GitHub Actions?
The new version of Crawlab v0.6 split general functionalities into separated modules, so that the whole project is consisted of a few dependent sub-projects. For example, the main project crawlab depends on the front-end project crawlab-ui and back-end project crawlab-core. Higher decoupling and maintainability are the benefits.
-
CI/CD in Action: How to use Microsoft's GitHub Actions in a right way?
GitHub Actions is the official CI/CD workflow service provided by GitHub. It is aimed at making it easy for open-source project contributors to manage operational maintenance, and enable open-source communities to embrace cloud-native DevOps. GitHub Actions is integrated into most of my open-source projects including Crawlab and ArtiPub. As a contributor, I think GitHub Actions is not only easy to use, but also free (which is the most important). Therefore, I hope this article will allow open-source project contributors who are not familiar with GitHub Actions, to really get ideas on how to utilize it and make an impact.
-
Golang in Action: How to quickly implement a minimal task scheduling system
Task Scheduling is one of the most important features in software systems, which literally means assigning and executing long tasks or scripts according to certain specifications. In the web crawler management platform Crawlab, task scheduling serves as a core module, which you may wonder how to build it from scratch. This article will introduce you how to build a simple but useful task scheduler with Go.
What are some alternatives?
advanced-scrapy-proxies - Scrapy rotation proxy package with advanced functions
crawlab-core - Backend core modules for Crawlab
katana - A next-generation crawling and spidering framework.
ant - A web crawler for Go
Pholcus - Pholcus is a distributed high-concurrency crawler software written in pure golang
docker-base-images
Ferret - Declarative web scraping
turbo-tor-crawl - Recursive hostnames crawler
lux - 👾 Fast and simple video download library and CLI tool written in Go
crawlab-ui - 🎉 A Vue.js 3.0 UI Library made by Crawlab team
artipub - Article publishing platform that automatically distributes your articles to various media channels
Gin - Gin is a HTTP web framework written in Go (Golang). It features a Martini-like API with much better performance -- up to 40 times faster. If you need smashing performance, get yourself some Gin.