proxy-list
Proxy-List
Our great sponsors
proxy-list | Proxy-List | |
---|---|---|
5 | 4 | |
2,204 | 574 | |
- | - | |
4.0 | 10.0 | |
about 1 year ago | 9 months ago | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
proxy-list
- Proxylist Sources
-
Where do you find proxies for proxychains?
Fetch them from this Repo daily https://github.com/clarketm/proxy-list. I created a cron Job which Pulls the repository and then Updates my config.
-
Does anyone know how do use Jdownloader 2 to bypass MEGA dl limits?
surfshark doesn't provide proxies, and if they did it would be one domain. https://github.com/clarketm/proxy-list might be worth a try
-
Help scraping StockX / Goat / eBay
There is a daily update list here https://github.com/clarketm/proxy-list/blob/master/proxy-list-raw.txt
-
Is it normal that NOT EVEN ONE proxy from github or other sites ... work ??
def filterOut_workingProxies(proxies): #FILTER OUT WORKING PROXIES workingProxies = [] #for i in range(10): for i in range(len(proxies)): p = proxies[i] print(f"testing: {p}...") proxy = { 'http' : 'http://'+p, #ERK:::https://stackoverflow.com/a/61466680 'https' : 'https://'+p} try: resp = requests.get("https://www.google.com", proxies=proxy, timeout=3) print(resp.status_code) if resp.status_code == 200 or str(resp.status_code)[0]=='2': print("+++++ WROKING\n") workingProxies.append(proxy) except: pass return workingProxies [...] #DAILY UPDATING PROXY LINKS: #CONST urls_github = ["https://github.com/ShiftyTR/Proxy-List/blob/master/proxy.txt", "https://github.com/clarketm/proxy-list/blob/master/proxy-list-raw.txt"] url_fplNet = "https://free-proxy-list.net"
Proxy-List
- Proxylist Sources
- Free Proxy list
-
Is it normal that NOT EVEN ONE proxy from github or other sites ... work ??
def filterOut_workingProxies(proxies): #FILTER OUT WORKING PROXIES workingProxies = [] #for i in range(10): for i in range(len(proxies)): p = proxies[i] print(f"testing: {p}...") proxy = { 'http' : 'http://'+p, #ERK:::https://stackoverflow.com/a/61466680 'https' : 'https://'+p} try: resp = requests.get("https://www.google.com", proxies=proxy, timeout=3) print(resp.status_code) if resp.status_code == 200 or str(resp.status_code)[0]=='2': print("+++++ WROKING\n") workingProxies.append(proxy) except: pass return workingProxies [...] #DAILY UPDATING PROXY LINKS: #CONST urls_github = ["https://github.com/ShiftyTR/Proxy-List/blob/master/proxy.txt", "https://github.com/clarketm/proxy-list/blob/master/proxy-list-raw.txt"] url_fplNet = "https://free-proxy-list.net"
What are some alternatives?
mubeng - An incredibly fast proxy checker & IP rotator with ease.
openproxylist - List of Free HTTPS, SOCKS4, SOCKS5 & V2Ray Proxy (Daily Updates!)
fresh-proxies - fresh-proxies
XenProxy - Free Proxies already checked with a timeout of 5000 ms
Awesome-Warez - All your base are belong to us!
free-proxy-list - 🔥Free proxy servers list / Updated hourly!
proxy-list - A list of free, public, forward proxy servers. UPDATED DAILY!
socks5_list - Auto-updated SOCKS5 proxy list + proxies for Telegram
awesome-web-scraping - List of libraries, tools and APIs for web scraping and data processing.
proxy-list - Automatically updated list of free proxies
Sneaks-API - A StockX, FlightClub, Goat, and Stadium Goods API all in one. This sneaker API allows users to search sneakers and track and compare prices while providing additional info such as product links and images
proxy-list - Get PROXY List that gets updated every hours. [GET https://api.github.com/repos/HyperBeats/proxy-list: 403 - Repository access blocked]