proxy-list VS free-proxy-list

Compare proxy-list vs free-proxy-list and see what are their differences.

proxy-list

A list of free, public, forward proxy servers. UPDATED DAILY! (by clarketm)

free-proxy-list

🔥Free proxy servers list / Updated hourly! (by a2u)
Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
proxy-list free-proxy-list
5 1
2,204 346
- -
4.0 4.0
about 1 year ago 12 months ago
MIT License GNU General Public License v3.0 only
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

proxy-list

Posts with mentions or reviews of proxy-list. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-02-15.
  • Proxylist Sources
    21 projects | /r/privatepub | 15 Feb 2023
  • Where do you find proxies for proxychains?
    1 project | /r/Hacking_Tutorials | 12 May 2022
    Fetch them from this Repo daily https://github.com/clarketm/proxy-list. I created a cron Job which Pulls the repository and then Updates my config.
  • Does anyone know how do use Jdownloader 2 to bypass MEGA dl limits?
    1 project | /r/Piracy | 23 Jan 2022
    surfshark doesn't provide proxies, and if they did it would be one domain. https://github.com/clarketm/proxy-list might be worth a try
  • Help scraping StockX / Goat / eBay
    2 projects | /r/webscraping | 16 Jan 2022
    There is a daily update list here https://github.com/clarketm/proxy-list/blob/master/proxy-list-raw.txt
  • Is it normal that NOT EVEN ONE proxy from github or other sites ... work ??
    2 projects | /r/webscraping | 20 Dec 2021
    def filterOut_workingProxies(proxies): #FILTER OUT WORKING PROXIES workingProxies = [] #for i in range(10): for i in range(len(proxies)): p = proxies[i] print(f"testing: {p}...") proxy = { 'http' : 'http://'+p, #ERK:::https://stackoverflow.com/a/61466680 'https' : 'https://'+p} try: resp = requests.get("https://www.google.com", proxies=proxy, timeout=3) print(resp.status_code) if resp.status_code == 200 or str(resp.status_code)[0]=='2': print("+++++ WROKING\n") workingProxies.append(proxy) except: pass return workingProxies [...] #DAILY UPDATING PROXY LINKS: #CONST urls_github = ["https://github.com/ShiftyTR/Proxy-List/blob/master/proxy.txt", "https://github.com/clarketm/proxy-list/blob/master/proxy-list-raw.txt"] url_fplNet = "https://free-proxy-list.net"

free-proxy-list

Posts with mentions or reviews of free-proxy-list. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-02-15.

What are some alternatives?

When comparing proxy-list and free-proxy-list you can also consider the following projects:

mubeng - An incredibly fast proxy checker & IP rotator with ease.

socks5_list - Auto-updated SOCKS5 proxy list + proxies for Telegram

fresh-proxies - fresh-proxies

getproxy - Scripts that automatically collect proxy links from the Internet, types http,https,socks4,sock5

Proxy-List - Free proxy list UPDATED HOURLY! -- for api visit

PROXY-List - Get PROXY List that gets updated everyday

Awesome-Warez - All your base are belong to us!

http-proxy-list - It is a lightweight project that, every 10 minutes, scrapes lots of free-proxy sites, validates if it works, and serves a clean proxy list. [GET https://api.github.com/repos/mertguvencli/http-proxy-list: 403 - Repository access blocked]

proxy-list - A list of free, public, forward proxy servers. UPDATED DAILY!

Proxy-Master - maybe the best free proxy list?

awesome-web-scraping - List of libraries, tools and APIs for web scraping and data processing.