Why is Python popular despite being accused of being slow?

This page summarizes the projects mentioned and recommended in the original post on /r/programming

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • Nim

    Nim is a statically typed compiled systems programming language. It combines successful concepts from mature languages like Python, Ada and Modula. Its design focuses on efficiency, expressiveness, and elegance (in that order of priority).

  • Scrapy

    Scrapy, a fast high-level web crawling & scraping framework for Python.

  • I use it regularly for things like web scraping (Scrapy is a joy) and data manipulation. For instance just wrote some fairly complicated scripts for doing address matching to pair up a couple of UK datasets without a common identity field. Human-entered addresses are decidedly fuzzy so you end up with a lot of arbitrary rules and Python is just fast to develop against. I don't really care if the script takes a couple of hours to run on the full datasets (35 million addresses) as opposed to half that time in something else more of a pain to tweak around with.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • Poetry

    Python packaging and dependency management made easy

  • Ray

    Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.

  • Perhaps with traditional approaches, but that is changing. Take a look at Ray (from some of the people who originally created Spark). ML usecases are so aggressively focused on Python that there's starting to be a lot of investment in fixing these problems because it's cheaper than shifting the userbase to a "better" language.

  • FrameworkBenchmarks

    Source for the TechEmpower Framework Benchmarks project

  • Web Frameworks Vs Node (I/O, Web) - Before you decide to flame, note that I'm speaking about web frameworks only. With uvicorn, async libraries like Starlette Python web frameworks are as fast, and faster than nodejs equivalents. Yes, uvicorn is built on uvloops which is blazing fast. https://www.techempower.com/benchmarks/#section=test&runid=a979de55-980d-4721-a46f-77298b3f3923&hw=ph&test=fortune&l=v2p4an-e7&a=2

  • Dask

    Parallel computing with task scheduling

  • Not everyone has the same "parallelism" needs. I have used mpi4py to distribute scientific computations using numpy over thousands of cores on hundreds of servers with much less effort than doing the same thing in C / C++ and almost no performance penalty (I could batch my data in big enough chunks). Today there are higher level distributed computing packages like dask that are even easier to use.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • What have you automated using Python?

    15 projects | /r/Python | 31 Aug 2022
  • Scrapy: A Fast and Powerful Scraping and Web Crawling Framework

    1 project | news.ycombinator.com | 16 Feb 2024
  • Seven Python Projects to Elevate Your Coding Skills

    3 projects | dev.to | 15 Feb 2024
  • Turning webpages into pdf

    2 projects | /r/learnpython | 6 Jul 2023
  • Implementing case sensitive headers in Scrapy (not through `_caseMappings`)

    4 projects | /r/scrapy | 3 Jul 2023