-
Nim
Nim is a statically typed compiled systems programming language. It combines successful concepts from mature languages like Python, Ada and Modula. Its design focuses on efficiency, expressiveness, and elegance (in that order of priority).
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Ray
Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
I use it regularly for things like web scraping (Scrapy is a joy) and data manipulation. For instance just wrote some fairly complicated scripts for doing address matching to pair up a couple of UK datasets without a common identity field. Human-entered addresses are decidedly fuzzy so you end up with a lot of arbitrary rules and Python is just fast to develop against. I don't really care if the script takes a couple of hours to run on the full datasets (35 million addresses) as opposed to half that time in something else more of a pain to tweak around with.
Perhaps with traditional approaches, but that is changing. Take a look at Ray (from some of the people who originally created Spark). ML usecases are so aggressively focused on Python that there's starting to be a lot of investment in fixing these problems because it's cheaper than shifting the userbase to a "better" language.
Web Frameworks Vs Node (I/O, Web) - Before you decide to flame, note that I'm speaking about web frameworks only. With uvicorn, async libraries like Starlette Python web frameworks are as fast, and faster than nodejs equivalents. Yes, uvicorn is built on uvloops which is blazing fast. https://www.techempower.com/benchmarks/#section=test&runid=a979de55-980d-4721-a46f-77298b3f3923&hw=ph&test=fortune&l=v2p4an-e7&a=2
Not everyone has the same "parallelism" needs. I have used mpi4py to distribute scientific computations using numpy over thousands of cores on hundreds of servers with much less effort than doing the same thing in C / C++ and almost no performance penalty (I could batch my data in big enough chunks). Today there are higher level distributed computing packages like dask that are even easier to use.