If data science uses a lot of computational power, then why is python the most used programming language?

This page summarizes the projects mentioned and recommended in the original post on /r/learnmachinelearning

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • serving

    A flexible, high-performance serving system for machine learning models

  • You serve models via https://www.tensorflow.org/tfx/guide/serving which is written entirely in C++ (https://github.com/tensorflow/serving/tree/master/tensorflow_serving/model_servers), no Python on the serving path or in the shipped product.

  • julia

    The Julia Programming Language

  • For example, Julia (https://julialang.org/) is arguably as high level as Python and is targeted to scientific computing and data science but it faster than C++ at execution in some circumstances (Julia compiles on the fly to native code via LLVM, a common compiler backend that is also used in many C/C++ compilers; the same compiler backend used by NVDIA to in the CUDA compiler https://developer.nvidia.com/cuda-llvm-compiler).

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • XLA.jl

    Discontinued Julia on TPUs

  • Julia can also use the same stuff that Python/Tensorflow use, to access the same hardware (e.g. Julia on TPUs https://github.com/JuliaTPU/XLA.jl).

  • tensorflow

    An Open Source Machine Learning Framework for Everyone

  • For reference: In Tensorflow and JAX, for example, the tensor gets compiled to the intermediate XLA format (https://www.tensorflow.org/xla), then passed to the XLA complier (https://github.com/tensorflow/tensorflow/tree/master/tensorflow/compiler/xla/service) or the new TFRT runtime (https://github.com/tensorflow/runtime/blob/master/documents/tfrt_host_runtime_design.md), or some more esoteric hardware (https://github.com/pytorch/glow).

  • runtime

    A performant and modular runtime for TensorFlow (by tensorflow)

  • For reference: In Tensorflow and JAX, for example, the tensor gets compiled to the intermediate XLA format (https://www.tensorflow.org/xla), then passed to the XLA complier (https://github.com/tensorflow/tensorflow/tree/master/tensorflow/compiler/xla/service) or the new TFRT runtime (https://github.com/tensorflow/runtime/blob/master/documents/tfrt_host_runtime_design.md), or some more esoteric hardware (https://github.com/pytorch/glow).

  • glow

    Compiler for Neural Network hardware accelerators (by pytorch)

  • For reference: In Tensorflow and JAX, for example, the tensor gets compiled to the intermediate XLA format (https://www.tensorflow.org/xla), then passed to the XLA complier (https://github.com/tensorflow/tensorflow/tree/master/tensorflow/compiler/xla/service) or the new TFRT runtime (https://github.com/tensorflow/runtime/blob/master/documents/tfrt_host_runtime_design.md), or some more esoteric hardware (https://github.com/pytorch/glow).

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts