The prospects for 128 bit processors (John R. Mashey, 1995)

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • llama.cpp

    LLM inference in C/C++

  • You don't need 128 bits for memory addressing, but for just processing - yes, and in fact 128 bits is far less than we're using already! If you look at https://github.com/ggerganov/llama.cpp you'll see this line:

    > AVX, AVX2 and AVX512 support for x86 architectures

    Guess what the 512 in AVX512 stands for?;)

    On GPUs I'm pretty sure the same thing is in play, but I'm less familiar. A quick search turns up ex. https://developer.nvidia.com/blog/implementing-high-precisio... which makes me think yes.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Show HN: Open-Source Load Balancer for Llama.cpp

    6 projects | news.ycombinator.com | 1 Jun 2024
  • RAG with llama.cpp and external API services

    2 projects | dev.to | 31 May 2024
  • Ask HN: I have many PDFs – what is the best local way to leverage AI for search?

    10 projects | news.ycombinator.com | 30 May 2024
  • Deploying llama.cpp on AWS (with Troubleshooting)

    1 project | dev.to | 28 May 2024
  • Devoxx Genie Plugin : an Update

    6 projects | dev.to | 28 May 2024