Someone’s Been Messing with My Subnormals

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • BinaryBuilderBase.jl

  • The Julia package ecosystem has a lot of safeguards against silent incorrect behavior like this. For example, if you try to add a package binary build which would use fast math flags, it will throw an error and tell you to repent:

    https://github.com/JuliaPackaging/BinaryBuilderBase.jl/blob/...

    In user codes you can do `@fastmath`, but it's at the semantic level so it will change `sin` to `sin_fast` but not recurse down into other people's functions, because at that point you're just asking for trouble. In summary, "Fastmath" is overused and many times people actually want other optimizations (automatic FMA), and people really need to stop throwing global changes around willy-nilly, and programming languages need to force people to avoid such global issues both semantically and within its package ecosystems norms.

  • MuladdMacro.jl

    This package contains a macro for converting expressions to use muladd calls and fused-multiply-add (FMA) operations for high-performance in the SciML scientific machine learning ecosystem

  • But if what you want is automatic FMA, then why carry along every other possible behavior with it? Just because you want FMA, suddenly NaNs are turned into Infs, subnormal numbers go to zero, handling of sin(x) at small values is inaccurate, etc? To me that's painting numerical handling in way too broad of strokes. FMA also only increases numerical accuracy, it doesn't decrease numerical accuracy, so bundling it with unsafe transformations makes one uncertain now whether it has improved or decreased accuracy.

    For reference, to handle this well we use MuladdMacro.jl which is a semantic transformation that turns x*y+z into muladd expressions, and it does not recurse into functions so it does not change the definitions of the callers inside of the macro scope.

    https://github.com/SciML/MuladdMacro.jl

    This is something that will always increase performance and accuracy (performance because muladd in Julia is an FMA that is only applied if hardware FMA exists, effectively never resorting to a software FMA emulation) because it's targeted to do only a transformation that has that property.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • herbie

    Optimize floating-point expressions for accuracy

  • Here is a really cool automatic tool that rewrites floating point expressions to be more accurate: https://herbie.uwplse.org/

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Std: Clamp generates less efficient assembly than std:min(max,std:max(min,v))

    4 projects | news.ycombinator.com | 16 Jan 2024
  • Herbie: Find and fix floating-point accuracy problems

    1 project | news.ycombinator.com | 28 Nov 2023
  • Towards a New SymPy

    5 projects | news.ycombinator.com | 8 Sep 2023
  • Q: Automated floating point error analysis

    1 project | /r/compsci | 14 Feb 2023
  • The comment with the most upvotes decides what language I write my finals in this year will be.

    14 projects | /r/ProgrammerHumor | 17 Sep 2022