Our great sponsors
-
haveged
Entropy daemon ![Continuous Integration](https://github.com/jirka-h/haveged/workflows/Continuous%20Integration/badge.svg)
> When are we going to get a distributed peer to peer randomness service.
The problem with that is how do you trust the source if you need “cryptographically secret” random numbers. One of the sources could poison the well with bad entropy and increase by a small but significant amount the chance of guessing your keys.
OK, so you could the data from many sources, but that would add latency so not an option where performance matters and even then if someone gains control of a significant portion of the distributed system (just by standing up lots of hosts) the issue persists.
You could also do a bunch of statistical tests, but again that is work that will harm performance and if you are going to that sort of effort anyway you could setup your own random sources (a couple of active Linux boxes, on older Linux kernels running haveged, on newer ones the latter part isn't needed (https://github.com/jirka-h/haveged/issues/57)) and use those tests to make sure those sources are statically safe.
So such a service isn't really needed, and where it might be isn't likely to be trusted, so it could exist but as a play-thing not a serious service.
On my little home server, not a particularly up to date CPU etc, running 5.10, I can pull >3Gbit/sec from /dev/random. Heck, the Pi400 that is currently my router can hand out ~230Mbit/sec or entropy.
-
Can anyone recommend between Librandombytes and libsodium ramdombytes?
https://github.com/jedisct1/libsodium/tree/master/src/libsod...
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.