Our great sponsors
-
A 20-year-old pocket calculator with a fingernail-sized solar cell will leave your brain in the dust when it comes to arithmetic. A computer of the same age can destroy you in complex scientific computations or in chess – given a few minutes to think. I'm using a decade-old laptop to run a Tencent artificial neural network that can fix misshapen (but already beautiful) portraits in seconds – original images generated in 30 seconds by something like A100, consuming less power than a modern gaming PC. A Stable Diffusion network is qualitatively superior, has human-level visual imagination and artistic ability and fits into a 3090. Gato has roughly the same hardware requirement and is, in a sense, a weak AGI already, capable of piloting a robot appendage as well as dealing with text and images and a bunch of other stuff... Those tasks are getting increasingly humanlike, but the compute requirement is not growing anywhere near as fast as your idea implies. Crucially, your appeal to Moore is misguided, we're seeing successes in making ML more algorithmically efficient lately, the progress is not coming solely from throwing more compute at the problem. For a concrete example, consider Flash Attention. Corporations will easily afford supercomputers so your specific claim is irrelevant, but why shouldn't we expect that a competent human-level agent can be ran on a small basement server even today, or on a high-end gaming PC in a decade? Our intuitive sense of the complexity of the task has apparently no relation to actual compute requirement, and only indicates the mismatch between the task's format and our survival-specialized substrate (at best). A vastly better substrate than typical modern chips is possible, but it's not clear they aren't all-around superior to human brains as is.
-
Midjourney + GFPGAN, something like 10 watt-hours.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.