Our great sponsors
-
NAI step in and being the coolest AI company around here. (All content is yours, everything is encrypted they can't read it- which might be duh. Isn't that obvious? No shit. That's far from the truth in AI industry.) They use GPT neo first - not GPT two - There is an epic movement going on at the same time too, if not before. It's the on going legend of eleutherAI. then gpt-j, then out of no where-frcking facebook drop their own open source model being Fairseq... then gpt-neox finshed training.
-
gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
Assuming they would ever want to, given the NAI team handled months of finetuning for every model and that's equivalent to giving away the core of their business, the VRAM requirements alone for NeoX (Krake minus finetune) would make it unrunnable by almost anyone who hasn't built and budgeted specifically for running it. I should add that the "almost" is in there only because crazy people and hardware reviewers exist.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Related posts
- [R] You can't train GPT-3 on a single GPU, but you *can* tune its hyperparameters on one
- Sequence-to-Sequence Toolkit Written in Python
- Show HN: LlamaGym – fine-tune LLM agents with online reinforcement learning
- Lightning AI Studios – A persistent GPU cloud environment
- Nvidia's 900 tons of GPU muscle bulks up server market, slims down wallets