automl
gpt-3
DISCONTINUED
Our great sponsors
automl | gpt-3 | |
---|---|---|
7 | 39 | |
6,125 | 9,406 | |
0.5% | - | |
5.7 | 3.5 | |
4 months ago | over 3 years ago | |
Jupyter Notebook | ||
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
automl
-
Android QR Code Detection with TensorFlow Lite
EfficientDet-D0 has comparable accuracy as YOLOv3.
-
[R] Google AI Introduces Two New Families of Neural Networks Called ‘EfficientNetV2’ and ‘CoAtNet’ For Image Recognition
Code for https://arxiv.org/abs/2104.00298 found: https://github.com/google/automl/efficientnetv2
gpt-3
-
[D] Dumb question: is GPT3 model open-sourced?
And from skimming their GH page, it seems it'd be costly to host as well
-
[D] In AI, is bigger always better? Article in Nature; Bing summary and comment
Found relevant code at https://github.com/openai/gpt-3 + all code implementations here
- Servers
-
SerpApi Changelog: December, 2022
This is a Telegram bot that lets you chat with the chatGPT language model using your local browser. The bot uses Playwright to run chatGPT in Chromium, and can parse code and text, as well as send messages. It also includes a /draw command that allows you to generate pictures using stable diffusion. More features are coming soon.
-
DeepMind’s New Language Model,Chinchilla(70B Parameters),Which Outperforms GPT-3
It implies our models are wrong.
Consider that a human adolescence is ~9.46x10^6 minutes and a fast speaking rate is ~200words/minute. That sets an upper bound of 1.9 billion words heard during adolescence. ie: human adults are trained on a corpus of less than 1.9B words.
To some extent, more data can offset worse models, but I don't think that's the regieme we're currently in. GPT-3 was trained (on among other languages) 181 billion English words - or about 100 times more words than a human will hear by the time they reach adulthood. How is the human brain able to achieve a higher level of success with 1% of the data?
1. https://github.com/openai/gpt-3/blob/master/dataset_statisti...
-
Dall-E 2
For discussion's sake:
- BFN reached out to A16Z, Worldcoin, Khosla Ventures largely declined to comment, which would mean that at least one person probably had a bit of runway from at least when the requests for comment were submitted. So yeah, you're probably right.
- Going from the github repos for GPT 2 and 3, those may have been hard launches:
Feb 14 2019, predating the first press for GPT-2 by a few days (was probably made public Feb 14 though) - https://github.com/openai/gpt-2/commit/c2dae27c1029770cea409...
May 28 2020, timed alongside the press news for GPT-3 - https://github.com/openai/gpt-3/commit/12766ba31aa6de490226e...
- Would it really have to be a conspiracy? Sounds like only one person would have to target a specific date or date range, and without really giving a reason.
One of the things that puts a hole in my own thinking here is that Sam Altman's name isn't really tied to the Dall-E 2 release. It's just OpenAI, and all the press around Sam's name today still almost exclusively surfaces Worldcoin stories. So if this was actually intended to bury another story, Sam's name would have to have been included in all the press blasts to be successful.
-
[R] Google AI Introduces Two New Families of Neural Networks Called ‘EfficientNetV2’ and ‘CoAtNet’ For Image Recognition
Code for https://arxiv.org/abs/2005.14165 found: https://github.com/openai/gpt-3
- [N] OpenAI has released the encoder and decoder for the discrete VAE used for DALL-E
- Quel secret de polichinelle dans votre profession surprendrait ceux qui n'en sont pas ?
What are some alternatives?
dalle-mini - DALL·E Mini - Generate images from a text prompt
DALL-E - PyTorch package for the discrete VAE used for DALL·E.
DALLE-mtf - Open-AI's DALL-E for large scale training in mesh-tensorflow.
stylegan2-pytorch - Simplest working implementation of Stylegan2, state of the art generative adversarial network, in Pytorch. Enabling everyone to experience disentanglement
v-diffusion-pytorch - v objective diffusion inference code for PyTorch.
simple-faster-rcnn-pytorch - A simplified implemention of Faster R-CNN that replicate performance from origin paper
tensorrtx - Implementation of popular deep learning networks with TensorRT network definition API
dalle-2-preview
gpt-2 - Code for the paper "Language Models are Unsupervised Multitask Learners"
jukebox - Code for the paper "Jukebox: A Generative Model for Music"
TFLiteClassification - TensorFlow Lite Image Classification Python Implementation
FLAML - A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.