Code for the paper "Language Models are Unsupervised Multitask Learners"
Eventually there will be so much software that writing code will need to be automated by machines. GPT-2 is open source and it's able to write computer programs, although not very well. GPT-3 does it better and they give it away for free. So imagine how much more advanced the stuff you have to pay money for is, or the trade secret models. You'd think programmers would move on to a higher-level of abstraction where programmers program the computer programs that do programming. That's probably not going to be the case. For example, take a look at the GPT-2 source code https://github.com/openai/gpt-2/blob/master/src/model.py It's only a few hundred lines of code.
I get an incredibly long error simply for trying to install an older version of Numpy
1 project | reddit.com/r/learnprogramming | 9 Jan 2022
Downloaded GPT-2, Encode.py, and Train.py not found.
2 projects | reddit.com/r/GPT3 | 8 Jan 2022
I used AI to generate a new Beatles track from scratch. Behold, "Magic Tree".
1 project | reddit.com/r/beatles | 1 Jan 2022
Any reason not to use the "Mirai" server software?
9 projects | reddit.com/r/admincraft | 29 Dec 2021
Paperback - An Encrypted Paper-based Backup Solution
1 project | reddit.com/r/coolgithubprojects | 29 Dec 2021