gpt-2-output-dataset
mesh-transformer-jax
Our great sponsors
gpt-2-output-dataset | mesh-transformer-jax | |
---|---|---|
11 | 52 | |
1,882 | 6,213 | |
1.2% | - | |
2.9 | 0.0 | |
5 months ago | over 1 year ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gpt-2-output-dataset
-
Being accused for using ChatGPT in my assignment, what should I do ?
especially: "Our classifier is not fully reliable. In our evaluations on a “challenge set” of English texts, our classifier correctly identifies 26% of AI-written text (true positives) as “likely AI-written,” while incorrectly labeling human-written text as AI-written 9% of the time (false positives). Our classifier’s reliability typically improves as the length of the input text increases. Compared to our previously released classifier, this new classifier is significantly more reliable on text from more recent AI systems." Many other classifiers are similar, e.g.:
-
Have OpenAI made GPT-2 available for download? I mean (pre) trained model, not source code? How large is it in terms of MB of traffic? MB on disk?
Links search found: https://github.com/openai/gpt-2-output-dataset (dataset? I want GPT, Pre-trained model).
- GPTZero case study discovers it's only accurate on less than 50% of text
- [P] I launched “CatchGPT”, a supervised model trained with millions of text examples, to detect GPT created content
- Detect ChatGPT Generated Content
-
meet the villain:
Source: the literal source code and paper by the original creators of a detector that most of these knockoff detectors are based on.
-
Originality.ai is a HUGE scam.
OpenAI published a detector themselves that seems to be quite good. https://github.com/openai/gpt-2-output-dataset/tree/master/detector
-
[Hobby Scuffles] Week of December 19, 2022
Here is OpenAI's own detector, but it's not impossible to beat by just doing some fairly basic stuff with things like automatic paraphrasing.
- Meta announces a GPT3-size language model you can download
-
GPT 3 output Detection
To a certain extent, GPT-2 worked because of the smaller dataset of just 40GB. Even in that model, researchers running detection found accurate results only in the:
mesh-transformer-jax
-
Large Language Models: Compairing Gen2/Gen3 Models (GPT-3, GPT-J, MT5 and More)
GPT-J is a LLM case study with two goals: Training a LLM with a data source containing unique material, and using the training frameworkMesh Transformer JAX to achieve a high training efficiency through parallelization. There is no research paper about GPT-J, but on its GitHub pages, the model, different checkpoints, and the complete source code for training is given.
-
[R] Parallel Attention and Feed-Forward Net Design for Pre-training and Inference on Transformers
This idea has already been proposed in ViT-22B and GPT-J-6B.
- Show HN: Finetune LLaMA-7B on commodity GPUs using your own text
-
[D] An Instruct Version Of GPT-J Using Stanford Alpaca's Dataset
Sure. Here's the repo I used for the fine-tuning: https://github.com/kingoflolz/mesh-transformer-jax. I used 5 epochs, and appart from that I kept the default parameters in the repo.
- Boss wants me to use ChatGPT for work, but I refuse to input my personal phone number. Any advice?
-
Let's build GPT: from scratch, in code, spelled out by Andrej Karpathy
You can skip to step 4 using something like GPT-J as far as I understand: https://github.com/kingoflolz/mesh-transformer-jax#links
The pretrained model is already available.
-
Best coding model?
The Github repo suggests it's possible you can change the number of checkpoints to make it run on a GPU.
- Ask HN: What language models can I fine-tune at home?
-
selfhosted/ open-source ChatGPT alternative?
GPT-J, which uses mesh-transformer-jax: https://github.com/kingoflolz/mesh-transformer-jax
-
GPT-J, an open-source alternative to GPT-3
They hinted at it in the screenshot, but the goods are linked from the https://6b.eleuther.ai page: https://github.com/kingoflolz/mesh-transformer-jax#gpt-j-6b (Apache 2)
What are some alternatives?
metaseq - Repo for external large-scale work
DeepSpeed - DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
oxen-release - Lightning fast data version control system for structured and unstructured machine learning datasets. We aim to make versioning datasets as easy as versioning code.
tensorflow - An Open Source Machine Learning Framework for Everyone
gpt-2 - Code for the paper "Language Models are Unsupervised Multitask Learners"
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
KoboldAI-Client
alpaca-lora - Instruct-tune LLaMA on consumer hardware
Finetune_LLMs - Repo for fine-tuning Casual LLMs
cedille-ai - ✒️ Cedille is a large French language model (6B), released under an open-source license