CodeGenX
transformers
CodeGenX | transformers | |
---|---|---|
7 | 176 | |
511 | 125,369 | |
0.0% | 1.7% | |
2.6 | 10.0 | |
almost 2 years ago | 1 day ago | |
JavaScript | Python | |
Mozilla Public License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CodeGenX
-
Top OpenAI Tools, Examples & Use Cases
GitHub link: https://github.com/DeepGenX/CodeGenX
-
[P] CodeParrot 🦜: Train your own CoPilot from scratch!
Looks good, I am currently working on a similar open-source project called CodeGenX. Its a fine-tuned version of GPT-J. Can I ask how long this took to train, and what hardware was used?
-
CodeGenX, a code generation system powered by GPT-J
Hey Everyone!
We are delighted to announce the first version of CodeGenX. CodeGenX is a free and open-source code-generation system (similiar to Github Copilot) powered by a fine-tuned version of GPT-J.
You can find details for installation and instruction on usage here: https://docs.deepgenx.com. Our GitHub repository can be found here: https://github.com/DeepGenX/CodeGenX.
I think it is important to identify what CodeGenX can and cannot do, so we know what to expect. We have found that CodeGenX works well for common use cases which occur repeatedly in the training set. It understands context and outputs code with similar formatting to your previous code using the appropriate variables. But it does not shine in places where logical or mathematical computations are needed. This does not mean that it doesn't understand any logic or math, but just a weak point of the model.
Here are some links that have additional information:
Website: https://deepgenx.com
- CodeGenX, Another Open Source Alternative to GitHub Copilot
-
CodeGenX, an open-source GitHub Copilot alternative
CodeGenX is an extension that will help you write your code better and faster, it uses a fine-tuned version of GPT-J to generate code that's readable and efficient. You can find more details and a guide on how to use CodeGenX on our website, if you're interested in the source code, you can take a look at the github repo. If you have any questions or feedback, you can contact us via discord or by replying to this post.
-
CodeGenX, A code generation system powered by GPT-J!
You can find details for installation and instruction on usage here: https://docs.deepgenx.com. Our GitHub repository can be found here: https://github.com/DeepGenX/CodeGenX.
-
Looking for a volunteer for some help on the website of an Open-Source Project.
Hope you're doing well. We (a group of volunteers) are currently working on an open-source project called CodeGenX. The project is a code generation extension for VS Code (similar to GitHub Copilot) based on a custom fine-tuned version of GPT-J. The problem is that most of us working on the project are Machine Learning Engineers/Data Scientists and do not have much experience in webdev.
transformers
-
AI enthusiasm #9 - A multilingual chatbot📣🈸
transformers is a package by Hugging Face, that helps you interact with models on HF Hub (GitHub)
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the “trax” repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
Lossless Acceleration of LLM via Adaptive N-Gram Parallel Decoding
The HuggingFace transformers library already has support for a similar method called prompt lookup decoding that uses the existing context to generate an ngram model: https://github.com/huggingface/transformers/issues/27722
I don't think it would be that hard to switch it out for a pretrained ngram model.
-
AI enthusiasm #6 - Finetune any LLM you want💡
Most of this tutorial is based on Hugging Face course about Transformers and on Niels Rogge's Transformers tutorials: make sure to check their work and give them a star on GitHub, if you please ❤️
-
Schedule-Free Learning – A New Way to Train
* Superconvergence + LR range finder + Fast AI's Ranger21 optimizer was the goto optimizer for CNNs, and worked fabulously well, but on transformers, the learning rate range finder sadi 1e-3 was the best, whilst 1e-5 was better. However, the 1 cycle learning rate stuck. https://github.com/huggingface/transformers/issues/16013
-
Gemma doesn't suck anymore – 8 bug fixes
Thanks! :) I'm pushing them into transformers, pytorch-gemma and collabing with the Gemma team to resolve all the issues :)
The RoPE fix should already be in transformers 4.38.2: https://github.com/huggingface/transformers/pull/29285
My main PR for transformers which fixes most of the issues (some still left): https://github.com/huggingface/transformers/pull/29402
- HuggingFace Transformers: Qwen2
- HuggingFace Transformers Release v4.36: Mixtral, Llava/BakLlava, SeamlessM4T v2
- HuggingFace: Support for the Mixtral Moe
-
Paris-Based Startup and OpenAI Competitor Mistral AI Valued at $2B
If you want to tinker with the architecture Hugging Face has a FOSS implementation in transformers: https://github.com/huggingface/transformers/blob/main/src/tr...
If you want to reproduce the training pipeline, you couldn't do that even if you wanted to because you don't have access to thousands of A100s.
What are some alternatives?
django-select2 - This is a Django integration for Select2
fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
kube-shell - Kubernetes shell: An integrated shell for working with the Kubernetes
sentence-transformers - Multilingual Sentence & Image Embeddings with BERT
autoComplete.js - Simple autocomplete pure vanilla Javascript library.
llama - Inference code for Llama models
jQuery-Autocomplete - Ajax Autocomplete for jQuery allows you to easily create autocomplete/autosuggest boxes for text input fields
transformer-pytorch - Transformer: PyTorch Implementation of "Attention Is All You Need"
gpt-3-tailwindcss - GPT-3 Tailwind CSS Code Generator
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
hash - 🚀 The open-source, self-building database. From @hashintel
huggingface_hub - The official Python client for the Huggingface Hub.