oppia
transformers
oppia | transformers | |
---|---|---|
4 | 176 | |
5,616 | 125,369 | |
0.1% | 1.7% | |
9.8 | 10.0 | |
7 days ago | 2 days ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
oppia
-
๐๐ 23 issues to grow yourself as an exceptional open-source Python expert ๐งโ๐ป ๐ฅ
Repo : https://github.com/oppia/oppia
-
Day 1 - Open Source
So tomorrow I will start looking the code and try to get myself a task assigned right now. This whole searching and setup took me 2-3 hrs ,may be its a lot but it takes what it takes :) issue This looks like something I can start with! Lets see if we can make good progress tomorrow!
-
Ask HN: How Do You Learn?
In the exploration I linked, the only types of interaction offered to the learner were either ok/proceed, or 'answer this textual multiple choice question'. This may make it seem like Oppia doesn't do much more than software for interactive fiction.
BUT Oppia has lots of other interaction types: https://github.com/oppia/oppia/tree/develop/extensions/inter...
For example, you can input music notes: https://github.com/oppia/oppia/issues/4842
Or ask the learner to enter a fraction, or to sort some objects.
transformers
-
AI enthusiasm #9 - A multilingual chatbot๐ฃ๐ธ
transformers is a package by Hugging Face, that helps you interact with models on HF Hub (GitHub)
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the โtraxโ repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
Lossless Acceleration of LLM via Adaptive N-Gram Parallel Decoding
The HuggingFace transformers library already has support for a similar method called prompt lookup decoding that uses the existing context to generate an ngram model: https://github.com/huggingface/transformers/issues/27722
I don't think it would be that hard to switch it out for a pretrained ngram model.
-
AI enthusiasm #6 - Finetune any LLM you want๐ก
Most of this tutorial is based on Hugging Face course about Transformers and on Niels Rogge's Transformers tutorials: make sure to check their work and give them a star on GitHub, if you please โค๏ธ
-
Schedule-Free Learning โ A New Way to Train
* Superconvergence + LR range finder + Fast AI's Ranger21 optimizer was the goto optimizer for CNNs, and worked fabulously well, but on transformers, the learning rate range finder sadi 1e-3 was the best, whilst 1e-5 was better. However, the 1 cycle learning rate stuck. https://github.com/huggingface/transformers/issues/16013
-
Gemma doesn't suck anymore โ 8 bug fixes
Thanks! :) I'm pushing them into transformers, pytorch-gemma and collabing with the Gemma team to resolve all the issues :)
The RoPE fix should already be in transformers 4.38.2: https://github.com/huggingface/transformers/pull/29285
My main PR for transformers which fixes most of the issues (some still left): https://github.com/huggingface/transformers/pull/29402
- HuggingFace Transformers: Qwen2
- HuggingFace Transformers Release v4.36: Mixtral, Llava/BakLlava, SeamlessM4T v2
- HuggingFace: Support for the Mixtral Moe
-
Paris-Based Startup and OpenAI Competitor Mistral AI Valued at $2B
If you want to tinker with the architecture Hugging Face has a FOSS implementation in transformers: https://github.com/huggingface/transformers/blob/main/src/tr...
If you want to reproduce the training pipeline, you couldn't do that even if you wanted to because you don't have access to thousands of A100s.
What are some alternatives?
futurecoder - 100% free and interactive Python course for beginners
fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
TkinterMapView - A python Tkinter widget to display tile based maps like OpenStreetMap or Google Satellite Images.
sentence-transformers - Multilingual Sentence & Image Embeddings with BERT
athens - Athens is a knowledge graph for research and notetaking. Athens is open-source, private, extensible, and community-driven.
llama - Inference code for Llama models
Learning-Python - This repo is made for the Learning Python blog course. In this course, all relevant material is provided for the course. For any suggestions, feedback or doubts, feel free to contact me via LinkedIn or Gmail.
transformer-pytorch - Transformer: PyTorch Implementation of "Attention Is All You Need"
Basic-Algorithms - Basic algorithms and data structures written in different programming languages
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
zim-plugin-instantsearch - Search as you type in Zim, in similar manner to OneNote Ctrl+E.
huggingface_hub - The official Python client for the Huggingface Hub.