iris
returnn-experiments
iris | returnn-experiments | |
---|---|---|
8 | 2 | |
757 | 152 | |
- | 1.3% | |
1.9 | 6.4 | |
3 months ago | 7 months ago | |
Python | Python | |
GNU General Public License v3.0 only | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
iris
-
From Deep to Long Learning
Yea, after all these LLMs are predicting one sequence of tokens from another sequence of tokens and the tokens could be anything, it just "happens" that text has the most knowledge and the easiest to input, then there are image, sound, video, but tokens could also be learned from world experience in RL:
Transformers are Sample-Efficient World Models:
https://github.com/eloialonso/iris#transformers-are-sample-e...
- What is the next booming topic in Deep RL?
-
Most Popular AI Research Sept 2022 - Ranked Based On Total GitHub Stars
Transformers are Sample Efficient World Models https://github.com/eloialonso/iris https://arxiv.org/abs/2209.00588v1
- [D] Most Popular AI Research Sept 2022 - Ranked Based On GitHub Stars
-
Minimal PyTorch re-implementation of GPT
This is actually a pretty neat, self-contained implementation that can super easily extended beyond stereotypical natural language models, for example to create world models for video games [1] or to create robot models that can learn to imitate from large, chaotic human demonstration data [2] (disclaimer, I'm an author on the second one.) Basically, GPT (or minGPT) models are EXCELLENT sequence modelers, almost to the point where you can throw any sensible sequence data at it and hope to get interesting results, as long as you don't overfit.
Even though I have only been working on machine learning for around six years, it's crazy to see how the landscape has changed so fast so recently, including diffusion models and transformers. It's not too much to say that we might expect more major breakthroughs by the end of this decade, and end in a place we can't even imagine right now!
[1] https://github.com/eloialonso/iris
- Transformers are Sample Efficient World Models
- [R] Transformers are Sample Efficient World Models: With the equivalent of only two hours of gameplay in the Atari 100k benchmark, IRIS outperforms humans on 10 out of 26 games and surpasses MuZero.
returnn-experiments
-
Show HN: WhisperFusion – Ultra-low latency conversations with an AI chatbot
The code is all released already. You find it here: https://github.com/rwth-i6/returnn-experiments/tree/master/2...
This is TensorFlow-based. But I also have another PyTorch-based implementation already, also public (inside our other repo, i6_experiments). It's not so easy currently to set this up, but I'm working on a simpler pipeline in PyTorch.
We don't have the models online yet, but we can upload them later. But I'm not sure how useful they are outside of research, as they are specifically for those research tasks (Librispeech, Tedlium), and probably don't perform too well on other data.
-
Minimal PyTorch re-implementation of GPT
This works for an architecture which has been well tuned and studied before, like LSTM or Transformer.
Once you do research on the model, testing out things, it often tends to become such kwarg monster in many frameworks.
Having everything (relevant) in one file (even in the config file itself with hyper params) allows you to copy the file for every experiment and modify it inplace. This avoids the kwargs mess. But then the config files are very complex, and can become messy in other ways (esp for research projects). Example: https://github.com/rwth-i6/returnn-experiments/blob/master/2...
Such approach makes it much more flexible and does not mess with the baseline code. As you say, it's more like an evolutionary DNA-like approach, where you then tend to do crossovers with other evolved good-performing configs, etc.
What are some alternatives?
setfit - Efficient few-shot learning with Sentence Transformers
minGPT - A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Text2Light - [SIGGRAPH Asia 2022] Text2Light: Zero-Shot Text-Driven HDR Panorama Generation
WhisperFusion - WhisperFusion builds upon the capabilities of WhisperLive and WhisperSpeech to provide a seamless conversations with an AI.
block-recurrent-transformer-pytorch - Implementation of Block Recurrent Transformer - Pytorch
machine-learning-articles - 🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
motion-diffusion-model - The official PyTorch implementation of the paper "Human Motion Diffusion Model"
CSL - [COLING 2022] CSL: A Large-scale Chinese Scientific Literature Dataset 中文科学文献数据集
VToonify - [SIGGRAPH Asia 2022] VToonify: Controllable High-Resolution Portrait Video Style Transfer
storydalle
rliable - [NeurIPS'21 Outstanding Paper] Library for reliable evaluation on RL and ML benchmarks, even with only a handful of seeds.