pwnagotchi-custom-plugins
transformers
pwnagotchi-custom-plugins | transformers | |
---|---|---|
5 | 178 | |
89 | 125,741 | |
- | 2.0% | |
0.0 | 10.0 | |
over 3 years ago | 3 days ago | |
Python | Python | |
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pwnagotchi-custom-plugins
-
So Im trying to get the twitter.py plugin working for my pwn but I cant get it to start is there anyone with knowledge or help for this plugin
yes it has internet, as I was able to git clone it and auto update the pwn I am also using https://github.com/dadav/pwnagotchi-custom-plugins/blob/master/twitter.py this one know as it is updated but still cant get it to work
- Plug-ins
-
GPIO buttons. Can you program the pwnagotchi to have GPIO buttons. I would like to have an on/off, screen off/blank and lastly switch between auto and manual mode. For auto and manual either one one button that switches between or two separate buttons. Potentially a Bluetooth on/off too.
You might want to take a look at gpio_shutdown.py and gpio_buttons.py here https://github.com/dadav/pwnagotchi-custom-plugins ;)
- Help! Pwnagotchi Plugin ApFaker Not Working
-
retrying access point
So adding to your config something like: main.custom_plugin_repos = [ "https://github.com/evilsocket/pwnagotchi-plugins-contrib/archive/master.zip", "https://github.com/dadav/pwnagotchi-custom-plugins/archive/master.zip", "https://github.com/daddy-makes-stuff-and-things/pwnagotchi_plugins/archive/master.zip" ] After run sudo pwnagotchi plugins update
transformers
-
XLSTM: Extended Long Short-Term Memory
Fascinating work, very promising.
Can you summarise how the model in your paper differs from this one ?
https://github.com/huggingface/transformers/issues/27011
-
AI enthusiasm #9 - A multilingual chatbot📣🈸
transformers is a package by Hugging Face, that helps you interact with models on HF Hub (GitHub)
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the “trax” repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
Lossless Acceleration of LLM via Adaptive N-Gram Parallel Decoding
The HuggingFace transformers library already has support for a similar method called prompt lookup decoding that uses the existing context to generate an ngram model: https://github.com/huggingface/transformers/issues/27722
I don't think it would be that hard to switch it out for a pretrained ngram model.
-
AI enthusiasm #6 - Finetune any LLM you want💡
Most of this tutorial is based on Hugging Face course about Transformers and on Niels Rogge's Transformers tutorials: make sure to check their work and give them a star on GitHub, if you please ❤️
-
Schedule-Free Learning – A New Way to Train
* Superconvergence + LR range finder + Fast AI's Ranger21 optimizer was the goto optimizer for CNNs, and worked fabulously well, but on transformers, the learning rate range finder sadi 1e-3 was the best, whilst 1e-5 was better. However, the 1 cycle learning rate stuck. https://github.com/huggingface/transformers/issues/16013
-
Gemma doesn't suck anymore – 8 bug fixes
Thanks! :) I'm pushing them into transformers, pytorch-gemma and collabing with the Gemma team to resolve all the issues :)
The RoPE fix should already be in transformers 4.38.2: https://github.com/huggingface/transformers/pull/29285
My main PR for transformers which fixes most of the issues (some still left): https://github.com/huggingface/transformers/pull/29402
- HuggingFace Transformers: Qwen2
- HuggingFace Transformers Release v4.36: Mixtral, Llava/BakLlava, SeamlessM4T v2
- HuggingFace: Support for the Mixtral Moe