Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Top 23 Jupyter Notebook Transformer Projects
-
nn
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Project mention: Can't remember name of website that has explanations side-by-side with code | /r/learnmachinelearning | 2023-03-28Hey are you talking about https://nn.labml.ai/ ?
-
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
BigDL
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max). A PyTorch LLM library that seamlessly integrates with HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, ModelScope, etc.
Project mention: BigDL-LLM: running LLM on your laptop using INT4 | news.ycombinator.com | 2023-07-03 -
pytorch-sentiment-analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
-
Promptify
Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research
Project mention: Promptify 2.0: More Structured, More Powerful LLMs with Prompt-Optimization, Prompt-Engineering, and Structured Json Parsing with GPT-n Models! 🚀 | /r/ArtificialInteligence | 2023-07-31First up, a huge Thank You for making Promptify a hit with over 2.3k+ stars on Github ! 🌟
-
-
hands-on-llms
🦖 𝗟𝗲𝗮𝗿𝗻 about 𝗟𝗟𝗠𝘀, 𝗟𝗟𝗠𝗢𝗽𝘀, and 𝘃𝗲𝗰𝘁𝗼𝗿 𝗗𝗕𝘀 for free by designing, training, and deploying a real-time financial advisor LLM system ~ 𝘴𝘰𝘶𝘳𝘤𝘦 𝘤𝘰𝘥𝘦 + 𝘷𝘪𝘥𝘦𝘰 & 𝘳𝘦𝘢𝘥𝘪𝘯𝘨 𝘮𝘢𝘵𝘦𝘳𝘪𝘢𝘭𝘴
There are 3 courses that I usually recommend to folks looking to get into MLE/MLOps that already have a technical background. The first is a higher-level look at the MLOps processes, common challenges and solutions, and other important project considerations. It's one of Andrew Ng's courses from Deep Learning AI but you can audit it for free if you don't need the certificate: - Machine Learning in Production For a more hands-on, in-depth tutorial, I'd recommend this course from NYU (free on GitHub), including slides, scripts, full-code homework: - Machine Learning Systems And the title basically says it all, but this is also a really good one: - Hands-on Train and Deploy ML Pau Labarta, who made that last course, actually has a series of good (free) hands-on courses on GitHub. If you're interested in getting started with LLMs (since every company in the world seems to be clamoring for them right now), this course just came out from Pau and Paul Iusztin: - Hands-on LLMs For LLMs I also like this DLAI course (that includes Prompt Engineering too): - Generative AI with LLMs It can also be helpful to start learning how to use MLOps tools and platforms. I'll suggest Comet because I work there and am most familiar with it (and also because it's a great tool). Cloud and DevOps skills are also helpful. Make sure you're comfortable with git. Make sure you're learning how to actually deploy your projects. Good luck! :)
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
-
transformers-interpret
Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
-
Our team is led by Elon Musk, CEO of Tesla and SpaceX. We have previously worked at DeepMind, OpenAI, Google Research, Microsoft Research, Tesla, and the University of Toronto. Collectively we contributed some of the most widely used methods in the field, in particular the Adam optimizer, Batch Normalization, Layer Normalization, and the discovery of adversarial examples. We further introduced innovative techniques and analyses such as Transformer-XL, Autoformalization, the Memorizing Transformer, Batch Size Scaling, and μTransfer. We have worked on and led the development of some of the largest breakthroughs in the field including AlphaStar, AlphaCode, Inception, Minerva, GPT-3.5, and GPT-4.
-
-
Transformer-MM-Explainability
[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
-
-
adaptnlp
An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
-
-
-
diffusers-interpret
Diffusers-Interpret 🤗🧨🕵️♀️: Model explainability for 🤗 Diffusers. Get explanations for your generated images.
-
Project mention: Moirai: A Time Series Foundation Model for Universal Forecasting | news.ycombinator.com | 2024-03-25
Code is available! https://github.com/SalesforceAIResearch/uni2ts
-
-
language-planner
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
-
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Jupyter Notebook Transformers related posts
- OpenChat 3.2 SUPER is Here!
- Refact LLM: New 1.6B code model reaches 32% HumanEval and is SOTA for the size
- OpenChat: Advancing Open-Source Language Models with Imperfect Data
- Creating a new Finetuned model
- Is this claim meaningful? https://news.ycombinator.com/item?id=36555000
- Bard is getting better at logic and reasoning
- How to annotate compound words to build NER models?
-
A note from our sponsor - InfluxDB
www.influxdata.com | 28 Mar 2024
Index
What are some of the best open-source Transformer projects in Jupyter Notebook? This list will help you:
Project | Stars | |
---|---|---|
1 | nn | 46,249 |
2 | Transformers-Tutorials | 7,259 |
3 | BigDL | 4,870 |
4 | pytorch-sentiment-analysis | 4,185 |
5 | Promptify | 2,958 |
6 | adapters | 2,352 |
7 | hands-on-llms | 2,110 |
8 | ZoeDepth | 1,853 |
9 | transformers-interpret | 1,191 |
10 | mup | 1,139 |
11 | course-content-dl | 701 |
12 | Transformer-MM-Explainability | 695 |
13 | gpt2bot | 424 |
14 | adaptnlp | 414 |
15 | optimum-intel | 299 |
16 | browser-ml-inference | 294 |
17 | diffusers-interpret | 249 |
18 | uni2ts | 217 |
19 | ocrpy | 217 |
20 | language-planner | 213 |
21 | mgpt | 192 |
22 | HugsVision | 188 |
23 | converse | 176 |