Jupyter Notebook Transformers

Open-source Jupyter Notebook projects categorized as Transformers

Top 23 Jupyter Notebook Transformer Projects

  • nn

    🧑‍🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠

    Project mention: Can't remember name of website that has explanations side-by-side with code | /r/learnmachinelearning | 2023-03-28

    Hey are you talking about https://nn.labml.ai/ ?

  • Transformers-Tutorials

    This repository contains demos I made with the Transformers library by HuggingFace.

    Project mention: FLaNK Stack Weekly for 07August2023 | dev.to | 2023-08-07
  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

  • BigDL

    Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max). A PyTorch LLM library that seamlessly integrates with HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, ModelScope, etc.

    Project mention: BigDL-LLM: running LLM on your laptop using INT4 | news.ycombinator.com | 2023-07-03
  • pytorch-sentiment-analysis

    Tutorials on getting started with PyTorch and TorchText for sentiment analysis.

  • Promptify

    Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research

    Project mention: Promptify 2.0: More Structured, More Powerful LLMs with Prompt-Optimization, Prompt-Engineering, and Structured Json Parsing with GPT-n Models! 🚀 | /r/ArtificialInteligence | 2023-07-31

    First up, a huge Thank You for making Promptify a hit with over 2.3k+ stars on Github ! 🌟

  • adapters

    A Unified Library for Parameter-Efficient and Modular Transfer Learning

  • hands-on-llms

    🦖 𝗟𝗲𝗮𝗿𝗻 about 𝗟𝗟𝗠𝘀, 𝗟𝗟𝗠𝗢𝗽𝘀, and 𝘃𝗲𝗰𝘁𝗼𝗿 𝗗𝗕𝘀 for free by designing, training, and deploying a real-time financial advisor LLM system ~ 𝘴𝘰𝘶𝘳𝘤𝘦 𝘤𝘰𝘥𝘦 + 𝘷𝘪𝘥𝘦𝘰 & 𝘳𝘦𝘢𝘥𝘪𝘯𝘨 𝘮𝘢𝘵𝘦𝘳𝘪𝘢𝘭𝘴

    Project mention: Where to start | /r/mlops | 2023-09-13

    There are 3 courses that I usually recommend to folks looking to get into MLE/MLOps that already have a technical background. The first is a higher-level look at the MLOps processes, common challenges and solutions, and other important project considerations. It's one of Andrew Ng's courses from Deep Learning AI but you can audit it for free if you don't need the certificate: - Machine Learning in Production For a more hands-on, in-depth tutorial, I'd recommend this course from NYU (free on GitHub), including slides, scripts, full-code homework: - Machine Learning Systems And the title basically says it all, but this is also a really good one: - Hands-on Train and Deploy ML Pau Labarta, who made that last course, actually has a series of good (free) hands-on courses on GitHub. If you're interested in getting started with LLMs (since every company in the world seems to be clamoring for them right now), this course just came out from Pau and Paul Iusztin: - Hands-on LLMs For LLMs I also like this DLAI course (that includes Prompt Engineering too): - Generative AI with LLMs It can also be helpful to start learning how to use MLOps tools and platforms. I'll suggest Comet because I work there and am most familiar with it (and also because it's a great tool). Cloud and DevOps skills are also helpful. Make sure you're comfortable with git. Make sure you're learning how to actually deploy your projects. Good luck! :)

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

  • ZoeDepth

    Metric depth estimation from a single image

    Project mention: Software 3D scanner. Free on Prusa Printables | /r/prusa3d | 2023-04-27
  • transformers-interpret

    Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.

  • mup

    maximal update parametrization (µP)

    Project mention: Announcing xAI July 12th 2023 | /r/xdotai | 2023-07-13

    Our team is led by Elon Musk, CEO of Tesla and SpaceX. We have previously worked at DeepMind, OpenAI, Google Research, Microsoft Research, Tesla, and the University of Toronto. Collectively we contributed some of the most widely used methods in the field, in particular the Adam optimizer, Batch Normalization, Layer Normalization, and the discovery of adversarial examples. We further introduced innovative techniques and analyses such as Transformer-XL, Autoformalization, the Memorizing Transformer, Batch Size Scaling, and μTransfer. We have worked on and led the development of some of the largest breakthroughs in the field including AlphaStar, AlphaCode, Inception, Minerva, GPT-3.5, and GPT-4.

  • course-content-dl

    NMA deep learning course

  • Transformer-MM-Explainability

    [ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.

  • gpt2bot

    Your new Telegram buddy powered by transformers

  • adaptnlp

    An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.

  • optimum-intel

    🤗 Optimum Intel: Accelerate inference with Intel optimization tools

  • browser-ml-inference

    Edge Inference in Browser with Transformer NLP model

  • diffusers-interpret

    Diffusers-Interpret 🤗🧨🕵️‍♀️: Model explainability for 🤗 Diffusers. Get explanations for your generated images.

  • uni2ts

    Unified Training of Universal Time Series Forecasting Transformers

    Project mention: Moirai: A Time Series Foundation Model for Universal Forecasting | news.ycombinator.com | 2024-03-25

    Code is available! https://github.com/SalesforceAIResearch/uni2ts

  • ocrpy

    OCR, Archive, Index and Search: Implementation agnostic OCR framework.

  • language-planner

    Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"

  • mgpt

    Multilingual Generative Pretrained Model

  • HugsVision

    HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision

  • converse

    Conversational text Analysis using various NLP techniques

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020). The latest post mention was on 2024-03-25.

Jupyter Notebook Transformers related posts

Index

What are some of the best open-source Transformer projects in Jupyter Notebook? This list will help you:

Project Stars
1 nn 46,249
2 Transformers-Tutorials 7,259
3 BigDL 4,870
4 pytorch-sentiment-analysis 4,185
5 Promptify 2,958
6 adapters 2,352
7 hands-on-llms 2,110
8 ZoeDepth 1,853
9 transformers-interpret 1,191
10 mup 1,139
11 course-content-dl 701
12 Transformer-MM-Explainability 695
13 gpt2bot 424
14 adaptnlp 414
15 optimum-intel 299
16 browser-ml-inference 294
17 diffusers-interpret 249
18 uni2ts 217
19 ocrpy 217
20 language-planner 213
21 mgpt 192
22 HugsVision 188
23 converse 176
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com