[R] Google Replaces BERT Self-Attention with Fourier Transform: 92% Accuracy, 7 Times Faster on GPUs

This page summarizes the projects mentioned and recommended in the original post on /r/MachineLearning

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • aphantasia

    CLIP + FFT/DWT/RGB = text to image/video

  • One of the public colabs using CLIP uses fourier transforms for image generation and it really is very fast. https://github.com/eps696/aphantasia

  • sentencepiece

    Unsupervised text tokenizer for Neural Network-based text generation.

  • I don't think this is valid in the context of this article. The input tokens are not one-hot encodings of the input characters, they are learned embeddings on a 32K SentencePiece vocabulary (4.1.1). As "STOP" and "SPOT" are probably fairly common words in their training dataset, I think it's safe to assume that each word would be assigned its own unique vector rather than be represented by the four "subword units" comprising their character decomposition.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts