[R] Google’s H-Transformer-1D: Fast One-Dimensional Hierarchical Attention With Linear Complexity for Long Sequence Processing

This page summarizes the projects mentioned and recommended in the original post on /r/MachineLearning

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • long-range-arena

    Long Range Arena for Benchmarking Efficient Transformers

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • [R] The Annotated S4: Efficiently Modeling Long Sequences with Structured State Spaces

    1 project | /r/MachineLearning | 16 Jan 2022
  • [D] Is there a repo on which many light-weight self-attention mechanism are introduced?

    2 projects | /r/MachineLearning | 26 Dec 2021
  • [2107.11906] H-Transformer-1D: Fast One-Dimensional Hierarchical Attention for Sequences

    1 project | /r/MachineLearning | 4 Aug 2021
  • Show HN: The “tl;dr” of Recent Transformer Papers

    1 project | news.ycombinator.com | 15 Aug 2021
  • Show HN: Tl;Dr” on Transformers Papers

    1 project | news.ycombinator.com | 12 Aug 2021