long-range-arena
hn-search
long-range-arena | hn-search | |
---|---|---|
6 | 1,637 | |
684 | 525 | |
1.0% | 0.2% | |
0.0 | 2.9 | |
5 months ago | 6 months ago | |
Python | TypeScript | |
Apache License 2.0 | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
long-range-arena
-
The Secret Sauce behind 100K context window in LLMs: all tricks in one place
https://github.com/google-research/long-range-arena
-
[R] The Annotated S4: Efficiently Modeling Long Sequences with Structured State Spaces
The Structured State Space for Sequence Modeling (S4) architecture is a new approach to very long-range sequence modeling tasks for vision, language, and audio, showing a capacity to capture dependencies over tens of thousands of steps. Especially impressive are the model’s results on the challenging Long Range Arena benchmark, showing an ability to reason over sequences of up to 16,000+ elements with high accuracy.
-
[D] Is there a repo on which many light-weight self-attention mechanism are introduced?
1.1 Long Range Arena: A Benchmark for Efficient Transformers. From authors of above, they proposed a benchmark for modeling long range interactions. It also inlcudes a repository
- [R] Google’s H-Transformer-1D: Fast One-Dimensional Hierarchical Attention With Linear Complexity for Long Sequence Processing
- [2107.11906] H-Transformer-1D: Fast One-Dimensional Hierarchical Attention for Sequences
-
[R][D] Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Zhou et al. AAAI21 Best Paper. ProbSparse self-attention reduces complexity to O(nlogn), generative style decoder to obtainsequence output in one step, and self-attention distilling for further reducing memory
I think the paper is written in a clear style and I like that the authors included many experiments, including hyperparameter effects, ablations and extensive baseline comparisons. One thing I would have liked is them comparing their Informer to more efficient transformers (they compared only against logtrans and reformer) using the LRA (https://github.com/google-research/long-range-arena) benchmark.
hn-search
-
Rule of Thumb: Anything that looks fancy is not worth you time
- Ads with Psychological tricks
Truly good websites have around 2 facts per 10 word sentence, and get instantly to the chase. Also: good websites give you the names of all their competitors/alternative websites before showing their own stuff, and give you further reading.
Right now the world of technology is supposedly more innovative than ever, but somehow Wikipedia (https://www.wikipedia.org/) and Search Hackernews (https://hn.algolia.com/) beat billion dollar search engines.
Articles written decades ago are still unsurpassed in terms of quality and ease of understanding, but the best modern websites can do is textbook explanations. It is time society graduates from boilerplate buzzword textbook culture.
Now the gems of the internet are slowly being buried beneath mountains of trash.
If something sounds boilerplate it isn't good enough.
Don't bother saying something that has been said before, and better.
-
What makes a translation great
>for more detail: https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
Oh, I see. We actually discussed Pound about four years ago - just a little back and forth about the ABC of Reading: https://news.ycombinator.com/item?id=24196681
>What's your explanation of why Pound went Fascist?
I'm not sure I particularly have one; I haven't read any of his longer political or cultural (i.e. non-literary) works. I just think it's silly to correlate an approach to translation that you dislike with fascism. Especially as I'm not sure it even makes sense on its own terms: I can only read your comment as 'lazy translator? Figures that he would be a fascist', but if I imagine the type of translation a fascist would approve of, the approach I picture is fastidious, fussy, concerned with fidelity to the point of stickler-ishness. (Isn't that from where we get 'grammar nazi'?)
And oh, well, since you ask I'll take a shy at it: my vague sense is that he became fascist because saw a society in decline due to it becoming more and more a sham society: opulence without virtue, power without vigour, money no longer tied to actually existing goods. (Of course, all of this shades easily into antisemitism.) He saw fascism as the answer; It's easier to see in retrospect that it wasn't.
-
Zed Decoded: Linux When? – Zed Blog
"multiplayer notepad" goes back 15 years at least - https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu... notepad&sort=byDate&type=comment
it was used back with a popular website which opened a text document and anyone viewing could type, but I can't remember the name. That became a thing in Google Docs, Microsoft Office, Floobits, and lots of self-hosted and cloned sites.
-
Louis Rossmann: YouTube's Legal Team sent me a letter [video]
If you see a post that ought to have been moderated but hasn't been, the likeliest explanation is that we didn't see it. You can help by flagging it or emailing us at [email protected].
https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
-
An Oil Price-Fixing Conspiracy Caused 27% of All Inflation in 2021
Ok, but please don't post unsubstantive comments to Hacker News.
I understand the reason for repeating these sentiments—it's the same reason why they get upvoted to the top of threads*—but repetition of this kind is what we're most trying to avoid here.
https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...
https://news.ycombinator.com/newsguidelines.html
* I've marked this one off topic now.
-
Validating app for manufacturers enhancing process reliability and efficiency
I was looking for it in the guidelines. There are a couple of conventions for postings. Consider a bit of prior examples: [https://hn.algolia.com/?q=show+hn]
-
Show HN: Hacker Search – A semantic search engine for Hacker News
yeah there are only three stories coming up from the site search
https://hn.algolia.com/?q=postgres+clustering
only one is semanthically correct, the other pick up the wrong version of clustering (i.e. k-means instead of multi master writes)
but yeah if one doesn't test the hard cases, how does one know it preserves semantics :D
- Longevity of Recordable CDs, DVDs and Blu-Rays
-
The Scientific Method Part 5: Illusions, Delusions, and Dreams
Like dismissing the work of Feyerabend or Wittgenstein without seemingly having read either:
https://hn.algolia.com/?dateRange=pastMonth&page=0&prefix=tr...
-
Any Google Analytics Alternatives?
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
What are some alternatives?
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
duckduckgo-locales - Translation files for <a href="https://duckduckgo.com"> </a>
attention-is-all-you-need-pytorch - A PyTorch implementation of the Transformer model in "Attention is All You Need".
v - Simple, fast, safe, compiled language for developing maintainable software. Compiles itself in <1s with zero library dependencies. Supports automatic C => V translation. https://vlang.io
HJxB - Continuous-Time/State/Action Fitted Value Iteration via Hamilton-Jacobi-Bellman (HJB)
parser - 📜 Extract meaningful content from the chaos of a web page
jax-resnet - Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
readability - A standalone version of the readability lib
flaxmodels - Pretrained deep learning models for Jax/Flax: StyleGAN2, GPT2, VGG, ResNet, etc.
yq - Command-line YAML, XML, TOML processor - jq wrapper for YAML/XML/TOML documents
LFattNet - Attention-based View Selection Networks for Light-field Disparity Estimation
milkdown - 🍼 Plugin driven WYSIWYG markdown editor framework.