-
HMT-pytorch
Official Implementation of "HMT: Hierarchical Memory Transformer for Long Context Language Processing"
Code: https://github.com/OswaldHe/HMT-pytorch
This looks really interesting. I've the paper to my reading list and look forward to playing with the code. I'm curious to see what kinds of improvements we can get by agumenting Transformers and other generative language/sequence models with this and other mechanisms implementing hierarchical memory.[a]
We sure live in interesting times!
---
[a] In the past, I experimented a little with transformers that had access to external memory using https://github.com/lucidrains/memorizing-transformers-pytorc... and also using routed queries with https://github.com/glassroom/heinsen_routing . Both approaches seemed to work, but I never attempted to build any kind of hierarchy with those approaches.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
memorizing-transformers-pytorch
Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch
Code: https://github.com/OswaldHe/HMT-pytorch
This looks really interesting. I've the paper to my reading list and look forward to playing with the code. I'm curious to see what kinds of improvements we can get by agumenting Transformers and other generative language/sequence models with this and other mechanisms implementing hierarchical memory.[a]
We sure live in interesting times!
---
[a] In the past, I experimented a little with transformers that had access to external memory using https://github.com/lucidrains/memorizing-transformers-pytorc... and also using routed queries with https://github.com/glassroom/heinsen_routing . Both approaches seemed to work, but I never attempted to build any kind of hierarchy with those approaches.
-
Code: https://github.com/OswaldHe/HMT-pytorch
This looks really interesting. I've the paper to my reading list and look forward to playing with the code. I'm curious to see what kinds of improvements we can get by agumenting Transformers and other generative language/sequence models with this and other mechanisms implementing hierarchical memory.[a]
We sure live in interesting times!
---
[a] In the past, I experimented a little with transformers that had access to external memory using https://github.com/lucidrains/memorizing-transformers-pytorc... and also using routed queries with https://github.com/glassroom/heinsen_routing . Both approaches seemed to work, but I never attempted to build any kind of hierarchy with those approaches.
-
heinsen_routing
Reference implementation of "An Algorithm for Routing Vectors in Sequences" (Heinsen, 2022) and "An Algorithm for Routing Capsules in All Domains" (Heinsen, 2019), for composing deep neural networks.
Code: https://github.com/OswaldHe/HMT-pytorch
This looks really interesting. I've the paper to my reading list and look forward to playing with the code. I'm curious to see what kinds of improvements we can get by agumenting Transformers and other generative language/sequence models with this and other mechanisms implementing hierarchical memory.[a]
We sure live in interesting times!
---
[a] In the past, I experimented a little with transformers that had access to external memory using https://github.com/lucidrains/memorizing-transformers-pytorc... and also using routed queries with https://github.com/glassroom/heinsen_routing . Both approaches seemed to work, but I never attempted to build any kind of hierarchy with those approaches.