Is the attention layer even necessary? (https://arxiv.org/abs/2105.02723)
Why do you think that https://github.com/asarigun/GraphMixerNetworks is a good alternative to do-you-even-need-attention
Is the attention layer even necessary? (https://arxiv.org/abs/2105.02723)
Why do you think that https://github.com/asarigun/GraphMixerNetworks is a good alternative to do-you-even-need-attention