YES! I think this is the model. It relies ENTIRELY on attention using a combination of absolute and relative spacing coordinate backgrounds.
Took you guys long enough.
THIS IS IT. The RIGHT direction in Neural Network architecture 29 years, with only Danko Nikolic Practopoeisis / Anapoesis bucking the trend, I’ve looked for someone to notice the importance of “noticing” for proper Neural Network functioning.
While “Attention” isn’t _all_ you need, Attention is key to agility in constantly changing contexts while also maintaining an intermediate view that’s also attached to slow evolutionary processes – 3 tiers interacting continually at their respective paces.
This paper starts with an absolute grid. Next influential paper adds relative attention along with absolute attention.
The Transformer. Watch this AI.
I take it back. Attention _is_ all you need.