I think with his new job, this site may be getting neglected. Conceptually, it’s this:
Slow learning and fast learning.
Evolution is slow learning. DNA is faster learning. Neurological interaction with itself and environment is fastest learning.
All layers interact and update each other and interact with the environment and the environment with them.
So basically, at any given moment, everything you’ve learned is used in decision making and in forming new connections, whether evolutionary, DNA or neurology. It all is in a dance as it were, traversing these levels.
This allows for an agile intelligence continually capable of adaptation and adaptation is continual.
an AI computer would be a “box that can learn”.
This is a basic overview. It’s missing a point or two but it’s a good start. https://www.youtube.com/watch?v=WIzsz03X8qc
Having worked on and off with AI through the decades now as a hobby, certain aspects never sat right with me. Preloading huge databases first or assuming certain thinking patterns are superior to others. Or agility with a lack of memory. Or static tables used to bootstrap that can’t be updated. Stuff like that.
They each use biology as a basis.
Most AI uses an ancient view of “neuron” developed in the 1940s, with weights and such. Useful? Oh yes. But it’s limited.
But it’s where the money and institutional support is, and we’ll still make progress, even if it’s far more slowly than I’d like to see.