Looking at it again: GWT is “theatre of the mind” – representation. I *think* 4E is not representational. It’s not that you can’t model but the model would be an included within the active system, tied into body movements and engagement with space, etc.

Looking at it again: GWT is “theatre of the mind” – representation. I *think* 4E is not representational. It’s not that you can’t model but the model would be an included within the active system, tied into body movements and engagement with space, etc.

\Ah here: this is an example of utilizing embodied cognition two ways: One WITH internal representation (which would be GWT) and one without internal representation, but both would be embodied.

===

“We agree that the mental representation approach could just as well be defined as a moderate embodied cognition perspective, contrasting with the more radical direct contact approach”

Ok, so from this perspective, GWT could fall within embodied cognition as an optional feature.

—-

I _+think+_ so. I haven’t seen both mentioned together in a single paper yet, which is surprising but they may operate in different spheres.

—-

 

THis is fascinating. This makes no mention of anything from 4E cognition at all.http://www.scholarpedia.org/article/Models_of_consciousness
====
ah ha! I think they’re both under:
 http://www.scholarpedia.org/article/Category:Computational_neuroscience
====
 ‘m not sure. There’s a fascinating disconnect. Looking at “Consciousness” within which GWT is hiding, I find:
http://www.scholarpedia.org/article/Contextual_emergence
which _seems_ as if it might belong in the 4E family but not quite. It’s right on the verge.
====
 FOund a philosophy paper which attempts to tie “contextual emergence” into “extended cognition” theory — which certainly is part of the 4E https://philpapers.org/rec/POCCEA
-=====
 ah ha! I got it. Dennett is the crossover point.Dennett brings up intentionality. “Intentional stance”
“IntentionalityAccording to Dennett (1989), the intentional stance can be applied to the prediction of any system’s behavior that is too complex to be treated as either a physical or a designed system. Intentional systems in this sense are systems whose behavior is predictable upon ascribing beliefs and desires to their internal states. Examples for intentional systems range from thermostats and chess computers over “magnetic snakes” (Snezhko et al. 2006) to “true believers”, e.g. human beings. “https://plato.stanford.edu/entries/intentionality/
—-
Oh no: David Marr’s [1982] account of vision, which I’ve been working with for a month or so with the lines and things – might be going head-=to-head with Gibson’s [1979] ecological alternative to cognitivism. Did I find the main players?
 
[looking at: https://s3.amazonaws.com/academia.edu.documents/43018059/Disclosing_the_World.pdf
====
YES! I FOUND THE MAIN PLAYERS. Gibson (enacted) vs Marr (computational)
 
[copied this]
David Marr’s approach to vision is based on the assumption that the function of the visual system is to invert the imaging process. Marr assumed that vision process infer 3D representation based on the information provided by the eyes. Marr’s originality was in decomposing the task in hierarchical level:
- functional level where the task can be understood geometrically
- algorithmic level
- hardware or neuronal level
Gibson rejected to conceive vision as an inversion of the imaging process and refuse the representational approaches to vision. Gibson assumed that what is perceived, the phenomenology of vision, can be explained by high level invariant of the dynamic optical array. The images on the eye retinas sample the dynamic optical arry at different orientation locations. Gibson’s originality also consists in connecting the needs of the organism to aspects of the stimulus directly. So an invariant of the DOA could also be an affordance. Which make Gibson in line with the bio-semiotic approach of Von Uxexkull.
In the end of the 1970’s, when Marr did his research, the fashion was computation, processing, information. And most of these cognitive scientist of the first generation did not understood the potential of Gibson’s approach and Gibson did not understood the role of computation could play in its own theory. So there was not a dialogue between the two approaches.
Since then the enactive approach to cognitive science moves in the direction of Gibson.
=====
YES! GIBSON – check this out. Key name. Source of “Affordances”, all the 4E cognition stuff.
 
http://gibsonenv.stanford.edu/
====
  FASCINATING – it almost goes out of its way to NOT use representationalism as a feature of Gibson is direct apprehension of the optic flow rather than manipulating toy models in the mind.
—-
 Now I have to compare what it uses for its computer vision to contrast it with what I’ve learned from the Marr family of computer vision and see if there’s overlap or if there’s a strong attempt to avoid Marr’s work.
 Sneaky! They gloss over the very thing I want to find out about – the semantic parsing, which shows “what’s salient” about each area – the overlap with my research with Marr and computer vision.. Going deeper now: http://buildingparser.stanford.edu/dataset.html
—-
 Ah – the annotated / labelled BY HAND. lol
===
 BUT they had some automation and referred to this paper, which I'[ll read now. https://www.semanticscholar.org/paper/RGB-(D)-scene-labeling%3A-Features-and-algorithms-Ren-Bo/e36111cc8acd864f047ff138cd6edc668891c32c
====
https://www.researchgate.net/post/In_what_way_exactly_was_David_Marrs_approach_different_from_that_of_James_Gibson_in_the_field_of_Vision_and_Perception
 
In a nutshell, the proponents of the ecological perception theory of James Gibson when opposing the computational/information-processing theories relying on representationalism would say smth like this: “Do not try to understant what is in your head, but try to understand what your head is in instead.”
=====

Leave a comment

Your email address will not be published. Required fields are marked *


+ 8 = sixteen

Leave a Reply