a) BIMANUAL COORDINATION: rSTG, SMA playing the piano or rhythmic tapping with fingers on both hands off each other likely strengths it. “fMRI allowed us to pinpoint three areas located in the right hemisphere that were more strongly activated in the coordination condition: the superior temporal gyrus (STG), the SMA, and the primary motor cortex (M1).” b) that image of pre-term brain functional connectivity differences (with some verbal task I think) : there’s BA22 which is the “Right STG” connected to the SMA in preterm babies. and c) Sergent et al. studied musical perception in professional pianists using PET (Sergent et al., 1992). They reported that listening to a musical piece activated the right STG (BA22), which was not detected in a scale-listening task I’m just trying to understand my own brain, not make declarations about anybody else’s. Trying to connect music with being very preterm baby with anxiety with social things, and difficulty generalizing / inferring (which may be social things) and other stuff I didn’t get to yet. The constant overlapping of adhd/anxiety/autism/preterm/ocd was a bit much and I had to see what was core.

a) BIMANUAL COORDINATION: rSTG, SMA
playing the piano or rhythmic tapping with fingers on both hands off each other likely strengths it.
 
“fMRI allowed us to pinpoint three areas located in the right hemisphere that were more strongly activated in the coordination condition: the superior temporal gyrus (STG), the SMA, and the primary motor cortex (M1).”
 
b) that image of pre-term brain functional connectivity differences (with some verbal task I think) : there’s BA22 which is the “Right STG” connected to the SMA in preterm babies.
 
and
 
c) Sergent et al. studied musical perception in professional pianists using PET (Sergent et al., 1992). They reported that listening to a musical piece activated the right STG (BA22), which was not detected in a scale-listening task
 
I’m just trying to understand my own brain, not make declarations about anybody else’s. Trying to connect music with being very preterm baby with anxiety with social things, and difficulty generalizing / inferring (which may be social things) and other stuff I didn’t get to yet. The constant overlapping of adhd/anxiety/autism/preterm/ocd was a bit much and I had to see what was core.

==

I may have hit gold!
 
“Vividness of auditory imagery correlated with gray matter volume in the supplementary motor area (SMA), parietal cortex, medial superior frontal gyrus, and middle frontal gyrus. An analysis of functional responses to different types of human vocalizations revealed that the SMA and parietal sites that predict imagery are also modulated by sound type. Using representational similarity analysis, we found that higher representational specificity of heard sounds in SMA predicts vividness of imagery, indicating a mechanistic link between sensory- and imagery-based processing in sensorimotor cortex. Vividness of imagery in the visual domain also correlated with SMA structure, and with auditory imagery scores. Altogether, these findings provide evidence for a signature of imagery in brain structure, and highlight a common role of perceptual–motor interactions for processing heard and internally generated auditory information. “
 
Feel the Noise: Relating Individual Differences in Auditory Imagery to the Structure and Function of Sensorimotor Systems
 
https://scholar.harvard.edu/files/danaboebinger/files/lima_etal_2015.pdf
==

Wernicke’s area encodes music timbre information during musical imagery

In this study, the response in Wernicke’s area was highly correlated with music timbre features of the stimulus as imagined or heard (Fig. 4 and Supplementary Information Fig. 2). This observation offers insights about the previously undiscovered function of Wernicke’s area. At least in musicians, Wernicke’s area may encode musical information, whether it is internally generated (imagery) or externally stimulated (perception), beyond its known function in language comprehension33,44.

The musical information represented by the activity of Wernicke’s area reflected low level features of the sound, such as spectral or amplitude fluctuations. During musical imagery, the sound was not present; therefore, the mentally-generated acoustic features were likely driven through top-down processes, as opposed to bottom-up processes that occurred when the sound was physically presented. In this context, the top-down and bottom-up processes involved distinct areas and networks. When the music was presented externally as a sound wave, its processing involved the primary auditory cortex and the ventral auditory pathway. In contrast, when the music was imagined in mind, the primary auditory cortex was not involved in representing the music (Fig. 4). This distinction suggests that the primary auditory cortex encodes the auditory features of a music only when it is perceived and not when it is imagined5.

Leave a comment

Your email address will not be published. Required fields are marked *


7 − = two

Leave a Reply