Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
We here investigate whether the well-known laterality of spoken language to the dominant left hemisphere could be explained by the learning of sensorimotor links between a word's articulatory program and its corresponding sound structure. Human-specific asymmetry of acoustic-articulatory connectivity is evident structurally, at the neuroanatomical level, in the arcuate fascicle, which connects superior-temporal and frontal cortices and is more developed in the left hemisphere. Because these left-lateralised fronto-temporal fibres provide a substrate for auditory-motor associations, we hypothesised that learning of acoustic-articulatory coincidences produces laterality, whereas perceptual learning does not. Twenty subjects studied a large (n=48) set of novel meaningless syllable combinations, pseudowords, in a perceptual learning condition, where they carefully listened to repeatedly presented novel items, and, crucially, in an articulatory learning condition, where each item had to be repeated immediately, so that articulatory and auditory speech-evoked cortical activations coincided. In the 14 subjects who successfully passed the learning routine and could recognize the learnt items reliably, both perceptual and articulatory learning were found to lead to an increase of pseudoword-elicited event-related potentials (ERPs), thus reflecting the formation of new memory circuits. Importantly, after articulatory learning, pseudoword-elicited ERPs were more strongly left-lateralised than after perceptual learning. Source localisation confirmed that perceptual learning led to increased activation in superior-temporal cortex bilaterally, whereas items learnt in the articulatory condition activated bilateral superior-temporal auditory in combination with left-pre-central motor areas. These results support a new explanation of the laterality of spoken language based on the neuroanatomy of sensorimotor links and Hebbian learning principles.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.cortex.2011.02.006 | DOI Listing |