include(MODULE_FOLDER_PATH."english_side_release_list_for_entry.php") ?>
Sign language activates the auditory cortex of deaf subjects, which is evidence of cross-modal plasticity. Lip-reading (visual phonetics), which involves audio-visual integration, activates the auditory cortex of hearing subjects. To test whether audio-visual cross-modal plasticity occurs within areas involved in cross-modal integration, we used functional MRI to study seven prelingual deaf signers, 10 hearing non-signers and nine hearing signers. The visually presented tasks included mouth-movement matching, random-dot motion matching and sign-related motion matching. The mouth-movement tasks included conditions with or without visual phonetics, and the difference between these was used to measure the lip-reading effects. During the mouth-movement matching tasks, the deaf subjects showed more prominent activation of the left planum temporale (PT) than the hearing subjects. During dot-motion matching, the deaf showed greater activation in the right PT. Sign-related motion, with or without a lexical component, activated the left PT in the deaf signers more than in the hearing signers. These areas showed lip-reading effects in hearing subjects. These findings suggest that cross-modal plasticity is induced by auditory deprivation independent of the lexical processes or visual phonetics, and this plasticity is mediated in part by the neural substrates of audio-visual cross-modal integration.
Sadato N, Okada T, Honda M, Matsuki KI, Yoshida M, Kashikura KI, Takei W, Sato T, Kochiyama T, Yonekura Y: Cross-modal integration and plastic changes revealed by lip movement, random-dot motion and sign languages in the hearing and deaf. Cereb Cortex. 2005 15:1113-1122.