The visual cortex does not fall silent in blindness. About half the human neocortex is thought to be devoted to visual processing. However, in blind people, who are deprived of the visual input that drives those brain regions, those areas become responsive to auditory, tactile, and other nonvisual tasks, a phenomenon called crossmodal plasticity.
What kinds of computations take place under these circumstances? How do crossmodally recruited brain regions represent information? To what extent are the connections mediating this plasticity present in sighted people as well? These and many other basic questions remain debated despite decades of crossmodal plasticity research.
Braille reading as a model for studying crossmodal plasticity. Blind persons reading Braille, a system of raised dots for tactile reading, have been shown to recruit visual cortical areas, yet it remains unclear what exactly those areas are doing. I am currently using MEG and fMRI to trace the propagation of Braille information through the brain as it transforms from a dot pattern to meaningful alphabetic information, and comparing this to the analogous processing stream of printed letters in sighted people.
Human echolocation. Similarly to many bats and marine mammals, some blind humans echolocate. Using acoustic reflections from tongue clicks or other pulses, such as cane taps or footfalls, practitioners have demonstrated remarkable precision in navigation and object perception. What information does echolocation afford its practitioners? What are the factors contributing to learning it? How broadly is it accessible to blind, and sighted, people? Past and ongoing work, at MIT and UC Berkeley, investigates the spatial resolution available to people using echoes to perceive their environments, and how well echoes promote orientation and mobility in blind persons. Previous work has shown that sighted blindfolded people can readily learn some echolocation tasks with a small amount of training, but that blind practitioners possess a clear expertise advantage, sometimes distinguishing the positions of objects to a precision of less than 2 degrees.
Mobility and assistive technology. Inspired by both human and non-human echolocators, we are investigating artificial echoes as a perceptual aid for blind persons. Ultrasonic echoes, silent to humans, carry higher-resolution information than audible echoes; an assistive device could make the perceptual advantages of ultrasonic echolocation available to human listeners. Initial tests of a prototype showed that untrained listeners can make spatial judgments using the echoes.