SUMMARY

Although we take language and music for granted, both play central roles in our mental lives and in our social lives. Language and music provide us ways to communicate with other people—and with ourselves. They facilitate social identification, parenting, and cultural transmission.

10-1 Sound Waves: The Stimulus for Audition

The stimulus for the auditory system is the mechanical energy of sound waves that results from changes in air pressure. The ear transduces three fundamental physical qualities of sound wave energy: frequency (repetition rate), amplitude (size), and complexity. Perceptually, neural networks then translate these energies into the pitch, loudness, and timbre of the sounds that we hear.

10-2 Functional Anatomy of the Auditory System

Beginning in the ear, mechanical and electrochemical systems combine to transform sound waves into auditory perceptions—what we hear. Changes in air pressure are conveyed in a mechanical chain reaction from the eardrum to the bones of the middle ear to the oval window of the cochlea and the cochlear fluid that lies behind it in the inner ear. Movements of the cochlear fluid produce movements in specific regions of the basilar membrane, leading to changes in the electrochemical activity of the auditory receptors, the inner hair cells on the basilar membrane that send neural impulses through the auditory nerve into the brain.

353

10-3 Neural Activity and Hearing

The basilar membrane has a tonotopic organization. High-frequency sound waves maximally stimulate hair cells at the base, whereas low-frequency sound waves maximally stimulate hair cells at the apex, enabling cochlear neurons to code sound frequencies.

Tonotopic organization analyzes sound waves at all levels of the auditory system, which also detects both amplitude and location. The firing rate of cochlear neurons codes sound amplitude, with louder sounds producing higher firing rates than softer sounds do. Location is detected by structures in the brainstem that compute differences in the arrival times and loudness of a sound in the two ears.

Cochlear hair cells synapse with bipolar neurons that form the cochlear nerve, which in turn forms part of the eighth cranial nerve. The cochlear nerve takes auditory information to three structures in the hindbrain: the cochlear nucleus, the superior olive, and the trapezoid body. Cells in these areas are sensitive to differences in both sound wave intensity and arrival times at the two ears. In this way, they enable the brain to locate a sound.

The auditory pathway continues from the hindbrain areas to the inferior colliculus of the midbrain, then to the medial geniculate nucleus in the thalamus, and finally to the auditory cortex. As for vision, dorsal and ventral pathways exist in the auditory cortex, one for pattern recognition and the other for controlling movements in auditory space. Cells in the cortex are responsive to specific sound categories, such as species-specific communication.

10-4 Anatomy of Language and Music

Despite differences in the patterns and structures of speech sounds, all human languages have the same basic foundation: syntax and grammar, which implies an innate template for creating language. Auditory areas of the left hemisphere cortex play a special role in analyzing language-related information, whereas those in the right hemisphere play a special role in analyzing music-related information. The right temporal lobe also analyzes prosody, the melodic qualities of speech.

Among several left-hemisphere language-processing areas, Wernicke’s area identifies speech syllables and words and so is critically engaged in speech comprehension. Broca’s area matches speech sound patterns to the motor behaviors necessary to make them and so plays a major role in speech production. Broca’s area also discriminates between closely related speech sounds. Aphasias are an inability to speak (Broca’s aphasia) or to comprehend language (Wernicke’s aphasia) despite the presence of normal cognition and intact vocal mechanisms.

Auditory analysis of music draws more on right-hemisphere activity than on the left. Nor is music production localized to the right hemisphere: it recruits the left hemisphere as well. Music perception engages both the right temporal and frontal regions.

Music’s power to engage both right- and left-hemisphere activity makes it a powerful tool for engaging the injured or dysfunctioning brain. Music therapy is playing an increasingly important role in treatment.

10-5 Auditory Communication in Nonhuman Species

Nonhuman animals have evolved specialized auditory structures and behaviors. Regions of songbirds’ brains are specialized for producing and comprehending song. In many species, these regions are lateralized to the left hemisphere, analogous in a way to how language areas are lateralized to the left hemisphere in most humans. Similarities in the development of song in birds and the development of language in humans, as well as similarities in the neural mechanisms underlying both the production and the perception of birdsong and language, are striking.

Both owls and bats can fly and catch prey at night using only auditory information to guide their movement. Bats evolved a type of biosonar that allows them to map the objects in their auditory world, as humans map their visual worlds. Although some blind humans employ this strategy, the mainly auditory reality of bats, dolphins, and other echolocators is one most people can only try to imagine.