Reyna Gordon

Reyna Gordon
Reyna Gordon

Ph.D., Complex Systems & Brain Sciences, August, 2010

Current Position: Postdoctoral Fellow, Vanderbilt University

 

Dissertation: 

NEURAL AND BEHAVIORAL CORRELATES OF SONG PROSODY

This dissertation studies the neural basis of song, a universal human behavior. The fine temporal resolution of electroencephalography (EEG) was used to investigate the relationship of between words and melodies in the perception of song at phonological, semantic, melodic, and rhythmic levels of processing. The observations help shed light on a ubiquitous human experience and also inform the discussion of whether language and music share neural resources or recruit domain-specific neural mechanisms.

Experiment 1 was designed to determine whether words and melody in song are processed interactively or independently. Participants listened to sung words in which the melodies and/or the words were similar or different, and performed a same/different task while attending to the linguistic and musical dimensions in separate blocks of trials. Event-Related Brain Potentials and behavioral data converged in showing interactive processing between the musical and dimensions of sung words, regardless of the direction of attention. In particular, the N400 component, a well-established marker of semantic processing, was modulated by musical melody. The observation that that variations in musical features affect lexico-semantic processing in sung language was a novel finding with implications for shared neural resources between language and music.

Experiment 2 was designed to explore the idea that well-aligned textsettings, in which the strong syllables occur on strong beats, capture listeners’ attention and help them to better understand song lyrics. EEG was recorded while participants listened to sung sentences that were Well-aligned, Misaligned, and Varied, and performed a lexical decision task on subsequently presented visual targets. Results showed that induced beta and evoked gamma power were modulated differently for well-aligned and misaligned syllables, and that task performance was adversely affected when visual targets followed misaligned and varied sentences. These findings suggest that alignment of linguistic stress and musical meter in song enhance beat tracking and linguistic segmentation by entraining periodic neural oscillations to events occurring on strong metrical positions. A series of follow-up studies has been outlined to further investigate the relationship between rhythmic attending in speech and music, as well as the influence of metrical alignment in songs on childhood language acquisition.