We are interested in the neural control of speech. In particular, we study speech feedback, the sensory information we receive when we speech movements. This feedback can be auditory (hearing your voice), somatosensory (feeling the contact between your lips, tongue, and palate), and proprioceptive (sensing where your tongue or jaw is in space). In the BLAB Lab, we focus on the relationship between auditory processing and motor programming to investigate speech and language.
Current research projects
Neuroimaging of speech feedback control in aphasia
This project investigates how individuals with aphasia, a communication disorder, use feedback while speaking. We use magnetoencephalography (MEG) to measure brain activity during speech and to test for sensitivity to small changes in speech sounds.

Neuroimaging of speech feedback control in second-language learners
This study considers speakers who are learning French, which contains a number of vowels that are not in English. A person's ability to use auditory feedback to recognize errors while speaking a second language may be linked to how native they sound.

Probing speech motor plans via cognitive inhibition
Using a modified version of the well-known Stroop task, we measure how speech is altered when a speaking task involves interference between two competing words. This provides evidence that motor speech planning is affected by cognitive inhibition.

Characterizing auditory representations of speech goals
What features of sound are important when we monitor our own speech feedback? This study investigates how changing our communicative goal (e.g. speaking vs. singing) changes what aspects of the feedback we are sensitive to (e.g. vowels vs. pitch).

Exploring speech-motor learning with visual feedback (Voystick)
By plotting a person’s vowel formant information as a cursor, and prompting them to move towards targets in their vowel space, we can assess their ability to produce non-native vowels, or vowels near the edges/outside of their normal vowel space.