,

Putting Your Best Face Forward

Researchers who study how babies acquire language have demonstrated that babies begin learning very early which sounds are relevant to the language that surrounds them. Babies learn so early to categorize sounds, in fact, that infants being raised as bilingual can even differentiate between the sounds of two languages they have never before heard.

One theory of how babies recognize a sound as belonging to one language or another is that they are using not only their ears but their eyes to understand language. They pay attention to visual cues, to the movement of the face, especially the lips, as a person speaks.

The idea that visual information affects speech perception is not new and is probably something you’ve noticed. Watching a dubbed film, for example, presents a problem to most viewers at first because they cannot reconcile the sounds they are hearing with the shapes of the mouths speaking the film’s original language. Such incongruity makes words hard to understand. About 35 years ago, in fact, researchers Harry McGurk and John McDonald demonstrated that humans use both visual and auditory information in speech perception. For most of us, vision triumphs; what we see can alter what we hear.

It’s called the McGurk effect.

SPEAKING UP
What does the McGurk effect mean to you as a presenter? => http://bit.ly/kTRboF


Discover more from Bronwyn Ritchie - Pivotal Public Speaking

Subscribe to get the latest posts sent to your email.