Although voice assistants such as Google Home, Alexa, Cortana, Bixby and Siri are all getting progressively smarter in their responses, all have one disadvantage in common – understanding a thicker dialect. Enthusiasts intent on continuing their use of the devices could begin to lose their regional accents in the long-term according to a study, adjusting to a more received pronunciation way of interacting.
Since the rise in popularity of voice assistants, videos across the internet have comically mocked the struggles of those with thicker accents and different dialects when interacting with the helper. Once the novelty laughter wears off, however, owners are still having to completely readjust their speech patterns in order for the device to hear them and accurately respond.
The Life Science Centre in Newcastle-Upon-Tune, United Kingdom asked 536 visitors of their experience with voice assistants, with 375 respondents owning a compatible device. As many as 79 percent of owners admitted their struggles of having to tone down their accent in order for the device to react as intended, often repeating themselves multiple times.
While the majority of participants in the study were likely Geordie, a colloquial term for those sporting Newcastle’s signature accent, the problem isn’t confined to regional Brits. Any English speaking country could face the same woes with virtual assistants, including New Zealand and countries within the Caribbean.
“Ask anyone with a regional accent and they’ll tell you the struggles of using automated voice recognition,” explains Life Science Centre chief executive, Linda Conlon. “The same people who decades ago were frustrated as teens trying to get cinema listings from an automated telephone system are now having the same issues with their smartphones or smart speakers – the technology has moved forward, but the inclusivity to cater for regional accents has not.”
According to Amazon, Alexa is “designed to get smarter every day. As more people speak to Alexa, with various different accents, the more she adapts to speech patterns, vocabulary, and personal preferences.”
It does beg the question of how long owners are meant to use their expensive devices without them working before they will suddenly acquire the ability to understand a new speech pattern without manual intervention to teach it. And by that point, will it be too late considering many users already feel the need to adjust.
KitGuru Says: Personally, I don’t think it’s fair to expect widespread adoption of a device that isn’t capable of catering to the masses fully. It feels much more like a beta test for those that aren’t housed under what these assistants already know, with no penned end date.