Should AI Impersonate People?

Should AI Impersonate People

Dr. David Hagenbuch, Ethicist and Professor of Marketing, Messiah University, Author of Honorable Influence, Founder of MindfulMarketing.org

“Imitation is the sincerest form of flattery”—it is a high compliment when people respect someone’s work enough to replicate it.  But, when one of the world’s largest companies’ smart speakers start imitating people’s voices, has flattery drifted into deceit?It’s difficult to keep pace with innovation in artificial intelligence (AI), but one particular advance that's certainly worth attention is the impending ability of Amazon’s Alexa to mimic voices

After hearing no more than a minute of audio, the smart speaker reportedly will be able to deliver a plausible impersonation.Alexa’s voice is apparently one that appeals to a very large number of consumers: 

A 2021 Statista study showed that Alexa was the most widely used assistant across four of six age demographics. So, why would Amazon want to mess with the sound that’s helped it sell so many smart speakers?According to Amazon senior vice president Rohit Prasad, the change “is about making memories last,” particularly remembrances of those who’ve passed.In many ways that motive makes the voice mimicking technology seem like a great idea. 

For those who have lost loved ones, one of the greatest blessings would be to hear their dearly departed’s voice again.Since my father passed away last August, I’ve thought several times how nice it would be to talk with him again—to hear his opinion about the latest news, to ask him questions that only he could answer.On a lighter side and also related to Alexa’s voice imitation, I’ve always enjoyed good impressionists.  It’s fun to hear comedians who can act and sound like famous people.  One of my favorites is Frank Caliendo, who is best known for impressions of famous sports figures; his John Madden and Charles Barkley impressions are great!

So, I can see why Alexa doing impressions of people we knew and loved could be popular.  However, AI impersonations should also give us pause for at least four reasons:

1.  More than a voice:  Of course, just because someone, or something, sounds like a person we know, doesn’t mean they are that person.  Every individual is a unique curation of beliefs, affections, and experiences that influence what they say and even how they say things.

Frank Caliendo may sound like Charles Barkley, but he obviously isn’t the NBA legend and popular sports broadcaster.  Consequently, Caliendo can never truly say what Barkley would say and neither can AI.  Only a person knows what they themself would say.

2.  Respect for the deceased:  Per the previous point, if AI speaks for anyone, beyond playing back a recording of them speaking, it’s putting words in that person’s mouth.  A living person could conceivably give such permission, but how would a dead person do the same, short of adding some kind of addendum to their last will and testament, allowing AI impersonation?I’m not sure it would be fair to ask anyone before their passing to give a smart speaker carte blanche use of their voice.  As hard as it is to let go of people we loved, it’s something we must do.  The longer we’d allow AI to speak for a loved one, the greater the probability that the technology would say things to tarnish their memory.

3.  Vulnerable consumers:  Given how good machines already are at imitating life, it will likely become increasingly easy for techno fakes to fool us.  However, there are certain groups of people who are at much greater risk of being duped than the average individual, namely children and older people.It’s scary to think how those with heinous motives might use AI voice imitation to make young children believe they’re hearing the words of a trusted parent, grandparent, etc.  Similarly, the Mindful Marketing article, “Preying on Older People” described how senior citizens are already frequent targets of phone scammers pretending to be someone they’re not.  AI voice imitation could open the flood gates for such abuse.

4.  Distorting the truth:  Thanks to fake news, native advertising, deepfake video and the like, the line between what’s real and what’s not is becoming more and more difficult to discern.  University of Maryland professor of psychology Arie Kruglanski warns that a truthless future is not a sustainable one: “Voluminous research in psychology, my own field of study, has shown that the idea of truth is key to humans interacting normally with the world and other people in it. Humans need to believe that there is truth in order to maintain relationships, institutions and society.”“

In the extreme, a lost sense of reality is a defining feature of psychosis, a major mental illness.  A society that has lost its shared reality is also unwell.”While examples of the innovation in imitation are fascinating, it’s concerning that in the not-too-distant future, fakes may become undetectable.  At that point, it seems like our world will be well on the path to what Kruglanski  forewarned: ‘losing its sense of reality’ and becoming ‘unwell.’In the 1994 movie Speed, Sandra Bullock and Keanu Reeves try to stop a city bus that’s triggered to explode if it drops below 50 mph.  AI deception can feel like that runaway bus, barreling forward with no way to stop it or even slow it down.

However, large corporations like Amazon share the driver’s seat and have some control over the AI vehicle.  Although having them put the brakes on innovation may be too much to ask, they can at least integrate some forms of notification to clearly indicate when people are seeing/hearing a fake and not the real thing.Even with such notifications, Alexa’s application of voice impersonation is wrought with potential for abuse.  For the four reasons outlined above, Amazon should shutter plans for its smart speaker to imitate people and thereby avoid talk of “Single-Minded Marketing.”

David Hagenbuch

About the Author: Dr. David Hagenbuch is a Professor of Marketing at Messiah University, the author of Honorable Influence, and the founder MindfulMarketing.org, which aims to encourage ethicalmarketing.

Dr. David Hagenbuch

Dr. David Hagenbuch is a Professor of Marketing at Messiah University, the author of Honorable Influence, and the founder of MindfulMarketing.org, which aims to encourage ethical 

https://www.midfulmarketing.org
Previous
Previous

5 Pitfalls of Restaurant Review Management 

Next
Next

Do You Sound Like a Leader?