Ghost in the Machine

“Humans struggle with mortality, especially in Western culture. For centuries we have tried to find ways to memorialize the dead, from death masks, to locks of hair, to old photos, to watching old movies,” Andrew Selepak, a social media professor at the University of Florida, told Lifewire via email. “Deepfakes use the latest technology to create a new death mask of a deceased loved one. But, depending on one’s perspective, is it creepy or a way to memorialize and hold on to someone you love after they have died?” One can only wonder how you might feel if Alexa pulled the same tricks in grandma’s voice. 

Deep Fake

The apparent ease with which Alexa learns to mimic a voice leads us to more nefarious uses of voice cloning: deep fakes.  “Deepfake audio is not new, even if it is little understood and little known. The technology has been available for years to recreate an individual’s voice with artificial intelligence and deep learning using relatively little actual audio from the person,” says Selepak. “Such technology could also be dangerous and destructive. A disturbed individual could recreate the voice of a dead ex-boyfriend or girlfriend and use the new audio to say hateful and hurtful things.” That’s just in the context of Alexa. Deep fake audio could go far beyond that, convincing people that prominent politicians believe things they don’t, for example. But on the other hand, the more we get used to these deep fakes—perhaps in the form of these Alexa voices—the more we will be skeptical of the more nefarious fakes. Then again, given how easy it is to spread lies on Facebook, perhaps not.  And if the tech for deep fakes is readily available, why not use it to comfort ourselves?