AI-TechTalker

AI-TechTalker

Share this post

AI-TechTalker
AI-TechTalker
AI Impersonating Your Loved Ones
Copy link
Facebook
Email
Notes
More

AI Impersonating Your Loved Ones

TechTalker's avatar
TechTalker
Apr 23, 2024
∙ Paid

Share this post

AI-TechTalker
AI-TechTalker
AI Impersonating Your Loved Ones
Copy link
Facebook
Email
Notes
More
Share
silver microphone near audio mixer
Photo by Jacek Dylag on Unsplash

Casey Kasem was a well-known celebrity whose voice was so distinctive and refined that millions tuned into the radio weekly to hear him report on the Top 40 songs of the week. In the 90’s, Casey Kasem called a travel agency and asked them to send airline tickets in the name of others whom he personally vouched for. The travel agency might not have done that for you or me, but for Casey Kasem they promptly mailed the tickets to the specified address. It wasn’t until later when the travel agency tried to collect for the tickets that they realized Casey Kasem never called them. It was an impersonator who was able to convincingly emulate Mr. Kasem’s voice. Because there weren’t too many known persons who could impersonate Mr. Kasem’s distinctive voice, Mr. Kasem suspected the culprit and confronted him. The culprit confessed and was eventually sentenced to probation.

Today, impersonating a voice as unique and refined as Casey Kasem’s is child’s play for AI. And while cloning a voice can be used for many good purposes, as Casey Kasem learned first-hand, it can also be used for fraud. Many believe such fraudulent uses of AI generated voices are already prevalent.  It is time to beware.

Keep reading with a 7-day free trial

Subscribe to AI-TechTalker to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 DigitalThink Educar, LLC
Publisher Privacy ∙ Publisher Terms
Substack
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More