Print this page
Thursday, 18 May 2023 15:08

Apple Rolling out a Feature that Lets Your iPhone Sound Just Like You, What Could Go Wrong

Written by

Reading time is around minutes.

I’ll take stupid features for $500 Alex. It seems that Apple is looking to deploy a feature that would allow your phone to sound and reply just like you do. The feature called “Personal Voice” uses a form of AI to replicate the sound and speech pattern of your voice in as little as 15 minutes (queue GEICO joke here). The feature is part of an update to their built-in accessibility features toolkit and on the surface is intended to help people that have speech challenges. Personal Voice can be used for in-person conversations and via phone calls. This feature will be tied to something called Live Speech which allows someone to type in messages and have them spoken by your phone.

Now, I am sure I am not the only one who is thinking about the potential issues here. The capability of a phone or device to be able to communicate for people with speech challenges is a great idea. Allowing it to sound exactly like someone is a security nightmare just waiting for you to go to sleep. AI voice technologies have already been used in some very disturbing fraud and phishing attacks. This is sure to just be another path to that.

As there is not much detail on how the AI learns the voice, what it takes to create the voice profile etc. I can imagine attackers using this function to create a voice print of someone for malicious purposes. We know that threat actors are taking advantage of other AL technologies to impersonate people, having this one so readily available on a phone, just makes things easier. We are not just talking about cybercrime here. Imagine if someone that just does not like you makes a print of your voice and then as a malicious joke has you “call” someone and say terrible things.

When I talk about a failure of imagination, this is exactly the type of shit I mean. There seems to have been no thought about how quickly this is likely to be abused at a time when AI is being seriously abused. Good lord will someone please send these people some recent news along with links to the descriptions of some of the terrible fraud calls that have been made using voice cloning technology? These concerns do not even get into the fact that your voice print could potentially become the property of Apple if you use this feature.

Read 1605 times
Sean Kalinich

Latest from Sean Kalinich

Related items