
AI can steal your voice, and there’s not much you can do about it | Voice cloning programs — most of which are free- have flimsy barriers to prevent nonconsensual impersonations, a new report finds
https://www.nbcnews.com/tech/security/ai-voice-cloning-software-flimsy-guardrails-report-finds-rcna195131

8 Comments
“Most leading artificial intelligence voice cloning programs have no meaningful barriers to stop people from nonconsensually impersonating others, a Consumer Reports investigation found.
Voice cloning AI technology has made remarkable strides in recent years, and many services can effectively mimic a person’s cadence with only a few seconds of sample audio. A flashpoint moment came during the Democratic primaries last year, when [robocalls of a fake Joe Biden](https://www.nbcnews.com/politics/2024-election/fake-joe-biden-robocall-tells-new-hampshire-democrats-not-vote-tuesday-rcna134984) spammed the phones of voters telling them not to vote.
Most ethical and safety checks in the industry at large are self-imposed. Biden had included some safety demands in his [executive order on AI](https://www.nbcnews.com/tech/tech-news/biden-signs-executive-order-ai-rcna122468), which he signed in 2023, though President Donald Trump revoked that order when he took office.
Voice cloning technology works by taking an audio sample of a person speaking and then extrapolating that person’s voice into a synthetic audio file. Without safeguards in place, anyone who registers an account can simply upload audio of an individual speaking, such as from a TikTok or YouTube video, and have the service imitate them.”
If you want to protect yourself from potential identity theft using this method, never answer calls from people you don’t know, and contact any institutions you work with like banks or credit providers to set up an authentication system so nobody can interact with your accounts through social engineering. These institutions will work with you to protect your accounts because this is a threat to them too.
With photoshop someone can make you look like you went somewhere you didn’t.
There are a few scams based on impersonating relatives in trouble asking for money, or employers, celebrities, and similar. Sometime a they are very well targeted to their victims using information gathered from public sources such as social media, and can be fairly convincing. Voice cloning will make them very convincing.
This highlights the urgent need for stronger regulations and security measures in voice cloning technology.
Voice ai generation isn’t that advanced yet. There is still a lot of minute speaking qualities it isnt able to replicate. If someone were to use ai audio in court, it would probably not work. However the government does tend to have more advanced technology than the rest of us so its possible you could get framed by the government or something.
wait a few years and a few picture will be enough to create thousands hours of video of you in every situation possible and there nothing you will be able to do against that
your voice, your appearance and even your personality (provided you’re a public person with lot of data available) will be taken and used against your will such is the nature of GenAI and AGI, you could create law that punish it but there nothing you can do against personnal use
it’s probably easier for society to adapt to this fact than trying to prevent it
What is the solution here? A signed contract saying you have the permission of the voice’s owner?