Voice hacking is an increasingly common technique in which cybercriminals use voice synthesis technology to imitate another person's voice and obtain sensitive information or perform fraudulent actions. We'll explore the different ways hackers can use voice hacking and how you can protect yourself against these attacks.
What is Voice Hacking ?
Voice hacking refers to techniques to manipulate people's voices using panama phone number data advanced technologies. This includes voice cloning, where artificial intelligence is used to replicate a specific person's voice and create authentic-sounding recordings. It is also known as voice deepfake or deep voice , which generates fake messages that appear to come from someone real.
Scammers can record a person's voice without their consent and use that audio to create fake messages that look authentic. This type of attack can compromise the security of devices such as virtual assistants (Alexa, Siri, Google Home) and other IoT devices.
Types of voice hacking scams

Vishing – Vishing is a scam technique in which criminals use phone calls to trick victims into giving away sensitive information. Scammers often pose as representatives of trusted institutions, such as banks, customer service, or even government agencies. During the call, they may use social engineering techniques to gain the victim’s trust by tricking them into believing they must provide sensitive data such as bank account numbers, passwords, or credit card details. Criminals can then use this information to make fraudulent transactions or commit other types of financial fraud.
Voice spoofing : Cybercriminals employ advanced voice cloning software , also known as voice deepfake or , to replicate a specific person’s voice. This technology uses artificial intelligence to analyze recordings of a person’s voice and create a synthetic version that sounds exactly like the original. Criminals can use this cloned voice to make phone calls or send text messages that appear to be from someone they trust, such as a company executive or family member. This allows them to trick victims into performing actions such as transferring money, providing sensitive information, or making online purchases under false pretenses.
Voice assistants and IoT devices : In this case, they can exploit vulnerabilities in devices such as virtual assistants (Alexa, Google Home, Siri) or Internet-connected home security systems to execute unauthorized commands. This can include making online purchases, disabling security alarms, accessing personal information stored on the device, or even spying on users by activating microphones. These attacks can be carried out by spoofing voice commands or by taking advantage of security gaps in the device's software.
Some known cases
These cases highlight the importance of being cautious with phone calls and voice devices, and show us why we need to check before we act.
Voice of a family member
A user contacted INCIBE's Cybersecurity Helpline after receiving a suspicious call. In the call, she heard her husband's voice asking her to send a message to a specific number. Suspecting that something was wrong, the user called her husband's usual number, who confirmed that he had not made the call. It was concluded that the voice had been generated using artificial intelligence, possibly from recordings obtained in previous suspicious calls.
Jennifer DeStefano Case
In 2023, Jennifer DeStefano received a call from someone who, using voice cloning technology, imitated her daughter's voice and told her that she had been kidnapped. A man on the call demanded a $50,000 ransom. Terrified, Jennifer tried to buy time and seek help, only to later discover that her daughter was safe and had never been kidnapped. The case highlights the danger of voice cloning being used to scam and emotionally manipulate people.
Ruth Card Case
A 73-year-old Canadian woman received a call from someone who sounded like her grandson, telling her that he was in custody and needed money to pay bail. Worried, Ruth and her husband withdrew 3,000 Canadian dollars from the bank. However, before handing over the money, they realized they had been victims of a scam. The voice they heard was not that of her grandson, but an artificial intelligence-generated clone.
UK energy firm scammed
In 2019, a UK energy company was scammed out of €220,000 when attackers used artificial intelligence software to clone the voice of the parent company’s CEO. Using this cloned voice, the scammers made a phone call to the director of a subsidiary, requesting an urgent transfer of funds to an account in Hungary. The voice was so convincing that the director, confident in the authenticity of the request, ordered the transfer without hesitation. The fraud was discovered too late to recover the money, highlighting the need for additional checks and awareness of these threats in companies.