Security
Headlines
HeadlinesLatestCVEs

Headline

Scammers are using AI to impersonate senior officials, warns FBI

Cybercriminals are using AI-based tools to generate voice clones of the voices of senior US officials in order to scam people.

Malwarebytes
#git#intel#perl#auth

The FBI has issued a warning about an ongoing malicious text and voice messaging campaign that impersonates senior US officials.

The targets are predominantly current or former US federal or state government officials and their contacts. In the course of this campaign, the cybercriminals have used test messages as well as Artificial Intelligence (AI)-generated voice messages.

After establishing contact, the criminals often send targets a malicious link which the sender claims will take the conversation to a different platform. On this messaging platform, the attacker may push malware or introduce hyperlinks that direct targets to a site under the criminals’ control in order to steal login information, like user names and passwords.

The AI-generated audio used in the vishing campaign is designed to impersonate public figures or a target’s friends or family to increase the believability of the malicious schemes. A vishing attack is a type of phishing attack in which a threat actor uses social engineering tactics via voice communication to scam a target—the word “vishing” is a combination of “voice” and “phishing.”

Due to the rapid developments in AI, vishing attacks are becoming more common and more convincing. We have seen reports about callers pretending to be employers, family, and now government officials. What they have in common is that they are after information they can use to steal money or sensitive information from the victim.

How to stay safe

Because these campaigns are very sophisticated and targeted, it’s important to stay vigilant. Some recommendations:

  • Independently verify the identity of the person contacting you, via a different method.
  • Carefully examine the origin of the message. The criminals typically use software to generate phone numbers that are not attributed to a specific mobile phone or subscriber.
  • Listen closely to the tone and word choice of the caller. Do they match those of the person allegedly calling you? And pay attention to any kind of voice call lag time.
  • AI-generated content has advanced to the point that it is often difficult to identify. When in doubt about the authenticity of someone wishing to communicate with you, contact your relevant security officials or the FBI for help.

If you believe you have been the victim of the campaign described above, contact your relevant security officials and report the incident to your local FBI Field Office or the Internet Crime Complaint Center (IC3) at www.ic3.gov. Be sure to include as much detailed information as possible.

We don’t just report on threats – we help safeguard your entire digital identity

Cybersecurity risks should never spread beyond a headline. Protect your—and your family’s—personal information by using identity protection.

Malwarebytes: Latest News

Lumma information stealer infrastructure disrupted