×
AI voice scams target US officials at federal, state level to steal data
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The FBI is warning about sophisticated smishing campaigns targeting current and former government officials that use AI-generated voices and social engineering techniques to steal sensitive information. This escalation represents a concerning evolution in government-targeted scams, as cybercriminals impersonate senior officials to establish trust before directing victims to malicious links that compromise personal accounts.

The big picture: Since April, cybercriminals have been targeting U.S. federal and state employees with texts and AI-generated voice messages that impersonate senior officials to establish rapport and ultimately gain access to sensitive information.

  • Once scammers compromise one account, they use the stolen information to target additional government officials or their contacts in a chain-like attack pattern.
  • The compromised information can be leveraged to impersonate legitimate contacts and extract further information or funds from unsuspecting victims.

Key warning signs: The FBI advises vigilance for several telltale indicators of these sophisticated impersonation attempts.

  • Requests to switch to different messaging platforms should be treated with immediate suspicion.
  • AI-generated media often contains noticeable imperfections including distorted extremities, unrealistic facial features, irregular faces, inaccurate shadows, and unnatural movements.
  • Voice calls may exhibit lag time, voice matching issues, or unnatural speech patterns that differ subtly from the person being impersonated.

Recommended protections: The FBI outlines several defensive measures to avoid falling victim to these scams.

  • Never share sensitive information with new online or phone contacts without independent verification of their identity.
  • Avoid clicking links in unsolicited messages and refuse requests for money transfers via any method.
  • Implement two-factor authentication for all accounts, never share 2FA codes, and establish secret verification phrases with family members.

The simplest solution: The FBI notes that the most effective protection is to simply ignore calls from unknown numbers, and to report any suspected scam attempts to the local FBI Field Office or the Internet Crime Complaint Center.

Warning: This AI Voice Scam Mimics US Officials to Gain Access to Your Accounts

Recent News

AI perceptions diverge between experts and public, survey finds

Public and expert views clash on AI implications, with experts displaying more optimism about employment effects while demographic differences shape risk assessments within the technical community.

AI reshapes reality: How it impacts personal freedom

The technology presents a dual reality where AI tools simultaneously expand human potential while potentially narrowing authentic choices and experiences.

AI voice scams target US officials at federal, state level to steal data

Scammers combine artificial intelligence voice cloning and text messages to extract sensitive data from government workers in a chain-like attack pattern against U.S. institutions.