The FBI Is Watching Out for AI Attacks in the Upcoming Election

The FBI expects the 2024 election to be a target for cyberattacks, especially those generated by artificial intelligence, FBI Director Christopher Wray told a national security conference. Specifically, Wray called these threats “fast-moving,” in the face of technology that could make election interference more accessible than ever for bad actors: “The U.S. has confronted foreign malign influence threats in the
March 5, 2024
 / 
meritsolutions
 / 
Image

The FBI expects the 2024 election to be a target for cyberattacks, especially those generated by artificial intelligence, FBI Director Christopher Wray told a national security conference. Specifically, Wray called these threats “fast-moving,” in the face of technology that could make election interference more accessible than ever for bad actors:

“The U.S. has confronted foreign malign influence threats in the past … But this election cycle, the U.S. will face more adversaries, moving at a faster pace, and enabled by new technology.”

“As intelligence professionals, we’ve got to highlight threats in specific, evidence-based ways so that we’re usefully arming our partners and, in particular, the public against the kinds of foreign influence operations they’re likely to confront.”

Christopher Wray

In particular, Wray is concerned about generative AI, as we all should likely be. While this technology can be useful to many different types of people, it can also be used for malicious intent. Namely, when it comes to cybersecurity (in this case election interference), hackers can use generative AI to improve their tactics, regardless of skill level. Highly trained attackers can benefit from using AI in their schemes just as low-skilled hackers can.

This won’t be the first year we’ve experience interference in our elections. The past two election cycles were both targeted by actors across various nations, hoping to influence the outcomes.

However, we’ve seen recently that the treat can come from “inside the house.” Last month, voters in New Hampshire received robocalls from what sounded like President Biden, urging the recipient not to vote in the upcoming primary. Of course, this call was a lie: Joe Biden never made such a statement. It was generative AI, run by a political operative and a New Orleans street magician of all people.

If a couple hobbyists with some AI software and a phone book could make national news with a robocall scam that sounds like the current President of the United States, what could an entire country do?

Share This

Leave a Reply



Sign Up for weekly MERIT Security Briefing

By signing up, you agree to our Privacy Policy.