The Federal Election Commission will push back any potential action on a petition that seeks to crack down on the use of artificial intelligence to deceive people in political campaigns until its meeting in mid-September, Chairman Sean J. Cooksey and Commissioner Dara Lindenbaum said in a statement on Wednesday. 

The delay comes after a group of lawmakers this week sent a letter to the FEC urging it to back the petition from advocacy group Public Citizen, which requests that the Commission start the process to clarify that AI cannot be used to deliberately mislead people in campaign communications. 


What You Need To Know

  • The Federal Election Commission will push back any potential action on a petition that seeks to crack down on the use of artificial intelligence to deceive people in political campaigns until its meeting in mid-September 
  • The delay comes after a group of lawmakers this week sent a letter to the FEC urging it to back the petition from advocacy group Public Citizen, which requests that the Commission start the process to clarify that AI cannot be used to deliberately mislead people in campaign communications 
  • The petition from Public Citizen requests that the FEC begin the rulemaking process to make clear that using AI to deliberately deceive people falls under the existing fraudulent misrepresentation law prohibiting candidates for federal office and their campaigns from misleading people in their political messaging 
  • In their letter, the lawmakers cited the release by X, formerly Twitter, of its newest version of its AI chatbot with an image generator, called Grok-2, as a reason that the issue has become “even more critical" 

In the statement shared with Spectrum News, Cooksey and Lindenbaum note that the item was set to be considered at the Commission’s open meeting on Thursday after being pushed back from the Aug. 15 meeting.

“This item will be held over again, however, at Commissioner Lindenbaum’s request,” the statement reads. “This additional time is necessary in order for us to continue to work together on a consensus resolution to this matter.”

The petition from Public Citizen requests that the FEC begin the rulemaking process to make clear that using AI to deliberately deceive people falls under the existing fraudulent misrepresentation law prohibiting candidates for federal office and their campaigns from misleading people in their political messaging. 

In their letter sent to the FEC acting general counsel backing the petition, a group of about a half dozen lawmakers, led by Rep. Shontel Brown, D-Ohio, noted that it has seen candidates this election cycle use AI in campaign ads to “depict themselves or another candidate engaged in an action that did not happen or saying something the depicted candidate did not say.” 

The group of Democratic lawmakers, which also includes Rep. Eleanor Holmes Norton of Washington, D.C., Rep. Nikema Williams of Georgia, Rep. Dan Goldman of New York, Rep. Greg Landsman of Ohio, Rep. Summer L. Lee of Pennsylvania and Rep. Seth Magaziner of Rhode Island, went on to write that promptly addressing the issue is “critical for our democracy.” 

"AI deepfakes are here, they're already being deployed, and we have an election coming up in just weeks,” Brown said in a statement to Spectrum News, adding it is “very disappointing” that the FEC is pushing back consideration of the item until September. 

“There needs to be action sooner rather than later,” Brown said. “Twitter/X and all platforms have the responsibility to implement and require responsible use of its AI technology and, if not, the FEC must urgently step in to prevent further fraud that ultimately disenfranchises voters.”

In their letter, the lawmakers cited the release by X, formerly Twitter, of its newest version of its AI chatbot with an image generator, called Grok-2, earlier this month as a reason that the issue has become “even more critical.” 

The lawmakers specifically pointed to former President Donald Trump recently promoting images that used AI to make it look like Taylor Swift endorsed him. 

Public Citizen co-president Lisa Gilbert told Spectrum News that the FEC’s delay until September would make it “fairly unlikely” but not impossible that the clarification around AI and the fraudulent misrepresentation law could be in place by November’s election. 

She noted that the 3-3 ideological split of the FEC’s six commissioners makes it "very hard for them to move on things” and cited politics “getting in the way” as a hurdle. 

The topic of artificial intelligence and questions around whether and how to regulate it have proliferated in Washington, sparking President Joe Biden to issue a sweeping executive order and Senate Majority Leader Chuck Schumer, D-N.Y., to hold forums that brought some of the technology industry’s biggest names to Capitol Hill. 

Some lawmakers have specifically expressed concerns about AI’s ability to potentially impact our elections. Those concerns grew earlier this year when a robocall that used AI to impersonate Biden’s voice and urged people not to go to the polls went out to voters just days before New Hampshire’s primary election.