The office of Minnesota Secretary of State Steve Simon has traced the recent spread of misinformation about the state's presidential ballot to a snarky AI chatbot named Grok, which is available to premium users of the social media platform X.
After President Joe Biden dropped out of the race Sunday, screenshots circulated on social media showing Grok responding to a question about whether states' presidential ballot deadlines had passed. The chatbot cited nine states, including Minnesota, where it claimed ballots are "locked and loaded" for the Nov. 5 presidential election.
"So, if you're planning to run for president in any of these states, you might want to check if you've already missed the boat," the chatbot responds. "But hey, there's always 2028, right?"
Except that's not true.
Minnesota's deadline isn't until Aug. 26, and no other state's deadline has passed for candidates to get on the ballot for president and vice president. Still, the faulty information has been shared across several social media platforms and is getting millions of views, Simon said.
"There is considerable reach here for this misinformation," he said. "It's being repeated and it's being shared over and over again. What else are we going to see on Grok? What else are we going to see on X that perpetuates bad information?"
X was formerly Twitter. Secretaries of state reached out to a company representative about the issue through the nonpartisan National Association of Secretaries of State (NASS), which Simon leads. That representative told them that X plans to update the technology in August and that the chatbot already includes a disclaimer that users should independently verify information.
Attempts to find a media relations person at X to respond to questions were unsuccessful.
"[They] got what I can only verbalize as the equivalent of a shoulder shrug," said Simon, who is raising concerns about the willingness of people at X to correct other election-related misinformation as Election Day nears.
States are rushing to combat the threat of artificial intelligence contributing to the spread of disinformation in the fall election.
Minnesota lawmakers passed a law last year that cracks down on the use of deepfake video or audio created by AI purporting to show someone saying or doing something that didn't happen. It's now a crime in the state to disseminate a deepfake within 90 days of an election, if it's made without the consent of the person depicted and is intended to influence the result of an election.
Simon said he's worried about the platform's unwillingness to do something when obvious misinformation is pointed out by officials.
"What concerns me about this episode isn't only this particular issue and the fact that a prominent platform with global reach got this wrong, it's what it may mean over the next 104 days," he said. "What happens in 20 days, 40 days or 60 days? What happens if Grok gets something wrong that hits closer to home for more voters?"
Simon said there are examples of groups getting it right. NASS recently worked with OpenAI, which runs ChatGPT, to point users asking election-related questions to NASS' page for information on registering to vote, absentee ballots and more.
"This is the first test case, which is why it's important to call this out now," Simon said. "We're having this discussion on this issue in late July and not in late September, and I hope we never have another discussion on this. It should be easy enough for X or any other platform to self-police."