
Following the 2020 presidential elections, the government had to change the law because elections officials in Washington state were receiving so many FOIA requests about the government’s voter registration dataset. The senate then redirected these requests to the Secretary of State’s business to ease the burden on local elections workers.
Democratic state lawmaker Patty Kederer, who supported the regulations, testified before our state examiners who came in and gave evidence of how long it took to respond to requests for public information. Processing those calls can be expensive. And some of these smaller districts lack the necessary manpower to control them. You could easily obliterate some of our smaller regions.
Experts and analysts now worry that election deniers had mass-produce FOIA requests at an even higher rate, drowning the vote workers legally required to respond to them in documents, and gumming up the political process as a result of relational AI. Specialists who spoke to WIRED expressed concern that governments are unable to stand up for election deniers and that relational AI companies lack the security measures necessary to stop users from attempting to stifle the work of election workers in a crucial election year when threats and systems are more disturbed than ever.
Chatbots like OpenAI’s ChatGPT and Microsoft’s Copilot can quickly make FOIA requests, also down to citing state- level laws. According to Zeve Sanderson, chairman of New York University’s Center for Social Media and Politics, this may make it easier than ever for individuals to storm local elections officials with calls and make it harder for them to ensure that elections run efficiently and easily.
” We know that FOIA requests have been used in bad faith previously in a number of different contexts, not just elections, and that ]large language models ] are really good at doing stuff like writing FOIAs”, says Sanderson. ” At times, the record requests themselves seem to have indicated that they necessitated work to respond to. If someone is working to provide a record request, they are not working to run elections.
WIRED was able to easily generate FOIA requests for a number of battleground states, specifically requesting information on voter fraud using Meta’s LLAMA 2, OpenAI’s ChatGPT, and Microsoft’s Copilot. In the FOIA created by Copilot, the generated text asks about voter fraud during the 2020 elections, even though WIRED provided only a generic prompt, and did n’t ask for anything related to 2020. The FOIA requests could be sent via a specific email address and mailing address as well, according to the text.
Caitline Roulston, director of communications at Microsoft, inquired about whether they had put guardrails in place to prevent election deniers from abused their tools. Roulston said the company was “aware of the potential for abuse and]has ] detections in place to help prevent bots from scraping our services to create and spread spam. What those measures were, or why Copilot specifically commissioned a FOIA request to investigate voter fraud in the 2020 elections, was not provided by Rohlston. Google’s Gemini would not return a FOIA request. A comment request was not responded to by OpenAI. A comment request was not received by Meta.
It can be challenging to distinguish between AI-generated content and chatbot-generated content. However, under the new law in Washington state, government officials are permitted to “deny a bot request,” which is a request for “public records that an agency reasonably believes was generated by a computer program or script” and that they believe would interfere with its functions.
According to Rebecca Green, codirector of the election law program at William and Mary Law School,” I think it’s safe to say that most state and local governments are underfunded and lack the tools to identify when a request is coming from a real person or if it’s AI generated or otherwise.” Local officials are “really left hanging in the wind” without those tools, and depending on what state laws require, to determine whether they follow the law or whether a request has been submitted by a human.
Last year, several companies, including Microsoft, OpenAI, and Google, voluntarily pledged to develop a system of watermarking as a way to designate content that has been created by AI. For instance, using a particular word more frequently than is statistically necessary for a computer to recognize that it was AI-generated when watermarking text-based outputs. However, many of these systems are still in development, and government and local authorities only need to have the training and technology to properly scan for and identify the watermark.
” We do n’t have the luxury of time to figure this out if we want safe and secure elections”, Green says.
Even though bad actors might use chatbots, according to David Levine, senior elections integrity fellow at the Alliance for Securing Democracy and former county elections director, they can still be useful tools to help people with legitimate inquiries navigate the public records process. ” People ought to be able to get access to information”, he says. And you might also think of hypothetical situations where people who are debating how to obtain information that will increase their understanding of how elections work.
Local officials have a legal deadline of a certain length of time to respond to records requests, according to Levine, and it’s challenging to understand which requests are in good faith and which are n’t.
Election officials must be at least aware of and trying to plan for this situation, according to Levine, who will simply say that if Mike Lindell and his affiliated associates want to try and conduct a functional DDoS-style attack for FOIA, they must be doing so.
FOIA requests are just one way election deniers pursue election workers. Local elections officials are still targeted in response to former president Donald Trump’s false accusations that the 2020 presidential election was rigged and that election workers across the nation were subject to a deluge of violent threats and intimidation. A law passed earlier this year, also in Washington state, makes it a felony to harass election workers. Kederer, the democratic representative, says that the threats to elections officials spurred state lawmakers to allocate more money for local auditors to “beef up security, if they feel they need to do that”.
The threats and scrutiny have, in some cases, driven elections workers to quit. Election workers have increased since 2020, according to a report released this week from the Bipartisan Policy Center.
There is a real concern that those people are just trying to stay up to date as the election day draws nearer, especially as many election officials are leaving the field and being replaced by inexperienced officials, Levine says. That task becomes even more challenging when dealing with lengthy FOIA requests. It’s one thing to respond to numerous FOIA requests well in advance of an election. It’s entirely different to respond to a flurry of FOIA requests on election day or perhaps during early voting.