Attorney General Clark Leads Bipartisan Coalition Urging Tech Companies to Stop the Spread of Deepfake Non-consensual Intimate Imagery
Letter cites report that 98% of fake online videos are deepfake non-consensual intimate imagery
MONTPELIER, Vt. – Attorney General Charity Clark today led a bipartisan coalition of 47 state attorneys general in calling on major search engines and payment platforms to take stronger action against the growing spread of computer-generated deepfake non-consensual intimate imagery, sometimes known as “deepfake pornography.” In a letter to search engines, the coalition outlines the failures of these companies to limit the creation of deepfakes and calls for stronger safeguards – such as warnings and redirecting users away from harmful content – to better protect the public. In a separate letter to payment platforms, the coalition urges these companies to take stronger action to protect the public by identifying and removing payment authorization for deepfake non-consensual intimate imagery content.
“Deepfakes pose a growing threat to all of us, but especially to women and girls, and tech companies must do more to stop the spread of these harmful materials,” said Attorney General Clark. “Vermont law already includes non-consensual deepfakes in its revenge pornography statute. It’s time for search engines and payment platforms to take responsibility for their role in spreading this harm and crack down on the proliferation of deepfakes. I am proud to have led these bipartisan letters.”
The spread of computer-generated non-consensual intimate imagery online poses significant harm to the public – particularly women and girls. It is a growing problem that has been used to embarrass, intimidate, and exploit people around the world, including notable cases involving celebrities like Taylor Swift, as well as teenagers in New Jersey, Florida, Washington, Kentucky, South Korea, and Spain. Although deepfake non-consensual intimate imagery overwhelmingly targets women and girls, men and boys have been victimized as well. A recent report found that 98% of fake videos online are deepfake non-consensual intimate imagery.
In their letters, the coalition points to existing industry practices that can be deployed to address these deepfakes. For example, search engines already limit access to harmful content such as searches for “how to build a bomb” and “how to kill yourself.” The attorneys general urged these companies to adopt similar measures for searches such as “how to make deepfake pornography,” “undress apps,” “nudify apps,” or “deepfake porn.” The coalition also urged payment platforms to deny sellers the ability to use their services when they learn of connections to deepfake non-consensual intimate imagery tools and content and remove those sellers from their network.
Joining the coalition in sending these letters, which were authored by Attorney General Clark and co-sponsored by Kentucky Attorney General Russell Coleman, Massachusetts Attorney General Andrea Campbell, New Jersey Attorney General Matthew Platkin, Pennsylvania Attorney General Dave Sunday, and Utah Attorney General Derek Brown, are the attorneys general of Alaska, American Samoa, Arizona, Arkansas, California, Colorado, Connecticut, Delaware, Georgia, Hawaii, Idaho, Illinois, Iowa, Louisiana, Maine, Maryland, Michigan, Minnesota, Mississippi, Missouri, Nebraska, Nevada, New Hampshire, New Mexico, New York, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Puerto Rico, Rhode Island, South Carolina, South Dakota, Tennessee, U.S. Virgin Islands, Virginia, Washington, West Virginia, Wisconsin, and Wyoming.
Copies of the coalition’s letters are available here: https://ago.vermont.gov/sites/ago/files/2025-08/Deepfake%20NAAG%20Letters%20(combined).pdf