Skip to content

The Take It Down Act, a legislative measure initiated by President Trump, targets and hinders the distribution of nonconsensual deepfakes, a form of manipulated media that misrepresents individuals without their approval.

Lawmaker Madeleine Dean, representing districts in Berks and Montgomery counties, supports the proposed bill.

Representative Madeleine Dean, serving Berks and Montgomery counties, has been listed as a...
Representative Madeleine Dean, serving Berks and Montgomery counties, has been listed as a co-sponsor for the proposed bill.

The Take It Down Act, a legislative measure initiated by President Trump, targets and hinders the distribution of nonconsensual deepfakes, a form of manipulated media that misrepresents individuals without their approval.

In a significant move, President Donald Trump has signed the Take It Down Act, a bipartisan legislation that imposes stricter penalties for the distribution of non-consensual intimate imagery, including deepfakes created by artificial intelligence. The law was proposed by Senators Ted Cruz and Amy Klobuchar and gained support from First Lady Melania Trump.

The Take It Down Act criminalizes the publication or threat of publishing intimate images without consent, encompassing both real and AI-generated content. It mandates websites and social media platforms to remove such material within 48 hours of notice from the victim and delete duplicate content. While many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, this federal law marks a seldom-seen intrusion into internet companies' regulations.

The legislation has garnered strong bipartisan support, with Melania Trump advocating for it on Capitol Hill in March, stating that it was disheartening to witness the ordeals faced by teenagers, particularly girls, following such content distribution. Elliston Berry and her mother visited Senator Cruz's office after Snapchat refused for nearly a year to remove an AI-generated deepfake of Berry, which served as the inspiration for the measure. Meta, which owns and operates Facebook and Instagram, supports the legislation.

Critics of the bill argue that the language is too broad, potentially leading to censorship and First Amendment issues. Free speech advocates and digital rights groups are concerned about the law's potential to censor legitimate images, including legal pornography, LGBTQ content, and government critics. The takedown provision in the bill applies to a wide range of content and lacks critical safeguards against frivolous or bad-faith takedown requests, Steve Gillmor, a technology journalist, commented. Online companies, particularly smaller ones with limited resources, may choose to remove content to avoid legal risks, which could lead to the premature censorship of lawful content.

The legislation pressures platforms to proactively monitor and police user speech, including encrypted communications, which raises privacy and security concerns. The reliance on automated content filters, known for both overblocking and underblocking, increases the risks of erroneous removal of protected speech. The act grants enforcement authority to the Federal Trade Commission (FTC), which some critics fear may lead to inconsistent or politicized enforcement, further limiting the protection of free speech.

  1. Despite the federal law's intent to curb the publication of non-consensual intimate images, critics argue that the bill's language could potentially lead to broad censorship, covering content such as legal pornography, LGBTQ material, and speech from government critics.
  2. The Take It Down Act, which mandates websites and social media platforms to remove intimate content within 48 hours of notice, also raises concerns about privacy and security, as it pressures these platforms to proactively monitor and police user speech, including encrypted communications.
  3. Technology journalist Steve Gillmor comments on the lack of critical safeguards against frivolous or bad-faith takedown requests in the bill, expressing worry that online companies, especially smaller ones with limited resources, may choose to remove content to avoid legal risks, potentially leading to the premature censorship of lawful content.

Read also:

    Latest