How have DMCA takedowns gone? How many upstanding developers have been nuked, without recourse, by some invalid or malevolent false report? How many times over and over[0] has HN loudly lamented the DMCA takedowns, decried how awful it is to have state censorship—and one without any meaningful adversarial process?
Well, here we are again: "The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests".
If courts don't strike this down, this might end up becoming DMCA for the whole of society. It'll certainly be used by politicians to silence critics. It'll be overrun by trolls. You'll see the results, and you'll hate them.
> "And I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.”"
The bill does have a nullifying logical flaw in it though for purposes of defense, but I also don’t think that enforcement is really the point of the law. It is very much more likely that feigned intimidation and acquiescence are the intended purpose, i.e., making the currently still painless violation of rights the easiest route for entities to follow.
It has been the MO to undermine, infiltrate, and subvert the fundamental laws in the USA that restrict the government and authoritarians from infringing on the inalienable, God given rights of the people for many decades now.
DMCA does have an adversarial process. You're more likely thinking of e.g. YouTube's automatic DMCA-like process.
But that adversarial process has timelines that require removal for a minimum of ten days. Which is plenty of time to cause damage.
- "The takedown provision applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the bill. The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored."
So sexual or intimate themed memes which use realistic looking imagery may end up being able to be taken down. I also find myself disagreeing with the EFF here in spite of generally being a tremendous supporter of their work. In particular their main argument is that there are existing laws which work for this issue, without introducing new potentially abusable legislation:
- "If a deepfake is used for criminal purposes, then criminal laws will apply. If a deepfake is used to pressure someone to pay money to have it suppressed or destroyed, extortion laws would apply. For any situations in which deepfakes were used to harass, harassment laws apply. There is no need to make new, specific laws about deepfakes in either of these situations."
But I think on this issue one should not need to suffer some form of measurable loss or suffering to want intimate/sexual images removed from a site, let alone then having to go through the legal system and file a lawsuit to achieve such.
[1] - https://en.wikipedia.org/wiki/TAKE_IT_DOWN_Act
[2] - https://www.eff.org/deeplinks/2025/04/congress-passes-take-i...
I have friends who were the victim of revenge porn and I think this law would help them. I’m looking forward to this becoming a law.
I understand the horrors and so called purpose of this new act is trying to address, but the genie is out of the stable diffusion and successors bottle.
Just this weekend, Trump signed an executive order declaring and celebrating World Intellectual Property Day [0], the kind of day IIPA and DMCA would have their lovechild conceived, while creepy uncles ACTA, TPP, TRIPS, and SOPA/PIPA lurked and maybe even watched in the attic.
The TAKE IT DOWN Act fits right in with this family reunion, handing those in power another tool to censor the internet under the guise of protecting "intimate images." Its vague net is so wide, it’ll have even more platforms spooked into nuking fair use or satire just to dodge legal heat. It’s no coincidence that censorship go hand in hand with IP laws, control the content, monitor what people are submitting. This is exactly why people are building distributed tech like IPFS or ActivityPub and federated networks to keep information flowing, not locked behind government filters or takedown ultimatums.
Before you know it, it will be obscene to disagree with those in charge.
[0] https://www.whitehouse.gov/presidential-actions/2025/04/worl...
Why do you believe it runs afoul of the First Amendment?
The actual problem is the fair use problem, because it prevents you from e.g. creating a third party Netflix or Twitter client without the corporation's approval. Which in turn forces you to use their app and puts them in control of recommendations, keeps you within a walled garden instead of exposing you to other voices, etc. It's a mechanism to monopolize the marketplace of ideas.
Of course, Apple et al have turned this against the creators in order to extract their vig, which is a related problem.
> the Ninth Circuit Court of Appeals ruled that software source code was speech protected by the First Amendment and that the government's regulations preventing its publication were unconstitutional.
Similarly, absent some Constitutional protection, states can restrict who can purchase lock picks.
That law is consistent with trade secret law in general. The First Amendment does not require trade secrets to lose all protection. If it did, you could freely disclose your own employer’s secrets without penalty.
Yes; in order for trade secrets to be protected, they have to be secret.
> Did Bunner work at the DVD Consortium?
I have no knowledge of this.
yeah, when people got unlawfully sent to a prison in another country even though they were a US citizen law makers sprung into action (/s)
that video of donald sucking elon's toes probably counts.
Take It Down Act: A Flawed Attempt to Protect Victims That'll Lead to Censorship - https://news.ycombinator.com/item?id=43296886 - March 2025 (38 comments)
The Take It Down Act isn't a law, it's a weapon - https://news.ycombinator.com/item?id=43293573 - March 2025 (30 comments)
The "Take It Down" Act - https://news.ycombinator.com/item?id=43274656 - March 2025 (99 comments)
There is no line.
https://www.congress.gov/bill/119th-congress/senate-bill/146...
I can't identify where EFFs concerns are coming from. There's a specific limitation of liability for online platforms and the entire process appears to be complaint driven and requires quite a bit of evidence from the complaintant.
What actually concerns me in this bill?
> (B) INVOLVING MINORS.—Except as provided in subparagraph (C), it shall be unlawful for any person, in interstate or foreign commerce, to use an interactive computer service to knowingly publish a digital forgery of an identifiable individual who is a minor with intent to—
> “(i) abuse, humiliate, harass, or degrade the minor; or
> “(ii) arouse or gratify the sexual desire of any person.
> “(C) EXCEPTIONS.—Subparagraphs (A) and (B) shall not apply to—
> “(i) a lawfully authorized investigative, protective, or intelligence activity of—
> “(I) a law enforcement agency of the United States, a State, or a political subdivision of a State; or
> “(II) an intelligence agency of the United States;
Wut? Why do you need this? Are we the baddies?
The limitation on liability is only saying they're not responsible for the consequences of taking something down, not for the consequences of leaving something up.
That, plus the FTC being able to sue any company for the inevitable false negatives that will happen means that the only reasonable response to takedown requests is to be extremely cautious about rejecting them. It'll inevitably be abused for spurious takedowns way more than the DMCA already is.
Hardly seems difficult. I think a lot of services have TOSes which cover this type of content. The text of the bill also plainly defines what is covered.
> are trivial to automate
And removal is trivial to automate. I'm pretty sure providers already have systems which cover this case. Those that don't likely don't allow posting of pornographic material whether it's consensual or not.
> they're not responsible for the consequences of taking something down
So the market for "intimate depictions" got a little harder to participate in. This is a strange hill to fight over.
> It'll inevitably be abused for spurious takedowns
Of pornographic content. The law is pretty well confined to "visual depictions." I can see your argument on it's technical merits I just can't rationalize it into the real world other than for some absurdly narrow cases.
The whole point of this is discussion is that this is going to be used to censor everything, not just 'intimate visual depictions'.
It doesn't; however, you would immediately have a civil case against the person and/or their representative for their false claim. That's not spelled out in the bill but it should be obvious.
> The whole point of this is discussion is that this is going to be used to censor everything
That's the claim. You may accept it without objection. I simply do not. Now I'm offering a slightly modified discussion. Is that alright?
> not just 'intimate visual depictions'.
I'm sure you would agree that any automation would obviously only be able to challenge images. This does create a vulnerability to be sure, but I do not agree that it automatically creates the wholesale censorship of political speech that you or the EFF envisions here.
It also makes efforts at being scoped only to sites which rely on user generated content effectively limiting it to social media platforms and certain types of adult content websites. Due to their nature it's already likely that these social media platforms do _not_ allow adult content on their websites and have well developed mechanisms to handle this precise problem.
The bill could be refined for civil liberties sake; however, in it's current state, I fail to see the extreme danger of it.
Wow. Not sure if this is ludicrously bad faith or just ludicrously naive/ignorant/unthinking but it's ludicrous either way. Plenty enough to nullify every other thing you attempt to say on the topic.
Great. So a motivated bad actor can send out 10,000,000 bogus takedowns for images promoting political positions and people they disagree with, and they have to be taken down immediately, and then all 10 million people affected have to individually figure out who the hell actually submitted the takedowns, and have enough money for lawyers, and enough time to engage in a civil suit, and in the end they might get money, if they can somehow prove that taking down those specific images damaged them personally, rather than just a cause they believe in, but will they get the images restored?
This just smacks of obliviousness and being woefully out of touch, even before we get to
> That's not spelled out in the bill but it should be obvious.
...which almost makes it sound like this whole thing is just an elaborate troll.
... or to finally hire enough moderators to make competent judgements to avoid getting a counter lawsuit for restricting free speech.
> IN GENERAL.—The term “covered platform” means a website, online service, online application, or mobile application—
> (i) that serves the public; and
> (ii) (I) that primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files; or
> (II) for which it is in the regular course of trade or business of the website, online service, online application, or mobile application to publish, curate, host, or make available content of nonconsensual intimate visual depictions.
If you publish the content on your own website or on certain public websites you don't even have to respond to these requests. You do; however, open yourself to criminal and civil liability for publicly hosting any nonconseual visual images. Your provider is also excluded from service and cannot legally be compelled to participate in their removal.
Assume they are a serial liar. Or that they are a political opponent with zero shame.