Don't use LLMs for law enforcement, automated decision making for people (or their health), disclose what's LLM generated, and you'll be golden.
Example - the GDPR is so ridiculously easy to maliciously comply with - just say you need the data for security reasons, and you need all the data all the time - otherwise you can't catch the bad guys. There are several data harvesting companies built on this loophole.
At the same time if you were trying to comply with the spirit of it - meaning you can't reverse engineer customer identity based even on data sources, basically all standard logging practices - server logging, customer telemetry etc. is off limits, even properly anonymized, as you can reverse engineer what people were doing from time stamps, access patterns, and corellation ids.
Removing all these renders all logging useless. And the fun part of it is that even if we were hell-bent on complying with the GDPR, and dropped all logs, once some failure or security incident hit, we'd be on the hook for not doing at least due diligence logging to at least reassure customers or users that we know what went wrong and how to fix it.
Then there was the whole “pay or okay” controversy around paywalls or tracking ads.
My observation is: saying no to tech rarely works. Building a more compelling alternative does. But the EU would rather regulate than build.
[1] https://www.politico.eu/article/europe-cookie-law-messed-up-...
This is the fault of people using unnecessary cookies.