GDPR meant nothing: chat control ends privacy for the EU
🤖 Article status notice: This is an automatically generated transcript
This video transcipt has been created using automated tools and has not been checked for accuracy by a human.
As the article incorporates text from a large language model, it may include inaccuracies or hallucinated information. Please keep this in mind if you are using this article as a source for information.
If you determine any of the contents to be inaccurate please add a notice, using the {{Important}}
template at the top of the page (more information here) and contact a moderator to correct or replace this entry.
To contact a moderator you can use the #appeals
channel on our Discord server (Join using this link]) or use the talk pages on the wiki and leave a message to any of the moderators. List of current moderators.
Header Information edit
- Channel: Louis Rossmann
- Video: GDPR meant nothing: chat control ends privacy for the EU
- Date: 2025-08-16
- Description:
https://www.youtube.com/watch?v=NE06Tw9UWM8
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:52022PC0209
https://data.consilium.europa.eu/doc/document/ST-12611-2023-INIT/en/pdf
https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52022PC0209
https://old.reddit.com/r/europe/comments/1mntbvn/eu_chat_control_proposal_would_scan_all_your/
Tools Used to Create this Transcription edit
Transcription Generator: https://tactiq.io/tools/youtube-transcript
AI Summary: https://chatgpt.com
AI Summary Prompt: "You are an expert at analyzing and summarizing transcripts that are easy to digest and understand by people unfamiliar with the subject matter. Please summarize this transcript:"
AI Disclaimer edit
The Summary and Transcription below were generated using artificial intelligence (AI). While efforts have been made to ensure accuracy and coherence, the following points should be noted:
- The transcript is machine-generated and is likely to contain inaccuracies, omissions, or misinterpretations due to the limitations of automated transcription technology.
- The summary, created using AI, is derived from this transcript and will likely not capture the nuances, tone, and context of the original content.
- Users should exercise caution and verify the information, considering the compounded limitations of two layers of AI processing.
AI Summary edit
Louis Rossman criticizes a proposed European Union law called “chat control”, which would require scanning encrypted messages on apps like Signal, WhatsApp, and Messenger for child abuse material (CSAM) and grooming. While framed as protecting children, Rossman argues this would effectively destroy privacy by forcing governments or companies to either decrypt private messages or scan them before encryption—completely undermining the purpose of end-to-end encryption.
He highlights several major issues:
- Privacy violations: It’s like having a police officer listening to every conversation everywhere you go.
- False positives and overload: AI or scanning systems would misinterpret messages, generate false reports, and overwhelm law enforcement with useless data.
- Double standards: Government officials’ communications are exempt from the law, while ordinary citizens’ messages would be monitored.
- AI unreliability: Automated systems misinterpret normal conversations (like technical IT jargon) as suspicious, risking innocent people being flagged.
- Erosion of trust: People who value privacy are often viewed as “suspicious,” even though companies like AT&T and Verizon have already been caught selling customer data with minimal fines.
Rossman is frustrated that despite the EU’s reputation for protecting privacy (like with GDPR and consumer protections), it now risks being worse than the U.S. in this area. He urges EU citizens to act quickly, directing them to fightchatcontrol.eu to contact their representatives and protest before the law is passed.
Ultimately, he frames the issue as a fight for fundamental freedoms: if governments can monitor every private message “for safety,” then privacy no longer exists. He stresses that defending privacy doesn’t mean supporting crime—it means protecting ordinary people from unjust surveillance.
AI Transcription edit
Hey everybody, how's it going? Hope you're having a lovely day. Today I want to follow up in a video that I did two years ago on how the EU seems to be trying their best to ruin the internet. It's something that confuses me a lot as an American because when I look at it, whether or not we always agree on every single issue, the EU seems to at least try to care about protecting things like privacy with the GDPR. When it comes to consumer rights, you want companies to offer real warranties, not that 3 to 9 month bulls--t where the product can die on month 11 and you're screwed. And when a company does something wrong in the EU, you find them a real amount of money. You don't play that s--t of 0.02% of their yearly profit when they have sold data to bail bondsmen and bounty hunters and advertisers.[1] You want companies to be held accountable and responsible the same way that I want telemarketers to be held accountable and responsible. This is why I'm really confused about what you guys are doing with this check control bill.
So, this is something that I went over in summer of 2023.[2] This was one of my most famous videos for this gaff that I made where I said, "Oh my god, who voted for you?" And somebody said, "How she got elected?" Louis, do we have news for you? We're referring to an EU commissioner that was suggesting that you could sniff encrypted communications. This is from an interview. Mullvad put this on their blog:[3]
"It's about sniffing, checking out, you could say. It's not as if you could read the communication. I mean, it's like a police dog being able to smell if there's something there."
And Mullvad says
"It's not possible to "sniff" end to end encrypted communication without looking at the encrypted communication."
They want this chat control thing to get pushed through because it's supposed to protect the children. It's supposed to protect against grooming. It's supposed to find CSAM, which is child abuse material. Yeah, let's get rid of privacy altogether. This is in the context of messaging apps. So, we're talking about Signal, Facebook Messenger, WhatsApp, stuff like that. They want to make sure that no CSAM, no child abuse material, no sort of grooming is going on. But how do you do that? The way that you do that is by detecting those types of messages. How do you detect those types of messages? Either A, you decrypt them, or B, you read them before they're encrypted. Either way, that completely f--king defeats the purpose of what Signal does. Signal is there so that when I send a message to my friend the only person who's reading the message is myself and my friend. And maybe you're a reporter at the Atlantic if you work for the White House but I'm sorry I can't help myself. But the whole point of this is privacy like that's the entire point you are removing that if you want to have a dog sniffing the communications next to us.
When I go to the airport and there's a dog there that dog is employed by the TSA or the police. So what you're suggesting is that when I'm sitting in my own house and I'm having a conversation with my friend who's also sitting on their own house which they paid for that we should have the police right next to us. That there should be a policeman next to me every f--king place that I go. I send text messages when I'm on vacation. I send text messages on the beach. I send text messages when I'm at work. I send text messages in my house. I send text messages when I'm exploring and I'm going to Diana's Baths in New Hampshire. So, what you're suggesting is that the police be next to me everywhere that I go when I communicate for the rest of my life. That's a bulls--t analogy. And even if you take it as a real analogy, that actually makes it worse. Let's go through this and let's just go through why this is a horrible f--king idea. And then this is where I ask all you guys in the EU, what the f--k are you doing?
Firstly, when you look at the actual numbers, about 10% of these were false positives immediately.[4] So 10% of what gets submitted is just not child abuse material. About 20% were marked as child abuse material and the rest of it was just a bunch of random bulls--t that was never actually acted on. If you were to create some method of scanning what people do with their end-to-end encrypted communications, what you're going to do at that point is you're going to overrun law enforcement agencies to the point where they can't respond to anything. How many people does the EU have like 440-450 million people? You're going to scan the communications of every single one of these people.
One of the ways that they propose doing this is using AI.[5]
Suitable technology for recognizing known CSAM or hash values in photo DNA. For new CSAM and grooming, AI is used similar to the technology used for recognizing spam and virus content.
So, you are going to use AI to figure out whether or not something that I sent or a message I sent is that type of material. There's a great post on Reddit that I have to highlight from Rizzan8.[6]
Think about chats like Counterstrike, like where to put a bomb or with some OS related queries. How to kill a child when its parent no longer runs.
If you are not a Linux systems administrator, you probably think that that's something nobody should ever say. If you're a Linux systems administrator, that's probably just run-of-the-mill talk for you.
I use AI for a lot of different things. A lot of the times it's very useful, but also a lot of the times it gives me really stupid s--t. And if that is going to be what is trusted with sending my personal private communications to law enforcement, get the f--k out of here with that s--t. Let's just put aside the fact that that's not even going to work based on the numbers alone. You're going to overload the police with so much s--t they're just not going to be able to go through any of this to begin with. Look at what you're giving up in order to do that. How far are you willing to go to do this whole protect the children thing? There are kids that get hit in parking lots by cars. Kids get hit by cars all the time. Let's just get rid of cars. Let's get rid of roads. Hell, let's not let them go outside. Cuz if you don't let the kids go outside until they're 18, there's no chance of them getting hurt. I'm being dead serious. Your proposal is to have a f--king Bonzi buddy sitting next to me, reading every single f--king thing that I say and reporting it to the police to figure out whether or not I'm a child abuser to protect kids. That's insane.
One of my personal favorites going through this law and his proposals when I read through the actual source text rules for thee, but not for me. Section 12A or recital 12A, I forget.[7] They have all these different weird names for s--t in the EU.
In light of the more limited risk for their abuse of the purpose of child sexual abuse and for the need to preserve confidential information, including classified information, information covered by professional secrecy and trade secrets, electronic communication services that are not publicly available, such as those used for national security purposes, should be excluded from the scope of this regulation. Accordingly, this regulation should not apply to the interpersonal communication services that are not available to the general public and the use of which is generally restricted to persons involved in the activities of that particular company, organization, body, or authority.
The people that govern you are not going to have their personal communication spied on and sent to f--king Bonzi buddy to figure out if it is abuse. But yours are.
All options focused on the objective of ensuring detection, removal, and reporting of previously known and brand new CSAM and grooming by relevant online service providers established in the EU and the third countries in so far as they offer their services in the union.[8]
So why is this important? Well, lots of methods that are used right now are going through databases of known CSAM and saying, "Okay, does this image, does this video match something that is in a database of stuff that is known to be child abuse material?" That's one thing because it's not looking for the s--t that you say. It's looking for a very specific hash of a specific file. Here, it's trying to find new stuff, which means that it's not just doing a dumb determination of does this picture match the hash of a known thing of child pornography. It's doing a 'let me think to myself and figure out if this is child pornography' and those are two very f--king different things. To be clear, I don't want my device being scanned by any of these f--ks for any of these things or any of my messages. But it's much more dangerous when you're going to have some sort of algorithm or AI try to figure out if what I am doing is that. Again, that's where you get into the question of when somebody says something like 'how to kill a child when its parent no longer runs'. How many ti— of you have asked AI an a question and gotten a stupid answer? How many of you want to get reported to the police based on that Bonzi buddy giving you a stupid answer? The thing that makes me sad about this is the fact that this most likely will go through and there's probably nothing you can do about it. So, I had hired somebody a few months ago is for a temporary position who's what I would consider a quote 'normie' to do kind of more public relations type of stuff, try to get the issues that I talk about here featured in the normal media. I already have an engaged audience of 2 million people. I'm speaking to the choir, but I wanted to figure out how to speak to others. And this was a communications chain back and forth with this employee at one point in time:
I was told that Signal is the sus WhatsApp. Like anyone that doesn't want anyone to know what they're speaking to because they're having sus relations. They use Signal. Very he doesn't have to know types. You know what I mean.
So my response was as follows:
AT&T has been selling its customers location data to creditors, bounty hunters, landlords, prison officials, and all sorts of third parties.[1]
And then I followed up with all of the documents.
AT&T, Sprint, T-Mobile, and Verizon collected user data and sold it to advertisers, bounty hunters, and bail bondsmen. When this came to light during an FCC investigation, AT&T was fined a dollar amount that was 0.02% of their net profit for a year. This is akin to me stealing the text conversations you have with your boyfriend and selling it to Jerry Springer and getting a $20 fine.
The response:
This is nuts and infuriating. Why is there no rule passed against this?
I said:
There is a rule. That's why they were fined $57 million. That 0.02% of AT&T's yearly net profit. Horrible.
And this comes back to a theme I talk about on my channel all the time, which is we spend so much time beating each other up that we don't spend time focusing on the people who are actually screwing us, the people that are selling our data to bail bondsmen that get bulls--t fines for it and continue to do it over and over again. She was very, very aggravated when she sent that message of like, 'how is this a thing? How is there not a rule against this?' And another answer I could have given in that moment is when people like us who are privacy advocates say that's wrong, you say that we're sus. We spend too much time beating each other up rather than focusing on the people who are actually screwing us. And as long as that happens, whether it's in the EU or the US or anywhere else about any topic, we're not going to get real change. If you want your government to install a Bonzi buddy next to you and a little doggy that follows you every f--king place that you go, like that Metalhead episode of Black Mirror, be my guest.[9] But I don't want it to be that way. Which is why I try to foster a world where we're not beating the s--t out of each other, but rather working together for collective change. Because this is the default thought process for most people. You want privacy. You must be dangerous.
Another experience that I had, this is funny. This dates back a long time ago on a dating app. I was having a conversation with somebody. It was actually going really, really well. At some point, I said, "Hey, here's my number. I'm on signal. If you want to take a walk downtown sometime, I'd love to talk more." And she sent something back. There was like this paragraph long lecture of okay so a man yeah a man wants to speak to me on a highly encrypted messaging app that I have never met and blah blah blah and I'm like like yes. Yes I yeah. I don't want AT&T and T-Mobile f--king selling my text message data and my location and everything without my consent.
And what I've learned over and over again is that we're the psychos. We're the crazy people because I don't want AT&T and Verizon and T-Mobile selling my data to f--king advertisers, bounty hunters, and bail bondsmen. I'm crazy because I believe the only person who should be reading my messages are myself and the person I send it to. I'm crazy. I've learned that this is not only a thing in the EU, it is a thing around the world. I'm the crazy one. And if you're watching this, you're likely the crazy one, too.
I don't like being pessimistic about things, but this has been a conversation that's been going on for two years with Chat Control at the very least. I think it was originally proposed in 2022. You had that ridiculous conversation where she said, "We don't want to read your messages. We just want to sniff it like a dog at the airport" in 2023, yet it's still a proposal that's being taken seriously. And it's about a month and a half to two months away from being enforced. And when you take a look at who's for the proposal and who's against the proposal, you're massively outnumbered. Three member states opposing, 15 member states supporting, nine member states undecided. So even if every single member state that's undecided opposes it, that's still 15 to 12.
So there's a website called fightchatcontrol.eu. It's a very wellorganized website. It tells you how to find your representatives. It tells you how to take action and it informs you on everything. And it has a lot of really, really good citations and sources if you want to find material. What I would suggest that you do if you live in the EU is you go through this website immediately. And if you want to live in a world where you have privacy, then you should care about this. And if you think that I'm being ridiculous, then show me your phone. No, see, I want to see all of your messages. Right now.
Because what I did is a mill— I responded to my employee here poorly. I also responded to the person on that dating app very poorly. The right thing to say in that instance is, "Give me your phone so I can read your messages." That's the only right way to go about this. Let me look through your phone right now. I want to see your emails. I want to see your messages. I want to see all of your interpersonal communications. Oh, you don't want me seeing them? Hm. Are you doing something wrong? That's the only way to get people to understand. The moment anybody makes you feel weird about this, the moment anybody implies that you are a creep or a weirdo or a conspiracy theory or a tinfoil hat or whatever because you want to have end-to-end encrypted communications that are not read by something. Look them straight in the eye and ask for the password to their computer and phone and demand that they put in front of you right now. Don't be a wise ass, Louis. I'm just you that I don't want to see my conversations, but I'm okay with the government seeing it. All right, cool. Let's go down the street. The sheriff's office is right down the road. How about you ctrl+P and you print down to 350 pages of conversations that you've had with your girlfriend or your boyfriend. And why don't you just leave it on the table over there and tell them that they could read through it at any time? Oh, you're opposed to that? Yes. Why? Cuz they don't need to know. Exactly. They don't need to know. That's privacy. That's how you get people to get it. And if you don't get people to get it sooner, then you're never going to have it later because they're two months away from installing a minority report level Bonzi Buddy next to you everywhere you f--king go and speak. You do something now or you're f--ked. There's no way I can else I can put it.
Honestly, yeah, I'm probably going to be provocative in the title of this video. I think that the EU is worse than the United States when it comes to privacy. I don't give a f--k about the GDPR. It's one thing for a company to be able to do this s--t, it's another when it's the government doing it. At the very least, if I wanted to live in the f--king woods somewhere and not use Facebook or Instagram or Google or ever join a forum, I could. Even in the EU, if I did all that s--t, if I went into a hut in the woods and I used Signal on my phone, you'd still be reading my s--t.
Until you fix this, the EU is actually worse than the US when it comes to privacy. Fight me. I'll defend that take. Yes. What's the point? I've wasted so much of my life clicking decline on cookie notices when I used to just be able to decline cookies in a separate web browser that I used for when I wasn't logging into websites. All this time that was wasted over this for nothing. You're worse than the US. That's f--king shameful. You should not be behind the US in consumer rights or privacy ever.
Go to that website and figure out what the f--k you need to do. Figure out who your representatives are and raise some hell. And don't do this, Louis. I don't know. No, f--k that s--t. Even in the video I did on turning a profile picture into a Clippy. You have no idea how many emails I got going, Louis, where do I find the profile picture? Where do I find Clippy? Where do you change it? It's like, how the f--k do you people even get up in the morning? I swear to Christ, sometimes I think that. No, just it's too important. Figure it out. Get together. I'm not in the EU. I don't vote there. I don't live there. I don't work there. I can't help you anymore than I am right now. I could bring this issue to your attention and I could suggest that you bring it to the attention of your neighbors and elected officials. And unelected officials because apparently they're the ones that make your laws for you.
But do something because if you don't do something there, this shit's going to spread over here. If our government sees that you got away with it, then we're f--ked too. Or it could be the opposite actually because a lot of our government looks at what the EU is doing and they're like:
EU stifling standards concerning the internet is what killed tech in Europe. How harmful would it be to winning the race for AI if America goes down the road of the EU?[10]
This could actually be a good thing for us because if you do this, maybe our government will do the exact opposite to say, "Look, we're not like the socialist EU." Maybe this could be a good thing. I try to find the positive in things when I feel like I'm hopelessly f--ked.
Clippy didn't sick a virtual dog on you that followed you everywhere you went, logged your personal communications, and reported them to the police under the guise of wanting to protect the children. Clippy just wanted to help. Clippy protected the children, too, while preserving their privacy. That's it for today and as always, I hope you learned something. I'll see you in the next video. Bye now.
[Outro]
We also have the duty not to infringe the IP rights in the process. It is in fact the manufacturers who have the relevant rights, not consumers.
References edit
- ↑ 1.0 1.1 Rossman, Louis (2024-04-30). "No Escape: EVERY US Carrier Sold Your Location Data with 0.02% Penalties from the FCC!". Youtube.com. Retrieved 2025-08-17.
{{cite web}}
: CS1 maint: url-status (link) - ↑ Rossman, Louis (2023-03-29). "EU commissioner thinks you can sniff encrypted communications like a dog at the airport 🤔". Youtube.com. Retrieved 2025-08-17.
{{cite web}}
: CS1 maint: url-status (link) - ↑ "THE EUROPEAN COMMISSION DOES NOT UNDERSTAND WHAT IS WRITTEN IN ITS OWN CHAT CONTROL BILL". mullvad.net. 2023-03-28. Retrieved 2025-08-17.
{{cite web}}
: CS1 maint: url-status (link) - ↑ "An Garda Síochána unlawfully retains files on innocent people who it has already cleared of producing or sharing of child sex abuse material". www.iccl.ie. 2022-10-19. Retrieved 2025-08-17.
{{cite web}}
: CS1 maint: url-status (link) - ↑ "Chat control: Internal documents show how divided the EU member states are". https://www.patrick-breyer.de. 2022-09-15. Retrieved 2025-08-17.
{{cite web}}
: External link in
(help)CS1 maint: url-status (link)|website=
- ↑ "Comment on "EU 'Chat Control' proposal would scan ALL your private messages and photos - only 3 member states oppose this mass surveillance"". reddit.com. 2025-08-11. Retrieved 2025-08-17.
{{cite web}}
:|first=
missing|last=
(help)CS1 maint: numeric names: authors list (link) CS1 maint: url-status (link) - ↑ "Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse — Presidency compromise texts". https://data.consilium.europa.eu. 2023-09-08. Retrieved 2025-08-17.
{{cite web}}
: External link in
(help); line feed character in|website=
|title=
at position 72 (help)CS1 maint: url-status (link) - ↑ "Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse". eur-lex.europa.eu. 2022-05-11. Retrieved 2025-08-17.
{{cite web}}
: CS1 maint: url-status (link) - ↑ "Metalhead (Black Mirror)". wikipedia.org. Retrieved 2025-08-17.
{{cite web}}
: CS1 maint: url-status (link) - ↑ "'Who's Winning—America Or China?': Ted Cruz Questions Sam Altman About AI Race". Youtube.com. 2025-05-11. Retrieved 2025-08-17.
{{cite web}}
: CS1 maint: url-status (link)