If a person buys a Google Android or Apple iPhone, they are offered a default email service, search engine, and calendar app. “The defaults just so happen to be the services that [these companies] themselves provide,” complained Yen in 2021. He was well aware that people favor convenience. “What we know from studies is that 95 percent of people will not change the defaults.” But 2022 was the year the European Union finally took action. In March, the bloc’s lawmakers agreed on new rules designed to release the grip Big Tech has on European consumers and to help homegrown internet companies compete with American giants for customers. The Digital Markets Act will obligate companies that run phone operating systems to offer “choice screens” so users have more control over which services they use. Technically, the DMA went into force in November, although it may not take full effect until March 2024. Proton is headquartered in Geneva, Switzerland, which is not an EU member. But Yen thinks this law will help European companies, like Proton, finally have a voice in Brussels. Europe’s momentum in rewriting the rules of the internet, however, is not all good for Proton, which has grown to 70 million accounts. The company is warily watching a wave of proposals in the UK and the EU that privacy advocates warn will threaten encryption, such as the UK’s Online Safety Bill and the EU’s proposals to combat child sexual abuse material. Yen spoke in October at WIRED’s business conference, WIRED Smarter. At the event, we talked about how he is thinking about the breakthroughs and concerns that are emerging out of Europe’s increasing focus on technology legislation. This interview has been edited for clarity and length.   WIRED: You’ve been a big advocate for Europe’s Digital Markets Act. Now that the new rules have passed, are you concerned about enforcement?  Andy Yen: I had a call earlier this year with Margrethe Vestager, who is the head of competition at the European Commission. And I can tell you, the political will to enforce this is there. You can see the fire inside her. She wanted to get it done.  But is political will the same as having the resources to force big tech companies to comply?  That’s exactly the problem. The combined market cap of these big tech companies a couple months ago was $7 trillion, which is bigger than most European countries’ GDP.  They [Big Tech] are throwing literally hundreds of millions of euros at this problem. And as much as Ms. Vestager is committed to fighting this, she is facing an uphill battle against enormous resources of entrenched powers. So it will be a tough fight. But what is making me very optimistic is that, for the first time, I’m seeing the commission reach out to small companies like Proton to really understand what the issue is and get to the heart of it.  It’s a shift. Instead of just listening to whatever Big Tech’s consultants and lawyers are spewing out, they’re taking time to talk to small companies and, for the first time—maybe ever—I feel like we have a voice in Brussels.  When did that shift happen? After the DMA was passed?  Just within the past year. I think it really shows a shift in the mindset in Brussels that has, so far, not yet happened in the US. In the US, the antitrust fight is much tougher.  What about other European regulation? I know there’s a lot of concern about the legislation drafted by EU Home Affairs commissioner Ylva Johansson which proposes forcing encrypted platforms to carry out automated searches for child sexual abuse material. Is that something you think could affect you? Of course, it could potentially impact us. There’s also the Online Safety Bill here in the UK. It seems like it’s coming back from the dead.  But if these things go through, there’s the risk that encryption will be demonized at a time where you’re having breakthroughs in these other areas.  The problem with these legislations is they are written too broadly; they are trying to cover too many unrelated issues. I’ll give you an example from the UK’s online safety debate. Part of its focus is content moderation on social media. But there’s a difference between messaging on social media versus private messaging. The two things should be decoupled. So, no one is saying that there are no problems and that we shouldn’t try to fix them. But I think we need to define clearly what we’re trying to solve and how the remedy is geared toward the actual problem. Otherwise you come up with legislation which has a lot of unforeseen consequences.  That might be the case in the online safety bill in the UK, which is trying to tackle lots of different things. But the EU’s chat control proposal is very much arguing that encrypted messaging creates a space where there is a concern child abuse is taking place. How do you approach that debate? Because it is so emotional.  Typically, the purpose of legislation is to step in when markets don’t create the right incentive structures to enforce an outcome that will be good for society, right? And if you look at the, let’s say, the child sexual abuse control debate, is there any company in the world that is incentivized not to tackle this problem? I would say no. It’s a huge problem from a PR standpoint, from a business standpoint. So Big Tech and small tech companies like Proton are already putting all the resources that we can into combating this issue. So given that is already the case, legislation perhaps isn’t necessary because the incentives to tackle the problem are already there.  You said companies like Proton are finally being listened to when it comes to competition policy. But how does the EU’s approach to CSAM compare? Do you feel like you’re being listened to on this issue? It is a debate. The issue that I see here is that politicians feel pressure to confront the issue. They’re getting pressure, also from law enforcement, to tackle the issue. But I think law enforcement is using this as a Trojan horse, they really want to [break encryption] for other purposes. At the same time, when I talk to people in Brussels, they say, “We’re not trying to break encryption, we know encryption is very important.” And it’s the typical issue where they need to show that they’re doing something, they want to do something. But at the same time there is no easy, obvious solution to the problem. So they’re kind of stuck.  How worried are you about the EU CSAM proposal? There’s a lot of opposition to the idea. Do you think that it’s going to pass or that it’s just too unpopular?   Actually, I’m quite concerned, because in the past this has come up, but it was wrapped around terrorism. But this time they’ve bundled it around child abuse, which is a very toxic topic. Due to the public debate, some of the people that would actually be standing against this will not be able to have a rational debate.  It’s much harder to get into the details, because lots of people don’t even want to debate this issue—it’s very upsetting.  You want to have a nuanced discussion about it, and then the response is “think of the children.” It’s difficult to have a proper discussion about it. I think it would be bad for democracy if we don’t have that debate. But it is a very clever packaging, for sure.