Hey everyone
Welcome to another Nexus post.
Over 200 people read each of our post, but only a small fraction are actually subscribed.
If you’ve been finding value in what we’re building here at Nexus, hitting that subscribe button goes a long way.
It tells us you’re not just lurking - you’re listening. And it helps us keep making real, grounded explainers about how tech actually shows up in our lives.
If you want more tech content that’s made for Pakistan (not just copy-pasted from the Valley), share this with a friend. Or hit reply - we read every one.
Alright. Let’s get into it.
They Say It’s for Safety. But It Could Break the Internet.
Governments in the UK and EU are pressuring tech companies to open a secret door into your private messages, all in the name of “protecting the public.”
It sounds noble. But behind the buzzwords is a global push to weaken the very thing that makes digital life secure: encryption.
This isn’t just some niche tech debate. If regulators succeed in their effort, everything from your banking details to private conversations could be exposed, not just to them, but to anyone who finds that backdoor.
In this piece, we’ll break it all down:
What encryption actually is.
Why backdoors defeat the whole point of end-to-end encryption.
And why not even Apple can help, and frankly, shouldn’t.
Let’s get into it.
How Encryption Works And Why Backdoors Break It
Before we talk about why backdoors are such a dangerous idea, we’ve got to get into some nerdy basics. Don’t worry, I’ll keep it simple.
Because here’s the thing:
If you don’t understand how encryption actually works, you won’t understand why breaking it, even just a little, ruins the whole system.
So let’s break down what encryption does, how it protects you, and why a backdoor isn’t just a harmless key for the good guys - it’s a ticking security time bomb.
What Encryption Really Does
Encryption scrambles your data, turns plain text into unreadable gibberish using complex math. It’s like speaking in a secret language only you and the intended recipient understand.
Whether it's a message, a photo, or a file, once encrypted, no one else can make sense of it, not your ISP, not the app provider, not even a hacker who intercepts it.
An example:
You can also encrypt your entire hard drive. In that case, the system locks everything until you enter a password - that password acts as the decryption key, unlocking your data and turning the gibberish back into usable information.
At its core, encryption protects access to sensitive information. No password? No access. That’s what makes it so powerful
What is End-to-End Encryption (E2E)
End-to-End Encryption (E2E) means only two people can read a message: the sender and the receiver. That’s it. Not your internet provider. Not the app company. Not even the cloud server handling it.
Apps like WhatsApp, Signal, and iMessage use E2E to keep your messages locked down from everyone else.
Here’s how it works:
E2E uses two keys - a public key (which anyone can see) and a private key (which only you hold). The sender encrypts the message using the receiver’s public key, and only the matching private key can decrypt it. Like a locked mailbox that anyone can drop letters into, but can only be opened by the owner.
This is what separates E2E from basic encryption. In regular encryption, messages might be scrambled while they are being transmitted to protect them, but the server (or company) can still read them. With E2E, even the server is blind. And this is why it’s so strong.
And trust me, this is just scratching the surface. If you’re curious about the magic behind those keys, check out Diffie-Hellman key exchange. That’s where the real crypto sorcery begins.
What Governments Want: A Backdoor
The problem?
Regulators and law enforcement say they need access to everything that is encrypted for crime prevention, child safety, counterterrorism, etc.
What they’re really asking for is a backdoor, a secret key that would let them unlock encrypted messages when “needed.”
But here’s the hard truth:
There’s no such thing as a backdoor for just the good guys.
If a key exists, anyone can potentially get it - hackers, foreign governments, blackmailers. Cryptography doesn’t do just this once or only for the right reasons.
One Hole Is All It Takes
Imagine installing a secret door to your home, meant only for firefighters in case of an emergency. You leave it unlocked… just in case.
How long before thieves find it?
That's what encryption backdoors are: a vulnerability waiting to be exploited. Once that hole exists, the system is no longer secure. period.
And all this? It isn’t theoretical, it’s already happened
There are multiple occasions when we’ve seen what happens when trusted systems get compromised:
NSA’s surveillance tools leaked and got weaponized, leading to ransomware attacks worldwide.
Stuxnet, originally built for state cyberwarfare, leaked and inspired new generations of malware.
My point is:
Once you crack the lock, you can’t choose who walks through.
What Governments Are Demanding and Why That’s Dangerous
Case in point: UK vs Apple
In early 2025, the UK secretly issued a Technical Capability Notice under its Investigatory Powers Act, ordering Apple to build a backdoor into its iCloud Advanced Data Protection (ADP) system.
The goal?
Give law enforcement complete access to encrypted backups, not just in the UK, but globally
Apple responded by removing ADP for UK users and suing the government ; on the other hand, whistleblowers showed that a true backdoor would require weakening encryption across the board, not just for British citizens.
The move sparked backlash in the US from both Democrats and Republicans, who said it sets a dangerous precedent, meaning that if such an act were allowed once, it could be repeated again, making it easier for future governments to demand similar or worse actions, such as breaking encryption or violating privacy.
Even WhatsApp and other tech giants joined Apple in court, arguing that undermining encryption can make everyone in the country less secure.
EU: Client-Side Scanning
The European Commission is pushing for “client-side scanning” under its CSAM regulation and upcoming Encryption Roadmap. This means platforms would scan your encrypted messages on your device before they even get encrypted.
Many experts are warning about this act. According to their explanations, such acts violate the core promise of E2E, introduces serious security vulnerabilities, and sets the stage for mass surveillance
India & Others: Traceability & Key Disclosure
India’s government wants messaging apps to reveal who originally sent a message, basically a way to track users, even in encrypted chats.
They’ve also suggested laws such as mandatory key disclosure rules that could force people or companies to give up their encryption keys, which would break the whole idea of private communication.
All of this is framed as “saving children” or “catching terrorists.” Noble on the surface. Dangerous underneath.
All this? It’s not just about regulation, it’s about rewriting the rules of trusted encryption.
And once that path is taken.
There’s no turning back.
Locked by Design: Why Tech Giants Can’t Help Governments
Apple vs FBI (San Bernardino, 2016)
After a terrorist attack in San Bernardino, California, the FBI recovered the shooter’s iPhone. They asked Apple to create a custom version of iOS that would let them bypass the phone’s encryption.
Apple said no. Not because they supported criminals, but because creating that tool would break the security of every iPhone.
Tim Cook called it dangerous and unprecedented, a move that would set the stage for mass surveillance.
The FBI backed down, but the case sparked a global debate over encryption and government access.
WhatsApp vs India
We previously discussed this as well, how India wanted traceability, which meant breaking WhatsApp’s end-to-end encryption, or building a surveillance layer into every conversation.
WhatsApp refused. Their argument:
"You can’t have privacy for billions of people if you’re forced to watch everything they say."
Private Means Private, Even From the App
These companies weren’t refusing to help because they care too much about their users' privacy; they literally can’t unlock your messages. That’s the design.
End-to-end encryption means no one, not Apple, not Signal, not even the server, can access what’s inside.
That’s what we call real privacy.
And that’s how it should be as well.
What Happens If Backdoors Become Law?
Software Gets Weaker
You can't build a “secure system with a hole in it.” That’s not security, that’s sabotage.
If governments ask developers to bake in vulnerabilities disguised as “access points” in their systems, then the security of those applications becomes meaningless.
Think about it:
Once you weaken encryption for one party, you weaken it for everyone. Engineers know this. Hackers know this. This isn’t about if someone breaks in, it’s when.
Apps Leave Countries
Apps like Signal, Proton, and even WhatsApp have threatened to pull out of regions like the UK or India rather than compromise user privacy.
Why?
Because once they comply with one bad law, they set the precedent. And then every other government will start following the path set earlier.
Trust in Tech Crumbles
Once users know their encrypted chats can be accessed, trust in secure communication dies.
People stop sharing. Companies stop innovating.
Paranoia spreads, and this time, it’s justified.
Business Impact
Tech companies rely on encryption to protect IP, contracts, and customer data.
We’re not just talking about WhatsApp.
Banks, hospitals, cloud services - everything that runs on encryption - would be compromised.
This isn’t a messaging app issue.
It’s an economic risk.
And this isn’t even the full list.
Backdoors in one country trigger a global ripple effect, paving the way for authorities in other countries to follow.
And guess what?
These problems are just the tip of the iceberg that I listed off the top of my head.
In reality, there are dozens more technical, legal, and ethical issues - all waiting to explode.
Why This Isn’t Just a Tech Problem, It’s a Human One
This fight isn’t just about stopping crime. It’s about protecting everyone else.
End-to-end encryption doesn’t just guard drug lords and hackers.
It also protects:
Journalists exposing corruption
Protesters speaking out
Lawyers fighting cases
You, living a private life that doesn’t need broadcasting
Pakistan’s Perspective
In Pakistan, vague laws like PECA already give authorities wide powers.
Surveillance is growing, from social media monitoring to telecom tracking.
They track what you say, what you share, even what you delete.
You don’t need to be guilty to get caught in the trap; just being visible is enough.
And once encryption breaks?
It’s not just criminals who lose privacy.
It’s everyone.
Do you see the matrix, Neo?
They are trying to lay a blueprint to control everyone while operating from the shadows, masking their true agendas by labelling them as child safety.
Strong Encryption = Strong Democracy
If you’ve read this far, then you already know:
Backdoors don’t make us safer. They make us exposed.
And the truth is - privacy and security aren’t enemies, they’re allies.
You need privacy to be safe.
Governments shouldn’t demand master keys; they should protect locks.
And Big Tech? They're not shielding criminals. They're shielding you.
Most importantly, if you value privacy, now’s the time to care.
Ask your platforms what they’re doing to protect your data.
Other than that, I’ve got a couple of questions for you:
What happens when trust in tech dies?
And what will you do when it’s your door they’re trying to unlock?
Drop your thoughts in the comments - I’d love to hear what you think.
And if you found value in this piece, I’d really appreciate your feedback.
As always - Stay safe. Stay secure.
Until next time…
Further Learning
Free Speech is Only as Strong as the Weakest Link | Electronic Frontier Foundation
WhatsApp tells BBC it backs Apple in legal row with UK over user data
Apple and the long secret arm of the UK Government | Privacy International
Offline: Imprisoned Bloggers and Technologists | Electronic Frontier Foundation
EU chat control law proposes scanning your messages — even encrypted ones | The Verge
I think if people don’t trust technology, they won’t use it.
They might be scared to try new things like AI, smart devices, or apps.
They could stop using it, avoid it, or speak out against it.
This can slow down progress, hurt businesses, and cause problems in the world.