UK forces Apple to provide encrypted data to security agencies—is America next?

FBI Warns iPhone, Android Users—We Want ‘Lawful Access’ To All Your Encrypted Data By Zak Doffman, Contributor. Zak Doffman writes about security, surveillance and privacy. Feb 24, 2025
The furor after Apple removed full iCloud security for U.K. users may feel a long way from American users this weekend. But it’s not — far from it. What has just shocked the U.K. is exactly what the FBI told me it also wants in the U.S. “Lawful access” to any encrypted user data. The bureau’s quiet warning was confirmed just a few weeks ago.
The U.K. news cannot be seen in isolation and follows years of battling between big tech and governments over warranted, legal access to encrypted messages and content to fuel investigations into serious crimes such as terrorism and child abuse.
As I reported in 2020, “it is looking ever more likely that proponents of end-to-end security, the likes of Facebook and Apple, will lose their campaign to maintain user security as a priority.” It has taken five years, but here we now are.
The last few weeks may have seemed to signal a unique fork in the road between the U.S. and its primary Five Eyes ally, the U.K. But it isn’t. In December, the FBI and CISA warned Americans to stop sending texts and use encrypted platforms instead. And now the U.K. has forced open iCloud to by threatening to mandate a backdoor. But the devil’s in the detail — and we’re fast approaching a dangerous pivot.
While CISA — America’s cyber defense agency — appears to advocate for fully secure messaging platforms, such as Signal, the FBI’s view appears to be different. When December’s encryption warnings hit in the wake of Salt Typhoon, the bureau told me while it wants to see encrypted messaging, it wants that encryption to be “responsible.”
What that means in practice, the FBI said, is that while “law enforcement supports strong, responsibly managed encryption, this encryption should be designed to protect people’s privacy and also managed so U.S. tech companies can provide readable content in response to a lawful court order.” That’s what has just happened in the U.K. Apple’s iCloud remains encrypted, but Apple holds the keys and can facilitate “readable content in response to a lawful court order.”
There are three primary providers of end-to-end encrypted messaging in the U.S. and U.K. Apple, Google and Meta. The U.K. has just pushed Apple to compromise iMessage. And it is more than likely that “secret” discussions are also ongoing with the other two. It makes no sense to single out Apple, as that would simply push bad actors to other platforms, which will happen anyway, as is obvious to any security professional.
In doing this, the U.K. has changed the art of the possible, bringing new optionality to security agencies across the world. And it has done this against the backdrop of that U.S. push for responsible encryption and Europe’s push for “chat control.” The U.K has suddenly given America’s security agencies a precedent to do the same.
“The FBI and our partners often can’t obtain digital evidence, which makes it even harder for us to stop the bad guys,” warned former director Christopher Wray, in comments the bureau directed me towards. “The reality is we have an entirely unfettered space that’s completely beyond fully lawful access — a place where child predators, terrorists, and spies can conceal their communications and operate with impunity — and we’ve got to find a way to deal with that problem.”
The U.K. has just found that way. It was first, but unless a public backlash sees Apple’s move reversed, it will not be last. In December, the FBI’s “responsible encryption” caveat was lost in the noise of Salt Typhoon, but it shouldn’t be lost now. The tech world can act shocked and dispirited at the U.K. news, but it has been coming for years. While the legalities are different in the U.S., the targeted outcome would be the same.
Ironically, because the U.S. and U.K. share intelligence information, some American lawmakers have petitioned the Trump administration to threaten the U.K. with sanctions unless it backtracks on the Apple encryption mandate. But that’s a political view not a security view. It’s more likely this will go the other way now. As EFF has warned, the U.K. news is an “emergency warning for us all,” and that’s exactly right.
“The public should not have to choose between safe data and safe communities, we should be able to have both — and we can have both,” Wray said. “Collecting the stuff — the evidence — is getting harder, because so much of that evidence now lives in the digital realm. Terrorists, hackers, child predators, and more are taking advantage of end-to-end encryption to conceal their communications and illegal activities from us.”
The FBI’s formal position is that it is “a strong advocate for the wide and consistent use of responsibly managed encryption — encryption that providers can decrypt and provide to law enforcement when served with a legal order.”
The challenge is that while the bureau says it “does not want encryption to be weakened or compromised so that it can be defeated by malicious actors,” it does want “providers who manage encrypted data to be able to decrypt that data and provide it to law enforcement only in response to U.S. legal process.”
That’s exactly the argument the U.K. has just run.
Somewhat cynically, the media backlash that Apple’s move has triggered is likely to have an impact, and right now it seems more likely we will see a reversal of some sort of Apple’s move, rather than more of the same. The UK government is now exposed as the only western democracy compromising the security for tens of millions of its citizens.
Per The Daily Telegraph, “the [UK] Home Office has increasingly found itself at odds with Apple, which has made privacy and security major parts of its marketing. In 2023, the company suggested that it would prefer to shut down services such as iMessage and FaceTime in Britain than weaken their protections. It later accused the Government of seeking powers to 'secretly veto’ security features.”
But now this quiet battle is front page news around the world. The UK either needs to dig in and ignore the negative response to Apple’s forced move, or enable a compromise in the background that recognizes the interests of the many.
As The Telegraph points out, the U.S. will likely be the deciding factor in what happens next. “The Trump administration is yet to comment. But [Tim] Cook, who met the president on Thursday, will be urging him to intervene,” and perhaps more interestingly, “Elon Musk, a close adviser to Trump, criticised the UK on Friday, claiming in a post on X that the same thing would have happened in America if last November’s presidential election had ended differently.”
Former UK cybersecurity chief Ciaran Martin thinks the same. “If there’s no momentum in the U.S. political elite and US society to take on big tech over encryption, which there isn’t right now, it seems highly unlikely in the current climate that they’re going to stand for another country, however friendly, doing it.”
Meanwhile the security industry continues to rally en masse against the change.
“Apple’s decision,” an ExpressVPN spokesperson told me, “is deeply concerning. By removing end-to-end encryption from iCloud, Apple is stripping away its UK customers’ privacy protections. This will have serious consequences for Brits — making their personal data more vulnerable to cyberattacks, data breaches, and identity theft.”
It seems inconceivable the UK will force all encrypted platforms to remove that security wrap, absent which the current move becomes pointless. The reality is that the end-to-end encryption ship has sailed. It has becomne ubiquitous. New measures need to be found that will rely on metadata — already provided — instead of content.
Given the FBI’s stated position, what the Trump administration does in response to the UK is critical. Conceivably, the U.S. could use this as an opportunity to revisit its own encryption debate. That was certainly on the cards under a Trump administration pre Salt Typhoon. But the furor triggered by Apple now makes that unlikely. However the original secret/not secret news leaked, it has changed the dynamic completely.
British Soldiers told to stop using the Whatsapp and use Signal instead of WhatsApp for security
George Grylls, Political Reporter Monday March 21 2022, 5.00pm GMT, The Times
British soldiers have been told to stop using Whatsapp over fears that Russia is intercepting their messages BENOIT TESSIER/REUTERS
British soldiers are being encouraged to use the Signal messaging app instead of WhatsApp, amid reports that Russian forces used insecure UK numbers to direct airstrikes in Ukraine.
Signal has a higher level of encryption than WhatsApp.
Military sources said that secure channels should be used to discuss sensitive matters but denied that the advice had been issued in response to security breaches resulting from the use of British phones in Ukraine.
Swedish Army Requires Signal for calls & messages
by Lars Wilderang, 2025-02-11
Translation from the Swedish Origin
In a new instruction for fully encrypted applications, the Swedish Armed Forces have introduced a mandatory requirement that the Signal app be used for messages and calls with counterparts both within and outside the Armed Forces, provided they also use Signal.
The instruction FM2025-61:1, specifies that Signal should be used to defend against interception of calls and messages via the telephone network and to make phone number spoofing more difficult.
It states, among other things:
“The intelligence threat to the Armed Forces is high, and interception of phone calls and messages is a known tactic used by hostile actors. […] Use a fully encrypted application for all calls and messages to counterparts both within and outside the Armed Forces who are capable of using such an application. Designated application: The Armed Forces use Signal as the fully encrypted application.”
The choice of Signal is also justified:
“The main reason for selecting Signal is that the application has widespread use among government agencies, industry, partners, allies, and other societal actors. Contributing factors include that Signal has undergone several independent external security reviews, with significant findings addressed. The security of Signal is therefore assumed to be sufficient to complicate the interception of calls and messages.
Signal is free and open-source software, which means no investments or licensing costs for the Armed Forces.”
Signal supports both audio and video calls, group chats, direct messages, and group calls, as well as a simple, event-based social media feature.
The app is available for iPhone, iPad, Android, and at least desktop operating systems like MacOS, Windows, and Linux.
Since Signal can be used for phone calls, the instruction is essentially an order for the Armed Forces to stop using regular telephony and instead make calls via the Signal app whenever possible (e.g., not to various companies and agencies that don’t have Signal), and no SMS or other inferior messaging services should be used.
Note that classified security-protected information should not be sent via Signal; this is about regular communication, including confidential data that is not classified as security-sensitive, as stated in the instruction. The same applies to files.
The instruction is a public document and not classified.
Signal is already used by many government agencies, including the Government Offices of Sweden and the Ministry for Foreign Affairs. However, the EU, through the so-called Chat Control (2.0), aims to ban the app, and the Swedish government is also mulling a potential ban, even though the Armed Forces now consider Signal a requirement for all phone calls and direct messaging where possible.
Furthermore, it should be noted that all individuals, including family and relationships, should already use Signal for all phone-to-phone communication to ensure privacy, security, verified, and authentic communication. For example, spoofing a phone number is trivial, particularly for foreign powers with a state-run telecom operator, which can, with just a few clicks, reroute all mobile calls to your phone through a foreign country’s network or even to a phone under the control of a foreign intelligence service. There is zero security in how a phone call is routed or identified via caller ID. For instance, if a foreign power knows the phone number of the Swedish Chief of Defence’s mobile, all calls to that number could be rerouted through a Russian telecom operator. This cannot happen via Signal, which cannot be intercepted.
Signal is, by the way, blocked in a number of countries with questionable views on democracy, such as Qatar (Doha), which can be discovered when trying to change flights there. This might serve as a wake-
https://cornucopia.se/2025/02/forsvarsmakten-infor-krav-pa-signal-for-samtal-och-meddelanden/
Recent News: If VPNs are targeted, cloud accounts could be compromised too Massive brute force attack uses 2.8 million IPs to target VPN devices https://www.bleepingcomputer.com/news/security/massive-brute-force-attack-uses-28-million-ips-to-target-vpn-devices/
Recent News: If VPNs are targeted, cloud accounts could be compromised too Massive brute force attack uses 2.8 million IPs to target VPN devices https://www.bleepingcomputer.com/news/security/massive-brute-force-attack-uses-28-million-ips-to-target-vpn-devices/
Not directly the state actors really more to do with the consequences ie common hacks into state actors resources make the data open to misuse and the state actors do not take any responsibility if they are hacked, right!
When an AI system is given access to it , it can uncover hidden patterns or vulnerabilities that humans might miss. This ability can lead to consequences, such as exposing sensitive information or breaking security measures, especially if the data is encrypted or anonymized. AI might also exploit weaknesses in the data, resulting in data breaches, privacy violations, or malicious manipulation. AI could leak personal details or confidential information, leading to significant risks like reputational damage or financial loss. AI to operate beyond traditional oversight makes these risks harder to predict and control.
Thank you for the link, do you think I ask the same question there as well or just read the posts there to gain more knowledge on the risks please
My concern: If an AI system is granted access to it, AI can detect patterns or vulnerabilities that humans might overlook, leading to data breaches or exploitation.
Thank you for the strategy I appreciate very much all the best
Thank you for that, I'm afraid I have mentioned the "encrypted" word in my post :
(ie. if I choose to store them online/cloud encrypted, I face significant privacy concerns. While they might be secure now, there’s always the potential for a very near future breaches or compromises, especially with the evolving risks associated with AI training and data misuse),
but haven't detailed/highlighted/clear enough
Yes "encrypt them first then upload them" is the situation I meant
p.s edited the post now
Dilemma: Online vs. Offline Privacy & Security of Personal Family Photos/Videos – Balancing Risks & Protection
Dear Friends,
I just wanted to take a moment to sincerely thank you everyone for your incredibly thoughtful and detailed responses for the films in general, while I find myself in a difficult situation when it comes to safeguarding the PERSONAL FAMILY PHOTOS and VIDEOS.
- On one hand, if I choose to store them online/cloud encrypted / (edit: encrypt first then upload it), I face significant privacy concerns. While they might be secure now, there’s always the potential for a very near future breaches or compromises, especially with the evolving risks associated with AI training and data misuse.
The idea of the personal moments being used in ways I can’t control or predict is deeply unsettling.
- On the other hand, keeping these files offline doesn’t feel like a perfect solution either. There are still considerable risks of losing them due to physical damage, especially since I live in an area prone to earthquakes. The possibility of losing IRREPLACEABLE MEMORIES due to natural disasters or other unforeseen events is always a WORRY.
How can I effectively balance these privacy, security, and physical risks to ensure the long-term safety and integrity of the FAMILY’S PERSONAL MEMORIES?
Are there strategies or solutions that can protect them both digitally and physically, while minimizing these threats?
I have tried to post my thank you message but "hanging" after clicking the reply button has continued for 3 days or so.
I sincerely thank you for your incredibly thoughtful and detailed response. Not only did you take the time to explain everything so clearly, but the way you included your personal experience really made a difference.
It’s rare to come across someone who is willing to share such in-depth insight, and I truly appreciate how much effort you put into helping me understand things from a practical standpoint. Your advice has been extremely helpful,
Thank you again for being so generous with your time and knowledge!
I agree that I have to to rely on the encryption for what I have for the films online/cloud, and that seems acceptable.
However, when it comes to personal family photos and videos, I’m facing a dilemma.
-
If they’re stored online, they’re vulnerable to potential compromises (PRIVACY CONCERN) -maybe not now, but in the future, especially with the risks posed by AI training.
-
On the other hand, if they’re kept offline, I’m still at risk of losing them due to physical factors, especially since I live in an earthquake-prone area
Thank you everyone so much for your responses. You’ve truly opened my eyes to so many aspects I hadn’t even considered before.
Your insights were not only thoughtful but also incredibly helpful. It’s rare to come across such comprehensive answers that cover so many angles, and I really appreciate the time and effort you took to share them.
Each of you has given me a lot to think about, and
I’m grateful for the depth of understanding you provided. Thanks again!
As a first step, I'd like to pick one of the programs to start with:
Cryptomator
gocryptfs (not so Windows-friendly)
GnuPG
VeraCrypt (slower than TrueCrypt, and since it’s offered as a replacement, it makes me suspicious, especially since TrueCrypt mysteriously vanished without providing any explanation. Some people believe VeraCrypt might have backdoors, whereas TrueCrypt’s abandonment perhaps didn’t provide any backdoors.)
TrueCrypt (I have used it occasionally on my Windows PC, although it is no longer updated)
I appreciate your suggestion very much.
I wonder what the difference between gocryptfs and others like Trucrypt would be.
Need to search and compare the pros and cons of both,
the advantages and disadvantages of each,
particularly in terms of security, ease of use, and performance
How do you ensure privacy and security on cloud platforms in an age of compromised encryption, backdoors, and AI-driven hacking threats to encryption and user confidentiality?
How do you ensure privacy and security on cloud platforms in an age of compromised encryption, backdoors, and AI-driven hacking threats to encryption and user confidentiality?
Let’s say you’ve created a film and need to securely upload the master copy to the cloud. You want to encrypt it before uploading to prevent unauthorized access. What program would you use to achieve this?
Now, let’s consider the worst-case scenario: the encryption software itself could have a backdoor, or perhaps you're worried about AI-driven hacking techniques targeting your encryption.
Additionally, imagine your film is being used to train AI databases or is exposed to potential brute-force attacks while stored in the cloud.
What steps would you take to ensure your content is protected against a wide range of threats and prevent it from being accessed, leaked, or released without your consent?