In theory it adds a speed bump. Apple as the cloud service provider can respond to the legal order by saying they don't have the key. And then the police can ask for a booby trapped update for just your phone which may or may not happen. Or they can lobby the legislature for an encryption backdoor for all devices which will force them to show their hand in terms of "lawful intercept" capability.
If you want maximum security use an air gapped computer. But that won't let you send messages on the go.
> If you want maximum security use an air gapped computer. But that won't let you send messages on the go.
You can, with some inconvenience, use optical diodes to transmit data from a trusted input device to an untrusted network device for transport over tor, and then push the received messages over a second diode to a display device that decrypts the messages, so that even if you receive an exploit/malware, there is no physical connection that allows unencrypted data to be exfiltrated.
If you want maximum security then just obviously don't use Apple services, or any other provider that has a capability to fetch your data under any circumstances.
Starting in May next year, the Digital Markets Act [1] requires Apple to "allow the installation of third-party software applications [...] by means other than the relevant core platform services of that gatekeeper."
I'm still on the fence about whether this will end up being a net good or not but people don't seem to consider the potential knock on effects of this. Apple puts some nice pro-consumer, along with some less nice anti-developer, requirements on Apps in the AppStore. Easy subscription management, privacy disclosure, parental controls etc. If the developers of an app decide to only make it available outside the AppStore you as a consumer may be forced to choose between using that app and getting those benefits.
> If the developers of an app decide to only make it available outside the AppStore you as a consumer may be forced to choose between using that app and getting those benefits.
And Apple already chooses the reverse for you by not allowing apps you may want and by charging at 30% tax for doing so. There is a vast disparity between the behaviors!
It won't help to download apps on an iPhone, which, I must say, isn't even yours: you don't get to decide which apps you can install on your phone. Apple gets to decide. Factually speaking you're merely renting the iPhonefrom Apple, which, being the device owner, decides the terms under which you can use it.
In practice this distinction is meaningless. In fact I trust Apple more than my own government. To take your argument to an absurd logical conclusion, I don’t own ANYTHING because my government can take it.
It is known that Apple would do quite a lot of what governments will ask of it. It removes app from national AppStores on a simple request from countries like China or Russia. (Well, now Apple might ignore Russian takedown requests, but prior to the war with Ukraine they were very receptive to their demands)
This is why side-loading and the option for alternative app stores is so crucial. If Apple bans Signal or other E2EE messenger apps from your national app store, you can't get them. Full stop.
If people in China and other privacy-hostile countries can side-load from alternative app stores (like F-droid for Android), the government/Apple doesn't control user access to particular undesireable apps.
There's obviously reverse concerns to this side of the coin but the overall concept has arguably always existed eith jailbreaking (Cydia store, AltStore(?)) and I haven't heard any stories about people becoming massively compromised in the way all the naysayers and Apple would have us believe.
Yes, I have heard of the GDPR and in my opinion it has improved/consolidated my digital privacy rights and not affected the "web browsing experience" in any negative way. I believe you are referring to the ePrivacy Directive (aka cookie law). As you may know, it's only mandatory to inform the user when the website is collecting information from the user beyond what is necessary for technical purposes - and in that case I do want the option to refuse that.
They don't have to lobby anyone for this. Apple has operations in aus. We have laws here gov can force you to put a backdoor in software or hardware and you are not allowed to tell even your employer you have been requested to do so.
Tbh in theory apple aren't allowed to tell you they have done it or otherwise. So their phones have probably been backdoored for a few years now at request of aus gov.
I would not be surprised if there is a backdoor already. Either explicitly ordered or secretly inserted like Dual_EC_DRBG. They’re not burning a zero day vulnerability or certificate authority just to convict one defendant. They’re saving them for something like Stuxnet.
Nothing is secure. Once we remember that, we'll stop nitpicking improvements.
Use your own server? Great, it's secure software-wise, but if someone broke into your house, it's all of the sudden the worst liability ever. The next thing you know, your entire identity, your photos, everything is stolen. You have excellent technical security, perhaps the weakest physical security.
So new plan, you use a self-hosted NextCloud instance on a VPS somewhere. That's actually not much smarter than using iCloud - VPSs handle data warrants all the time. They also move your data around as they upgrade hardware, relocate servers, and so forth.
So new plan, you use iCloud E2E encryption. You have to trust that Apple does as they say, and trust that their algorithms are correctly functioning. Maybe you don't want to do that, so new plan:
You use a phone running GrapheneOS, with data stored on a VPS, with your own E2E setup. Great - except you need to trust your software, and all the dependencies it relies on. Are you sure GrapheneOS isn't a CIA plant like ArcaneOS was? Are you sure your VPN isn't a plant, like Crypto AG? And even if the VPN is legitimate, how do you know the NSA doesn't have wiretaps on data going in and out, allowing for greatly reducing the pool of suspects? Are you sure that even if the GrapheneOS developers are legitimate, the CIA hasn't stolen the signing key long ago? Apple's signing key might be buried in an HSM in Apple Park requiring a raid, but with the GrapheneOS developer being publicly known, perhaps a stealth hotel visit would do the trick.
So new plan, you build GrapheneOS yourself, from source code. Except, can you really read it all? Are you sure it is safe? After all, Linux was nearly backdoored with only two inconspicuous lines hidden deep in the kernel (the 2003 incident). So... if you read it all, and verify that it is perfect, can you trust your compiler? Your compiler could have a backdoor (remember the "login" demo?), so you've got to check that too.
At this point, you realize that maybe your code, and compiler, is clean - but it's all written in C, so maybe there are memory overflows that haven't been detected yet, so the CIA could get in that way (kind of like with Pegasus). In which case, you might as well carefully rewrite everything in Rust and Go, just to be sure. But at that point, you realize that your GrapheneOS phone relies on Google's proprietary bootloader, which is always signed by Google and not changeable. Can you trust it?
You can't, and then you realize that the chip could have countless backdoors that no software can fix (say, with Intel ME, or even just a secret register bit), so new plan. You immediately design and build your own CPU, your own GPU, and your own silicon for your own device. Now it's your own chip, with your own software. Surely that's safe.
But then you realize there's no way to verify, even after delidding the chip, to verify that the fabrication plant didn't tweak your design. In which case, you might need your own fabrication plant... but then you realize that there's the risk of insider attacks... and how do you even know those chip-making machines are fully safe? How do you know the CIA didn't come knocking and make a few minor changes to your design, and then gag the factory with a National Security Letter from giving you any whiffs about it?
But even if you managed to get that far, great, you've got a secure device - how do you know that you can securely talk to literally anyone else? Fake HTTPS Certificates from Shady Vendors are a thing (TrustCor?). You've got the most secure device that is terrified to talk to anybody or anything. You might as well start your own Certificate Authority now and have everyone trust you. Except... aren't those people... in the same boat now... as yourself... And also, how do you know the NSA hasn't broken RSA and the entire encryption ecosystem with that supercomputer and mathematicians of theirs? How do you know that we aren't using a whole new DUAL_EC_RBG and that Curve25519 isn't rigged?
The rabbit hole will never end. This doesn't mean that we should just give up - but it does mean we shouldn't be so ready to nitpick the flaws in every step forward, as there will be no perfect solution.
Oh, did I mention your cell service provider always knows where you are, and your identity, at all times, regardless of how secure your device is?
Edit @INeedMoreRAM:
For NextCloud, from a technical perspective it's fantastic, but your data is basically always going to be vulnerable to either a technical breach of Linode, an insider threat within Linode, or a warrant served (either a real warrant, or a fraudulent warrant, which can happen).
You could E2E encrypt it with NextCloud (https://nextcloud.com/endtoend/) which would solve the Linode side of the problem, but there are limitations you need to look into. Also, if a warrant was served (most likely going to be authentic if police physically show up, at least more likely than one they served your data over), you could always have your home raided, recovery keys found, and data accessed that way. Of course, you could destroy the keys and only rely on your memory - but, what a thing to do to your family if you die unexpectedly. Ultimately, there's no perfect silver bullet.
Personally... It's old school, I use encrypted Blu-rays. They take forever to burn, but they come in sizes up to 100GB (and 128GB in rare Japanese versions), they are physically stored in my home offline, and I replace them every 5 years. This is coupled with a NAS. It's not warrant-proof but I'm not doing anything illegal - but it is fake-warrant-resistant and threats-within-tech resistant, and I live in an area where I feel relatively safe (even though this is, certainly, not break-in-proof). Could also use encrypted tape.
I run Nextcloud on a RPI at home with fail2ban, brute force protection, MFA, and E2EE which is backed up remotely using encrypted Borg backup. The 4TB SSD drive safely serves my friends and family too. My laptop and Graphene phone's files, apps and settings are backed up automatically to it daily. I have too many apps installed on Nextcloud to list, but it is basically an all in one solution to your cloud needs.
Both Nextcloud and GrapheneOS are FOSS which addresses your concern about it being a government trap.
My partner is able to access my Bitwarden account if I were ever to be indisposed.
Sure nothing is perfect, but tell me how this is not a better solution than trusting the closed source ecosystem of the biggest corporation in the world.
“Both Nextcloud and GrapheneOS are FOSS which addresses your concern about it being a government trap.”
I was merely referring to the fact that unless you build the code yourself, there is no certainty that you have that a government has not shipped a custom hacked build to your device and stolen a FOSS signing key. Unlikely? Yes. Possible? Yes. Also, backdoors, as seen in the 2003 Linux incident, can be as hidden as a deliberately missing equals sign in 1 line of code - so, a sneaky government commit with the smallest backdoor could be undetected even if FOSS. I still think it’s better than proprietary - don’t get me wrong - but it’s not invincible which was my main point about how security does not end.
Right, but nobody can write all the code they need for every service. I agree nothing is invincible. We put varying degrees of trust in people and processes of communities who maintain the SW. FOSS requires much less trust than proprietary SW developed by megatech.
> Use your own server? Great, it's secure software-wise, but if someone broke into your house, it's all of the sudden the worst liability ever.
this doesn't invalidate the rest of your point, but if your data isn't encrypted at rest on your own hardware, that one very particle point? that's your own fault.
you will need some kind of remote mounting mechanism. Imagine you are abroad and your power at home is off for a short period of time. How to boot remotely and mounting the encrypted filesystem?
Not an easy task. You will need some kind of dropbear ssh that you dial into and input your encryption key. Many moving parts. Don't get me started if you have to update the packages due to security fixes.
I've been running my own Nextcloud instance on a Linode with 2FA and your response made me question how secure it is.
Even though I get an A+ on the Nextcloud Security Scan (https://scan.nextcloud.com/), have 2FA, and custom IP blocking set up in my .htaccess file, it's disheartening to know that I'm not as secure as I thought I was.
I removed all my photos/files from iCloud for privacy reasons, and now I feel helpless contemplating how Linode may just hand my data over if served a warrant.
Any other Nextcloud hardening tips besides Fail2ban and reverse proxying you'd recommend? May I ask what your workflow looks like for preserving files throughout time?
Nextcloud has three recommended add-ons that you can install in a few clicks:
-Brute force protection
-End to end encryption
-Multi-factor Authentication
If you want maximum security use an air gapped computer. But that won't let you send messages on the go.