Apple started using a new technology for uploads to their iCloud service.
No one with half a brain uploads anything to an Apple server except Apple fanboys. No one even thinks about it since Edward Snowden came on the scene. Only fanboys do something so foolish, as only fanboys refuse to believe anything negative about the company they so love (a company that continually shafts them).
Beeb came with an update on the above story just today.
Quoting from the first piece:
'But there are concerns that the technology could be expanded and used by authoritarian governments to spy on its own citizens.'
Gosh. Gasp. Ya think?
This is mostly about scans of data uploads. Something like what Google and YouTube already do - scan audio for words they don't like, pieces of music they want to block, scan photographs to read book spines and magazine covers to complement your dossier which they can share with companies that seem to collect data on your shopping habits.
Not to mention of course that these revelations don't come from whistleblowers. Apple wanted this information out there. They have people from Facebook helping them. As Rick Falkvinge said over ten years ago: watch out when they start talking about child porn - that means they're up to something, that's always the thin end of their wedge.
What's more worrying than Apple scanning uploads is the fact that Apple can spy on their users at any time, not just on uploads. This is something that at least some people have been aware of for quite some time. At least ten years. And this spying could be going on right now and you'd be mostly unaware of it. People mostly don't want to think about things like that. And the danger increases geometrically when you're dealing with a box that's been hermetically sealed on Planet Groovy.
FAANG (Facebook Amazon Apple Netflix Google) would love to scan your hard drives. Google tried once before with Google Desktop and got booted on their butts.
But how about today?
Facebook and Twitter can't do a satisfactory scan: they lack admin access, and anything they try in user land would stick out.
Microsoft can do it by running surreptitious traffic through a catch-all DLL. They started doing that years ago.
Apple can scan your devices at any time. Firewalls can perhaps pick this up, and in some cases stop it, but won't you even then be looking for the proverbial needle in the proverbial stack of proverbial needles?
That's a study of what happens with a brand new Apple box that's been totally stripped - totally - so it can't possibly phone home. Except it does anyway. Lots. And there's seemingly no way you can do anything about it. If Jeffrey Paul, a hyper-paranoid user you might say, can't stop it, and if he can still say 'things are pretty OK', then you have more to worry about than you ever imagined.
'Instead of focusing on making it easy for people to report content that's shared with them, Apple built software that can scan all the private photos on your phone - even photos you haven't shared with anyone. That's not privacy.' - Will Cathcart, WhatsApp
Long ago yours truly cohosted a discussion on privacy and security in Stockholm. I opened things with the stereotypical review of the technical aspects, then my cohost took over and told a story.
Back in the days before WW2, he said, the Dutch government announced a new national census. One of the questions in that census was about ethnicity. The people of the Netherlands protested. Their leaders responded.
'We understand what you're getting at', they told the people, 'but trust us - we'd never misuse that information.'
'And they didn't', said my colleague. 'But then the Nazis moved in.'
'Anyone here not heard of Anne Frank?'
This drives home an important point. Perhaps the most important point of all, ever. Namely that it's never been about people misusing information about you that they shouldn't have - it's about that information being 'out there' in the first place.
Once you grasp that, you've found the kernel of the issue and you'll know how to proceed.
So: can Apple spy on their users? Can Microsoft?
All bets are off with Microsoft. Their systems are closed source. Anyone can channel traffic so it stays under the radar.
But how about Apple?
For a spy operation to be thorough, one has to be able to reach all parts of a file system. Apple desktop users can secure things beyond prying eyes in system areas accessible only to those with an admin password. (The days when Maccies believed you could hide things on a Unix system are hopefully long gone, and the carpetbaggers trying to peddle that snake oil are hopefully gone as well.)
But things took a turn for the worse some years ago, as a few of you may remember. This review of the issue indicates that things began back in the day of 10.5 Leopard.
At issue was the software update install process. As files will often require root privileges to be overwritten, the user had to submit a root password. That's the way it always worked, back in the day when you got your OS on a disc sent to you by post. But no longer. Suddenly, and without fanfare of any kind, things went smoothly - without a password prompt. How was that possible?
What we found at the time - without digging all the way down - was that Apple somehow connected to a system root process to do the dirty work.
The Apple fanboy army didn't react. Anything they wrote on the topic was misconstrued to the extreme, the one thing that makes them indispensable to Apple.
But this served only to bring attention to an issue that was already there. Namely that Apple, as vendor, would have countless root:wheel threads that could do this at any time. Already from Day One.
So how do you secure a system? How do you secure your own system? How do you secure your own life?
If you don't understand that this is a major issue, not even after Edward Snowden, then there's little hope for you. For those of you who are concerned: it's an uphill battle, it's always going to be an uphill battle, and the fight will never go away, the bad guys are always coming back to try new tricks.
Joe and Josephine Blogs? They're gleefully using Alexa. They don't worry about what's going on. Would it have been possible to make an Alexa that was privacy-aware? Certainly. But FAANG never bothered.
'But then the Nazis moved in.' Remember that.
Your information is out there. You know that someone has that information. Information that can be used against you. In ways you can't even imagine. Seemingly innocent information. But you just wait.
Stupid promises by a menacing Queen Bee in California that they will never hurt you: they're useless and you're missing the point. It should never be anyone else's prerogative what they do with your information. They shouldn't have that information.
You might try doing as Andrew did.
'On the other hand, working in Linux I feel so clean. Nothing is hidden. Everything is laid bare; it's only up to me to find it and understand it. There's no more nasty, greasy worry about what my OS is really doing beneath its happy face. Has MS already installed one of its spyware components in a trojan security patch? No more. And when it's done, I'll know exactly what I have, and it will be what I want, no more or less, because I chose and configured every piece of it.'
It's not known if Andrew ever reached his goal. And don't forget about Ken Thompson's bomb when he revealed he'd had an undetected backdoor hidden in the Unix C compiler all along.
Andrew and a crew of invincible code auditors from OpenBSD would need to go over a complete system with a very fine-toothed comb for a very long time to be able to offer a reasonable assurance that a system was not compromised, 100% safe.
But using only open source, and abandoning corporate solutions, taking the DIY approach: it's a huge step in the right direction.
Some twenty years ago, when we began this journey, the issue at hand was security. Microsoft systems were inherently insecure and could not be made secure. It took years of teaching systems programming and reflecting on our own materials to fully understand that. Something we know that few others understand even to this day. (Or care about, sadly.)
Our friend Freddie, the one who'd done the groundwork on tracing the ILOVEYOU worm back to its origins, said once, at a pub night in Stockholm's Silicon Valley, after putting down way too many pints of the local lager, in the men's room of the establishment - he said it almost casually, an innocent reflection on whatever we'd all been talking about at the table with the group:
'You know what? Privacy and security: they're gonna be the buzzwords for the first ten years of this new millennium.'
Did anyone believe it? Did Freddie believe it? Freddie was totally hammered that evening.
But look where we are today. Look how much we've learned about what those technology companies are up to, as we down those pints after yet another hard day's work. No one seems to mind the occasional tingle of grass growing between their toes, but the bad guys never let that happen.
We have to stay one step ahead of the bad guys. We have to make sure we can count on that extra step being there. We have to fix that matter 'mathematically'. We have to have robust assurances that our systems work, that our privacy and security are intact, that they cannot in any way be compromised.