How Private Is Your iPhone Data, And How To Protect Your iPhone Privacy

How private is your iPhone, and the personal data stored on it? We examine the iPhone's built-in privacy measures, explain how to protect your iPhone privacy, and argue that Apple is more deserving of your trust - and your data - than Google. Latest: iMessage's end-to-end encryption has been improved.The biggest political battle of the second half of the 2010s may well be privacy.At time of writing it's US presidential primary season, and privacy is one of the few areas of genuine disagreement. (Ted Cruz is against expansion of governmental surveillance, Trump and Rubio are loudly in favour of it, and Bernie Sanders has called NSA activities "Orwellian". Hillary Clinton's position, as on many things, remains somewhat unclear.)

Most of all this battle will be fought in the realm of technology, where corporate behemoths Apple and Google represent (at least in the mind of the average tech user) opposite ends of the spectrum. Apple makes lots of noise about protecting its users' privacy, while Google… well, we'll talk about that in a moment.
Still, talk is cheap. If you're wondering how seriously Apple takes privacy - and about the protections that are in place to protect the privacy of data stored on your iPhone or other Apple device or service, such as the potentially sensitive medical data stored by Care Kit apps - then wonder no longer, because we've put together a list of the 5 reasons why we believe that Apple respects customers' data privacy more than Google. Read next: How to protect your privacy on Mac
Jump to the latest details & timeline of Apple's privacy battle with the FBI

iPhones are equipped with a number of powerful privacy measures:


The iPhone is not easy to break into, and quite aside from Apple's corporate position on privacy, the smartphone itself has several protective features that help to safeguard your privacy.

Best iPhone privacy measures: Passcodes


First up: we always recommend that readers should set a passcode for their iPhones. This simple measure can be surprisingly effective at stopping people from getting at your data, as the FBI discovered recently.
How to improve your iPhone privacy: As simple as an iPhone's passcode can be - we'd recommend a custom alphanumeric code, not four digits, but even the latter is a deterrent to casual identity theft - it takes a lot of work to crack one. This is particularly the case because iOS builds in delays after you get the passcode wrong: each computation is deliberately designed to take longer than it needs to, at 80 milliseconds, and if you get it wrong six times in a row the iPhone is locked for a minute; further incorrect guesses result in longer delays. The latter measure in particular prevents hackers from using brute force to machine-guess hundreds of codes in quick succession. See also: How to make iPhone data go further. 
The six-wrong-attempts delay is always activated, but there's a second more drastic measure you can choose to activate if you are carrying highly sensitive or business-critical data. If you want, iOS will erase your data if someone (including you!) gets the passcode wrong 10 times in a row. Go to Settings > Touch ID & Passcode, enter your passcode and then scroll down to Erase Data. But only do this if you are willing to run the risk of accidentally erasing everything if you get drunk.
How private is your iPhone: Passcodes

Best iPhone privacy measures: Touch ID:


The iPhone 5s and later come with Touch ID fingerprint scanners. You can use your fingerprint to unlock the device itself, but third-party developers have for some time been able to build Touch ID into their apps - enabling you to fingerprint-protect password keepers, banking data, health data and so on. As of iOS 9.3, you can use Touch ID - and passwords, for the matter - to protect individual notes in the Notes app.
Fingerprints aren't necessarily more secure than passcodes and passwords - a reasonably long and alphanumeric passcode is extraordinarily time-consuming to crack - but they are far more convenient, which makes it much more likely that we will use them.
How private is your iPhone data?
But the benefits of Touch ID are not straightforward, and my colleague Glenn Fleischman discusses this in a separate article, The scary side of Touch ID. As he puts it:
"Someone might be able to coerce a password from you with a wrench... But it still requires that threat and your acquiescence. [...] Mobile fingerprint sensors change that equation dramatically. An individual who wants some of your information must only get hold of your device, ensure it hasn't been rebooted, and hold an appropriate digit still for long enough to validate one's fingerprint.
"As I touch, touch, touch, I think about about Hong Kong and mainland China; about Afghanistan and Iraq; about Ferguson, Missouri, and police overreach and misconduct; and extrajudicial American operations abroad and domestic warrantless procedures and hearings about which we know few details. I think about the rate of domestic violence in this country.
"As a nonconsensual method of validating your identity wherever you're carrying a device, coupled with software that likewise recognises it, Touch ID requires a bit more thought than just registering your fingerprints."
How to improve your iPhone privacy: Here's a small related item of interest, to anyone who wishes to keep their iPhone as private as possible. It's been ruled, in the US at least, that police can force a suspect to use Touch ID to unlock a device - following the reasoning that a fingerprint is a piece of physical evidence - whereas a passcode is viewed as knowledge and is protected by the Fifth Amendment... not that there is any logical way for police to extract this information short of waterboarding.
In other words, for the extremely privacy-conscious, securing an iPhone with a passcode alone is actually a better choice than using Touch ID.
How private is your iPhone: Touch ID

Best iPhone privacy measures: iMessage:


One of the quietly most secure aspects of the iPhone is the iMessage platform. It works across the Apple hardware platforms of iPhone, iPad, iPod touch and Mac to provide a well integrate messaging service. As long as you have either an iCloud email account or an iPhone with data plan you’ll be able to use iMessage. 
If you message someone and the text bubbles are green, then you are sending text messages in regular SMS format (this is also only possible on iPhones as SMS can only be sent between mobile numbers). However, iMessages that send blue cleverly have detected that you and the recipient are Apple users, and these messages are sent over an Internet connection instead (so 3G, 4G or Wi-Fi). These don’t count against your text message package from your mobile operator and work like WhatsApp or Facebook Messenger chats.
Aside from the excellent integration and automatic detection in this way, a great aspect of iMessage is that it is an encrypted platform. This means that Apple has engineered a way for your messages to only be readable by you and the recipient. This is known as end-to-end encryption. To get an idea of the scale of iMessage, it processes peak traffic of ‘of more then 200,000 messages per second, across 1 billion deployed devices’. This is according to a recent report by researchers at Johns Hopkins University, spotted by Patently Apple, which praises Apple for improving the security of iMessage in the March 2016 iOS 9.3 and OS X 10.11.4 updates. 
End-to-end encryption is one form of encryption for personal messaging services but is preferable in the way that not even the companies that provide them are able to read or intercept messages. The same report by researchers at John Hopkins did however find that because Apple does not change its encryption keys as regularly as other secure messaging services, it could possibly be at risk of attacks on larger volumes of historical messaging data should the coding ever be broken by a malicious hacker.
This is all however very theoretical – Apple’s iMessage is an excellently secure messaging service, and the highest praise we can give it is that from a user’s perspective it just works. With the complete absence of the user’s input, Apple runs one of the largest, most secure messaging networks on the planet.

Best iPhone privacy measures: Secure Enclave:


We'll be talking again about Apple's privacy battle with the FBI in more detail in a bit, but it's worth discussing one technical aspect of that case here. The iPhone belonging to one of the shooters in the San Bernardino case (or rather, belong to his employer) is a 5c model, and this - the company claims - is crucial in Apple's ability to open it up. iPhones more recent than this are equipped with security measures that mean even Apple's own engineers wouldn't be able to access the data inside.
As well as introducing Touch ID, the iPhone 5s was also the first iPhone to feature a security measure that Apple calls the Secure Enclave. This is an area of the processor chip - a separate processor in its own right, essentially - that stores the fingerprints and other security-critical data. But it is also a crucial part of the encryption setup.
"The Secure Enclave uses a secure boot system to ensure that it the code it runs can't be modified," explains Mike Ash, "and it uses encrypted memory to ensure that the rest of the system can't read or tamper with its data. This effectively forms a little computer within the computer that's difficult to attack."
(I'm obliged to Mike for virtually all of my understanding of the Secure Enclave's technicalities, but he acknowledges in turn that his findings partly derive from Apple's published security guide: the security measures mean that a lot of the Secure Enclave's details remain unverifiable.)
The generally agreed plan for Apple to break into the shooter's iPhone 5c involves the company's engineers creating and installing a custom build of iOS - one that doesn't have the same security measures that prevent brute-forcing of the passcode. The OS on the Secure Enclave, it is surmised, features defensive measures that would delete the keys to the encrypted data if new firmware were installed.

Apple is publicly committed to user privacy:


Update 30 November 2016: Despite Apple's stance on user privacy, it would seem that the company takes 'constant logs of your iPhone calls in iCloud' (as reported by Forbes). This new information has come from information by Elcomsoft, a Russian provider for iPhone hacking tools, where the company stated that iCloud stores four months of data (from calls logs to user data) in its system in real-time. Where the only way to disable this privacy concern would be to completely disable iCloud - as there is no way of turning off these automatic logs to the iCloud servers. This shows us that Apple isn't fully disclosing all the data stored in iCloud, leading us to believe it isn't as transparent as we might have thought.
Following the San Bernardino shootings of December 2015, the FBI obtained a warrant to search an iPhone 5c belonging to one of the shooters, Syed Rizwan Farook (the phone was technically the property of Farook's employers, which was a factor in obtaining permission to do this). Yet the FBI were unable to get into the device because it was locked with a passcode, and sought - and obtained - a court order instructing Apple to open the phone up.
But Apple refused, and published its reasons in an open letter on 16 February 2016 from the CEO, Tim Cook.
How private is your iPhone data?
"The implications of the government's demands are chilling," the letter reads. "If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone's device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge.
"Opposing this order is not something we take lightly."
Indeed, at its 21 March 'Let us loop you in' launch event, Apple took time before mentioning any of its new products to reiterate its determination to stare down the FBI. 
"We did not expect to be in this position, at odds with our own government," said Tim Cook. "But we have a responsibility to help you protect your data and protect your privacy. We owe it to our customers and we owe it to our country. We will not shrink from this responsibility."
Apple and privacy
Apple has talked about the importance of data privacy many, many times the past, but this is the clearest statement yet that the company is prepared to take concrete action for that principle.
I personally feel that Cook has been outmanoeuvred to a certain extent. It's about the worst case on which to make a stand that you could imagine: the most deadly domestic terrorist attack the US has faced since 9/11, a subject on which the US public will surely, surely take the side of law enforcement. (Sure enough, a Pew Research Center poll found that 51 percent of Americans think Apple should hack the phone, compared to 35 percent who think it should not.)
And it's the worst time: presidential primary season, when Republicans are queueing up to act tough (Donald Trump has asked who Apple think they are for making this statement, but then again this is the genius who said they should make "their damn computers and things" on home soil) and Democrats won't dare support an unpopular cause.
But this makes the move even more admirable. I don't think Apple is doing this because it's a good strategic move - although caring about your customers is a pretty good business model that's served Apple well over the years - but because it believes this is the right thing to do.
Lots of tech companies talk about privacy, and indeed in this case many other major tech firms, including Microsoft and even Google, have come out in solidarity with Apple's stance. But there's a difference between saying and doing.
I also couldn't help but notice that there was a fair gap between Apple's statement and most the supportive comments, as if they were looking to see who else would commit themselves before jumping in. In fact, NSA whistleblower Edward Snowden tweeted on 17 Feb 2016 at 4:43pm that "silence means @google picked a side, but it's not the public's", and Google boss Sundar Pichai's admittedly admirable responsecame more than seven hours later.
Apple is powerful enough to stand up to overreaching governmental prying, and it has a business model that depends on loyal customers that love the company and its products so much that they are willing to pay more than the going rate for their smartphone. It also makes sense for the company, from a PR point of view, to act in a way that highlights Google's philosophy.
Apple has the means, and it has the motive, to safeguard its users' privacy.
Here's Tim Cook explaining Apple's stance, in an interview with ABC News:

Latest developments in Apple/FBI privacy battle:


Update, 6 May 2016: Up until this point Apple has given the impression that iPhone models equipped with a Secure Enclave - the iPhone 5s and later, in other words, but not the iPhone 5c at the centre of the San Bernardino case - are effectively uncrackable if protected by a passcode, and that even Apple's own staff cannot bypass iOS's anti-brute-force protections. But a new revelation puts that theory in doubt.
According to the LA Times, police hired a hacker earlier this year to break into a passcode-protected iPhone 5s - a device with a Secure Enclave - and the hacker was successful. (The phone was owned by April Jace, the victim in a high-profile suspected murder case.) This occurred during the same period when Apple and the FBI were disputing whether Apple should be obliged to open up an iPhone 5c in a separate case.
The LAPD's actions are outlined in a search warrant written up by LAPD detective Connie Zych, who stated that the department found a "forensic cellphone expert" who could "override the locked iPhone function". The force has thus far declined to provide any more detail than that - the identity of the expert, the method used, the information recovered - and as with the FBI case, conspiracy theorists will speculate about whether it actually happened.
It's understood that the phone was running iOS 7 or earlier, and thus did not enjoy the additional encryption measures added with iOS 8. But this is still a blow to Apple's reputation as a maker of ultra-private smartphones, at least until more detail emerges.
Of course, it's also an eye-opener for anyone who still believed that US law enforcement only wants to break into citizens' phones if they're involved in terrorist plots.
Update, 14 April: You remember that iPhone everyone was so excited about opening up? It turns out there was nothing useful on there after all.
CBS News quotes "a law enforcement source" as stating that so far, "nothing of real significance" has been found on the San Bernardino shooter's iPhone 5c, which was finally cracked last month by - it is alleged - a team of professional hackers. The FBI continues to analyse the data, and may yet make discoveries that aid in the prevention of future attacks, but that must now be unlikely. As it always was, incidentally, given that the attackers were acknowledged to be self-radicalised and not part of a cell.
So months and months of legal wrangling, threats and political grandstanding, and what are we left with? An old phone with the passcode deactivated, and an apparent software vulnerability that threatens the security of millions of iPhone owners around the world, but which Apple can't patch because the FBI won't tell them about it.
Update, 13 April: A further development. The Washington Post is now alleging that US law enforcement officials didn't hire Cellebrite at all; they hired a team of professional hackers.
Whether this is quite the ethical misstep that the word 'hacker' might imply is debatable: many hackers earn a reasonably respectable living seeking out software vulnerabilities and then selling that knowledge back to the vendor rather than using it for nefarious purposes. But the fact that the FBI still refuses to tell Apple about the vulnerability that was used to crack the iPhone - and thereby allow it to safeguard the millions of iPhone 5c models around the world from being cracked in the same way - raises broader questions about surveillance culture and the state's approach to its citizens' privacy.
This will also worry people who own an iPhone 5c, of course.
Update, 29 March: And that seems to be that. As predicted last week, the US Department of Justice and FBI have conveniently found another way into the phone and withdrawn their case against Apple. In a statement, the company said: "From the beginning, we objected to the FBI's demand that Apple build a backdoor into the iPhone because we believed it was wrong and would set a dangerous precedent. As a result of the government's dismissal, neither of these occurred. This case should never have been brought."
It's still not completely clear how the FBI got into that rogue iPhone 5c - possibly the Israeli firm Cellebrite mentioned below - but the fact that it was able to do so without Apple's help obviously undermines the arguments it used in court.
Update, 24 March: Extraordinarily, the FBI appears to have backed down.
On Monday night, shortly before Apple was scheduled to start setting out its defence, the Department of Justice's legal team asked the judge to postpone the hearing on the grounds that it had found a third party who could help them break into the phone. (And presumably on the unstated grounds that it was no longer sure it could win the case, and didn't want to set a precedent.) The third party has since been revealed to be an Israeli forensic software firm named Cellebrite. The case is not officially over, but it looks like Apple has won. We offer them our sincere and hopefully not premature congratulations.
Update, 1 March: In a separate case that is likely to have a bearing on the San Bernardino judgement, a New York judge has sided with Apple and struck down an order for the company to hack a different iPhone, belonging this time to a drug dealer. "I conclude that none of those factors justifies imposing on Apple the obligation to assist the government's investigation against its will," wrote the judge. "I therefore deny the motion."
Both cases depend on the All Writs Act of 1789, and similar arguments are likely to be made when Apple appears again to justify its case against the FBI.

Google has a long-term record of privacy-hostile behaviour :


Google, by contrast, has both the means and the motive to pose a threat to its users' privacy.
Google's business model is very different to Apple's. Apple sells products, and premium-priced products at that; this is a strategy that depends on loyalty and love from your customers, but requires little sucking up to anyone else… except possibly the media. (And only the mainstream media; you probably wouldn't believe how aloof Apple is towards the tech press, who it feels confident will write about its products regardless of how they are treated.) Generally speaking, it is in Apple's best interests to treat its customers well. From time to time it may choose to make it relatively difficult for users to customise their watches, for example, or to download unauthorised software, but on the whole such tactics are intended to preserve a better user experience.
But Google gives away most of its best products, making money instead from the user data it collects in return. What Google actually sells isn't a search engine, or a mobile operating system; it's carefully targeted user eyeballs. As the old adage says, if you're not paying for a service then you're not a customer, you're a product.
Google is essentially an advertising business, and it has far less motivation than Apple to worry about the happiness of its users; in turn, it has far more motivation to erode user privacy.
And Google has a truly vast network of data sources. Granted, if Apple turned into a surveillance power overnight it could potentially gain access to a large quantity of personal data from your iPhone and Mac. (Although even there it faces limits; as we discuss above, the firm claims that, in contrast with Syed Rizwan Farook's 5c, its most modern iPhones contain security measures that would prevent even Apple's own engineers from opening them up.) But Google has a search engine, a web analytics service, a social network and a desktop operating system; it has YouTube and Gmail; and its mapping service, web browser and mobile operating system each have far more users than Apple's equivalents.
Google is tapped into every aspect of our lives. It's SkyNet. It's the nearest thing to an all-knowing Big Brother that human society has known.
That's just the theory, but there's plenty of practical evidence to back it up: indeed, there are far more incidents of Google acting in a privacy-hostile manner than I can list here. But just as a taster:
Google has been criticised for too readily providing governments with information about their citizens; prohibiting anonymous or pseudonymous accounts on various of its services; installing cookies with a lifespan of 32 years; refusing to offer a Do Not Track feature far longer than any other major browser maker; harvesting data from (admittedly unencrypted) private Wi-Fi networks across 30 countries without permission; and on the launch of Google Buzz making Gmail users' contact lists public by default.
In 2007 Privacy International gave Google (and Google alone) its lowest possible ranking: 'Hostile to Privacy'. In 2009 Google CEO and part-time Indiana Jones villain Eric Schmidt responded to privacy concerns by saying that "if you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place."
These cases and arguments represent merely the tip of the iceberg when it comes to Google and privacy. Those who are interested can read more on the subject at Slate, the Economist, Wired and even the dedicated Wikipedia page on the subject. Also, for balance, take a look at Google's own privacy policy page.
But my own conclusion is that these are not isolated incidents. They speak to a deeper truth. In my opinion, Google is institutionally and constitutionally an anti-privacy organisation, and everything I know about the two companies leads me to believe that Apple is far more deserving of your trust, and your data.

Post a Comment

Thanks For Your Comment

Designed by OddThemes | Distributed by Gooyaabi Templates