Tag Archives: iOS

iOS Backdoors

In the last few days there has been an increasing noise related to some iOS backdoors. Apple does not deny they exist, but contests how they can be used.

This is not new, and the security researcher who presented his findings did highlight that, it is likely related to methods being used by certain forensic software sold to law enforcement.
What is “concerning” is the following:
– These backdoors are actively maintained and developed by Apple, how much more data will they allow to be extracted from iOS device in future;
– Those backdoors provide access to SMS, Contact, and other potential sensitive data on the phone; they also allow to bypass full disk encryption. This highlight the fact that unless you phone is off, the data on your phone is no longer encrypted per say, but only protected by access control (PIN);
– If it can be used by law enforcement, it can be used by “greyer” parties too

A few links to get further:
Summary of this story

Zdziarski presentation, detailed and very informative

Apple response, not surprising and not really addressing the points of concert

A new iOS 6.1 hack

As seen on the Hacker news, there is currently a way to bypass the iPhone lock screen (iPad with SIM too?) running iOS 6.1.x

I had to change the steps listed in “The Hacker news” slightly for it to work:
-Go to emergency call, push down the power button and tap cancel.
-Dial 112 and tap green and inmediately red.
-Go to lock screen, by pressing the power button
-Go to passcode screen, by pressing the home button
-Keep pushing down the power button …1…2…3…seconds and before showing the slider “turn off”…tap the emergency call button and …voilá!
-Then without releasing the power button press the home button and let go…

From there you gain full access to the phone application and can change/add/delete contact, as well as use the phone to make phone calls but you cannot stop a call you started with that technic.


YouTube Direkt

Carrier IQ, an interesting story of deception or what we could call the Facebook syndrome

It all started with some findings published by Trevor Exkhart on his website a few weeks ago.

He found that a Californian based company called Carrier IQ (CIQ) had develop a software that was acting as a *key logger* and was installed by default on many different mobile devices: Android, Blackberry, Nokia Phones, iPhones (iOS 3.x to 5.x), and also tablets.

The important point here, is that this software is intentionally installed/provided by the devices manufacturers or network carriers. It is quite amazing how widespread the use of that spying software is (the BBC reported 140 Million devices). This is not limited to only one type of device or provider. What they collect might be different (apparently much less on iOS than Android), but it shows a systemic desire from companies who make and sell those devices to gather usage and user information.

This is what I would call, the Facebook syndrome!

The official stance from CIQ was that their software was only used for improving the “network experience” by providing some information back to carrier and phone manufacturer such as signal strength, network information, etc.
They explicitly stated that they “do not and cannot look at the contents of messages, photos, videos, etc., using this tool”.

This is not what you would say from a software that logs all the key pressed on your device…

Again, it is important to note that by default their software is not hidden (there is a visible check-mark in the status bar) but this can be modified by 3rd parties. And it is being modified!

One example given by Trevor is Verizon in the US, although you can opt out, by default the phones they sell will record and transmit (?) the following personal user information: any URL accessed, including potential search queries and the location of the device. This is what could be considered as a significant personal privacy invasion.

So how did CIQ reacted to Trevor’s post?
By sending him a Cease and Desist letter on the 16th of November!

They claimed Trevor was in copyright infringement (because of some of their publicly available training material having been referenced) and making false allegations.

As reported on The Register on the 24th of November, they eventually withdrew their legal threats thanks to the legal help of the EFF, who nicely summarizes the case on their website, and also to a new post showing exactly what Trevor meant by calling CIQ software a “root kit” (I called it a “key logger earlier”, but root kit is more accurate and also has wider security implications).

Trevor’s second CIQ article, goes into details as to why CIQ software is indeed a root-kit. With a video showing the different steps required to reproduce his tests. It also describes how the data is collected even if you are off the network and, at least on an HTC phone, the data is not really anonymised.

Since then, another mobile phone hacker has published some findings about CIQ, this time confirming that Apple has included CIQ software in all its iOS version from iOS3 to the latest iOS5. However, it seems that the information logged on the Apple devices is much less than what is logged on Androids': no URL nor SMS and the location is only sent if you have allowed for it to be, furthermore, that information is not transmitted by default but only if the user manually choose to send diagnostic information to Apple.

All this has generated an increasing level of noise and attention:

As pointed out in a ViaForensics article, it is not clear when and if the data CIQ logs on the phone is always transmitted or just remains on it. And if transmitted, to where? But if it is being transmitted, I have a little story for you…

A few years ago I went on holiday and decided to take an international data plan, I had an iPhone 3G at the time, and I did monitor my data consumption every day with the built-in iOS bandwidth statistics. I stopped using data on my phone when I reached 90% of my allowed and pre paid consumption.

I was therefore very surprised when I was charged for going over my data allowance by a good margin! How could I have miscalculated my data consumption by so much!? After complaining to my provider they eventually claimed that the built-in iOS bandwidth statistics were only showing average figures and were not accurate. I also read in some forum at the time, that Apple claimed their figures should be taken as an estimate only. With that in mind, I decided not to pursue further, accepted to pay the extra fee and promised myself never to use data roaming again.

Now, it would be interesting to know if all the network data generated by CIQ is counted in those mobile OS network bandwidth statistics or if, like the information it gathers, they are also hidden from view.
After all, if the provider goes at length to hide the data they collect from you, they probably don’t want you to see that sealed fat envelop leaving your phone!

If that’s that case, how legal is this?! not only spying/gathering user information is questionable but doing so could be at the expense of the user! Couldn’t it be considered as a hidden cost to their service? could it explain the unexplainable extra fee I had to pay?

So I have three final comments to make:

  1. Mobile device companies are like any others, they want users’ personal information, but unlike others, they have full control of the device you discuss you life on.
  2. Opting for usage statistics, should be just that, an optional choice! and it should be made clear that it could result in extra cost, especially when roaming!
  3. If CIQ data consumption is also hidden from mobile OS(es) statistics then this is an extra hidden cost to the user
Now, where have I kept my 10 years old beloved Nokia 8210?
UPDATE, 12th of December 2011: CarrierIQ has responded to the issues discovered by Trevor through a 19 pages document. Not sure I find it very convincing.

New iOS Security attack, this time it looks bad!

Another attack on the iOS security has been published today and there are two recurring themes to the attacks I described in previous posts, namely: weaknesses with the Keychain and iOS encryption implementation.

But this time they have been used differently and seem to provide an attacker access to any passwords stored on an iOS device, even if it is passcode protected.
One main difference in this attack, is that the attacker would only requires the iOS devices and nothing else (as opposed to the relevant synced PC with previous attacks).

It also seems to prove Zdiarski’s concerns over the iOS encryption controls to be true.
The attack used some jailbreaking techniques to access the iOS device boot/ram, bypassing the passcode and using the OS to run a script to access the local keychain and all the passwords it may contain (email, VPN, web apps, etc)
It seems that the encrypted data is not linked to the user passcode, which means that if someone can bypass the passcode, even if the data is in theory still encrypted, the attacker uses the iOS device itself to decrypt the data for him!

When I said it was “bad, but not that bad”. Now, it may be THAT bad! ;)

All the details, video and whitepaper, are available here:
Fraunhofer Institute

Follow-up on Apple iOS Full Disk Encryption

Regarding my previous post I wanted to mitigate some of the risks I was describing.
In a nutshell, it is bad, but not that bad! :)

Escrow keybag
There is indeed a forensic issue with the escrow keybag feature, but because it requires the attacker to have both the targeted mobile device and the computer used to sync it with, That attacker would first need to break the computer’s security to access its filesystem.

Because that computer is used to sync the mobile device, most of the information it contains is likely to be on the computer as well.
For example, email accounts are likely to have been setup both on the computer and the mobile device, office files are likely to have been created on the computer, etc.

Therefore gaining access to the computer’s filesystem is likely to already give you access to most of the mobile device’s data.
Having said that, there is no garantee it will always be the case and some information such as call history, text messages, internet history, etc would only be available on the mobile device (and its hopefully encrypted backup).

The point is that although the Escrow Keybag can indeed be used to bypass a mobile device protection and is therefore a security risk, it should be put into context with the security risks related to successfully gaining access to it in the first place.
In other words, it is bad, but not that bad!

Full Disk Encryption
The statement that I reproduced about the level of security offered by the iOS full disk encryption control should also be put into a wider context.
Jonathan Zdiarski claims it was inadequate because it automatically decrypts data once requested for it, the way I undertand it is that its level of security is therefore dependant of the strengh of the passcode used and of the device’s OS security (sandbox, access control, etc).

But this is also true for any full disk encryption control, on any plateform.
If you gain knowledge or access to the passcode you can then access the data.
And if you get a malware running on your full disk encrypted device, it would not be prevented to access any data associated with your credentials.

I therefore do not believe this is an Apple specific security risk.
In other words, it is bad, but not that bad!

Apple and their elusive Full Disk Encryption solution

I have been researching how Apple has been implementing their full disk encryption control over the weekend and what I found puzzled me:

Although technically Apple provides a hardware full disk encryption solution, from a traditional security sense of the term, there is no full disk encryption available on the iPhone/Ipad currently! It sounds like a paradox? let me explain…

The closest analogy I can think of, would be if someone was selling you a house and claiming that the full house was protected with alarms in each room. The only problem, is that the alarm would only work when nobody was in the house… meaning the only protection your house effectively had was a simple front door key.

The following information can be found in the following article:
iPhone full disk encryption seems to have been implemented with one purpose in mind: fast/instantaneous remote wipe as it just erase the 256-bits encryption key.
Jonathan Zdiarski found that “the iPhone OS automatically decrypts data when a request for data is made, effectively making the encryption worthless for protecting data”
This is where the new iOs4.x “data protection” security feature comes into play, it allows for an app to derive a key from the user’s passcode/password and encrypt the app’s data. But so far it is only done by the built-in email app. APIs are available but each apps needs to use them if they want their data encrypted.
From the article referenced at the beginning of this post, I found the following two characteristics of that API the most interesting::
- Positive: There is a protection against brute force attack as an attacker can only guess about 20 passwords per second due to how keys are generated (which compares well to other software such as encrypted PGP files where 900 passwords can be guessed per second.)
- Negative: There is however a security weakness called the “Escrow Keybag, which is a collection of keys necessary to decrypt every file on the device without requiring the user’s password. This was done to allow computers to sync with the iPhone without asking the user for the password”. A company called ELCOMSOFT may be using this weakness in their iphone password recovery solution
The last point is of forensic significance. If both the iOS device and the computer used to sync it with are either seized or stolen then it is possible to find the plist/lockdown files on the computer and bypass the passcode used on the iOS device and dump all its data for analysis, unencrypted.
This is true for the latest iphones (3GS and 4G) with the latest firmware. For older model/versions there are other easier techniques to obtain the data.
Windows 7 is not better though:
Their might be some lights for Android based phone with the Motorola enterprise offering: http://www.networkworld.com/news/2010/100710-droid-pro-enterprise.html
The link below needs to be read with a warning as I don’t really agree with the author’s message that sounds as if all those phones are very secure and enterprise ready. They can be, but companies need to be aware of the limitations of the security controls that have been implemented. He still provides a good overview of each type of mobile phone capabilities (page 3) hence why I am providing the link:
Finally, those guys have a nice white paper on iphone forensics which was updated recently in November 2010:
VIAFORENSICS

Cellular Network Attacks

A few websites have been running a story today on an upcoming attack announcement/demo in next week black hat conference.

Instead of targeting the OS or a specific app, that attack would target bugs directly in a component used to send and receive calls, a baseband chip. Although technically it is still a software attack, the code used to control that chip, it would bypass any security measures in place at the OS level, and would especially be out of Apple/Google control. Such attack could be used to intercept calls or spy on a phone user by activating its phone microphone…

But then surely you would also need to find a bug in the microphone chip? Or elevate your privilege at the OS level from the baseband chip bug?
Anyway, eavesdropping on calls would at least be possible.

What makes this news interesting is both that duplicating a cell tower is becoming easier/cheaper (about $2k) and that you can’t secure and control everything, even in close systems such as iOS devices. Until they start manufacturing every single component, phone manufacturers will have to rely on a multitude of other vendors; all with different security agendas.

Now, if I was working for a security state agency I would invest in some key communication component companies… As hacking is becoming more and more lucrative/political, how long until the “bad guys” start thinking alike… but then you would call me paranoid ;)

Below is a link on the first website I read that story today:
MACWORLD