Tag: app store

Apple App Store – Walled garden, or pit of snakes; the security flaws

No Comments

Some might be familiar with the name Charlie Miller. He is a well-known software security expert, most known for his work with Apple products of late. His previous accomplishments include the hack of the Intel MacBook line smart batteries, which were all protected by the same two passwords and could be accessed by software (Good one Apple – create a situation where some internet script kid could disable my battery remotely…). This time around, he turned his eye to Apple’s prized feature – the App Store.

Whatever you think of the walled garden approach they adopt, there is no doubt that the App Store is a commercial success (for Apple – unfortunately for the devs, it’s mostly a gambling exercise where a few make millions, the rest lose their shirt). It works well for the consumer, as Apple personally go through each submitted app, making sure it meets the standard they expect. Apparently, that inspection is supposed to cover security. However, Charlie Miller has put a chink in that assertion, by releasing an app which is capable of receiving remote commands and putting those commands into effect on your device. What’s more important, is that this app, called InstaStock and designed as a simple stock ticker, got right through the fabled verification process without a hitch.

The roots of the flaw are based on how Apple enforce code-signing, and Apple’s desire to speed up the phone browser in competition with other devices. A technique used in all sorts of software and security, code-signing in basic terms relies on Apple wrapping the software with a code, and any software without this code is refused. That is similarly why you can’t just download some app straight onto your iPhone – it isn’t signed and therefore the phone won’t run it without a jailbreak. However, by manipulating the access given to javascript commands in the browser, and Apple’s addition of a special exception (allowing the browser to run unsigned code in an area of the memory) opened a hole. Whilst Apple had protected that exception with other methods, blocking untrusted websites from using it, Miller found a way around that:

“Apple runs all these checks to make sure only the browser can use the exception,” he says. “But in this one weird little corner case, it’s possible. And then you don’t have to worry about code-signing any more at all.”

Miller has already promised that he won’t reveal more detail about the bug until his talk next week in order to give Apple more time to fix the flaw, planning to discuss the flaw in detail at the SysCan conference in Taiwan next week.
Using the flaw, he got the aforementioned app placed into the store, and demonstrated that it could connect to a remote machine to download instruction and execute them at will. Functions such as photos, contacts, sound, vibration and other iOS functions are accessible, according to Forbes.

“Now you could have a program in the App Store like Angry Birds that can run new code on your phone that Apple never had a chance to check,” says Miller. “With this bug, you can’t be assured of anything you download from the App Store behaving nicely.”

Whilst many will point out that Android already has this kind of malicious application, Google do not purport to guarantee the safety of their Market – they encourage you to be vigilant, and use a permissions-check system to tell you exactly what services and functions a program requires. Apple, on the other hand, present a model where worries over safety can be ignored as they have checked everything and it all just works.

”Android has been like the Wild West,” says Miller. “And this bug basically reduces the security of iOS to that of Android.”

Worse, when the deception was all pointed out to Apple, instead of a response of “whoa, dude, thanks. We’ll get this patched right up. Cheers for the heads-up”, instead the app was pulled (no big deal obviously) and then Miller was struck from the developer programme – Miller announced the news on Twitter this afternoon, saying “OMG, Apple just kicked me out of the iOS Developer program. That’s so rude!” But as Apple notes in its letter to Miller (posted below), he violated sections 3.2 and 6.1 of Apple’s iOS Developer Program License Agreement (a separate agreement), which respectively cover interfering with Apple’s software and services, and hiding features from the company when submitting them.

“I don’t think they’ve ever done this to another researcher. Then again, no researcher has ever looked into the security of their App Store. And after this, I imagine no other ones ever will,” Miller said in an e-mail to CNET. “That is the really bad news from their decision.”

The real shame from all this is that Apple and their walled garden gives its users a totally false sense of security. Whilst, for both the App Store and Android Market (and any other app stores), 99% of apps will be genuine and safe, you can never be 100% sure. Users should be taking their own precautions, and should not be lulled into complacency. Apple’s insistence on an ‘it just works’ method results in expectation, expectation that when Apple assert that an app is safe (by publishing it on their store) it must be.
In computer terms, you’d call the Apple model gateway security – you secure the entrance, and therefore anything that gets inside must be safe. Unfortunately, that leaves one big, central point of failure. The gateway. And any knowledgeable computer user knows it isn’t just enough to use the firewall on your router – you need the antivirus and firewall protection on the PCs too.

And the final observation – if some nice, white-hat hacker finds a flaw and tells you about it for free, ‘thanks’ will do much better than a swift kicking. I know you have an image to maintain, Apple, and you can’t allow people to lose confidence in your garden, but at least give him some credit.