On the day I got my iPhone I submitted a bug report to Apple. It wasn’t truly a bug, but I didn’t know of a better way to express my disappointment involving the absence of a software development kit for the iPhone. It just seemed like too unique of a device to not be able to create applications for it. Perhaps a bug report was a bit of a low blow, but I never expected I'd hear anything back. However, the day after Apple announced they were going to release an iPhone dev kit in February of '08, I got an email in response to my "bug." Now, this email was identical to what Apple posted in the "Hot News" portion of their Web site and while I'd seen it before on many of the Apple news sites, this time I actually read it. One big section stood out in particular:
“It will take until February to release an SDK because we’re trying to do two diametrically opposed things at once—provide an advanced and open platform to developers while at the same time protect iPhone users from viruses, malware, privacy attacks, etc. This is no easy task. Some claim that viruses and malware are not a problem on mobile phones—this is simply not true. There have been serious viruses on other mobile phones already, including some that silently spread from phone to phone over the cell network. As our phones become more powerful, these malicious programs will become more dangerous. And since the iPhone is the most advanced phone ever, it will be a highly visible target.
Some companies are already taking action. Nokia, for example, is not allowing any applications to be loaded onto some of their newest phones unless they have a digital signature that can be traced back to a known developer. While this makes such a phone less than “totally open,” we believe it is a step in the right direction. We are working on an advanced system which will offer developers broad access to natively program the iPhone’s amazing software platform while at the same time protecting users from malicious programs.”
Now, I'm the known Apple fanboy around my office, but as a security analyst I've taken a more measured, if not skeptical, approach to Apple. However, this posting was quite the statement, showed some amazing insight on the part of Apple, and made me – at least from a security analyst’s perspective – take notice. Reading the text, a number of things caught my attention:
“...we’re trying to do two diametrically opposed things at once—provide an advanced and open platform to developers while at the same time protect iPhone users...”
This was the first line to really strike me, because it was something my father (an armchair security analyst) and I had been discussing only an hour before. Security and usability; that is, usability for both the developer and the end user are always at odds with each other. If the first step to recovery is admitting you have a problem then this is step number one for anyone interested in making a good product more secure. By recognizing the need to help developers write secure mobile device code and building features in to create secure mobile code, Apple is making an effort to protect users by aiding these developers.
“There have been serious viruses on other mobile phones already...”
“As our phones become more powerful, these malicious programs will become more dangerous.”
Both of these points are very true yet rarely acknowledged by major mobile providers or carriers. For example, I had a gentleman in tech support with a mobile provider tell me that cell phone malcode is impossible. Strange then that we have already seen SymbOS.Cabir, SymbOS.Skulls, WCE.Duts, and SymbOS.Commwarrior, just to name a few. As Apple states, today's mobiles are demonstrating equal if not greater computing power and functionality of desktops from five years ago. Most smartphones come out of the box sporting email clients, Web browsers, and document readers – all running on the same TCP/IP networks as desk-bound workstations. Gone are the days of cell phones that simply made phone calls. With all of these new capabilities comes an attack surface on par with most traditional operating systems. The attack surface is host to a unique set of restrictions due to smaller user interfaces, slower processors, and less spacious storage.
“We are working on an advanced system which will offer developers broad access to natively program the iPhone’s amazing software platform while at the same time protecting users from malicious programs.”
In my eyes this is the most intriguing line of the announcement. Code signing solutions have been proposed before and have had limited success at best, especially in open-source projects. There just ends up being too many certificates floating around and it's too easy for one of them to be used to sign a rootkit or exploit of some kind. Ollie Whitehouse had a great analysis of this in his blog post “Driver Signing on Vista 64-bit – Using the Process against Itself.” The only system I've seen recently that's been truly revolutionary in terms of using software signing, amongst other technologies, to protect the user from inadvertently installing malware has been the Bitfrost system, created by Ivan Krstić for the OLPC project. So, it will be interesting to see Apple's answer to this difficult and complicated question.
All in all, this email speaks volumes of Apple's acknowledgement of the importance and difficulty of mobile security and their willingness to raise the status quo. So, I give a "tip of the hat" to Mr. Jobs, the iPhone dev team, and Apple's security team. You have a difficult challenge ahead of you, but I'm glad you're stepping up to the plate. Here's hoping they hit a home run.
Message Edited by Scott Roberts on 03-06-2008 09:03 AM