According to the folks at Wired, the walls of the App Store have been breached as developers of the latest version of Lyrics (1.0.1) have included an Easter egg in their iPhone app.
The Easter egg allows users to turn the profanity filter off and thus be able to view the objectionable and explicit lyric content. We have seen Easter egg in iPhone apps before but this one was a little sneaky as the iPhone app was initially rejected from the App Store because it didn't have a ‘profanity filter’.
Easter Eggs are features, messages, images etc that can only be seen if the user is aware of the required commands or actions.
In case of Lyrics – iPhone app, the profanity filter can be turned off by scrolling down to the bottom of the "About" screen three times, once done, a filter button will appear with an on/off options. Turn it off, and the app works flawlessly.
According to Jelle Prins, developer of Lyrics:
"It’s almost impossible for Apple to see if there’s an Easter egg because they can’t really see the source code,” Prins said. “In theory a developer could make a simple Easter egg in their app and provide a user with whatever content they want.”
We can probably give the developers the benefit of doubt on this one as the Easter egg didn't have any malicious intent and the idea was to give users a choice. Ideally, Apple shouldn't have rejected the iPhone app in the first place as rejecting apps because they show dirty words is quite lame, which will hopefully get sorted with the age based parental controls in iPhone OS 3.0. But having said that, I think it was quite sneaky on the part of the developers to include an Easter egg to turn off a feature which Apple had explicitly requested for, to approve the iPhone app.
The iPhone app
is currently not available on the App Store. The developer has confirmed that
it wasn't Apple who removed it, but they took it offline temporarily for
the following reason:
"The Lyrics application will be temporary offline while we make the necessary
changes to our filter (will be much better, Apple will love it!) and take care
of the licensing issues."
This incident raises a concern that a developer with malicious intent could easily use an Easter egg to violate user's privacy by accessing the camera, address book etc. Nullriver CEO Adam Dann points out:
“If people start putting in naked pictures of their ex-girlfriend as an Easter egg to get revenge, or something like that, that isn’t quite right,” Dann said. “It has the potential to really mess things up for everybody.”
Jonathan Zdziarski, iPhone hacking expert and author of the book iPhone Forensics gives some examples:
"An audio app with a malicious Easter egg, could potentially
allow a developer to record a user’s conversations without him or her knowing
about it. And a harmful photo app could snap photos with your camera even when a
user is not pressing the shutter button. Third, a malicious app could steal your
address book contacts."
This again raises questions about Apple's approval process for iPhone apps. If Apple has to prevent such potential time bombs from exploding and avoid a PR nightmare, they will have to closely scrutinize each and every iPhone app including its source code. But with more than 46,000 iPhone apps on the App Store and submissions growing at a rapid pace, it seems like a daunting task and would further delay the approval of iPhone apps.
Do let us know what you think in our comments section below.