I will try to put aside my feelings as much as I can in this piece. Having studied multi-touch technology for my diploma thesis and being an Apple user for several years, I am bound to be a bit biased, so let's get the obvious out of the way.
I believe Apple has sought to redefine computer interfaces by means of multitouch technology. Its work started with the iPhone, on which they iterated three times, both in software as in hardware. How did they do that? A good product, a large developer ecosystem, and the App Store for easy customer consumption of software. Many people believe, and would have justified reasoning, that Apple imposed the App Store as a gatekeeper for developer money and platform control.
In light of the recently revealed tablet, another idea cropped up in my mind. Some
of the tech pundits
said that the tablet was the thing apple sought out to build in the first place. After seeing the deluge of applications and software experiments that flooded the app store, I think that the iPhone served as an excellent test bed for multitouch applications. 120.000 apps? 3 billion downloads? Apple turned the App store/iPhone into a giant usability test. Apple itself built very, very little software for the iPhone, purposefully so. They make the simplest thing that works then polish the hell out of it. Compared to the 120k apps, Apple's code is just a blip. Developers and users all took part willingly to this giant test, while Apple observed, took notes and made a fair bit of cash.
Of course, the iPhone is a cool device, it was marketed well and there were tons of logistics that were behind it's financial success, but I see that as a secondary driver for Apple's intentions. David Bowie once said every artist needs to release, every once in a while, an album that has a marketable sound, so he can then write three more that are truly expressing his creativity.
I believe the iPhone was one of Apple's commercial albums. It will soon become a commodity, driven by lower yearly costs, while the tablet will be the start of a series of masterpieces. In fact, I will liken this first iteration to an early sketch that a master does before taking on the masterpiece, a study
, if you will.
If you were to say "I will redefine the way computing is done and introduce a completely new way of interacting", how would you go about it? Will you develop and refine UI guidelines, samples and conduct tests in secret for 20 years and then release it as the perfect way of operating computer interfaces by virtue of touch? Of course not. Market reality coupled with society's push towards commercial success prevents such things from happening.
How would you make sure that others follow your lead? How would you get developers to switch their mindset towards creating user interfaces centered on touch? Sure enough, some developers had been waiting for years to create such interfaces, with high attention to detail and high esteem for user-centered design. On the whole, however, software has become a messy affair. Quality tends to get sacrificed in favor of features and time to market, marketeers and project managers drive products instead of designers and engineers. I have seen it many times first-hand.
When the iPhone SDK was introduced developers complained that their favorite language is not allowed on the platform, that their toolkit is not used, or other favorite toys not being present. There were harsh criticisms of the App Store, mostly related to Apple not allowing this or the other feature for developers. Some developers were quite vocal about their displeasure with Apple, even going so far as threatening to leave the OSX ecosystem for good. Flash, Java, open-source supporters all cried out that Apple is playing a monopoly card and that Steve Jobs is a control freak.
Apple mostly ignored these and provided some excellent APIs in the form of Cocoa Touch and a lot of ready-made controls for developers. It took a while to create them, but their quality ensured that many developers would use them. Some happily, some grudgingly took to XCode and started building interfaces that are not only usable, but simple, functional and keep a close line to the look and feel of the iPhone OS, ensuring user familiarity with iPhone applications way of working.
Two years later and 75 million devices
sold, several proficient developers and consumers declared "whining will only get you so far, good stuff will get you our cash".
In the past two years, Apple has taken its time with releasing new features. Copy-and-paste and other missing features became a running joke
among tech writers and blogs. The App Store was seen (and still is, in my opinion) as a walled garden. Many have seen in this greed and need for control. I think there is another point to it, specifically that Apple constrained developers to think more about touch-based input and how to design quality apps around this new way of interacting with a device. Apple set the example with it's few released applications on the device, and then slowly with the approved ones. Hence, the big usability test I was talking earlier on.
Multitouch, like the mouse before it, is definitely not new. [Sidenote: In the upcoming article about the history of user interfaces, I will detail more on this subject.]
. The "public at large" however, was largely unaware of it until the release of the iPhone. Many developers were also in the same boat. Software for mobiles was, for the large part, an attempt into shrinking existing desktop interfaces into tiny mobile screens. Scrollbars, windows, tiny buttons made their way into the devices while conjuring up the stylus as a way of "manipulating" the interface. The iPhone was the first "smart" phone to do away with all of those. Almost everyone who uses one, even temporarily, praises it's ease of use.
The big elephant in the room however, was the lack of multitasking. We've become so used to this in computing, and the iPhone does so many things akin to a "real" computer, it wasn't long before all of the techies were wondering when will Apple "get to it" and flip the switch, enabling us with the much needed multitasking we so dearly love.
Having jailbroken my iPhone, I wasn't bothered too much about this absence. It is
a multitasking OS, it's running Unix, for crying out loud. A little worry inside my head was saying "Apple is being sort of mean, not allowing this for third-party apps". For some time, my concerns were alleviated by Apple's defence of "it's a phone, you don't really
need multitasking. Besides, running many things in the background will only drain your battery too quickly".
And then came January 27. The device which had been hyped for years
was finally unveiled by Steve in typical Apple fashion. To say that the reveal was not polarizing would be to ignore ice proximity on-board of the Titanic. My good friend and colleague Rares promptly sent me this funny take on the iPad launch commercial from College Humor
. Numerous pundits started pouring out vitriol towards Apple and the new device as if it's failure would doom mankind. No camera! It's just a big iPod! Where's the innovation? Couldn't you pick out a name that is less like a feminine product
Many bloggers took to critique, with some notable exceptions of a few people that were actually there
I will admit, I was pretty upset about the lack of certain features. As a multitouch enthusiast, I loved my iPhone but I was expecting a "real multitouch computer" any day now from Apple, one that would allow me to develop freely on the device, one that is as close as possible to the computers we have now, one that would finally bring us a large multi-touch surface on a modern operating system.
Sigh. The iPad is clearly none of those things. I was a bit confused at first for the "why". I had heard a few sources that Apple is working on iPhone OS 4.0, which would finally enable multitasking (although only on select devices
). Surely, a bigger, more capable device can handle that? And if that device is indeed more capable, then surely the App Store is not needed anymore?
But that is not the device Apple chose to make. Apple decided to take a risk. What is that risk? Building a multitouch, consumer-oriented operating system. I believe the iPad is the "grandma" computer.
Fraser Speirs put it best:
The people whose backs have been broken under the weight of technological complexity and failure immediately understand what's happening here. Those of us who patiently, day after day, explain to a child or colleague that the reason there's no Print item in the File menu is because, although the Pages document is filling the screen, Finder is actually the frontmost application and it doesn't have any windows open, understand what's happening here.
Remember the large usability test with the iPhone I mentioned earlier? Turns out 80 million people or so are doing just fine without multitasking on their phones. I recalled a few links
and an actual study
which didn't speak too highly of our habit of juggling tasks. Thinking more about it, I believe we don't really multitask at the computer, since we are actually focusing our attention on one thing at a time. It is true, however, that our time is segmented more and more while we answer IM, receive an email, take a Skype call. There's still no consensus whether this is beneficial or just illusory, as we get the impression that we are getting more things done at the same time.
With that said, our brain is a massive parallel, mean multitasking machine. What do we multitask? Our movements, our perceptions, our internal activities, alerts (we're hungry, we're sleepy, and so on) are all managed by our brain without any of us spending any time concentrating on that. Plenty of things to manage by its own while we do the most mundane of tasks. I'll focus on the perceptions or sensory tasks that our brain does. More specifically, listening to music.
Many times where I see the complaint about multitask absence, I hear of Pandora streaming
, Last.fm, or any other third party music application that users want. Which, if you really think about it, makes sense. Listening to music is, after all, one of the conscious tasks that our brain juggles pretty well with others in a low-stress situation.
The other type of multitasking I've thought about would be a reference-creation type of activity. We do these every day, and have been, before computers were around. We write an essay while having an open book in front of us. We paint a landscape on the canvas next to us. We write a document while surfing the web and we write code while looking at the documentation. Surely, you can multiply the number of open documents, web pages (I should know, I'm using up to 30 tabs open at a time) but the basic action will still be the same, reduced to two items: look up something in the reference, create something at the destination. An Alt/Command-tab away, information is at our fingertips. With two monitors, this setup is quite nice to work with. On the iPhone, however, where memory and display size are a premium, you cannot afford (yet) to keep two applications open. I will venture to say it comes very close to the "Alt-Tab" way of working. You pop-up Safari, open a website, hit the Home button, open the email/notepad app, with or without a copy/paste operation in between, then another tap of the home button, and you're back to where you were in Safari. "You do realize, Mircea, that the app is removed from memory when you hit the Home button! This is most definitely not
multitasking!". I would argue at this point, it is just a technicality. iPhone apps, with session saving well implemented, pick up from where you left them. As an added benefit, you can even turn the device off, and the next day, they will again pick up from exactly where you left them, which you cannot say about too many of their desktop counterparts.
Whereas multitasking was seen as a benefit of keeping in "memory" several apps and avoiding a lengthy start of said software, "multitasking" on the iPhone requires that an App exits from memory, especially so that the other loads faster. And fast is a key word here. At the base level, CPUS still execute one instruction at the same time, and multitasking is an "illusion" given to us by the speed with which the CPU manages to run several operations (And yes, I know about multi-core/parallel processing. I'll talk about that in a future post).
The third kind of multitasking that we use every day is manifested through notifications. Email, IM, system updates, browser updates. For this part, I'm still on the fence. I like them while I'm browsing the web. I'm not particularly fond of them when reading a book. I'll have to explore this in more detail. What do you think?
Apple took a risk and pulled a Nintendo with the iPad. The Nintendo Wii's story is much like the iPad's. Launched at the same time as its competitors, the XBox360 and the PS3, it was scarce on features. No High-Definition output, limited hard drive, poor graphics capabilities. Some even said it was "two Gamecubes (previous generation hardware) duct-taped together" and name jokes were in abundance. Sounds familiar? Yet Nintendo innovated in a key aspect. The controller, the way you interface with a console. By adding a few accelerometers, Nintendo managed a 1 to 1 mapping of human arm motion to the action reflected inside a game. All of a sudden, people who had no passion for gaming (as previously established until then) started playing tennis, bowling and a host of other games. Nintendo focused on simple games, simple graphics, simple motion controls. And they nailed it. They are the number one console this generation, despite hordes of gamers who predicted its demise. It opened gaming to an entire new demographic, while blissfully ignoring complaints and raking in the cash. Now, three years later, both Sony and Microsoft will launch this fall their responses; the Playstation Motion Controller and Natal.
Will this be the same pattern with the iPad? Will they, by focusing on ease of use, natural mapping of inputs and speedy, responsive interfaces open up computing to a new segment of users? Will I have to buy one for my mother so that I don't spend half an hour guiding her through a 4 clicks browser update?
Time will tell. While Apple has taken to the risky business of migrating the user interface paradigm we've been used to for 30 years, it will be a long journey to get everyone on board.
That of course, if the iPad isn't doomed to failure and all these words were spent on nothing.
I'll give Apple till the third iteration to fix their shortcomings and bring multitouch to an OS oriented towards professional applications, that we, the techies, have been waiting for all along.
Don't disappoint me, damnit.