(Originally published on Quora on September 20th, 2013)
A lot has been written about skeuomorphism, the idea that digital objects should look like their physical counterparts. Speculation went wild from the day that Jony Ive, Apple’s former head of industrial design, had taken the helm for software design as well after the letting go of Scott Forstall.
Steve Jobs, the rumors said, was an advocate of skeuomorphism, and so was Forstall. But Ive was a famous objector, and had many supporters within the Apple design team. On its face, skeuomorphism sounds ridiculous: why must a digital calendar, a thing far more complex and flexible than a paper calendar, be made to look like its inferior progenitor? Why must we have leather, and stitches, and yellowing paper texture in our modern, digital note taking app? In the early days of mobile computing, these things may have made some sense, to help train a generation of new users. But are they still necessary today?
There is a lot to like about the recent release of iOS 7. New features, and new design elements that make users lives easier. But one thing that seems evident is that Apple has taken the backlash against skeuomorphism a couple steps too far. iOS 7 does away not only with skeuomorphism, but with a lot of physicality per se. Buttons no longer look like you can press them. Icons are flat and cast no shadow. Toolbars are flat, and practically indistinguishable from the content below, and folders do not look like containers anymore. It’s true that some new physical behaviors have been added, like the frosted glass effect in the notification center, or the great new way to flipping through open apps. But for the most part, elements have become more abstract, less substantial, harder to separate from their background, and therefore – less “real”.
Physicality, the idea that digital objects should have physical characteristics so that they can be interacted with physically, is older than even the Graphic User Interface. Text cursors had geographical positions on the screen. Then graphics came, and with it, the concepts of background and foreground and various windows and menus that could be layered on top of each other, like physical objects. Drag and drop was introduced to mimic the way we manipulate objects in real life, and as soon as our screens were good enough, toolbars and buttons started employing shading and color to mimic volume and three dimensionality, and help users quickly make sense of a screen as if it were a topographical model.
When touch interfaces hit the mainstream with the iPhone, a new level of physicality was reached. Icons were made three-dimensional, to convey that they are for tapping. Toolbars were shaded and gradated to give them the illusion of volume so that they were easy to separate from the content, alert dialogs had a unique texture, lighting, and they cast a shadow on the content below. Buttons were made in various forms: either as protruding elements on the toolbar, as integral part of the content, or as big, puffy overlays on top of the content to stress an important but temporary selection. All digital objects employed shading, depth perception, volume, texture, three dimensionality, reflexivity, and sometimes even weight to give the user the illusion that they are physical, and thus to subconsciously clue him in on how to manipulate them.
But a lot of this physicality has disappeared with iOS 7. Why? Did the Apple design team simply decide that it had no use for it? Did they have brilliant new ideas that made the old ideas obsolete? Having played with iOS 7 for a day now, and knowing what I know about interface design, I find that hard to believe.
Physicality is one of the most powerful tools in the UX designer’s arsenal. It is the primary way for a designer to answer the following questions for the user, subconsciously and instantly:
- Where does this object end and another begin?
- Will it do anything when I tap it, or is it informational only?
- Will it scroll away with the rest of the page, or will it stay?
- Is it a permanent feature of this screen, or a temporary occurrence?
- What should I pay attention to right now?
- What part of the screen is normal app content, and what part is an alert intruding from the outside?
- What is part of the content and what is part of the controls and navigation?
In the absence of physicality, which subconsciously informs the user on all of these questions, the user must resort to higher level thinking, and to memory (if he has previously used the app.) The designer who gives up physicality, on the other hand, is forced to invent new conventions in the form of color coding, layout conventions, and wording conventions, which the user must memorize in order to use. In other words, rejecting physicality is rejecting one of the principal ways in which a designer can connect an entirely new piece of software to a universe of pre-existing experiences and natural laws that the user had already internalized.
But, is the attack on skeuomorphism per se even warranted? Skeuomorphism may seem dishonest, a fake digital version of a physical product. But so is every graphic interface, or even the basic notion that your computer is displaying objects that can be manipulated. Are those really characters you’re reading on the screen? Or are they a lie, produced by arranging combination of little lights together? Do web pages really exist, or are they mere electrical processes running through wires that are designed to fool us into seeing words and images?
Software is a cognitive creation, much like a novel. And just like objects in a novel, digital objects should look and feel like whatever it is that helps human beings understand and interact with them. They have no other purpose that can determine a more “honest” form. Viewed from that perspective, skeuomorphism isn’t a lie: it’s a metaphor. A metaphor that helps us interact with an object that has no independent existence, and no independent qualities. It’s not as critical as physicality but it is, sometimes, a good idea.
Physicality, on the other hand, is at the very heart of computing. It is the future of computing. It’s important that Apple does not throw this baby with some of the more cheesy skeuomorphic bathwater.