I mean, it's OK to hate skeuomorphs, a valid critical position, but nobody is going to die.
And the whole thing feels like Louis CK's riff about airplanes. You are riffling through entire bookstores and museums on an affordable 150dpi Star Trek pad in the bathroom while complaining that innovation has stopped because the graphics look too familiar.
Two limited defenses of skeuomorphs. One: In a world where all of tech turns over on a timescale of months, orientation is important. It may be more important that people can glance at your calendar app and tell that it's supposed to be a calendar than that the calendar app perform optimally in the hands of a trained expert.
The other is: Skeuomorphs generally mimic designs that are at least decades old, sometimes centuries old. Be cautious about casually discarding the work of tens of generations of designers in the name of neophilia.
(Book page-flipping animations are less defensible with the latter argument, but the former still applies.)
Design space is dizzyingly unconstrained, and finding an optimal design is an intractable problem. A skeuomorph (sticking to the definition in play here) has taken its initial design from a particular source, a physical object. Odds are that the optimal design does not resemble that physical object. So what? Physical objects are a legitimate source of design ideas. Starting with a highly optimized solution to a similar, more tightly constrained problem is a common and effective pattern for generating good solutions to an intractable problem.
Another way of saying "hating on skeuomorphs" is "stigmatizing skeuomorphs," and the likely outcome of that is that designers, mindful of their professional image, will tend to eschew skeuomorphs in favor of inferior non-skeuomorphic designs until the winds shift back. (Isn't that the point of hating on skeuomorphs? To influence which designs are perceived as good or bad?) All that means to users is that they will be forced to use inferior designs because of a point of fashion they aren't even aware of. Another way of saying it is that analyzing whether a design is a skeuomorph or not is like analyzing a musician's influences: it might give you a clue to whether the creator keeps up with current trends, but it won't tell you anything about whether the product is good or not.
Sure, it's legitimate to say that other sources of inspiration should be mined as well. Constraining a problem to non-skeuomorphic solutions is a legitimate creative exercise, just like constraining the problem in any other way. It's a creative exercise, though. It isn't a legitimate rule of thumb for designing products any more than "use skeuomorphs!" is a good rule of thumb. Blindly changing features to make them less skeuomorphic is likely to make the product worse, not better. How to implement features in a shipping product such as Google Calendar should be based on usability, not whether you think the designer's thought process reflects a certain heuristic, overdependence on which has historically inhibited the emergence of new and better designs. Research efforts should be criticized on that basis, not shipping products.
So: a skeuomorph is obviously not a problem if it does not cause gratuitous errors or slowdowns, and it obviously is if it does.
And in any case design also concerns matters of taste, on which opinions will naturally differ.
"In Google Calendar, pick the "4 Weeks" option at the top, instead of "Month" and it functions exactly as you describe, with the current week at the top. -benjymous"
I didn't have a 4 Weeks view - turns out there's a setting which controls what the button between Month and Agenda does, in Settings -> General -> Custom View. There's also a setting above that for which view is your default when you reload the calendar.
Example: Wednesday is April 2... Was Monday March 30 or 31? It forces me to switch my entire calendar to get this info. I want to see days in March and May on my April calendar.
For me personally, the agenda view is hard to view because my brain works on a week/month scale, not day scale. In day view, I also like to see all my events in blocks throughout the day with gaps, instead of a linear list of events without gaps. It helps me visualise free time.
It works pretty well (I just remember the first two lines). I hope it saves you some time!
He's criticizing the case where the current date is April 30 and the calendar is still showing March 26, but not May 7, because you are on the April "page" still.
He spends most of the article telling us that skeuomorphs are bad giving the example of calendar app that looks like a desk calendar and then finishes by praising an app with a screen that skeuomorphically flips up just like a wall-calendar.
Or in summary Mimicking because it is good design that fits what you're trying to do is good, mimicking simply for the sake mimicking is bad.
Someone should start an open source lab that sets out to explore and test new design patterns in that space, without any preconceived notions of what should and should not work.
Unfortunately, the "old-fashioned, physical object" this design is based on is the human brain, or more precisely, the useful cultural artifacts that happen to be engrained in pretty much all the human brains on the planet. The monthly and weekly calendar reflects how we think about time. You think you can stop me from thinking in weeks and months by changing one calendar application? You'd have to work a little harder than that. Here's what you need to do:
1. Stop my company from scheduling my paychecks to coincide with the beginning and middle of each month. Stop them from organizing my work days into groups of five days in the middle of groups of seven, and stop people where I work from informally scheduling things with respect to month boundaries.
2. Do the same to the companies that employ all the people I occasionally synchronize my social schedule with.
3. Stop all the businesses I use from scheduling lessons and classes on a monthly basis and varying their hours on a weekly basis. Persuade the state government that it should be as easy to buy liquor on Sunday as on any other day. (That should be easy once you've abolished the days of the week; see 5.)
4. Detach holidays from dates. It would be so much nicer if Thanksgiving was the 329th day of the year instead of the fourth Thursday in November. That way I could forget about months and weeks, and my online calendar could make better use of screen space. (Again, this will follow easily once you've accomplished number 5.)
5. Abolish the days of the week and the months of the year. Prevent anyone from referring to Monday, Tuesday, January, etc., or to weeks and months at all, so I never have to think, "We're shipping on the first Monday of next month. How many days until then?"
After all that, I'll no longer want a calendar app that orients me with respect to weeks and months. You can get rid of your "skeuomorph," and I won't mind that my calendar app gives me no visual, non-verbal cues about what day of the week or week of the month it is.
So, sarcasm aside, you DO need to show month boundaries. It is not helpful to propose doing away with the one convention for that without proposing any replacement for it.
Also, it is not helpful to propose doing away with the past entirely. People are oriented by the past as well as by the future. Oh, dear, it's been a week since I told Doug that information would be available soon. I had better drop him a line and explain that it's delayed. My tooth has been hurting ever since I went to the dentist; how long has that been? When's the last time I worked out?
Showing the past is a valuable function! If you don't recognize that showing the past is part of the function of the calendar, then criticizing a calendar for showing too much of the past rings a little hollow, because you aren't balancing the valuable functions of the interface. You're just picking one and throwing out another.
A better statement of the problem is that the visual clues for month boundaries are the top and bottom of the displayed grid of days on the screen, and therefore the past to present ratio varies dramatically through the month. During some parts of the month, you see very little of the past, and during other parts of the month, you see very little of the future.
A useful suggestion would be to detach month boundaries from the edges of the displayed grid and show them in some other way, perhaps by using color or shading. That way you can keep the balance of past to present close to an optimum value. Perhaps the current week can be the second or third from the top. I'm not a UI designer, but I think that is a more useful analysis of the problem, even if I didn't show off a new word I learned a few weeks ago from a magazine article.
Now I certainly don't think retro design is "crippling" the basic OS experience. I personally find the whole desktop metaphor to be silly and outdated and not so useful for a generation that has grown up with computers and don't need a clock to look like a wall clock in order to understand that it's telling us the time. OSX widgets may be immature and tasteless, but they're not a hindrance.
If retro design is crippling anything, IMO, it digital audio workstations (DAW). Whenever I launch Cubase I have to deal with virtual mixer boards and virtual synthesizers that mimic the interfaces of 40 year old hardware to the point that you have to turn virtual knobs with my mouse. It's ridiculous and more often than not it's frustrating. Now I understand how this choice of design eases the learning curve, but I'd much rather jump over a few hurdles than run straightaway into a wall.
This is why programs like vim and emacs are still relevant decades after their invention. They don't insult the users intelligence, they don't pretend to be something they're not, and they take full advantage of the platform they're designed for.
If somebody could create a DAW that adhere's to this kind of philosophy, I don't care if it uses ncurses as an interface, I'd adopt it in a heartbeat.
I liked that quote. Having said that, I take the point that skeuomorphs are not exactly destroying people's minds.
Another quote I found extremely intresting was this one, from a professional designer's blog
"It shows the care and attention paid to the printing of a photographic image, but also shows how the analogue process of printing a photograph shares a lot with the the digital process of adjusting an image in Photoshop." [ my italics ]
See the quote in context at the link below
http://www.wemadethis.co.uk/blog/2012/01/shaped-by-war/
Basically, are we reaching the point where metaphors that originally made software more accessible (Photoshop like a wet darkroom) actually lose their meaning. My colleagues who teach photography often illustrate aspects of the photographic printing process using Photoshop (reverse metaphor).
Does anyone have any academic references on the anthropology of interfaces?
What a calendar should be, if we're going to truly abolish the tragedy of skeuomorphism, is a smoothly flowing timeline, with you at the front, diminishing logarithmically into the future. This way, you can clearly see your appointments. Should make the author happy.
The point is, sometimes we need to artificially break things up into manageable pieces. We think in milestones - the beginning of a season, midnight as a landmark showing how late you're staying up. Some skeuomorphic designs are only "incidentally" skeuomorphic, in that they solve a problem the right way, and just happen to resemble how people used to solve that problem.
Magazine, how retro
Worst case is when trying to highlight, as a single mark (with note attached), a section of text which spans more than 2 pages (esp. when a "page" is a large-font phone-screen size): I can't tap-and-drag from one end of the section to then next because the "page" paradigm intervenes. Web page? word processing? no problem, we're used to highlighting via scrolling. E-book? scroll? nope, gotta match that page-flipping animation.
I think that's why the page paradigm has persisted. It's all about familiarity.
This article demonstrates how it's almost the opposite for non-game interfaces. Instead, it's the plucky upstarts that can take the risk on novel interfaces since they don't have large numbers of users to annoy (and I bet if Google tried to get radical with Calendar now, there'd be a real storm of hate over it).
What other 'retro' tech hasn't changed much? The automobile?
Apple went over the top with the new iCal GUI, but it wasn't an interaction problem. Users have mental models from cultural experience that dictate how they want to use things. In other words, "that software feels intuitive."
The issue I suspect the author actually has with the new iCal is that it goes over the top visually. Software doesn't have to mimic the gaudy ornamentation of a real world object to resemble it.
I wish that there was an OS with the core of iOS and the appearance of WP7[1].
1. except Helvetica.
Personally I prefer the regular week view.
It's "crippling innovation"...what!? Retro design isn't an institution, or a person, or even a "thing". It's a concept that designers take in order to progress towards a more usable interface.
You know what "cripples" innovation - massive design changes that happen in extended batch schedules. Technically this doesn't cripple innovation, it simply decreases innovative adoption and user acceptance.
That needs to be answered first before assuming crippling of innovation.