If you need to reach across the screen without taking up screen real estate, use your other hand. Or use a stylus. Problem solved.
Edit: Also not to mention that for the most part iOS does a pretty good job of letting me be pretty accurate with my bulky fingers. It's not perfect, but it's certainly not terrible.
I don't pretend to have the answers to all these issues, but something tells me that smartphones can do much better than the current state of the art when it comes to game controls, to pointing at precise location, etc. And it goes without saying that this won't work with the existing mobile OSs as-is.
The only game as far as I'm aware that properly makes use of it is Tearaway, but even then it only appears in a few places and was kind of awkward to control properly.
It is in fact awkward to use, IMO, and not just because of the software or Sony's implementation.
There are at least two big problems with it:
One is that it is really difficult to do Wacom-pen style hovering of a "cursor" on a capacitive surface with finger input in a way that works well universally for everyone without a lengthy and awkward calibration. And because of the non-direct method of interaction using your fingers on the back gives you, you really need some sort of non-action hovering indicator for this setup to work well.
The other is this: Put your hands in the positions shown in the original article. Now try moving your your index and middle fingers around as if touching the back surface of a device and try not to move your thumbs (and wrists) all over the place involuntarily. For most people this is difficult. When you are tightly gripping a device this becomes less of a problem but still contributes to the whole thing feeling very uncomfortable and unstable.
I'm nearly positive that various companies like Apple must have tested something like this out (either before/after the PS Vita) for a phone and just found it to be a poor solution when implemented in a real-world prototype.
paper: http://www.cliftonforlines.com/papers/2007_wigdor_lucidtouch...
On a tangent [Tearaway Spoiler Alert]...when beginning the game, Tearaway asks you to select your skin tone from a few presets. Knowing nothing about the game at that point, I thought that was really strange. I recall thinking, "I don't really care whether my character has my skin tone...and it's really odd that they'd presume that I would." But it turns out, that's not what the skin tone selection was for. In Tearaway, you use the rear touchpad to punch your fingers through the paper backdrops of the game to manipulate things. The first time I did this and they showed "my" fingers in the game, I was startled for a split second and then laughed out loud. Screenshot of the effect: http://media.officialplaystationmagazine.co.uk/files/2013/11...
It's more magical live as your virtual in-game fingers track the position and angle (angle presumably by extrapolating based upon the current finger position and average hand size/grip) of your actual fingers surprisingly well.
The whole NI tablet debacle made me quite jaded towards miracle tech. I'm lucky it predated kickstarter, because I would have most likely backed it up to 50% over retail. It's probably why I don't own a 1st gen pebble.
Here is a good overview of the concept's rise and fall: http://www.engadget.com/tag/NotionInk/
you are right about the apple patent. Apple bought fingerWorks in 2005 and killed ALL the loved product lines (of note, one keyboard that was in its entire area, a touchpad). apple sit on top of the company's IP assets and used it for nothing but suing people. ...and people still wonder why i avoid any apple product..
anyway, eventually the touch screen tech made it to the iphones. or so they say. but if you compare the fingerworks tech on the few shipped keyboards and the iphone resistive touchscreen, they have little in common. apple was just trolling everyone with the patents and killing innovation all around.
but since fingerworks was dead since 2005 thanks to apple, several other companies with employees that probably never even heard of finger works developed this idea... nokia as you mention. sony with psp. motorola with the backflip (which being one of the first AT&T exclusive android phones suffered of having the worst crap of custom android ROM that ever saw the light of day). And more recently the Oppo N1 already have the very same implementation idea mentioned in the article, and is in production. but you don't see anyone rushing to the stores.
I'd rather see companies trying out ideas that might work instead of just sticking to their metaphorical guns.
All of these ideas are pointing to the issues in the use of a touchscreen as input (imprecise, blocks screen during use, etc), yet for some reason they keep getting used (and even taking over regular buttons).
I hope that this decision is being tested by the companies HCI departments, but I worry that marketing is deciding that changing the input would be too much of a risk (or cause fragmentation).
http://www.slashgear.com/google-patents-rear-touch-controls-...
http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=...
I'd still like to see someone try it.
Based on my limited experience, some potential issues:
Not having the user's fingers obscuring the view of the screen may in fact increase the perceived latency, since they're focused on the screen and don't have the motion of their fingers to distract them.
The latency issue would be twice as bad if you want to render a 'ghost' of your hands on the screen as described in this design concept.
Interacting with onscreen elements is more difficult when using your hands on the rear of a device, even with a 1-1 mapping. I don't really know why this is, but even with a cursor onscreen, I have found it to be true.
Accidental interaction is 5 times worse with a rear touch panel. Apps on the Vita that use it extensively are a huge pain in terms of accidental swipes and touches, especially if you try to lay the device down on a surface for a moment, or set it on your knee to use the front touch panel.
The core problem with virtual buttons/joysticks/gamepads is that you have no physical feedback about where your fingers are, and as a result you lose your 'centering' and your inputs end up being misinterpreted or not landing. Moving your fingers to the rear of the device makes this worse, because you can no longer look at your fingers to figure out where they are.
DOOGEE DG800 : sub $120 smartphone with dual touch : http://www.pandawill.com/doogee-valencia-dg800-smartphone-cr...
http://conversations.nokia.com/2011/11/10/nokia-gem-what-sor...
The whole thing was a touch screen.
Here's an Engadget article on the Nokia concept with a fair amount of comments:
http://www.engadget.com/2011/11/11/nokia-gem-concept-dazzles...
I've never realized this was even a problem, much less the biggest one.
Maybe in games this is more of an issue, especially where you need to leave your fingers in the same position and the design of the game does a poor job of taking this into account. But those are mostly edge cases.
http://www.patentlyapple.com/patently-apple/2013/10/apple-fi...
Further more the cell phone of the future will be able to borrow any big screen in its vicinity. Something like NFC from the large display/computer monitor and built in AppleTV, Google Chrome MiraCast functionality. You will also be able to borrow local keyboards for better input, but without the bluetooth hassles of setting it up. So the cafe, workplaces of the future will have wireless chargers and screens that you can borrow for your mobile device.
So you will carry your device around but borrow larger displays and keyboards. The device will be powerful enough todo your everyday computing. No need to drag a big laptop around if you do not want to.
These devices will also be user serviceable like Google/Motorola Project Ara. It is simply not good for the environment to throw away a whole phone just because the display, battery is bad or because you want to upgrade the radio components. So in the future devices will be made to be recyclable, this trend will be driven by the scarcity of rare earth metals. It will simply not be good enough to buy and throw devices and not think about the recycling of rare metals and the environment.
The future is bright.
Even people who grasp code pretty well loading up some code(stable library) needed by tutorial might get tripped up by a wrong advice such as library that was fixed for some security vulnerability that broke a number of thing that current examples need to work and failing code just misbehaves in a number of ways, one remarkable - blaming general implementation of examples that they are trying breaking security standards and not give any alternative to do otherwise. that is unless you find a reference somewhere on some obscure blog that says that you have to load specific version pre-alpha/HEAD^3 because HEAD is broken so many subtle ways that it will cause you even more pain. This happened not once and not only to me I bet. I know I could've patched my code but with the deadline and fact that I don't know much about cryptography that would not be a sane option.
With screens bigger than your hands and accidental touch detection being as bad as ever, a simple button to turn off the digitizer would make a lot of people happy...
Perhaps a smart watch could even have its screen on one side of your wrist, and a "touchpad" on the other.
for gaming, most likely uncomfortable users are used to using thumbs to play rather than using their index fingers
i rather be waiting for those kind of panel-less screen