Android Central

Qeexo, a small company borne out of Carnegie Mellon, has been introducing FingerSense, a technology that allows smartphones and tablets to tell the difference between touches with a fingertip, finger pad, nail, knuckle, stylus, or stylus eraser.

They've modified a Samsung Galaxy S3 with a special acoustic sensor, while their  custom software demonstrates wide variety of applications, including contextual menus accessed through knuckle-taps, artistic input, and gaming. Qeexo is quietly getting developers hooked up with their Android SDK, and as a part of their pitch to OEMs, Qeexo boasts FingerSense is low-latency, real-time, and has a small demand on power. 

Personally, I could see this kind of touch behavior becoming very natural, and adding a whole level of navigational depth to Android beyond the standard set of gestures we've become used to. The only problem is that it hinges on smartphone manufacturers being on board, and even then, it can take awhile for them to work it into a final, shipped product. At the same time, Qeexo has to win over developers to support the input with compelling experiences that push manufacturers to adopt FingerSense - a hard sell for busy developers. 

More information on FingerSense, including a finished paper on the project is available here, while developers can learn more about getting involved at Qeexo's site. Developers, any interest? What are the odds that an OEM will pick this up? Could you guys see yourselves using knuckle-taps on a regular basis?

Via: TechNewsDaily

 

Reader comments

Qeexo FingerSense screen able to distinguish knuckle, stylus, nail, and finger taps

12 Comments

No. I'm not gonna knuckle tap to right click. Design the interface correctly so it doesn't require a right click. It should be obvious to interact with, not hidden behind whatever obscure part of your hand you put on the screen.

ETA: The fact that they're even concerned about right clicks means they're thinking in old school desktop interface ideas and trying to apply them to touch interfaces. So their very premise is broken.

I use nova and really like long tapping to uninstall, remove, re-size and get info. a quick knuckle tap would be great. Why limit the ways we can interact with our tech? By the standard you outline, long tap shouldn't exist. Swiping from off screen to reveal menus also would not exist. That is hiding the content, no?

On an up note, I like the way you assume you have an absolute understanding of the UX that every one wants to use.

Lol, so glad you like it. I don't think it's bad for _additional_ methods to be introduced. But if they're thinking this is gonna be some kind of primary interface, it's never gonna be adopted by mainstream users. Geeks will like it, just like how they like obscure mouse gestures in browsers. But those make no sense either and have never been widely adopted.

The fact that they're using a Galaxy Note 2 should already make clear that, just because this is a niche software design doesn't mean it can't go mainstream.

This is pretty cool. Don't know exactly how useful it would be, but it'd be nice to have the option. I'm hoping someone can come up with a way to turn the screen/phone on without any physical buttons. It'll be nice having 0 moving parts for us to break...

Not a chance. Not intuitive. Knuckles are under utilized during most everyday tasks for good reason, the act does not come naturally. Finger tips are better suited because they are better trained. If you try to change the human condition to suit your technology, you lose. If you try to change technology to suit human nature, you win big.

I think this is really cool. The more methods people throw at the wall in the current rapidly evolving environment the more sussed out the processes and interactions become. Not saying that this method is the best way to interact but it may be a clumsy bridge to a better interaction. I loved when android devices started having styli because I find just one input source (finger) limiting. I love being able to have multiple types of input for different jobs, rather than having to clumsily change the intent of your touch with a menu or a toggle each time you want to switch. All in all I'd give it a shot, it looks promising from my point of view.