Dynamic Perspective SDK allows developers to add a new dimension to their UIs

Amazon is using 3D to add a new dimension to using and interacting with its newly unveiled Fire Phone. The Fire Phone uses all of its sensors to give users a new perspective in interactions. Fire Phone uses its four front-facing cameras that are mounted on the corners of your screen to track your face and the phone's movement to generate the the dynamic perspective. This allows for gentle tilting to flip pages on Kindle Books or zoom in on images depending on direction and angle of tilt.

While tilt scrolling is nothing new, hopefully with the support of added hardware Amazon could do this better than rivals like Samsung. Amazon is also leveraging this tilting in games as well so you don't need controls to pan around a game. This could be beneficial to adventure games, FPS, and other 3D titles where you're moving around in the real world.

Additionally, Amazon is also adding perspective to looking at things in 3D.

The new Amazon Maps app is also laid out in 3D and moving your hands over the screen will move the interface elements out of the way. so you have unobstructive access to your mapping information. This all happens dynamically on the fly.

To achieve this, amazon wants to know where the user's head is in relation to the phone's display at all times by leveraging its 4 cameras and mapping what it calls dynamic perspective. Amazon will overcome darkness, if a user is wearing sunglasses, and other changes by using the four cameras along with infra-red lights so that the phone can maintain this dynamic perspective even in the dark.

Each camera has a wide angle to see more so that your face doesnt disappear if you aren't right in front of the cameras.

Each of the four cameras has a 120-degree lens so that it has a wide angle and can capture the user's face or head even if the phone is not held directly in front of the face. There's also depth sensing so that the camera will know how big a head should be and reject false positives, such as heads sensed on other objects.

Using dynamic perspective, Fire Phone stores this information as X, Y, and Z data and the Dynamic Perspective SDK is available to developers. Hopefully, we'll see more apps that leverage this new user interaction. Developers can add this SDK to existing Android titles. The SDK is available today.

 

Reader comments

Dynamic Perspective adds a new dimension to Fire Phone's UI

13 Comments

Sounds like Amazon jumped the gun on Microsoft doing 3D gestures

Posted via Android Central App

Considering Samsung and HTC both already have had devices (for about a year now) capable of the sort of thing Microsoft has just recently announced that they're working on, it's more like Microsoft is late to the party. Again.

It's not exactly the same as MS will be using more hardware function and not software trickery. This has more function from hardware with more camera and sensors. I've seen the stuff the S5 and was not impressed by it at all.

Posted via Android Central App

"Microsoft is able to achieve touch-less interaction by using an electric field sensing tech. This allows for 3D finger tracking as well as in-air gestures."

This is exactly how the Samsung and HTC phones do it. They extend the magnetic field out further from the screen, and use more sensitive digitizers in order to detect how far away your finger is. No cameras or "software trickery" are involved. One of the early articles I read, about what MS was promising, described "swiping your hand above the screen to change pages" and "hovering over an email to get a preview" which is exactly the things that HTC and Samsung phones have been doing for over a year now.

http://www.theverge.com/2014/6/9/5792802/microsoft-3d-touch-real-motion-...

Now, there is mention of detecting orientation based on "how the phone is held" which *is* different from how Samsung current does their "Smart Rotation". That is using the camera to detect which direction a face is oriented relative to the phone. That could be a better or worse method, depending on the user and how they normally hold their device. Everything else described, such as answering a phone by holding it to your face, or turning on speaker phone by laying the phone down on a table, already exists in a number of Android phones.

$600+ off contract. Amazon need to get some perspective of their own.

Posted via Android Central App

I said this on the live cast post, but I find it funny that Bezos made a point is saying "It's not gimmicky!" when talking about Dynamic Perspective. if you feel the need to explicitly say that it's not gimmicky, then you know full well that it *is* gimmicky. And, as I said before, there's nothing particularly wrong with "gimmicky" so long as it's done well enough that people will have fun with it.

Cool.. i guess... but how many years did it take to get this done? really not super impressed on any level with this phone

Wonder if anyone will make an app that can do this "3d" stuff with any Android phone using the front facing camera and sensors. Sure it's not going to be as accurate but surely a similar effect can be achieved. Those lookscreens did look cool.

Posted via Android Central App on nexus 7 (2nd gen)

I've seen live wallpapers that try and get this same general effect just using the phone's accelerometer and gyroscope. Not as accurate, but you still get the same overall effect. it's a neat idea, but it is definitely a gimmick in the overall list of features of a phone.

And (before anyone goes on a rant), as I've said before, there's nothing wrong with "gimmicky". It definitely should be the most accurate way that anyone has done this sort of thing before, and opening up the API means that there will probably be some pretty cool apps to take advantage of the hardware. I just don't see it as a feature that will "attract" people to this device.