Google's Project Soli chip powers Motion Sense on the Pixel 4, and we expect to see it branch out into some other functions and features as more time goes by. With the arrival of the Pixel 4 we've seen a lot of talk about what it can do, but not as much about how it does it.
Soli is actually simpler than you might think. The science behind it can look and feel a bit like magic, but Soli uses proven methods to capture fine motor movement. The biggest hurdle was getting everything packed into a small and power-friendly state. In a nutshell, Soli uses millimeter-wave radar to detect motion at the micron — one-millionth of a meter — level and pass that data off to software.
You might have heard the term millimeter wave before when people are talking about 5G. Millimeter-wave is simply a term used to describe signals longer than infrared waves but shorter than radio waves or microwaves. Soli isn't going to interfere with 5G or meteorology, but Google did need FCC approval for its use (which is why it's only available in a few countries).
More: Will 5G impact weather forecasting? A UN conference aims to find out
Radar is a system that can detect objects using radio waves. It can detect things like the range (distance), angle, and speed of anything in its path. We're familiar with how radar is used to detect rain, but radar is good at detecting anything solid.
Soli has both a transmitter and a receiver on its chip. The transmitter sends out a modulated radio wave. That means a second signal is combined with a "normal" radio wave that contains extra data. When this wave hits an object, it scatters in many different directions, including right back at the Soli chip's receiver.
Because the original radio wave was modulated with extra data, the delay, frequency shift (also known as a Doppler shift or change in frequency) and amplitude attenuation (a reduction in the maximum amount of energy that was sent) between the original signal and the signal that was reflected back can be measured to give data about what it is "seeing."
The data collected can tell a lot about what is in the path of the radio wave. Soli can determine the distance, speed, size, shape, smoothness and more through measuring these reflected waves. It can even determine what an object is made of and the exact angle it is orientated in.
This data is a gold mine when it comes to wireless and touchless input. While Soli has all the data about the shape and size of an object it sees, the more important set of data is the motion, range, and velocity. Since Soli can measure differences as little as a micron, it's very precise and can accurately detect and track even the tiniest motions of a hand.
This data is then handed over to software. As long as the data maintains a consistent pattern, it's used the same way data like tapping on a touch screen can be. With this data software like a music player can be interacted with, or a more complex feature like Face Unlock can be initiated.
Soli is capable of a lot more that we've seen so far. Using it to make Face Unlock quick and seamless is pretty cool, but more importantly, it means that a Soli chip is present to be used for other things once a phone is unlocked and used. If Google were to create a public API that lets app developers use its capabilities, Soli could do just about anything our fingers can.
What about radiation? I mean I transport the thing in my pocket - and that's not so far away from my private parts...
Lol I'm really hoping that's a joke XD
Saying "anything our fingers can" is a bit of a stretch... don't think Soli is capable of picking up a pencil, or flicking a co-worker on the nose when they do something stupid... Also doubtful that you could use it for typing either, as that would require a person to learn a completely new way of typing, with zero haptic feed back... and we all know how terrible touch typing was when there basically wasn't any haptics.... until they can make the air feel like a keyboard and give you physical resistance/responses, don't think that'd work very well... this sort of tech would be vastly better suited for VR applications, where it could track your hands and allow for some actual interaction. Though once again, lack of haptics still make that rather unsettling....
He's saying that it can do anything our fingers can when it comes to interacting with software. It's obviously not capable of manipulating physical objects. At least the writer assumed that was obvious but apparently it needed to be spelled out literally for the thick-skulled among us.
Get the best of Android Central in in your inbox, every day!
Thank you for signing up to Android Central. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.