Welcome to Project Soli

Welcome to Project Soli

My name is Ivan Poupyrev, and I work for Advanced
Technology and Projects group at Google. The hand is the ultimate
input device. It’s extremely precise,
it’s extremely fast, and it’s very natural
for us to use it. Capturing the possibilities
of the human hand was one of my passions. How could we take
this incredible capability, the finesse of human actions
and finesse of using our hand, but apply it
to the virtual world? We use radio frequency spectrum,
which is radars, to track human hand. Radars have been used
for many different things– to track cars, big objects,
satellites and planes. We’re using them
to track micro motions, twitches, of the human hand and then use that
to interact with wearables and Internet of Things and other computing devices. Lien: Our team is focused
on taking radar hardware and turning it
into a gesture sensor. Radar is a technology which transmits a radio wave
towards a target, and then the receiver
of the radar intercepts the reflected energy
from that target. The reason why we’re able
to interpret so much from this one radar signal is because of the full
gesture recognition pipeline that we’ve built. The various stages
of this pipeline are designed to extract
specific gesture information from this one radar signal
that we receive at a high frame rate. Amihood: From these strange,
foreign range Doppler signals, we are actually interpreting
human intent. Karagozler: Radar
has some unique properties when compared to cameras,
for example. It has very high
positional accuracy, which means that you can sense
the tiniest motions. Schwesig: We arrived
at this idea of virtual tools because we recognized
that there are certain archetypes of controls,
like a volume knob or a physical slider,
a volume slider, Imagine a button between your thumb
and your index finger, and the button’s not there, but pressing this
is a very clear action. And there’s an actual
physical haptic feedback that occurs as you perform
that action. The hand can both embody
a virtual tool, and it can also be, you know, acting on that virtual tool
at the same time. So if we can recognize
that action, we have an interesting direction
for interacting with technology. Poupyrev: So when we started
this project, you know, me and my team,
we looked at the project idea, and we thought,
“Are we gonna make it or not? Eh, we don’t know.” But we have to do it. Because unless you do it,
you don’t know. Raja: What I think I’m most
proud of about our project is, we have pushed
the processing power of the electronics itself
further out to do the sensing part for us. Poupyrev:
The radar has a property which no other technology has. It can work through materials. You can embed it into objects. It allows us to track
really precise motions. And what is most exciting
about it is that you can shrink
the entire radar and put it in a tiny chip. That’s what makes this approach
so promising. It’s extremely reliable. There’s nothing to break. There’s no moving parts. There’s no lenses. There’s nothing,
just a piece of sand on your board. Schwesig: Now we are at a point
where we have the hardware where we can sense
these interactions, and we can put them to work. We can explore
how well they work and how well they might work
in products. Poupyrev:
It blows your mind, usually, when you see things people do. And that I’m really
looking forward to. I’m really looking forward to releasing this
to the development community, and I really want them
to be excited and motivated to do something cool with it,

Author: Kevin Mason

79 thoughts on “Welcome to Project Soli

  1. This is so great I cant explain how excited it looks and represents itself in the video..great minded people doing wonderfull things with technology..great project…

  2. It looks great and I hope it works out but I cannot think of many great uses for it. There don't seem to be many benefits of doing this other than you don't have to touch the screen so you can use functions when cooking or eating or something. I think other companies such as LG have had very similar things but maybe this will be implemented better.

  3. Just when you thought we couldn't lose any more privacy Google delivers yet again. So basically along with having cameras recording your every move, Google is also looking to capture your body position and a map of the environment you are in now as well. And most of the idiots in America are gonna back this idea thinking that this is great news, because average Americans are borderline retarded. Nothing more than wannabe popular social media zombies who are too lazy to want to actually do anything physical.

  4. People are going to say, again, that Google copied Apple. But we all know that Apple ripped Google when they launched their faceid in 2017 while Google were trying to make it perfecto.

  5. This is amazing! Why haven't I thought about it?! Make it purchasable as a discrete unit, not only as a 'built-in". Can't wait to get my hand on one of those sensors and integrate with one of my projects. I mean, kind of 'get on'.

  6. You Do NOT have permission to access this Group 🙁


  7. It's going to be interesting putting this in a phone. It's seems impractical due to the device always being in your hands. It'll be convenient for any sort of travel and for a quick peek, but time will tell they'll probably develop very clever uses for it.
    I could see an integration into smart home. Gesturing to your phone to control the tv, Chromecast, lights, locks, routines, the whole shebang.
    Beyond that, it'll open the door for live sign language interpretation, which would be a huge win in itself.

  8. Затея то отличная, но поколение тупеет, для них это слишком сложное управление.

  9. I love this shit even now.. really excited for the future abilities in Pixel phone with Updates or in Smartwatches. I would love a Google smartwatch!

  10. This is pretty cool! If they can train it to recognize Ameslan, you also wouldn't need a keyboard and people who cannot speak would not need a microphone.

  11. Amazing, I hope they put this on tv. The day will come when we don't have to type or speak to or touch. amazing, truly amazing.

  12. Watching after pixel 4 release…
    Just realised (after reading comments) how people demotivate other who r trying something different…
    Thankfully people at google did not read the comments 😂

  13. Soli will only get better with time more action more gestures. Soli needs to be out in the wild for a few months and software engineers to analyse what feed back they get. No doubt soli will get a upgrade next year to be have a bigger field of view

  14. Pixel 4 is here and they decided not to put the full functionality from project soli to the pixel devices. Instead they choose a approach where dispite of having all the hardware available to use soli, they will make it seem like they are improving this tech on future pixel phones so that they can capitalize on soli’s functionality as longer as possible. A tech which was available since 2015 will fully penetrate the market by year 2025 or so, just because of the corporate greed.
    What a great world we live in!

  15. So more RF? Why not use camera or Infrared? You keep bombarding us with more RF 🙁 On the fight of humanity vs cancer, you are on the side of cancer!

  16. I've always thought that phone screens are too small for comfort. texting is really quite annoying. So this could be really interesting, a big part of it would be coming up with innovative and signals to use the technology. I guess they would also have to be straightforward to the point there's barely any learning curve so the project can sell well. Very invested in project soli's future!

  17. In our university we are working on a similar project, without the Millions of funds, we have achieved similar results. 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *