A supercharged update to Google Now is headed to Android M, and it's all about "in the moment" features.

Google Now on Tap

Google is all about showing off their knowledge engines, and for a while Google Now has been a big part of that experience. Alongside all of the updates to Google's Machine Learning systems, which make Google Now bigger, faster, and smarter, Android M is going to take things one step further and offer additional "in the moment" features to Google Now. These new and improved features to Google Now have earned the name Now on Tap, and it's going to be a huge push into the individual apps and services you use every day.

Google Now on Tap is all about offering specific features while you are inside of at app, and according to Google parts of it will work without changes from developers. Now on Tap allows you to ask simple, nonspecific questions while inside of an app, and the service will use the information currently displayed within the app to answer the question. On stage at Google I/O, the demo given was to ask for the actual human name of the almost musical persona Skrillex.

Now on Tap Tomorrowland

As Google Now frequently does already, Now on Tap can scour text and images inside the app you are using for helpful suggestions as well. You can tap on the image of an actor to get more information, or tap on the name of a movie in a message or email and get IMDB profiles and local showtimes. It's a lot of the same stuff you see in Google Now already, but the added in-app context means there's far fewer steps involved in getting to this information.

When developers do plug in to Now on Tap, the contextual street goes both ways. In one on-stage example, Now on Tap helped a restaurant app open immediately to the restaurant you are currently sitting in, and from there are able to perform voice and text section searches with that same context engine. Google Now Director Aparna Chennapragada promises this is just a small example of what is to come through Now on Tap, and that we'll hear more about this feature over the next couple of months.

Keep an eye on our Google I/O liveblog for more!