Saturday, December 12, 2009

Why the Google "Mobile Lab" Test?

Whatever else Google may want to demonstrate with its "mobile lab" test, which apparently has Google employees globally testing an Android smartphone, the company likely wants to explore and highlight the use of the mobile device as an intelligent sensor able to use voice input, location and camera features to enrich the "what's around here" features of Google's search experience.

Google first launched "search by voice" about a year ago, and "looking ahead, we dream of combining voice recognition with our language translation infrastructure to provide in-conversation translation," says Google VP Vic Gundotra.

Google recently also introduced "What's Nearby" for Google Maps on Android 1.6+ devices, available as an update from Android Market. The application returns a list of the 10 closest places, including restaurants, shops and other points of interest near a user's location. Local product inventory will be added in 2010.

Visual search also is developing, Gundotra says. A picture taken by a Google-equipped device will return relevant search results based on that visual information, including information on landmarks, works of art, and products.

"Today you frame and snap a photo to get results, but one day visual search will be as natural as pointing a finger," says Gundotra.

No comments:

Costs of Creating Machine Learning Models is Up Sharply

With the caveat that we must be careful about making linear extrapolations into the future, training costs of state-of-the-art AI models hav...