Skip to main content

Tag: whatsnew

Google Mobile App – clever convergence of data, directory categorisation, location and interface

As the regular reader will know I’m a big  believer that the convergence of location-based information, structured data, inferred/contextual relationships and a slick relevant interface will change our world and start delivering the sort of “future” interactions that we had in the 1960s’ SciFi.

Google’s Mobile App is a step closer.

I won’t rehash the explanatory video – it’s, er, self-explanatory – but the really interesting part for me isn’t the voice recognition but rather the emerging “common sense” in the google results. Note that there’s now an interpretive layer that’s interception calculations, directory-type enquiries (eg film listings, nearby restaurants) and informational or evaluative requests.

This is a major step forward for something that we tend to think of as a text-indexing service.

I’m a great fan of knowledge systems like TrueKnowledge (that has an inference engine built upon structured facts, questions and relationships – wonderful) – but it seems that Google’s slowly but surely adding equivalent capabilities by stealth and in parts.

Let’s start counting the days until this is seen as “just normal”…

UPDATE: been playing this morning at a client’s (different voices, male/female, Northern, Welsh, Australian) and we’re getting a one in five success rate. Still, that it even works 20% of the time is amazing and I’m sure it’ll train me to get clearer 😉