Blippar, which demoed its smartphone-based visual search and discovery technology at February's Code/Media, is ready for the rest of the world to give it a try.
The company is updating its iOS and Android apps to be able to recognize millions of different kinds of objects rather than just those embedded with a special code. The result is a quirky, if a bit addicting, experience that allows users to point their phone and get a word cloud for everything the camera sees.
When Blippar's app is confident it has found a match from one of its commercial partners or open source databases like Wikipedia, it "blips" up a bubble. From there you can see more info on the result or things related to the result.
It's similar to the Firefly feature Amazon introduced with the Fire phone, but capable of recognizing a wider range of objects, CEO Ambarish Mitra told Recode.
"They made it very purchase focused," he said of Firefly. "It was very Amazon-like."
The move comes as analysts predict a growing number of searches will come through means other than text. In a key slide during last week's presentation at Code Conference, noted analyst Mary Meeker projected half of searches by 2020 will come from either image or voice queries.
One big challenge for Blippar and its 320 employees will be keeping up with the image recognition algorithms being developed at places like Google, Facebook and Microsoft that have access to nearly infinite repositories of photos with which to tune their systems.
The new app is also a big a shift away from Blippar's existing business, which focused on identifying goods specially tagged by its business partners.
"We thought for our users who are pointing at hundreds of other things beyond branded content, this was the best way forward," Mitra said. The app will continue to recognize Blippar partners' codes as well as standard bar codes and QR codes.
Here's what Blippar's updated app looks like in action, at least under ideal conditions:
This article originally appeared on Recode.net.