2 Jun 2017DisruptorsWhy Google Lens could be the next step towards people becoming cyborgsDISRUPTORS: The ideas changing industries
image-7bc38748757190d4cf11f24c0d51968289809c08-1349x472-jpg

At its annual I/O conference, Google unveils a host of innovations that it’s been working on. And this year, media hype centred around Google Lens; a technology that integrates Google’s search engine with a smartphone’s camera. If it works, it’ll enable people to use their phones for visual searches – whether quick translations of foreign signs or booking tickets to an event just by pointing at the poster. We explore the insights behind the Silicon Valley giant’s latest innovation, and what it could mean for the way people use technology to interact with the world.

Author
Alex Rueckheim

Google announced a number of new products and updates at the conference, including advancements in AI hardware, Google Assistant and Google Home. The Google Lens technology will leverage Google’s computer vision and AI tech, enabling phone cameras to recognise and understand whatever they're 'looking' at. Google CEO Sundar Pichai explains that users will be able to point their phone’s camera at anything from flowers to restaurants, and it will automatically understand what it’s seeing.

As the smartphone evolves, it’s no longer just used for calls, texts and taking photos. It wakes us up, gets us from A to B and even helps us find love. With 79% of Americans keeping their phone on or near them for 22 hours a day, these technologies have the potential to seamlessly fit into people's lives – if they actually work. As cyborg anthropologist Amber Case aptly notes, well-designed technology “allows you to be a smarter human, not have a smarter device.”

Google wants to make searching smarter

What remains to be seen is whether a visual search function can work seamlessly enough to do this. With developments in AI and machine learning, the idea of exploring the world this way are is appealing; Blippar developed an AR-powered visual search feature allowing people to discover a wealth of information through their phone’s camera whilst mitigating the discomfort of speaking to a device in public, while Snapchat has introduced a 3D filter that overlays AR elements onto real life scenes.

“We are all cyborgs now,” says Case. And if Google successfully manages to transform the camera on people’s phones from a passive receptor into an active tool, it will not only change the way people interact with the world, but the way they interact with their phones on a personal level – making us more cyborg-like than ever.

Alex Rückheim is a behavioural analyst at Canvas8, which specialises in behavioural insights and consumer research. Having lived in nine countries, he holds a master’s degree in Strategic Marketing and is fascinated by cross-cultural shifts in consumer behaviour. He is also the founder of design-focused site GOODS WE LIKE.