Sensors are increasingly everywhere. Soon, maybe even on our faces.
The recent Snap announcement of Spectacles seemed like a watershed moment for face wearables to me. Finally, I thought: something that isn’t just a slightly refined prototype brought out of a computer science lab. Something that even looks like the cool kids might wear it - at least, some of the time. And at a price point that doesn’t make it the preserve of a Silicon Valley VC.
Whether Spectacles will be the must have ‘toy’ this Christmas, or just be the preserve of Californian oddballs, remains to be seen.
More interesting, however, is how this strategic move fits into the continued unbundling of sensors from our mobile phones to everywhere they might become valuable. In a world where trends can shift at lightspeed, one of the safer bets to bank on is that more connected sensors will continue to surround us over time.
Whether they are worn, placed or installed in our everyday environments, the plummeting cost and increasing quality of sensors around us will radically change what we expect from mobile, and by-extension, the way businesses serve their customers.
The nervous systems of 21st Century organisations
At TAB, we’ve already started to experiment with machine vision techniques that transform an image sensor from humble camera to intelligent product detection tool.
These techniques, combined with machine learning applied in the cloud, are just starting to pay dividends in consumer and enterprise applications. From a quick search for photos of your cat on your phone, to tracking the gaze of passers by outside your retail space, the number of things machines can ‘see’ via a camera is growing each week - just as the price of high quality light-sensors hits the floor.
But it’s not just cameras. Microphones are on a large scale roll-out to new places and interfaces, exemplified by the Amazon Echo, followed by Google and (probably) Apple as each tech giant tries to place Assistant and Siri-connected mics within closer shouting distance than Amazon’s Alexa.
Again, the sensor itself is connected, and gains its intelligence from the cloud infrastructure at the core and ongoing data collection at every endpoint. What ‘machine hearing’ might then become once this proliferation of microphones are connected to deep learning infrastructure is hard to predict.
Perhaps some of the more exciting sensor possibilities will start to emerge in cases where our own senses aren’t directly analogous to a sensor’s capabilities. LiDAR uses lasers, for example, to sense depth and this is how automated vehicles ‘see’. But applications well beyond this aren’t hard to imagine - especially as the price plummets.
We’re moving towards a world of companies with eyes in the sky that can see every detail of a competitor's operations. Or ears in a customer’s kitchen, listening to what they need help with today. Even a sensor-eye-view of everything going on across your retail environments over the course of a day.
Or if you’re Snap, seeing what’s in front of your customers eyes. Seeing what they can see. Imagine.
Most businesses today are moving towards cloud architecture and intelligence with gusto, buoyed by cost saving and new intelligent and scalable applications and data at the core. The next stage is where Snap, Amazon and Google are now pointing - and that means connecting the brain of the cloud to the emergent nervous system of sensors.
Delivering tomorrow’s improvements overnight
The business potential of this throng of connected sensors around us is hard to overstate.
Understanding and optimising how your physical capital can work in unison is already commonplace in container shipping and in some agriculture, and will likely become commonplace on any business that has physical operations.
New, differentiated products and services are already emerging in the ‘smart home’ space, but will almost certainly spread to retail experiences and beyond soon too. Alexa, Siri and chums are showing the potential for connected sensors to become ‘invisible’ interfaces, and transform how companies interact with customers in the process.
One interesting lens for uncovering future opportunities is to think what might be possible retrospectively - looking back to better understand what happened yesterday, and what we might want to change to unlock new opportunities.
Imagine the potential value of analysing a single day’s LiDAR data from a busy retail store. It might track products on shelves over time, as well as customers within the store; it might reveal group sizes, linger-times at particular displays - or whether particular products were picked up and replaced, or put in a basket.
Overnight analysis of this could be used to rapidly optimise inventory choices, store layouts and marketing.
It’s clear that as this process continues and accelerates, where you place your sensors (or the sensors you are granted access to) could have as much business value as real estate, distribution networks or intellectual property in the 20th Century.
Sensors are poised to uncover new ways to make organisations smarter, better and more adaptable in serving their customers. In that way, every company will be a camera company. And a microphone company, and a LiDAR company...
And what might just be a ‘toy’ for Snap right now might just prove a critical strategic stake in tomorrow’s sensor-coated world.