TechWatch editor Emily McDaid meets with Sensum’s Ben Bland to discuss how the start-up is translating human emotion into quantifiable data for the auto industry.
“We don’t make a wearable,” said Ben Bland, CTO of Sensum. “We provide physiological and emotional data insights that can be captured with a wide array of sensor technology.
“We can use data from heart-rate monitors, technology that monitors palms, facial expression recognition and voice analysis.”
“The skin on your hands has an evolutionary hangover. Our hands get clammy when we’re excited. We’re unaware of it but sensors can pick up tiny changes through skin conductance,” Bland said.
This is the phenomenon whereby, with emotional arousal from stimuli, the skin momentarily becomes a better conductor of electricity.
Facial recognition works by creating a map of a human face, using algorithms that monitor vast quantities of pixels, signalling when the person is smiling, laughing etc.
Now, Sensum’s market focus is on the automotive industry. “We were blown away by the sector’s response,” said Bland. “We’ve spoken to almost every top-tier auto manufacturer, and they get it.”
With autonomous cars, it’s an accepted principle that the technology needs to understand the humans using it, both the driver and the passengers.
“Audi, among others, has already built an empathic car,” said Bland, when I ask him how far along the automakers are. “They’re all thinking about this but some are further along than others.”
It sounds like Sensum has a captive audience and is poised to hit the ground running after building its technology for several years. Bland said the technology can be applied to cars in three ways: safety, comfort and entertainment.
“We’ve been perfecting our emotional processing engine since 2011,” he said. “We’re world leaders at measuring emotions in the wild – anywhere outside the lab, where people live and breathe.”
How large is Sensum now?
“We have 15 full-time staff, currently.”
Are you already at deal stage?
Bland said: “Yeah, we’ve been securing deals with car brands and tier-one suppliers.”
Sensum has helped build the Ford Buzz car that flashes 200,000 external LED lights to correspond with the driver’s mood. Bland explained: “It’s a cool way to show how a vehicle responds to the driver’s emotions in real time.”
Ford showed that driving a sports car ranks with life’s most exciting moments. I wouldn’t know, since I drive a Renault Megane.
Bland stressed: “The Ford Buzz car is just showing what’s possible.”
What other customers can you describe?
“We worked with Red Bull Media House on a project with extreme-sports athletes, measuring their real-time emotional state to create visualisations in video and VR content.
“We’ve also worked with Unilever, providing a new research tool that responds to consumers’ biometric state in the moment of product usage.”
Are there applications outside the vehicle? Will you ever sell a B2C product directly to consumers?
Bland said: “Not in the immediate future. We’re building a universal emotional processing engine. Thinking ahead, there are uses in the home, such as playlists that come on automatically based on your mood, or mood lighting – it’s about optimising services to someone’s current state. It could even detect a heart attack that’s about to happen, but that kind of medical use would be some way off for Sensum.”
Switching gears, Bland commented on the future of transport.
“The incentives to change are dramatic and obvious in the auto industry. Self-driving cars are vastly safer. If we let AI take over, we could get road deaths down to zero,” he said. “We take for granted how complex our cars are already, so we can expect to see what seems ‘futuristic’ come in very quickly.
“Our customer base is very global, but I think Europe can be a hub for advanced mobility.”
By Emily McDaid, editor, TechWatch
A version of this article originally appeared on TechWatch