Skip to content

The Poet of Senses: Smell

At this year’s Tech Open Air, I attended a Design Thinking workshop by edenspiekermann in cooperation with NanoScent on rethinking scent recognition.

In small groups we asked ourselves „How can scent recognition contribute to both, environmental and personal health within the city?“

NanoScent’s CEO Oren Gavriely gave us an introduction to their technology. They are using sensors and machine learning algorithms to detect Volatile Organic Compounds (VOCs) in the environment. Their technology aims to prevent environmental disasters in various industries.

NanoScent CEO Oren Gavriely at Tech Open Air 2019

Inspired by this insightful workshop I wanted to dig deeper into the „poet of senses“– smell!

Early Fail

I came across an interesting article by The Hustle on an earlier attempt to monetize the intersection of technology and smell. Around the turn of the millennium, a company called Digiscents raised $20m to bring scents to the digital world through their prototype iSmell. They employed sensory psychologists and “scentographers” to create unique scents through the combination of different oils that got heated and fanned out.


“You can break down any smell — rotten bananas, fresh sardines — into 100 or so key fragrances,” he tells us. “From those materials, you can recreate the smell of almost anything in the natural world.”

– Founder of Digiscents via The Hustle

Ultimately, the endeavor failed due to complicated testing procedures, technical faults and a lack of demand. The fall of Digiscents deterred investors and scared people to work on a rebirth of digital scent technologies.

Research on Scents and Smell

In 2019, we can see companies like NanoScent bringing new concepts to market. Next to that, I found a couple of interesting research projects.

One example is the Sussex Computer Human Interaction Lab conducting a lot of research on multi-sensory experiences in HCI.

Their goal is to fully understand how different senses can be stimulated when people interact with technology taking into account relationships between the senses and potential human limitations when information from different senses need to be integrated. With that knowledge, richer experiences can be crated for users. Being deeply connected to our emotions and memories, smell is an especially interesting candidate when it comes to enhancing human-technology interaction. The rise of immersive VR applications may further drive this development.

In-car scent interaction

One application field of scent interaction is the automotive industry. Researchers already found out that scents can be used to influence drivers’ physical states and behaviors. In various experiments effects of scents on alertness, wakefulness, mood, attentiveness and even braking performance have been demonstrated.

Peppermint for example is associated with faster reaction times and can help drivers to focus on the driving task and being less frustrated while driving.

Last year, the scientists of Sussex Computer Human Interaction Lab were first in applying olfactory notifications to convey driving-relevant information.

They tested a visual only feedback modality against a combination of visual and olfactory based information. A lavender scent was used for instance to convey the message “slow down”. Their driving simulator experiment demonstrated that feedback based on scents is perceived as less distracting and more helpful than visual notifications.

But not only did the experiment participants like the scent system, the positive effects were also reflected by the driving behavior: the drivers made less mistakes in the olfactory condition compared to visual only feedback.

Considering these promising findings, I can’t wait to see olfactory conditioning happen in real cars soon.

Source: Dmitrenko, Dmitrijs & Maggioni, Emanuela & Obrist, Marianna. (2018). I Smell Trouble: Using Multiple Scents To Convey Driving-Relevant Information. 234-238. 10.1145/3242969.3243015.

Impressions: Rethinking Scent Recognition Design Thinking Workshop at TOA 2019