On Season 4 of Theory and Practice, Alex Wiltschko and I explore newly emerging human-like artificial intelligence and robots — and how we can learn as much about ourselves, as humans, as we do about the machines we use. The series delves into many aspects of AI, from needed guardrails and empathic communication to robotic surgery and a future where computers make decisions.
In episode 5, we explore how machine learning helped to create a map of odors and how that technology will train computers to smell. To learn more, I visited Alex in his lab at Osmo, where a collection of technologists, chemists, and master perfumers are digitizing our sense of smell.
Alex has been interested in the sense of smell from a young age. He trained as an olfactory neuroscientist at the University of Michigan and Harvard, and most recently, worked at Google AI, where he set up the Digital Olfaction Team and went on to join GV as an entrepreneur in residence. Alex announced the formation of Osmo earlier this year, and today, his team published landmark digital olfaction research in Science.
During my lab tour, Alex showed me various machines for labeling smells and analyzing different components of scents. As we learn in the episode, aroma is quite complex compared to other senses, including hearing or sight. This complexity arises from the number of receptor types involved in olfaction, creating a “portfolio sensing effect.” There is also a nonlinear relationship between the structure of a scent and what it actually smells like. For example, tiny molecular changes in a chemical structure can change a scent from a rose to a rotten egg.
Today, machine learning and artificial intelligence have opened up the ability to map scent structure using neural network graphs — breaking open a variety of possibilities. These include healthcare applications that could help diagnose diseases such as Parkinson’s and cancer.