Scent as Signal: How Aurat Aroma is Using Smell to Understand the Brain

When people talk about neurotechnology, they usually think in terms of brain waves, interfaces, and wearables. But there’s another powerful — and often overlooked — gateway into the brain: smell. Aurat Aroma sits at that intersection. Co-founded by Lekha Challappa, Aurat is an olfactory intelligence company building ScentAI software, grounded in computational neuroscience, to predict how people will perceive and respond to scent chemistry. Over time, the same underlying olfaction layer can support adjacent applications, like intranasal drug formulation and threat sensing.

In this conversation, Lekha shares the personal story that drew her to olfaction, and why scent is one of the most powerful and overlooked pathways into the brain. She explains how digitized scent could reshape the flavor & fragrance industry, and where she sees olfactory neurotech — and the wider neurotechnology industry — heading next.

Carter Sciences By Carter Sciences
 
Lekha Challappa, co-founder of Aurat Aroma, photographed in 2024, a neurotechnology founder working on olfactory intelligence and scent-based brain research.

Lekha Challappa. Image sourced from Louisville Inno (American City Business Journals), July 31, 2024.

You’ve previoulsy spoken about how scent and memory are deeply personal for you. When did olfaction first become something you thought about seriously — not just as a sense, but as a gateway into the brain?

It’s a different answer than most people expect, but the real turning point was when my sister passed away very suddenly. When my family and I were thinking about memories of her, we realized how many of those moments were tied to her perfumes. A certain scent would take me straight back to a very specific moment — childhood scenes, or even something as recent as an outfit she wore last year: I could see it. There was a whole visual memory playing in my mind just from the smell.

That’s when it really clicked for me: olfaction, memory, emotion, recall — they’re deeply connected. And it didn’t feel abstract anymore. It felt like evidence. It made me confident there are real patterns — signals you can follow — that lead to measurable, neurologic outcomes.

I suspect a lot of people can relate to that. Music does that for me — it gives me goosebumps, takes me back somewhere instantly. Even taste can do the same. Does everyone experience sensory memory in that way, or is it more individual?

It’s both. The underlying mechanism is shared by most people, but the specific triggers and strength of the effect are highly individual. Most brains can form strong odor-memory links because olfactory input is tightly coupled to emotion and episodic memory systems. But people differ in their sensory biology and the experiences that get ‘bound’ to certain smells over time.

Neuroimaging diagram showing hippocampal activation and connected brain regions during odor-based memory recognition, illustrating how smell links to emotion and episodic memory.

A: Hippocampal activation during odor-based recognition. B: brain regions functionally connected during odor-based recognition. Figure reproduced from Saive, Royet & Plailly (2014) .

Where it becomes quantifiable is when you treat those differences as variables: olfactory variables, individual biology, and psychosocial covariates. When you collect those covariates alongside standardized response measures, such as recognition, recall vividness, valence/arousal ratings, reaction time, and memory consistency across repeated trials, you can model patterns at the population level and then personalize from there. So sensory memory is real for most people, but its shape is individual, and that individuality is something you can measure, stratify, and predict once you treat smell, context, and genetics as a combined data signal.

Let’s get into the core of what you’re building. In simple terms, what do you mean by ‘olfactory intelligence, and how is it different from how smell has traditionally been used?

For a long time, smell has been used mostly as a therapeutic or environmental tool — “this scent is relaxing”, “that scent is energizing”. They’re usually described qualitatively, and rarely handled as neurologic data you can measure, model, and predict.

When I talk about olfactory intelligence, what I mean is treating smell as a structured input to the brain — a source of neural information that directly maps to perception, emotion, and memory. Olfaction is neurologically distinct because it encodes scent as a high-dimensional receptor pattern, then routes that pattern rapidly into limbic circuitry. That makes it one of the most information-dense and underutilized ways to understand how someone’s brain is likely to interpret and react to a stimulus.

So instead of asking only how a smell made a person feel, we ask: What is the person’s underlying olfactory profile, and what does it predict about how they will perceive and respond — consistently, at scale, and across contexts? That’s where the ‘intelligence’ comes in — we’re building the computational layer that can translate scent chemistry into predicted human response.

Commercially, it means we can deliver value now by helping the flavor and fragrance ecosystem predict human perception and reduce repetitive testing, while building the same core olfaction layer that can later support next phases like targeted intranasal delivery, diagnostics, and sensing applications.

“Aurat Aroma’s digital olfaction platform visualising how scent chemistry is translated into predicted human perception, emotion, and cognitive response.

Aurat Aroma’s Digital Olfaction Platform aims to unlock the cognitive power of olfaction.

You’ve also talked about digitizing scent responses and working with ‘electronic noses’. Where does that fit in, and in which areas do you think olfactory intelligence will prove its value fastest?

There are already some excellent companies digitizing odor — taking airborne chemistry and turning it into a readable ‘scent ID’. That’s a huge part of the future for us. Today, we work primarily with pre-analyzed scent chemistry to predict human perception and response. But pairing our platform with electronic-nose data is where this gets really powerful. It’s how you move from modeling scent in a lab to translating real environments and real-time chemistry into the same predictive layer.

You can think of electronic noses as capturing the chemical signal, and Aurat as the layer that converts that signal into human meaning — how a person is likely to process it emotionally, cognitively, and behaviorally. We’re essentially the ‘scent brain API’ — taking a scent ID, whether it comes from a formulation file or an e-nose, and mapping it to predicted human response.

In terms of where the value proves out fastest, a couple of domains stand out:

  • Flavour and fragrance / consumer packaged goods
    This is probably the biggest near-term wedge for us. People don’t really think about it, but fragrance chemicals are literally everywhere — in hotels, airports, airplanes, skincare, personal care, cleaning products. And when you zoom out, you realize how huge that space is, with massive players like IFF, Givaudan, P&G, and Unilever. A lot of decision-making still relies on human panels that are costly, slow, and hard to scale. Digitising response lets brands predict performance earlier, reduce repetitive testing and enabling faster iteration.

  • Healthcare, threat, and chemical detection
    As e-nose capabilities improve, you can detect more invisible particles — biohazards, explosives, allergens, disease markers, and environmental toxins. Our layer sits on top to answer the next question — not just what’s present, but what it means for the people in that context — experience, impact, and level of risk. That’s the bridge we’re building: Going from detecting that something is in the air to understanding how humans are likely to experience it, and what action it implies. And the tighter the connection between e-nose sensing and our predictive models, the closer we get to a world where environments themselves become legible through olfactory intelligence.

A lot of people still think of smell as too subjective to measure. Is that a misconception? What patterns do you actually see when you look at receptors, chemistry, and behavior?

It’s definitely a misconception, and it usually comes from how people frame smell. They jump straight to preference, social context, culture, and experience — but scent sits on top of something much more structured.

Underneath the story layer, olfaction is biological and neurologic. People have different receptor genetics, so the same scent can literally be encoded differently at the sensory level — what you detect, how intense it feels, and which notes dominate. But the ‘scent fingerprint’ isn’t just a list of receptors; it’s how those receptor-driven signals get represented and learned by the brain over time, how they map to valence, familiarity, memory binding, and behavioral tendency.

Scientific diagram showing relationships between odour molecules and olfactory receptors, illustrating how scent is encoded and processed by the human nervous system

Relationships between odour molecules and olfactory system receptors. Figure reproduced from Trimmer et al. (2019) .

That’s where computational neuroscience-enabled machine learning is so important. We treat a scent as an input that produces a patterned neural code — starting at receptor activation and moving up into perceptual features like intensity, pleasantness, familiarity, and emotional salience. Then we layer in the individual variables that shape processing — biology, context, and experience — and learn the stable structure in how those signals combine. The output is a fingerprint that reflects someone’s full olfactory processing profile, not just what they say they ‘like’.

Right now, most of our digital life is visual and auditory — screens and headphones. Do you see a future where scent is part of how we experience digital products and content?

I really do. We’ve spent the last decade turning vision and audio into mature commercial and neural data layers. Computer vision can interpret images at scale, and audio ML can model speech, emotion, and environment. But olfaction, even though it’s one of the most direct routes into emotion and memory, has been largely skipped in the digital stack.

Part of it is hardware. I’m genuinely surprised we don’t already have a simple micro-deployment device, something like a small plug-in diffuser that can deliver scents tuned to the tone and target audience of an experience, the way we already tune visuals and sound. It’s a natural extension of what we know about the hippocampus and olfactory memory.

Right now, you mostly see scent tech show up where the need is immediate: defense, hazard detection, and environmental risk. The next step is bringing it into everyday consumer-facing experiences as interfaces get more immersive: AR glasses, wearables, and mixed reality. Longer term, I’d love to see an ‘olfactory BCI’ concept emerge that not only deploys scent, but can read the sensory state of the nose and environment so that olfaction becomes a true bidirectional layer in neurotechnology, the way vision and audio already are.

Zooming out, where do you see this specific slice of neurotechnology — scent and olfaction — going next? And more broadly, where would you like to see the whole neurotech field move?

For olfaction specifically, I see three main arcs:

  1. Chemical flavour and fragrance, and consumer environments
    This is our first focus. Doing the foundational work here — understanding olfactory chemistry and modeling human response at scale — is a huge opportunity for prediction and personalization, as well as for making the products people live with every day objectively better.

  2. Targeted intranasal drug delivery and neurodegenerative conditions
    Because we’re building a deeper understanding of olfactory processing, emotional memory, olfactory storage, and precursor signals, there’s real potential to support neurodegenerative and central nervous system-related efforts — even if the gains are incremental.

  3. Robotics, sensing, and invisible particles
    As environmental sensing expands, olfactory intelligence becomes critical for interpreting invisible particles in the air like allergens, toxins, pathogens, and threats. Detection is one piece, but translating what’s present into readouts that are explainable and actionable for humans is where this becomes truly valuable.

For neurotechnology more broadly, I think it’s going to become a much bigger part of commercial AI — not just something living in a silo. I’m an AI/ML engineer, and so much of what we call ‘AI’ today is just building on top of hype cycles. But if you go back, all of this came from human neural networks, neural probabilities, and brain-inspired computation. I think we’re going to see a much tighter merge between neural understanding and AI, which will push the quality of technical products forward.

Also published on Medium via NeuroTechX

Looking to Build a World-Class Neurotech Team?

Carter Sciences delivers personalized talent strategies backed by 20 years of international headhunting experience. We support growth-stage startups with tailored solutions and cost-effective fee structures, so you can scale without impacting your runway.

Specialisms include: Neurotechnology, Neuromodulation/Stimulation, Brain-Computer Interfaces, Wearable Devices, Neurosurgical Technology, and Private Equity/Venture Capital.

Contact Carter Sciences
Next
Next

Precision Over Pain: How Spinally Is Reinventing Spinal Cord Stimulation