W3C CSS Odor Module Released

W3CThe web was always a visual medium, but with the addition of sound and video it has locked up two human senses. With development of specifications and techniques around vibration, the internet you “feel” is getting closer, too. That leaves only a couple senses left to cover

Ever since the late 90s, companies have proposed different methods for bringing the sense of smell to the web. If successful it’s a necessary lead-in before taste can be transmitted (since so much of taste is based on smell). That day is even closer as of this week.

Previous attempts failed (DigiScent, RealAroma) partly because they were attempting to encode core odor indicators in markup, making for bulky pages with strings of adjectives to describe certain smells. They lacked more subtle indicators of smells, relying instead on keywords (such as rose, which can vary greatly depending on many biological factors). This variance in odor has contributed to the other key reason other proposals have failed — hardware makers struggling with consistent smells. These are the reasons that the ATML (Aroma Text Markup Language) and OML (Olfactory Markup Language) never took off.

The W3C is now trying its hand at bringing smell to the web. Just as CSS separates things like color style from markup, it will also be the driving force behind smell — W3C CSS Odor Module.

Just as colors are defined in CSS3 as RGB values with transparency (RGBA), smells are defined in Odor CSS by the base taste flavors (to promote accessibility for those with reduced or damaged olfactory function) along with intensity. The base taste flavors should look familiar to most of us: sweet, bitter, sour, salty, and umami.

Future plans for taste will include texture, but that will rely on coming haptic technology development that doesn’t exist yet.

The CSS markup will look familiar too, since the syntax is designed to lean on developers’ experience with color, using percentages for the base “flavors” and a decimal value for intensity (similar to transparency):


#rose {
  odor: s(78%,7%,3%,0%,23%,0.4);
  /* sweet, bitter, sour, salty, umami, intensity */
}
#steak {
  odor: s(5%,18%,8%,27%,89%,0.8);
  /* sweet, bitter, sour, salty, umami, intensity */
}

Browser makers are already throwing support behind the specification, probably because they only need to put together an API. Producing the smells will be up to hardware makers. If you want to experiment, Opera and Firefox are already working support into their plans, and Webkit nightlies have support for it as well. A prefixed version of the above would like this:


#rose {
  -mozilla-odor: s(78%,7%,3%,0%,23%,0.4);
  -webkit-odor: s(78%,7%,3%,0%,23%,0.4);
  -o-odor: s(78%,7%,3%,0%,23%,0.4);
  odor: s(78%,7%,3%,0%,23%,0.4);
  /* sweet, bitter, sour, salty, umami, intensity */
}
#steak {
  -mozilla-odor: s(5%,18%,8%,27%,89%,0.8);
  -webkit-odor: s(5%,18%,8%,27%,89%,0.8);
  -o-odor: s(5%,18%,8%,27%,89%,0.8);
  odor: s(5%,18%,8%,27%,89%,0.8);
  /* sweet, bitter, sour, salty, umami, intensity */
}

Now all we need is some sort of hardware to identify the different intensities that make up an odor, otherwise experimentation might make for some awful results. I doubt a Smell of Books-style solution will fit for this since I don’t have enough hands to spray five different flavors, nor the skill to spray in the right intensity.

Update, April 1, 2013

It is my understanding from the folks at Opera who are working on all-things-web-for-your-TV that the CSS Odor Module will play a big part in the technology behind this story from New Scientist: Smell-o-vision screens let you really smell the coffee (March 29, 2013).

When Matsukura, the inventor of the display, says the next stage is to incorporate a cartridge, like those for printers, which allows smells to be changed easily, what he’s referencing is the mapping of the odors as outlined in the specification.

An example of this kind of technology in action (read more):

Update, April 1, 2014

News broke recently of a study that suggests humans can smell over a trillion different odors (Expert nose: we can sniff out over a trillion smells), well beyond prior estimates. Given this news the W3C is considering revising the W3C CSS Odor Module to account for the new findings.

My guess is that any browser-prefixed odor styles will continue to work for some time until the W3C can sort out some new syntax.

The changes may also affect apps that rely on web standards, such as this one by Scentee targeted at cash-poor college students who can only afford rice.

Update, April 1, 2015

Related technologies that will hopefully benefit from the standard:

Update, April 1, 2017

A Paris Ad Agency Made a Virtual-Reality Nose Mask That Emits a Fart Smell because of the South Park video game. Which makes me not want to play it.

Update: April 1, 2022

In an unfortunate further acknowledgement of the mess that is Facebook’s Metaverse, NPR ran a story about a company trying to make it smell: Vermont tech firm believes to experience the metaverse, you have to smell it too.

I embedded the audio:

Sadly, no indication it will use a standardized approach for mapping those odors and no indication it will contribute to the Odor Module. Ten years on, and it appears the draft specification has entered permanent limbo. Kind of like XForms 2.0.

Update: April 1, 2023

In Scientific American article AI Predicts What Chemicals Will Smell like to a Human, the author discusses a different mapping for odors than I discuss above.

The model may also open the door to new technology that records or produces specific scents on demand. Wiltschko describes his team’s work as a step toward “a complete map” of human odor perception. The final version would be comparable to the “color space” defined by the International Commission on Illumination, which maps out visible colors. Unlike the new olfactory map, however, color space does not rely on words, notes Asifa Majid, a professor of cognitive science at the University of Oxford, who was not involved in the studies. […]

[…]

One major challenge is the identification of primary scents. To create the olfactory equivalent of digital imagery, in which smells (like sights) are recorded and efficiently re-created, researchers need to identify a set of odor molecules that will reliably produce a gamut of smells when mixed—just as red, green and blue generate every hue on a screen.

“An Odor Map” scatter plot with unlabeled axes. Among the hundreds of points, some discontinuous amorphous shapes are drawn: a tall narrow purple blob labeled ‘musk’ in the upper left quadrant, a wider and shorter green ‘Lily’ in the center left, a large bulbous red hook in the bottom left labeled ‘Grape’, and an orange squashed U in the center right as ‘Cabbage’.
The proposed odor map.

The idea of leaning on the color model for mapping is there even if the specific attributes are still being sorted. Of course this means very little for moving a standardized odor mapping forward in a meaningful way.

No comments? Be the first!

Leave a Comment or Response

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>