Sensory UX (aka Sensorial UX) is a new branch between the many UX disciplines.
One particularity of Sensory UX is that it always existed, even in ancient times. And there were many experiments and inventions that used that paradigm.
However, as a clear and more regulated UX field, it is very new.
There are many possible explanations for that, as UX history shows that this concept was taken into account even before User Experience got this name.
One of the most likely explanations is that UX as a science or practice was divided into 2 large areas: physical product design on the one hand, and Human Computer Interaction (HCI or XCI) on the other.
Due to this, the initiating current of what we know today as UX moves away from the original discipline (Ergonomics and Human Factors) that considered user experiences at deeper levels to give more development to these new currents. Which is logical given that they are the ones that dominate the economy and frame our current life.
Despite this, research shows that this “abandonment” is being replaced by a revision of concepts that leads us to these new sensory experiences. And so begins what we call Sensory UX.
What is Sensory UX? Is it Sensory or Sensorial UX?
Sensory UX is one of the branches of the User Experience disciplines that considers multiform experiences that involve more than one sense, provided that these senses interact and enrich the experience.
To put it as a definition:
Sensory / Sensorial UX is a discipline focused on the study, research and creation of experiences that involve the different senses in interaction with physical, virtual or experiential environments.
It is very important to recognize the interaction and the multiple sensoriality in said interaction.
A simple Sensory UX example
Tak a look to the photo above. If I see a brown object and touch it, I involve two senses. If touch adds nothing to the experience, then we are not in the presence of Sensory UX.
However, if touch informs me of additional characteristics of the object that enrich my experience, then we are in the presence of Sensory UX.
For example: the brown object (a lens case) has a nice textured surface that makes it difficult for it to slip from my hands. By touching one of the sides, I can feel (or see, if I can see) the hinges, which tells me which side the case opens on.
And when I open it, I feel that its interior is velvety, with the brand embossed on one of its covers. This tells me about a brand with sophistication and tells me that it takes special care for my lenses, but also that I perceive that the brand is more sophisticated than usual. Also, opening and closing the case makes 2 very distinctive sounds. The second one (the one that sounds when I close the case) is very strong and informs me that the case is completely closed.
This is an extremely basic and simple example that takes 3 senses: vision, touch, and hearing. In an object as simple as an eyeglass case, with no greater technology than a mechanical closing mechanism.
And yet this very simple and basic approach raises the perception of the product’s value, usability and user experience. And all this without the need for complicated mechanisms or great conceptualizations.
Is it Sensory or Sensorial UX?
Since this discipline is in full swing, there are no complete or definitive definitions yet, not even in the name.
In any case, Sensory and Sensorial mean the same thing (Sensorial is not a very used word, but it is synonymous with Sensory). The difference is that Sensorial is a word that exists in several languages, whereas Sensory only exists in English.
Since in our UX agency we firmly believe in the precepts of Universal UX, it seems to us that Sensorial is more inclusive and comprehensive than a word that exists in only one language.
Another aspect that we think is useful is that it serves to differentiate a certain way of perceiving the different paradigms in which Sensory/Sensorial UX is evolving.
Sensory UX is usually seen as something more related to the 5 senses and with a strong commitment to wearables.
On the other hand, Sensorial UX is more inclusive and deals with the 9 senses now accepted as valid (we will delve into that) and is not reduced to wearables. Not even to digital experiences. On the contrary, it includes all these, plus experiential environments (psychological, moods, aspirations, circumstantial states, etc.)
How many senses are there?
As we all know and have learned since we were children, there are 5 basic senses: sight, hearing, taste, smell, and touch.
However, the word “basic” in the previous sentence is key.
First of all, let’s try a very simple example: suppose we are wearing a wool shirt. Next, we wear a silk shirt. Obviously we will notice a very important sensory difference. But … what sense intervenes in that sensoriality?
Perhaps we could say “the sense of touch”. Or even “a tactile experience” in a broad sense
Of course, that doesn’t quite explain sensory sensations. In principle, in a classical sense, the nerve endings of the sense of touch occur in the fingers and toes. But the main sensation occurred in our body, not in our fingers.
In addition, other information beyond the merely tactile intervened in this experience. For example, we could admit that the experience was tactile based on the texture of the fabric.
But there is additional information: one cloth was warmer, another cooler. One was heavier, another lighter. One was more comfortable, another more uncomfortable.
As we can see, the sensations we feel when wearing the two shirts are not explained with the sense of touch.
Does this mean that there are other senses beyond the 5 basic senses?
Yes.
In the example above, the sense that intervenes in that experience is called Somatosensation or Tactition. It is a general or “multiple” sense that includes several types of senses, since it combines touch and interoception.
Technically, what we feel is information transmitted by small mechanoreceptors located in our epidermis. These mechanoreceptors are called Merkel cells, and they inform different parts of our brain about the sensation produced by an object or event. In this case, the use of both shirts.
But in order not to get bored, let’s leave the technical aspects related to medicine and deep psychology, and let’s continue in a more simplified way (if you want to delve deeper, check the end of this article that includes a lot of in-depth content and bibliography).
List of all senses in Sensory UX
So as we could see, the 5 basic senses are not enough to explain our sensoriality, and there are others that intervene.
How many senses are there? It’s hard to say. It depends on the theoretical framework that we use, and even from the discipline that we analyze them.
But since UX is a set of sciences and disciplines where Psychology is a fundamental protagonist (especially conductive psychology and behavioral psychology, something that you can see in our UX Team), we are going to be governed by the most common definitions within the theoretical framework of Psychology, Neurology and Neurobiology.
List of the 9 senses according to psychology and neurobiology
Based on the premises mentioned in the previous paragraphs, we can recognize 9 human senses (there are animals that have other senses). They are the 5 basic senses:
- Sight
- Hearing
- Touch
- Taste
- Smell
To these senses, 4 more senses are added:
- Interoception or Sense of Internal state of the body
- Proprioception or Kinaesthesia
- Nociception or Sense of Pain
- Equilibrioception or Sense of Balance
With these we would already have the 9 fundamental senses. There are at least 2 more senses that can be added, depending on the theoretical framework that we use: Thermoception or Temperature Sense and the Alert Sense.
However, for the purposes of Sensorial UX these last two are not representative, since both can be considered as part of the Interoception System
Explaining the 4 additional senses
As I said before, I am not going to delve too deeply into the medical or psychological aspects of each sense, just mention them, explain what they are and what they are for (since in a future article we will delve into concrete examples of how to create experiences based on Sensory UX )
Interoception or Sense of Internal state of the body
Interoception is the brain’s mechanism that interprets signals transmitted from the body, allowing an analysis of the physiological state of the body. This analysis or representation can be conscious or unconscious (what we colloquially call sensation or feeling).
This sense is perhaps the most important of all, as it determines how we feel, how we “experience” ourselves, and how we live. Literally. Interoception is the “sense of life” for any living organism, and its failure is usually demonstrative of diseases and pathological conditions of various kinds, as well as neurological and psychological syndromes ranging from anxiety and obsessive compulsive disorders (OCD) to autism and Post-traumatic stress disorder (PTSD).
Because of its importance, Interoception is the basis of sensory integration on many projects, both physical and virtual.
Proprioception or Kinaesthesia
Proprioception (aka Kinaesthesia or Kinesthesia) is the sense that makes us aware of our body position and the way we move.
This sense is usually known as the 6th sense (sorry Hollywood movies fans!), and it’s related to muscles, tendons and joints. Simply put: how we move, where are we, what we need to do in order to reach something, the strength and speed we need to do to perform an activity and so on.
As you may imagine, Propioception is the definitive form of sensory integration in physical product development, and the sense that guides Ergonomics and Human Factors discipline as well as many developments in Product Design and Industrial Design.
Nociception or Sense of Pain
As the name implies, this is the Sense of Pain.
Nociception causes our body to respond with a variety of physiological and behavioral responses to stimuli. These responses are subjective, that is, a feeling or perception, which we identify as pain or discomfort.
But also, nocireception can generate autonomous responses even when we are not aware of the sensation. Thus, we can feel symptoms such as sweating, dizziness, nausea, tachycardia, or simple sensations of discomfort without feeling pain or being able to identify what is happening to us.
A clear example of these autonomous responses is when we say “I feel weak” or “I have no strength”, or when someone tells us “you look pale”.
Equilibrioception or Sense of Balance
Balanceception is the sensory mechanism that prevents living beings from falling when standing or moving. Balanceception, more than a unique sense in itself, is a set of sensory systems that work together: the visual system (vision), the vestibular system (inner ear) and the sense of proprioception.
The balance system works with the visual and skeletal systems (that is, kinesthesia or proprioception) in order to maintain orientation or balance. The visual signals sent to the brain about the body’s position in relation to its environment are processed by the brain and compared with information from the vestibular and skeletal systems.
Although at first it seems difficult to understand how to implement sensory integration using the sense of balance, we will soon see that it is much more common than it seems. In fact, it is very likely that today you have used a device that makes use of Sensory UX integrating Equilibrioception plus other sense(s).
How do we use Sensory Integration in Products or Services?
Sensory integration is the mechanism by which we “integrate” the senses into the product. That is, the planning of the physical or virtual product, or the service, or any outcome that we plan when we implement a Sensory UX strategy.
Sensorial UX is the overcoming of UX as we know it Click To TweetRemember that Sensorial UX goes far beyond the physical, or wearables. Rather, it delves into the realms of deep psychology, cognitivism, and behaviorism. Which allows us to generate physical experiences (that is, products), but also services, or multiform experiences. In that sense, Sensorial UX is the overcoming of UX as we know it (even though the UX we know was always prepared for Sensorial UX, what a paradox).
Through sensory integration, we can create rich experiences that help more people. There are almost no UX studies that work on problems related to psychodiagnostics, or neurological pathologies. Sensory UX is the answer to that failure of our industry.
But Sensorial UX is not only for “people with problems”, but mainly for “normal people”. Please note the quotation marks, since there is hardly a person who does not have any problem, or who can be described as “normal” by anyone without distinction.
When we talk about “normal people” (I really like the term autistic people use to describe this typology: “ableist”) we are talking about people who apparently do not have great physical, psychological, socio-cultural or any other type of disability.
And this type of people, who just do not have great psychomotor problems, are the ones who can make the greatest use of sensoriality. And through these tools, industry and commerce can generate greater benefits by using planned sensory integration.
Two of the industries that is blooming on multi-sensory integration strategies are Virtual Reality (VR) and Augmented Reality (AR). Of course, these are quite obvious. But as we saw in the first example of the eyeglasses case, we don’t need complex technology. Not even a device!
An example that you may have already seen (or smelled!): There are clothing stores, which have a certain type of music, a certain visual environment and when you enter they have an automatic device that emits a light cloud of perfume. These are strategies designed by means of Sensory UX (of course, some do it by imitation of those who implemented these strategies for the first time).
But let’s stay with this example and let’s not continue any longer: in a future article we will see concrete examples and even tutorials on Sensorial UX development strategies
Additional Bibliography for Sensorial UX and Sensory Related Subjects
Interoceptive Awareness Skills for Emotion Regulation: Theory and Approach of Mindful Awareness in Body-Oriented Therapy (MABT) by Cynthia J. Price and Carole Hooven (2018)
Brain region learns to anticipate risk, provides early warnings, suggests new study in Science by Gerry Everding (2005)
Nociceptors by D. Purves (2001)
Beauty is truth: Multi-sensory input and the challenge of designing aesthetically pleasing digital resources by Claire Warwick (2017)
An Exploration of Sensory Design: How Sensory Interaction Affects Perceptual Experience in an Immersive Artwork by K. Harnpinijsak (2019)
Sensory incongruity and surprise in product design. by G. Ludden (2008)
Multisensory inclusive design with sensory substitution by Lloyd-Esenkaya, T., Lloyd-Esenkaya, V., O’Neill, E., & Proulx, M. J. (2020).
UX Sensorial: un nuevo enfoque para mejorar la experiencia del usuario F.Devin, Academia.edu, (2019)
We can improve your business!
Let us help you with the best solutions for your business.
It only takes one step, you're one click away from getting guaranteed results!
I want to improve my business NOW!