Design your first Quantum UX project in 7 easy steps

Added on August 1, 2021 - Category: Essays, Theory, UX Theory
Build your first Quantum UX project cover image

How to design a Quantum UX project?

After the introductory article to Quantum UX I received some critical messages (fortunately mostly positive) and some inquiries.

In general, these inquiries mostly revolved around the apparent difficulty of designing and implementing a project with Quantum UX.

I hope that this article will serve to explain to you in a much simpler way how to develop a project using the QUX framework and how to apply the basic concepts of XMI / XCI, XEl and reactivity.

Although many of the concepts seem very abstract and may require additional effort (which is logical with a novel dialectical conception), QUX is actually very simple, and therein lies part of its beauty.

Most of the blame for conveying this simplicity lies with me, as I wanted to condense as many concepts as possible into the smallest possible space. My most sincere apologies.

However, this article will probably show that QUX is actually simplicity itself.

Multiform design and multidimensional design

Although I initially thought that the concept of reactivity of the XEl would be the biggest difficulty (since XEl are really the most complex components of QUX), the questions mainly revolved around 3 axes. One of them is the dual concept of multiform design / multidimensional design.

As you may have read in previous articles, these concepts are central to the design of a Quantum UX project.

And it’s easy to see that something that has multiple shapes and / or multiple dimensions is an almost impossible abstraction to grasp. And technically, it is.

Now, are they as complex as they sound?

Yes and no.

Yes, because the first case is about creating user experiences that have no boundaries or defined shapes and that transcend the different dimensions of user experience based on Quantum UX.

But let’s focus on the simple part: QUX is as complex as it is simple at the same time (and therefore the QUX concept itself is multidimensional and multi-form).

Just as the experiences we generate are very difficult to comprehend, they are in fact self-generated. And what we perceive and measure is the result, which is also automatically measured to be self-generated again.

Quantum UX Project: Google Analytics Example
Check this out: Google Analytics allows us to extract information to design Quantum UX projects just like that

So it doesn’t matter at all if we perceive a QUX user experience one way or the other, because if the design is correct, it would be very difficult for 2 or 3 or 4 or 100 people to perceive the same experience.

On the contrary, all users will perceive something different, unique and personalized across different dimensions of the experience.

For example, when we define a dimension as the “volume” of an event (e.g. “number of interactions”), we are implicitly measuring and defining our experiences in the context of another experience (the volume of a concert or a piece).

But again, no matter how complex it may seem to be, multiform design is not complex at all. It just needs to be developed from the ground up with respect to QUX: The XEl should allow us to generate any user experience that can be conceived and defined within any given context while respecting some rules that will be defined later on.

And this means basically that you have to create rules that describe what you want to do with your media and how those things are supposed to behave together inside preceding contexts (if they are not already part of one) – and then try them out.

This concept is very simple but not easy at all because it requires great creativity and knowledge about how actors behaves inside different contexts. However, you can simply apply the principles described in my previous articles as an starting point. And if you feel they’re not clear enough (again, my fault!), maybe make use of my Recommended Reading list for some inspiration! 😉

Let’s see two examples taken from previous projects I have implemented using QUX: A case study for integrating an application into the operating system (that is often considered as an OS itself) and an interactive menu for a computer-aided design application.

In both cases I wanted the same thing: The interactions between elements should follow certain rules which result in specific user experiences – regardless if those elements are located on separate computers or inside one operating system / application / web page etc..

Nevertheless I chose different ways of implementing these rules according to the context in which they were applied.

But let me stress once again: this approach works only if you really want it; If you choose something else instead, there will probably be nothing wrong with your decision… BUT there might also be nothing right either!

So really think hard about what you want before starting down this road!

Of course dual concepts such as multiform design/multidimensional design imply more complexity than just creating standard interfaces for generic software applications like Figma, Photoshop or Sketch… There is even less space available for creative experimentation than usual interfaces because these systems usually contain many more potentials than conventional ones do (and therefore require much more effort).

Regarding interactivity it enables design processes since each individual element can react differently depending on its position relative to other similar parts within a given environment. However, these changes must always respect certain pre-defined rules .

Let’s consider Photoshop: we can imagine operations like “renaming/saving features” where every feature has its own unique ID number that holds information about its name , type , location etc… so whenever anyone renames/saves/deletes features with special names corresponding IDs can change automatically while preserving all related data regarding those features .

When designing Quantum UX UIs we must work under similar conditions hence allowing us not only implement cutting edge technology but also develop logical preconditions based on human logic toward self-generated creative purposes .

In addition, most users expect GUIs containing multiple layers so they don’t necessarily know whether their inputted data has been processed correctly or not due to contextual interference.

In short: Multiform design allows designers freedom by reducing unnecessary constraints and opening up new possibilities thus increasing creativity exponentially.

How to apply Universal UX in a QUX user experience.

This was the second concept that caused the most problems. And in fact, in its purest state, it is the simplest of all. Furthermore, the tools to create this are available for free, and you won’t need an extreme technical knowledge.

One way to define multi-form experiences based on both Cultural UX and Universal UX is, for example, to use Google Analytics data (which we can also use with Big Query) or also Google Optimize, which allows us to create both more “conservative” UX designs and more innovative ones if we so wish.

In other words: We can make it complex (for example, the work of anthropologists, psychologists and language experts would give us incredible richness). But we can also automate it completely and say:

if A then yellow, if B then blue, if A has parts of B then green. 

It really is as simple as that.

Quantum UX project: overlapping experiences model
Quantum UX project: A user has an experience A; another has an experience we call B. A third one will take elements from both experiences and create a 3rd one. And so on

Using Sensorial UX in Quantum UX

Finally, this is the concept that has generated the most queries. In this particular case, the difficulty is more real since sensoriality is more difficult to control.

However, designing sensory user experiences is easy. The hard part is measuring them.

But how can we measure them?

As mentioned in previous articles, actors (XCI or XMI) are cross-dimensional.

In the case of sensoriality, the experiences will usually be in the quantum dimension due to its high degree of subjectivity. And as we have already seen, the quantum dimension can be easily measured by the effects it has on other entities or dimensions.

Since Quantum UX is a metaphor from Quantum Physics, it’s common for us to use examples from Physics in QUX. In this case, a known example is black holes: they were only a theory for decades. Until astronomers discovered distortions in time and space created by invisible objects that had to have super massive mass. Or, in other words: a black hole. An element we can’t see, but we can perceive it.

The same way happens here. Sensory UX (we prefer Sensorial UX) is difficult to measure, because even the same user has different perceptions at different times.

Let’s try a simple example: suppose we have a shop, and we turned on the air conditioner. Then we see that while some customers are comfortable, some of our customers begin to rub their arms to warm themselves, or we see them tremble. They won’t need to tell us anything, and we won’t need to use a thermometer to actually know the temperature of their bodies. Simple observation of the effects will provide us insightful information.

In short, even the most complex is at the same time extremely simple.

Designing a Quantum UX Project for the First Time

Now that I’ve explained the seemingly “fuzzy” concepts involved in implementing QUX projects, let’s move on to a practical example.

First, we will see how to organize a QUX project from its conception, and then create multiform and multidimensional experiences.

Sound interesting? Of course it is!

Let’s look at the following picture:

Steps to design a Simple Quantum UX project (diagram, click to enlarge)
Simple Quantum UX project diagram (click to enlarge)

Step 1: Defining our Quantum UX project

First thing first. What are we going to do?

One of the most interesting things of Quantum UX (which is the main difference with all other UX approaches) is that we don’t necessarily need to have an idea to start with.

We may have an idea to design a product that fulfills a need. In this case, we’ll research on that specific need and its scenarios. Pretty much like any other UX research process, only that in QUX we use a lot of existing data.

However, we can just say: “hey, let’s research on opportunities for businesses”. And then use one of the many solutions to do so, find one or more niches you think you can develop and then go back to the paragraph above (just in case you think this is crazy, this is what SEO and marketers do every single day: research on new business opportunities in order to develop them) .

With the above information, we’re ready to do our project. For example purposes, let’s say we discovered data that shows an urge for cupcakes with Bluetooth (hopefully you won’t do this!)

Image of a QUX project hand drawn diagram
We can start a QUX project with just a pen and a sheet of paper!

Step 2: Define the dimensions of the project

As we saw in the XMI article, QUX projects work in dimensions.

Although it is possible that some of these dimensions will be created automatically and therefore unexpectedly, it is preferable to have some control over them.

In other words, these dimensions or planes are the “canvas” on which entities “draw” their experiences. It is possible to let the entities roam freely in the dimension, but it is much better if we can guide them to the goal we need.

We already know that there are dimensions that are easier than others (ergo, more plannable). But even without knowing what the result of the entities’ action on the dimensions will be, we can plan for an event to occur in a more complex dimension, such as psychological or sensory.

Since our product is a cupcake with Bluetooth, we determine that our project will use the sensory dimension, the virtual dimension, and the physical dimension a priori.

Developing the sensory dimension of the user experience

As in our example, even if the madness is very obvious, we know that at the sensory level we will have at least 3 senses that intervene in the experience. We also know that there will be more, but for a person without some kind of disability or psychophysical or neurological impairment, we know that 3 senses will intervene: Sight, Taste and Touch.

What can we create to improve the experience of the senses?

Well, we can make sure that the cupcake has a very eye-catching decoration and that it comes in different colors and shapes (sense of sight).

Then we can create unique and unrepeatable flavors, with exotic combinations (sense of taste/smell).

Finally, we can make one type of cupcake really fluffy, while another is firmer and more consistent, and has wrappers with different textures (sense of touch).

Designing the physical dimension

For the physical dimension, we have already defined that we will make a tangible product (the cupcake), plus some packaging and special wrappers.

Virtual dimension design

As we are very innovative, we created a special flour that contains Bluetooth nano-transmitters and causes the song that most reminds of that cupcake to play on the user’s Spotify list on her phone when she eats a certain cupcake.

The complete UX design

We can see how the 3 dimensions interact and intersect with each other, creating new dimensions or extensions of the given dimensions (e.g., the inclusion of other senses, such as hearing, introception, etc.).

Of course, we should remember that this is a fictional project that I’m just using to creatively illustrate the possibilities. If we wanted to use something more concrete, we could simply create a UI design made up of different elements that are able to respond to the data (we’ll see how to do this in another article, but this is what Amazon does, for example).

Step 3: Define the elements of the User Experience or XEl

The elements of the User Experience or XEl are physical or virtual elements that are able to be charged with energy by the data they receive, and at the same time return some or all of that energy to other XEls, changing the dimension they are in.

Continuing with our crazy example, we said that when we take a bite of the cupcake, depending on the cupcake chosen (taste, consistency, aroma, packaging) the “magic Bluetooth” sends a signal to our mobile phone that triggers the song on our list that most matches said cupcake.

The cupcake is an XEl that transmits information, the “magic Bluetooth” is another XEl that is powered by biting into the cupcake, the connection to our Spotify list is another XEl, and the selected song is another XEl. All of these elements send and receive information in one or more directions.

But there are other XEl that are (or can be) charged with energy. Let’s say the user experience is over. When we design a Quantum UX project, the consumption of the product is an important part of the process, but it is also an XEl itself.

What will happen next? It could be that the user shares their experience on networks. Or we can measure whether the user buys the product again, since we’ve already identified her. We can measure if the user buys the same product in different flavors, or if they always choose the same one. We can measure if the interaction is positive with the song that is playing, or if the user wants an element of randomness and stops the song to see what other song is suggested.

It is important to know that all the information generated is measurable, and we must be ready to receive it, process it and generate new experiences (another of the major differences with the classic UX approaches).

Step 4: Measure Data and Train the XEl

We’ve already seen how the data is measured and sent to the cloud. Now we need to train our XEls so they can learn from it and generate new experiences for us. This is where Machine Learning comes in.

Machine learning (ML) is a field of computer science that gives computers the ability to learn without being explicitly programmed.

In other words, ML allows computers to automatically improve their performance on a given task by analyzing large amounts of data without the need for human intervention.

AI big data nodes representation in Quantum UX

This means that our cupcake can “learn” what kind of music goes with each bite, or whether there are different types of users who prefer different songs based on their profile or tastes (e.g., some people like classical music while others prefer rock). The possibilities are endless!

But let’s not get too far ahead of ourselves… We still have one more step to go before we can start designing our Quantum UX projects!

Step 5: Create an environment for your XEl

Once you know how your product works as an element charged with energy, you need to create an environment where it can interact with other elements and change its dimensionality depending on the information it receives from them (or vice versa).

Let’s go back to our example: when I bite into my cupcake, depending on what song Spotify suggests to me at that moment, I may want a different cupcake the next time I buy it. But if Spotify recommends me something completely random when I eat my favorite, I may not buy it again because it’s no longer to my taste

So we’ve two choices: Either we create one system where all these interactions happen within the same platform/environment/app/etc., or we create multiple environments where each interaction happens separately but in parallel. This second option is way more complex, since it would require developing multiple micro-apps and combine them later using AI algorithms like clustering.

As always with any type of UX Design (UXD), this depends entirely on the problem you’re trying to solve and the goal you’re trying to achieve with your project.

Step 6: Design and Develop Your Quantum UX Product or Service

Now that you know all the elements of your User Experience, it’s time to design and develop your Quantum UX product or service based on what you learned in steps 1 through 5.

If you’re working on a Quantum UX experience, now the fun begins! You can start designing your product and its interactions with other elements.

For example, if we want to design an app for our cupcake that lets us choose what song plays when we bite into the cupcake, some of the things we need to consider are:

  • What happens when I press play?
  • Does my cupcake start playing music immediately or does it wait until I bite into it?
  • If so, how long should it wait before it starts? What happens if I don’t like the song Spotify is suggesting at that moment?
  • Can I manually change the songs or do I’ve to wait for another suggestion from Spotify based on my profile/preferences/etc.?
  • How many songs can my cupcake play at once (or simultaneously)?
  • Should there be a limit to the number of songs per day/week/month/year?
  • Is there a way to know what song will play next without having to ask someone else (e.g. Siri) or look something up online (e.g. Google)?
  • What happens if no one bites into my cupcake after X amount of time has passed since the last interaction with another element (e.g., a user)?
  • Do all these options only apply when eating in public places like restaurants and cafes, or also at home when watching TV shows/movies/etc.?

The list could go on and on, but you get the point. There are many things to consider when developing an app within a Quantum UX paradigm, but we also have the help of AI to assist us, so it’s a virtuous cycle


Step 7: Launch your Quantum UX project!


And there you have it! You can now start designing Quantum UX projects. Of course, there are many more steps (e.g. defining the business model), but they depend on the type of project you’re working on.


Conclusion

Quantum UX is a new approach to designing products and services that will change the way we interact with technology. It’s not just about creating experiences, it’s about measuring them and training our XEl to learn from them.

This means that in the future, Quantum UX projects may no longer be designed by humans, but by machines!

The possibilities are endless: imagine a world where you no longer have to choose between two options when making a purchase (e.g. black or white), but instead your product adapts to what you need at the moment. For example: if you need a pair of shoes for running, it would recommend a certain type of shoe; if you need another pair for a weekend trip with friends, it would suggest a different one. But everything will happen automatically, predicting behaviors based on Big Data.

Or imagine an app that knows how much time each user spends watching videos on YouTube each day, and suggests other content based on their preferences – all without requiring them to log in to their account! (wait… sounds familiar?)

For this reason, I strongly believe Quantum UX is the next step in User Experience. And while there are still many challenges ahead, I’m sure we’ll see some amazing results from this fieldsooner rather than later – results that seem like science fiction today!

We can improve your business!

Let us help you with the best solutions for your business.

It only takes one step, you're one click away from getting guaranteed results!

I want to improve my business NOW!
-->