MPathy

MPathy is a speculative concept that augments digital messaging with AI-interpreted emotion visualisations based on physiological signals. Through desk and user research, we explored misunderstandings in digital communication and designed a system that allows users to manage, preview, and selectively share emotions. The result is a conceptual prototype that rethinks emotional presence in chat interfaces, opening up new possibilities for trust-sensitive, affective interaction design.

Problem

Digital communication lacks emotional bandwidth. While fast and convenient, text often strips away tone, nuance, and intent, leading to misunderstandings that wouldn't occur face-to-face. Emotions aren’t absent; they’re embedded, but encoded implicitly: in timing, punctuation, and phrasing. Emojis attempt to compensate, but they’re abstract, ambiguous, and often misinterpreted culturally or contextually. In emotionally charged situations, this blind spot creates friction. MPathy addresses that gap by making emotions visible — but not prescriptive.

My Role

MPathy was a highly collaborative team project in which all of us contributed across research, concept development, and interface ideation. Early on, I found myself drawn to user research and systems thinking and began shaping how we explored emotional models and interaction scenarios. While our roles weren’t sharply divided, I helped translate theoretical questions into designable elements and supported the team in grounding the concept in both user needs and communication logic.

Methodology

We combined desk research on affective computing and emotion theory with semi-structured interviews to understand how emotion gets lost in digital text. While literature has highlighted the layered nature of affect, the interviews revealed how users struggle with emotional ambiguity in everyday chats. Both methods converged on the same tension: emotion is always present, but rarely made explicit. This became the conceptual anchor. Based on these insights, we built experience maps, explored visual metaphors, and iterated early prototypes through user-driven feedback.

Initial question

How does the absence of emotional cues in digital communication disrupt the balance between content and relational meaning, and lead to misunderstanding?

Method

Desk research on Watzlawick’s communication model. Focus: How missing emotion distorts the relational layer in text-based interaction.

Insight

Digital messages lack relational signals. Without them, users misread tone, overfocus on content, and fill gaps with assumptions.

Decision

Emotional markers were added to make the relational layer visible. They reduce misalignment and support more accurate interpretation.

Initial question

How can basic emotion theory be operationalised in system logic, and what models allow for flexible yet structured emotional communication?

Method

Desk research on emotion classification models. Compared basic sets and dimensional approaches to identify usable structures for emotional encoding.

Insight

Emotion theories disagree on dimensions, but all reduce affect to structured types. Systems need flexible mapping without oversimplifying expression.

Decision

A decision wheel encodes emotion via eight core types and adjustable intensity. Users select which emotion is shared and how strongly for each individual contact.

Initial question

How can physiological signals be leveraged to track users’ emotional states over time continuously, and what implications does this have for adaptive system design?

Method

Exploratory desk research across affective computing, biometrics, and behavioural psychology. Focused on how physiological signals like heart rate, skin conductance, or facial tension can indicate emotional states over time.

Insight

Emotions are evident in both behaviour and measurable physiological changes. Signals such as heart rate variability or skin conductance provide a foundation for emotion tracking, especially when interpreted in context and over time.

Decision

MPathy tracks multimodal physiological input (e.g., heart rate, GSR). These Signals are interpreted via neural models and semantically linked to affective states for adaptive use over time.

Solution

Main Idea

MPathy reduces misunderstandings and promotes interpersonal empathy in digital communication by processing AI-interpreted sensor data into emotion visualisations and sending these - linked to the text messages - to the chat partners.

Structure of the emotion visualisations

Emotions are rarely singular. They overlap, combine, and fluctuate, which makes assigning them to predefined categories inherently reductive. Different people perceive emotions differently, and few emotional states can be captured by a single word.

That’s why MPathy avoids naming emotions. Instead, we designed an abstract visual system that reflects intensity, layering, and change without forcing interpretation. The aim is not to define what is felt, but to make felt experience perceptible. Especially when paired with written messages, these visuals reduce ambiguity and make emotional tone more tangible than text alone could.

Translation from measured value to visualisation

We deliberately chose not to map measured data to predefined emotion labels. Translating physiological signals into language, and then into visual form, would have created an unnecessary layer of interpretation and potential distortion. Instead, the system directly modulates parameters of an abstract visualisation based on sensor input.

Emotions are complex, dynamic, and rarely singular in nature. Rather than naming them, MPathy aims to make their presence and intensity perceptible. The visualisation remains open to interpretation, but becomes meaningful in combination with written text. This avoids reduction while supporting clearer interpersonal understanding.

Recognising, sending, and receiving emotions


MPathy detects emotional signals and displays a preview before sending, making the emotional tone visible without relying on self-selected emojis. Incoming emotions appear briefly as an overlay and can be revisited at any time.

Trust-based contact classification

Users can assign contacts to trust categories, such as “work,” “family,” or “friends”. Each category defines which emotional signals are shared, ensuring appropriate transparency in every context.

Classify and change contacts according to "Trust Level

Not all emotions should be shown to all chat partners, which is why you can divide them into different categories, such as “work”, “family” and “friends”.

Pause emotions

Emotional sharing can be paused for a single message or a defined period. This supports situational control and protects emotional privacy when needed.

Outcome

MPathy emerged as a speculative exploration, designed with a 10-year horizon in mind. But with recent advances in affective computing and real-time emotion tracking, parts of the system now feel technically within reach. In hindsight, the project’s true strength lies not in technological foresight but in how it frames emotional bandwidth, agency, and trust as designable layers of human-machine interaction. While the current version remained conceptual, it set a strong foundation for further prototyping, validation, and ethical debate.

Learnings

For me, it was a first encounter with emotion-aware interaction design and taught me how conceptual UX thinking can frame complex human questions as actionable design spaces. While the current version remained exploratory, it formed a basis for future iterations and sparked key questions about trust, expressiveness, and agency in AI-mediated communication. It also sparked a lasting interest in human-AI interaction and deepened my passion for UX research.