top of page

Sensory Maze

Sensory Maze is an AI-driven innovative desktop application with a virtual expression therapy interface created to deliver a captivating and immersive sensory journey for users, focusing on the realm of sensory perception.

 

It is a multi-level game that utilizes a camera sensor to analyze users' facial expressions and determine if they accurately display specific emotions.

Platform: Desktop App

Tools used: Figma, Photoshop

Based on: Face API.js

Duration: 2023 - 2024

Services: Services

Challenge

The pandemic poses diverse challenges for individuals, especially vulnerable groups like behaviorally challenged children. With face-to-face therapy unavailable, those with autism struggle with emotion recognition and expression. Expressive therapy, though effective, requires costly and difficult-to-arrange in-person interaction, exacerbating the situation during the pandemic.

Hypothesis

We believe that by using Sensory Maze at their home, players can easily grow their emotions and learn how to strengthen their cognitive skills without any expensive personal therapy session.

Goals

In recent years, advancements in AI/ML technologies, particularly the utilization of machine learning models within web browsers to process webcams have opened up promising avenues for remote monitoring and therapy. Goal is to create novel ML based expression therapy web interface. Users play a game on the computer, where they are asked to mimic the expression of a virtual therapist. The webcam capturing the expression is processed by an AI model and provides audio/visual feedback to the user.

My Design Process 

1

User Research 

​

Problem Analysis

User Interview

Target Persona​

Application Flow

​

2

Design & Coding

Wireframing

Interactive Prototype

UI Design

Hi-Fidelity Prototype

Branding

Logo Creation

Animation

​

3

User Testing 

Code Change

Test With Users

Gather Feedback

Documentation

​

​

User Research

User Interview 

Using user research, we can identify and study user's needs, as well as any pain points, of target users.

 

I conducted in-depth interviews with ten users who have issues with emotion regulation. Participants were asked a series of questions aimed at understanding their challenges, coping strategies, triggers, and effective techniques for managing emotions.

Example Questions

​​

I asked them few questions such as: 

​

  • Can you describe a recent situation or scenario where you found it challenging to regulate your emotions?

  • What strategies or techniques do you currently use to manage and regulate your emotions in difficult situations?

  • Can you tell me about the specific triggers or factors that often lead to emotional difficulties for you?

  • Can you share any personal insights or techniques that you have discovered over time that have been particularly effective in managing your emotions?

Outcome

Responses were analyzed to identify common themes and patterns, providing valuable insights into the diverse ways individuals navigate emotional difficulties.

90%

Most of the users have problems showing right expressions on their face to express their emotions

70%

People say that it is helpful to gamify their learning experiences. They also told that repeating emotions helps with emotion memorization.

80%

People agreed that there could be a software that could help them learn how to express their emotion as there is none in the market 
User Proto Persona

I created Proto Persona of the targeted user group. It is based on assumptions and initial research rather than extensive data. It can provide a basic understanding of potential users, their needs, goals, behaviors, and pain points.

Frame 1 (1).png
Application Flow

User flow/application flow diagrams provide a visual representation of how users interact with an application or website. By mapping out the steps users take to accomplish tasks or achieve goals, I have a clear understanding of the user experience from start to finish.

Sensory_Maze_Flow.png

Design & Coding

Wireframe

I started sketching different ways we can solve the problems. I utilized Balsamiq to wireframe our intended user journey.

Screen Shot 2023-10-29 at 3.48.17 PM.png
Design of the Game

Sensory Maze is an innovative and immersive video game that combines elements of emotion expression and emotion finding. The game is designed to challenge players’ senses, encouraging them to navigate through a complex series of facial emotions and pick out the correct emotions. On starting the game, the user is prompted to a screen to
choose one of the three levels of difficulty

Heading 2

level1.png

Level 1: Test Your Emotional Intelligence

A user will mimic the facial expression of the virtual therapist on the left. If the expression matches that of the therapist, positive feedback is provided. Else, the user is asked to repeat the expression. 

Heading 2

level22.png

Level 2: Let's See If You Can Recognize Emotion

In the second level, the subject is shown a video of an expression, and they are asked to identify which of the four expression is shown.

Heading 2

Heading 2

level3.png

Level 3: Contextualize Uour Emotions

Heading 2

In the third level, an image representing
a situation that elicits one of the four emotion is shown.

Final Product

A prototype is build in Figma to specify the interactions and prepare for different tasks based usability testing. Final interface is coded in HTML, CSS and Javascript.

User Testing 

Conduct User Study

I conducted the user study to validate the significance of the virtual expression therapy
interface.

Significance of the User Study

Empower and support autistic individuals in their emotional expression recognition journey, offering them an accessible, enjoyable, and context-aware interface to enhance their skills and overall well-being.

Research Questions

1. What factors impact the design of a multimodal (visual/audio) driven interactive interface that teaches subjects to express and recognize emotions?
2. How does an interface employing real-time facial expression classification via webcam and faceapi.js, help autistic individuals, especially children, in expressing and recognizing emotions?

Methodology

For user testing, a 2x 2 x 4 experimental design was used. The IV will be light level, presence or absence of eyeglasses, and 4 types of emotion while the DV will be response time.
Our methodology blends technology, gamifica-tion, user-centric design, and user experience research, to realize an innovative and holistic solution for remote therapy, enhancing emotional expression recognition skills and overall well-being for autistic individuals.

Preliminary User Study and Results

We recruited 16 subjects for the test, with one of the participants leaving after the Lux Experiment,
with age ranges varying from 18 to 55. We conducted three separate 1-way ANOVA tests:

Heading 2

1. Lux Test: 

Heading 2

Participants underwent two tests in controlled lighting conditions: under 50 lux, the light was directed away, and between 50-150 lux. H0: Lights above 50 Lux have no impact on response times. H1: Lights above 50 Lux have a decreased response time

Heading 2

glass.png

2. Glasses/No Glasses Test:

Heading 2

Participants made facial expressions (Happy, Sad, Angry, Surprised) five times each while looking at the screen. This section has its light pegged at 250 lux. H0: There is no significant difference in the time taken for facial expression recognition between wearing glasses
and not wearing glasses. H1: There is a significant difference in the time  taken for between wearing glasses and not wearing glasses.

Heading 2

Emotion.png

3. Emotions Test:

Heading 2

We investigated whether identifying the happy emotion was the easiest among the three experiments conducted in this preliminary study, based on the results of the first two tests.
H0: There is no difference in the ease of recognizing happy emotions compared to other emotions.
H1: The Happy emotion is easier to recognize compared to other emotions

Conclusion: The experiments rejected all null hypotheses, demonstrating significant findings: wearing glasses affected facial expression recognition times, while higher illuminations above 50 Lux reduced response times. Additionally, evidence favored the notion that ’Happy’ emotion recognition is notably easier than others, highlighting the distinctiveness of the emotion in facial expressions.

Future Work & Enhancement

This is a work in progress. Based on the results of usability testing, I am working on enhancing the user interface and the types of interaction.

© 2021 Korak Sengupta. All Rights Reserved
  • LinkedIn
bottom of page