Designing a haptic-based art generation tool for affect expression and emotion regulation
Haptic Art-cade setup with the User Interface and Haply device
January - April 2021 (4 months)
Arduino, Processing, Adobe Premiere Pro, Haply, Miro
Affinity Diagramming, Literature Review, User-Centered Design, Prototyping, Usability Testing, Survey, Brainstorming
As a course project for CPSC 543 (CanHap501), we developed Haptic Art-cade, a haptic-based art generation interactive tool, for affect expression and potential emotion regulation.
Over 4 months, we iteratively designed and prototyped the haptic tool, guided by intermediate usability and user tests. Each team member participated equally in all processes, from the design to the evaluation of the tool. In addition, I led the presentation and video creation for the final deliverable of the project.
So, what was the problem?
Our course project involved the Haply device, where we had to come up with an application for the device, centered around exploring the haptic capabilities of the device.
Our team decided to focus on the potential of haptics in anxiety and stress management, through the use of vibrotactile and force feedback. We were partially driven by our own experiences with anxiety and stress management, particularly on the impact of touch in soothing or stress-relief, and partially by the potential of such an application in today's world where stress and anxiety have become a part of our daily lives.
The use of haptic feedback has previously been explored in stress and anxiety reduction - our specific aim thus became to explore if this modality could be used in a similar effective yet enjoyable way through the Haply.
I see, so who was your target demographic?
Our target users were people looking to self-manage their stress and anxiety. While our tool could potentially be used by anyone, people familiar with haptics, or interested in the use of art or games as a stress management tool would be more likely to try this tool. A technical constraint was given our development and focus on the Haply device, target users would need access to the same.
So, what was the plan?
We followed the Design Thinking Process of Empathize-Define-Ideate-Prototype-Test.
We began by conducting a preliminary literature review, to assess the use of haptic feedback in affect expression and emotion regulation. While haptics has been prevalent in the field of emotion regulation through force and vibrotactile feedback (CuddleBit, OTO), it has usually been through wearables or handheld objects.
Outside of haptics, several coping mechanisms for stress and anxiety exist, from indoor to outdoor activities, like journaling or exercising. Some popular activities to self-manage stress include journaling, colouring, and games like Bubble Wrap and Paper Toss. Drawing and colouring as well as these games help relieve stress by focusing on something calming and increase mindfulness.
However, we did not find any applications that combined these activities with a haptic modality.
We wanted a tool that would be portable, and could be used whenever users felt they needed to relieve their stress or anxiety. A game-based art generation format would also make the emotion regulation process enjoyable, and thus possibly more engaging, for users. We created a rough design space map (shown below) to scope our project and brainstorm the different functionalities it could have.
Given the timeline and scope of this project, we decided to exclude music from this tool, instead focusing on creating a haptic tool that would provide a game-like environment for users, and generate art through gameplay, through a series of configurable environments.
Our goals for the tool were threefold:
Enable affect expression through customizable elements (realized through varying colour palette options and the ability to move between different environments within the application)
Enable users to record their current emotional states for future reflection (realized by optional timestamped snapshots in each environment)
Create an aesthetic experience which used multiple modalities and was both therapeutic and enjoyable (realized through the careful synchronization of visual, audio and haptic modalities, and an artistic gameplay experience)
Ideate, Prototype, Test:
Given we designed and evaluated this tool in two stages, I combine the Ideate, Prototype and Test stages here for ease of understanding. In Stage 1, we design and evaluate the haptic sensations for the final application, and in Stage 2, we design the user interface and overall application of the tool.
Before we ideated on the design of the application, we needed to explore the haptic capabilities of the Haply. The Haply development kit we were given was a two-degree-of-freedom haptic device, that could be programmed to deliver varying levels of force feedback. To determine how best to deliver this feedback for our art generating tool, we decided to define ‘base haptic sensations’ that could be built upon later to include art-rendering capabilities:
We then conducted a round of evaluations (N=5), where our participants were our peers and people from our social circles, with 3 having previous haptics experience. Through an open-ended survey requiring users to try out the interactions and answer what they felt about the interaction emotionally and physically, we attempted to assess which of these interactions could be perceived as calming, yet also interesting. Interactions 4,6 and 7 were the ones that felt the most calming and interesting, namely ‘breaking through a barrier’, ‘popping a ball’, and ‘pushing against a squishy surface’.
With these results in mind, we moved on to Stage 2, where we then thought of how these haptic interactions could be incorporated in an art-generation game-like format.
We began first by mapping and then affinity diagramming the input and output parameters for the Haply device. This was done to gain a better understanding of the configurable parameters so we could scope which ones we would want to work with when designing the user interface and interactions.
Based on the results of the first user test, and the parameterization mapping, we converged on a set of four environments for the final application, three of them providing active user control of the device, and one passive. We detail each of these environments below, with ‘Guided’ being the passive interaction, while the rest are active.
1. Slingshot: An environment where users can simulate throwing paint on a wall, using the Haply end effector as a sling. The force feedback involves a ‘pull’ as users move the end effector away from the centre, and a ‘spring’ on release, where the end effector springs back to its original position. A colour palette also allows the user to select different paint ‘balls’ to throw each time, and a paint splatter is formed when the user lets go of the end effector, coupled with a splaterring sound, for a convincing paintball explosion rendition.
2. Pop: An environment comprising of several varying-size colourful bubbles, that can be popped on applying force. The user can press against the bubbles with the end effector to pop them, replicating a colourful bubble wrap popping experience. Each pop creates a paint splatter as well, accompanied by sound effects.
We felt both Slingshot and Pop provided game-like environments that would be fun for users to play in, and thus could help emotion regulation. Similar to games like Bubble Pop and Paper Throw, these environments help calm the user, but additionally include artistic and haptic elements that can further add to the experience.
3. Squish: An environment where users can move the end effector to squish it against a bouncy surface. This provides a feeling similar to that of squishing a stressball, or slime. On exerting sufficient force, users can break through the ‘barrier’ of the bouncy surface, providing a feeling of engulfing the end effort in viscous gel. The movement of the end effector creates a trail of colour as it moves.
We felt this environment would simulate a stressball, enhanced with paint and haptics, which could help in emotion regulation, while also creating artifacts that could be used for later reflection.
4. Guided: This passive environment guides the user through a fixed route. Users can hold on to the end effector as they are guided across the device surface to create patterns. The movement creates colourful trails on the screen as the end effector moves.
This environment does not require the user to make any decisions except for the colour palette choice, which we felt might work for users who do not want to actively play a game or so, but go through a more guided regulation mechanism.
We show the final application and the results of the final user test in the next section.
Oh…sounds good, how did that turn out?
The final application consists of a user interface controlled by and accompanying the Haply device, with four haptic environments. For each of these environments, users are provided with a colour palette, which they can choose from. There are also multiple colour palettes to encompass a variety of shades and hues. Each of these environments uses a different ‘base sensation’ for haptic feedback, from Interactions 4, 6 and 7. Users can navigate through these environments from a central menu, and save a timestamped snapshot of the art created each time for future reflection.
This video further shows the tool in use:
We evaluated the final application with 7 participants, through a survey. The survey consisted of Likert-scale questions asking participants how calming and enjoyable each environment felt, as well as how the interactions felt physically and emotionally. All four scenes were thought to be calming as well as enjoyable to interact with, with participants saying “It felt a bit soothing, especially when looking simultaneously at the visuals”, “feel great, I love colors, and splashing color was a cool experience” and “lot of fun, enjoyed repeating the game three times”. While we were unable to explore the use of this tool long-term for the purpose of emotional regulation, our preliminary user tests show the potential of such an application.
Nice! Who else knows?
This project was completed as a course project and not pursued further. We did create a report as documentation, available here: https://drive.google.com/drive/u/0/folders/15Sb83IyhujOzLq8gvq7ArZvX7XA10hI9, as well as the demo video shown above.
Cool! Did anything get in the way?
Our main challenges with this project were:
Device-dependent user experience: Given that each Haply comprises of multiple pieces of hardware, and was assembled by each researcher themselves, each device had minute differences that led to slightly varied user experiences based on the device setup. Within our 3-person research team while doing internal testing, we were able to identify cases where the Haply behaviour was too varied across the three devices we had, and had to modify the haptic interactions in those cases to provide an almost-uniform experience. Replicating the exact same user experience across several hardware devices is always complicated, but we found rigorous internal testing and communication to be an effective mitigation strategy.
Remote testing and evaluation: Given this project was completed amidst COVID-19 restrictions and across institutions in Canada, the research team was not co-located at any point. This brought about the challenge of not being able to experience and troubleshoot hardware issues any team member faced, as well as having to resort to online strategies for evaluation. We tried to overcome this by having Zoom calls for troubleshooting, using Discord for constant team communication, and using surveys and recruiting from our social bubbles for evaluations.
Limited evaluations: Given the time and scope of the course project, we were unable to evaluate the long-term impact of Haptic Art-cade on emotional regulation. Our preliminary user tests did show effectiveness and interest in the tool, but long-term larger evaluations would show more accurate measures of engagement and effectiveness.
Hmm, interesting. What’s been your biggest takeaway?
This was an incredibly fun and challenging project to take on, for someone coming from a majorly software-oriented background with minimal haptics knowledge. While my haptics knowledge certainly improved, there was another key takeaway from this experience as well:
The importance of other modalities, but in precise synchonization, in addition to haptics: There was a massive difference in user perception of the haptic sensations, when comparing the ‘base sensations’ to the final haptic interactions in the final application. While the ‘base sensations’ had no audio and visual element, the final application included carefully synchronized audio and visual elements complementing the haptic interactions which greatly improved the ‘feel’ of the interactions. Users reported how the audio and visual elements made the experience enjoyable, with improved results in the second user study on the enjoyability and calming effect of the interactions. There is a fine line between using multiple modalities to complement each other vs overloading the design with multiple modalities, but as we discovered, careful design and synchronization can be used to walk that line.
So, what’s gonna happen next?
This project ended here, but showed the potential of haptics in emotional regulation, through a medium of art and gaming. While we may or may not pursue this further some day, we would love to see other researchers progress in this avenue!
For now, this was it!