2015 Fall Interaction Design
Professor Axel Roesler
Duration: 10 Weeks
Team: Maureen McLennon, Dillon Baker, Phillip Carpenter, Kasturi Dani
SENSE is a pair of glasses that are aimed towards higher-functioning autistic individuals. The purpose of these glasses is to assist in helping the wearer understand a person's emotions (through facial expressions) to a better degree. People with specific forms of autism in particular have a harder time reading facial expressions and understanding the emotions of others which makes it hard for them to adapt and participate in common social situations. These glasses are aimed at being a therapy tool aimed for some autistic people learning to develop a sense of emotional awareness of others and to help empower them in social situations.
The glasses use facial and vocal recognition software to analyze the emotional state of people in direct conversation with the user of the glasses.
The glasses display subtle and customizable cues to the wearer to help them recognize and learn to identify the emotional state of others. The glasses utilize a camera that records the facial expressions of whoever the wearer happens to be interacting with, and through facial recognition software, the glasses will automatically display any of the colors (blue, green, yellow, and red) to indicate the level of emotion displayed to help the user understand what the other person is experiencing.
If used daily for short 30 minute sessions with the help of a special needs teacher (ideally), they are meant to be part of a repetitive learning routine for an autistic student. With the help of the teacher, the glasses' emotional display is customizable to communicate to the user in the most efficient way, without over stimulating the wearer with unnecessary information. Users that exhibit synesthesia, who associate colors in different contexts, can choose the colors they wish to associate with certain emotions. Colorblind wearers can customize the display to see symbols instead of color cues.
The wearer takes the glasses out into a social environment. Using the camera lens and the microphone in conjunction with the recognition software, the glasses track the emotions of those who are in direct conversation with them. This emotional data is recorded to the internal flash-based memory. POV video and audio can also be recorded along with this data if desired.
After a session, a special needs teacher can download the data via USB or Wi-Fi to a tablet computer. Using the app designed for the glasses, the session can be graphed out and reviewed by both teacher and student to point out the emotional contexts of the interactions. This data can be seen with or without the recorded video or audio of the session.
The app is also capable of displaying, through Wi-Fi, real time monitoring of what the student is experiencing so that the teacher can have a better context for reviewing with the student later.
I took on many roles throughout the project ranging from UI to the actual video production. I assisted Kasturi Dani in designing the UI and how the concept of reviewing the footage recorded by the glasses would be incorporated in the interface seamlessly. For the video production, the whole team worked together with a storyboard of the device and how it could potentially work, as well as how the video could tell the story. I also worked with Maureen McLennon on the script of the video as well as recording practice audio for the narration.