top of page
candilogo.jpg

Multisensory Thermostat for Independent Living

Overview

CANDI is an open-source, multisensory thermostat that I designed and built as part of MIT’s 2.00 Design course. Working in a team of four, we were paired with a real user and tasked with designing a system that would help them become more independent in daily life. Our user, Candelaria (“Candi”), is visually impaired and described persistent difficulty interacting with standard home appliances, particularly her thermostat, whose small visual displays made temperature control frustrating and inaccessible.

Rather than redesigning the thermostat’s internal electronics, we focused on augmenting the existing interface to make temperature information legible through touch, sound, and physical interaction, enabling independent use without reliance on vision.

User

I played a central role in user research and system design, conducting interviews to understand how Candi interacted with her home environment and where breakdowns occurred. One key insight was that her challenge was not a lack of understanding, but a lack of sensory access to feedback. This reframed the problem from “adding features” to translating information across sensory modalities.

Our design goal became clear:

Build an interface that communicates state through physical and auditory cues, allowing users to confidently act without visual confirmation.

System Design & Mechanical Assembly

​The final system consists of a physical casing that fits over an existing circular thermostat dial and translates its position into audible feedback:

  • Clear acrylic cylindrical housing (the 'case') that mechanically couples to the thermostat dial

  • Color-coded coded strips: read by a color sensor to determine dial position

  • Rotational mapping from physical motion → sensed state → spoken output

  • USB-Powered Speaker: that announces the current temperature aloud

I contributed to both mechanical design and integration, working on housing geometry, tolerances, and assembly to ensure smooth rotation, durability, and ease of installation. The design intentionally avoided fine motor gestures, relying instead on large, confident movements.

Software & Sensors

In addition to hardware, I worked on the software logic that mapped sensor readings to temperature states and triggered audio output. This required reasoning about signal thresholds, calibration, and error handling to ensure the system behaved predictably in everyday use.

MEET CANDI
  • Our visually impaired user.

  •  Very active (likes to dance, paint and spend time with friends)

  • Faces several obstacles throughout her daily life.
     

OUR NEW PLACE

I'm a paragraph. Click here to add your own text and edit me. Let your users get to know you.

 

Candi's website We created a website to share our idea with family and friends of the visually impaired so that they can make their loved one's lives just a little bit easier.

Screenshot 2025-12-15 at 01.08.17.png
Final Presentation

Lessons Learned

 

CANDI was formative in shaping how I think about technology:

  • Interfaces don’t need screens to be expressive

  • Sensory feedback can reduce cognitive load and increase confidence

  • Designing for accessibility often produces better systems for everyone

  • Human-centered systems succeed when they respect how people actually perceive and act

This project sparked my long-term interest in multisensory interfaces, cognitive augmentation, and human–machine collaboration, and it continues to inform how I approach the design of adaptive systems today.

Thanks for taking a look at my work! Feel free to contact me at amasini@alum.mit.edu if you would like to chat more.

bottom of page