The Sonos gesture interface

One of my first projects at Sonos was the design and implementation of a new on-device user interface to complement our next generation speaker (the new Play 5). There are a lot of unique challenges to working on a hardware interface, especially one that needs to blend into a speaker and adapt to future designs that haven’t been conceived yet. I was the lead designer on the project from some early proposals around interaction design, through countless rounds of software and hardware prototyping, user testing, sound design, icon design, software tutorials, light pipes, LED's and as well as working with our engineers on the custom sensor designs and associated firmware.



The redesign of the control surface was triggered by a completely new industrial design language.  One that called for cell phone level tolerances and forms that feel more like quality furniture than the latest consumer electronics. Our team wanted to deliberately downplay the existence of a control surface.

When we started the project, we also wholeheartedly believed that starting music and later pivoting the music quickly would always make the product more satisfying to use. Research showed that while the app was obviously more powerful, it often took users 12 seconds to complete tasks like adjusting the volume.


After creating video mock ups and paper prototypes we quickly realized that we were spending far too much time and energy trying to keep everyone on the team imagine the same concepts and alternatives.  We needed something fully interactive that had the performance of a final product but allowed us to iterate quickly.

Sonos is obviously a music product, so I created what I think is a unique prototyping platform with Max MSP, an iPad, and a 3D printed mount. Max MSP is best known for interactive installations, but after writing some custom patches we made it into an incredible tool for prototyping hardware. With it we were able to build, test and iterate concepts, often in the same day. It's hard to overstate how much this unlocked the team from repetitive debates.  

When we couldn’t push the concepts any further in software, we hired mindtribe to build us a hardware prototype with the same embedded sensors we later used in the final product. These sensors fed into the Max prototyping platform which allowed us to leverage our previous work. Not only did we get many more prototyping iterations into the same amount of time, it saved us months of slow and expensive firmware adjustments. 



Anytime you make a hardware interface, it ends up being a huge investment and risk for the company. We tested concepts on a weekly basis ran two major research sessions in Boston with our team there. We learned a lot about users expectations, the tutorials and feedback.