High Tech’s Higher Purpose November 14, 2022

Sensors that read your emotions at work? They may one day make the office a happier place

Drawing that shows faces expressing emotions

Whether you know it or not, your face has probably been scanned and identified hundreds of times by facial recognition technology. It’s used in everything from airport security checkpoints to photo-sharing apps. In a 2016 study by Georgetown University, researchers found that images of the faces of half of all Americans are stored in at least one facial recognition database used by law enforcement. Beyond the obvious ethical and privacy concerns, the relatively low accuracy of facial biometrics—particularly among Black and brown people—makes it one of the most controversial technologies out there. It’s even at the center of major lawsuits. But could there be a more positive, even healthy, use for it?

Even if the technology can correctly identify facial expressions, any number of variables—from cultural norms to gender to personal circumstances—can affect the way people communicate how they’re feeling.

Architects and designers are optimistic. And today, some are experimenting with a version of facial recognition software that may be able to help them design workplaces that enhance employee well-being and increase the value and performance of real estate. Commonly referred to as emotion-sensing technology or automated affect recognition (“affect,” as in, the physical manifestation of an emotion), the software scans for facial expressions and micro-movements of face and eye muscles. It then uses those scans to identify whether a person is feeling bored, stressed, surprised, happy, sad, worried, or angry, among other emotions. In theory, designers could use this data to make enhancements to the workplace that make employees happier, healthier, and more productive—and, by extension, owners could make their real estate more desirable. One day, perhaps.

In 2021, a team of Perkins&Will designers in London piloted emotion-sensing technology in their own workplace. They wanted to better understand how their colleagues were using various spaces within the office, and how those spaces were affecting their emotional state. Designers were prepared to respond to the sensors’ data by making adjustments to the layout and design of their workplace—changing where desks are positioned, for example, to allow for better views or more exposure to natural light, or introducing new color schemes, temperature controls, or lighting options—if it meant improving people’s well-being.

 

But as the designers found out, the software accurately decoded individual emotions only half of the time, at best. Research suggests that’s because, even if the technology can correctly identify facial expressions, any number of variables—from cultural norms to gender to personal circumstances—can affect the way people communicate how they’re feeling. Additionally, people express emotions in myriad, sometimes overlapping ways: A scowl may very well indicate anger, but it might also be an unconscious activation of facial muscles in a moment of deep contemplation. A genuinely angry person might not scowl at all.

The idea that artificial intelligence can one day empower employers with real-time information about their employees’ happiness in the workplace is a fascinating one. Imagine being able to reduce employee stress, improve moods, and enhance performance through data-informed changes to an office’s layout or interior design—without ever having to manually collect the data.

The technology may not be sophisticated enough just yet, and its litigiousness use may slow its broad adoption. But with major companies like Delta and MasterCard—not to mention the U.S. government—continuing to invest in facial recognition software development, it’s not likely to fade into obsolescence. With the right legal and ethical safeguards in place, in the future it may very well have the potential to create good—especially in the workplace. Faced with that prospect, it’s no wonder architects and designers are paying close attention.

How does AI interpret your facial expression?
See for yourself: Try an emotion-sensing app >