Beyond Ray-Ban Meta: Smart glasses could be the future of health monitoring.

Beyond Ray-Ban Meta: Smart glasses could be the future of health monitoring.

      “Current wearables like Fitbit provide us with continuous daily data on our activity levels, and we have similar data for sleep. The upcoming aspects of wearable health tracking will focus on emotional and dietary health. We see a significant opportunity to offer many benefits to people through these advancements.” This was how CEO Steen Strand introduced Emteq Labs, a company taking a uniquely different direction in smart eyewear. But how do smart glasses determine your emotions and eating habits? I discovered the answer and was impressed by the technology's accuracy, as well as its potential.

      Introducing Emteq Labs

      “Health applications fueled the popularity of the Apple Watch,” Strand noted during our Zoom discussion, “and we aim to push smart eyewear beyond what can currently be done on smartphones.” Rather than developing a clone of Ray-Ban Meta, the creators of the Sense smart glasses asked, “What can these glasses achieve that no other product can?” Strand was joined on our Zoom call by Emteq Labs’ founder, Dr. Charles Nduka, who shared insights into the company's origins.

      “I’m a facial reconstructive surgeon,” Nduka explained, “and my journey began with the challenge of understanding how people express themselves between clinical visits, which led to the development of smart eyewear that tracks expressions. Initially a research and development project, it gained remarkable momentum, prompting us to explore essential aspects of daily life, mental health, metabolic diseases, and dietary functions.”

      Unlike smart glasses such as the Ray-Ban Meta and the Solos AirGo Vision that feature outward-facing cameras, Emteq Labs’ platform focuses internally, analyzing facial movements with sensors to provide insights typically found only in a laboratory. The two main areas highlighted were mood and eating behaviors.

      Understanding Mood

      “We know that individuals experiencing depression display lower amplitude movements, adopt a head-down posture, smile less frequently, and make more negative expressions,” Nduka explained. He noted that there is only one muscle responsible for raising the eyebrows and one muscle group for moving the cheeks. To “detect” these movements, Emteq Labs’ Sense smart glasses utilize lensless sensors that measure the motion and texture of the skin and muscles.

      The demo I witnessed was strikingly accurate. The platform mirrored Strand’s facial expressions in real time while he wore the Emteq Labs smart glasses, all without the use of a camera. Anyone familiar with smartwatches or smart rings, like the Apple Watch or the Oura Ring, has likely encountered stress measurements. Strand highlighted why facial indicators are far more reliable for assessing stress than wrist or finger sensors:

      “Not only do you miss feedback, but the data can also be misleading,” Strand noted. “It indicates stress due to an elevated heart rate, but I might simply be exercising. Without visualizing the face, it’s challenging to determine the cause of someone’s stress. Additional insights are needed to understand emotional states that can’t be captured from fingers, wrists, or phones. These methods yield a vast amount of data but lack quality. We believe our smart glasses can achieve both.”

      Emteq Labs’ software processes the data in a more comprehensive and meaningful manner compared to, for example, the Oura Ring, which merely reports if stress levels were elevated during the day without explaining why. Emteq Labs software tracks when you were stressed or upset based on your facial expressions, providing a timeline throughout the day. Historical data reveals mood fluctuations and the duration of positive or negative feelings. This intriguing data has clear implications for mental health diagnosis and treatment.

      You Are What You Eat

      The sensors in Emteq Labs’ smart glasses also offer insights into eating habits, potentially aiding individuals in changing their diets or losing weight.

      “The temple sensors record jaw movements in real time,” Nduka explained. “We can assess chewing rate, which is a new metric previously unattainable, and this data itself can indicate calorie intake. If you consume calorie-dense foods rapidly, you’ll need to expend more energy to burn them off.”

      Strand elaborated on the significance of this information.

      “It’s a small piece of data,” he explained, “but from it, we can determine when you ate, how long you took, whether you had snacks or three full meals. We know if you ate quickly or slowly and how many bites you took. Extensive research links these factors to overall metabolic health.”

      This capability helps individuals identify dietary missteps and provides actionable insights to alter their habits. The platform can distinguish with 95% accuracy whether you are eating, talking, laughing, or singing, allowing it to differentiate between chewing food and conversing during dinner. The depth and detail of the data is remarkable. The temple sensors provide both X and Y axis data and can even analyze how the type of food affects chewing patterns as consumption progresses. Nduka explained the implications of this data:

      “Dietitians understand that if they can encourage people to eat more slowly, it prevents the stomach

Beyond Ray-Ban Meta: Smart glasses could be the future of health monitoring. Beyond Ray-Ban Meta: Smart glasses could be the future of health monitoring.

Other articles

Beyond Ray-Ban Meta: Smart glasses could be the future of health monitoring.

Digital Trends interviews a company developing smart glasses that monitor emotions and mental health while utilizing sensors to gain insights into eating behaviors.