Detailed within a patent filed by Microsoft, the software company may be developing mood-sensing technology for the HoloLens holographic headset. Specifically, the device could be programmed to interpret a person’s disposition using eye tracking, gesture recognition, audio analysis, posture analysis, expression recognition and biometric analysis.
Hypothetically, this would allow the user of the HoloLens to quickly read people that are standing in front of them, basically a cheat sheet for reading emotions. As an example, Microsoft includes an illustration of a someone giving a presentation while using the HoloLens to read people within the audience. The presenter basically directs their attention to anyone that’s identified as being bored. That example could also translate into the school environment easily, basically directing the teacher to engage with children that are becoming distracted or bored in class.
Specifically, the patent explains “The device can interpret changes in user posture, user gestures, audible input levels such as murmurs and or other factors to determine the emotional engagement of the audience. For example, slouched postures or wandering audience gaze might indicate a lack of interest or attention.”
Perhaps more outlandish, the patent also includes “Other social situations wherein the technology may be useful include romantic situations involving a one-on-one relationship between individuals.”
Of course, this type of romantic relationship would have to escalated to a point where both parties are comfortable interacting with a HoloLens device strapped to their heads or perhaps embedded within a pair of glasses similar to Google Glass.
At this point, it’s difficult to predict if the details included within this patent will actually end up in the final version of the HoloLens. Microsoft has been very guarded of the device, only allowing a select few to experience the technology in private demos. However, Microsoft will likely demo the technology at the Build conference this week, thus new details may emerge about the holographic headset.
Related Posts
Your WhatsApp voice notes could help screen for early signs of depression
The study, led by researchers in Brazil including Victor H. O. Otani from the Santa Casa de São Paulo School of Medical Sciences, found that their AI could identify depression in female participants with 91.9% accuracy. All the AI needed was a simple recording of the person describing how their week went.
Talk to AI every day? New research says it might signal depression
This finding comes from a national survey of nearly 21,000 U.S. adults conducted in 2025, where participants detailed how often they interacted with generative AI tools and completed standard mental health questionnaires. Within that group, about 10% said they used AI daily, and 5% said they engaged with chatbots multiple times throughout the day. Those daily users showed higher rates of reported depressive symptoms and other negative emotional effects, such as anxiety and irritability.
You might actually be able to buy a Tesla robot in 2027
The comments follow a series of years-long development milestones. Optimus, which was originally unveiled as the Tesla Bot in 2021, has undergone multiple prototype iterations and has already been pressed into service handling simple tasks in Tesla factories. According to Musk, those internal deployments will expand in complexity later this year, helping prepare the robotics platform for broader use.