At Build 2024, Microsoft announced it partnered with Meta to extend Windows apps into 3D space with the help of a Quest VR headset. When working on physical objects, it’s important to have spatial awareness of components.

Microsoft and Meta have been working together on VR solutions for quite some time, and Microsoft Office apps are now available in the Quest Store. You can open Word, Excel, and PowerPoint to get work done in a Meta Quest 3 on virtual screens. However, these flat panels don’t really take advantage of what’s possible in virtual reality.

Microsoft’s latest plans sound much more interesting. The Windows Developer Blog promises to “make Windows a first-class experience on Quest devices.” Windows Volumetric apps can extend into 3D space, allowing users to manipulate virtual objects with their hands.

Microsoft will provide a volumetric API to use for this interaction. That means developers will need to create new apps or add these features to existing apps before you’ll be able to make use of this new concept. In the meantime, you can still connect your Quest 3 to a Windows PC to view multiple virtual windows with apps like Immersed and Virtual Desktop.

The mention of Meta Quest was brief as Microsoft had dozens of exciting AI and Windows topics to cover, including several big Copilot AI enhancements, real-time video translation for Edge, and the awesome new Copilot+ PCs.

Microsoft and Meta have worked together before and have been planning on deeper integration since Meta Connect 2022 when Meta announced its Quest Pro, a work-centric VR headset. Some of the plans mentioned in 2022 arrived late in 2023, so it could take months to see any progress on Microsoft Volumetric apps.

Related Posts

New study shows AI isn’t ready for office work

A reality check for the "replacement" theory

Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns

The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.

Microsoft tells you to uninstall the latest Windows 11 update

https://twitter.com/hapico0109/status/2013480169840001437?s=20