Technology

Samsung’s new Galaxy phones lay the groundwork for headsets and glasses to come

Samsung is working with Google on a mixed reality VR headset similar to Apple Vision Pro, running Android XR. Last year, we got to see a demonstration of this technology. Samsung Unpacked, its winter-themed event focused on phones, revealed more, including a Google-Samsung AI partnership that may be the missing link to bring it all together. This AI-infused experience is expected to be available on a new VR/AR headset in 2019, but also on the Galaxy S25 and the glasses that connect to it.
This article is part of CNET’s
Samsung event

collection of news, tips, and advice about Samsung’s most popular devices.
And now that AI is becoming consistent.

SamsungA seeing AI in real timeSamsung briefly addressed upcoming VR/AR headsets and glasses at its latest Unpacked event, but we already knew about them. And now that AI is becoming consistent.

Samsung

A seeing AI that works in real time

Samsung briefly addressed upcoming VR/AR headsets and glasses at its latest Unpacked event, but we largely knew about those already. Still, Samsung’s demonstration of real-time AI that can see things on your phone or through cameras is exactly the trend we

were expecting to arrive in 2025.Project Moohan (meaning “Infinity” in Korean) is a VR headset with passthrough cameras that blend the virtual and real, much like the Vision Pro or

Meta’s Quest 3. The design is similar to Meta’s discontinued Quest Pro, but the specs are much better. The headset is equipped with hand- and eye-tracking, Android apps that run on an Android XR operating system, which will be fully revealed later in the year, and Google Gemini AI throughout. Google’s Project Astra technology, which allows for real-time help on phones, headsets and glasses, is debuting in Samsung’s Galaxy S25 phone series. It’s already been on my face. To use Live AI, it had to be switched into the live mode. After that it was able to see and hear everything I did. The live assistance could be temporarily stopped using pause mode. It’ll probably work when watching YouTube videos, just like the Android XR demonstration. And according to Samsung and Google’s execs working on Android XR, it could even be used for live help while playing games.Gemini’s on-the-fly visual recognition skills might start feeling the same between glasses and phones.SamsungBetter battery life and processing…for glasses?Samsung and Google have also confirmed they’re working on smart glasses, also using Gemini AI, to compete with

Meta’s Ray-Bans

and a wave of other emerging eyewear. AR glasses may also be in the works. Smart glasses such as Meta’s Ray-Bans work in this way. Live AI may become a more common feature in the future, relying on smartphones to constantly assist glasses. The better processing, graphics, and most importantly, improved battery life and cooling sounded to me like ways to make these phones better pocket computers for eventual glasses.

Personal data clouds are what Samsung and Google are going to lean on to drive smarter AI assistants on both glasses and phones.

Samsung

A personal data set that these AI gadgets will need

Samsung also announced an obscure-sounding Personal Data Engine that Google and Samsung’s AI will take advantage of, bucketing personal data into a place where AI could possibly develop richer conclusions and connections to all the things that are part of your life.How that plays out or is secured, or where its limits are, was extremely unclear. But it sounds like a repository of personal data that Samsung and Google’s AI can train off and work with connected extended products, including watches, rings and glasses.Camera-enabled AI wearables are only as good as the data that can assist them, which is why so many of these devices right now feel clunky and weird to use, including Meta’s Ray-Bans in their AI modes. These AI devices are usually unable to know things that your apps already do better. Will I trust this process with Google, Samsung or anyone else? How will these phones and future glasses make the relationship between AI, our data, and us more clear and manageable? It feels like we’re watching one shoe drop here, with others coming when Google’s I/O developer conference will likely discuss Android XR and Gemini’s advances in far more depth.

Samsung’s making Project Moohan its first headset, following with glasses in the future after that. Google and Samsung will likely share more information at the developer focused Google I/O event in May or June. The full details may be revealed at Samsung’s Unpacked event, which is expected to take place later this summer. We may learn more by then about the reasons why the Galaxy S25 series of phones, which are a bit boring at first glance, might be building an infrastructure. This will become clearer towards the end of this year… or even later.

story originally seen here

Editorial Staff

Founded in 2020, Millenial Lifestyle Magazine is both a print and digital magazine offering our readers the latest news, videos, thought-pieces, etc. on various Millenial Lifestyle topics.

Leave a Reply

Your email address will not be published. Required fields are marked *