Android XR https://theinshotproapk.com/category/app/android-xr/ Download InShot Pro APK for Android, iOS, and PC Wed, 10 Dec 2025 12:01:27 +0000 en-US hourly 1 https://theinshotproapk.com/wp-content/uploads/2021/07/cropped-Inshot-Pro-APK-Logo-1-32x32.png Android XR https://theinshotproapk.com/category/app/android-xr/ 32 32 Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition https://theinshotproapk.com/start-building-for-glasses-new-devices-for-android-xr-and-more-in-the-android-show-xr-edition/ Wed, 10 Dec 2025 12:01:27 +0000 https://theinshotproapk.com/start-building-for-glasses-new-devices-for-android-xr-and-more-in-the-android-show-xr-edition/ Posted by Matthew McCullough – VP of Product Management, Android Developer Today, during The Android Show | XR Edition, we ...

Read more

The post Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition appeared first on InShot Pro.

]]>

Posted by Matthew McCullough – VP of Product Management, Android Developer



Today, during
The Android Show | XR Edition, we shared a look at the expanding Android XR platform, which is fundamentally evolving to bring a unified developer experience to the entire XR ecosystem. The latest announcements, from Developer Preview 3 to exciting new form factors, are designed to give you the tools and platform you need to create the next generation of XR experiences. Let’s dive into the details!

A spectrum of new devices ready for your apps

The Android XR platform is quickly expanding, providing more users and more opportunities for your apps. This growth is anchored by several new form factors that expand the possibilities for XR experiences.


A major focus is on lightweight, all-day wearables. At I/O, we announced we are working with Samsung and our partners Gentle Monster and Warby Parker to design stylish, lightweight AI glasses and Display AI glasses that you can wear comfortably all day.  The integration of Gemini on glasses is set to unlock helpful, intelligent experiences like live translation and searching what you see.

And, partners like Uber are already exploring how AI Glasses can streamline the rider experience by providing simple, contextual directions and trip status right in the user’s view


The ecosystem is simultaneously broadening its scope to include wired XR glasses, exemplified by Project Aura from XREAL. This device blends the immersive experiences typically found in headsets with portability and real-world presence. Project Aura is scheduled for launch next year.

New tools unlock development for all form factors

If you are developing for Android, you are already developing for Android XR. The release of Android XR SDK Developer Preview 3 brings increased stability for headset APIs and, most significantly, opens up development for AI Glasses. 


You can now build augmented experiences for AI glasses using new libraries like Jetpack Compose Glimmer, a UI toolkit for transparent displays , and Jetpack Projected, which lets you extend your Android mobile app directly to glasses. Furthermore, the SDK now includes powerful ARCore for Jetpack XR updates, such as Geospatial capabilities for wayfinding.

For immersive experiences on headsets and wired XR glasses like Project Aura from XREAL, this release also provides new APIs for detecting a device’s field-of-view, helping your adaptive apps adjust their UI.

Check out our post on the Android XR Developer Preview 3 to learn more about all the latest updates. 

Expanding your reach with new engine ecosystems

The Android XR platform is built on the OpenXR standard, enabling integration with the tools you already use so you can build with your preferred engine.

Developers can utilize Unreal Engine’s native Android and OpenXR capabilities, today, to build for Android XR leveraging the existing VR Template for immersive experiences. To provide additional, optimized extensions for the Android XR platform, a Google vendor plug, including support for hand tracking, hand mesh, and more, will be released early next year.

Godot now includes Android XR support, leveraging its focus on OpenXR to enable development for devices like Samsung Galaxy XR. The new Godot OpenXR vendor plugin v4.2.2 stable allows developers to port their existing projects to the platform. 

Watch The Android Show | XR Edition

Thank you for tuning into the The Android Show | XR Edition. Start building differentiated experiences today using the Developer Preview 3 SDK and test your apps with the XR Emulator in Android Studio. Your feedback is crucial as we continue to build this platform together. Head over to developer.android.com/xr to learn more and share your feedback.


The post Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition appeared first on InShot Pro.

]]>
How Calm Reimagined Mindfulness for Android XR https://theinshotproapk.com/how-calm-reimagined-mindfulness-for-android-xr/ Sun, 02 Nov 2025 12:08:16 +0000 https://theinshotproapk.com/how-calm-reimagined-mindfulness-for-android-xr/ Posted by Stevan Silva , Sr. Product Manager, Android XR Calm is a leading mental health and wellness company with ...

Read more

The post How Calm Reimagined Mindfulness for Android XR appeared first on InShot Pro.

]]>

Posted by Stevan Silva , Sr. Product Manager, Android XR

Calm is a leading mental health and wellness company with over 180 million downloads. When they started their development for Android XR, their core engineering team was able to build their first functional XR orbiter menus on Day 1 and a core experience in just two weeks. This demonstrates that building for XR can be an extension of existing Android development work, not something that has to be started from scratch. As a company dedicated to helping users sleep better, stress less, and live more mindfully, their extensive library has made Calm a trusted source for well-being content on Android. 


With the introduction of the Android XR platform, the Calm team saw an opportunity to not just optimize their existing Android app, but to truly create the next generation of immersive experiences.


We sat down with Kristen Coke, Lead Product Manager, and Jamie Martini, Sr. Manager of Engineering at Calm, to dive into their journey building for Android XR and learn how other developers can follow their lead.

Q: What was the vision for the Calm experience on Android XR, and how does it advance your mission?

A (Kristen Coke, Lead Product Manager): Our mission is to support everyone on every step of their mental health journey. XR allows us to expand how people engage with our mindfulness content, creating an experience that wasn’t just transportive but transformative.

If I had to describe it in one sentence, Calm on Android XR reimagines mindfulness for the world around you, turning any room into a fully immersive, multisensory meditation experience.

We wanted to create a version of Calm that couldn’t exist anywhere else, a serene and emotionally intelligent sanctuary that users don’t just want to visit, but will return to again and again.

Q: For developers who might think building for XR is a massive undertaking, what was your initial approach to bringing your existing Android app over?

A (Jamie Martini, Sr. Manager of Engineering): Our main goal was to adapt our Android app for XR and honestly, the process felt easy and seamless.

We already use Jetpack Compose extensively for our mobile app, so expanding that expertise into XR was the natural choice. It felt like extending our Android development, not starting from scratch. We were able to reuse a lot of our existing codebase, including our backend, media playback, and other core components, which dramatically cut down on the initial work.

The Android XR design guides provided valuable context throughout the process, helping both our design and development teams shape Calm’s mobile-first UX into something natural and intuitive for a spatial experience.

Q: You noted the process felt seamless. How quickly was your team able to start building and iterating on the core XR experience?

A (Jamie Martini, Sr. Manager of Engineering): We were productive right away, building our first orbiter menus on day one and a core XR Calm experience in about two weeks. The ability to apply our existing Android and Jetpack experience directly to a spatial environment gave us a massive head start, making the time-to-first-feature incredibly fast.

Q: Could you tell us about what you built to translate the Calm experience into this new spatial environment?

A (Jamie Martini, Sr. Manager of Engineering): We wanted to take full advantage of the immersive canvas to rethink how users engage with our content.

Two of the key features we evolved were the Immersive Breathe Bubble and the Immersive Scene Experiences.

The Breathe Bubble is our beloved breathwork experience, but brought into 3D. It’s a softly pulsing orb that anchors users to their breath with full environmental immersion.

And with our Immersive Scene Experiences, users can choose from a curated selection of ambient environments designed to gently wrap around them and fade into their physical environment. This was a fantastic way to take a proven 2D concept (the mobile app’s customizable background scenes) and transform it for the spatial environment. 

We didn’t build new experiences from scratch; we simply evolved core, proven features to take advantage of the immersive canvas.


Q: What were the keys to building a visually compelling experience that feels native to the Android XR platform?


A (Kristen Coke, Lead Product Manager): Building for a human-scale, spatial environment required us to update our creative workflow.


We started with concept art to establish our direction, which we then translated into 3D models using a human-scale reference to ensure natural proportions and comfort for the user.


Then, we consistently tested the assets directly in a headset to fine-tune scale, lighting, and atmosphere. For developers who may not have a physical device, the Android XR emulator is a helpful alternative for testing and debugging.


We quickly realized that in a multisensory environment, restraint was incredibly powerful. We let the existing content (the narration, the audio) amplify the environment, rather than letting the novelty of the 3D space distract from the mindfulness core.


Q: How would you describe the learning curve for other developers interested in building for XR? Do you have any advice?


A (Jamie Martini, Sr. Manager of Engineering): This project was the first step into immersive platforms for our Android engineering team, and we were pleasantly surprised. The APIs were very easy to learn and use and felt consistent with other Jetpack libraries.


My advice to other developers? Begin by integrating the Jetpack XR APIs into your existing Android app and reusing as much of your existing code as possible. That is the quickest way to get a functional prototype.


A (Kristen Coke, Lead Product Manager): Think as big as possible. Android XR gave us a whole new world to build our app within. Teams should ask themselves: What is the biggest, boldest version of your experience that you could possibly build? This is your opportunity to finally put into action what you’ve always wanted to do, because now, you have the platform that can make it real.


Building the next generation of spatial experiences


The work the Calm team has done showcases how building on the Android XR platform can be a natural extension of your existing Android expertise. By leveraging the Jetpack XR SDKs, Calm quickly evolved their core mobile features into a stunning spatial experience.


If you’re ready to get started, you can find all the resources you need at developer.android.com/xr. Head over there to download the latest SDK, explore our documentation, and start building today.


The post How Calm Reimagined Mindfulness for Android XR appeared first on InShot Pro.

]]>
Peacock built adaptively on Android to deliver great experiences across screens https://theinshotproapk.com/peacock-built-adaptively-on-android-to-deliver-great-experiences-across-screens/ Fri, 30 May 2025 12:00:31 +0000 https://theinshotproapk.com/peacock-built-adaptively-on-android-to-deliver-great-experiences-across-screens/ Posted by Sa-ryong Kang and Miguel Montemayor – Developer Relations Engineers Peacock is NBCUniversal’s streaming service app available in the ...

Read more

The post Peacock built adaptively on Android to deliver great experiences across screens appeared first on InShot Pro.

]]>

Posted by Sa-ryong Kang and Miguel Montemayor – Developer Relations Engineers

Peacock is NBCUniversal’s streaming service app available in the US, offering culture-defining entertainment including live sports, exclusive original content, TV shows, and blockbuster movies. The app continues to evolve, becoming more than just a platform to watch content, but a hub of entertainment.

Today’s users are consuming entertainment on an increasingly wider array of device sizes and types, and in particular are moving towards mobile devices. Peacock has adopted Jetpack Compose to help with its journey in adapting to more screens and meeting users where they are.

Disclaimer: Peacock is available in the US only. This video will only be viewable to US viewers.

Adapting to more flexible form factors

The Peacock development team is focused on bringing the best experience to users, no matter what device they’re using or when they want to consume content. With an emerging trend from app users to watch more on mobile devices and large screens like foldables, the Peacock app needs to be able to adapt to different screen sizes. As more devices are introduced, the team needed to explore new solutions that make the most out of each unique display permutation.

The goal was to have the Peacock app to adapt to these new displays while continually offering high-quality entertainment without interruptions, like the stream reloading or visual errors. While thinking ahead, they also wanted to prepare and build a solution that was ready for Android XR as the entertainment landscape is shifting towards including more immersive experiences.

quote card featuring a headshot of Diego Valente, Head of Mobile, Peacock & Global Streaming, reads 'Thinking adaptively isn't just about supporting tablets or large screens - it's about future proofing your app. Investing in adaptability helps you meet user's expectations of having seamless experiencers across all their devices and sets you up for what's next.'

Building a future-proof experience with Jetpack Compose

In order to build a scalable solution that would help the Peacock app continue to evolve, the app was migrated to Jetpack Compose, Android’s toolkit for building scalable UI. One of the essential tools they used was the WindowSizeClass API, which helps developers create and test UI layouts for different size ranges. This API then allows the app to seamlessly switch between pre-set layouts as it reaches established viewport breakpoints for different window sizes.

The API was used in conjunction with Kotlin Coroutines and Flows to keep the UI state responsive as the window size changed. To test their work and fine tune edge case devices, Peacock used the Android Studio emulator to simulate a wide range of Android-based devices.

Jetpack Compose allowed the team to build adaptively, so now the Peacock app responds to a wide variety of screens while offering a seamless experience to Android users. “The app feels more native, more fluid, and more intuitive across all form factors,” said Diego Valente, Head of Mobile, Peacock and Global Streaming. “That means users can start watching on a smaller screen and continue instantly on a larger one when they unfold the device—no reloads, no friction. It just works.”

Preparing for immersive entertainment experiences

In building adaptive apps on Android, John Jelley, Senior Vice President, Product & UX, Peacock and Global Streaming, says Peacock has also laid the groundwork to quickly adapt to the Android XR platform: “Android XR builds on the same large screen principles, our investment here naturally extends to those emerging experiences with less developmental work.”

The team is excited about the prospect of features unlocked by Android XR, like Multiview for sports and TV, which enables users to watch multiple games or camera angles at once. By tailoring spatial windows to the user’s environment, the app could offer new ways for users to interact with contextual metadata like sports stats or actor information—all without ever interrupting their experience.

Build adaptive apps

Learn how to unlock your app’s full potential on phones, tablets, foldables, and beyond.

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.

The post Peacock built adaptively on Android to deliver great experiences across screens appeared first on InShot Pro.

]]>
Updates to the Android XR SDK: Introducing Developer Preview 2 https://theinshotproapk.com/updates-to-the-android-xr-sdk-introducing-developer-preview-2/ Thu, 29 May 2025 12:02:26 +0000 https://theinshotproapk.com/updates-to-the-android-xr-sdk-introducing-developer-preview-2/ Posted by Matthew McCullough – VP of Product Management, Android Developer Since launching the Android XR SDK Developer Preview alongside ...

Read more

The post Updates to the Android XR SDK: Introducing Developer Preview 2 appeared first on InShot Pro.

]]>

Posted by Matthew McCullough – VP of Product Management, Android Developer

Since launching the Android XR SDK Developer Preview alongside Samsung, Qualcomm, and Unity last year, we’ve been blown away by all of the excitement we’ve been hearing from the broader Android community. Whether it’s through coding live-streams or local Google Developer Group talks, it’s been an outstanding experience participating in the community to build the future of XR together, and we’re just getting started.

Today we’re excited to share an update to the Android XR SDK: Developer Preview 2, packed with new features and improvements to help you develop helpful and delightful immersive experiences with familiar Android APIs, tools and open standards created for XR.

At Google I/O, we have two technical sessions related to Android XR. The first is Building differentiated apps for Android XR with 3D content, which covers many features present in Jetpack SceneCore and ARCore for Jetpack XR. The future is now, with Compose and AI on Android XR covers creating XR-differentiated UI and our vision on the intersection of XR with cutting-edge AI capabilities.

Android XR sessions at Google I/O 2025

Building differentiated apps for Android XR with 3D content and The future is now, with Compose and AI on Android XR

What’s new in Developer Preview 2

Since the release of Developer Preview 1, we’ve been focused on making the APIs easier to use and adding new immersive Android XR features. Your feedback has helped us shape the development of the tools, SDKs, and the platform itself.

With the Jetpack XR SDK, you can now play back 180° and 360° videos, which can be stereoscopic by encoding with the MV-HEVC specification or by encoding view-frames adjacently. The MV-HEVC standard is optimized and designed for stereoscopic video, allowing your app to efficiently play back immersive videos at great quality. Apps built with Jetpack Compose for XR can use the SpatialExternalSurface composable to render media, including stereoscopic videos.

Using Jetpack Compose for XR, you can now also define layouts that adapt to different XR display configurations. For example, use a SubspaceModifier to specify the size of a Subspace as a percentage of the device’s recommended viewing size, so a panel effortlessly fills the space it’s positioned in.

Material Design for XR now supports more component overrides for TopAppBar, AlertDialog, and ListDetailPaneScaffold, helping your large-screen enabled apps that use Material Design effortlessly adapt to the new world of XR.

An app adapts to XR using Material Design for XR with the new component overrides

An app adapts to XR using Material Design for XR with the new component overrides

In ARCore for Jetpack XR, you can now track hands after requesting the appropriate permissions. Hands are a collection of 26 posed hand joints that can be used to detect hand gestures and bring a whole new level of interaction to your Android XR apps:

moving image demonstrates how hands bring a natural input method to your Android XR experience.

Hands bring a natural input method to your Android XR experience.

For more guidance on developing apps for Android XR, check out our Android XR Fundamentals codelab, the updates to our Hello Android XR sample project, and a new version of JetStream with Android XR support.

The Android XR Emulator has also received updates to stability, support for AMD GPUs, and is now fully integrated within the Android Studio UI.

the Android XR Emulator in Android STudio

The Android XR Emulator is now integrated in Android Studio

Developers using Unity have already successfully created and ported existing games and apps to Android XR. Today, you can upgrade to the Pre-Release version 2 of the Unity OpenXR: Android XR package! This update adds many performance improvements such as support for Dynamic Refresh Rate, which optimizes your app’s performance and power consumption. Shaders made with Shader Graph now support SpaceWarp, making it easier to use SpaceWarp to reduce compute load on the device. Hand meshes are now exposed with occlusion, which enables realistic hand visualization.

Check out Unity’s improved Mixed Reality template for Android XR, which now includes support for occlusion and persistent anchors.

We recently launched Android XR Samples for Unity, which demonstrate capabilities on the Android XR platform such as hand tracking, plane tracking, face tracking, and passthrough.

moving image of Google’s open-source Unity samples demonstrating platform features and showing how they’re implemented

Google’s open-source Unity samples demonstrate platform features and show how they’re implemented

The Firebase AI Logic for Unity is now in public preview! This makes it easy for you to integrate gen AI into your apps, enabling the creation of AI-powered experiences with Gemini and Android XR. The Firebase AI Logic fully supports Gemini’s capabilities, including multimodal input and output, and bi-directional streaming for immersive conversational interfaces. Built with production readiness in mind, Firebase AI Logic is integrated with core Firebase services like App Check, Remote Config, and Cloud Storage for enhanced security, configurability, and data management. Learn more about this on the Firebase blog or go straight to the Gemini API using Vertex AI in Firebase SDK documentation to get started.

Continuing to build the future together

Our commitment to open standards continues with the glTF Interactivity specification, in collaboration with the Khronos Group. which will be supported in glTF models rendered by Jetpack XR later this year. Models using the glTF Interactivity specification are self-contained interactive assets that can have many pre-programmed behaviors, like rotating objects on a button press or changing the color of a material over time.

Android XR will be available first on Samsung’s Project Moohan, launching later this year. Soon after, our partners at XREAL will release the next Android XR device. Codenamed Project Aura, it’s a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR. It will launch as a developer edition, specifically for you to begin creating and experimenting. The best news? With the familiar tools you use to build Android apps today, you can build for these devices too.

product image of XREAL’s Project Aura against a nebulous black background

XREAL’s Project Aura

The Google Play Store is also getting ready for Android XR. It will list supported 2D Android apps on the Android XR Play Store when it launches later this year. If you are working on an Android XR differentiated app, you can get it ready for the big launch and be one of the first differentiated apps on the Android XR Play Store:

And we know many of you are excited for the future of Android XR on glasses. We are shaping the developer experience now and will share more details on how you can participate later this year.

To get started creating and developing for Android XR, check out developer.android.com/develop/xr where you will find all of the tools, libraries, and resources you need to work with the Android XR SDK. In particular, try out our samples and codelabs.

We welcome your feedback, suggestions, and ideas as you’re helping shape Android XR. Your passion, expertise, and bold ideas are vital as we continue to develop Android XR together. We look forward to seeing your XR-differentiated apps when Android XR devices launch later this year!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.

The post Updates to the Android XR SDK: Introducing Developer Preview 2 appeared first on InShot Pro.

]]>