InShot Pro https://theinshotproapk.com/ Download InShot Pro APK for Android, iOS, and PC Thu, 11 Dec 2025 22:26:00 +0000 en-US hourly 1 https://theinshotproapk.com/wp-content/uploads/2021/07/cropped-Inshot-Pro-APK-Logo-1-32x32.png InShot Pro https://theinshotproapk.com/ 32 32 Building a safer Android and Google Play, together https://theinshotproapk.com/building-a-safer-android-and-google-play-together/ Thu, 11 Dec 2025 22:26:00 +0000 https://theinshotproapk.com/building-a-safer-android-and-google-play-together/ Posted by Matthew Forsythe , Director, Product Management, App & Ecosystem Trust and Ron Aquino Sr. Director, Trust and Safety, ...

Read more

The post Building a safer Android and Google Play, together appeared first on InShot Pro.

]]>

Posted by Matthew Forsythe , Director, Product Management, App & Ecosystem Trust and Ron Aquino Sr. Director, Trust and Safety, Chrome, Android and Play




Earlier this year, we reiterated our commitment to keeping Android and Google Play safe for everyone and maintaining a thriving environment where users can trust the apps they download and your business can flourish. We’ve heard your feedback clearly, from excited conversations at Play events around the world to the honest concerns on social media. You want simpler ways to make sure your apps are compliant and pass review, and need strong protections for your business so you can focus on growth and innovation. We are proud of the steps we’ve taken together this year, but know this is ongoing work in a  complex, ever-changing market. 


Here are key actions we’ve taken this year to simplify your development journey and strengthen protection.

Simpler ways to build safer apps from the start

This year, we focused on making improvements to the app publishing experience by reducing friction points, from the moment you write code to submitting your app for review.

  • Policy guidance right where you code: We rolled out Play Policy Insights to all developers using Android Studio. This feature provides real-time, in-context guidance and policy warnings as you code, helping you proactively identify and resolve potential issues before you even submit your app for review.

  • Pre-review checks to help prevent app review surprises: Last year, we launched pre-review checks in Play Console so you can identify issues early, like incomplete policy declarations or crashes, and avoid rejections. This year, we expanded these checks for privacy policy links, login credential requirements, data deletion request links, inaccuracies in your Data safety form, and more.

Stronger protection for your business and users

We are committed to providing you with powerful ways to protect your apps and users from abuse. Beyond existing tools, programs, and the performance and security enhancement that comes with every Android release, we’ve also launched:

  • Advanced abuse and fraud protection: We made the Play Integrity API faster and more resilient, and introduced new features like Play remediation prompts and device recall in beta. Device recall is a powerful new tool that lets you store and recall limited data associated with a device, even if the device is reset, helping protect your business model from repeat bad actors.

  • Tools to keep kids safe

    • We continued to invest in protecting children across Google products, including Google Play. New Play policy helps keep our youngest users safe globally by requiring apps with dating and gambling features to use Play Console tools to prevent minors from accessing them. Our enhanced Restrict Minor Access feature now blocks the users who we determine to be minors from searching for, downloading, or making purchases in apps that they shouldn’t have access to. 

    • We’ve also been providing tools to developers to help meet significant new age verification regulatory requirements in applicable US states.

  • More ways to stop malware from snooping on your app: Android 16 provides a new, powerful defense in a single line of code: accessibilityDataSensitive. This flag lets you explicitly mark views in your app as containing sensitive data and block malicious apps from seeing or performing interactions on it. If you already use setFilterTouchesWhenObscured(true) to protect your app from tapjacking, your views are automatically treated as sensitive data for accessibility for an instant additional layer of defense with no extra work. 

Smoother policy compliance experience

We’re listening to your concerns and proactively working to make the experience of Play policy compliance and Android security requirements more transparent, predictable, and accessible for all developers. You asked for clarity, fairness, and speed, and here is what we launched:

  • More support when you need it: Beyond the webinars and resources that we share, you told us you needed more direct policy help to understand requirements and get answers. Next week, we’ll add a direct way for you to reach our team about policy questions in your Play Console. You’ll be able to find this new, integrated support experience directly within your Play Console via the “Help” section. We also expanded the Google Play Developer Help Community to more languages, like Indonesian, Japanese, Korean, and Portuguese. 

  • Clearer documentation: You asked for policy that’s easier to understand. To help you quickly grasp essential requirements, we’ve introduced a new Key Considerations section across several policies (like Permissions and Target API Level) and included concise “Do’s & Don’ts” and easier-to-read summaries.

  • More transparent appeals process: We introduced a 180-day appeal window for account terminations. This allows us to prioritize and make decisions faster for developers who file appeals.

  • Android developer verification design changes: To support a diverse range of users and developers, we’re taking action on your feedback. 

    • First, we’re creating a dedicated free account type to support students and hobbyists who want to build apps just for a small group, like family and friends. This means that you can share your creations to a limited number of devices without needing to go through the full developer verification process. 

    • We’re also building a flow for experienced users to be able to install unverified apps. This is being carefully designed to balance providing choice with prioritizing security, including clear warnings so users fully understand the risks before choosing to bypass standard safety checks. 

The improvements we made this year are only the beginning. Your feedback helps drive our roadmap, and it will continue to inform future refinements to our policies, tools, experiences, and ensuring Android and Google Play remain the safest and most trusted place for you to innovate and grow your business. 


Thank you for being our partner in building the future of Android.


The post Building a safer Android and Google Play, together appeared first on InShot Pro.

]]>
Enhancing Android security: Stop malware from snooping on your app data https://theinshotproapk.com/enhancing-android-security-stop-malware-from-snooping-on-your-app-data/ Thu, 11 Dec 2025 17:00:00 +0000 https://theinshotproapk.com/enhancing-android-security-stop-malware-from-snooping-on-your-app-data/ Posted by Bennet Manuel, Product Management, Android App Safety and Rob Clifford, Developer Relations Security is foundational to Android. We ...

Read more

The post Enhancing Android security: Stop malware from snooping on your app data appeared first on InShot Pro.

]]>
Posted by Bennet Manuel, Product Management, Android App Safety and Rob Clifford, Developer Relations




Security is foundational to Android. We partner with you to keep the platform safe and protect user data by offering powerful security tools and features, like Credential Manager and FLAG_SECURE. Every Android release brings performance and security enhancements, and with Android 16, you can take simple, significant steps to strengthen your app’s defenses. Check out our video or continue reading to learn more about our enhanced protections for accessibility APIs.

Protect your app from snooping with a single line of code

We’ve seen that bad actors sometimes try to exploit accessibility API features to read sensitive information, like passwords and financial details, directly from the screen and manipulate a user’s device by injecting touches. To combat this, Android 16 provides a new, powerful defense in a single line of code: accessibilityDataSensitive.

The accessibilityDataSensitive flag allows you to explicitly mark a view or composable as containing sensitive data. When you set this flag to true on your app, you are essentially blocking potentially malicious apps from accessing your sensitive view data or performing interactions on it. Here is how it works: any app requesting accessibility permission that hasn’t explicitly declared itself as a legitimate accessibility tool (isAccessibilityTool=true) is denied access to that view.

This simple but effective change helps to prevent malware from stealing information and performing unauthorized actions, all without impacting users’ experience of legitimate accessibility tools. Note: If an app is not an accessibility tool but requests accessibility permissions and sets isAccessibilityTool=true, Play will reject it and Google Play Protect will block it on user devices. 

Automatic, enhanced security for setFilterTouchesWhenObscured protection

We’ve already integrated this new accessibilityDataSensitive security functionality with the existing setFilterTouchesWhenObscured method. 

If you already use setFilterTouchesWhenObscured(true) to protect your app from tapjacking, your views are automatically treated as sensitive data for accessibility. By enhancing the setFilterTouchesWhenObscured method with accessibilityDataSensitive protections, we’re instantly giving everyone an additional layer of defense with no extra work.

Getting started

We recommend that you use setFilterTouchesWhenObscured, or alternatively the accessibilityDataSensitive flag, on any screen that contains sensitive information, including login pages, payment flows, and any view displaying personal or financial data.

For Jetpack Compose

setFilterTouchesWhenObscured

accessibilityDataSensitive

val composeView = LocalView.current DisposableEffect(Unit) { composeView.filterTouchesWhenObscured = true onDispose { composeView.filterTouchesWhenObscured = false } }

Use the semantics modifier to apply the sensitiveData property to a composable.

BasicText { text = “Your password”,

            modifier = Modifier.semantics {

                sensitiveData = true }}

For View-based apps

In your XML layout, add the relevant attribute to the sensitive view.

setFilterTouchesWhenObscured

accessibilityDataSensitive

<TextView android:filterTouchesWhenObscured=”true” />

<TextView android:accessibilityDataSensitive=”true” />

Alternatively, you can set the property programmatically in Java or Kotlin:

setFilterTouchesWhenObscured

accessibilityDataSensitive

myView.filterTouchesWhenObscured = true;

myView.isAccessibilityDataSensitive = true;

myView.setFilterTouchesWhenObscured(true)

myView.setAccessibilityDataSensitive(true);

Read more about the accessibilityDataSensitive and setFilterTouchesWhenObscured flags in the Tapjacking guide.

Partnering with developers to keep users safe

We worked with developers early to ensure this feature meets real-world needs and integrates smoothly into your workflow.

 “We’ve always prioritized protecting our customers’ sensitive financial data, which required us to build our own protection layer against accessibility-based malware. Revolut strongly supports the introduction of this new, official Android API, as it allows us to gradually move away from our custom code in favor of a robust, single-line platform defense.”

– Vladimir Kozhevnikov, Android Engineer at Revolut
You can play a crucial role in protecting your users from malicious accessibility-based attacks by adopting these features. We encourage all developers to integrate these features into their apps to help keep users safe.

Together, we can build a more secure and trustworthy experience for everyone.

The post Enhancing Android security: Stop malware from snooping on your app data appeared first on InShot Pro.

]]>
#WeArePlay: How Miksapix Interactive is bringing ancient Sámi Mythology to gamers worldwide https://theinshotproapk.com/weareplay-how-miksapix-interactive-is-bringing-ancient-sami-mythology-to-gamers-worldwide/ Wed, 10 Dec 2025 12:01:55 +0000 https://theinshotproapk.com/weareplay-how-miksapix-interactive-is-bringing-ancient-sami-mythology-to-gamers-worldwide/ Posted by Robbie McLachlan, Developer Marketing In our latest #WeArePlay film, which celebrates the people behind apps and games on ...

Read more

The post #WeArePlay: How Miksapix Interactive is bringing ancient Sámi Mythology to gamers worldwide appeared first on InShot Pro.

]]>

Posted by Robbie McLachlan, Developer Marketing



In our latest #WeArePlay film, which celebrates the people behind apps and games on Google Play, we meet Mikkel – the founder and CEO of Miksapix Interactive. Mikkel is on a mission to share the rich stories and culture of the Sámi people through gaming. Discover how he is building a powerful platform for cultural preservation using a superheroine.

You went from a career in broadcasting to becoming a founder in the games industry. What inspired that leap?

I’ve had an interest in games for a long time and always found the medium interesting. While I was working for a broadcast corporation in Karasjok, I was thinking, “Why aren’t there any Sámi games or games with Sámi content?”. Sámi culture is quite rich in lore and mythology. I wanted to bring that to a global stage. That’s how Miksapix Interactive was born.


Your game, Raanaa – The Shaman Girl, is deeply rooted in Sámi culture. What is the significance of telling these specific stories?

Because these are our stories to tell! Our mission is to tell them to a global audience to create awareness about Sámi identity and culture. Most people in the world don’t know about the Sámi and the Sámi cultures and lore. With our languages at risk, I hope to use storytelling as a way to inspire Sámi children to value their language, celebrate their identity, and take pride in their cultural heritage. Sámi mythology is rich with powerful matriarchs and goddesses, which inspired us to create a superheroine. Through her journey of self-discovery and empowerment, Raanaa finds her true strength — a story we hope will inspire hope and resilience in pre-teens and teens around the world. Through games like Raanaa – The Shaman Girl, we get to convey our stories in new formats.

How did growing up with rich storytelling affect your games?

I was raised in a reindeer herders family, which means we spent a lot of time in nature and the fields with the reindeers. Storytelling was a big part of the family. We would eat supper in the Lavvu tent sitting around a bonfire with relatives and parents telling stories. With Miksapix Interactive, I am taking my love for storytelling and bringing it to the world, using my first hand experience of the Sámi culture.

How has Google Play helped you achieve global reach from your base in the Arctic?

For us, Google Play was a no-brainer. It was the easiest part to just release it on Google Play, no hassle. We have more downloads from Google Play than anywhere else, and it has definitely helped us getting abroad in markets like Brazil, India and the US and beyond. The positive Play Store reviews motivated and inspired us during the development of Raanaa. We use Google products like Google Sheets for collaboration when we do a localization or translation. 

What is next for Miksapix Interactive?

Now, our sights are set on growth. We are very focused on the Raanaa IP. For the mobile game, we are looking into localizing it to different Sámi languages. In Norway, we have six Sámi languages, so we are now going to translate it to Lule Sámi and Southern Sámi. We’re planning to have these new Sámi languages available this year.

Discover other inspiring app and game founders featured in #WeArePlay.

The post #WeArePlay: How Miksapix Interactive is bringing ancient Sámi Mythology to gamers worldwide appeared first on InShot Pro.

]]>
Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition https://theinshotproapk.com/start-building-for-glasses-new-devices-for-android-xr-and-more-in-the-android-show-xr-edition/ Wed, 10 Dec 2025 12:01:27 +0000 https://theinshotproapk.com/start-building-for-glasses-new-devices-for-android-xr-and-more-in-the-android-show-xr-edition/ Posted by Matthew McCullough – VP of Product Management, Android Developer Today, during The Android Show | XR Edition, we ...

Read more

The post Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition appeared first on InShot Pro.

]]>

Posted by Matthew McCullough – VP of Product Management, Android Developer



Today, during
The Android Show | XR Edition, we shared a look at the expanding Android XR platform, which is fundamentally evolving to bring a unified developer experience to the entire XR ecosystem. The latest announcements, from Developer Preview 3 to exciting new form factors, are designed to give you the tools and platform you need to create the next generation of XR experiences. Let’s dive into the details!

A spectrum of new devices ready for your apps

The Android XR platform is quickly expanding, providing more users and more opportunities for your apps. This growth is anchored by several new form factors that expand the possibilities for XR experiences.


A major focus is on lightweight, all-day wearables. At I/O, we announced we are working with Samsung and our partners Gentle Monster and Warby Parker to design stylish, lightweight AI glasses and Display AI glasses that you can wear comfortably all day.  The integration of Gemini on glasses is set to unlock helpful, intelligent experiences like live translation and searching what you see.

And, partners like Uber are already exploring how AI Glasses can streamline the rider experience by providing simple, contextual directions and trip status right in the user’s view


The ecosystem is simultaneously broadening its scope to include wired XR glasses, exemplified by Project Aura from XREAL. This device blends the immersive experiences typically found in headsets with portability and real-world presence. Project Aura is scheduled for launch next year.

New tools unlock development for all form factors

If you are developing for Android, you are already developing for Android XR. The release of Android XR SDK Developer Preview 3 brings increased stability for headset APIs and, most significantly, opens up development for AI Glasses. 


You can now build augmented experiences for AI glasses using new libraries like Jetpack Compose Glimmer, a UI toolkit for transparent displays , and Jetpack Projected, which lets you extend your Android mobile app directly to glasses. Furthermore, the SDK now includes powerful ARCore for Jetpack XR updates, such as Geospatial capabilities for wayfinding.

For immersive experiences on headsets and wired XR glasses like Project Aura from XREAL, this release also provides new APIs for detecting a device’s field-of-view, helping your adaptive apps adjust their UI.

Check out our post on the Android XR Developer Preview 3 to learn more about all the latest updates. 

Expanding your reach with new engine ecosystems

The Android XR platform is built on the OpenXR standard, enabling integration with the tools you already use so you can build with your preferred engine.

Developers can utilize Unreal Engine’s native Android and OpenXR capabilities, today, to build for Android XR leveraging the existing VR Template for immersive experiences. To provide additional, optimized extensions for the Android XR platform, a Google vendor plug, including support for hand tracking, hand mesh, and more, will be released early next year.

Godot now includes Android XR support, leveraging its focus on OpenXR to enable development for devices like Samsung Galaxy XR. The new Godot OpenXR vendor plugin v4.2.2 stable allows developers to port their existing projects to the platform. 

Watch The Android Show | XR Edition

Thank you for tuning into the The Android Show | XR Edition. Start building differentiated experiences today using the Developer Preview 3 SDK and test your apps with the XR Emulator in Android Studio. Your feedback is crucial as we continue to build this platform together. Head over to developer.android.com/xr to learn more and share your feedback.


The post Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition appeared first on InShot Pro.

]]>
Build for AI Glasses with the Android XR SDK Developer Preview 3 and unlock new features for immersive experiences https://theinshotproapk.com/build-for-ai-glasses-with-the-android-xr-sdk-developer-preview-3-and-unlock-new-features-for-immersive-experiences/ Mon, 08 Dec 2025 18:00:00 +0000 https://theinshotproapk.com/build-for-ai-glasses-with-the-android-xr-sdk-developer-preview-3-and-unlock-new-features-for-immersive-experiences/ Posted by Matthew McCullough – VP of Product Management, Android Developer In October, Samsung launched Galaxy XR – the first ...

Read more

The post Build for AI Glasses with the Android XR SDK Developer Preview 3 and unlock new features for immersive experiences appeared first on InShot Pro.

]]>

Posted by Matthew McCullough – VP of Product Management, Android Developer

In October, Samsung launched Galaxy XR – the first device powered by Android XR. And it’s been amazing seeing what some of you have been building! Here’s what some of our developers have been saying about their journey into Android XR.

Android XR gave us a whole new world to build our app within. Teams should ask themselves: What is the biggest, boldest version of your experience that you could possibly build? This is your opportunity to finally put into action what you’ve always wanted to do, because now, you have the platform that can make it real.

You’ve also seen us share a first look at other upcoming devices that work with Android XR like Project Aura from XREAL and stylish glasses from Gentle Monster and Warby Parker.

To support the expanding selection of XR devices, we are announcing Android XR SDK Developer Preview 3!


With Android XR SDK Developer Preview 3, on top of building
immersive experiences for devices such as Galaxy XR, you can also now build augmented experiences for upcoming AI Glasses with Android XR. 

New tools and libraries for augmented experiences

With developer preview 3, we are unlocking the tools and libraries you need to build intelligent and hands-free augmented experiences for AI Glasses. AI Glasses are lightweight and portable for all day wear. You can extend your existing mobile app to take advantage of the built-in speakers, camera, and microphone to provide new, thoughtful and helpful user interactions. With the addition of a small display on display AI Glasses, you can privately present information to users. AI Glasses are perfect for experiences that can help enhance a user’s focus and presence in the real world.

To power augmented experiences on AI Glasses, we are introducing two new, purpose-built libraries to the Jetpack XR SDK:

  • Jetpack Projected – built to bridge mobile devices and AI Glasses with features that allow you to access sensors, speakers, and displays on glasses

  • Jetpack Compose Glimmer – new design language and UI components for crafting and styling your augmented experiences on display AI Glasses


Jetpack Compose Glimmer is a demonstration of design best practices for beautiful, optical see-through augmented experiences. With UI components optimized for the input modality and styling requirements of display AI Glasses, Jetpack Compose Glimmer is designed for clarity, legibility, and minimal distraction.

To help visualize and test your Jetpack Compose Glimmer UI we are introducing the AI Glasses emulator in Android Studio. The new AI Glasses emulator can simulate glasses-specific interactions such as touchpad and voice input. 

Beyond the new Jetpack Projected and Jetpack Compose Glimmer libraries, we are also expanding ARCore for Jetpack XR to support AI Glasses. We are starting off with motion tracking and geospatial capabilities for augmented experiences – the exact features that enable you to create helpful navigation experiences perfect for all-day-wear devices like AI Glasses.


Expanding support for immersive experiences

We continue to invest in the libraries and tooling that power immersive experiences for XR Headsets like Samsung Galaxy XR and wired XR Glasses like the upcoming Project Aura from XREAL. We’ve been listening to your feedback and have added several highly-requested features to the Jetpack XR SDK since developer preview 2.


Jetpack SceneCore now features dynamic glTF model loading via URIs and improved materials support for creating new PBR materials at runtime. Additionally, the SurfaceEntity component has been enhanced with full Widevine Digital Rights Management (DRM) support and new shapes, allowing it to render 360-degree and 180-degree videos in spheres and hemispheres.

In Jetpack Compose for XR, you’ll find new features like the UserSubspace component for follow behavior, ensuring content remains in the user’s view regardless of where they look. Additionally, you can now use spatial animations for smooth transitions like sliding or fading. And to support an expanding ecosystem of immersive devices with diverse display capabilities, you can now specify layout sizes as fractions of the user’s comfortable field of view.

In Material Design for XR, new components automatically adapt spatially via overrides. These include dialogs that elevate spatially, and navigation bars, which pop out into an Orbiter. Additionally, there is a new SpaceToggleButton component for easily transitioning to and from full space.

And in ARCore for Jetpack XR, new perception capabilities have been added, including face tracking with 68 blendshape values unlocking a world of facial gestures. You can also use eye tracking to power virtual avatars, and

depth maps

to enable more-realistic interactions with a user’s environment.

For devices like Project Aura from XREAL, we are introducing the

XR Glasses emulator

 in Android Studio. This essential tool is designed to give you accurate content visualization, while matching real device specifications for Field of View (FoV), Resolution, and DPI to accelerate your development. 


If you build immersive experiences with Unity, we’re also expanding your perception capabilities in the Android XR SDK for Unity. In addition to lots of bug fixes and other improvements, we are expanding tracking capabilities to include: QR and ArUco codes, planar images, and body tracking (experimental). We are also introducing a much-requested feature: scene meshing. It enables you to have much deeper interactions with your user’s environment – your digital content can now bounce off of walls and climb up couches!

And that’s just the tip of the iceberg! Be sure to check out our immersive experiences page for more information.

Get Started Today!

The Android XR SDK Developer Preview 3 is available today! Download the latest Android Studio Canary (Otter 3, Canary 4 or later) and upgrade to the latest emulator version (36.4.3 Canary or later) and then visit  developer.android.com/xr to get started with the latest libraries and samples you need to build for the growing selection of Android XR devices. We’re building Android XR together with you! Don’t forget to share your feedback, suggestions, and ideas with our team as you progress on your journey in Android XR.


The post Build for AI Glasses with the Android XR SDK Developer Preview 3 and unlock new features for immersive experiences appeared first on InShot Pro.

]]>
Android Studio Otter 2 Feature Drop is stable! https://theinshotproapk.com/android-studio-otter-2-feature-drop-is-stable/ Thu, 04 Dec 2025 18:37:00 +0000 https://theinshotproapk.com/android-studio-otter-2-feature-drop-is-stable/ Posted by Sandhya Mohan – Product Manager, Trevor Johns – Developer Relations Engineer The Android Studio Otter 2 Feature Drop ...

Read more

The post Android Studio Otter 2 Feature Drop is stable! appeared first on InShot Pro.

]]>

Posted by Sandhya Mohan – Product Manager, Trevor Johns – Developer Relations Engineer

The Android Studio Otter 2 Feature Drop is here to supercharge your productivity.

This final stable release for ‘25 powers up Agent Mode, equipping it with the new Android Knowledge Base for improved accuracy, and giving you the option to try out the new Gemini 3 model. You’ll also be able to take advantage of new settings such as the ability to keep your personalized IDE environment consistent across all of your machines. We’ve also incorporated all of the latest stability and performance improvements from the IntelliJ IDEA 2025.2 platform, including Kotlin compiler and terminal improvements, making this a significant enhancement for your development workflow.

Updates to Agent Mode

Access to Gemini 3


We recently introduced the ability to use our latest model, Gemini 3 Pro Preview, within Android Studio. This is our best model for coding and agentic capabilities. It’ll give you superior performance in Agent Mode and advanced problem-solving capabilities so you can focus on what you do best: creating high quality apps for your users.

We are beginning to roll out limited Gemini 3 access (with a 1 million token size window) to developers who are using the no-cost default model. For higher usage rate limits and longer sessions with Agent Mode, you can add a paid Gemini API Key or use a Gemini Code Assist Enterprise plan. Learn more about how to get started with Gemini 3.



Enhance Agent Mode with Android knowledge


While the training of large language models provides deep knowledge that is excellent for common tasks—like creating Compose UIs—training concludes on a fixed date, resulting in gaps for new libraries and updated best practices. They are also less effective with niche APIs because the necessary training data is scarce. To fix this, Android Studio’s Agent Mode is now equipped with the Android Knowledge Base, a new feature designed to significantly improve accuracy and reduce hallucinations by grounding responses with authoritative documentation. This means that instead of just relying on its training data, the agent can actively consult fresh documentation from official sources like the Android developer docs, Firebase, Google Developers, and Kotlin docs before it answers you.

The information in the Android Knowledge Base is stored in Android Studio and its content is automatically updated in the background on a periodic basis, so this feature is available regardless of which LLM you’re using for AI assistance.

Gemini searching documentation before it answers you

This feature will be invoked automatically when Agent Mode detects a need for additional context, and you’ll see additional explanatory text. However, if you’d like Agent Mode to reference documentation more frequently, you can include a line such as “Refer to Android documentation for guidance” in your Rules configuration.

Requested settings updates

Backup and Sync

Backup and Sync is a new way to keep your personalized Android Studio environment consistent across all your installations. You can now back up your settings—including your preferred keymaps, Code Editor settings, system settings, and more—to cloud storage using your Google Account, giving you a seamless experience wherever you code. We also support Backup and Sync using JetBrains accounts for developers using both IntelliJ and Android Studio installs simultaneously.

Backup and Sync

Getting started is simple. Just sign into your Google Account by clicking the avatar in the top-right corner of the IDE, or navigate to Settings > Backup and Sync. Once you authorize Android Studio to access your account’s storage, you have full control over which categories of app data you want to sync. If you’re syncing for the first time on a new machine, Android Studio will give you the option to either download your existing remote settings or upload your current local settings to the cloud. Of course, if you change your mind, you can easily disable Backup and Sync at any time from the settings menu. This feature has been available since the first Android Studio Otter release.


Communications from Android Studio

You can now opt in to receive communications directly from the Android Studio team. This enables you to get emails and notifications about important product updates, new features, and new libraries as soon as they’re available.

You’ll see this option when you sign in, and you can change your preference at any time by going to Settings > Tools > Google Accounts > Communications.

Your option to receive emails and notifications


IntelliJ Merge Updates

This release incorporates all stability and quality improvements from the IntelliJ IDEA 2025.2 platform. Notable highlights include:

  • Kotlin K2 Mode: Following its rapid adoption after being enabled by default, the K2 Kotlin mode is now more stable and performant. This version improves Kotlin code analysis stability, adds new inspections, and enhances the reliability of Kotlin script execution.

  • Terminal Performance: The integrated terminal is significantly faster, with major improvements in rendering. For Bash and Zsh, this update also introduces minor visual refinements without compromising or altering core shell behavior.

Get started

Ready to dive in and accelerate your development? Download Android Studio Otter 2 Feature Drop and start exploring these powerful new features today! As always, your feedback is crucial to us. Check known issues, report bugs, and be part of our vibrant community on LinkedIn Medium, YouTube, or X. Let’s build the future of Android apps together!

The post Android Studio Otter 2 Feature Drop is stable! appeared first on InShot Pro.

]]>
What’s new in the Jetpack Compose December ’25 release https://theinshotproapk.com/whats-new-in-the-jetpack-compose-december-25-release/ Wed, 03 Dec 2025 20:34:00 +0000 https://theinshotproapk.com/whats-new-in-the-jetpack-compose-december-25-release/ Posted by Nick Butcher, Jetpack Compose Product Manager Today, the Jetpack Compose December ‘25 release is stable. This contains version ...

Read more

The post What’s new in the Jetpack Compose December ’25 release appeared first on InShot Pro.

]]>

Posted by Nick Butcher, Jetpack Compose Product Manager




Today, the Jetpack Compose December ‘25 release is stable. This contains version 1.10 of the core Compose modules and version 1.4 of Material 3 (see the full BOM mapping), adding new features and major performance improvements.


To use today’s release, upgrade your Compose BOM version to 2025.12.00:


implementation(platform(“androidx.compose:compose-bom:2025.12.00”))

Performance improvements

We know that the runtime performance of your app is hugely important to you and your users, so performance has been a major priority for the Compose team. This release brings a number of improvements—and you get them all by just upgrading to the latest version. Our internal scroll benchmarks show that Compose now matches the performance you would see if using Views:


Scroll performance benchmark comparing Views and Jetpack Compose across different versions of Compose


Pausable composition in lazy prefetch

Pausable composition in lazy prefetch is now enabled by default. This is a fundamental change to how the Compose runtime schedules work, designed to significantly reduce jank during heavy UI workloads.


Previously, once a composition started, it had to run to completion. If a composition was complex, this could block the main thread for longer than a single frame, causing the UI to freeze. With pausable composition, the runtime can now “pause” its work if it’s running out of time and resume the work in the next frame. This is particularly effective when used with lazy layout prefetch to prepare frames ahead of time. The Lazy layout CacheWindow APIs introduced in Compose 1.9 are a great way to prefetch more content and benefit from pausable composition to produce much smoother UI performance.


Pausable composition combined with Lazy prefetch help reduce jank


We’ve also optimized performance elsewhere, with improvements to Modifier.onPlaced, Modifier.onVisibilityChanged, and other modifier implementations. We’ll continue to invest in improving the performance of Compose.


New features

Retain

Compose offers a number of APIs to hold and manage state across different lifecycles; for example,  remember persists state across compositions, and rememberSavable/rememberSerializable to persist across activity or process recreation. retain is a new API that sits between these APIs, enabling you to persist values across configuration changes without being serialized, but not across process death. As retain does not serialize your state, you can persist objects like lambda expressions, flows, and large objects like bitmaps, which cannot be easily serialized. For example, you may use retain to manage a media player (such as ExoPlayer) to ensure that media playback doesn’t get interrupted by a configuration change.


@Composable

fun MediaPlayer() {

    val applicationContext = LocalContext.current.applicationContext

    val exoPlayer = retain { ExoPlayer.Builder(applicationContext).apply { … }.build() }

    …

}


We want to extend our thanks to the AndroidDev community (especially the Circuit team), who have influenced and contributed to the design of this feature.


Material 1.4

Version 1.4.0 of the material3 library adds a number of new components and enhancements:


Horizontal centered hero carousel


Note that Material 3 Expressive APIs continue to be developed in the alpha releases of the material3 library. To learn more, see this recent talk:

 


New animation features

We continue to expand on our animation APIs, including updates for customizing shared element animations.


Dynamic shared elements

By default, sharedElement() and sharedBounds() animations attempt to animate

layout changes whenever a matching key is found in the target state. However, you may want to disable this animation dynamically based on certain conditions, such as the direction of navigation or the current UI state.


To control whether the shared element transition occurs, you can now customize the

SharedContentConfig passed to rememberSharedContentState(). The isEnabled

property determines if the shared element is active.


SharedTransitionLayout {

        val transition = updateTransition(currentState)

        transition.AnimatedContent { targetState ->

            // Create the configuration that depends on state changing.

            fun animationConfig() : SharedTransitionScope.SharedContentConfig {

                return object : SharedTransitionScope.SharedContentConfig {

                    override val SharedTransitionScope.SharedContentState.isEnabled: Boolean

                        get() =

                            // determine whether to perform a shared element transition

                }

            }

}


See the documentation for more.


Modifier.skipToLookaheadPosition()

A new modifier, Modifier.skipToLookaheadPosition(), has been added in this release, which keeps the final position of a composable when performing shared element animations. This allows for performing transitions like “reveal” type animation, as can be seen in the Androidify sample with the progressive reveal of the camera. See the video tip here for more information: 



Initial velocity in shared element transitions

This release adds a new shared element transition API, prepareTransitionWithInitialVelocity, which lets you pass an initial velocity (e.g. from a gesture) to a shared element transition:


Modifier.fillMaxSize()

    .draggable2D(

        rememberDraggable2DState { offset += it },

        onDragStopped = { velocity ->

            // Set up the initial velocity for the upcoming shared element

            // transition.

            sharedContentStateForDraggableCat

                ?.prepareTransitionWithInitialVelocity(velocity)

            showDetails = false

        },

    )

A shared element transition that starts with an initial velocity from a gesture

Veiled transitions

EnterTransition and ExitTransition define how an AnimatedVisibility/AnimatedContent composable appears or disappears. A new experimental veil option allows you to specify a color to veil or scrim content; e.g., fading in/out a semi-opaque black layer over content:


Veiled animated content – note the semi-opaque veil (or scrim) over the grid content during the animation


AnimatedContent(

    targetState = page,

    modifier = Modifier.fillMaxSize().weight(1f),

    transitionSpec = {

        if (targetState > initialState) {

            (slideInHorizontally { it } togetherWith

                    slideOutHorizontally { -it / 2 } + veilOut(targetColor = veilColor))

        } else {

            slideInHorizontally { -it / 2 } +

                    unveilIn(initialColor = veilColor) togetherWith slideOutHorizontally { it }

        }

    },

) { targetPage ->

    …

}

Upcoming changes


Deprecation of Modifier.onFirstVisible

Compose 1.9 introduced Modifier.onVisibilityChanged and Modifier.onFirstVisible. After reviewing your feedback, it became apparent that the contract of Modifier.onFirstVisible was not possible to honor deterministically; specifically, when an item first becomes visible. For example, a Lazy layout may dispose of items that scroll out of the viewport, and then compose them again if they scroll back into view. In this circumstance, the onFirstVisible callback would fire again, as it is a newly composed item. Similar behavior would also occur when navigating back to a previously visited screen containing onFirstVisible. As such, we have decided to deprecate this modifier in the next Compose release (1.11) and recommend migrating to onVisibilityChanged. See the documentation for more information.


Coroutine dispatch in tests

We plan to change coroutine dispatch in tests to improve test flakiness and catch more issues. Currently, tests use the UnconfinedTestDispatcher, which differs from production behavior; e.g., effects may run immediately rather than being enqueued. In a future release, we plan to introduce a new API that uses StandardTestDispatcher by default to match production behaviours. You can try the new behavior now in 1.10:


@get:Rule // also createAndroidComposeRule, createEmptyComposeRule

val rule = createComposeRule(effectContext = StandardTestDispatcher())


Using the StandardTestDispatcher will queue tasks, so you must use synchronization mechanisms like composeTestRule.waitForIdle() or composeTestRule.runOnIdle(). If your test uses runTest, you must ensure that runTest and your Compose rule share the same StandardTestDispatcher instance for synchronization.


// 1. Create a SINGLE dispatcher instance

val testDispatcher = StandardTestDispatcher()


// 2. Pass it to your Compose rule

@get:Rule

val composeRule = createComposeRule(effectContext = testDispatcher)


@Test

// 3. Pass the *SAME INSTANCE* to runTest

fun myTest() = runTest(testDispatcher) {

    composeRule.setContent { /* … */ }

}


Tools

Great APIs deserve great tools, and Android Studio has a number of recent additions for Compose developers:


  • Transform UI: Iterate on your designs by right clicking on the @Preview, selecting Transform UI, and then describing the change in natural language.

  • Generate @Preview: Right-click on a composable and select Gemini > Generate [Composable name] Preview.




To see these tools in action, watch this recent demonstration:


Happy Composing

We continue to invest in Jetpack Compose to provide you with the APIs and tools you need to create beautiful, rich UIs. We value your input, so please share your feedback on these changes or what you’d like to see next in our issue tracker.


The post What’s new in the Jetpack Compose December ’25 release appeared first on InShot Pro.

]]>
Android 16 QPR2 is Released https://theinshotproapk.com/android-16-qpr2-is-released/ Tue, 02 Dec 2025 19:00:00 +0000 https://theinshotproapk.com/android-16-qpr2-is-released/ Posted by Matthew McCullough, VP of Product Management, Android Developer Faster Innovation with Android’s first Minor SDK Release Today we’re ...

Read more

The post Android 16 QPR2 is Released appeared first on InShot Pro.

]]>

Posted by Matthew McCullough, VP of Product Management, Android Developer




Faster Innovation with Android’s first Minor SDK Release

Today we’re releasing Android 16 QPR2, bringing a host of enhancements to user experience, developer productivity, and media capabilities. It marks a significant milestone in the evolution of the Android platform as the first release to utilize a minor SDK version.

A Milestone for Platform Evolution: The Minor SDK Release

Minor SDK releases allow us to deliver APIs and features more rapidly outside of the major yearly platform release cadence, ensuring that the platform and your apps can innovate faster with new functionality. Unlike major releases that may include behavior changes impacting app compatibility, the changes in QPR2 are largely additive, minimizing the need for regression testing. Behavior changes in QPR2 are largely focused on security or accessibility, such as SMS OTP protection, or the support for the expanded dark theme.

To support this, we have introduced new fields to the Build class as of Android 16, allowing your app to check for these new APIs using SDK_INT_FULL and VERSION_CODES_FULL.

if ((Build.VERSION.SDK_INT >= Build.VERSION_CODES.BAKLAVA) && (Build.VERSION.SDK_INT_FULL >= Build.VERSION_CODES_FULL.BAKLAVA_1)) {
    // Call new APIs from the Android 16 QPR2 release
}

Enhanced User Experience and Customization

QPR2 improves Android’s personalization and accessibility, giving users more control over how their devices look and feel.

Expanded Dark Theme

To create a more consistent user experience for users who have low vision, photosensitivity, or simply those who prefer a dark system-wide appearance, QPR2 introduced an expanded option under dark theme.

The old Fitbit app showing the impact of expanded dark theme; the new Fitbit app directly supports a dark theme

When the expanded dark theme setting is enabled by a user, the system uses your app’s isLightTheme theme attribute to determine whether to apply inversion. If your app inherits from one of the standard DayNight themes, this is done automatically for you. If it does not, make sure to declare isLightTheme=”false” in your dark theme to ensure your app is not inadvertently inverted. Standard Android Views, Composables, and WebViews will be inverted, while custom rendering engines like Flutter will not.

This is largely intended as an accessibility feature. We strongly recommend implementing a native dark theme, which gives you full control over your app’s appearance; you can protect your brand’s identity, ensure text is readable, and prevent visual glitches from happening when your UI is automatically inverted, guaranteeing a polished, reliable experience for your users.

Custom Icon Shapes & Auto-Theming

In QPR2, users can select specific shapes for their app icons, which apply to all icons and folder previews. Additionally, if your app does not provide a dedicated themed icon, the system can now automatically generate one by applying a color filtering algorithm to your existing launcher icon.

Custom Icon Shapes

Test Icon Shape & Color in Android Studio

Automatic system icon color filtering

Interactive Chooser Sessions

The sharing experience is now more dynamic. Apps can keep the UI interactive even when the system sharesheet is open, allowing for real-time content updates within the Chooser.

Boosting Your Productivity and App Performance

We are introducing tools and updates designed to streamline your workflow and improve app performance.

Linux Development Environment with GUI Applications

The Linux development environment feature has been expanded to support running Linux GUI applications directly within the terminal environment.

Wilber, the GIMP mascot, designed by Aryeom Han, is licensed under CC BY-SA 4.0. The screenshot of the GIMP interface is used with courtesy.

Generational Garbage Collection

The Android Runtime (ART) now includes a Generational Concurrent Mark-Compact (CMC) Garbage Collector. This focuses collection on newly allocated objects, resulting in reduced CPU usage and improved battery efficiency.

Widget Engagement Metrics

You can now query user interaction events—such as clicks, scrolls, and impressions—to better understand how users engage with your widgets.

16KB Page Size Readiness

To help prepare for future architecture requirements, we have added early warning dialogs for debuggable apps that are not 16KB page-aligned.

Media, Connectivity, and Health

QPR2 brings robust updates to media standards and device connectivity.

IAMF and Audio Sharing

We have added software decoding support for Immersive Audio Model and Formats (IAMF), an open-source spatial audio format. Additionally, Personal Audio Sharing for Bluetooth LE Audio is now integrated directly into the system Output Switcher.

Health Connect Updates

Health Connect now automatically tracks steps using the device’s sensors. If your app has the READ_STEPS permission, this data will be available from the “android” package. Not only does this simplify the code needed to do step tracking, it’s also more power efficient. It also can now track weight, set index, and Rate of Perceived Exertion (RPE) in exercise segments.

Smoother Migrations

A new 3rd-party Data Transfer API enables more reliable data migration between Android and iOS devices.

Strengthening Privacy and Security

Security remains a top priority with new features designed to protect user data and device integrity.

Developer Verification

We introduced APIs to support developer verification during app installation along with new ADB commands to simulate verification outcomes. As a developer, you are free to install apps without verification by using ADB, so you can continue to test apps that are not intended or not yet ready to distribute to the wider consumer population.

SMS OTP Protection

The delivery of messages containing an SMS retriever hash will be delayed for most apps for three hours to help prevent OTP hijacking. The RECEIVE_SMS broadcast will be withheld and sms provider database queries will be filtered. The SMS will be available to these apps after the three hour delay.

Secure Lock Device

A new system-level security state, Secure Lock Device, is being introduced. When enabled (e.g., remotely via “Find My Device”), the device locks immediately and requires the primary PIN, pattern, or password to unlock, heightening security. When active, notifications and quick affordances on the lock screen will be hidden, and biometric unlock may be temporarily disabled.

Get Started

If you’re not in the Beta or Canary programs, your Pixel device should get the Android 16 QPR2 release shortly. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you are currently on the Android 16 QPR2 Beta and have not yet installed the Android 16 QPR3 beta, you can opt out of the program and you will then be offered the release version of Android 16 QPR2 over the air.
For the best development experience with Android 16 QPR2, we recommend that you use the latest Canary build of Android Studio Otter.
Thank you again to everyone who participated in our Android beta program. We’re looking forward to seeing how your apps take advantage of the updates in Android 16 QPR2.

For complete information on Android 16 QPR2 please visit the Android 16 developer site.

The post Android 16 QPR2 is Released appeared first on InShot Pro.

]]>
Explore AI on Android with Our Sample Catalog App https://theinshotproapk.com/explore-ai-on-android-with-our-sample-catalog-app/ Tue, 02 Dec 2025 17:00:00 +0000 https://theinshotproapk.com/explore-ai-on-android-with-our-sample-catalog-app/ Posted by Thomas Ezan and Ivy Knight As the AI landscape continues to expand, we often hear that developers aren’t ...

Read more

The post Explore AI on Android with Our Sample Catalog App appeared first on InShot Pro.

]]>
Posted by Thomas Ezan and Ivy Knight

As the AI landscape continues to expand, we often hear that developers aren’t always sure where to start and which API or SDK is best for their use case.

So we wanted to provide you with examples of AI-enabled features using both on-device and Cloud models and inspire you to create delightful experiences for your users.

We are thrilled to announce the launch of the redesigned Android AI Sample Catalog, a dedicated application designed to inspire and educate Android developers to build the next generation of AI-powered Android apps.

Discover what’s possible with Google AI

The Android AI Sample Catalog is designed as a one-stop destination to explore the capabilities of Google AI APIs and SDKs. Inside, you’ll find a collection of samples demonstrating a wide range of AI use cases that you can test yourself. We really designed this catalog to give you a hands-on feel for what you can build and help you find the right solution and capability for your needs.

Here are some of the samples you can find in the catalog:

Image generation with Imagen

Uses Imagen to generate images of landscapes, objects and people in various artistic styles.

On-device summarization with Gemini Nano

Lets you summarize text on-device using Gemini Nano via the GenAI Summarization API.

Chat with Nano Banana

A chatbot app using the Gemini 3 Pro Image model (a.k.a. “Nano Banana Pro”) letting you edit images via a conversation with the model.

On-device image description with Gemini Nano

Lets you generate image descriptions using Gemini Nano via the GenAI Image Description API.

Other samples include: image editing via Imagen mask-editing capabilities, a to-do list app controlled via the voice using the Gemini Live API, on-device rewrite assistance powered by Gemini Nano, and more!

The samples using cloud inference are built using the Firebase AI Logic SDK, and the ML Kit GenAI API is used for the samples running on-device inference. We plan to continue creating new samples and updating the existing ones as new capabilities are added to the models and SDKs.

Fully open source and ready to copy

We believe the best way to learn is by doing. That’s why the AI Sample Catalog is not only fully open-source but it’s been architectured so the code relevant to the AI features is self-contained and easy to copy and paste, so you can quickly experiment with these code samples in your own project.

When you’re exploring a sample in the app and want to see how it’s built, you can simply click the <> SOURCE button to jump directly to the code on GitHub.

To help you get started quickly, each sample includes a README file that highlights the APIs used, along with key code snippets.



Note: To run the samples using the Firebase AI Logic SDK, you’ll need to set up a Firebase AI project. Also, the samples using ML Kit Gen AI APIs powered by Gemini Nano are only supported on certain devices.

We also put extra thought into the app’s user interface to make your learning experience more engaging and intuitive. We’ve refreshed the app with a bold new brand that infuses the Android look with an expressive AI design language. Most notably, the app now features a vibrant, textured backdrop for the new Material 3 expressive components, giving you a modern and enjoyable environment to explore the samples and dive into the code. The systematic illustrations, inspired by generated image composition, further enhance this polished, expressive experience.


Check out the Android AI Sample Catalog today, test the features, and dive into the code on GitHub to start bringing your own AI-powered ideas to life!

The post Explore AI on Android with Our Sample Catalog App appeared first on InShot Pro.

]]>
Learn about our newest Jetpack Navigation library with the Nav3 Spotlight Week https://theinshotproapk.com/learn-about-our-newest-jetpack-navigation-library-with-the-nav3-spotlight-week/ Mon, 01 Dec 2025 17:00:00 +0000 https://theinshotproapk.com/learn-about-our-newest-jetpack-navigation-library-with-the-nav3-spotlight-week/ Posted by Don Turner – Developer Relations Engineer Jetpack Navigation 3 is now stable, and using it can help you ...

Read more

The post Learn about our newest Jetpack Navigation library with the Nav3 Spotlight Week appeared first on InShot Pro.

]]>

Posted by Don Turner – Developer Relations Engineer


Jetpack Navigation 3 is now stable, and using it can help you reduce tech debt, provide better separation of concerns, speed up feature development time, and support new form factors. We’re dedicating a whole week to providing content to help you learn about Nav3, and start integrating it into your app. 

You’ll learn about the library in detail, how to modularize your navigation code, and lots of code recipes for common use cases. At the end of the week, tune into the “Ask Me Anything” session so you can have the experts answer anything you like about Nav3. Here’s the full schedule: 

Monday: API Overview

Dec 1st, 2025 

Learn the most important Nav3 APIs including NavDisplay, NavEntry, and entryProvider with a coding walkthrough video.

Tuesday: Animations

Dec 2nd, 2025

Make your screen transitions look beautiful! Learn how to set custom animations for all screens in your app, and how to override transitions for individual screens that need different behavior. 

Wednesday: Deep links

Dec 3rd, 2025


Deep links support has been one of the most requested features from developers. You’ll learn how to create deep links with a variety of different code recipes. 


Thursday: Modularization

Dec 4th, 2025


Learn how to modularize your navigation code. Avoid circular dependencies by separating navigation keys into their own modules, and learn how to use dependency injection and extension functions to move content into feature modules.  

Friday: Ask Me Anything

Dec 5th, 2025

Do you have burning questions? We have a panel of experts waiting to provide answers live at 9am PST on Friday. Ask your questions using the #AskAndroid tag on BlueSky, LinkedIn and X

The post Learn about our newest Jetpack Navigation library with the Nav3 Spotlight Week appeared first on InShot Pro.

]]>