Compose https://theinshotproapk.com/category/app/compose/ Download InShot Pro APK for Android, iOS, and PC Thu, 21 Aug 2025 21:30:00 +0000 en-US hourly 1 https://theinshotproapk.com/wp-content/uploads/2021/07/cropped-Inshot-Pro-APK-Logo-1-32x32.png Compose https://theinshotproapk.com/category/app/compose/ 32 32 64-bit app compatibility for Google TV and Android TV https://theinshotproapk.com/64-bit-app-compatibility-for-google-tv-and-android-tv/ Thu, 21 Aug 2025 21:30:00 +0000 https://theinshotproapk.com/64-bit-app-compatibility-for-google-tv-and-android-tv/ Posted by Fahad Durrani Product Management, Google TV Google TV and Android TV will require 64-bit app compatibility to support ...

Read more

The post 64-bit app compatibility for Google TV and Android TV appeared first on InShot Pro.

]]>

Posted by Fahad Durrani Product Management, Google TV

Google TV and Android TV will require 64-bit app compatibility to support upcoming 64-bit TV devices starting August 2026.

Following other Android form factors, Google TV and Android TV devices will soon support 64-bit app compatibility. 64-bit apps will offer improved performance, shorter start times, and new viewing experiences on upcoming 64-bit Google TV and Android TV devices.

Starting August 1st, 2026:

We’re not making any changes to 32-bit support, and Google Play will continue to deliver apps to 32-bit devices. The 64-bit requirement means that apps with 32-bit native code will need a 64-bit version as well. You should continue to provide 32-bit binaries alongside 64-bit binaries by using ABI splits in App Bundles.

How to transition

This requirement only impacts apps that utilize native code. You can check if your app has native code (.so files) with the APK Analyzer. For ARM devices, you can find native libraries in lib/armeabi-v7a (32-bit) or lib/arm64-v8a (64-bit).

For detailed guidance on transitioning to 64-bit, see Support 64-bit architectures.

How to test

    • The Google TV emulator image for macOS devices with Apple Silicon is configured for a 64-bit userspace and may be used for app testing and verification.
    • The Nvidia Shield (models P2571, P2897 and P2897) have both 32-bit and 64-bit userspace compatibility and may be used for testing on physical hardware. If your app contains 64-bit libraries, they will be used automatically.
    • 64-bit TV apps may be sideloaded to Pixel (7 or newer) phones after constraining the view window to TV resolution and DPI:
        • adb shell wm size 1080×1920
          adb shell wm density 231 #tvdpi
          adb install <package.apk>

Next steps

Prepare your TV apps to comply with 64-bit requirements by August 1st, 2026:

  1. Use the APK Analyzer to check if your app has native code.
  2. Update your native code to support 64-bit and 16 KB memory page size.
  3. Test and verify that your changes work as intended.
  4. Submit your app update to Google Play.

The post 64-bit app compatibility for Google TV and Android TV appeared first on InShot Pro.

]]>
What’s new in the Jetpack Compose August ’25 release https://theinshotproapk.com/whats-new-in-the-jetpack-compose-august-25-release/ Wed, 13 Aug 2025 18:00:00 +0000 https://theinshotproapk.com/whats-new-in-the-jetpack-compose-august-25-release/ Posted by Meghan Mehta – Developer Relations Engineer and Nick Butcher – Product Manager Today, the Jetpack Compose August ‘25 ...

Read more

The post What’s new in the Jetpack Compose August ’25 release appeared first on InShot Pro.

]]>

Posted by Meghan Mehta – Developer Relations Engineer and Nick Butcher – Product Manager

Today, the Jetpack Compose August ‘25 release is stable. This release contains version 1.9 of core compose modules (see the full BOM mapping), introducing new APIs for rendering shadows, 2D scrolling, rich styling of text transformations, improved list performance, and more!

To use today’s release, upgrade your Compose BOM version to 2025.08.00:

implementation(platform("androidx.compose:compose-bom:2025.08.00"))

Shadows

We’re happy to introduce two highly requested modifiers: Modifier.dropShadow() and Modifier.innerShadow() allowing you to render box-shadow effects (compared to the existing Modifier.shadow() which renders elevation based shadows based on a lighting model).

Modifier.dropShadow()

The dropShadow() modifier draws a shadow behind your content. You can add it to your composable chain and specify the radius, color, and spread. Remember, content that should appear on top of the shadow (like a background) should be drawn after the dropShadow() modifier.

@Composable
@Preview(showBackground = true)
fun SimpleDropShadowUsage() {
    val pinkColor = Color(0xFFe91e63)
    val purpleColor = Color(0xFF9c27b0)
    Box(Modifier.fillMaxSize()) {
        Box(
            Modifier
                .size(200.dp)
                .align(Alignment.Center)
                .dropShadow(
                    RoundedCornerShape(20.dp),
                    dropShadow = DropShadow(
                        15.dp,
                        color = pinkColor,
                        spread = 10.dp,
                        alpha = 0.5f
                    )
                )
                .background(
                    purpleColor,
                    shape = RoundedCornerShape(20.dp)
                )
        )
    }
}

drop shadow drawn all around shape

Figure 1. Drop shadow drawn all around shape

Modifier.innerShadow()

The Modifier.innerShadow() draws shadows on the inset of the provided shape:

@Composable
@Preview(showBackground = true)
fun SimpleInnerShadowUsage() {
    val pinkColor = Color(0xFFe91e63)
    val purpleColor = Color(0xFF9c27b0)
    Box(Modifier.fillMaxSize()) {
        Box(
            Modifier
                .size(200.dp)
                .align(Alignment.Center)
                .background(
                    purpleColor,
                    shape = RoundedCornerShape(20.dp)
                )
                .innerShadow(
                    RoundedCornerShape(20.dp),
                    innerShadow = InnerShadow(
                        15.dp,
                        color = Color.Black,
                        spread = 10.dp,
                        alpha = 0.5f
                    )
                )
        )
    }
}

modifier.innerShadow() applied to a shape

Figure 2. Modifier.innerShadow() applied to a shape

The order for inner shadows is very important. The inner shadow draws on top of the content, so for the example above, we needed to move the inner shadow modifier after the background modifier. We’d need to do something similar when using it on top of something like an Image. In this example, we’ve placed a separate Box to render the shadow in the layer above the image:

@Composable
@Preview(showBackground = true)
fun PhotoInnerShadowExample() {
    Box(Modifier.fillMaxSize()) {
        val shape = RoundedCornerShape(20.dp)
        Box(
            Modifier
                .size(200.dp)
                .align(Alignment.Center)
        ) {
            Image(
                painter = painterResource(id = R.drawable.cape_town),
                contentDescription = "Image with Inner Shadow",
                contentScale = ContentScale.Crop,
                modifier = Modifier.fillMaxSize()
                    .clip(shape)
            )
            Box(
                modifier = Modifier.fillMaxSize()
                    .innerShadow(
                        shape,
                        innerShadow = InnerShadow(15.dp,
                            spread = 15.dp)
                    )
            )
        }
    }
}

Inner shadow on top of an image

Figure 3.Inner shadow on top of an image

New Visibility modifiers

Compose UI 1.8 introduced onLayoutRectChanged, a new performant way to track the location of elements on screen. We’re building on top of this API to support common use cases by introducing onVisibilityChanged and onFirstVisible. These APIs accept optional parameters for the minimum fraction or amount of time the item has been visible for before invoking your action.

Use onVisibilityChanged for UI changes or side effects that should happen based on visibility, like automatically playing and pausing videos or starting an animation:

LazyColumn {
  items(feedData) { video ->
    VideoRow(
        video,
        Modifier.onVisibilityChanged(minDurationMs = 500, minFractionVisible = 1f) {
          visible ->
            if (visible) video.play() else video.pause()
          },
    )
  }
}

Use onFirstVisible for use cases when you wish to react to an element first becoming visible on screen for example to log impressions:

LazyColumn {
    items(100) {
        Box(
            Modifier
                // Log impressions when item has been visible for 500ms
                .onFirstVisible(minDurationMs = 500) { /* log impression */ }
                .clip(RoundedCornerShape(16.dp))
                .drawBehind { drawRect(backgroundColor) }
                .fillMaxWidth()
                .height(100.dp)
        )
    }
}

Rich styling in OutputTransformation

BasicTextField now supports applying styles like color and font weight from within an OutputTransformation.

The new TextFieldBuffer.addStyle() methods let you apply a SpanStyle or ParagraphStyle to change the appearance of text, without changing the underlying TextFieldState. This is useful for visually formatting input, like phone numbers or credit cards. This method can only be called inside an OutputTransformation.

// Format a phone number and color the punctuation
val phoneTransformation = OutputTransformation {
    // 1234567890 -> (123) 456-7890
    if (length == 10) {
        insert(0, "(")
        insert(4, ") ")
        insert(9, "-")

        // Color the added punctuation
        val gray = Color(0xFF666666)
        addStyle(SpanStyle(color = gray), 0, 1)
        addStyle(SpanStyle(color = gray), 4, 5)
        addStyle(SpanStyle(color = gray), 9, 10)
    }
}

BasicTextField(
    state = myTextFieldState,
    outputTransformation = phoneTransformation
)

LazyLayout

The building blocks of LazyLayout are all now stable! Check out LazyLayoutMeasurePolicy, LazyLayoutItemProvider, and LazyLayoutPrefetchState to build your own Lazy components.

Prefetch Improvements

There are now significant scroll performance improvements in Lazy List and Lazy Grid with the introduction of new prefetch behavior. You can now define a LazyLayoutCacheWindow to prefetch more content. By default, only one item is composed ahead of time in the direction of scrolling, and after something scrolls off screen it is discarded. You can now customize the amount of items ahead to prefetch and behind to retain through a fraction of the viewport or dp size. When you opt into using LazyLayoutCacheWindow, items begin prefetching in the ahead area straight away.

The configuration entry point for this is on LazyListState, which takes in the cache window size:

@OptIn(ExperimentalFoundationApi::class)
@Composable
private fun LazyColumnCacheWindowDemo() {
    // Prefetch items 150.dp ahead and retain items 100.dp behind the visible viewport
    val dpCacheWindow = LazyLayoutCacheWindow(ahead = 150.dp, behind = 100.dp)
    // Alternatively, prefetch/retain items as a fraction of the list size
    // val fractionCacheWindow = LazyLayoutCacheWindow(aheadFraction = 1f, behindFraction = 0.5f)
    val state = rememberLazyListState(cacheWindow = dpCacheWindow)
    LazyColumn(state = state) {
        items(1000) { Text(text = "$it", fontSize = 80.sp) }
    }
}

lazylayout in Compose 1.9 release

Note: Prefetch composes more items than are currently visible — the new cache window API will likely increase prefetching. This means that item’s LaunchedEffects and DisposableEffects may run earlier – do not use this as a signal for visibility e.g. for impression tracking. Instead, we recommend using the new onFirstVisible and onVisibilityChanged APIs. Even if you’re not manually customizing LazyLayoutCacheWindow now, avoid using composition effects as a signal of content visibility, as this new prefetch mechanism will be enabled by default in a future release.

Scroll

2D Scroll APIs

Following the release of Draggable2D, Scrollable2D is now available, bringing two-dimensional scrolling to Compose. While the existing Scrollable modifier handles single-orientation scrolling, Scrollable2D enables both scrolling and flinging in 2D. This allows you to create more complex layouts that move in all directions, such as spreadsheets or image viewers. Nested scrolling is also supported, accommodating 2D scenarios.

val offset = remember { mutableStateOf(Offset.Zero) }
Box(
    Modifier.size(150.dp)
        .scrollable2D(
            state =
                rememberScrollable2DState { delta ->
                    offset.value = offset.value + delta // update the state
                    delta // indicate that we consumed all the pixels available
                }
        )
        .background(Color.LightGray),
    contentAlignment = Alignment.Center,
) {
    Text(
        "X=${offset.value.x.roundToInt()} Y=${offset.value.y.roundToInt()}",
        style = TextStyle(fontSize = 32.sp),
    )
}

moving image of 2D scroll API demo

Scroll Interop Improvements

There are bug fixes and new features to improve scroll and nested scroll interop with Views, including the following:

    • Fixed the dispatching of incorrect velocities during fling animations between Compose and Views.
    • Compose now correctly invokes the View’s nested scroll callbacks in the appropriate order.

Improve crash analysis by adding source info to stack traces

We have heard from you that it can be hard to debug Compose crashes when your own code does not appear in the stack trace. To address this we’re providing a new, opt-in API to provide richer crash location details, including composable names and locations enabling you to:

    • Efficiently identify and resolve crash sources.
    • More easily isolate crashes for reproducible samples.
    • Investigate crashes that previously only showed internal stack frames.

Note that we do not recommend using this API in release builds due to the performance impact of collecting this extra information, nor does it work in minified apks.

To enable this feature, add the line below to the application entry point. Ideally, this configuration should be performed before any compositions are created to ensure that the stack trace information is collected:

class App : Application() {
   override fun onCreate() {
        // Enable only for debug flavor to avoid perf regressions in release
        Composer.setDiagnosticStackTraceEnabled(BuildConfig.DEBUG)
   }
}

New annotations and Lint checks

We are introducing a new runtime-annotation library that exposes annotations used by the compiler and tooling (such as lint checks). This allows non-Compose modules to use these annotations without a dependency on the Compose runtime library. The @Stable, @Immutable, and @StableMarker annotations have moved to runtime-annotation, allowing you to annotate classes and functions that do not depend on Compose.

Additionally, we have added two new annotations and corresponding lint checks:

    • @RememberInComposition: An annotation that can mark constructors, functions, and property getters, to indicate that they must not be called directly inside composition without being remembered. Errors will be raised by a corresponding lint check.
    • @FrequentlyChangingValue: An annotation that can mark functions, and property getters, to indicate that they should not be called directly inside composition, as this may cause frequent recompositions (for example, marking scroll position values and animating values). Warnings are provided by a corresponding lint check.

Additional updates

Get started

We appreciate all bug reports and feature requests submitted to our issue tracker. Your feedback allows us to build the APIs you need in your apps. Happy composing!

The post What’s new in the Jetpack Compose August ’25 release appeared first on InShot Pro.

]]>
Engage users on Google TV with excellent TV apps https://theinshotproapk.com/engage-users-on-google-tv-with-excellent-tv-apps/ Mon, 02 Jun 2025 12:01:33 +0000 https://theinshotproapk.com/engage-users-on-google-tv-with-excellent-tv-apps/ Posted by Shobana Radhakrishnan – Senior Director of Engineering, Google TV, and Paul Lammertsma – Developer Relations Engineer, Android Over ...

Read more

The post Engage users on Google TV with excellent TV apps appeared first on InShot Pro.

]]>

Posted by Shobana Radhakrishnan – Senior Director of Engineering, Google TV, and Paul Lammertsma – Developer Relations Engineer, Android

Over the past year, Google TV and Android TV achieved over 270 million monthly active devices, establishing one of the largest smart TV OS footprints. Building on this momentum, we are excited to share new platform features and developer tools designed to help you increase app engagement with our expanding user base.

Google TV with Gemini capabilities

Earlier this year, we announced that we’ll bring Gemini capabilities to Google TV, so users can speak more naturally and conversationally to find what to watch and get answers to complex questions.

A user pulls up Gemini on a TV asking for kid-friendly movie recommendations similar to Jurassic Park. Gemini responds with several movie recommendations

After each movie or show search, our new voice assistant will suggest relevant content from your apps, significantly increasing the discoverability of your content.

A user pulls up Gemini on a TV asking for help explaining the solar system to a first grader. Gemini responds with YouTube videos to help explain the solar system

Plus, users can easily ask questions about topics they’re curious about and receive insightful answers with supporting videos.

We’re so excited to bring this helpful and delightful experience to users this fall.

Video Discovery API

Today, we’ve also opened partner enrollment for our Video Discovery API.

Video Discovery optimizes Resumption, Entitlements, and Recommendations across all Google TV form factors to enhance the end-user experience and boost app engagement.

    • Resumption: Partners can now easily display a user’s paused video within the ‘Continue Watching’ row from the home screen. This row is a prime location that drives 60% of all user interactions on Google TV.
    • Entitlements: Video Discovery streamlines entitlement management, which matches app content to user eligibility. Users appreciate this because they can enjoy personalized recommendations without needing to manually update all their subscription details. This allows partners to connect with users across multiple discovery points on Google TV.
    • Recommendations: Video Discovery even highlights personalized content recommendations based on content that users watched inside apps.

Partners can begin incorporating the Video Discovery API today, starting with resumption and entitlement integrations. Check out g.co/tv/vda to learn more.

Jetpack Compose for TV

Compose for TV 1.0 expands on the core and Material Compose libraries

Last year, we launched Compose for TV 1.0 beta, which lets you build beautiful, adaptive UIs across Android, including Android TV OS.

Now, Compose for TV 1.0 is stable, and expands on the core and Material Compose libraries. We’ve even seen how the latest release of Compose significantly improves app startup within our internal benchmarking mobile sample, with roughly a 20% improvement compared with the March 2024 release. Because Compose for TV builds upon these libraries, apps built with Compose for TV should also see better app startup times.

New to building with Compose, and not sure where to start? Our updated Jetcaster audio streaming app sample demonstrates how to use Compose across form factors. It includes a dedicated module for playing podcasts on TV by combining separate view models with shared business logic.

Focus Management Codelab

We understand that focus management can be challenging at times. That’s why we’ve published a codelab that reviews how to set initial focus, prepare for unexpected focus traversal, and efficiently restore focus.

Memory Optimization Guide

We’ve released a comprehensive guide on memory optimization, including memory targets for low RAM devices as well. Combined with Android Studio’s powerful memory profiler, this helps you understand when your app exceeds those limits and why.

In-App Ratings and Reviews

Ratings and reviews entry point forJetStream sample app on TV

Moreover, app ratings and reviews are essential for developers, offering quantitative and qualitative feedback on user experiences. Now, we’re extending the In-App Ratings and Reviews API to TV to allow developers to prompt users for ratings and reviews directly from Google TV. Check out our recent blog post detailing how to easily integrate the In-App Ratings and Reviews API.

Android 16 for TV

Android 16 for TV

We’re excited to announce the upcoming release of Android 16 for TV. Developers can begin using the latest Emulator today. With Android 16, TV developers can access several great features:

    • Platform support for the Eclipsa Audio codec enables creators to use the IAMF spatial audio format. For ExoPlayer support that includes previous platform versions, see ExoPlayer’s IAMF decoder module.
    • There are various improvements to media playback speed, consistency and efficiency, as well as HDMI-CEC reliability and performance optimizations for 64-bit kernels.
    • Additional APIs and user experiences from Android 16 are also available. We invite you to explore the complete list from the Android 16 for TV release notes.

What’s next

We’re incredibly excited to see how these announcements will optimize your development journey, and look forward to seeing the fantastic apps you’ll launch on the platform!

Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.

The post Engage users on Google TV with excellent TV apps appeared first on InShot Pro.

]]>
Announcing Jetpack Navigation 3 https://theinshotproapk.com/announcing-jetpack-navigation-3/ Mon, 02 Jun 2025 12:01:06 +0000 https://theinshotproapk.com/announcing-jetpack-navigation-3/ Posted by Don Turner – Developer Relations Engineer Navigating between screens in your app should be simple, shouldn’t it? However, ...

Read more

The post Announcing Jetpack Navigation 3 appeared first on InShot Pro.

]]>

Posted by Don Turner – Developer Relations Engineer

Navigating between screens in your app should be simple, shouldn’t it? However, building a robust, scalable, and delightful navigation experience can be a challenge. For years, the Jetpack Navigation library has been a key tool for developers, but as the Android UI landscape has evolved, particularly with the rise of Jetpack Compose, we recognized the need for a new approach.

Today, we’re excited to introduce Jetpack Navigation 3, a new navigation library built from the ground up specifically for Compose. For brevity, we’ll just call it Nav3 from now on. This library embraces the declarative programming model and Compose state as fundamental building blocks.

Why a new navigation library?

The original Jetpack Navigation library (sometimes referred to as Nav2 as it’s on major version 2) was initially announced back in 2018, before AndroidX and before Compose. While it served its original goals well, we heard from you that it had several limitations when working with modern Compose patterns.

One key limitation was that the back stack state could only be observed indirectly. This meant there could be two sources of truth, potentially leading to an inconsistent application state. Also, Nav2’s NavHost was designed to display only a single destination – the topmost one on the back stack – filling the available space. This made it difficult to implement adaptive layouts that display multiple panes of content simultaneously, such as a list-detail layout on large screens.

illustration of single pane and two-pane layouts showing list and detail features

Figure 1. Changing from single pane to multi-pane layouts can create navigational challenges

Founding principles

Nav3 is built upon principles designed to provide greater flexibility and developer control:

    • You own the back stack: You, the developer, not the library, own and control the back stack. It’s a simple list which is backed by Compose state. Specifically, Nav3 expects your back stack to be SnapshotStateList<T> where T can be any type you choose. You can navigate by adding or removing items (Ts), and state changes are observed and reflected by Nav3’s UI.
    • Get out of your way: We heard that you don’t like a navigation library to be a black box with inaccessible internal components and state. Nav3 is designed to be open and extensible, providing you with building blocks and helpful defaults. If you want custom navigation behavior you can drop down to lower layers and create your own components and customizations.
    • Pick your building blocks: Instead of embedding all behavior within the library, Nav3 offers smaller components that you can combine to create more complex functionality. We’ve also provided a “recipes book” that shows how to combine components to solve common navigation challenges.

illustration of the Nav3 display observing changes to the developer-owned back stack

Figure 2. The Nav3 display observes changes to the developer-owned back stack.

Key features

    • Adaptive layouts: A flexible layout API (named Scenes) allows you to render multiple destinations in the same layout (for example, a list-detail layout on large screen devices). This makes it easy to switch between single and multi-pane layouts.
    • Modularity: The API design allows navigation code to be split across multiple modules. This improves build times and allows clear separation of responsibilities between feature modules.

      moving image demonstrating custom animations and predictive back features on a mobile device

      Figure 3. Custom animations and predictive back are easy to implement, and easy to override for individual destinations.

      Basic code example

      To give you an idea of how Nav3 works, here’s a short code sample.

      // Define the routes in your app and any arguments.
      data object Home
      data class Product(val id: String)
      
      // Create a back stack, specifying the route the app should start with.
      val backStack = remember { mutableStateListOf<Any>(Home) }
      
      // A NavDisplay displays your back stack. Whenever the back stack changes, the display updates.
      NavDisplay(
          backStack = backStack,
      
          // Specify what should happen when the user goes back
          onBack = { backStack.removeLastOrNull() },
      
          // An entry provider converts a route into a NavEntry which contains the content for that route.
          entryProvider = { route ->
              when (route) {
                  is Home -> NavEntry(route) {
                      Column {
                          Text("Welcome to Nav3")
                          Button(onClick = {
                              // To navigate to a new route, just add that route to the back stack
                              backStack.add(Product("123"))
                          }) {
                              Text("Click to navigate")
                          }
                      }
                  }
                  is Product -> NavEntry(route) {
                      Text("Product ${route.id} ")
                  }
                  else -> NavEntry(Unit) { Text("Unknown route: $route") }
              }
          }
      )
      

      Get started and provide feedback

      To get started, check out the developer documentation, plus the recipes repository which provides examples for:

        • common navigation UI, such as a navigation rail or bar
        • conditional navigation, such as a login flow
        • custom layouts using Scenes

      We plan to provide code recipes, documentation and blogs for more complex use cases in future.

      Nav3 is currently in alpha, which means that the API is liable to change based on feedback. If you have any issues, or would like to provide feedback, please file an issue.

      Nav3 offers a flexible and powerful foundation for building modern navigation in your Compose applications. We’re really excited to see what you build with it.

      Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.

The post Announcing Jetpack Navigation 3 appeared first on InShot Pro.

]]>
In-App Ratings and Reviews for TV https://theinshotproapk.com/in-app-ratings-and-reviews-for-tv/ Sun, 01 Jun 2025 12:04:39 +0000 https://theinshotproapk.com/in-app-ratings-and-reviews-for-tv/ Posted by Paul Lammertsma – Developer Relations Engineer Ratings and reviews are essential for developers, offering quantitative and qualitative feedback ...

Read more

The post In-App Ratings and Reviews for TV appeared first on InShot Pro.

]]>

Posted by Paul Lammertsma – Developer Relations Engineer

Ratings and reviews are essential for developers, offering quantitative and qualitative feedback on user experiences. In 2022, we enhanced the granularity of this feedback by segmenting these insights by countries and form factors.

Now, we’re extending the In-App Ratings and Reviews API to TV to allow developers to prompt users for ratings and reviews directly from Google TV.

Ratings and reviews on Google TV

Ratings and reviews entry point forJetStream sample app on TV

Users can now see rating averages, browse reviews, and leave their own review directly from an app’s store listing on Google TV.

Ratings and written reviews input screen on TV

Users can interact with in-app ratings and reviews on their TVs by doing the following:

    • Select ratings using the remote control D-pad.
    • Provide optional written reviews using Gboard’s on-screen voice input, or by easily typing from their phone.
    • Send mobile notifications to themselves to complete their TV app review directly on their phone.

User instructions for submitting TV app ratings and reviews on mobile

Additionally, users can leave reviews for other form factors directly from their phone by simply selecting the device chip when submitting an app rating or writing a review.

We’ve already seen a considerable lift in app ratings on TV since bringing these changes to Google TV, and now, we’re making it possible for developers to trigger a ratings prompt as well.

Before we look at the integration, let’s first carefully consider the best time to request a review prompt. First, identify optimal moments within your app to request user feedback, ensuring prompts appear only when the UI is idle to prevent interruption of ongoing content.

In-App Review API

Integrating the Google Play In-App Review API is the same as on mobile and it’s only a couple of method calls:

val manager = ReviewManagerFactory.create(context)
manager.requestReviewFlow().addOnCompleteListener { task ->
    if (task.isSuccessful) {
        // We got the ReviewInfo object
        val reviewInfo = task.result
        manager.launchReviewFlow(activity, reviewInfo)
    } else {
        // There was some problem, log or handle the error code
        @ReviewErrorCode val reviewErrorCode =
            (task.getException() as ReviewException).errorCode
    }
}

First, invoke requestReviewFlow() to obtain a ReviewInfo object which is used to launch the review flow. You must include an addOnCompleteListener() not just to obtain the ReviewInfo object, but also to monitor for any problems triggering this flow, such as the unavailability of Google Play on the device. Note that ReviewInfo does not offer any insights on whether or not a prompt appeared or which action the user took if a prompt did appear.

The challenge is to identify when to trigger launchReviewFlow(). Track user actions—identifying successful journeys and points where users encounter issues—so you can be confident they had a delightful experience in your app.

For this method, you may optionally also include an addOnCompleteListener() to ensure it resumes when the returned task is completed.

Note that due to throttling of how often users are presented with this prompt, there are no guarantees that the ratings dialog will appear when requesting to start this flow. For best practices, check this guide on when to request an in-app review.

Get started with In-App Reviews on Google TV

You can get a head start today by following these steps:

    1. Identify successful journeys for users, like finishing a movie or TV show season.
    2. Identify poor experiences that should be avoided, like buffering or playback errors.
    3. Integrate the Google Play In-App Review API to trigger review requests at optimal moments within the user journey.
    4. Test your integration by following the testing guide.
    5. Publish your app and continuously monitor your ratings by device type in the Play Console.

    We’re confident this integration enables you to elevate your Google TV app ratings and empowers your users to share valuable feedback.

    Play Console Ratings graphic

    Resources

    Explore this announcement and all Google I/O 2025 updates on io.google starting May 22.

    The post In-App Ratings and Reviews for TV appeared first on InShot Pro.

    ]]>