Baseline Profiles https://theinshotproapk.com/category/app/baseline-profiles/ Download InShot Pro APK for Android, iOS, and PC Fri, 21 Nov 2025 17:00:00 +0000 en-US hourly 1 https://theinshotproapk.com/wp-content/uploads/2021/07/cropped-Inshot-Pro-APK-Logo-1-32x32.png Baseline Profiles https://theinshotproapk.com/category/app/baseline-profiles/ 32 32 Fully Optimized: Wrapping up Performance Spotlight Week https://theinshotproapk.com/fully-optimized-wrapping-up-performance-spotlight-week/ Fri, 21 Nov 2025 17:00:00 +0000 https://theinshotproapk.com/fully-optimized-wrapping-up-performance-spotlight-week/ Posted by Ben Weiss, Senior Developer Relations Engineer and Sara Hamilton, Product Manager We spent the past week diving deep ...

Read more

The post Fully Optimized: Wrapping up Performance Spotlight Week appeared first on InShot Pro.

]]>

Posted by Ben Weiss, Senior Developer Relations Engineer and Sara Hamilton, Product Manager




We spent the past week diving deep into sharing best practices and guidance that helps to make Android apps faster, smaller, and more stable. From the foundational powers of the R8 optimizer and Profile Guided Optimizations, to performance improvements with Jetpack Compose, to a new guide on levelling up your app’s performance, we’ve covered the low effort, high impact tools you need to build a performant app.

This post serves as your index and roadmap to revisit these resources whenever you need to optimize. Here are the five key takeaways from our journey together.

Use the R8 optimizer to speed up your app

The single most impactful, low-effort change you can make is fully enabling the R8 optimizer. It doesn’t just reduce app size; it performs deep, whole-program optimizations to fundamentally rewrite your code for efficiency. Revisit your Keep Rules and get R8 back into your engineering tasks.


Our newly updated and expanded documentation on the R8 optimizer is here to help.


Reddit observed a 40% faster cold startup and 30% fewer ANR errors after enabling R8 full mode.

You can read the full case study on our blog.


Engineers at Disney+ invest in app performance and are optimizing the app’s user experience. Sometimes even seemingly small changes can make a huge impact. While inspecting their R8 configuration, the team found that the -dontoptimize flag was being used. After enabling optimizations by removing this flag, the Disney+ team saw significant improvements in their app’s performance.

So next time someone asks you what you could do to improve app performance, just link them to this post.


Read more in our Day 1 blog: Use R8 to shrink, optimize, and fast-track your app

Guiding you to better performance


Baseline Profiles effectively remove the need for Just in Time compilation, improving startup speed, scrolling, animation and overall rendering performance. Startup Profiles make app startup more even more lightweight by bringing an intelligent order to your app’s classes.dex files.


And to learn more about just how important Baseline Profiles are for app performance, read Meta’s engineering blog where they shared how Baseline Profiles improved various critical performance metrics by up to 40% across their apps.


We continue to make Jetpack Compose more performant for you in Jetpack Compose 1.10. Features like pausable composition and the customizable cache window are crucial for maintaining zero scroll jank when dealing with complex list items.Take a look at the latest episode of #TheAndroidShow where we explain this in more detail.


Read more in our Wednesday’s blog: Deeper Performance Considerations

Measuring performance can be easy as 1, 2, 3


You can’t manage what you don’t measure. Our Performance Leveling Guide breaks down your measurement journey into five steps, starting with easily available data and building up to advanced local tooling.

Starting at level 1, we’ll teach you how to use readily available data from Android Vitals, which provides you with field data on ANRs, crashes, and excessive battery usage.


We’ll also teach you how to level up. For example, we’ll demonstrate how to reach level 3 with local performance testing using Jetpack Macrobenchmark and the new UiAutomator 2.4 API to accurately measure and verify any change in your app’s performance.


Read more in our Thursday’s blog

Debugging performance just got an upgrade


Advanced optimization shouldn’t mean unreadable crash reports. New features are designed to help you confidently debug R8 and background work:

Automatic Logcat Retrace

Starting in Android Studio Narwhal, stack traces can automatically be de-obfuscated in the Logcat window. This way you can immediately see and debug any crashes in a production-ready build.

Narrow Keep Rules

On Tuesday we demystified the Keep Rules needed to fix runtime crashes, emphasizing writing specific, member-level rules over overly-broad wildcards. And because it’s an important topic, we made you a video as well.

And with the new lint check for wide Keep Rules, the Android Studio Otter 3 Feature Drop has you covered here as well.

We also released new guidance on testing and troubleshooting your R8 configuration to help you get the configuration right with confidence.


Read more in our Tuesday’s blog: Configure and troubleshoot R8 Keep Rules

Background Work

We shared guidance on debugging common scenarios you may encounter when scheduling tasks with WorkManager.

Background Task Inspector gives you a visual representation and graph view of WorkManager tasks, helping debug why scheduled work is delayed or failed. And our refreshed Background Work documentation landing page highlights task-specific APIs that are optimized for particular use cases, helping you achieve more reliable execution.


Read more in our Wednesday’s blog: Background work performance considerations

Performance optimization is an ongoing journey

If you successfully took our challenge to enable R8 full mode this week, your next step is to integrate performance into your product roadmap using the App Performance Score. This standardized framework helps you find the highest leverage action items for continuous improvement.

We capped off the week with the #AskAndroid Live Q&A session, where engineers answered your toughest questions on R8, Profile Guided Optimizations, and more. If you missed it, look for the replay!


Thank you for joining us! Now, get building and keep that momentum going.


The post Fully Optimized: Wrapping up Performance Spotlight Week appeared first on InShot Pro.

]]>
Deeper Performance Considerations https://theinshotproapk.com/deeper-performance-considerations/ Fri, 21 Nov 2025 12:04:53 +0000 https://theinshotproapk.com/deeper-performance-considerations/ Posted by Ben Weiss – Senior Developer Relations Engineer, Breana Tate – Developer Relations Engineer, Jossi Wolf – Software Engineer ...

Read more

The post Deeper Performance Considerations appeared first on InShot Pro.

]]>

Posted by Ben Weiss – Senior Developer Relations Engineer,
Breana Tate – Developer Relations Engineer,
Jossi Wolf – Software Engineer on Compose

Compose
yourselves and let us guide you through more background on performance.

Welcome
to day 3 of Performance Spotlight Week. Today we’re continuing to share details and guidance on
important
areas of app performance. We’re covering Profile Guided Optimization, Jetpack Compose
performance
improvements and considerations on working behind the scenes. Let’s dive right in.

Profile
Guided Optimization

Baseline
Profiles

and
Startup
Profiles

are foundational to improve an Android app’s startup and runtime performance. They are part of a
group of
performance optimizations called Profile Guided Optimization.

When
an app is packaged, the d8 dexer takes classes and methods and populates your app’s
classes.dex
files. When a user opens the app, these dex files are loaded, one after the other until the app
can start.
By providing a
Startup
Profile

you let d8 know which classes and methods to pack in the first
classes.dex
files. This structure allows the app to load fewer files, which in turn improves startup
speed.

Baseline
Profiles effectively move the Just in Time (JIT) compilation steps away from user devices and
onto developer
machines. The generated Ahead Of Time (AOT) compiled code has proven to reduce startup time and
rendering
issues alike.

Trello
and Baseline Profiles

We
asked engineers on the Trello app how Baseline Profiles affected their app’s performance. After
applying
Baseline Profiles to their main user journey, Trello saw a significant 25 % reduction in app
startup
time.

Trello
was able to improve their app’s startup time by 25 % by using baseline
profiles.

Baseline
Profiles at Meta

Also,
engineers at Meta recently published an article on how they are
accelerating
their Android apps with Baseline Profiles
.

Across
Meta’s apps the teams have seen various critical metrics improve by up to 40 % after
applying Baseline
Profiles.


Technical
improvements like these help you improve user satisfaction and business success as well. Sharing
this with
your product owners, CTOs and decision makers can also help speed up your app’s
performance.

Get
started with Baseline Profiles

To
generate either a Baseline or Startup Profile, you write a
macrobenchmark
test that exercises the app. During the test profile data is collected which will be used during
app
compilation. The tests are written using the new
UiAutomator
API
,
which we’ll cover tomorrow.

Writing
a benchmark like this is straightforward and you can see the full sample on
GitHub.

@Test

fun profileGenerator() {

    rule.collect(

        packageName = TARGET_PACKAGE,

        maxIterations = 15,

        stableIterations = 3,

        includeInStartupProfile = true

    ) {

        uiAutomator {

            startApp(TARGET_PACKAGE)

        }

    }

}

Considerations

Start
by writing a macrobenchmark tests Baseline Profile and a Startup Profile for the path most
traveled by your
users. This means the main entry point that your users take into your app which usually is
after
they logged in
.
Then continue to write more test cases to capture a more complete picture only for Baseline
Profiles. You do
not need to cover everything with a Baseline Profile. Stick to the most used paths and measure
performance
in the field. More on that in tomorrow’s post.

Get
started with Profile Guided Optimization

To
learn how Baseline Profiles work under the hood, watch this video from the Android Developers
Summit:




And
check out the Android Build Time episode on Profile Guided Optimization for another in-depth
look: 




We
also have extensive guidance on
Baseline
Profiles

and
Startup
Profiles

available for further reading.

Jetpack
Compose performance improvements

The
UI framework for Android has seen the performance investment of the engineering team pay off.
From version
1.9 of Jetpack Compose, scroll jank has dropped to 0.2 % during an internal long scrolling
benchmark
test. 

These
improvements were made possible because of several features packed into the most recent
releases.

Customizable
cache window

By
default, lazy layouts only compose one item ahead of time in the direction of scrolling, and
after something
scrolls off screen it is discarded. You can now customize the amount of items to retain through
a fraction
of the viewport or dp size. This helps your app perform more work upfront, and after enabling
pausable
composition in between frames, using the available time more efficiently.

To
start using customizable cache windows, instantiate a
LazyLayoutCacheWindow
and pass it to your lazy list or lazy grid. Measure your app’s performance using different cache
window
sizes, for example 50% of the viewport. The optimal value will depend on your content’s
structure and item
size.

val
dpCacheWindow = LazyLayoutCacheWindow(ahead =
150.dp,
behind =
100.dp)

val
state = rememberLazyListState(cacheWindow = dpCacheWindow)

LazyColumn(state
= state) {

    //
column contents

}

Pausable
composition

This
feature allows compositions to be paused, and their work split up over several frames. The APIs
landed in
1.9 and it is now used by default in 1.10 in lazy layout prefetch. You should see the most
benefit with
complex items with longer composition times. 


More
Compose performance optimizations

In
the versions 1.9 and 1.10 of Compose the team also made several optimizations that are a bit
less
obvious.

Several
APIs that use coroutines under the hood have been improved. For example, when using
Draggable
and
Clickable,
developers should see faster reaction times and improved allocation counts.

Optimizations
in layout rectangle tracking have improved performance of Modifiers like
onVisibilityChanged()
and
onLayoutRectChanged().
This speeds up the layout phase, even when not explicitly using these APIs.

Another
performance improvement is using cached values when observing positions via
onPlaced().

Prefetch
text in the background

Starting
with version 1.9, Compose adds the ability to prefetch text on a background thread. This enables
you to
pre-warm caches to enable faster text layout and is relevant for app rendering performance.
During layout,
text has to be passed into the Android framework where a word cache is populated. By default
this runs on
the Ui thread. Offloading prefetching and populating the word cache onto a background thread can
speed up
layout, especially for longer texts. To prefetch on a background thread you can pass a custom
executor to
any composable that’s using
BasicText
under the hood by passing a
LocalBackgroundTextMeasurementExecutor
to a
CompositionLocalProvider
like so.

val defaultTextMeasurementExecutor = Executors.newSingleThreadExecutor()

CompositionLocalProvider(

    LocalBackgroundTextMeasurementExecutor provides DefaultTextMeasurementExecutor

) {

    BasicText(“Some text that should be measured on a background thread!”)

}

Depending
on the text, this can provide a performance boost to your text rendering. To make sure that it
improves your
app’s rendering performance, benchmark and compare the results.

Background
work performance considerations

Background
Work is an essential part of many apps. You may be using libraries like WorkManager or
JobScheduler to
perform tasks like:

  • Periodically
    uploading analytical events

  • Syncing
    data between a backend service and a database

  • Processing
    media (i.e. resizing or compressing images)

A
key challenge while executing these tasks is balancing performance and power efficiency.
WorkManager allows
you to achieve this balance. It’s designed to be power-efficient, and allow work to be deferred
to an
optimal execution window influenced by a number of factors, including constraints you specify or
constraints
imposed by the system. 

WorkManager
is not a one-size-fits-all solution, though. Android also has a number of power-optimized APIs
that are
designed specifically with certain common Core User Journeys (CUJs) in
mind.  

Reference
the
Background
Work landing page

for a list of just a few of these,  including updating a widget and getting location in the
background.

Local
Debugging tools for Background Work: Common Scenarios

To
debug Background Work and understand why a task may have been delayed or failed, you need
visibility into
how the system has scheduled your tasks. 

To
help with this, WorkManager has several related

tools to help you debug locally

and optimize performance (some of these work for JobScheduler as well)! Here are some common
scenarios you
might encounter when using WorkManager, and an explanation of tools you can use to debug
them.

Debugging
why scheduled work is not executing

Scheduled
work being delayed or not executing at all can be due to a number of factors, including
specified
constraints not being met or constraints having been
imposed
by the system

The
first step in investigating why scheduled work is not running is to
confirm
the work was successfully scheduled

After confirming the scheduling status, determine whether there are any unmet constraints or
preconditions
preventing the work from executing.

There
are several tools for debugging this scenario.

Background
Task Inspector

The
Background Task Inspector is a powerful tool integrated directly into Android Studio. It
provides a visual
representation of all WorkManager tasks and their associated states (Running, Enqueued, Failed,
Succeeded). 

To
debug why scheduled work is not executing with the Background Task Inspector, consult the listed
Work
status(es). An ‘Enqueued’ status indicates your Work was scheduled, but is still waiting to
run.

Benefits:
Aside from providing an easy way to view all tasks, this tool is especially useful if you have
chained work.
The Background Task inspector offers a graph view that can visualize if a previous task failing
may have
impacted the execution of the following task.

Background
Task Inspector list view



Background
Task Inspector graph view

adb
shell dumpsys jobscheduler

This
command
returns a list of all active JobScheduler jobs (which includes WorkManager Workers) along with
specified
constraints, and system-imposed constraints. It also returns job
history. 

Use
this if you want a different way to view your scheduled work and associated constraints. For
WorkManager
versions earlier than WorkManager 2.10.0,
adb
shell dumpsys jobscheduler

will return a list of Workers with this name:

[package
name]/androidx.work.impl.background.systemjob.SystemJobService


If
your app has multiple workers, updating to WorkManager 2.10.0 will allow you to see Worker names
and easily
distinguish between workers:

#WorkerName#@[package
name]/androidx.work.impl.background.systemjob.SystemJobService


Benefits:
This
command is useful for understanding if there were any
system-imposed
constraints,
which
you cannot determine with the Background Task Inspector. For example, this will return your
app’s
standby bucket
,
which can affect the window in which scheduled work completes.

Enable
Debug logging

You
can enable
custom
logging

to see verbose WorkManager logs, which will have
WM—
attached. 

Benefits:
This allows you to gain visibility into when work is scheduled, constraints are fulfilled, and
lifecycle
events, and you can consult these logs while developing your app.

WorkInfo.StopReason

If
you notice unpredictable performance with a specific worker, you can programmatically observe
the reason
your worker was stopped on the previous run attempt with
WorkInfo.getStopReason

It’s
a good practice to configure your app to observe WorkInfo using getWorkInfoByIdFlow to identify
if your work
is being affected by background restrictions, constraints, frequent timeouts, or even stopped by
the
user.

Benefits:
You can use WorkInfo.StopReason to collect field data about your workers’
performance.

Debugging
WorkManager-attributed high wake lock duration flagged by Android vitals

Android
vitals features an excessive partial wake locks metric, which highlights wake locks contributing
to battery
drain. You may be surprised to know that
WorkManager
acquires wake locks to execute tasks
,
and if the wake locks exceed the threshold set by Google Play, can have impacts to your app’s
visibility.
How can you debug why there is so much wake lock duration attributed to your work? You can use
the following
tools.

Android
vitals dashboard

First
confirm in the
Android
vitals excessive wake lock dashboard

that the high wake lock duration
is
from WorkManager and not an alarm or other wake lock. You can use the
Identify
wake locks created by other APIs

documentation to understand which wake locks are held due to WorkManager. 

Perfetto

Perfetto
is a tool for analyzing system traces. When using it for debugging WorkManager specifically, you
can view
the “Device State” section to see when your work started, how long it ran, and how it
contributes to power
consumption. 

Under
“Device State: Jobs” track,  you can see any workers that have been executed and their
associated wake
locks.

 

Device
State section in Perfetto, showing CleanupWorker and BlurWorker execution.

Resources

Consult
the
Debug
WorkManager page

for an overview of the available debugging methods for other scenarios you might
encounter.

And
to try some of these methods hands on and learn more about debugging WorkManager, check out
the

Advanced WorkManager and Testing

codelab.

Next
steps

Today
we moved beyond code shrinking and explored how the Android Runtime and Jetpack Compose actually
render your
app. Whether it’s pre-compiling critical paths with Baseline Profiles or smoothing out scroll
states with
the new Compose 1.9 and 1.10 features, these tools focus on the
feel
of your app. And we dove deep into best practices on debugging background work.

Ask
Android

On
Friday we’re hosting a live AMA on performance. Ask your questions now using #AskAndroid and get
them
answered by the experts. 



The
challenge

We
challenged you on Monday to enable R8. Today, we are asking you to
generate
one Baseline Profile

for your app.

With
Android
Studio Otter
,
the Baseline Profile Generator module wizard makes this easier than ever. Pick your most
critical user
journey—even if it’s just your app startup and login—and generate a profile.

Once
you have it, run a Macrobenchmark to compare
CompilationMode.None
vs.
CompilationMode.Partial.

Share
your startup time improvements on social media using
#optimizationEnabled.

Tune
in tomorrow

You
have shrunk your app with R8 and optimized your runtime with Profile Guided Optimization. But
how do you
prove
these wins to your stakeholders? And how do you catch regressions before they hit
production?

Join
us tomorrow for
Day
4: The Performance Leveling Guide
,
where we will map out exactly how to measure your success, from field data in Play Vitals to
deep local
tracing with Perfetto.

The post Deeper Performance Considerations appeared first on InShot Pro.

]]>