Posted by Ben Weiss – Senior Developer Relations Engineer,
Breana Tate – Developer Relations Engineer,
Jossi Wolf – Software Engineer on Compose

Compose
yourselves and let us guide you through more background on performance.
Welcome
to day 3 of Performance Spotlight Week. Today we’re continuing to share details and guidance on
important
areas of app performance. We’re covering Profile Guided Optimization, Jetpack Compose
performance
improvements and considerations on working behind the scenes. Let’s dive right in.
Profile
Guided Optimization
Table of Content
Baseline
Profiles
and Startup
Profiles
are foundational to improve an Android app’s startup and runtime performance. They are part of a
group of
performance optimizations called Profile Guided Optimization.
When
an app is packaged, the d8 dexer takes classes and methods and populates your app’s classes.dex
files. When a user opens the app, these dex files are loaded, one after the other until the app
can start.
By providing a Startup
Profile
you let d8 know which classes and methods to pack in the first classes.dex
files. This structure allows the app to load fewer files, which in turn improves startup
speed.
Baseline
Profiles effectively move the Just in Time (JIT) compilation steps away from user devices and
onto developer
machines. The generated Ahead Of Time (AOT) compiled code has proven to reduce startup time and
rendering
issues alike.
Trello
and Baseline Profiles
We
asked engineers on the Trello app how Baseline Profiles affected their app’s performance. After
applying
Baseline Profiles to their main user journey, Trello saw a significant 25 % reduction in app
startup
time.
Baseline
Profiles at Meta
Also,
engineers at Meta recently published an article on how they are accelerating
their Android apps with Baseline Profiles.
Technical
improvements like these help you improve user satisfaction and business success as well. Sharing
this with
your product owners, CTOs and decision makers can also help speed up your app’s
performance.
Get
started with Baseline Profiles
To
generate either a Baseline or Startup Profile, you write a macrobenchmark
test that exercises the app. During the test profile data is collected which will be used during
app
compilation. The tests are written using the new UiAutomator
API,
which we’ll cover tomorrow.
Writing
a benchmark like this is straightforward and you can see the full sample on GitHub.
@Test
fun profileGenerator() {
rule.collect(
packageName = TARGET_PACKAGE,
maxIterations = 15,
stableIterations = 3,
includeInStartupProfile = true
) {
uiAutomator {
startApp(TARGET_PACKAGE)
}
}
}
Considerations
Start
by writing a macrobenchmark tests Baseline Profile and a Startup Profile for the path most
traveled by your
users. This means the main entry point that your users take into your app which usually is
after
they logged in.
Then continue to write more test cases to capture a more complete picture only for Baseline
Profiles. You do
not need to cover everything with a Baseline Profile. Stick to the most used paths and measure
performance
in the field. More on that in tomorrow’s post.
Get
started with Profile Guided Optimization
To
learn how Baseline Profiles work under the hood, watch this video from the Android Developers
Summit:
And
check out the Android Build Time episode on Profile Guided Optimization for another in-depth
look:
We
also have extensive guidance on Baseline
Profiles
and Startup
Profiles
available for further reading.
Jetpack
Compose performance improvements
The
UI framework for Android has seen the performance investment of the engineering team pay off.
From version
1.9 of Jetpack Compose, scroll jank has dropped to 0.2 % during an internal long scrolling
benchmark
test.
Customizable
cache window
By
default, lazy layouts only compose one item ahead of time in the direction of scrolling, and
after something
scrolls off screen it is discarded. You can now customize the amount of items to retain through
a fraction
of the viewport or dp size. This helps your app perform more work upfront, and after enabling
pausable
composition in between frames, using the available time more efficiently.
To
start using customizable cache windows, instantiate a LazyLayoutCacheWindow
and pass it to your lazy list or lazy grid. Measure your app’s performance using different cache
window
sizes, for example 50% of the viewport. The optimal value will depend on your content’s
structure and item
size.
val
dpCacheWindow = LazyLayoutCacheWindow(ahead = 150.dp,
behind = 100.dp)
val
state = rememberLazyListState(cacheWindow = dpCacheWindow)
LazyColumn(state
= state) {
//
column contents
}
Pausable
composition
This
feature allows compositions to be paused, and their work split up over several frames. The APIs
landed in
1.9 and it is now used by default in 1.10 in lazy layout prefetch. You should see the most
benefit with
complex items with longer composition times.
More
Compose performance optimizations
In
the versions 1.9 and 1.10 of Compose the team also made several optimizations that are a bit
less
obvious.
Several
APIs that use coroutines under the hood have been improved. For example, when using Draggable
and Clickable,
developers should see faster reaction times and improved allocation counts.
Optimizations
in layout rectangle tracking have improved performance of Modifiers like onVisibilityChanged()
and onLayoutRectChanged().
This speeds up the layout phase, even when not explicitly using these APIs.
Another
performance improvement is using cached values when observing positions via onPlaced().
Prefetch
text in the background
Starting
with version 1.9, Compose adds the ability to prefetch text on a background thread. This enables
you to
pre-warm caches to enable faster text layout and is relevant for app rendering performance.
During layout,
text has to be passed into the Android framework where a word cache is populated. By default
this runs on
the Ui thread. Offloading prefetching and populating the word cache onto a background thread can
speed up
layout, especially for longer texts. To prefetch on a background thread you can pass a custom
executor to
any composable that’s using BasicText
under the hood by passing a LocalBackgroundTextMeasurementExecutor
to a CompositionLocalProvider
like so.
val defaultTextMeasurementExecutor = Executors.newSingleThreadExecutor()
CompositionLocalProvider(
LocalBackgroundTextMeasurementExecutor provides DefaultTextMeasurementExecutor
) {
BasicText(“Some text that should be measured on a background thread!”)
}
Depending
on the text, this can provide a performance boost to your text rendering. To make sure that it
improves your
app’s rendering performance, benchmark and compare the results.
Background
work performance considerations
Background
Work is an essential part of many apps. You may be using libraries like WorkManager or
JobScheduler to
perform tasks like:
-
Periodically
uploading analytical events -
Syncing
data between a backend service and a database -
Processing
media (i.e. resizing or compressing images)
A
key challenge while executing these tasks is balancing performance and power efficiency.
WorkManager allows
you to achieve this balance. It’s designed to be power-efficient, and allow work to be deferred
to an
optimal execution window influenced by a number of factors, including constraints you specify or
constraints
imposed by the system.
WorkManager
is not a one-size-fits-all solution, though. Android also has a number of power-optimized APIs
that are
designed specifically with certain common Core User Journeys (CUJs) in
mind.
Reference
the Background
Work landing page
for a list of just a few of these, including updating a widget and getting location in the
background.
Local
Debugging tools for Background Work: Common Scenarios
To
debug Background Work and understand why a task may have been delayed or failed, you need
visibility into
how the system has scheduled your tasks.
To
help with this, WorkManager has several related
tools to help you debug locally
and optimize performance (some of these work for JobScheduler as well)! Here are some common
scenarios you
might encounter when using WorkManager, and an explanation of tools you can use to debug
them.
Debugging
why scheduled work is not executing
Scheduled
work being delayed or not executing at all can be due to a number of factors, including
specified
constraints not being met or constraints having been imposed
by the system.
The
first step in investigating why scheduled work is not running is to confirm
the work was successfully scheduled.
After confirming the scheduling status, determine whether there are any unmet constraints or
preconditions
preventing the work from executing.
There
are several tools for debugging this scenario.
Background
Task Inspector
The
Background Task Inspector is a powerful tool integrated directly into Android Studio. It
provides a visual
representation of all WorkManager tasks and their associated states (Running, Enqueued, Failed,
Succeeded).
To
debug why scheduled work is not executing with the Background Task Inspector, consult the listed
Work
status(es). An ‘Enqueued’ status indicates your Work was scheduled, but is still waiting to
run.
Benefits:
Aside from providing an easy way to view all tasks, this tool is especially useful if you have
chained work.
The Background Task inspector offers a graph view that can visualize if a previous task failing
may have
impacted the execution of the following task.
Background
Task Inspector list view
Background
Task Inspector graph view
adb
shell dumpsys jobscheduler
This
command
returns a list of all active JobScheduler jobs (which includes WorkManager Workers) along with
specified
constraints, and system-imposed constraints. It also returns job
history.
Use
this if you want a different way to view your scheduled work and associated constraints. For
WorkManager
versions earlier than WorkManager 2.10.0, adb
shell dumpsys jobscheduler
will return a list of Workers with this name:
[package
name]/androidx.work.impl.background.systemjob.SystemJobService
If
your app has multiple workers, updating to WorkManager 2.10.0 will allow you to see Worker names
and easily
distinguish between workers:
#WorkerName#@[package
name]/androidx.work.impl.background.systemjob.SystemJobService
Benefits:
This
command is useful for understanding if there were any system-imposed
constraints, which
you cannot determine with the Background Task Inspector. For example, this will return your
app’s
standby bucket,
which can affect the window in which scheduled work completes.
Enable
Debug logging
You
can enable custom
logging
to see verbose WorkManager logs, which will have WM—
attached.
Benefits:
This allows you to gain visibility into when work is scheduled, constraints are fulfilled, and
lifecycle
events, and you can consult these logs while developing your app.
WorkInfo.StopReason
If
you notice unpredictable performance with a specific worker, you can programmatically observe
the reason
your worker was stopped on the previous run attempt with WorkInfo.getStopReason.
It’s
a good practice to configure your app to observe WorkInfo using getWorkInfoByIdFlow to identify
if your work
is being affected by background restrictions, constraints, frequent timeouts, or even stopped by
the
user.
Benefits:
You can use WorkInfo.StopReason to collect field data about your workers’
performance.
Debugging
WorkManager-attributed high wake lock duration flagged by Android vitals
Android
vitals features an excessive partial wake locks metric, which highlights wake locks contributing
to battery
drain. You may be surprised to know that WorkManager
acquires wake locks to execute tasks,
and if the wake locks exceed the threshold set by Google Play, can have impacts to your app’s
visibility.
How can you debug why there is so much wake lock duration attributed to your work? You can use
the following
tools.
Android
vitals dashboard
First
confirm in the Android
vitals excessive wake lock dashboard
that the high wake lock duration is
from WorkManager and not an alarm or other wake lock. You can use the Identify
wake locks created by other APIs
documentation to understand which wake locks are held due to WorkManager.
Perfetto
Perfetto
is a tool for analyzing system traces. When using it for debugging WorkManager specifically, you
can view
the “Device State” section to see when your work started, how long it ran, and how it
contributes to power
consumption.
Under
“Device State: Jobs” track, you can see any workers that have been executed and their
associated wake
locks.
Device
State section in Perfetto, showing CleanupWorker and BlurWorker execution.
Resources
Consult
the Debug
WorkManager page
for an overview of the available debugging methods for other scenarios you might
encounter.
And
to try some of these methods hands on and learn more about debugging WorkManager, check out
the
Advanced WorkManager and Testing
codelab.
Next
steps
Today
we moved beyond code shrinking and explored how the Android Runtime and Jetpack Compose actually
render your
app. Whether it’s pre-compiling critical paths with Baseline Profiles or smoothing out scroll
states with
the new Compose 1.9 and 1.10 features, these tools focus on the feel
of your app. And we dove deep into best practices on debugging background work.
Ask
Android
On
Friday we’re hosting a live AMA on performance. Ask your questions now using #AskAndroid and get
them
answered by the experts.
The
challenge
We
challenged you on Monday to enable R8. Today, we are asking you to generate
one Baseline Profile
for your app.
With
Android
Studio Otter,
the Baseline Profile Generator module wizard makes this easier than ever. Pick your most
critical user
journey—even if it’s just your app startup and login—and generate a profile.
Once
you have it, run a Macrobenchmark to compare CompilationMode.None
vs. CompilationMode.Partial.
Share
your startup time improvements on social media using #optimizationEnabled.
Tune
in tomorrow
You
have shrunk your app with R8 and optimized your runtime with Profile Guided Optimization. But
how do you
prove
these wins to your stakeholders? And how do you catch regressions before they hit
production?
Join
us tomorrow for Day
4: The Performance Leveling Guide,
where we will map out exactly how to measure your success, from field data in Play Vitals to
deep local
tracing with Perfetto.