Measuring performance has a few major approaches:
- Profilers and performance tooling: detailed info with some overhead
- Timestamp measurement: precise values with no details
- Performance tests: a WIP
If you need more assistance with android performance testing, please reach out to us on the #perf-android-frontend channel on Matrix.
Profilers and performance tooling
Important note: this approach will create overhead and give less accurate time values. This method is useful when you are looking for red flags in your code and trying to diagnose performance issues. If you want accurate measurements, see timestamp measurement or performance tests.
There are different tools at your disposal. Each of them has a specific use case and the details are below.
- The Firefox Profiler / Gecko Profiler (documentation) is the preferred profiler. This profiler allows us to profile the code with lower overhead and non-debuggable builds including APKs from the Play Store, capture native code, and share profiles right in the browser. The UX is just so much better. :) You should try another profiler if you:
- Need access to Java threads other than the main thread (issue )
- Are profiling application start up and some information is missing (bug 1659103)
- The Debug API (documentation) is most useful when you:
- Need access to Java threads other than the main thread
- Are profiling application start up and want to capture it early (when the application can start executing code)
- The Android Profiler (documentation) is most useful when you want the benefits of the `Debug` API but cannot instrument the code. The results in the Android Profiler are very misleading: what you see in profiles may not accurately represent what users’ experience in release builds. This is due to the debuggable=true requirement which causes different methods to increase in runtime inconsistently (further explanation). For example, in the AS Profiler, we’ve seen that UI layout is proportionally longer than business logic than in production builds.
- Simpleperf (official documentation; follow the Fenix-specific steps found in this README) is most useful when you want:
- The benefits of the Debug API
- To capture the start up trace even earlier (at process start)
- To capture native traces
It appears to have low overhead. However, it’s difficult to set up and will only be able to profile start up with rooted phones.
- Nanoscope is another tool that can be useful to profile non-debuggable applications. (documentation). We would only suggest to use this tool whenever profiling start up has failed with the previous options.
In order to have a precise understanding of the impact of your changes on performance, you need to measure what the user experiences with little overhead. In practice, this is done by instrumenting your code with timestamps and outputting to Logcat. Measuring your changes can go wrong in many ways, so please reach out to us at #perf-android-frontend on Matrix so that we can explain what to look out for.
If you decide to jump in anyway, you’ll need to:
- Use SystemClock.elapsedRealtime to measure timestamps
- Use a production build variant: see the guide on build variants in the Fenix README
- If you’re measuring UI, you may need to measure when frames are drawn, not when lifecycle methods (e.g. onResume) are called
These aren't ready for broader use yet – please contact us for questions!