Performance/Projects/ClientBenchmark

From MozillaWiki
Jump to: navigation, search
Draft-template-image.png THIS PAGE IS A WORKING DRAFT Pencil-emoji U270F-gray.png
The page may be difficult to navigate, and some information on its subject might be incomplete and/or evolving rapidly.
If you have any questions or ideas, please add them as a new topic on the discussion page.

Experiment Name: about:benchmark (better name suggestions welcome!)
Experiment champion: Alexander Limi

The user experience of Firefox shows incredible variance, often because extensions/add-ons slow down the browser. We see all the time when users update and some of their (often badly coded / badly maintained) extensions are disabled. Suddenly Firefox is fast again.

We want to give people (or at least the "tinkerers" and extension developers) a simple way to test whether the installed extensions affect speed — and if they do, by how much, and offer to identify the badly behaving add-on(s).

1. Why should we do this experiment?

Because extensions are killing certain users, and they will switch to other browsers because they think it's an inherent problem with Firefox, not thinking about the effect extensions have.

Down the line, we want these data to inform the listings on AMO, and possibly even warn you in Firefox when an extension has been identified as slowing down your browser.


2. What is the expected outcome?

The goal is to do the simplest thing that can possibly work to start identifying these add-ons. If we can create an extension that can do the following, it will be a great start:

about:benchmark

  • Gives you a page that says "Here, you can benchmark your browser speed with and without benchmarks to figure out whether it is slower than it needs to be"
  • If you click the "Start benchmark" button, the browser will store the current session, restart the browser with a new session (with current add-ons enabled) and run a standardized benchmark suite consisting of a decent cross-section of the current web. Items that should be looked at are:
    • Startup speed of the browser
    • Rendering time for each of the pages (if they're not local, subtract network lag or download in the background to disk before starting the test)
    • Time some common and simple operations like opening a new tab, opening a new window, closing them, and I'm sure there are some other useful metrics
  • After the first test run, restart browser again, this time disabling all extensions (and clearing the cache :)
  • Run the same test suite, compare performance, and show something like "Your browser is 24% slower with add-ons enabled. Do you want us to try to figure out which add-on is causing this?"
  • Offer buttons with the possibilities to:
    • Resume your previous browser session.
    • Narrow down the misbehaving add-ons(s) by either enabling one add-on at a time and running the tests, or doing a binary search (you're the scientists, I'll let you figure out the optimal way to do this ;) — also, warn that this will take a long time if you have a lot of add-ons.
    • Run the benchmark again.

Extra brownie points for allowing the user to upload/report the offending add-on to our servers, so we can take a look.

3. Which resources (people/money/etc) are needed?

This shouldn't require a lot of design resources, and I assume it's not too hard to do this kind of thing either in Jetpack or the existing add-on infrastructure. The main challenge is designing a good set of tests, and making sure the numbers are independent of network, etc.

Also, I don't know if we have the necessary hooks to time these things via an add-on.

It can probably be contracted out, but I'm not sure I would trust the results with people that are less familiar with the code base. ;)


4. Are there any major risks?

It's important to understand that this won't solve the performance issues with add-ons — and that there are more interesting ways to do it that are also a lot more resource-hungry and time-consuming.

We have discussed hosting a standard benchmark on AMO that would warn authors/end-users when an add-on significantly reduces FF performance as part of the upload process.

However:

  • A lot of the add-ons that cause slowdowns aren't even on AMO (Norton Toolbar, Google Toolbar)
  • Having a standalone tool that people can run to evaluate their work is useful for extension authors
  • This way, it's easier for us to approach people and say "Hey, I ran the benchmark tool with your add-on, and noticed that it slows down Firefox. Anything I can do to help?"
  • Starting a project like this requires a lot of infrastructure, machines for running tests, etc. We can get most of the way there if we have a client-side add-on that we can selectively apply when we suspect something is wrong.

The other risk is of course that it might be hard and/or time-consuming to come up with good test suite.

5. Does it align with a theme?

Making sure Firefox is as fast as possible for as many people as possible. Also, interesting data all around, even if only as ammo for Jetpack. ;)

6. Links to prior work in this space:

None that I know of.