Labs/Test Pilot/WeeklyMeetings/2009-09-17

From MozillaWiki
Jump to: navigation, search

TP v0.2.5 - Test result

Focus

  • Get the data analysis right to the community
  • Criteria:
    • User would be able to see the test result on the website
    • User would feel triggered to discuss the test result and be able do that easily (on the website? or wiki? or Google discussion group?)
    • Show aggregated data result, not the raw data


Do


Don't do

  • any mechanism to allow users interact with the raw data.

(In the future, (TP 0.4?), we may allow people to provide query API to analyze data for their interests.)


TP periodical test

Goal:

To monitor/evaluate FX performance and learn basic user browsing behaviors.


Duration:

5 days per month (cover both weekdays, and weekends. Could also be randomly picked 5 days?)

(need to estimate the data volume and prepare the server)


Potential metrics:

Browser configuration (one-time metadata which can go along with any other data upload):

  1. OS version
  2. Firefox version
  3. Firefox language/local (e.g. Firefox en-US)
  4. Amount of Add-ons
  5. What add-ons (with their respective version number) are active (could be used to correlate memory data to find add-ons using much memory?)
  6. Number of tool and status bar items (min/max/avg)
  7. Search bar (enabled/disabled/which search engine)
  8. Other tool bars installed

Product performance (collected as a series of values over time):

  1. time taken to launch browser
  2. time taken to open new tab
  3. average time to draw a page
  4. amount of memory available (incl. OS)
  5. amount of memory used (average and max should be stored)


Browsing behaviors (interacts with product performance metrics):

  1. how long FF is running
  2. how often FF gets restarted (user forced? add-on update?)
  3. number of open tabs/windows
  4. count how many pages have been visited/ how many are opened during a session
  5. preferred Tab-Add-ons (i.e. tab view on top or on which side (Tab-Kit))
  6. Frequency and "average time open" of various non-browser windows: JS Error Console, Preferences, Library, Add-ons manager, etc.


Browser UI (some subset of these could be piggybacked on other tests):

Examples:

  1. Width of the address bar (in relation to the toolbar and its space when shared with other items.)
  2. Tool bar icon sizes (small or large)
  3. Common button order (default navigation buttons)
  4. Whether status bar / bookmark bar / etc are shown or hidden

TP Q4 : V0.3 plan

Theme

  • Improving overall user experience for pilots

Goal

  • improve overall user experience with 2 tests: a periodical test (see above) and <TBD>

To do

  1. improve submission experience
  2. improve notification experience
  3. makeover for TP blog
  4. no restart for new studies (Jetpack)
  5. AMO experience

Don't do

  1. Query-API