Release Management/Rapid Betas/Test Plan

From MozillaWiki
Jump to: navigation, search

[DRAFT] Rapid Betas Test Plan

The following is a strategy for QA executing effectively on Rapid Betas. It should serve as a guideline for outlining release test plans. It should also serve as a statement of what QA will be doing to ensure quality and time to market is not impacted by Rapid Betas.

Once complete we should escalate to Release Management for consideration


QA has the following existing resources for testing:

  • 2 Drivers: Juan and Anthony
  • 1 Stability: Marcia
  • 2 Testers: Jason and Tracy
  • 8 Softvision: 5 Desktop, 3 Automation

Rapid Beta Test Lifecycle

The lifecycle of a release is typically broken up into 30 business days (6 weeks with 5 business days per week). The lifecycle adapts some aspects of an agile methodology, although not entirely to make the process lean. Below summarizes the lifecycle:

  • Day 0
    • Automation and Smoketests used to qualify the merge
    • Feature sign-offs begin
    • Planning Discussion
      • Past-Release QA Analysis and Reflection
      • Define pain point areas that deserve exploratory testing for the upcoming week
  • Day 1
    • Feature sign-offs complete
  • Day 2-28
    • Weekly standup
      • Define pain point areas that deserve exploratory testing for the upcoming week
      • Review bug metrics tracker for the release
    • Daily automation triage, smoketesting, exploratory testing, and bug triage
    • Weekly triage meeting to go over bugs
  • Day 29
    • Go-no-Go meeting
    • Prepare test plan draft for the next release
      • Includes bug metrics tracker for fixed, qawanted, and unconfirmed bugs
  • Day 30
    • Ship current release
    • Day 0 for the next release

Effort Allocation

This following table summarizes the percentage of effort allocated per each major beta testing area for Firefox releases:

Note that the below estimates do not include time to investigate, debug, and otherwise test issues that come up
Task Participant Effort Percentage
Automation Rotated Per Week Within Desktop QA 13.8%
Smoketests One Softvision (rotated) 27.7%
Exploratory four remaining Softvision 27.7%
FIXED Bugs everyone 13.8%
QAWANTED ashughes, juanb 13.8%
UNCONFIRMED all Mozilla team 2.77%



Mozmill automation kicked off automatically via Mozmill-CI and triaged on a daily basis

  • Functional: run against latest version, en-US, all platforms (~30 minutes)
  • Updates: run against 5 previous versions, en-US + 1 locale, all platforms (~60 minutes)
  • Endurance: run against latest version, en-US + 2 locales, all platforms (~30 minutes)
  • Default to Compatible: run against latest version, all locales, all platforms (~240 minutes)
  • Total estimated runtime end-to-end: 7 hours


Triaging the results, and manually jumpstarting the automation when it fails to start, can be done by a single person.

  • Functional: ~5 minutes to review results (up to 30 minutes if there are bugs to file)
  • Updates: ~5 minutes to review results (up to 30 minutes if there are bugs to file)
  • Endurance: ~10 minutes to review results (up to 30 minutes if there are bugs to file), takes longer because we need to evaluate memory usage over various testruns
  • Default to Compatible: ~30 minutes to review results (up to an hour if there are bugs to file)


Manual smoketests run on a daily basis. Stagger the OS coverage to make it less time consuming, examples:

  • Monday: Windows 7 32-bit and 64-bit
  • Tuesday: Windows XP and Vista
  • Wednesday: Mac OSX 10.5, 10.6, and 10.7
  • Thursday: Linux 32-bit and 64-bit
  • Friday: Windows 8 and Mac OSX 10.8

This can be done by one Softvision person each day in a couple of hours, but that person should be rotated (ie. not the same person every time).


Feature Owners ensure all scoped features have been tested for delivered user stories and integration into Firefox. It is the sole responsibility to a particular feature's owner to decide the sign-off criteria and to ensure all the criteria is met. This should be performed immediately following the merge into Beta and will likely take at least one full day.

Exploratory Tests

Run exploratory tests daily for areas prone to regressions. We need to decide what these areas are in advance based on historical data and future-looking. I would suggest picking a different area each day, examples:

  • Add-ons prone to causing us problems
  • Plug-ins (current and pre-release versions)
  • Top-sites
  • Feedback Summary indications
  • Post-mortem outcomes

This can be done in parallel to the Smoketests. Everyone who is not doing Smoketests that day, working together, should be able to complete this in a couple of hours.

Bug Triage


  2. FIXED (security)
  3. FIXED (feature)


  4. MAJOR



Taking ownership means that you are committing to ensuring the bug fix is verified before we ship.

Until we have the QA Contact field, take ownership by adding [qa+:ircnick] to the whiteboard (ircnick is the nickname you use on IRC).


To meet these targets, we will do the following:

  • Daily: verify the fixes which landed since the previous build
  • Weekly: have two meetings a week, one stand-up and one triage, to ensure we stay on track

To meet the following targets we propose getting the bug reporter involved in the verification process. The following is a sample work flow of a verification under this model:

  • Work from a query of all RESOLVED FIXED, status-firefoxN:fixed, and tracking-firefoxN:+ bugs
  • Take owernship of verifying a bug:
    • add [qa+:ircnick] to the whiteboard (where ircnick is the nickname you use on IRC)
    • comment in the bug asking the reporter to see if the bug is resolved (ask them to test each version with a status flag set to fixed)
  • If they do not respond within a week, do the verification yourself:
    • leave [qa+:ircnick] in the whiteboard (do not change to [qa!] is in the past)
    • set the status flag to verified for all versions tested


QA needs to verify as many of the landed fixes as possible before we can give sign-off of a particular Firefox version. Based on historical data we believe we can and should achieve an overall verification rate of 80% (~100 bugs), with priority being given to security, blocker, and critical bugs.

The following table summarizes our verification metric targets per release:

Severity Target Firefox 12 Firefox 11 Firefox 10
Blocker 100% 100%: 1/1 100%: 0/0 100%: 2/2
Critical 100% 85%: 35/41 98%: 44/45 79%: 26/33
Major >=80% 60%: 3/5 100%: 10/10 88%: 7/8
Normal or below >=60% 75%: 51/68 99%: 78/79 79%: 38/48
Overall >=80% 78%: 90/115 99%: 132/134 80%: 73/91
Security Bugs
sg:critical 100% 84%: 21/25 100%: 33/33 76%: 16/21
sg:high >=90% 100%: 0/0 100%: 1/1 67%: 2/3
sg:moderate >=80% 50%: 3/6 100%: 4/4 50%: 1/2
Overall >=80% 77%: 24/31 100%: 38/38 73%: 19/26



Taking ownership means that you are committing to doing everything you can to find steps to reproduce, a minimized test case, aquiring necessary hardware/software, or finding a regression range. Ultimately, the job of QA on these bugs is to find the reproducible case and the offending changeset so that Engineering can fix the issue before we ship.

Until we have the QA Contact field, take ownership by adding [qa+:ircnick] to the whiteboard (ircnick is the nickname you use on IRC).


To meet our targets, we will do the following:

  • Daily: carve out some time to debug what is in your ownership bucket
  • Weekly: update status and discuss next steps in a weekly triage meeting
  • Tues/Thurs: Release Leads check the query for any new bugs, get an update on existing bugs, and identify QA blockers for discussion in the Release Management meetings

The following is a proposed workflow for QAWANTED bugs:

  • Release Management escalates a bug by adding the QAWANTED keyword and CCing QA Leads
  • QA Lead ensures there is enough information to debug and assigns ownership of the bug to someone on the team
  • QA Lead checks in every Tuesday and Thursday to get an updated status on the bug
    • if QA has exhausted all known debugging methods, bug is added to the next channel meeting, where Release Management will advise other leads QA can follow, escalate to Engineering, or decide to drop QAWANTED if we've hit a dead-end
    • if QA is successful in debugging, QAWANTED is removed and the bug moves to Engineering for a fix
  • When the bug is RESOLVED it is put into our FIXED Bugs verification bucket


QA is committed to looking at every single QAWANTED bug (~10 per release). This is not a commitment to see every bug fixed; it is a commitment to debug every QAWANTED bug to the extent of our abilities.

Bugs Target Firefox 12 Firefox 11 Firefox 10
FIXED >=80% 77%: 10/13 85%: 11/13 20%: 1/5
DEAD-END <=20% 23%: 3/13 15% 2/13 80% 4/5
Overall 100% 100%: 13/13 100%: 13/13 100%: 5/5


We should make a point to look over the UNCONFIRMED bugs filed against the current milestone, ideally on a weekly basis. A couple of ideas to this end:

  • Weekly triage meeting
  • Daily "moderator" (like Contributor list)
  • Community project
  • Testdays


Taking ownership of an UNCONFIRMED bug means that you are committed to doing everything in your ability to either confirm or debunk the bug.

Until we have the QA Contact field, take ownership by adding [qa+:ircnick] to the whiteboard (ircnick is the nickname you use on IRC).


To meet our targets we will commit to doing the following:

  • Holding a weekly triage meeting to go through as many of the bugs as possible
  • Engaging with people through the contributor mailing list
  • Organizing one test day per release to assist with these bugs

The following is a sample workflow for an UNCONFIRMED bug:

  • Assign to yourself by adding [qa+:ircnick] to the whiteboard (where ircnick is your nickname on IRC)
  • If the bug appears to be in the wrong component, move it to the correct component
  • Attempt to reproduce the bug given the information provided
  • If the bug does not reproduce ask the reporter for more information, some examples:
    • Running with a new profile
    • Disabling add-ons
    • Trying different Firefox versions
    • Trying other browsers
    • Asking for test URLs or a minimized test case
  • If you reproduce the bug, set the status to NEW and try to:
    • find a minimized test case (add testcase keyword if found)
    • find a regression range (add regression keyword if found)
  • Resolutions
    • NEW: If the bug is reproducible
    • INCOMPLETE: not reproducible and no feedback from the reporter (give them a week)
    • INVALID: designed behaviour or a problem external to Firefox
    • WONTFIX: bug in Firefox which will not be fixed
    • WORKSFORME: bug no longer reproduces and no patch was applied to fix it
    • FIXED: bug no longer reproduces and a patch was applied to fix it


QA should endeavour to have eyes on all UNCONFIRMED bugs. However, due to resources, this just isn't possible. Historically, combined efforts of QA and Engineering have burned down the list of bugs to 35% UNCONFIRMED. QA will commit to improving (ie. lowering) that number to 30% (or reciprocally, increasing the CONFIRMED/RESOLVED rate to 70%). This will be accomplished through weekly triage meetings, community test days, and engagement via the contributor mailing list.

Bugs Target Firefox 12 Firefox 11 Firefox 10
UNCONFIRMED <=30% 31%: 310/1005 39%: 411/1049 33%: 442/1359
CONFIRMED/RESOLVED >=70% 69%: 695/1005 61%: 638/1049 67%: 917/1359
Overall 100% 100%: 1005/1005 100%: 1049/1049 100%: 1359/1359


Provide ways the community can help out. This should be called out explicitly in your test plan so that community members know what to do, how to start, who to ask, and where to report issues.


  • Unconfirmed bug triage
  • Dogfooding builds and filing new bugs
  • Verifying fixed bugs
  • Dogfooding features and filing new bugs
  • Running MozTrap tests
  • Utilize Aaron's Topsites Addon for Desktop crowdsource testing

Question, Concerns, and Blockers

Add feedback to this etherpad. All points will be answered in line.

Discussion Points

  • [DONE] There will be crossover points with different teams (ex. Core bugs) -- do they need to be called out / planned for in the Strategy?
    • ANSWER: There will surely be cross-over areas but we cannot include other teams' resources as assumed resources for our own team
  • [DONE] Rapid releases do not provide us the time we need for a dedicate planning phase.
    • ANSWER: We will solve this by having meetings, to be implemented immediately as they benefit us with or without rapid betas:
      • post-mortem to evaluate the previous release and plan for the next release
      • weekly stand-up meeting to evaluate the previous week and plan the next week
      • weekly triage meeting to ensure traction of bug queries
  • Figuring out the point-based system for tracking progress and estimating work
  • Bug triage targets: ideal, acceptable, necessary
  • Discussion about adjusting testing process to be more Agile: wiki
  • More clearly defining ownership of bug triage (QA Contact field will help here)
    • Bug 684088 - Return QA contact field to QA