QA/Automation/Projects/Mozmill Shared Modules/On Demand Testing

From MozillaWiki
Jump to: navigation, search

Overview

Lead: Geo Mealer
Co-workers: Henrik Skupin, Al Billings
Dates: Ongoing; Phase III to be completed end of Q3 2011
Status: Streamline how update testing is performed on releases
Repository Location: http://hg.mozilla.org/qa/mozmill-automation/file/default/ondemand
Tracking Bug / Bug List: bug 657081, bug 628659
Documentation: Pulse, Command line docs

Project details

Overview/Goal

Update testing of Firefox releases is currently an involved task with many manual steps. There have been opportunities in the past to parallelize or delegate the work that we've had to decline because of the difficulty of explaining how to set up, run, and analyze the results of the tests.

Our initial goal is to streamline the process into a semiautomatic solution that is easily documentable and does not require much manual handholding from the tester.

The Process Before

  1. Build an update test plan. Updates are performed from a given source to the current release build. A test plan chooses source builds using the following attributes:
    • Platform (win32, mac64, etc.)
    • Version (3.6, 3.6.12, etc.)
    • Locale (en-US, es, etc.)
  2. Download each source build manually from ftp and stage in a directory hierarchy in a disk share.
  3. Manually log into platform machines/VMs that can see the above disk share, and (from command line) execute the update testing scripts. This is done once for each platform.
  4. Look on each machine for the generated test logs that result from the update testing script.

The Process after Phase I

Phase I concentrates on removing most of the fully manual steps in favor of a semi-automatic solution.

  1. Build an update test plan as above.
  2. From the test plan, create a configuration file including the desired builds, platforms, and locales.
  3. Run a single script that reads the configuration file, automatically downloads the desired builds, and stages them in the disk share.
  4. Run a single script that uses pulse to kick off a process on each target platform machine and executes the existing update testing script.
  5. Look in brasstacks (our automation result reporting framework) for the update test results as they land.

The Staging Script

The staging script takes as input a configuration file containing a list of all desired platforms, builds, and locales.

The script expands the configuration file's lists into a comprehensive list of desired builds, and downloads into the staging area.

The Execution Script

The execution system consists of two parts, a test pusher and a test listener.

The test pusher defines the branch, update channel, and optionally the platform to be tested. The test listener runs on each desired platform; when a push for all platforms or for the selected platform is received, it launches the existing update test scripts.

The existing update test scripts consume the directory tree created by the staging script to perform the desired total test run.

The Process after Phase II

Phase I concentrates on improving the Phase I process in two ways:

  1. We've introduced a cluster concept so that multiple test clusters can be run without interfering with each other.
  2. The system now works with functional tests as well. In fact, this project page is basically obsolete and now needs to be replaced with "On Demand Release Testing," as the system can now handle all aspects.

We've also introduced a new component, a heartbeat emitter. This is a temporary workaround, because we found that rolling out Pulse listeners that receive only sporadic traffic causes their sockets to be closed out from under them. The heartbeat emitter is simply a traffic generator.

The Process after Phase III (In Progress)

Phase III will be improving Phase II in one primary way:

  1. The system will be kicked off from a web front end rather than via command-line

The primary gain here will be simplifying the process for those who aren't used to ssh and command-line tools. Our goal is to simplify the test run process to the point that a non-automation person can ramp up on how to do it within minutes.