QA/Async Drawing Test Plan: Difference between revisions

From MozillaWiki
< QA
Jump to navigation Jump to search
(removing this, not in use)
(edits)
Line 167: Line 167:
{
{
  "f1":"blocked",
  "f1":"blocked",
  "o1":"equals",
  "o1":"contains_any",
  "v1":"1229961",
  "v1":"1229961",
  "status":["NEW","REOPENED","UNCONFIRMED", "RESOLVED"],
  "status":["NEW","REOPENED","UNCONFIRMED", "RESOLVED"],

Revision as of 18:20, 6 April 2017

Overview

Purpose

Quality assurance plan to ensure Flash content on Windows with asyncDrawing enabled is ready for public release.

Quality Criteria

Risk area Requirement Status
Flash videos and apps No significant regression in site correctness, video performance or app/gaming performance at risk
General performance Overall performance of asyncDrawing enabled Firefox Adobe Flash Player should not be notably worse than with asyncDrawing not enabled TBD

Testing summary

Scope of Testing

In Scope

The scope of our testing is the async drawing functionality and performance of the most popular sites and games with latest Adobe Flash Player.

  • Integration: Verify the integration with the current browser functionalities and UI;
  • Functionality: Basic and advanced functionality to be verified according to the existing requirements;
  • Performance: Reference, where applicable, observed and collected performance data.

Out of Scope

We will not be testing on obscure web sites nor in conjunction with popular add-ons

Requirements for testing

Environments

Testing will be performed on following OSes:

  • Windows 10 (64bit and 32bit)
  • Windows 7

Quality Assurance Strategy

Test Items

Flash Video and Apps

Criteria Description Metric asyncDrawing disabled asyncDrawing enabled Criteria Met? QA Owner
Manual testing Test cases passed unknown 83% of test case PASS cross platform Bugs on file (20170306) StephanG

General Performance

Acceptable regression ranges, if any, need to be determined.

Criteria Description Metric asyncDrawing disabled asyncDrawing enabled Criteria Met? QA Owner
CPU usage (observed) Peak/average % CPU %peak/%average %peak/%average TBD (Date status updated) StephanG ?
Memory usage (observed) Peak/Average % memory %peak/%average %peak/%average TBD (Date status updated) StephanG ?
Telemetry - overall crash rates of Adobe Flash Player crashes per 1000 use hours # crashes # crashes TBD (Date status updated) tracy

Builds

Test Execution Schedule

The following table identifies the anticipated testing period available for test execution.

Project phase Start Date End Date
Start project December 2016 -
QA - Test plan creation 20170120 20170207
QA - Test cases/Env preparation 20170120 20170127
QA - Nightly Testing - Dec. 2016 Mar. 2017
QA - Beta Testing
Release Date

Testing Tools

Detail the tools to be used for testing, for example see the following table:

Process Tool
Test plan creation Mozilla wiki
Test case creation TestRail
Test case execution TestRail
Bugs management Bugzilla

Status

Overview

  • Track the dates and build number where feature was released to Nightly
  • Track the dates and build number where feature was merged to Aurora
  • Track the dates and build number where feature was merged to Release/Beta

References

Early testing tracked here

Testcases

Available on TestRail or Google Doc format

Overview

  • Summary of testing scenarios

Test suite

  • Full Test suite - Test Rail - (google doc)
    • We should make sure the full test suite includes bugs that have whiteboard STR from the list below

Bug Work

  • Bugzilla Meta Bug
  • Bugzilla logged bugs -

Main List

Bugzilla query error

Array ( [type] => error [message] => http-bad-status [params] => Array ( [0] => 400 [1] => Bad Request ) ) 1


Sign off

Criteria

Check list

  • All Criteria under each section of Quality Assurance Strategy should be green.
  • All test cases should be executed
  • All blockers, critical bugs must be fixed and verified or have an agreed-upon timeline for being fixed (as determined by engineering/RelMan/QA)

Results

Aurora testing

  • TBD on TestRail

Merge to Aurora Sign-off
List of OSes that will be covered by testing

  • Link for the tests run - TBD
    • Full Test suite - TBD

Checklist

Exit Criteria Status Notes/Details
Testing Prerequisites (specs, use cases) Done
Testing Infrastructure setup No
Test Plan Creation Done
Test Cases Creation Done
Full Functional Tests Execution Done
Automation Coverage N/A
Performance Testing TBD
All Defects Logged Done
Critical/Blockers Fixed and Verified [IN PROGRESS]
Daily Status Report (email/etherpad statuses/ gdoc with results) TBD
Metrics/Telemetry TBD
QA Signoff - Nightly Release Email to be sent
QA Beta - Full Testing
QA Signoff - Beta Release Email to be sent

Ownership

Product contact:

Engineering contact:
Jim Mathies

QA contact:
Stefan Georgiev
Tracy Walker (IRC: tracy)