QA/Feature Testing
< QA
Jump to navigation
Jump to search
Feature testing
Purpose of this document
What this document is
This document outlines the major milestones associated with the quality management of Firefox features. These milestones are centered around manual testing, evaluation and reporting.
Note: The process described below applies only for the Firefox features shipped based on the train model.
What this document is not
The information available in this document does not apply to:
- QA teams conducting automated testing
- QA teams working on off-the-train features
Note: The process described below does not directly result in a final Go/NoGo decision for a Firefox feature.
Process overview
Phase 1: Kickoff and Test Plan sign off
- QA requests a feature walkthrough from the engineering team responsible with the feature
- Description: the walkthrough can consist of an actual demo held by the engineering team, a discussion based on a very specific agenda or an email thread tackling a very specific list of topics/questions essential for sketching a Test Plan
- Goal: for QA to understand the scope of the feature and see the big picture
- QA drafts a Test Plan based on the feature walkthrough provided by engineering
- When: immediately after the feature walkthrough
- Exceptions: this might be impossible if the feature is received late and/or test execution is more urgent than Test Plan creation
- Best practice:
- while not mandatory, it’s usually best to consult with the Team Lead if specific items from the Test Plan are unclear
- once a draft is ready for the Test Plan, an internal review is started by QA peers and/or Team Lead
- References: https://goo.gl/HdgWft (Test Plan template)
- QA sends the draft Test Plan to engineering for review and formal sign off
- When: as soon as the draft Test Plan has been green-lit internally, by QA peers and/or Team Lead
Phase 2: Test preparation and scheduling
- If the draft Test Plan has been signed off by the engineering team, QA starts creating high-level test cases based on the Test Objectives described in the Test Plan
- When: immediately after the Test Plan has been formally signed off
- Exceptions: Test Suite drafting could also start while the Test Plan is being reviewed, if QA is confident about the Test Objectives proposed in the Test Plan
- QA sends the high-level Test Cases or the draft Test Suite to Engineering for review
- When: as soon as the high-level test cases have been green-lit internally, by QA peers or Team Lead
- Best practice: it’s usually best to send the high-level test caes in the form of a Google Spreadsheet, because not all engineering teams have access to TestRail and it’s also a better environment for adding notes and comments per test case or step
- QA starts drafting a Test Suite based on the high-level test cases green-lit by Engineering
- When: as soon as the high-level test cases have been green-lit by Engineering
- QA schedules a testing session with specific activities based on the available time frame
- When: based on priority, time constraints and available bandwidth
Phase 3: Mid Nightly test execution and sign off
- QA sends out a preliminary status report and schedule update before the mid-Nightly feature sign off
- Goal: For QA to discuss the status of the feature with the Engineering team, prior to the formal sign off.
- When: 1 week before the mid-Nightly sign off
- Exceptions: this might often be impossible if the feature is received late or if testing starts late
- QA formally signs off the feature mid-Nightly
- When: by end of week #3 (6-week or 7-week release cycle) or by end of week #4 (8-week cycle)
- Best practice: while not mandatory, it’s usually best to consult with the Team Lead if the status or arguments used in the feature sign off report are unclear or might be subject to interpretation
- References: https://goo.gl/eXYzxx (sign off template)
Phase 4: Pre Beta test execution and sign off
- QA sends out a preliminary status report and schedule update 1 week before the pre-Beta sign-off
- Goal: For QA to discuss the status of the feature with the Engineering team, prior to the formal sign off.
- When: 1 week before the pre-Beta sign off
- QA formally signs off the feature pre-Beta
- When: by end of week #5 (6-week), week #6 (7-week cycle) or by end of week #7 (8-week cycle) -- or 1 week before merge day
- References: https://goo.gl/eXYzxx (sign off template)
Phase 5: Pre Release test execution and sign off
- (not always applicable) Engineering QA hands over the feature to Release QA and informs the engineering team about it
- When: in the 1st week of the release cycle
- Description: this step is optional, as not all features are handed over
- References: https://goo.gl/ehzXnK (feature handover template)
- QA sends out a preliminary status report and schedule update 1 week before the pre-Release feature sign off
- Goal: For QA to discuss the status of the feature with the Engineering team, prior to the formal sign off.
- When: 1 week before the pre-Release sign off
- QA formally signs off the feature pre-Release
- When: by end of week #4 (6-week cycle), by end of week #5 (7-week cycle) or by end of week #6 (8-week cycle) -- or 2 weeks before merge day
- References: https://goo.gl/eXYzxx (sign off template)
Time table
6-week release cycle
7-week release cycle
8-week release cycle
Best practices
- Each feature’s QA owner should have a peer assigned to help. Larger, more complex features can justify more than one QA peer.
- Internal Test Plan reviews and updates occur periodically. Feature Test Plans should be updated at least once a week to keep them relevant.
- Feature status updates should be provided periodically in the QA status documents associated to each Firefox version.
- Weekly checks should be made for the bugs reported in the wild. Since there are so many environment variations (due to various software and hardware pairings), some bugs might only be uncovered by users that have very specific environment setups.
- A continuous monitoring process should be in place for new bug fixes. This can be easily done by setting up Bugzilla queries, or something similar.
- In the case of highly complex features, a meta bug should be created to track all the issue reported by QA. Having a separate meta bug for the issues reported by QA ensures a more efficient tracking, referencing and reporting.
- Highly severe bugs (critical, blockers) affecting a feature should be flagged using the qablocker keyword. Using this keyword in addition to setting needinfo? flags for the right people is the most efficient way of raising major concerns.