QA/Execution/Web Testing/Goals/2016/Q2

From MozillaWiki
Jump to: navigation, search

Web QA Q2 2016 Goals

Discussion Etherpad here:

Team Goals and Themes

Previous Themes

  1. Move our test-automation code into each project’s main-development repository to enable faster feedback and visibility for developers
  2. Prototyping the future with new tools, skills, processes

New Themes

  1. Improve consistency and stability of automated tests
  2. Reduce dependency on buildmaster role

Individual Goals


  1. Work with devs and ops to review and improve AMO's release process
  2. Work on Webextensions release for 48 to ensure cross-platform coverage
    1. Check if there are opportunities to engage with Chrome webextensions before the 48 release. Need to discuss with Amy and Dan before confirming on this.


  1. Get a Dockererized OWASP ZAP (CLI) instance up and running against a staged instance of one of our key sites: either AMO,, or MDN, in Web QA's Jenkins, on either/both a cronjob or on-demand, as goal
    • Document (i.e. blogpost[s]) the goals, the process, the progress, to try to help increase awareness
    1. DONE: &


  1. Set up UI functional tests for the add-ons website to run against pull requests
    • The UI functional tests currently live in and are run against deployed instances of the add-ons website. This deliverable will mean that UI functional tests will be run whenever a contributor submits a patch for consideration against an instance of the application including the change. This will reduce the feedback loop for failures, and will prevent regressions from being introduced.


  1. Ramp up and own Shavar deployments
  2. Learn and implement Docker container locally, assist with One and Done Docker-ization


  1. Dockerize One and Done
  2. MDN - Create a community oriented test plan

Open-ended Questions for Q2

  1. How can we increase community contribution?
    1. OneandDone goodfirstbugs import
    2. Bugsahoy
    3. Creating Outreachy tasks for evaluating candidates
    4. Clean up backlog of test automation bugs dashboard (and github issues)
  2. How can we improve GitHub issue and review discovery?
    1. Gaia has a history around this, might help to talk with them
    2. Bugzilla as much as possible just for history and discoverability
    3. Our dashboard could have bugzilla component tracking as well as GitHub Issues to try to combine
    4. Justin Potts had a redesign that might help
    5. Testrail test plans could generate manual and automated