From MozillaWiki
< QA‎ | Goals
Jump to: navigation, search

2014 Yearly Goals

  • Support platform and products to ensure what we are shipping is high quality.
  • Increase scalability by attaining a level of community participation where we have on average 80 active contributors per week, every week working on QA projects
  • Every top level project in every area will establish a comprehensive and integrated set of quality metrics and a known, externally (from qa) referenceable level of quality.
  • Drive changes to our processes for increased efficiency and impact as measured from both developer satisfaction (with finding issues early and often) and in end-user quality perception and satisfaction.

Q2 Breakdown

Supporting Events

Primary Team Event Action When Grade
Desktop/Mobile/Svcs Firefox 29 Ship Australis, FxA powered sync April 29 Services: A- Desktop: B, Apps on android: A
Desktop/Mobile/Svcs Firefox Sync (30, 31) Addition of token server, verifier, migration, fixes, sync 1.5, etc June 10 A
Svcs/Desktop Loop (32) Early Engagement, Planning, automation and testing June 10 A
FxOS/Svcs/Web Firefox Accounts on FxOS 1.5 (Where's My Fox, Marketplace) Early engagement, planning, testing, and automation June 10 C
FxOS Ship 1.3 See the release through IOT testing and get it out the door April 28 B
FxOS Ship 1.3t Get 1.3t through IOT testing April 28 B
FxOS Stability and support for 1.4 Drive stability for 1.4 as it drives toward full IOT testing June 10 A
FxOS Support teams on 2.0 (1.5) Engage with 2.0 teams, drive stability on master, early testing Ongoing throughout quarter - branch on June 9 B-
Web Socorro to Continuous Deployment Establish continuous deployment mechanism for socorro (crash-stats.m.c) June 30 A

Internal Goals (By Project)

Desktop Firefox

  • [DONE] Improve Geolocation Testing by ensuring that if geolocation code fails to work the failure will be noticed [juanb]
  • [DROPPED] Improve regression testing for gaming technologies by unhiding webGL automated tests, ensuring automated emscripten tests are run daily [kamil]
  • [DONE] Create necessary test materials and infrastructure for Loop Product to ensure testing can be shared with partners [ashughes]
  • [DONE] Complete and publish metrics from bugs, regressions, root cause analysis, stability, and uplifts [lizzard,kairo]

Desktop Automation

  • [MISSED] Ensure Mozmill systems are controlled by puppet for all updates on Mac and Linux systems
    • RelOps is not able to spend too much time on us given they have their own goals to finish. So what's left to do is the coverage for Java and Flash, which is blocked on bug 1032133
  • [DONE] Set up TPS (sync automation) continuous integration
  • [DONE] Automate specific tests as identified by Desktop team as features needing automation (Aim to automate 90% of requests)
  • [DONE] Provide automation training to larger community through 4 automation training events
  • [MISSED] Replace one component of Mozmill with the necessary changes to have it using marionette as its engine in the future
    • The patch is ready but was blocked on final updates, and the upgrade of wptserve on PyPI. So it will land a couple days late

Android Firefox

  • [DONE] Deliver Feature Testing and Branch Health report (starting with Fx30) [SV/Kevin]
  • [DONE] Engage contributors and mentor 2-3 one and done projects [Aaron]
  • [DROPPED] Scaling and rebalancing resources and tasks for release signoff and testplanning [Kevin]

Web QA

  • [DONE] Get Flame devices running in on-device automation within 5% of the same failure rate as the Buri phones
  • [MISSED] Complete and publish metrics on project health started in Q1
    • We thought Tableau was a given for this quarter (carried over from last), but given the Metrics team's resourcing and priorities, we're doing this in-house
      • I think we'll be able to understand what we can and can't use, metrics-wise, from what we're already measuring in Bugzilla
    • Partial Metrics from Bugzilla bugs
  • [DONE] Expand marketplace payment testing by improving automated regression testing
  • [DONE] Streamline FxOS smoketest running and reporting by unifying manual and automated smoketests
    • The HTML Reporting, the separate smoketest builds, and tinderbox builds have us set up to implement what's left, which is the process piece


  • [DONE] Streamline FxOS smoketest running and reporting by unifying manual and automated smoketests. [jason/marcia]
    • Plan established with contractors, Flame automation up, starting up on June 5th
  • [DONE] Build a proof-of-concept dashboard to display smoketest information [johan]
  • [MISSED] Create a plan to engage community testing now that there are contributors armed with devices (tablets and flames)[marcia - falls under community goal as well]
  • [MISSED] Complete and publish project metrics identified in Q1 [geo]
  • [DONE] Optimize team support so that we can use the timezones and different teams to take advantage of our geolocations [tony]
    • [DONE] distribute mainstream support to US (flame), partner device support to TW (dolphin, tarako)
    • [DONE] arranging contract test houses on both sides to collaborate efforts
    • [MISSED] setup flame automation across TW and US for daily support
      • TW had a delay of getting flames in stock, and will not have time this quarter to set up an automation rig like the US QA lab. TW will revisit this in Q3
  • [DONE] Support efforts to create automated regression testing for graphics on Fx OS [no-jun]
  • [DONE] Unify acceptance criteria with other teams (UX, Product, Dev) within sprints and report on project health at various milestones (1.5 timeframe) [jason/tony]
    • See this page for documented acceptance criteria for QA


  • [DONE] Ship Firefox Accounts and Sync 1.5 exceeding quality goals [Team]
  • [DONE] Publish project health metrics created in Q1 - metrics [edwong]
  • [DONE] Ensure FxA has regression tests on FxOS by removing 2 barriers to developers (Rpapa wrote sample tests for dev to provide both a framework and a template) [rpappa]
  • [DONE] Work with desktop to ensure loop server system is thoroughly covered by test plans[pdehaan/jbonacci]
  • [DONE] Establish a Services QA 101 tiger team of contributors and lead them to help ensure Tokenserver, FindMyDevice, Sync 1.5, Simplepush etc are seeing adequate test coverage for the release level of each of those systems [kthiessen]
    • [DONE] Talk to 6 contributors about Services QA projects they're interested in.
    • [DONE] Talk to developers on the above project teams to find out where they need help.
    • [DONE] Have a conversation with a community leader to help co-ordinate the tiger team.
  • [DONE] OAuth sign in for Marketplace and FMD cross browser automation [kthiessen/team]

Platform Quality

  • [DONE] Hire tech lead for platform Quality team
  • [DONE] Create "Sunshine"/"Happy Path" test automation system in mtn view for webRTC
  • [DONE] Analyze intermittent failures in webRTC and attempt 5 plausible solutions, attempting to reduce occurrence rates of all the webRTC intermittent tests by 10%
  • [MISSED] Increase quality level of webRTC code by expanding covered scenarios being tested (non-virtual systems, TURN, STUN, NATs, integrating with Loop partners, etc)
    • We outlined the strategy for this. We are just now starting to build this out. Sunny Day environment took longer than expected.
  • [DONE] Continue expanding prioritized backlog with an eye toward other platform related projects


  • FX OS
    • [MISSED] Investigate using Mulet as a possible Task in One and Done [marcia]
      • Never had time to start this project.
    • [DONE] Identify a set of QA contributors that should be the first ones to get the Flame and or Flatfish devices [marcia]
      • Flame devices are not done, will carry over.
    • [DONE] Create getting Started Documents on QMO in Firefox OS section [marcia]
      • Tony created a create first step for FX OS, so I am considering this goal complete. We will continue refining.
    • [DONE] Conduct one Test Event this quarter for either the Flame device or Flatfish device [marcia]
  • Tools
    • [DONE] Provide historical and ongoing data for QA community contributions to the Baloo project and define what reports we need back from them [lizzard] bug 990667bug 1017201
    • [DONE] Generate Requirements Document for version 2 of One and Done - [lizzard/rbillings](Tracking info here:
  • Recognition
    • [DONE] Create a Badges working group [rbillings]
    • [DONE] Produce one specific piece of QA swag to deploy to contributors [marcia]
    • [MISSED] Design a QA contributor swag pack [marcia] -> Stretch Goal
      • This was a stretch goal I did not have time to complete.
    • [DONE] Keyword/flag created in Bugzilla/Github to track nominations for swag [kthiessen]
  • Events
    • [DONE] Define QA Participation in first MozCamp by participating in Content Design session [marcia]
    • [MISSED] Reinvigorate Test Days [aaronMT/ashughes/]
      • [MISSED] Instead of "reinvigorating" the entire test day, focus on providing a roadmap and a set of curricula to help other community leads to start doing the "reinvigorating" of the test days throughout Q3/Q4
        • Significant progress has been made but roadmap definition will overlap into early Q3 with the intent to kickstart fixing our lowest-hanging fruit by end-Q3 and building on that in Q4.
    • [MISSED] Develop an Events Strategy [marcia] -> Stretch Goal
      • This was a stretch goal I did not have time to complete.
  • Education/Resources
    • [DONE] Start a "QA 101" working group in consultation with the CBT Education Working Group, with emphasis on creating curriculum for teaching Mozillian contributors generally applicable QA skills [kthiessen]
  • Contribution Paths