QA/Goals/2014q4

From MozillaWiki
< QA‎ | Goals
Jump to: navigation, search

Milestones this Quarter

This is the list of milestones this quarter. The table below is used to judge how our efforts shaped each axis of quality in the final release. We'll fill this in after the quarter completes.

Milestone Date Delight Dependability Security & Privacy Performance Overall Score Rationale
Firefox Ten Year Nov 9 B B A A B Fast turnaround release - tough to get done in time, lots of web production work as well. Very lucky this worked out well given 33.
Firefox 33 Oct 13 F C A A C Black screen issues, omtc issues resulted in obscurring other issues
Firefox 34 Nov 25 B B- A A B- Search could have gone better - we caught issues but no time to fix before it went live. Kamil did an amazing job getting a pre-look at it and getting many of the issues filed. Secrecy of the search release really impacted our ability to be effective. Loop uplifts and coordination issues with server side proved difficult and fraught with issues, worked out in the end. Lots of confusion in user community about slow roll-out of the loop feature - need better documentation going forward
PDX Work week Dec 1 - 6 n/a n/a n/a n/a n/a n/a
Marketplace shift to FxAccounts Oct 31 A A A A A Moved from persona to FxAccounts. Went quite smoothly even though there was a lot of churn. Automation in place helped a lot as well.
Marketplace ship on Tarako Device in new location Oct - Nov A A A A A Shipped without a ton of work on our side, which surprised us. We thought this would be bigger than it turned out to be.
QMO Update Nov 20? A A A A A Went really well. Docs on MDN was a lot of work, but was led well.
MozFest Oct 24-26 B A A A A QA standpoint - gave out 1000 phones, and sent people back into the communities. Would have been nice to have had sessions around that to teach people to be effective contributing with the phones.
Firefox OS 2.1 Nov 30 A A A A A Best quality release of Fx OS to date (from partner feedback)
Loop Ship Nov 25 B B A A B Largest issues were mostly out of our control - miscommunications happening between the various groups. We were all hearing different things and spending considerable amount of time to track down source of truth and adjust strategies appropriately. Volume of uplifts was high but we dealt with it, had a separate branch which helped quite a bit.
E10S on by default in nightly Nov 9 A A A A A A little difficulty with some nighlty users - being able to flip it off easily in the pref pane was a huge win. The pain we encountered was good and made the e10s system more resilient going forward.

General Quality

  • [DONE] Create a centralized site of all dashboard/project information from all quality teams (opt in, per team) so that it is easier to understand the impact and effect of on-going quality efforts
    • [MISSED] [ctalbert] Add churn metrics to the dashboard for platform codebase
      • NOTE: We have the values for churn, but the metrics team wasn't able to complete the work to combine this with bugzilla data to have a true modelling for when things are going off the rails. Churn values are here
    • [MISSED] [mschifer] Include SUMO metrics for support requests over time
    • [DONE] [bsilverberg] Identify and produce metrics around the "drop-off rate" of One and Done tasks for tracking
    • [DONE] [ctalbert] Invite services QA to add in relevant, high level dashboards for cross-QA projects (like Loop, FxAccounts etc)
  • [DONE] Deploy a simpler, streamlined version of QMO
    • [DONE] [pragmatic] Consolidate all QA Docs onto MDN and/or Moz Wiki as appropriate
    • [DONE] [aaronmt] Use our social media presences for digital outreach to create a bit of buzz around the QMO deployment measured by an increase in number of first time contributors
  • [DONE] [aaronmt/ashughes] Testdays: Based on feedback gathered in Q3, prioritize and resolve some of the barriers to participation with a goal of developing some contributors engaged outside of these events (ex. active in bugzilla/moztrap/one&done day-to-day).
    • Hypothesis: if we provide more frequent bite-sized mentorship opportunities through testdays we'll develop more established relationships in other contribution channels

Firefox

  • [DONE] [mschifer] Work on doing fewer verifications since they don't seem to be effective, and use the time saved for exploratory testing.
    • Hypothesis: We will still produce releases just as high quality but will be able to find more interesting bugs sooner in the development cycles.
    • Hypothesis: We will be able to use our fledgling churn metrics to guide where and when to do verifications for non-security bugs (all security bugs continue to be verified if possible).
  • [DONE] [lizzard] Clarify/standardize/improve how we do QA for Firefox through the entire release process. Have a fairly standard wiki template for each Firefox channel that we can use for tracking and for our daily workflow. We can support community better in helping us (and support each other) better if we have a good transparent window into how QA works across each "train"
  • [DONE] [mschifer] Develop a system to measure development pace for SV Automation (similar to FxTeam) so we can establish a historical baseline and begin to measure an increase pace of automated test development

Web QA

  • [MISSED] [mbrandt and shared with MDN dev] Develop and stand up a JavaScript-based testing infrastructure (proof-of-concept) which allows Web development and Web QA to be maximally effective by sharing key points of test infrastructure, visibility, and process/workflow:
    • Carried this over to Q1 2015, and will post the week of January 4th
    • [DONE] [MDN] identify, document, and prioritize current testing challenges and needs
    • [DONE] [mbrandt] create the plan (incl. decisions, next steps/action items, owners, dependencies, agreed-upon process/workflow changes, etc.) to best-attempt to address those needs
    • [DONE] [mbrandt] execute against the plan -- according to decided owners -- noting new dependencies, shifts in direction, limitations, etc.
    • [MISSED] [mbrandt] at the end of the quarter, do a writeup (blog post?) covering the progress and any next steps for Q1 2015
  • [DONE] [rbillings] Review current commitments in terms of projects and transition several to community and/or developers until the set of sites is both sustainable and our time spent on them is impactful
  • [DONE] Finalize Marketplace support requirements for Firefox OS

Platform QA

  • [MISSED] [sydpolk/nils] Complete WebRTC connection establishment and connection quality tests by deploying them to the automation created in Q3 and have them reporting to Treeherder
    • NOTE:The system reports pass/fail but without logs, because we have to be responsible for reserving the space for the logs which was not scoped in this requirement originally. We will return to this and finish it in Q1, post 36 ship.
  • [DONE] [sydpolk/nils] Start and deliver at least one sprint on a new (non-WebRTC) project - project TBD, likely one of: (gfx hardware discovery, MSE)

Community

  • [DONE] [ctalbert] Create the long term 2015 plan for community that addresses how we move our contributors from one-time contributors to active and core contributors
  • [DONE] [marcia] Participate in the Mozilla Spaces Bug Squash November 1-2 (London and Paris offices) with the explicit goal of getting more QA ideas and process instilled in Firefox OS contributors
  • [DONE] [marcia] Participate in the Mozilla Festival October 24-26 with the goal of sharing QA best practices around FX OS, and give community the setup they need to make in an impact in the project.
  • [DONE] [bsilverberg] One and Done: Build a metrics dashboard that exposes data on the most completed & most abandoned tasks, identifying contributors who are making an impact
  • [MISSED] [ctalbert, aaronmt, rbillings, ashughes, marcia] Use Community Calls, new testdays, community buddy, etc as a bridge between one and done and the larger community in order to help contributors cross the gap between "doing something in one and done" and being an active contributor.
    • Hypothesis: if we execute well on these initiatives we'll see a decline in the drop-off rate between 1 task contributors and multiple task contributors and we can use that to see if these efforts are successful.
    • ashughes I think this is at risk partly because we defined a KPI without establishing a baseline. Specifically for testdays, we have done well to integrate One & Done to our workflow but I really don't know if it's had a measurable impact.
  • [DONE] [pragmatic] QMO Re-design: Complete the migration of Docs off QMO to MDN QA Zone. Roll the new design on to the live site.