QA/Team Dynamics/Platform QA
Scheduling is pretty much set by the Firefox release schedule.
uplifts? There was time-pressure to release MSE/EME as soon as possible
ideally, working before nightly.
treeherder -- same pri as other tests tier 2 -- long-running, sherrifs notify but don't pull rip cord
Focus on increasing Visibility
Development of automation and tests: was github not Bugzilla. New: "Firefox media tests" component in Bugzilla, Moving in-tree.
Syd works with both the WebRTC and Media teams -- Maja works with the Media team.
Contact point: Nils
WebRTC stuff is in BZ. Infra for WebRTC is in github. Big advantage: Nils used to be in QA (so he knows a lot about testing), and he is very responsive.
Contact points: (Anthony Jones/Chris Pearce)
The team works with Softvision for manual testing -- NZ to Romania?
Based in NZ/Oz -- communication/timezone. Not a big problem.
Communication:Finding out what tests would be useful is a challenge. Devs not always aware of what tests are running. File bugs in BZ to get their attention.
Twice in the past 4 months -- big Netflix regressions.
Memory tracking -- also valuable.
Process of test development made Maja notice broken things.
Bugs that only show up in automation -- reporting is time-consuming since you can't reproduce locally; time spent on automation infrastructure to collect more info about the failure (e.g. re-run with debug build). Also quite rewarding -- these are bugs that are not reported in the field.
Bugs found manually -- generally fixed before automation can be written.
reusability and visibility -- team expertise
Treeherder, multi-platform: we are running tests that no one else runs.
staging/automation in VMs -- we can dupicate other team's prod envs in VMs, if there are not adequate staging envs.
blind spot -- iOS Fennec
Maja -- There seem to be overlapping automation and QA efforts. Duplication of efforts? Do QA teams take full advantage of existing automation?
Syd -- Is the complication of the automation a big barrier to community?
Code Review !!!
Code review is a great way to get people informed of what other segements of the team/org are working on.
Challenge: We need reviews from people who are already overworked.
JS, Python, Marionette.
Compile lists of books/videos/learning tools and distribute them.
Everybody learns in different ways -- ensure variety of learning styles represented.
Keeping documentation/resources up-to-date (quarterly review?).
Writing/updating documentation as you learn -- your experience can help the next new hire or new community member.
The following applies to any kind of new Firefox contributor (employee, community member, whatever). Provide same core training for all new contributors, regardless of role: overview of development and release process, main tools, main test suites, main challenges, internal terminology, etc. I'm thinking a [multi-day] "bootcamp" and/or a 'welcome manual' (that is actually kept up-to-date). Safe space/occasion to ask questions like 'what's a sheriff?', 'what is uplifting?'. The point is to make people productive faster and to provide exposure to the different roles/teams they might interact with so that collaboration is easier (e.g team dynamic for devs + qa). (More reasons: https://air.mozilla.org/may-brantina-onboarding-and-the-cost-of-team-debt-with-kate-heddleston/) Right now, people learn all these things informally over time, and we have many sources of good info, but it's very inefficient.