QA/Platform/Graphics: Difference between revisions
Jump to navigation
Jump to search
| Line 95: | Line 95: | ||
= Interesting Initiatives = | = Interesting Initiatives = | ||
* [[Performance/Runtime_Hardware_Testing|Runtime Hardware Testing]] | * [[Performance/Runtime_Hardware_Testing|Runtime Hardware Testing]] | ||
= Laboratory = | |||
== Inventory == | |||
== Testing == | |||
Revision as of 22:14, 2 July 2015
Consult this page for more information.
Understanding the Problem Space
First order of business for my transition to the Graphics team is to understand the problem space so I can understand the immediate needs of the team and make the best impact I can in the shortest amount of time.
- What are the key problems/challenges facing the Graphics team in terms of quality?
- discrepancy in environments between testers and release users
- discoverability of bugs pre-release
- ?...
- Where can QA add value/support to the Graphics team?
- improving pre-release discoverability of bugs
- closing the gap between tester and release systems
- helping with bug triage, particularly with bugs hiding in general components
- representation in crashkill
- improving code coverage and/or identifying gaps in code coverage
- identifying ways to improve participation in the graphics team (events, projects, One & Done, etc)
- documentation of tools, testing processes, etc
- building out the lab in Toronto
- continuing to drive Betabreakers testing every 6 weeks
- verifying bug fixes (what does this look like)?
- profiling areas of risk (eg. troublesome configs)
- conducting root cause analysis for regressions
- understanding problems outside of our control (eg. driver resets)
- feature testing and upcoming priorities (e10s, Windows 10, El Capitain, Android, B2G, etc)
- What does QA need to know to be effective?
- key components of an actionable Graphics bug
- fundamentals/technologies that should be learned
- how to distinguish a graphics crash from a non-graphics crash with a graphics signature
- meetings, mailing lists, bugzilla components to watch, blogs, IRC channels to join, etc
- who is each member of the team (incl. contributors) and what do they do
- where does graphics code reside in the tree?
- what role does Unified Telemetry in graphics quality?
- what are the prefs to enable/disable different functionalities?
- we need a database of known-troublesome hardware/driver configurations to inform testing, hardware acquisitions, and blocklisting
Sanity Checking
- Desktop
- Boot 2 Gecko (No-Jun Park)
- Android
Stability
How do we identify a graphics crash?
- by signature: gfx, layers, D2D, D3D, ?...
- by topmost filename: gfx, ?...
- ?...
How do we prioritize graphics crashes?
- Overall topcrashes in release > beta > aurora > nightly
- Gfx crashes in release > beta > aurora > nightly
- Explosive crashes in release > beta > aurora > nightly
What tools do we have at our disposal to investigate crashes?
- Bughunter for investigating crashes correlated to a URL
- KaiRo's reports for identifying crashes that are new or escalating quickly
- Socorro for getting detailed information about crash reports
What information is needed to make a crash actionable by developers?
- Correlations to particular hardware, driver, add-on, 3rd-party software, or library
- ?...
Features
- Gecko 39: OOM driver issues
- Gecko 40: OMTC on all platforms
- Gecko 41: WebGL 2, E10S M3
- Gecko 42: Desktop Tiling, Desktop APZ, Desktop Silk
Participation
- Sanity checking via One & Done
- Meetups to connect testers/users with devs
- Testdays to teach people about graphics testing
- Documentation and translation of documentation
- Engaging on community spaces (Discourse, Reddit, Facebook, Twitter, etc)
Betabreakers
Testing:
- [DONE] Firefox 38: MSE stress test
- [DONE] Firefox 39: beta sanity check
- [DONE] Firefox 40: WebGL with e10s
- Firefox 41: exploratory testing Windows 10 in Aurora, gfx-noted bugs (eg. [1], [2], [3], [4])
- Firefox 42: to be determined
- Firefox 43: to be determined
- Firefox 44: to be determined
Risks:
- Betabreakers will not have Windows 10 deployed to machines until after it is officially released. However, they can deploy Preview to select machines upon request. We need to develop a set of requirements for Windows 10 testing, particularly machine specifications for any upcoming testrun that targets Windows 10.
- We need to select hardware for testing based on data from past testruns and known-troublesome hardware
- We need to identify gaps in test coverage and investigate whether they can fill these gaps for us