QA/Execution/Web Testing/Socorro/Test Plan

From MozillaWiki
Jump to: navigation, search

Socorro test plan

Socorro roadmap:

Staging server:

Release checklist:

Code Respository / Commits

Supported browser(s):

  • Firefox 3.6+ (Tier 1)
  • IE 8 (major functionality/layout only)
  • Chrome (major functionality/layout only)
  • Safari (major functionality/layout only)

Litmus Tests:

Manual Tests:

  • Ensure that the various top-level application views work (Firefox, Thunderbird, etc.)
  • As well as their listed versions, beneath
    • Check that the versions are accurate/up-to-date/complete (ask!)
    • Check they're sorted correctly
    • Load each "Report" type:
      • Overview
      • Crashes per User
      • Nightly Builds
      • Top Changers
      • Top Crashers
      • Top Crashers by URL
      • Top Crashers by Domain
      • Top Crashers by Topsite
  • Browse around and ensure there are no JS errors trapped in the Error Console
  • Ensure that searching for crashes by ID -and- signature works
  • Ensure that the "More Reports" dropdown loads data for each option
  • Advanced Search (contains: "flash"):
  • Load a crash report, and:
    • "Details" tab: Ensure that the "Related Bugs" block shows up, and the links work
    • "Modules" tab: raw list of DLLs, their versions, debug identifiers, and filenames
    • "Raw dump" tab: just what it says, ensure there is data there
    • "Extensions" tab:
      • If there is an extension listed, click on it; it should go to its AMO details page
      • It lists the extension's GUID (email address or hex), version, and "current?" (if available) version
    • "Comments" tab:
      • Comments show up (crashes don't always have comments; just ensure some do)
      • Their timestamps are links to the original report, which should have the same top frame
    • "Correlations" tab: click on the "Load" buttons, and ensure that data loads
    • Try the "Next" and "Previous" buttons
  • Make sure to test out
    • Refresh, ensure that the server time updates
    • What else can be tested here, on staging? (Socorro devs, please fill in)

lars: the dailyUrl cron job creates a csv file at a configurable location. That csv file in staging is at /home/processor and is named in the form YYYYMMDD-crashdata.csv.gz [09:41am] stephend|mtg: ah [09:41am] lars: cpu_info is column 12 (0 based) [09:42am] lars: I've just noticed that the column name is degenerate, though that is not likely a problem

How to verify that the right data is showing up on staging:

Automated Tests (via Selenium):

  • What can be automated from the above list?
    • Can we pull product versions from product-details, store it in an array, and compare what's on staging against that
    • Values of important <select> / <option> widgets
      • For each resulting page, verify the most important elements, and that there is *some* data
        • Also verify that column headings, that they have data, and, if it has a sort indicator, it's sorting correctly
    • Under Top Crashers / Top Changers, ensure three versions are present, and have 20 entries (15 right now on staging; which is right? Does it vary?)
    • View All link
    • Search
      • Ensure form elements are present/correct
      • Search by crash ID: find an incident via the DOM, get its ID, then execute a search for that ID
      • How do we search for stacktraces (data-dependent)?
        • Are there common/long-standing ones that staging always has available?
  • Next steps:

Gotchas / Notes

  • If certain versions / views have no data, it could be that we have to re-enable them manually on staging from the admin interface, when data in postgres gets flushed