Game Benchmark Automation

From MozillaWiki
Jump to: navigation, search

What?

The Browser Benchmark Automation project will run automated performance tests on Firefox's four release channels (Nightly, Aurora, Beta, and Release) and Chrome's three release channels (Dev, Beta, Release, and possibly Canary) on platforms representative of typical Firefox users. Our first benchmark will run padenot's webaudio benchmark in Firefox Nightly and Chrome Canary on Windows 7. Later, other benchmarks may be selected from the list of benchmarks used by Tom's Hardware Guide. This project sometimes goes by the name "game benchmark automation" or "browser benchmark automation".

Why?

Mozilla wants to track Firefox's performance improvements and regressions and do competitive analysis with other browsers, like Chrome or IE.

Who?

  • Dan Minor <dminor>, A-Team engineer developing the benchmark automation framework
  • Chris Peterson <cpeterson>, Engineering program manager
  • Kyle Lahnakoski <klahnakoski>, Statistics and Visualization Engineer (for benchmark reports)
  • Aaron Train <atrain>, QA Engineer
  • Kamil Jozwiak <kjozwiak>, QA Engineer
  • Alan Kligman <akligman>, Platform Engineer who designed original automation framework

When?

Alan started this project in 2013 Q4 to run game-related benchmarks. Around the same time, Chris was organizing a SpiderMonkey team project to profile and optimize the benchmarks used by Tom's Hardware Guide's Web Browser Grand Prix. These two efforts merged in 2014 Q1. Development picked up speed in 2014 Q2 when Dan from the A-Team joined the effort.

Where?

The benchmark test machines are located in Mozilla's Toronto office, where Aaron and Kamil (or Alan) can fix machine issues in person that Dan may be unable to do remotely.

Bugs

No results.

0 Total; 0 Open (0%); 0 Resolved (0%); 0 Verified (0%);