Update:Remora Load Testing: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
No edit summary
 
(19 intermediate revisions by 3 users not shown)
Line 7: Line 7:
* Tests should be configurable to have different levels of concurrency
* Tests should be configurable to have different levels of concurrency


= Grinder =
= TODO =
We concluded that [http://grinder.sourceforge.net/ Grinder] fulfills these requirements, and excels in ways AB and httperf can't or don't.
* Analyze traffic and load
* Come up with potential tests cases
* Record test cases for deployment in Grinder
* Do progressive load tests until melting point


= The Guessing Game =
= How to Create a Test Case =
The Grinder results will not be completely accurate.  This is the nature of load testing, agreed, but there are also some things we can do with peak/off-peak load numbers to understand how the load test results could be skewed to accommodate for the external effect of other apps and overall higher stress on shared resources.


We discussed gathering some cumulative NS/app/db stats to get a better hold of what our load tests numbers mean, and gain some perspective on the margin of error.
= Test Cases =


mrz is going to give us some numbers based on [https://nagios.mozilla.org/graphs/mpt/Systems/ cumulative Cacti results].
= Where to Check-in Test Cases =
 
= Test Variance =
* By concurrent requests
** 100
** 1000
** 2000
* By multiple input vars (GET)
** All unique vars, causing cache-hit to be zero
** Mixture of unique vars, possibly 50% duplicated
** Only one URI across all requests
* Otherwise differentiated by request URI
 
= Pages We Want To Test =
* Main page
* Search page
* Category listing
* Add-on main page
* Services
** Update check
** Blocklist
** PFS
* RSS / Feeds
* Vanilla
** Addon-specific discussion page
** Top page
* Others?  Mark?  Shaver?
 
= Grinder Installation =
 
Grinder is a distributed Java application scriptable in Jython.  A central GUI application known as the Console marshals a herd of Agents running on other machines.  The Agents spawn processes which, in turn, run tests in threads.  The tests, written in Jython, are intended to hit some resource repeatedly.  The Agents collect response statistics from the processes and report back to the Console.  The Console aggregates and summarizes the data for presentation within its GUI.  The Console also can save the aggregated results as a CSV file.
== Setting up Grinder ==
Setting up the application is the same for both the Agents and the Console.  We will probably create a custom tar file for a distribution.  Once untarred, it will only be necessary to edit an environment setup file to adjust paths.
 
(TO-DO enumerate actual steps)
 
== Running the tests ==
 
Tests are run manually by starting the Agents and then the Console.
 
Agents are started at a command line by typing:
 
...

Latest revision as of 02:08, 6 March 2008

« Back to Update:Remora

Goals

  • Tests should be distributed, run from multiple nodes
  • Tests should return req/s capability
  • Tests should return metrics on HTTP error codes and/or success/failure rates
  • Tests should test with cache and without -- and at varying cache-hit rates between
  • Tests should be configurable to have different levels of concurrency

TODO

  • Analyze traffic and load
  • Come up with potential tests cases
  • Record test cases for deployment in Grinder
  • Do progressive load tests until melting point

How to Create a Test Case

Test Cases

Where to Check-in Test Cases