Update:Remora Load Testing: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
No edit summary
 
(6 intermediate revisions by 2 users not shown)
Line 8: Line 8:


= TODO =
= TODO =
== morgamic ==
* Analyze traffic and load
# <strike>wikify meeting notes and todo</strike>
* Come up with potential tests cases
# <strike>come up with test cases / criteria</strike>
* Record test cases for deployment in Grinder
# come up with URIs to actually test
* Do progressive load tests until melting point
# work with IT to come up with testing schedule


== lars ==
= How to Create a Test Case =
# <strike>find out how to get and parse results from grinder</strike>
# document how to set up test nodes, install grinder on them, connect to aggregate box
# write load tests appropriate for the "Pages We Want To Test" section below
# setup a load test strategy based on the "Test Variance" section below
# gather results and export them from grinder console


= Grinder =
= Test Cases =
We concluded that [http://grinder.sourceforge.net/ Grinder] fulfills these requirements, and excels in ways AB and httperf can't or don't.


= The Guessing Game =
= Where to Check-in Test Cases =
The Grinder results will not be completely accurate.  This is the nature of load testing, agreed, but there are also some things we can do with peak/off-peak load numbers to understand how the load test results could be skewed to accommodate for the external effect of other apps and overall higher stress on shared resources.
 
We discussed gathering some cumulative NS/app/db stats to get a better hold of what our load tests numbers mean, and gain some perspective on the margin of error.
 
mrz is going to give us some numbers based on [https://nagios.mozilla.org/graphs/mpt/Systems/ cumulative Cacti results].
 
= Test Variance =
* By concurrent requests
** 100
** 1000
** 2000
* By multiple input vars (GET)
** All unique vars, causing cache-hit to be zero
** Mixture of unique vars, possibly 50% duplicated
** Only one URI across all requests
* Otherwise differentiated by request URI
 
= Pages We Want To Test =
We should be able to config two base URIs:
https://remora.stage.mozilla.org/ (for public and forum pages)
https://remora-services.stage.mozilla.org/ (for services)
 
Test URIs, used against test db (https://remora.stage.mozilla.com/):
/
/en-US/search
/en-US/search/?q=web+developer
/en-US/addons/browse/type:1
/en-US/addons/browse/type:4
/en-US/addons/browse/type:1/cat:all
/en-US/addons/display/60
/en-US/reviews/display/60
/en-US/addons/rss/newest
 
 
Update service test URI (https://remora-services.stage.mozilla.org/):
/update.php?reqVersion=1&id={c45c406e-ab73-11d8-be73-000a95be3b12}&version=1.0.2&maxAppVersion=2.0.0.*&status=userEnabled&appID={ec8030f7-c20a-464f-9b0e-13a3a9e97384}&appVersion=2.0.0.2pre&appOS=Darwin&appABI=x86-gcc3
 
Pages we want to be able to test:
* <strike>Main page</strike>
* <strike>Search page</strike>
* <strike>Category listing</strike>
* <strike>Add-on main page</strike>
* Services
** Update check
** Blocklist
** PFS
* <strike>RSS / Feeds</strike>
* Vanilla
** Addon-specific discussion page
** Top page
 
= Grinder Installation =
 
Grinder is a distributed Java application scriptable in Jython.  A central GUI application known as the Console marshals a herd of Agents running on other machines.  The Agents spawn processes which, in turn, run tests in threads.  The tests, written in Jython, are intended to hit some resource repeatedly.  The Agents collect response statistics from the processes and report back to the Console.  The Console aggregates and summarizes the data for presentation within its GUI.  The Console also can save the aggregated results as a CSV file.
== Setting up Grinder ==
Setting up the application is the same for both the Agents and the Console.  We will probably create a custom tar file for a distribution.  Once untarred, it will only be necessary to edit an environment setup file to adjust paths.
 
(TO-DO enumerate actual steps)
 
== Running the tests ==
 
Tests are run manually by starting the Agents and then the Console.
 
Agents are started at a command line by typing:
 
...

Latest revision as of 02:08, 6 March 2008

« Back to Update:Remora

Goals

  • Tests should be distributed, run from multiple nodes
  • Tests should return req/s capability
  • Tests should return metrics on HTTP error codes and/or success/failure rates
  • Tests should test with cache and without -- and at varying cache-hit rates between
  • Tests should be configurable to have different levels of concurrency

TODO

  • Analyze traffic and load
  • Come up with potential tests cases
  • Record test cases for deployment in Grinder
  • Do progressive load tests until melting point

How to Create a Test Case

Test Cases

Where to Check-in Test Cases