Update:Remora Load Testing: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
|||
Line 42: | Line 42: | ||
** Top page | ** Top page | ||
* Others? Mark? Shaver? | * Others? Mark? Shaver? | ||
= Grinder Installation = | |||
== Setting up a node == | |||
== Setting up the control center == | |||
== Running the tests == |
Revision as of 22:56, 4 October 2006
Goals
- Tests should be distributed, run from multiple nodes
- Tests should return req/s capability
- Tests should return metrics on HTTP error codes and/or success/failure rates
- Tests should test with cache and without -- and at varying cache-hit rates between
- Tests should be configurable to have different levels of concurrency
Grinder
We concluded that Grinder fulfills these requirements, and excels in ways AB and httperf can't or don't.
The Guessing Game
The Grinder results will not be completely accurate. This is the nature of load testing, agreed, but there are also some things we can do with peak/off-peak load numbers to understand how the load test results could be skewed to accommodate for the external effect of other apps and overall higher stress on shared resources.
We discussed gathering some cumulative NS/app/db stats to get a better hold of what our load tests numbers mean, and gain some perspective on the margin of error.
mrz is going to give us some numbers based on cumulative Cacti results.
Test Variance
- By concurrent requests
- 100
- 1000
- 2000
- By multiple input vars (GET)
- All unique vars, causing cache-hit to be zero
- Mixture of unique vars, possibly 50% duplicated
- Only one URI across all requests
- Otherwise differentiated by request URI
Pages We Want To Test
- Main page
- Search page
- Category listing
- Add-on main page
- Services
- Update check
- Blocklist
- PFS
- RSS / Feeds
- Vanilla
- Addon-specific discussion page
- Top page
- Others? Mark? Shaver?