Foundation/Metrics/Optimizely Process: Difference between revisions
< Foundation | Metrics
Jump to navigation
Jump to search
Adamlofting (talk | contribs) |
Adamlofting (talk | contribs) |
||
| Line 37: | Line 37: | ||
# Put the test live in Optimizely | # Put the test live in Optimizely | ||
# Update the status to live on the [https://wiki.mozilla.org/Webmaker/Testing_Hub Webmaker Webmaker Testing Hub] | # Update the status to live on the [https://wiki.mozilla.org/Webmaker/Testing_Hub Webmaker Webmaker Testing Hub] | ||
# People affected by testing should | # People affected by testing should watch the wiki page for updates | ||
# | # Announce new test to: | ||
# Let the test run until results are significant | #* webmaker on IRC | ||
# | #* webmaker@lists.mozilla.org | ||
# Let the test run until results are statistically significant (Optimizely will do the calculations for you) | |||
== Concluding a test == | |||
# Some tests don't produce significant results. If this happens, don't be afraid to close the test. There is always something else we can be testing that might have more impact. | |||
# Add the results and Optimizely screenshots into your test write-up | # Add the results and Optimizely screenshots into your test write-up | ||
# Move your test into the 'Completed Tests' section in the [https://wiki.mozilla.org/Webmaker/Testing_Hub Webmaker testing hub] | # Move your test into the 'Completed Tests' section in the [https://wiki.mozilla.org/Webmaker/Testing_Hub Webmaker testing hub] | ||
# Announce the write-up to: | |||
#* webmaker on IRC | |||
#* webmaker@lists.mozilla.org | |||
# Close the ticket | # Close the ticket | ||
# Share this on the next weekly team call and get peer-review on your conclusions | |||
== Weekly updates == | == Weekly updates == | ||
Revision as of 15:59, 11 February 2014
This is our process for running front-end tests on Mozilla Foundation websites and tools. The process is new so please feedback on it, and we can improve these guides, and the process.
Golden rules for A/B testing
- Testing is continuous, it is never finished
- Be brave: You can always be testing something else, so focus on potential impact
- Traffic is opportunity. At any given time, we should have a test running on each of our sites and tools
- There are no fails. Every test is a learning opportunity (especially if it makes something worse)
- It's a team sport. Testing is as good as the number of people who see the results
How to get something tested
- If you have quick ideas, dump them here: Webmaker test ideas Etherpad
- If you have something specific you definitely want 'actioned', scroll down and follow the guide on how to file the bug and fill out the report template
What tests are we running?
And what did we learn from previous tests?
- Webmaker Testing Hub
- Other team links to go here
How do we choose?
Choosing which test to run next is a judgement call combining the following:
- How long a test will take to run
- How difficult a test is to setup
- The potential impact of making the change
A formula to calculate this would be artificial as the impact of a test is hard to predict. Just keep these three things in mind and continually look to have the biggest impact. And when it's hard to decide, remember that testing anything is better than testing nothing.
Our process to run a test
- Open a Bugzilla ticket under the component the test is being run in to assign the test and track it's progress
- Add [optimizely] to the ticket's whiteboard
- Estimate time to complete the test and record this in the ticket: Test duration calculator
- Make a copy of this Google doc template and fill out the content: Mofo A/B testing report template
- Add the URL of your test report into your ticket
- Add your test into the Webmaker testing hub
- Setup your test in Optimizely, or request someone to do this
- Add the preview URL from Optimizely into the ticket for review:
- Technical review: (make sure we haven't broken other things on the page)
- Content review: with design and copy people as appropriate
- Put the test live in Optimizely
- Update the status to live on the Webmaker Webmaker Testing Hub
- People affected by testing should watch the wiki page for updates
- Announce new test to:
- webmaker on IRC
- webmaker@lists.mozilla.org
- Let the test run until results are statistically significant (Optimizely will do the calculations for you)
Concluding a test
- Some tests don't produce significant results. If this happens, don't be afraid to close the test. There is always something else we can be testing that might have more impact.
- Add the results and Optimizely screenshots into your test write-up
- Move your test into the 'Completed Tests' section in the Webmaker testing hub
- Announce the write-up to:
- webmaker on IRC
- webmaker@lists.mozilla.org
- Close the ticket
- Share this on the next weekly team call and get peer-review on your conclusions
Weekly updates
In the weekly cross-team calls, include a short segment on testing:
- Announce any tests closed in that week and link to the write up for peer-review
- Remind people to log new test ideas, and discuss if any planned tests needed priority