CloudServices/SoftRelease: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
(minor edits)
 
(6 intermediate revisions by the same user not shown)
Line 1: Line 1:
= Overview =
= Overview =
The ''SoftRelease'' service offers a way to activate/degrade a new feature to a subset of Firefox Mobile / Desktop users.


It's useful when the client side of your application goes through a release cycle that makes it painful
Firefox Mobile and Firefox Desktop both follow a specific release cycle that makes it hard to
to ship experimental features or to throttle the load of a new feature. Typically Firefox Desktop & Firefox Mobile.
ship experimental features, try out small changes on a subset of users or ramp up a new feature to avoid huge peaks on our infrastructure.
 
The ''SoftRelease'' service offers a way to ramp up or bucket-test a new feature shipped in Firefox Mobile or Desktop.


Use cases examples:
Use cases examples:
Line 9: Line 10:
* ramping up Firefox Hello for our user base by making it accessible to 10% of the user base and growing it to 100% once we are confident that the server infrastructure works well.
* ramping up Firefox Hello for our user base by making it accessible to 10% of the user base and growing it to 100% once we are confident that the server infrastructure works well.
* activating a new feature for specific regions in Firefox.  
* activating a new feature for specific regions in Firefox.  
* making small UI variations like what they're doing at the Foundation for end of year campaign, see https://fundraising.mozilla.org/testing-testing-and-more-testing/
* making small UI variations like what they're doing at the Foundation for their "End Of the Year" campaign, see https://fundraising.mozilla.org/testing-testing-and-more-testing/


= General Principle =
= General Principle =


When the client starts (Firefox OS, Firefox or a WebApp), it sends a request to the SoftRelease service to ask if
When Firefox Mobile or Dekstop starts, it sends a request to the SoftRelease service to ask if
a feature has to be activated or not.
a feature has to be activated or not.


Line 28: Line 29:


When the '''enabled''' key is sent back, the client gets a YES/NO answer and acts upon it.
When the '''enabled''' key is sent back, the client gets a YES/NO answer and acts upon it.
For example, for Firefox HELLO, the decision to display the Hello button or not could be done
For example, for Firefox Hello, the decision to display the Hello button or not could be done
by this call:
by this call:


Line 76: Line 77:
Policies can be combined.
Policies can be combined.


== Example ==
== Examples ==
 
 
=== Example 1 ===


We want to try out a new donation campaign UI for French users. In their case, we want to set the color
We want to try out a new donation campaign UI for French users. In their case, we want to set the color
Line 104: Line 108:
   </pre>
   </pre>


== Metrics ==
= Metrics =


Collecting metrics during A/B testing is mandatory to follow & understand what's the impact
Collecting metrics during A/B testing is important to follow & understand what's the impact
of the different versions of a feature.
of the different versions of a feature.


XXX
The proposed service does not provide any server-side metrics, but returns a unique id
for each combination returned for a given feature.
 
e.g. :
 
<pre>
GET https://features.services.mozilla.com/<feature_name>
 
{'enabled': true, 'id': '4fa1d44e-2f9d-4cd3-a660-85e892c0ace9'}
</pre>
 
The id is guaranteed to stay unique and consistent and can be used by the application
to track the different combinations.
 
= Related Works =
 
There are two related works at Mozilla:
 
* The Campaign Manager - https://wiki.mozilla.org/CloudServices/Roadmaps/Campaign-Manager
* Abatar - https://github.com/dannycoates/abatar
 
The campaign manager focuses on targeted messaging for Android users, but its principle is quite similar to
the current proposal: the client asks the server for messages to display - and that message may vary from
one client to the other.
 
Abatar is a client-side only A/B testing framework: all the possible combinations are stored in the
client and the decision is made with the execution context.
 
Abatar plans to have a server side at some point, that will serve some javascript files
that contains the combinations, but the decision will still be made by the client.
 
For evident practical reasons, a project using Abatar will want to own the releasing and publishing process
of all its javascript files, so deporting this to a service like the one described in
this document would add too much complexity.

Latest revision as of 16:59, 12 December 2014

Overview

Firefox Mobile and Firefox Desktop both follow a specific release cycle that makes it hard to ship experimental features, try out small changes on a subset of users or ramp up a new feature to avoid huge peaks on our infrastructure.

The SoftRelease service offers a way to ramp up or bucket-test a new feature shipped in Firefox Mobile or Desktop.

Use cases examples:

  • ramping up Firefox Hello for our user base by making it accessible to 10% of the user base and growing it to 100% once we are confident that the server infrastructure works well.
  • activating a new feature for specific regions in Firefox.
  • making small UI variations like what they're doing at the Foundation for their "End Of the Year" campaign, see https://fundraising.mozilla.org/testing-testing-and-more-testing/

General Principle

When Firefox Mobile or Dekstop starts, it sends a request to the SoftRelease service to ask if a feature has to be activated or not.

The proposed API is a single HTTP endpoint that contains the name of the feature:

GET https://features.services.mozilla.com/<feature_name>

{'enabled': true}

The server analyzes the client IP and specific headers like the User Agent, and returns a JSON mapping containing the answer.

When the enabled key is sent back, the client gets a YES/NO answer and acts upon it. For example, for Firefox Hello, the decision to display the Hello button or not could be done by this call:

GET https://features.services.mozilla.com/hello

In some other cases, the feature is activated but we want to make different versions. it's preferrable to let the client decide what to do, given a list of options values sent back by the server.

For example, in a campaign page, an UI has a button with two options that may vary amongst users: its color and its text. The client can call the server to get back the values to use:

GET https://features.services.mozilla.com/campaign-2015

{'color': '#ff0000',
 'value': 'Click Here'}

When the client wants to get several features at once, it can batch its requests by calling the root endpoint:

GET https://features.services.mozilla.com?features=campaign-2015,hello

{
 'hello': {'enabled': true}, 
 'campaign-2015': {'color': '#ff0000', 'value': 'Click Here'}
}

Dashboard

To manage the responses, the service provides a dashboard where admins can:

  • add or remove a feature name
  • list the different possible responses that can be returned to a user
  • configure a policy for the server to decide which response to return

We provide 3 policies

  • Weighted: defines a percentage for each possible response.
  • Geolocation: associates regions to possible responses.
  • User-Agent: associates User-Agents to possible responses.

Policies can be combined.

Examples

Example 1

We want to try out a new donation campaign UI for French users. In their case, we want to set the color of the button Green. For other users, we want a Blue color.

steps:

  1. . We add a new "campaign-button-color" feature via the dashboard
  2. . we add the two possible responses:
  Green => {'color': 'green'}
  
  Blue => {'color': 'blue'}
  
  1. . we associate each response to a Geolocation Policy
  France: Green
  
  default: Blue
  

Metrics

Collecting metrics during A/B testing is important to follow & understand what's the impact of the different versions of a feature.

The proposed service does not provide any server-side metrics, but returns a unique id for each combination returned for a given feature.

e.g. :

GET https://features.services.mozilla.com/<feature_name>

{'enabled': true, 'id': '4fa1d44e-2f9d-4cd3-a660-85e892c0ace9'}

The id is guaranteed to stay unique and consistent and can be used by the application to track the different combinations.

Related Works

There are two related works at Mozilla:

The campaign manager focuses on targeted messaging for Android users, but its principle is quite similar to the current proposal: the client asks the server for messages to display - and that message may vary from one client to the other.

Abatar is a client-side only A/B testing framework: all the possible combinations are stored in the client and the decision is made with the execution context.

Abatar plans to have a server side at some point, that will serve some javascript files that contains the combinations, but the decision will still be made by the client.

For evident practical reasons, a project using Abatar will want to own the releasing and publishing process of all its javascript files, so deporting this to a service like the one described in this document would add too much complexity.