Identity/Persona-Analytics: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
 
(20 intermediate revisions by the same user not shown)
Line 1: Line 1:
== Overview ==
{{LastUpdated}}
The Persona team makes use of metrics to help make good decisions and stronger products.


== Internal (Private) Metrics ==
== Internal (Private) Metrics ==
Line 7: Line 6:
* [https://kpi-dashboard.personatest.org/ KPI dashboard] (Maintained by Identity team)
* [https://kpi-dashboard.personatest.org/ KPI dashboard] (Maintained by Identity team)
* [https://metrics.mozilla.com/pentaho/content/pentaho-cdf-dd/Render?solution=metrics2&path=identity&file=identity.wcdf Identity dashboard] (Maintained by Metrics team)
* [https://metrics.mozilla.com/pentaho/content/pentaho-cdf-dd/Render?solution=metrics2&path=identity&file=identity.wcdf Identity dashboard] (Maintained by Metrics team)
==== Privacy ====
Like all of us at Mozilla, we take user privacy seriously. We only collect the data that we use for decision making, and we take care to remove personally identifying information from that data before it reaches these dashboards. While we aim to be open in our decision making, our metrics dashboards that make use of production data are currently private.


== Public Metrics ==
== Public Metrics ==
 
* Exploratory project -- based on github search results: http://www.arewepopularyet.com/
* Persona related metrics collected by other sites: [[/ThirdPartyMetrics | Third party metrics]]
* Persona related metrics collected by other sites: [[/ThirdPartyMetrics | Third party metrics]]


== Stakeholders ==
== Stakeholders ==
* Project leadership: Are we growing? Are we succeeding at our goals?
* Project Management: Are we growing? Are we succeeding at our goals? Do we have RPs in the pipeline? Are we converting them?
* UX: Do users proceed successfully through the dialog? Does UX A vs UX B improve our bounce rate?
* UX: Do users proceed successfully through the dialog? Does UX A vs UX B improve our bounce rate?
* Developers/QA: What's breaking? Are the dialog response times acceptable? How frequent is a particular bug/issue?
* Developers/QA: What's breaking? Are the dialog response times acceptable? How frequent is a particular bug/issue?
* Product Management: What are the success modes/product opportunities? Are we converting users/RPs through stages of engagement?


Note: "Is the service up and functioning properly?" is not a question answered by these dashboards -- that is handled by monitoring, which is a separate function.
Note: "Is the service up and functioning properly?" is not a question answered by these dashboards -- that is handled by monitoring, which is a separate function.
Line 32: Line 26:


== Projects ==
== Projects ==
The focus for Persona Analytics is outlined in order of priority below:
A list of all Persona Metrics projects we'd like to do someday:


{| class="fullwidth-table"
{| class="fullwidth-table"
Line 43: Line 37:
|Two metrics that matter: one for RP adoption and one for user adoption. These metrics should be accessible in a usable (intelligibly badass) dashboard.  
|Two metrics that matter: one for RP adoption and one for user adoption. These metrics should be accessible in a usable (intelligibly badass) dashboard.  
|John Gruen (UX and dashboard coding), Ryan Feeley (UX), Katie Parlante (dashboard coding, lead), Shane Tomlinson (browserid data source)
|John Gruen (UX and dashboard coding), Ryan Feeley (UX), Katie Parlante (dashboard coding, lead), Shane Tomlinson (browserid data source)
|Oct?
|Postponed
|-
|-
|RP A/B Testing
|RP A/B Testing
|Co-ordinating A/B testing hosted by RPs. Tests placement of initial Persona buttons/links/introductory text.
|Co-ordinating A/B testing hosted by RPs. Tests placement of initial Persona buttons/links/introductory text.
|Ryan Feeley, John Gruen, Dan Callahan, Katie Parlante
|Ryan Feeley, John Gruen, Dan Callahan, Katie Parlante
|
|Postponed
|-
|-
|Dialog A/B Testing
|Dialog A/B Testing
|Test different options on the first screen, optimizing to minimize bounce rate. Create our own infrastructure to run tests, as we don't want to use 3rd party options in our dialog.
|Test different options on the first screen, optimizing to minimize bounce rate. Create our own infrastructure to run tests, as we don't want to use 3rd party options in our dialog.
|Shane Tomlinson (browserid infrastructure), Katie Parlante (results display)
|Shane Tomlinson (browserid infrastructure), Katie Parlante (results display)
|dialog infrastructure: Oct? results display: Nov?
|Unscheduled
|-
|-
|Move KPI infrastructure to Ops team
|Move KPI infrastructure to Ops team
|Host KPI servers in VPC, instead of on ephemeral instances maintained by dev.
|Host KPI servers in VPC, instead of on ephemeral instances maintained by dev.
|Gene Wood, Katie Parlante
|Gene Wood, Katie Parlante
|Dec? It may make sense to postpone this until after "system architectural cleanup" described below.
|Postponed
|-
|-
|Self service access to data -- KPI
|Self service access to data
|Write script to pull kpiggybank data into elasticsearch + out-of-box front end (like kibana)
|Use RabbitMQ or Heka as transport, route to an elastic search backend, use kibana (or similar) as frontend
|
|
|Unscheduled -- prototype might be good freaky project
|Unscheduled -- prototype might be good freaky project
|-
|Self service access to data -- *-metrics.json data
|Use heka as transport, route to an elastic search backend, use kibana as frontend. This applies to data that is currently routed to metrics data warehouse and viewed via pentaho.
|
|Unscheduled
|-
|Self service access to data -- identity log data
|Similar to other services, use heka as transport, route to an elastic search backend, use kibana as frontend
|
|Unscheduled
|-
|Email verification in Dashboard
|Use email verification metrics.json data to cross check the KPI data
|
|Unscheduled
|-
|-
|System architecture cleanup
|System architecture cleanup
|Replace https (transport) and couchdb (data store) with hekka and elasticsearch across the board.
|Replace https (transport) with Heka/RabbitMQ and couchdb (data store) with elasticsearch .
|
|
|Unscheduled
|Unscheduled
|-
|-
|Dashboard, phase II
|Dashboard Improvements
|Performance graphs, segmentations, further work on additional user adoption graphs. Exact scope TBD.
|Performance visualizations, email verification stats, etc.
|
|
|Unscheduled
|Unscheduled
Line 100: Line 79:
* Privacy Reviews:
* Privacy Reviews:
** review kickoff for RP/email verification features: https://bugzilla.mozilla.org/show_bug.cgi?id=909980
** review kickoff for RP/email verification features: https://bugzilla.mozilla.org/show_bug.cgi?id=909980
 
* UX Redesign (TMTM): https://github.com/johngruen/kpi_des
=== User Stories ===
 
Top Priorities:
* As a UX designer, I want to know if new users are making it through the dialog and successfully signing in/up. [ new user views: funnel, etc. ]
* As a Product Manager, I want to know how well we're doing at acquiring new users and RPs [cohort/funnel]
* As a UX designer, I want to make a change to the UX and know if it improves the user's experience. [ A/B testing ]
 
Also on the horizon:
* As a developer, I want to know the rate users run into error screens.
* As a developer, I want to know if users experience performance problems with the service.
* As a product manager, developer or UX designer, I want the dashboard to be reliable and updated regularly. [move to full production setup, etc.]


=== Code & Issue tracking ===
=== Code & Issue tracking ===
Line 118: Line 86:


=== Documentation ===
=== Documentation ===
* https://www.lucidchart.com/documents/view/42df-90fc-5240a53c-8ea1-5c2f0a0089aa (Diagram)
* https://wiki.mozilla.org/Identity/BrowserID/KPI_Dashboard (a bit out of date right now, but the overall picture is right)
* https://wiki.mozilla.org/Identity/BrowserID/KPI_Dashboard (a bit out of date right now, but the overall picture is right)


== Identity Dashboard ==
== Identity Dashboard ==
Open bugs:
Open bugs:
*[https://bugzilla.mozilla.org/show_bug.cgi?id=885467 Recognize FFOS]
*[https://bugzilla.mozilla.org/show_bug.cgi?id=935709 Identity Dashboard missing data starting 10/29]
*[https://bugzilla.mozilla.org/show_bug.cgi?id=885423 Screen out testing RPs ]
*[https://bugzilla.mozilla.org/show_bug.cgi?id=885423 Screen out testing RPs ]
== People/Project contacts ==
* Katie Parlante is driving this project, and maintaining the KPI dashboard
* Shane Tomlinson is the main point of contact for kpi work in the browserid codebase
* Ryan Feeley and Hannah Quay-de la Vallee are currently the primary "clients" of the KPI dashboard -- the goal is to support UX
* The metrics team maintains the "Identity" dashboard

Latest revision as of 20:32, 22 November 2013

Last updated: 2013/11/22

Internal (Private) Metrics

We have two dashboards that support the #signin team (LDAP required):

Public Metrics

Stakeholders

  • Project Management: Are we growing? Are we succeeding at our goals? Do we have RPs in the pipeline? Are we converting them?
  • UX: Do users proceed successfully through the dialog? Does UX A vs UX B improve our bounce rate?
  • Developers/QA: What's breaking? Are the dialog response times acceptable? How frequent is a particular bug/issue?

Note: "Is the service up and functioning properly?" is not a question answered by these dashboards -- that is handled by monitoring, which is a separate function.

Meetings and Status

Projects

A list of all Persona Metrics projects we'd like to do someday:

Task Description Crew Target
Two Metrics That Matter - KPI Dashboard Two metrics that matter: one for RP adoption and one for user adoption. These metrics should be accessible in a usable (intelligibly badass) dashboard. John Gruen (UX and dashboard coding), Ryan Feeley (UX), Katie Parlante (dashboard coding, lead), Shane Tomlinson (browserid data source) Postponed
RP A/B Testing Co-ordinating A/B testing hosted by RPs. Tests placement of initial Persona buttons/links/introductory text. Ryan Feeley, John Gruen, Dan Callahan, Katie Parlante Postponed
Dialog A/B Testing Test different options on the first screen, optimizing to minimize bounce rate. Create our own infrastructure to run tests, as we don't want to use 3rd party options in our dialog. Shane Tomlinson (browserid infrastructure), Katie Parlante (results display) Unscheduled
Move KPI infrastructure to Ops team Host KPI servers in VPC, instead of on ephemeral instances maintained by dev. Gene Wood, Katie Parlante Postponed
Self service access to data Use RabbitMQ or Heka as transport, route to an elastic search backend, use kibana (or similar) as frontend Unscheduled -- prototype might be good freaky project
System architecture cleanup Replace https (transport) with Heka/RabbitMQ and couchdb (data store) with elasticsearch . Unscheduled
Dashboard Improvements Performance visualizations, email verification stats, etc. Unscheduled

KPI Dashboard

Code & Issue tracking

Documentation

Identity Dashboard

Open bugs: