Identity/Firefox Accounts/Minimum Viable Metrics

From MozillaWiki
Jump to: navigation, search
Last updated: 2014/09/02

Minimum set of metrics required before going live with real accounts. The assumption here is that the first accounts will be for Marketplace and WMF, on FxOS and Web. That may change given the dates for FxOS 1.4, but its a reasonable place to start.

Bugs & Status Summary


Proposal - required

  •  % completion rate for FTU sign up flow
  •  % completion rate for FxA creation (regardless of origin)
  • basic FxA event metrics (e.g. new accounts per day, email verifications per day, etc.)
  • basic fraud detection related data collection
  • self service access to all high level "metrics" data from Kibana (or similar, if need be)
  • Usable "Dashboard" for the key metrics (either customized in Kibana or some other solution)

Proposal - might be requried

  • Performance data collected and accessible in "self serve" mode
  • Fraud detection data routed somewhere sensible (e.g. CEF logging routed to Arcsight)

Proposal - design for and log data for

  •  % completion rate for all flows
  • drop off rates for all flows
  • segmentation of flows by: device (FxOS, Android, Desktop, Web, etc.), service where flow originated (FTU, Marketplace, FxOS Settings, etc.), locale
  • "meta" flows that may span devices (e.g. create account --> email verification --> successful use of a service at some later point in time)

Probably not in MVM

  • data routed from the Persona verifier.

Constraints/Assumptions for MVM

  • No telemetry from the device/client, no separate API call just for metrics
  • No collaboration with other metrics efforts (like Marketplace)
  • Metrics may request additional info (optional?) be sent with existing API calls to get the segmentations above (though we may be able to avoid this)
  • Use existing logging infrastructure (Heka + ElasticSearch + Kibana)

Milesones and Roadmap

While this plan is still valid, we veered off this roadmap after FxOS slipped and Sync took priority.

Milestones for MVM

  • 0. Events, flows, segmentations proposed; github and bugzilla issues logged (November 2013)
  • 1. End to End Proof of concept -- self service access to some basic metrics + FTU flow data. (December 2013)
  • 2. Nail down all events to be logged for flows, basic metrics & fraud detection.
  • 3. All data logged and routed to db.
  • 4. Key dashboard customizations completed.
  • 5. Performance and Fraud detection work completed.
  • 6. "Bonus" customizations completed.

Roadmap Post MVM (some tasks for these items might be slotted earlier)

  • Verifier counts (data routed from Persona verifier)
  • A/B testing for Web/Desktop flows
  • Use metrics on staging to observe community test coverage
  • Flows for other devices and services as they come on line
  • Cohort analysis (define a "cohort" by account creation date, or device/service origin, and compare activities based on cohort)

Basic & Flow Event Logging Design

We can model the info we want for MVM by collecting counts of events, where each event has "facets" that allow us to segment it different ways. Flows are just one more segment. Each event is logged, with a time. (The time can be fuzzed to 10 minute intervals -- really the smallest level of granularity we'll need is by day).

To get a flow drop off visualization, we display the counts that made it to each "event" of the flow for that day.

Basic high level events

For tracking general adoption and use of service

  • account creation attempted/started
  • account created
  • device attachment attempted/started
  • device attached
  • email verifications attempted (successful and unsuccessful)
  • certificate signed
  • assertion verified (observed by the verifier, not auth server)
  • password changed (more than one unique event here?)
  • account destroyed

Segment by (facets)

  • time event is logged (fuzzed to nearest 10 minutes before landing in elasticdb)
  • flow
    • "Create Account"
    • "Attach Device"
    • "Forgot Password"
    • "Change Password"
    • "SSO Create Account" (distinct because doesn't complete email verification loop)
    • "Force Authentication"
  • context/origin/service
    • "FTU"
    • "Settings"
    • "Marketplace"
  • device
    • "FxOS + version"
    • "Android + version"
    • browser user agent
    • desktop
  • locale
  • time since flow started
  • flow start date (used to bucket events in flows that span days -- nice to have)

Nice to have/future

  • region (?) (QA/Mozilla Operations Center (wants this segmentation on perf numbers, for example)
  • account creation date (allows fancy cohort analysis)


  • "Create Account"
  • "Attach Device"
  • "Forgot Password"
  • "Change Password"
  • "SSO Create Account" (distinct because doesn't complete email verification loop)
  • "Force Authentication"

Fraud Detection Logging

Initial conversation w/warner and ckarlof

helpful links:

Performance Metrics

TBD, requirements mentioned:

  • As a QA/MOC I need the exact number of failed (error) sign in per day so that I can be confident that our deployment is successful.
  • As a QA/MOC I need the long load times (over threshold) by region so that we know our performance is providing a good UX.
  • As a QA/MOC I need the number of user time outs by region so that we know our performance is providing a good UX.