Privacy/Reviews/Telemetry/Default Browser Status: Difference between revisions

Line 21: Line 21:
''Briefly state the problem to be solved by collecting this type of data: a question we have and how the answer can make our users' web experience better. The product or engineering contact should fill this out.''
''Briefly state the problem to be solved by collecting this type of data: a question we have and how the answer can make our users' web experience better. The product or engineering contact should fill this out.''


What differentiates new users who quit using Firefox because they didn't like it as opposed to those who user it regularly.
The AMO ping (updates to add ons), the Blocklist Ping (updates to browser including security updates) are sent once a day.
For users for whom Firefox is not the default, how often to they use Firefox?: how many days will it take till they see security updates?


On what measurements are successful features different from features that dont get traction?
Is the browser uptime for the 'non default' users long enough for Firefox to send the AMO/BU ping: if isn't (for a percentage for the users) the ping wont go through and they wont see the security update.
 
And what metrics  differentiates users who adopt quickly vs those who dont?
 
 
Knowing where Firefox is the default browser or not (e.g. if not, possibly the user doesn't use it Firefox much) would help provide
answers to these questions. This measurement in itself is not enough, but provides context in the form of metadata.


== Measurement to Collect ==
== Measurement to Collect ==
30

edits