In brainstorming, it very quickly became clear that there are a lot of metrics that could potentially be useful to track to help us get and keep people more engaged in our development process. In the interest of starting small and iterating quickly, I propose the following:
We care about contributions. The hope is to use this to notice when individuals' contribution numbers go up or down so that when there are significant changes, we can ask contributors specific questions, such as what motivated the change, and if they're contributing less, what it would take for us to interest them in contributing more.
patches landed per person per month
I want to start by tracking the number of patches landed in mercurial per person per month, in large part because landed patches are an extremely leveraged way to contribute.
Action: Review once per month to see if anyone has had a particularly long or precipitous drop in contribution. If so, contact them both to demonstrate that we care, and to understand more about their motivations & situation and see if there things we can do to help reengage them.
Over time, I expect a more general "contributions per person per month" metric would make sense, including some or most of:
- addon link with ui-review request (design)
- bugzilla change (qa)
- comment on bug assigned to self (dev)
- getsat response (support)
- checkin-needed push (dev)
- mailing list post
- forum post
- wiki post
separate list of possibilities specifically for devs from Standard8:
- Patches submitted (maybe with indications of re-submission) and review requested on
- Patches reviewed
- Comments submitted
- New bugs submitted
New contributors per month
This is very strategic, because growing the community allows us to do more.
Action: review once a month to check for issues, and understand our trajectory. Various things we do will be attempting to pushing this metric upwards. Also, starting reaching out to new contributors and ask them what their pain points were, and what would motivate them to contribute more
Reviews (for both MailNews Core and Thunderbird products)
If someone goes to the trouble of writing a patch, submitting it for review, and then the review doesn't happen in a timely manner for whatever reason, this can be extremely demotivating. As a result, I think it's worth focusing some energy here. The most interesting metric, I suspect, is median time for r+ / r- to be granted (along with standard deviation).
Action: review regularly, notice issues, try initiatives to drive downwards over time.
It could be interesting to be facet and/or correlate these numbers with each of patch size, requestor, requestee, bugzilla component as well.