Personal tools

Contribute/Audit

From MozillaWiki

Jump to: navigation, search

To help inform efforts to grow Mozilla and improve the contributor experience, the Contributor Engagement team worked with David Eaves in late 2011 to assess the experience of volunteers at Mozilla.

Contents

Purpose

The goal of this audit is to assess Mozilla's community engagement capacity – particularly in light of a renewed effort to radically grow its size. As such this audit seeks to propose ways to assess the size, participation, retention and efficacy of volunteer contributors, as well as identify places where high transaction costs could be reduced, across the various projects at Mozilla.

High level findings

Role of the Community is Unclear to Many Staff

Presently a majority of staff believe volunteer contributions are more important in general than in their area of work, suggesting they don’t know if and when – or perhaps how – to engage volunteers. There was a time when the community was essential to the core mission of Mozilla; at that time, the browser could literally not be coded without volunteers. This is simply no longer true for a growing number of groups at Mozilla. Consequently, there is confusion about when and how to use community volunteers – especially when community engagement conflicts with other goals (such as speed).

Positive Community Experiences Foster Stronger Community Engagement

In short, community engagement matters. Among staff, there is a strong correlation between positive experiences in the community and a range of other good outcomes (likelihood of on-ramping, valuing the community, etc.). The opposite is true for those with negative experiences. Meanwhile, among volunteers, the impact is real: those with recent negative experiences are significantly more likely to report leaving the contributor community.

Lack of Data Impedes Audit, Planning and Management

Conducting an audit of Mozilla’s community engagement capacity is almost impossible since at almost every level, volunteer contributions at Mozilla are not measured. This lack of data has several impacts, of which two major ones stand out:

  • Management and staff cannot be evaluated on volunteer engagement.
  • Groups often have little information to help create engagement strategies.

In addition, it rendered it impossible to conduct a quantifiable audit of participation at Mozilla and as such the report relies on qualitative analysis, which is less reliable and cannot be updated easily and regularly.

Lack of Community Engagement Vision Hurts Mozilla

There are virtually no corporate-wide systems or tools to support community engagement. As a result, staff and volunteers are unsure how to engage one another. For example:

  • A majority of volunteers don’t believe or are unsure if their contributions have real impact.
  • Many volunteers are unsure of how decisions at Mozilla are made.

High level recommendations

A corporate strategy for volunteers

The Steering Committee should lay out broad goals and expectations around volunteers at Mozilla, including:

  • Guidelines for when to volunteer and when not;
  • Ensuring decision-making processes are clear to Mozilla volunteer and staff;
  • Broad goals around community growth and size (this is done).

Push the Strategy Down to the Operational Level

Require Directors to create annual and quarterly goals for community growth, retention, participation and effectiveness. Plans should outline:

  • How volunteers’ participation will integrate into a group’s workflow and help achieve strategic goals;
  • A strategy for managing and coordinating volunteer contributions;
  • Tools and skills volunteers and staff need to support working together;
  • Recruitment strategy.

Most importantly, Directors should be made accountable for the size and participation rate of their volunteer community.

Measure Engagement

To assess engagement capacity, Mozilla needs some basic data about engagement. Each Mozilla group should develop a plan to measure participation and effectiveness. Such a plan should address the following points:

  • A map of activities volunteers can engage in;
  • How it will measure the number of contributors involved in these tasks;
  • How it will measure the average response time a contributor waits for any decision from Mozilla;
  • Its processes and documentation for on-ramping;
  • System to determine when a contributor stops participating and a process for asking why;
  • A system for thanking/acknowledging contributors.

The underlying assumption for these measures is that architecting for and measuring participation will allow Mozilla to do more. If this is not the case, then we should rethink our branding around volunteerism and engagement.

Presentation video and slides

Audit presentation.jpg

Survey data

The audit collected data from both paid and unpaid contributors. The results of those surveys are available for review and further analysis.

  • All hands community survey data
  • Volunteer Survey data

Feedback

Please provide any thoughts, comments, analysis or suggestions about this on the Mozillians list at

Related blog posts

Next Steps

Plans for addressing the recommendations made here are covered in the etherpad at: https://yummy.etherpad.mozilla.org/audit-nextsteps