Changes

Jump to: navigation, search

Foundation:Planning:Education

2,310 bytes removed, 14:58, 9 January 2009
Re-arranged sections, split out checkpoints into separate section, moved evaluation to a separate page
Our plan is to track and participate in these projects where we can add value, in order to take advantage of general resources that can be leveraged in a Mozilla-specific context, make contacts and alliances that might be useful to Mozilla, and promote the general topic of open source and education. We have already agreed to sit on the advisory committee of FOSS Education at Oxford.
 
== Checkpoints ==
 
It is important to get early results on the progress of initiatives and avoid expending time and resources on unproductive activities. We therefore plan to have quarterly checkpoints throughout 2009 at which progress will be assessed and decisions made about the future course of individual experiments.
 
The current list of checkpoints is given below; note that these should be read in conjunction with the planned activities for each quarter as listed in the roadmap section. Checkpoint activities should be completed by the dates given.
 
=== March 30, 2009 (end of Q1) ===
 
* Do we still have confidence in proposed activities now that we have plans and budgets in place? If 'no' for any activity, kill it.
* Do we have agreement on plans and institutional commitments from Seneca and URJC? If 'no', ask why and potentially reconsider partnership.
* Do we have people, content and web infrastructure ready to roll for at least one Mozilla Community Course? If 'no', is the problem one of resources or enthusiasm for the project? If enthusiasm, drop this idea.
* Are we able to gather compelling and useful content for EMO? If 'no', what is missing? Where do we need to look? Is the idea viable?
* Do we have shared versions of basic Seneca tools for use by others, especially prototype student project pool web site and updated Real World Mozilla course?
 
=== June 30, 2009 (end of Q2) ===
 
* Is there interest in using the materials and infrastructure that Seneca is sharing? Are Seneca and Mozilla happy with what has been produced?
** In particular, do we see early evidence that students, professors and mentors are planning to use these resources in the 2009/2010 academic year? If 'no', is this an issue of promotion or interest?
* Do we have the materials and people in place to have a successful first course at URJC? Are students registered?
* Did people participate in first Mozilla community courses? Did the participating Mozilla projects get useful outcomes? If 'no', do we need to improve our approach or kill this idea?
* Are we starting to see traffic and use on the EMO site? If 'no', does this indicate a lack of interest or a need for improvements?
 
=== September 30, 2009 (end of Q3) ===
 
* Do we see actual courses or individual student projects starting up with Seneca-resources? If 'yes', do we have the capacity to handle them? If 'no', assess why and adjust future plans accordingly.
* Is the pace of Mozilla community courses meeting the needs of the potential audience? What do we need more of? Less of?
* How useful is the new EMO functionality? If it is not useful, can it be fixed?
 
=== December 31, 2009 (end of Q4) ===
 
* Did the URJC Mozilla Technology course generate good learning outcomes for participants? Was URJC happy with outcomes? Did we produce re-usable course materials? If 'yes' on most, consider second phase. If 'no', question whether worthwhile to do again.
* Do any major tweaks need to be made to the Mozilla community courses for 2009? Should we bring all instructor duties inhouse (as part-time or full-time staff)?
* Is EMO important enough to evolve into a "first-class object" (e.g., comparable to SUMO, QMO, AMO, MDC) during 2010? If 'yes', this needs to be fully addressed in the 2010 budget.
* What is our overall assessment of Mozilla Education activities in 2009? What are the answers to our core design and thesis questions? In 2010, should we expand, evolve our experiments or kill the program?
 
== Roadmap ==
 
''[More detailed roadmaps are included on specific pages related to the pilot program activities outlined above.]''
 
The following roadmap outlines proposed activities in each quarter of 2009. At the end of each quarter the results of the activities will be assessed according to the checkpoints above.
 
=== Q1 2009 ===
 
* Present plan to the Mozilla Foundation board, revise plan as necessary. Finalize commitments for funding and staff time. (January)
* Finalize a detailed task list and timeline for Seneca activities. (January)
* Generate a list of course topics for Mozilla community courses, recruit instructors and mentors for Mozilla community courses, begin course design. (January and February)
* Hold an [[Events/EduCamp@FOSDEM2009|EduCamp]] meeting in association with [http://www.fosdem.org/2009/ FOSDEM] in early February to discuss Mozilla and other open source education initiatives in Europe. (February)
* Get a definitive go/no-go decision on URJC participation and plans for a Mozilla Technology course, and begin content development. (January)
* Deploy an initial EMO prototype as a page on mozilla.org or WikiMo with a basic set of links and resources. (February and March)
* Develop shared versions of core Seneca resources, especially student project pool web site and updated Real World Mozilla course. (January to March)
* Make contacts and have discussions with the OSS Watch FOSS Education project, the institutions involved in the "Integrating FOSS into undergraduate curriculum" activities, and others involved in general "teaching open source" activities. (throughout)
 
=== Q2 2009 ===
 
* Do an academic year-end review of Seneca program, including
** progress on generating lists of student projects from Bugzilla
** state of participation in #seneca, Seneca wiki, etc., by non-Seneca students and faculty
** state of development of packaged course material for "Real World Mozilla", etc.
* Develop content and recruit students for Mozilla Technology course, making sure it's ready to go live in Q3.
* Deliver at least one Mozilla community courses, ideally more.
* Evaluate the usefulness of the EMO prototype (e.g., based on traffic, content, comparison with related sites) and plan how it might evolve. In particular: should it remain a simple portal or take on other functions?
* Make decisions on where and how we might work together with others involved in "teaching open source" activities and commit to a set of plans.
 
=== Q3 2009 ===
 
* Work on Seneca initiatives in preparation for new academic year.
* Mozilla Technology course begins.
* Hold three Mozilla community courses (one per month)
* Work on the next phase of EMO.
 
=== Q4 2009 ===
 
* Evaluate the achieved scope of the Seneca program in the 2008-2009 academic year thus far vs. what was accomplished in the 2007-2008 academic year.
* Evaluate the success of the Mozilla Technology course and plan follow-on projects.
* Evaluate the success of Mozilla community courses, including the popularity of particular topics and whether the basic model of academic instructor plus mentor is working OK.
 
=== Beyond ===
 
''[What 2010 activities can we reasonably anticipate at this time?]''
== Resources and financial setting ==
''[Please add your own name if you are interested in participating in this effort.]''
== Roadmap Related information == ''[More detailed roadmaps are included on specific pages related to the pilot program activities outlined above.]'' === Q1 2009 === '''End of quarter check points''' * Do we still have confidence in proposed activities now that we have plans and budgets in place? If 'no' for any activity, kill it.* Do we have agreement on plans and institutional commitments from Seneca and URJC? If 'no', ask why and potentially reconsider partnership.* Do we have people, content and web infrastructure ready to roll for at least one Mozilla Community Course? If 'no', is the problem one of resources or enthusiasm for the project? If enthusiasm, drop this idea.* Are we able to gather compelling and useful content for EMO? If 'no', what is missing? Where do we need to look? Is the idea viable?* Do we have shared versions of basic Seneca tools for use by others, especially prototype student project pool web site and updated Real World Mozilla course?  '''Activities''' * Present plan to the Mozilla Foundation board, revise plan as necessary. Finalize commitments for funding and staff time. (January)* Finalize a detailed task list and timeline for Seneca activities. (January)* Generate a list of course topics for Mozilla community courses, recruit instructors and mentors for Mozilla community courses, begin course design. (January and February)* Hold an [[Events/EduCamp@FOSDEM2009|EduCamp]] meeting in association with [http://www.fosdem.org/2009/ FOSDEM] in early February to discuss Mozilla and other open source education initiatives in Europe. (February)* Get a definitive go/no-go decision on URJC participation and plans for a Mozilla Technology course, and begin content development. (January)* Deploy an initial EMO prototype as a page on mozilla.org or WikiMo with a basic set of links and resources. (February and March)* Develop shared versions of core Seneca resources, especially student project pool web site and updated Real World Mozilla course. (January to March)* Make contacts and have discussions with the OSS Watch FOSS Education project, the institutions involved in the "Integrating FOSS into undergraduate curriculum" activities, and others involved in general "teaching open source" activities. (throughout) === Q2 2009 === '''End of quarter check points''' * Is there interest in using the materials and infrastructure that Seneca is sharing? Are Seneca and Mozilla happy with what has been produced?** In particular, do we see early evidence that students, professors and mentors are planning to use these resources in the 2009/2010 academic year? If 'no', is this an issue of promotion or interest?* Do we have the materials and people in place to have a successful first course at URJC? Are students registered?* Did people participate in first Mozilla community courses? Did the participating Mozilla projects get useful outcomes? If 'no', do we need to improve our approach or kill this idea?* Are we starting to see traffic and use on the EMO site? If 'no', does this indicate a lack of interest or a need for improvements? '''Activities''' * Do an academic year-end review of Seneca program, including** progress on generating lists of student projects from Bugzilla** state of participation in #seneca, Seneca wiki, etc., by non-Seneca students and faculty** state of development of packaged course material for "Real World Mozilla", etc.* Develop content and recruit students for Mozilla Technology course, making sure it's ready to go live in Q3.* Deliver at least one Mozilla community courses, ideally more.* Evaluate the usefulness of the EMO prototype (e.g., based on traffic, content, comparison with related sites) and plan how it might evolve. In particular: should it remain a simple portal or take on other functions?* Make decisions on where and how we might work together with others involved in "teaching open source" activities and commit to a set of plans. === Q3 2009 === '''End of quarter check points''' * Do we see actual courses or individual student projects starting up with Seneca-resources? If 'yes', do we have the capacity to handle them? If 'no', assess why and adjust future plans accordingly.* Is the pace of Mozilla community courses meeting the needs of the potential audience? What do we need more of? Less of?* How useful is the new EMO functionality? '''Activities''' * Work on Seneca initiatives in preparation for new academic year. * Mozilla Technology course begins.* Hold three Mozilla community courses (one per month)* Work on the next phase of EMO. === Q4 2009 === '''End of quarter check points''' * Did the URJC Mozilla Technology course generate good learning outcomes for participants? Was URJC happy with outcomes? Did we produce re-usable course materials? If 'yes' on most, consider second phase. If 'no', question whether worthwhile to do again.* Do any major tweaks need to be made to the Mozilla community courses for 2009? Should we bring all instructor duties inhouse (as part-time or full-time staff)?* Is EMO important enough to evolve into a "first-class object" (e.g., comparable to SUMO, QMO, AMO, MDC) during 2010?* What is our overall assessment of Mozilla Education activities in 2009? What are the answers to our core design and thesis questions? In 2010, should we expand, evolve our experiments or kill the program? '''Activities''' * Evaluate the achieved scope of the Seneca program in the 2008-2009 academic year thus far vs. what was accomplished in the 2007-2008 academic year.* Evaluate the success of the Mozilla Technology course and plan follow-on projects.* Evaluate the success of Mozilla community courses, including the popularity of particular topics and whether the basic model of academic instructor plus mentor is working OK. === Beyond === xxx == Evaluation == ''This section is '''VERY DRAFT'''. Much discussion still needed.'' As noted above the two hypotheses we wish to test are that we can* produce rich learning outcomes for students* garner new ideas and contributors for Mozilla We potentially have multiple ways of testing whether students have rich learning outcomes:* Student self-assessment. For example, we could survey students and ask them (among other things) to rate the value and relevance of their learning experience.* Instructor assessment. For example, instructors (and mentors) could rate students on their demonstrated levels of enthusiasm, dedication, and achievement. * Peer assessment. For example, students could nominate classmates for special recognition based on their overall contributions to the class experience.* Student behavior. For example, we could look at the level of student involvement in class activities as measured by the amount of code checkins, bug comments, forum postings, IRC presence, etc. We also have several possible ways to measure resulting new ideas and contributors:* Mentor assessment. For example, we could ask mentors (and/or module owners and peers) to rate students and/or their projects based on their level of contribution to project goals.* Contributor assessment. For example, we could ask Mozilla project contributors in general to nominate students and their projects for special recognition.* New contributor metrics. For example, we could determine how many students achieve various contributor-related milestones (getting "canconfirm" status on bugs, having patches successfully reviewed and accepted, gaining commit access, etc.). If possible, we could compare students with a "control group" of contributors who came into the project through more traditional paths.
There are other dimensions * Some thoughts on which we might evaluate our overall efforts, [[Foundation:Planning:Education:Evaluation|evaluation methodologies]] for example:* Student participation. This would measure the overall reach larger questions of the program measuring learning outcomes and the extent contributions to which participating students take on more active roles within the Mozilla project. More specifically, some potential measures include** Total number of students who participate to some degree.** Number of students who successfully complete at least one substantial project.** Number of students who continue participation past the end of their class(es).** Number of students who become core project contributors.* Geographic reach. This would measure the extent to which we can achieve worldwide success in reaching potential student populations.** Number of countries with participating institutions.** Number of countries with participating students. (This assumes the possibility of some students participating outside a formal institutional framework.)** Number of languages in which instruction occurs and for which educational material is localized.* Institutional adoption. This would measure the extent to which this paradigm is adopted within particular education institutions.** Number of instructors participating within given institutions.** Number of relevant classes within given institutions.** Number of students participating within given institutions.* Breadth of offerings. This would measure the extent to which instruction occurs beyond just traditional computer science programs. For example, do we have participation by business schools? Design schools? And so on..community.
Confirm
610
edits

Navigation menu