196
edits
| Line 122: | Line 122: | ||
|- | |- | ||
| 3.1 | | 3.1 | ||
| X% increase in investments for AI + bias grantees. | | '''X% increase in investments for AI + bias grantees.''' | ||
'''Motivation:''' <br /> | '''Motivation:''' <br /> | ||
We are already investing in a number of projects related to AI and bias: the AJL CRASH project; Common Voice; Creative Media Award grantees/projects, and ideas surfaced at MozFest. In 2021, we plan to kickstart our increased focus on this topic by providing additional funding, implication and support to these projects. | We are already investing in a number of projects related to AI and bias: the AJL CRASH project; Common Voice; Creative Media Award grantees/projects, and ideas surfaced at MozFest. In 2021, we plan to kickstart our increased focus on this topic by providing additional funding, implication and support to these projects. | ||
| Line 133: | Line 133: | ||
|- | |- | ||
| 3.2 | | 3.2 | ||
| X# people participate (share stories, donate data, etc.) in projects on mitigating bias in AI as a result of Mozilla promotion. | | '''X# people participate (share stories, donate data, etc.) in projects on mitigating bias in AI as a result of Mozilla promotion.''' | ||
'''Motivation:''' <br /> | '''Motivation:''' <br /> | ||
Last year we observed that bias is a topic that gets the public to pay attention to trustworthy AI issues. In 2021, we want to see if we can go further by getting the public to engage in projects that concretely advance a trustworthy AI agenda. | Last year we observed that bias is a topic that gets the public to pay attention to trustworthy AI issues. In 2021, we want to see if we can go further by getting the public to engage in projects that concretely advance a trustworthy AI agenda. | ||
| Line 144: | Line 144: | ||
|- | |- | ||
| 3.3 | | 3.3 | ||
| Pipeline of additional projects Mozilla can support to mitigate bias in AI established. | | ''Pipeline of additional projects Mozilla can support to mitigate bias in AI established.'' | ||
'''Motivation:''' <br /> | '''Motivation:''' <br /> | ||
The previous KRs focus on projects we already know about. We know much more is happening in this space. Over the coming year, we will build a pipeline of additional funding, engagement and philanthropic advocacy opportunities related to AI bias which we can use to drive our work in 2022+. | The previous KRs focus on projects we already know about. We know much more is happening in this space. Over the coming year, we will build a pipeline of additional funding, engagement and philanthropic advocacy opportunities related to AI bias which we can use to drive our work in 2022+. | ||
edits