SecurityEngineering/MeetingNotes/05-16-13

From MozillaWiki
Jump to: navigation, search

Standing Agenda

  • Q2 Goals Recap ( https://intranet.mozilla.org/2013Q2Goals#Security_Engineering )
  • Review roadmap priorities to ensure they accurately reflect active projects and Mozilla's priorities
  • Suggest additions or changes to roadmaps
  • Detailed discussion of features or outstanding issues as time permits
  • Additional Items
  • Upcoming events, OOO/travel, etc.

Last week: https://wiki.mozilla.org/SecurityEngineering/MeetingNotes/05-09-13

Q2 Goals

  • [ON TRACK] land the application reputation scanning tool bug 662819 (mmc)
  • [DONE] Turn Mixed Content Blocking on in Aurora (tanvi)
  • [ON TRACK] land classic cert validation replacement, off by default (bsmith)
 builds on all platforms, but some issues with revocation.
  • [ON TRACK] land OCSP stapling support and tests (keeler)
  • [RISK] Revamp the MDN documentation of CSP and Mixed Content Blocker (imelven + tanvi)
  • [ON TRACK] Develop & socialize plan (document containing steps, timeline,, implementation & test plan) for getting sandboxing onto a desktop Firefox, probably Linux (imelven)
  • [ON TRACK] Deploy pilot cookie study and publish results. (ddahl)

Agenda

Q2 Goals

CSP and HTTP auth (bsmith)

Application Reputation (bsmith)

  • http://online.wsj.com/article/PR-CO-20130513-911455.html?mod=googlenews_wsj
    • From TFA, switching to the safe browsing API v2 is an 73% benefit but application reputation is only a 10% benefit. So, why are we working on application reputation now and (AFAICT) not switching to API v2?
      • Because v2 is already implemented
    • Are these useful or meaningful numbers?
    • How can we measure the effectiveness of these measurements without waiting for NSS Labs and others to do it for us?
    • How important is this, really? In theory it gets us a 7fold increase in coverage
  • Why aren't we copying MSIE instead? First, if we are happy with a 85% block rate, apparently we could get that without sending the URLs of downloads to Google, by doing whatever Microsoft was doing before.
    • I think that v2 is implemented, and that doesn't involve sending URLs to Google.

That would be a better privacy/security tradeoff than getting the same results by sending Google every download URL. Then, by adding application reputation we would presumably get to 99.9% protection. What am I missing here?

"Of the three browsers using Google’s Safe Browsing API, Chrome is the only one to also utilize Google’s malicious download technology; this technology attempts to block malicious downloads from sites that are not blocked by URL reputation. Figure 10 shows the block performance of the URL blocking component and the additional download block component used by Google’s Chrome and Internet Explorer. The URL blocking performance of the three Safe Browsing technology browsers was consistent at about 10%. Google’s malicious download protection proved to be approximately seven times more effective than URL blocking alone, increasing overall blocking performance by 73.2% when compared to URL blocking alone. The malicious download technology accounts for the majority of the blocking performance of Google Chrome. "

  • What is the difference between what MSIE does and what Chromium does that accounts for the ~15% difference in performance between the two? Presumably, the law of diminishing returns applies and so MSIE's performance seems *really* impressive as that last 15% has got to be the hardest 15% to capture.

Plugins in Firefox for Android

3rd Party Cookies and Telemetry (bsmith)

and I'm having trouble understanding why Brendan (or I) would care about those numbers, except to satisfy curiosity. In particular, it seems like the main question is how many pages/sites will break in a way that users care about, and I don't see how we're answering that question with the telemetry that we are about to collect. It seems like instead we need something like what Matt Wobensmith did (is doing, hopefully) for testing the impact of mixed content blocker and/or what the mobile team did for testing FxAndroid and B2G compatibility with the internet. Regardless, it seems like we need to have a control group to measure "with third-party cookies blocked" vs. "without third-party cookies blocked" impact (this is why I said above that, if telemetry is actually important then we would need it several weeks ahead of enabling it in -beta/-release).