Making Firefox More Robust
Some of the comments on the Firefox/Feature_Brainstorming page have touched on the fact that Firefox lets pages interfere with each other in bad ways. This page looks at how to improve Firefox's robustness by putting pages from different domains in different processes.
Current Robustness Problems
The brainstorming page covers a lot of these problems, but I'll mention them here for completeness.
- Memory Management - Sites can allocate as much memory as they want, until the entire browser becomes unresponsive. It can get to the point that the user has to kill Firefox, and all open pages go with it. The brainstorming page lists several variations on this.
- Failure Isolation - If one page crashes, all pages crash with it. This could be the result of a bug in Gecko or any plugin, like Applets, Flash, or PDF viewers.
Ways to fix these problems
The concurrency problem could be solved by using multiple threads, but this doesn't prevent a crash on one page from taking down all other pages.
Instead, OS processes offer enough isolation to solve these problems, and they correspond to the idea of having different "web applications," just like desktop applications. Each process is scheduled by the OS, has its own address space, and can't interfere with other running processes.
There are two challenges for using multiple processes in Firefox. The first one is picking the right level of granularity, because some web pages can communicate with each other via the DOM. Putting these pages in separate processes would break this feature.
The second challenge is that Firefox profile data cannot currently be shared between processes. I won't get into that here; just look at bug #135137.
Where to Introduce Processes
We can look at different approaches to decide the best way to introduce multiple processes, without breaking existing pages. No current browsers get this right.
- Process per Group of Windows - Internet Explorer and Konqueror start a new process each time you start the program from its icon. However, they use the same process for pages in different tabs, or when the user chooses "New Window." There's no visual indication of which windows belong to which process, and a crash still takes down all pages in a given group of windows.
- Process per Domain - Instead of separating sub-domains like store.company.com and company.com, we can put all pages from *.company.com in the same process. This makes a domain responsible for the all the pages it delivers, while preventing those pages from interfering with pages from other domains. This approach doesn't break existing pages, and it could support a management feature like "kill all pages from somebadsite.com".
Note that Mozilla-based browsers support signed scripts, which let a script ask the user for permission to talk to pages from different domains. (In the diagram above, this would add an arrow between D.com and E.com.) However, I haven't found any popular sites using this feature, and I'm not sure it's worth letting pages from different domains interfere with each other. Correct me if I'm wrong.
Other browsers like Opera support HTTP 5 message passing, which allows two pages from different domains to pass messages using a "postMessage" function call. This would also add an arrow between D.com and E.com above. This is a lot closer to inter-process calls than shared memory, though, so it could still be implemented even if the pages are in different processes.
Introducing more processes will inevitably add memory overhead, but not as much as one might expect. Shared libraries cut down on the actual amount of memory needed for each extra process. Also, this approach greatly improves the responsiveness of each page when one bad page is doing something expensive.
While many people have requested making Firefox faster on the feature brainstorming page, making it more robust could actually set it apart from other browsers. It's worth taking this into account.
I've already started building a prototype of a web browser that uses a process per domain. However, because Firefox profiles cannot be shared across processes, I'm modifying Konqueror instead of Firefox.
To do this, I'm using XParts to embed a KHTMLPart from one process in a Konqueror window in a different process, which is working pretty well so far. I'm currently working on the process management code, to map each window/tab/frame to the correct process, and to decide when a process can be safely killed (i.e., when all pages from a domain leave the browser history).
If we want to fix these robustness problems for Firefox 3, though, we need to first to fix bug #135137 (to share profile data across processes).
Feel free to discuss this below, or on the Talk page for this page.
Charlie Reis, University of Washington
creis [at] u [dot] washington [dot] edu