From MozillaWiki
Jump to: navigation, search

Kinto Integration in Firefox

Key features

  • Diff-based data synchronization
  • Data integrity/signing
  • Offline persistence
  • Admin panel UI

Use Cases

  • Certificates blocklist (OneCRL) (contact: Mark Goodwin — mgoodwin)
  • Addons/Plugins/Gfx Blocklisting (contact: Mathieu Leplatre — leplatrem)
  • storage.sync API (WebExtensions) (contact: Tarek Ziadé — tarek)
  • Fennec assets catalog (contact: Sebastian Kaspari — sebastian)
  • Password manager recipes (contact: Matthew Noorenberghe — MattN)

Feel free to come and discuss on #storage :)


Leveraging the Kinto HTTP client in Gecko looks like this:

const { KintoHttpClient } = Cu.import("resource://services-common/kinto-http-client.js");

const client = new KintoHttpClient("");
  .then(result => ...);

As for the Kinto offline-first client, it is like:

const { loadKinto } = Cu.import("resource://services-common/kinto-offline-client.js");

const KintoOfflineClient = loadKinto();

const client = new KintoOfflineClient({
  adapter: Kinto.adapters.FirefoxAdapter,
  remote: "",
  bucket: "a-bucket"

const collection = db.collection("a-collection");

try {
  // Fetch changes from server.
  yield collection.sync();
  // Read local collection of records.
  const records = yield collection.list();
} finally {
  yield collection.db.close();


Currently, the instance of Kinto used by Firefox clients is hosted at


The goal is to replace the current system based on a single XML file downloaded everyday by several Kinto collections.



Currently the blocklist system relies on a big XML file that is downloaded every day. It contains block entries for certificates to be revoked, addons and plugins to be disabled, and gfx environments that cause problems or crashes. Everything is managed via the Addons server.

Firefox (and derivatives like Thunderbird, Seamonkey, ...) downloads it on an URL that contains client information (eg.

Using the same XPCOM notification callback, the new mechanism will synchronize the local copy of each collection from the remote server. If changes are available, the local copy will be updated and content signature verified.

  • In phase 1, it will be a no-op. Both mechanisms will run in parallel but only the legacy one will be used. We'll keep both mainly because the current download of the XML is used to count active daily users. Once we are ok with the statistics, we'll go to phase 2.
  • In phase 2, we'll change the source of truth of block entries on the server side. The data from the Addons server won't be used anymore. That means the server will produce the same XML file but using the data stored in the new service.
  • In phase 3, the blocking mechanism will rely on the data managed via JSON, and the old XML client will be decommissioned.

Fennec assets catalog

The goal is to remove the static assets (fonts, hyphenation dicts, etc.) from the distribution package and download them asynchronously using an online Kinto catalog.

  • The bucket is fennec
  • The collection is catalog



Upgrade client libraries

Two client libraries are embedded in Firefox:

  • Kinto/kinto-http.js: for direct interactions with the Kinto HTTP API
  • Kinto/kinto.js: for offline persistence in internal SQLite

Generate bundles

The Kinto client libraries are developed independently on Github:

  • kinto-http is the HTTP client for the Kinto REST API;
  • kinto.js is the offline-first client for Kinto.

With the help of Babel and browsersify, a bundle is generated for Firefox with the minimum transpilation possible (eg. CommonJS require, ES7 decorators).


From the kinto.js repo, generate the moz-kinto-offline-client.js file:

$ npm run dist-fx

And overwrite it in the Firefox code base:

$ cp dist/moz-kinto-offline-client.js ../mozilla-central/services/common/kinto-offline-client.js


From the kinto-http.js repo, generate the moz-kinto-http-client.js file:

$ npm run dist-fx

And overwrite it in the Firefox code base:

$ cp dist/moz-kinto-http-client.js ../mozilla-central/services/common/kinto-http-client.js

Run the tests

First, follow the instructions to build Firefox.

In order to speed up the build and being able to run tests properly, create a **mozconfig** file at the root of the repository:

ac_add_options --enable-debug
ac_add_options --disable-optimize
ac_add_options --disable-crashreporter
ac_add_options --with-ccache=/usr/bin/ccache

For JavaScript updates only, have a look at Artifacts Builds, trading bandwidth for compilation time.

$ ./mach build faster
$ ./mach xpcshell-test services/common/tests/unit/test_kinto.js
$ ./mach xpcshell-test services/common/tests/unit/test_storage_adapter.js

Or both at once:

$ ./mach xpcshell-test services/common/tests/unit/test_kinto.js services/common/tests/unit/test_storage_adapter.js

There are also tests relying on Kinto in services/common/tests/unit/test_blocklist_* .

Gfx blocklist tests

The Gfx test suite requires the debug mode to be enabled. Add this to mozconfig file, rebuild and run the tests:

ac_add_options --enable-debug

Debug content signature

You can get tests (or Firefox) to give you more information on what the content signature verifier is doing by setting the NSPR_LOG_MODULES environment variable. For example:

EXPORT NSPR_LOG_MODULES=ContentSignatureVerifier:5,CSTrustDomain:5

TDD mode

Using inotify, we will detect a file change in the dist/ folder and run a series of commands to execute the tests automatically.

First, install inotify-tools:

sudo apt-get install inotify-tools

Then start an infinite loop with inotify-wait:

while true; do
    # Wait for a change
    inotifywait -q -e create,modify,delete -r ~/Code/Mozilla/kinto.js/dist
    # Execute these commands
    cp ~/Code/Mozilla/kinto.js/dist/moz-kinto-offline-client.js services/common/kinto-offline-client.js
    ./mach xpcshell-test services/common/tests/unit/test_storage_adapter.js
    ./mach xpcshell-test services/common/tests/unit/test_kinto.js          

Source: Antoine Cezar

Submit patch

> Patch are contributed to kinto.js and kinto-http.js, which are first released on NPM.

DO NOT land files that are not tagged officially on upstream repositories.

Become a contributor

Configure SSH key for hg:

  IdentityFile ~/.ssh/contrib_moz

Run integration tests: «Try»


With Mercurial, push a patch to MozReview (see below) and trigger a Try build from the UI (*Automation > Trigger a Try build*).

Or with git, use a gecko-dev fork from Github, and with install moz git tools

git push-to-try -t --rev master..HEAD ~/hg/mozilla-central/ -b do -p linux,linux64,macosx64,win32,win64 -u xpcshell,mochitests -t none

Submit for review


# Keep a bookmark of your branch to address review.
# (equivalent of git branches)
hg bookmark bug/XXXXX

# Commit with link to Bugzilla
hg commit -m "Bug XXXXX - Upgrade <lib> to X.Y.Z"

# Submit to MozReview
hg push review

# Go back to «master»
hg update -C central


To adjust previously submitted patch:

# Go to bookmark
hg update -C bug/XXXXX

# Address comments

# Amend commit
hg commit --amend

# Inspect history
hg log -f -G --rev bug/XXXXX | less

# Update review
hg push review

When a patch has several commits:

# Inspect current branch
hg log -f -G --rev bug/XXXXX | less

# Squash/Reword commits since rev c59308877f9a
hg histedit c59308877f9a

To rebase revision c59308877f9a and descendants on last «master»:

hg pull 

hg rebase -d central -s c59308877f9a