Distributing a Firefox extension

How we're internally distributing a browser extension with S3 and automation

Distributing a Firefox extension
Photo by Joel Muniz / Unsplash

πŸ“— Context

One of my team's projects at OVHcloud involves creating a browser extension for Firefox, our browser of choice. Given this extension will contain private data, we don't want to distribute it through Firefox's store ; but we still want a distribution channel with automatic updates. The extension will be used by dozens of people internally, we don't want to handle distributing new versions manually or put this load on the helpdesk.

I was then on the market for a distribution channel which met the following constraints:

  • Serving files over HTTPS
  • No authentication – the browser doesn't authenticate when checking extension updates
  • Only accessible internally

πŸͺ£ S3 to the rescue

I contacted our Developer Platform team. They didn't have anything specifically made for this, but they pointed me to our Public Cloud Object Storage product. It's perfect for serving files over HTTPS, and can be configured to not require authentication. We don't yet offer IP restrictions for clients, but we can implement them internally for specific needs.

Now that I had my S3 bucket, I needed to find a way to automate release distribution. I didn't want to manually download, edit and upload the update manifest for each release, so I warmed up my bash-fu and crafted a script using awscli, jq and some sed sprinkled over our CDS pipeline.

πŸ—οΈ Build script

flowchart LR test[test & lint] --> build --> sign --> publish

First, we're using web-ext to build and lint the extension. We also use it to send our extension for signature by addons.mozilla.org (AMO). Modern Firefox releases require extensions to be signed to install them. We ask to be unlisted though, so that it doesn't appear in public listings.

# install npm dependencies
npm install -g web-ext
npm install

# test & lint
npm test
web-ext lint --self-hosted

# build
web-ext build --overwrite-dest

# sign
web-ext sign --channel=unlisted --api-key=${JWT_ISSUER} \
  --api-secret=${JWT_SECRET}

After that, we prepare the S3 configuration and download our existing manifest. I manually uploaded the initial version. We edit the JSON with jq and save the new version in a second file so we can print the diffs during the pipeline and expose the two files as build artifacts for troubleshooting.

# publish
## install & configure awscli
pip3 install awscli awscli-plugin-endpoint
aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
aws configure set plugins.endpoint awscli_plugin_endpoint
aws configure set region sbg
aws configure set s3.endpoint_url $S3_ENDPOINT
aws configure set s3.signature_version s3v4
aws configure set s3api.endpoint_url $S3_ENDPOINT

## download current updates manifest
aws s3 cp s3://$S3_BUCKET_NAME/updates.json ./

## edit updates manifest
filename=$(ls *.xpi)
version=$(echo $filename | sed -E 's/[a-z0-9-]+-(.*)\.xpi/\1/g')
echo "filename=$filename"
echo "version=$version"
jq ".addons[\"$EXTENSION_ID\"].updates += [{\"version\":\"$version\", \"update_link\":\"$S3_BUCKET_URL/$filename\"}]" updates.json > updates2.json

## show diff
diff updates* || true

Finally, we upload the signed extension and the new manifest, and make them public.

## push updated manifest and signed extension
aws s3 cp ./updates2.json s3://$S3_BUCKET_NAME/updates.json
aws s3 cp $filename s3://$S3_BUCKET_NAME

## make the files publicly accessible
aws s3api put-object-acl --acl public-read \
  --bucket $S3_BUCKET_NAME --key updates.json
aws s3api put-object-acl --acl public-read \
  --bucket $S3_BUCKET_NAME --key $filename

And voilΓ ! We got ourselves a private distribution channel for our extension, with support for automatic updates. The final touch is adding the URL of the updates file in our extension manifest:

{
  "browser_specific_settings": {
    "gecko": {
      "update_url": "https://bucket-url.example.com/updates.json"
    }
  }
}

πŸ§‘πŸ»β€πŸ« Conclusion & learnings

It was fun! You should've seen the satisfaction on my face at 18:55, alone in the open space, after seeing a test version being automatically downloaded by my browser 😁

I learned a lot about S3: I was familiar with the concept but I had no hands on experience prior to this. I'm still no expert but I'm now a bit more confident about what I do.