How to Publish Web Components to NPM

Publishing JavaScript to npm is a controversial affair. What should be a simple process, in my opinion, is fraught with a large number of correlated choices that are dependent not just on technical factors, but social factors like what users are used to, and how web-centric their point-of-view is.

Web components, being a superset of JavaScript, bring their own set of choices on top of this.

What follows is my personal checklist for publishing web components to npm. This checklist attempts to maximize compatibility, standards compliance, flexibility, and usefulness to your users.

These are my personal opinions, yes, but they are hard-won, toiled over, and hopefully well-reasoned opinions, so I (a bit obviously) think they're really the right way to go.

Justin's Checklist for Publishing Web Components to NPM™

  1. Publish standard ES2017
  2. Publish standard JavaScript modules
  3. Do not use .mjs file extensions
  4. Only publish a single build
  5. Important package.json fields:
    a. Set "type" to "module"
    b. Set "main" to the main entry point module
    c. Set "module" to the same file as "main"
    d. Include polyfills in devDependencies, not dependencies
  6. Do not bundle
  7. Do not minify
  8. Always self-define elements
  9. Export element classes
  10. Do not import polyfills into modules
  11. Import dependencies with "bare" or "named" import specifiers
  12. Always include file extensions in import specifiers
  13. Publish a custom-elements.json file documenting your elements
  14. Include good TypeScript typings

Now let's get into the why for each recommendation.

Publish standard ES2017

Modern JavaScript is smaller, faster, and more capable than the same code transpiled to an old version of JS like ES5, and the vast majority of users have browsers that support it.

So it's very preferable to send the most modern JavaScript to users that you can. But you can't send modern JS to browsers if you don't have it to begin with. Consumers of your packages can compile the code to a lower language level if they need to, but they can't un-compile your code to a newer language version.

Another way to frame this is that only applications know their exact browser support levels. Some applications need to support very old browsers. Some, maybe running in Electron, only need to support the latest Chrome. Reusable libraries can't know what the browser requirements are, and should publish modern JS to give the most flexibility to applications.

But how modern? This is the tricky part. I choose ES2017 because it's very widely supported across Chrome, Safari, Firefox, and Edge. This means that for most browsers you don't need to compile at all.

I once had tried to come up with a convention that would describe what language version to publish, like "ESY-1", meaning in 2019 publish ES2018, but Edge has slipped a bit and doesn't support object spread/rest, so this didn't hold. I think the situation will be a little clearer when Edge ships with the Chromium backend, hopefully soon.

What do users do who need to support IE11? They need to compile dependencies inside node_modules/. This is something that should be more common, because it solves a lot of problems with JS distribution. If this is slow with some tools, my opinion is that those tools need to cache intermediate build results better because you should only recompile dependencies when they change.

Publish standard JavaScript modules

All modern browsers and tool-chains support standard JavaScript modules, and browsers don't natively support any other module format. This means that if you publish modules, they can load uncompiled.

This is especially nice for development, where the less transforms on code you do the better the debugging experience is. Native modules are great to use during development without bundling too. A static file server that properly sends 304 response codes for unmodified files will take maximum advantage of the browser's cache and only send the files that changed.

Most applications are not serving native modules to browsers yet, but the application's tool chain can certainly handle modules as input and transform and bundle them to whatever format the app is using.

Do not use .mjs file extensions

The .mjs file extension is as useless for browsers as it is controversial. There's really no benefit at all from using it. Browsers only care about the mime type of files, not the file extension, so .mjs does nothing there. And tools look to package.json to determine if a package contains modules, so there's no benefit there.

There are downsides to .mjs though: not every tool in the world understands it. Some static file servers may not send the right mime-type header, meaning that the file won't load as a module in browsers.

The best thing is to just avoid it altogether and always write and publish modules, and always use the .js extension.

Only publish a single build

It's pretty common on npm to publish multiple builds. This is a bad and outdated practice that can lead to bloat in application bundles. The reason is that multiple libraries may share a common dependency, but if they import different builds, the bundles will end up with slightly different duplicates of the dependency.

In the case of web components, this is especially dangerous, since we really need there to be a single definition of a component. Multiple versions here is a bigger headache than just bloat.

And again, if the application consumes standard JS modules in its build pipeline, it can transform that to whatever single format it needs. There is really no need to publish multiple builds ever.

I really recommend holding this line against the inevitable issues and PRs that will ask you to add an ES5 UMD build.

Important package.json fields

Set "type" to "module"

The "type" field is the standard way to indicate that JavaScript files in an npm package are modules. Tools like CDNs and bundlers can use this to properly parse files with module parse goal.

With this field you do not need to use the .mjs extension, even in Node.

Set "main" to the main entry point module

This is just standard practice with the slight difference that most packages today publish some kind of non-modules build and point "main" to that. It'll be unusal to some to point this field to a module, but it's semantically correct, goes along with "type": "module", and npm requires a "main" entry.

Set "module" to the same file as "main"

"type": "module" is the most correct way to specify that a package contains modules, but tools have supported the "module" field for a while, and we'll have to use this for a while yet before all tools support the "type" field.

Include polyfills in devDependencies, not dependencies

Polyfills are an application concern, so the application should depend directly on them. Packages may need to depend on polyfills for tests and demos, so if they're needed, they should only go in "devDependencies".

Do not bundle

This one has a little bit of wiggle room depending on the structure of your package.

The most important advice here is to not bundle dependencies. This keeps you from causing bloat by duplicating dependencies into multiple package bundles.

As long as you don't bundle dependencies, you may decide to bundle your package in order to hide implementation modules. If you do, make sure you preserve all the valid entry points of your package as separate entry point files into a set of bundles with shared chunks.

That is, if you support importing 'my-library/element-a.js' and 'my-library/element-b.js don't bundle them together. The browser's module loader acts as a natural tree-shaker, in that it only loads the modules it needs. It's important then to keep modules relatively small and single-purposed. Let consumers import just the parts of your package that they need. Bundlers will have an easier time of creating small bundles too. So don't let per-package bundling lead to bloat for applications.

I find it simpler to just not bundle libraries at all. The application's build pipeline will take of it how it sees fit.

Do not minify

Just like the recommendations to not publish builds less than ES2017 and to not bundle, minification should be an application concern. It's much easier to debug non-minified code, and minifiers get better over time, so don't bake this into your published files.

Always self-define elements

The module that declares the web component class should always include a call to customElements.define() to define the element.

This is a contentious point, but currently there is no other practical choice. Some people would like to allow consumers of components to choose the tag name, so they advocate leaving out the customElements.define() call, or putting it in a separate module, but this doesn't actually work out well.

Custom elements currently require that there be a strict one-to-one relationship between tag names and classes. This means that if a custom element doesn't self-define, it leaves open the possibility that multiple consumers try to register the element and only the first will succeed.

If consumers really want to choose a different tag name, they can create a trivial subclass of the element and register that:

import {SomeElement} from 'some-element';

customElements.define('my-some-element', class extends SomeElement{});

This will all change when we finally get scoped custom element registries. Then consumers will be able to register elements into a scope that they fully control and can choose whatever that names they want.

Export element classes

In order to support the trivial subclassing pattern described above and subclassing in general, you should export the element class:

export class MyElement extends LitElement {
  // ...
}
customElements.define('my-element', MyElement);

Do not import polyfills into modules

This is important general advice, but it's worth repeating for web components.

To go along with the "Only the application knows" mantra, only the application knows what polyfills are necessary for it's target environments. Most users don't need the web components polyfills, so a well-constructed application will not serve them to those users, or will use the webcomponentsjs polyfill loader to only dynamically load them when necessary.

If your library directly imports polyfills, it makes it pretty tricky for applications to pull them out when they don't need them.

Import dependencies with "bare" or "named" import specifiers

It's common practice in Node and on npm to import external dependencies by package name.

Node natively supports this in CommonJS modules with require() and node module resolution:

const otherLib = require('other-lib');

And developers these days are pretty used to writing standard JS module syntax that uses node module resolution:

import * as otherLib from 'other-lib';

These are called "bare import specifiers" or "named imports", and browsers don't actually support them! At least not yet. Browsers only support importing by URL. So all imports must by a full URL (http://) or be a relative URL starting with /, ./ or ../.

What's happening to make bare specifiers work in browsers is that tools like Rollup and Webpack are performing node module resolution at build time and transforming these paths while they're building the whole application.

So if we want to publish standard modules to npm so that they can load without compilation, and we have dependencies that we need to import what do we do?

We could try to use relative paths to point to dependencies, like this:

import * as otherLib from '../other-lib/index.js';

But this requires that we know exactly where other-lib is located on the file system relative to the importing module. The problem is that npm can install modules in a number of different locations, so we can't possibly know the right path to use. Paths will be different when your package is the top package, and you dependencies will likely be at ./node_modules/, and when your package is installed as a dependencies might be at ../node_modules. Servers and build-systems could move files around too.

An even bigger problem is that most tools do not understand these kind of relative paths that reach out of package boundaries. VS Code and TypeScript get mightily confused in my experience.

So right now the best option is to import dependencies by name and let tools rewrite the import specifiers before they reach the browser. This works really well in practice. Since basically all tools support named specifiers, compatibility is good there, and a number of tools have popped up that only rewrite import specifiers and leave individual modules otherwise untouched to be loaded natively by browsers.

We pioneered this on my team with the Polymer CLI which rewrites specifiers on the fly in it's polymer serve command, and other tools like es-dev-server and the unpkg.com CDN with the ?module query parameter do this.

In the near future, browsers will support "import maps" that will tell them how to translate names into URLs, so they too will support named imports. Yay!

Always include file extensions in import specifiers

Classic Node module resolution doesn't require file extensions because it does a search of the file system looking for one of several file extensions if one isn't given. When you import some-package/foo, Node will import some-package/foo.js if it exists. This isn't feasible over a network, so browsers will not do this kind of searching.

Import maps will allow mapping names to URLs, but they only have two type of mappings: exact and prefix. Here's an example for lodash:

{
  "imports": {
    "lodash": "/node_modules/lodash-es/lodash.js",
    "lodash/": "/node_modules/lodash-es/"
  }
}

This means that we can easily map a bare specifier like lodash, or a prefix + full file path, like lodash/forEach.js, etc., but to support extensionless imports like lodash/forEach, we'd have to map every one to the full path, like:

{
  "imports": {
    "lodash": "/node_modules/lodash-es/lodash.js",
    "lodash/": "/node_modules/lodash-es/",
    "lodash/forEach": "/node_modules/lodash-es/forEach.js"
  }
}

We would have to do this for every extensionless import in the entire app. lodash-es has 341 modules that could be imported. Creating entries for each one that's imported without extensions would make for a bloated import map, so it's much better to just use extensions in imports and have only a prefix import map entry.

Publish a custom-elements.json file documenting your elements

Web components tools, like IDE plugins and catalogs, are  starting to converge on a common format for describing the custom elements publish in an npm package.

The custom-elements.json file describes the tag names, attributes, properties, events, etc., that an element supports. With this information IDEs can provide auto-completion, hover-over docs, etc; linters can check that you're using defined properties; type checkers can ensure property bindings are of the correct type; documentation viewers can display the information for human consumption; and catalogs like Storybook can generate "knobs" for components automatically.

Rune Mehlsen, an amazing developer who maintains the lit-plugin VS Code plugin, also maintains web-component-analyzer that outputs this format.

Including this file will greatly enhance the developer experience of your users.

Include good TypeScript typings

I'm a huge fan of TypeScript, and one of my favorite things about its type system is how it's able to type APIs that use string keys.

One example is the document.createElement() method. It's return type depends on the value of the string parameter passed to it, so that document.createElement('div') returns an HTMLDivElement and document.createElement('img') returns an HTMLImageElement. This let's code like this type-check:

document.createElement('div').src = './image.jpg'; // error!
document.createElement('img').src = './image.jpg'; // fine :)

The best part about this is it's based on a mapping from tag name to class that you can extend. It's called the HTMLElementTagNameMap. To extend it to add your element, use TypeScript's interface augmentation.

A full custom element definition in TypeScript should look something like this:

export class MyElement extends LitElement {
  // ...
}

customElements.define('my-element', MyElement);

declare global {
  interface HTMLElementTagNameMap {
    "my-element": MyElement,
  }
}

Of course you can just add this to your typings too if you're not writing your element in TypeScript. The benefits to your TypeScript users will be huge.

Disagree? Comments and Questions?

I know some of these recommendations are controversial, and the finer points can be open to interpretation and debate, but I firmly believe that this is the best way to publish web components in late 2019.

I don't have comments enabled on this blog for good reasons, but you can find me on Twitter at @justinfagnani. I can't promise I'll have the time or inclination to debate these recommendations, but I try to answer questions. Of course, if you seriously disagree you can publish your own recommendations too! 😎

Show Comments