Menu

React, etc. Tech Stack

React, Flux, GraphQL, Hack, HHVM...? All of this and more!

Web Components and SEO

Search continues to be the top driver of traffic to a large majority of websites. While Facebook and other social media platform are growing, good visibility in search results (SERP) continues to be crucial.

So how to make sure you stay on top in results in the fast-moving web development scene with adoption of standards like Web Components?

Search Engine Optimisation (SEO) has grown into an industry in it's own right. While parts of it can be classified as somewhat shady snake oil business, a lot of technical SEO is worthwhile and key in returning relevant results to search engine users.

At the heart of SEO lies the HTML markup which is read (crawled) by Search Engine Robots. This collected data is then processed into a searchable index that is ranked. Recently characteristics of the semantic web have arisen with JSON-LD and other metadata.

But JSON-LD and other metadata have only been supplementary to the core content markup done with HTML5. For two decades the HTML markup has stayed largely the same and the backend generating it has been irrelevant.

Web Components are an unforeseen challenge for SEO

For the crawlers (robots) it did not make any difference whether the markup was generated by Java, PHP or even JavaScript. In fact, the first wave of component based UI rendering methods such as React are often server side rendered to provide optimal HTML.

Server side rendering of Angular 2, React or other front end view libraries and frameworks is great for SEO and user experience. They output the initial view on the server side for speedy initial load and optimal search engine visibility. This is fine as the end result is everyday humdrum HTML.

However, the native web components are different. They operate natively in the browser and if the crawler cannot render a custom web component it will not index the contained content at all. This is a challenge that is not widely explored today as it relies on support for core browser platform features. 

The core of how browsers work with HTML continues to be the same as always; the browser ships with X amount of tags and features. Over the years the supported tags have evolved, but with Web Components developers can write components that behave as if they were included in the platform. This is a key difference.

Contemporary web browsers like Microsoft Edge, Google Chrome and Mozilla Firefox are all evergreen, which means they will get continuously updated with the latest features. This is great for feature adoption in the sense that instead of years of adoption time and a fragmented versioning landscape, new features like web components land in browsers fairly quickly.

Keep Web Component polyfills around for search robots

It will take years for web crawlers to support the web component specifications and many crawlers will never gain support. Luckily prominent search engine robots like the Googlebot, from Google is able to execute JavaScript. This is where polyfills come to play.

Browser polyfills dynamically fill in gaps in support using techniques like JavaScript. According to some discussions at the Polymer Summit 2016, a polyfill library for web components from Google, the same approach can be taken to SEO and Web Components:

Polyfills are traditionally a feature targeted at browsers, but since JavaScript execution is enabled in search engine robots, it can be used as a SEO technique for enabling best possible search engine visibility for a contemporary site using web components.

So as you build web sites and progressive web applications (PWAs) with web components, make sure to use polyfill libraries like Polymer or Skate.js. Initially they will allow browser users (both on desktop and mobile) to use your site, but they also enable search robots to view your content.

Google also suggests that indexing web components can be done by embedding the content in the web components into JSON-LD metadata snippets on the page:

This data is embedded into your webpage as a JSON-LD snippet, which means it’s available to be consumed by the Custom Element to display to a human visitor and for Googlebot to retrieve for Google indexing.
Easier website development with Web Components and JSON-LD

While the above method is reasonable, it requires specific effort and is easy to miss. This is why it's not recommended to deprecate Web Component polyfills lightheartdly even when browser support is complete. Keep them around as a SEO technique for best possible search engine visibility.

Written by Jorgé on Tuesday November 22, 2016

Permalink - Tags: javascript, seo

« Worm spreading malware via Facebook using SVG, JavaScript and Google Chrome Extension - Koa 2, the Node.js framework engineered for async/await »