Tips and Trick to Improve Your JavaScript’s SEO

JavaScript usage can have a surprising impact on SEO. But how? As you probably know, basic web crawlers are pretty dumb. They just read through the text files associated with your domain, following links and trawling through your code. But they can’t really “see” the website: only the code that comprises it. As a result, super dense code that doesn’t help your SEO—like extensive JavaScript—can water down in the impact of your effective SEO. But with most sites relying heavily on JavaScript to produce a modern user interface and even serve content, what can you do to improve?

1. Use as little JavaScript as possible

This might be a bit of a cop-out trick, but it should be your first step. Look to see if you can replace animations with HTML5 and CSS3 equivalents. The control and flexibility available is on par with JavaScript. The major downside is that it’s not as broadly applicable to older browsers, and there aren’t always polyfills for older browsers. Make sure to check compatibility before approaching the task. It doesn’t work exactly the same was as jQuery, of course, so you’ll need to familiarize yourself with the new code. But if you get CSS3 to replace animations on this project, you can use it on every other project thereafter. Then your code will offer your client improved SEO options.

JavaScript is also slowly beginning to fall out of favor with visitors. As web users have come to associate scripting languages with ad servers and tracking technologies, more and more have begun to selectively block JavaScript with extensions like uBlock and NoScript. This can lead to even “good” JavaScript being blocked as the baby is thrown out with the bathwater. While you would hope consumers don’t block your good stuff, you should try to avoid a position in which consumers can block content. They might not understand what to do, or why your page is breaking: they just blame it on you. Not your fault, you might say! And that’s accurate. But if we can avoid a potential problem, we should try.

2. Externalize your JavaScript

You probably already know that using a <script> tag to call your JavaScript is better than embedding it in the document. It’s more readable, easier to maintain and simpler to code. But it can also improve your SEO score. When you call JavaScript with a <script> tag, search engine page crawlers will simply ignore it. This will allow your page’s search engine optimized text to stand out. Using a small amount of in-page JavaScript won’t hurt your SEO in any appreciable way, of course. But if you’re using enough JavaScript to be interested in reading this post, you’ll probably want to call it externally.

3. What about JavaScript-generated content?

For a long time, SEO pros have recommended a “plain text” approach, advocating for as much text as possible within the body of the HTML itself. Styling and JavaScript should be sequestered in their own documents, according to these recommendations. However, this information is no longer relevant.

As more sites have begun making JavaScript a major part of their content generation scheme, major search engines have caught up. Google, according to Search Engine Land’s study JavaScript and SEO, fully crawls, indexes, and ranks content generated with JavaScript as if it were HTML. This means that dynamically generated content, links and redirects are all properly managed by the search engine, correctly indexing the final rendered DOM. So don’t be afraid of make JavaScript a core component of your content generation scheme.

Part of the reason this confusion was spawn was a misunderstanding between the roles of Google’s crawling engine (Googlebot) and Google’s indexing and rendering engine (Caffeine). The crawler doesn’t actually render content. It’s just responsible for finding content on the web to look at. This led to a widespread misunderstanding that Googlebot couldn’t handle JavaScript. That is true, but also irrelevant. The crawler doesn’t need to handle JavaScript, since it just finds the content. The index and rendering engine, Caffeine, is the part of the machine that renders pages. And Caffeine is fully capable of rendering JavaScript pages and handling the DOM, and that’s what we care about.

4. Manage the crawl budget

While Google is fully capable of rendering your page’s JavaScript, it also takes a comparatively large amount of time. Google has something called a “crawl budget”: a measurement of the amount of effort they’re willing to expend on a single site. They’ll make their best effort to index your page, but after a while, they’ll move on to the next one, whether or not yours is “done.” We should do our best to help Google out as much as we can if we want our sites to be indexed properly. That means using JavaScript where necessary, but limiting its use.

When your website is based on JavaScript or uses JavaScript extensively, page crawling and indexing becomes inefficient and slow. This means the site map must be constantly regenerated as new links are discovered, potentially leading to link miscategorization or an incorrect balance of URL importance. Good SEO is all about being efficient, so its important to minimize inefficiency wherever possible.


SEO can be a large and baffling topic, and it might be hard to advise clients if you’re not an expert yourself. You can consult websites like SearchEngineLand to learn more, or work with SEO expert consultants if you need SEO consulting.

You might also like the following posts:

Using CSS Pseudo-elements and Pseudo-classes like ::before and ::after

5 Tools that Web Developers and Programmers Need

Best (And Worst) Practices for Collecting Email Addresses

Author: Alex Fox

Leave a comment

Your email address will not be published.