logo
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean feugiat dictum lacus, ut hendrerit mi pulvinar vel. Fusce id nibh

Mobile Marketing

Pay Per Click (PPC) Management

Conversion Rate Optimization

Email Marketing

Online Presence Analysis

Fell Free To contact Us
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean feugiat dictum lacus

029 2176 2077

hello@top-space-cardiff.co.uk

2 Fitzalan Road Cardiff CF24 0EB

Top

JavaScript and SEO: why it’s so important that each page can get indexed by Googlebot

JavaScript and SEO: why it’s so important that each page can get indexed by Googlebot

 

 

 

Technical SEO

 

It’s important, if you use JavaScript on your website, that the page or the blog post can still get indexed.
If it can’t get indexed, that is it then can’t be read by Googlebot, then there’s no chance that the page will appear in Google’s results.

You can still of course use JavaScript, yet it’s important to remember, that if your website is extremely slow, then this could negatively impact your business’s SEO.
So, whether you use Javascript or not, its important to use tools such as GT Metrix to check the speed of your website, and to try and improve how fast it is.
If the website is too slow, it will increase the bounce rate, which is widely considered as bad for a businesses SEO.
That’s because Google, wants your business to offer a good UX and user experience, one way of doing this is to make sure your website is fast.


So, what exactly is JavaScript?

 

It’s important to understand how Googlebot crawls and indexes a page

It’s thought that Googlebot processes JavaScript, in three distinct phases, those phases are “crawling” “rendering”, and then “indexing”.

Firstly, Googlebot sends a request to the company’s server, using what is called a “mobile user agent” so it can obtain the HTML for that website.

Is important to remember that Google, as well as other major search engines only have a certain amount of crawl budget that they allocate to each website. So, do remove any duplicated pages, that waste your companies crawl budget.

More well-known brands and websites, such as the BBC, will have a higher crawl budget, than a blog that doesn’t receive many organic visitors. So, it might be the case, that until your company website gets nearer page 1 of Google’s results, that it will have a much less of a crawl budget, until, the website ranks higher on Google, and receives more organic visitors, at that stage, its likely to get crawled much more often, perhaps once a day, or websites such as CNN might get crawled every few hours.

This stands to reason because well-known brands and companies are likely to keep their website updated much more often, so there is more content marketing that needs indexing, plus the website will have a lot more organic visitors from Google, so it’s important that the content marketing gets indexed as quickly as possible so it can appear in Googles results, such as “breaking news” stories for example, so Google will crawl and index say the BBC website much more often, to look for new pages.

However, some believe that because of this crawl budget, it’s the HTML elements of the website to get crawled first by Googlebot, and then it defers indexing the pages JavaScript for a bit later by crawling and placing them into a queue.

It’s therefore widely thought that it does take longer for Google to index pages that have a lot of JavaScript, although the delay, is often quite short.


 

Server-side rendering

SSR or server-side rendering happens when the complete rendering process takes place directly on the hosting company’s server.
Then once completely rendered, the finished HTML web page or blog post is sent directly to the browser, a lot of SEO consultants consider this a very good choice, as it can help to reduce the load times and prevent layout shifts.

There is also client-side rendering, however, this is generally considered by some seo consultants as slower than server-side rendering.

Many who improve the technical seo of the business, will know that it’s important to make the website as fast as you possibly can so you may prefer to have server-side rendering.


 

So what type of rendering is considered best for seo?

If you were to ask us, we would say to opt for server-side rendering,

Make sure that all of your content marketing can be indexed by Googlebot

We would make sure that each page or blog post can get or is indexed, and to individually check every single page. Sure, this is time-consuming, but we can help to highlight any indexation issues that might be occurring.

So, sure there could be an indexation problem with a JavaScript page on your website, however you have to make sure that everything is getting indexed because something as simple as a “no-index tab” left on a page could be stopping it from getting indexed by Googlebot.

There are also a lot of free tools to help you to see if the page or blog post has been indexed such as Google’s Search Console, by using the inspection tool, which makes it an absolute piece of cake to check if that page has been indexed.
You can also use Screaming Frog as well to look to see if the pages have been indexed.


 

 

Robots.txt

It’s also important to know that code can sometimes be written within the robots.txt file, that can prevent search crawlers from indexing that page or blog posts.

So, if your website can’t get indexed on Google,, and none of the pages are appearing in the serps, we recommend checking to see if there’s any code that’s been written in the robots.txt, may well be disallowing it from being indexed.

 

 

 

 

Share