React JS SEO Guide – Getting Started With React Server Side Rendering

React JS SEO Guide – Getting Started With React Server Side Rendering

Seeing as over 90 percent of traffic on the web comes from first page results on search engines, it comes as no surprise that being SEO friendly is absolutely crucial for a business website’s survival. The prevailing issue among all JavaScript based platforms such as Angular or React is that search engines still remain incapable of indexing their content.

Yes, Google has been taking some steps toward rectifying that, but at much too slow a pace. Meanwhile, web developers wishing to make use of the dynamic features JavaScript, particularly, React JS, provide are left questioning whether or not they should make the shift considering this major drawback.

React JS SEO Guide – How to Make React SEO Friendly

The top contending reason for the use of React and other JS based applications would be its use of Single Page Applications (SPA). With technology progressing as fast as it has, people are used to the speedy influx of information.

This means that people no longer have time to wait for new web pages to load and render with each click. Bounce rates are significantly higher in sites that do not make use of SPAs. With the invention of the SPA, reloading an entire page as opposed to simply generating only the content that differs has become redundant and decreases the performance of the site.

However, as mentioned before, it is not without its drawbacks.

Lack of Dynamic Tags

Search engine crawlers are unable to pick up the complete page load cycle. Since SPAs allow the user to click and change content dynamically within a single page, this means that the page does not get refreshed, and neither does the metadata. Considering the fact that crawlers cannot determine activity performed within a page unless it is through clicking a link that leads to another page, the SPA will be indexed as empty.

Normally, to combat this problem, programmers would simply create separate pages in order for the crawlers to pick up on individual data. However, this displays the need for SPAs even more since developing additional pages will lead to increased expenses and does not ensure higher ranking on search engines which is imperative for businesses as 75 percent of traffic goes to the top 5 search results on the first page. Visibility in the lower ranked searches is almost as bad as complete invisibility.

Search Engines Do Not Run JavaScript 

React and other JS based applications are useless without enabling JavaScript itself. The issue this raises is that search engines are unable to index JS, and hence, are unable to execute them. Google released a statement that so long as JS based websites allowed access to Googlebots, there would be no issue in executing the page.

Though this statement seems to be a beacon of hope for those considering JavaScript applications seeing as Google is the leading search engine in the world, one should not forget that there are indeed other search engines. Search engines such as Bing still hold a considerable percent of the market share, namely one third of the market in the US.

So far, Google is the only search engine that has made progress in better indexing JavaScript, and it is still not a guaranteed system. By relying purely on Google to render React webpages, you are forfeiting a significant percent of traffic that could have been avoided if the rendering problem were solved.

Prerendering

If search engines cannot be trusted with creating JS rendering solutions in the near future, it makes sense to simply prerender your webpage beforehand. This way, you are fully aware of what issues your page present and how to rectify them. Search bots work by scanning through the source code for indexable content. As JS programs do not rely on the source code, search bots will be unable to find the content they are looking for.

In order to create this content, there must be an HTML snapshot of your page content placed within the code. Otherwise, it may be helpful to use prerendering tools that are readily available for you on the net. One such tool is Prerender.

Prerender is a web application that renders JS online, saves the static HTML then simulates crawling, alike to the type used by search engines. It is also Google approved and its AJAX crawling specification is in fact the specification followed by the app. Prerender is also an open source application available on GitHub so anyone can run their own personalized server. On top of all that, if a crawler is unable to access a page because it isn’t cached, Prerender runs the page anyway and caches it later so their cache is always complete.

React Server Side Rendering 

This is perhaps the best solution for making your React page SEO friendly. Server Side Rendering (SSR) can also be called Isomorphic JavaScript as isomorphism means seamlessly switching between server and client. In other words, Isomorphic JS Technology, otherwise known as ‘Universal JavaScript’, refers to the code or technology that allows JavaScript to run on either the server or client’s side. If a client has disabled their JavaScript, the server can send the final content readily from its own library and database. This allows for wider compatibility for different browsers and crawlers. It also creates a smooth experience in browsing for its users with better code maintainability.

Imagine having to wait virtually no time at all for your page to load because it is already available to you. It can be more pressure on the server than simple prerendering would be, though.

React JS can be used with most front-end libraries, and is most often used with a language mixed with JavaScript and XML called JSX language.

To set up and use SSR would first require a server, for which you can use the Express app to render your React based page. Instructions on how to use the app are detailed on its website.