Websites making use of SPAs are more effective in retaining their visitors for longer periods of time than those that don’t. Elements of the page can be quickly explored without having to load a new page, notably decreasing bounce rates and leading to higher conversion. Other than that, it is significantly less work for developers to write code for one page, than to write several codes for several pages.
Angular JS SEO Tutorial – UPDATED 2018
Angular JS’s setback with SEO hasn’t hindered its usage, however, and is still being used by over 900 leading websites today with some developers even naming it as the new standard. This shows that there are indeed ways to overcome the SEO problem.
Until Google is fully capable of crawling through JS or CSS files, it is much safer to understand exactly how search engines work and how to adjust your webpage to better fit their standards.
Angular JS SEO Friendly URLs
To check that all links are working and to avoid such an outcome, after routing to HTML5, generate XML sitemaps and submit it to Google’s Search Console or Bing Webmaster Tools to see if all the important links are showing up on the search engine.
Angular JS SEO Indexing
First and foremost, Googlebot and other SEOs normally identify a page’s index by looking at its source code. Angular JS does not normally display such content within its source code. Though Google has made significant improvements in JS indexation, it is vastly ahead of its competing search engines, and cannot account for all the traffic received by a site.
The answer then to making your Angular JS based site SEO friendly is pre-rendering. In other words, it is essential to create an HTML snapshot of your content within the source code. By doing so, you are allowing search bots to index content that is otherwise hidden to them behind the single page clicks.
Do not trust search engines to achieve all the rendering. Indexing your own site will prove to be a stronger guarantee than any improvements search engines will make for a long time. Hiring professional consultants who are well versed in optimizing websites for search engines may be advisable as well. If you opt to do it yourself, there are some resources available for you to check the rendering of your web pages.
Prerendering can be achieved with the help of services such as prerender.io which renders JS online through your browser, saves the HTML then checks its crawl capacity, or Phantom JS, which does the same.
Other advanced tools are available. Browseo is a web app that focuses purely on HTML, just as a search engine operates. It not only helps its users with rendering the page, but provides additional information such as the number of words on the page, the headings, the links available, tags, keywords, etc. that are readable in index. If you are unsure which areas are relevant for SEO, Browseo highlights them for you. No download is necessary, and if you’re seriously considering using this tool but remain confused as to how to go about it, the webpage provides a blog detailing how it works.
Another way to check how well your website renders is to simply load it on the Google Search Console’s ‘Fetch as Google’ option. By running a “fetch”, you can test how well Google can crawl through your webpage and how many of your page resources are not available to Googlebot. It is basically a crawl simulation.
To check the rendering for other search engines, you may check the most recent cached version of your page on either Bing or Google. This will show you exactly what came up in the bots’ last crawl through your page.
In summary, in order to ensure your page does not incur losses by potential visitors missing out on your page due to problems with SEO, simply prerender your site beforehand with the help of the various tools provided to guarantee that it is a hundred percent readable by search engines. It is also important to check your links. Search engines like Google may be evolving to better fit Angular JS based sites, but it makes no sense losing revenue while you wait!