Let’s get down to the real mojo. In the earlier article we had discussed about the basics of Google spider and established how important crawling is for your website ranking.
Are you thinking… “Hey all that is cool but just tell me how I can make my website spider friendly and improve ranking?”
Let’s cover the details of how to make your website google friendly?
How to make a Google Friendly Website Structure?
The flat vs deep website structure to improve website ranking debate has always been a common SEO discussion. While most experts unanimously agree that a flatter structure not only promotes easy navigation and improves user experience, but also subsequently boosts ranking.
True that but won’t the number of links to a certain sub-page reduce?
Before we delve into this, let’s take a quick look at what a flat website structure is and what a deep website structure is. A flatter website is more horizontal with most of its important sections on the top and subsections underneath each. With the majority of its content concentrated and spread over the top, it has a flatter architecture.
A deeper structure is more nested where you have to go from one link to another to get access. Click here to learn more about website structure.
Without getting into the technicalities, what you need to know is that a flatter website structure is a good option even from the crawling ability point of view.
First it aids Google spider to find the relevant pages easily and hence index them accordingly. If your most important page is on the top and easily accessible, Google gives this page priority leading to improved authority and hence higher rankings. And since the content is concentrated across a flatter structure where the top tabs will have links leading to subtopics, Google spider would easily crawl them.
But note that a lot of internal links could adversely affect spider’s crawling patterns as the bot would crawl and index a selective few rather than all. This could also mean that your important web pages might not get crawled.
Create a blueprint for your website
For both existing and new websites, it makes sense to create a hierarchical blueprint to give you a visual idea of what your website would ultimately look like.
You can do this manually or use tools like Visio or do it straight in MS Excel. Move around the tabs, see you don’t have a lot of internal links, the number of subsections under one section should not be more than twice the number under the other.
Superior Website design and Link structure
Besides the architecture, your overall design and link structure too affects crawlability. The superior website structure will also aid in providing site link, which then shows underneath your website on SERPs.
Google themselves admit that spider don’t just as it is discover websites and crawl every page on it.
Some bots might themselves, but mostly a good website structure makes it easy for bots to crawl and index besides making it easy to navigate. Easy navigation should be the top priority with a link structure that directs spider to read all your pages instead of a select few. For this, you need a sitemap.
Suggestive as the name, the sitemap is your way of telling search engines about the website structure and content organization. For content-heavy websites, a sitemap is a definite must.
In fact, our suggestion is that every website – irrespective of the amount of content – must add a sitemap. Also, submit it through Google Search Console.
From Google’ webmaster’s blog
Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages.
More Quality Content
In the first part, we mentioned that google bots crawl through Cache. So every time you update your content, not only it helps with getting more views and improving traffic, but improves crawling speed as bots crawling your site look for changes.
If there is a change – which there is in terms of new content – bots crawling your site will make a note and then come back again.
So with the new and quality content you are inviting bots to crawl your site more often.
Quality informative content that gets readers to visit and revisit your website is the one that would pull up your rankings. Even if Google might not outwardly say so.
Additional Tip – Another way to make your website spider friendly is via cleaner codes. Codes give the spider a direct route to take instead of them having to make that decision. Get an SEO audit to see if you have a spider friendly website or not. If you don’t, no need to sweat it, just follow these tips and you’d have Google eating off your hands.