October 23, 2016 Chris Barnhill

What it Takes to Index a New Site

So your brand new website is live.

You’ve spent the time crafting the perfect copy, sweating every pixel in the design, and debating endlessly over what will provide your users with the best possible user experience. Now it’s time to get it in front of people. You type the name of your company into Google, but nothing comes up. That’s because Google needs to index your site in order for it to be available to appear in search results.In order for that to happen, their crawler known as “Googlebot” needs to find your website and report back to Google on what it contains. There are a few steps you can take to expedite this process and make sure your website is reaching the leads you are looking for.google search console

Inbound Links

Once your site is launched, do what you can to encourage known web properties to include links to your site on their pages. As Googlebot crawls already known sites in search of changes, it will make note of new unknown links and set them aside for future crawling. The more quality inbound links, the better, but pay special attention to the word “quality.” Google will punish sites it feels are utilizing less-than-genuine tactics to improve their position on search rankings.

Robots.txt

When Googlebot makes it to your website, it will look for this file to see which pages to crawl and which not to. For this reason, your Robots.txt file should be a part of your website’s top level directory. Another important feature of the Robots.txt file is that it points Googlebot to your sitemap, another crucial tool in making sure your site is indexed. What’s the deal with this sitemap, you say?

Sitemap

A sitemap is an XML file that documents each page on your website and tells crawlers which pages they are welcome to visit. Think of it as a map of your website that Googlebot can easily read and understand. It will also allow you to assign priority to URLs relative to other URLs on your site (where your homepage may be 1.0 compared to your privacy policy being a 0.2). It is important to keep this file updated with the current structure and content of your website, but don’t worry, there are many tools that can do this for you. The Yoast plugin for wordpress, for example, will automatically create and update your sitemap as you build and make changes to your website’s structure.

Google Search Console

Site Index

Now that you have these tools in place, you can sit back and wait for Googlebot to breeze on through and send your website’s data back to the mothership. OR, you could take a more proactive step and submit it yourself through the Google Search Console (formerly Google Webmaster Tools). From your Search Console Dashboard, you can see what parts of your site have been crawled, some basic search analytics, and what (if any) of your URLs have been submitted via your sitemap. You also have the option to submit your sitemap for the first time for crawling, which can be helpful in expediting your indexing process. This doesn’t necessarily guarantee that your site will be crawled immediately, but it will make Google aware that the site needs to be crawled if it hadn’t come across your site by virtue of inbound links.

From here you can start making use of your web analytics to gain insight into how your site is reaching your desired audience. There are lots of other SEO factors that will determine how and where your site will rank in various keyword searches, but these tips will at least get you on the map.