Using Google Webmaster Tools
You will need a Google account (Gmail, YouTube) to access the tools. Simply head over to http://www.google.com/webmasters/tools and log in. From the Search Console, you can see a list of your current websites. Google calls these properties. If you've not added any yet, the list will be empty.
Click the large "Add a property" button to get started. From here, just enter the URL and click Add.
The next step is to verify that you are the owner. There are several ways you can do this if you already use analytics you can simply link the property to the Analytics property. You can also upload a file to the root of the domain, add a meta tag to the header of the homepage or add a tag to the DNS record. Usually uploading the provided file is a good option.
Once you click verify, Google Webmaster Tools will go away and verify your domain. Once that is complete you can access the full Search Console and find out what is going on with your search engine optimisation.
The first thing you will see is a list of messages from Google about your site performance. If your site has any problems, for example, broken links, invalid sitemaps or connectivity errors, they will show here. Depending on your settings, Google may also email you a copy of these messages. You can also see an overview of your site's performance and include sections covering crawl errors, search analytics, and sitemaps. Each of these summary boxes links to the main report for more detailed information.
If you're adding a brand new website, there may not be any information available yet. It sometimes takes a few days or a few weeks for data to show.
Let's have a look at each report in turn, what information it gives us and why it's important.
This section of the search console shows detailed information about any problems Googlebot encountered whilst crawling your site. This could be from broken links, server connectivity issues, server errors.
The most important report in this section is the "Not Found" errors. These are also known as 404 errors. This report will show you the broken links on your site so you can fix them, either by repairing the link or redirecting the broken link to another page.
You can also see any server errors, such as server-side code errors, or configuration errors on the server. These should also be fixed as a high priority as often they can lead to internal implementation disclosure, which in turn leads to being hacked.
Clicking on any of the broken links will show the detail for the error. It will show you how often the error occurs when it first occurred, what pages and websites are linking to that page. You can also fetch the page as Googlebot and see what the search engine sees. This can be helpful in diagnosing problematic 404 errors when the page does exist.
Once you have resolved an issue, you can mark it as fixed by clicking the tick box followed by "Mark as Fixed". Google will then go away and re-crawl those pages. Check back in a few days and it should no longer appear in the list.
Another report in the Crawl menu is the crawl stats. From here you can see how many times Google tries to index your pages per day, how much bandwidth it is using, and how long it spent downloading content. Google is normally pretty good and keeping the crawl rate low, but sometimes it can really hammer a site, especially if the server isn't very powerful. In these cases you can throttle the crawl rate by clicking on the gear icon in the top right and selecting "Site Settings" from here you can select the option to set crawl rate manually.
Also within the crawl menu is a tool for testing your robots.txt file. This is especially useful if a particular page or section is not getting index in Google. The page will show you the contents of the robots.txt file as Google sees it. It will report on any errors in the file and also provide an option to test a URL. Simply enter a URL in the box provided and click test. This will then fetch the page as Googlebot and it will report if the page is blocked by robots.txt or not.
The final section of note in the Crawl menu is the sitemaps page. This report will give you an overview of your website's sitemap performance and detail the number of unique URLS submitted in the sitemap and the number of URLS currently in Google's index. If there are any errors in the sitemap they will also be listed here. If you don't have a sitemap you can submit one to test or for inclusion.
Now that your site is being indexed and has no crawl errors we can take a look at how people are finding your site. The search traffic analytics report is key here as it will show information about the keywords used to find your site, something that cannot be accessed from any other source. The search traffic analytics report will show you a list of keywords, how many times your site appeared for that keyword and how many clicks your link received. It will also show you the links average Search Engine Results Page (SERP), the page on which the link appears.
From this report, you can find out a few bits of crucial information.
- If your link ranks high (position is around 1) but doesn't get that many clicks you may need to look at your meta description or titles to entice people to click the link over competitors link.
- If your link ranks high, and the impressions are high (>100), but not getting clicks you also need to optimise the meta description and title.
- If your link ranks low (position > 3) you need to optimise the page for the keyword.
The report will also show a list of websites linking to your site. This can be useful to know as it will show sites bringing in traffic which you may want to link back to. You can also see your most linked content and how that data is linked.
The next section to look at in the Search Console is the Search Appearance. This section will give you a few tools which you can use to see how Google views your site and how the data is displayed in the search pages.
If your site uses structured data (microdata, rich cards) you can see how Google sees them and if there are any errors preventing it from showing correctly in the results. You can also use the data highlighter to help Google understand the data on your site by tagging bit of relevant content.
Finally, there is a report which contains any HTML improvements Google think would make the pages better. They will highlight issues such as duplicate content, duplicate descriptions or titles, content which is too short or any other areas which they deem could be beneficial to resolve.
The final area of interest is the security issues tab. This will show you if Google has detected any malicious content on any of your websites which could lead to Google blacklisting your domain.
Typically these come from your site being hacked and malware being uploaded to it. If anything is detected here it should be addressed as a high priority. Once the issue(s) have been fixed, you need to mark them as resolved. Due to the severity of security issues, each item is manually verified as fixed by a human at Google. Because of this, it may take a bit of time to get relisted.