Eight Tips to Make the Best Usage of Google Webmaster Tools

Posted by Walther Ipsen on April 21st, 2021

Google Webmaster tools certainly are a set of tools which enable webmaster to administer and monitor their site online. These tools are no cost and all that it requires is a Google account. As soon as you log into the webmaster console you can add all your sites here and administer the way the Google bot interacts together with your site. This is among the best resources made available by Google and if you aren't using it you then are missing a great opportunity. For those of you that are using this feature, here are a few tips to have more out of Google webmaster tools. Submit Your Sitemap A sitemap is a basic xml document which lists out all the url's in your site. This is a very useful document and Google do you want to it as a mention of crawl all your pages. Therefore here is the first thing that one should do immediately after you have verified your website. If you are uncomfortable with html, they you can use among the free online tools to make a xml sitemap. A sitemap is particularly useful if your site has a complex navigational structure. Address Canonical Issues In the webmaster console it is possible to set your url preference to either use the www or non www format. This will prevent many problems arising out of canonical issues and issue of duplicated content (More on addressing canonical problems in a later article) Check Crawling Stats On the webmaster console, it is possible to access your crawling stats. Look for the amount of time the Google bot has spent on your site and the average quantity of data downloaded (lower the better). You can also get details about broken links along with other http errors. There exists a feature called fetch as Google bot, which displays the web page because the Google bot sees it. This will be helpful to identify if your articles is clearly noticeable to the spider. Create A Robots.txt file A Robots.txt file is a simple text file gives directions to the Spider on how to crawl your website. If you don't want Google to crawl certain pages, it is possible to block these pages using the robots.txt file. However if you wish for the bot to spider all pages, then there is no requirement for a robots.txt file although I prefer to have one simply for sake of completion. Links To Your Website The webmaster console also shows the total number of links pointing to your internet site. But do not be surprised if you see a very low number because this isn't a real time data and it can only be utilized as a ballpark figure. But nevertheless it is useful. Analyse Keywords Using the data obtained by crawling your website, Google displays a couple of keywords highly relevant to your pages. That is a useful feature and will assist you to analyse and correct keyword density and placement. Analyse Search Terms The Google webmaster console provides another powerful feature called search queries. That is a synopsis of all the search queries leading to your website. read more is very useful because you can focus on keywords which are in fact bringing traffic to your website. Check Site Speed The speed with which your website loads is an essential aspect in terms of drawing visitor. A site that loads slowly is likely to be missed by impatient viewers. It is therefore prudent to test the website speed using Google webmaster tools. Predicated on this one may take remedial actions like compressing pages or displaying them as a cache.

Like it? Share it!


Walther Ipsen

About the Author

Walther Ipsen
Joined: April 21st, 2021
Articles Posted: 5

More by this author