Eight Tips to Make the Best Use of Google Webmaster Tools

Posted by Walther Ipsen on April 21st, 2021

Google Webmaster tools certainly are a group of tools which enable webmaster to administer and monitor their site online. These tools are no cost and all that it requires is a Google account. Once you log in to the webmaster console you can include all of your sites here and administer how the Google bot interacts with your site. This is the most effective resources offered by Google and if you are not using it then you are missing an excellent opportunity. For those of you that are using this feature, here are a few tips to get more out of Google webmaster tools. Submit Your Sitemap A sitemap is really a basic xml document which lists out all of the url's in your site. That is a very helpful document and Google will you it as a reference to crawl all your pages. Therefore this is the first thing that one must do soon after you have verified your website. If you are not comfortable with html, they you need to use one of the free online tools to make a xml sitemap. A sitemap is specially useful if your site includes a complex navigational structure. Address Canonical Issues In the webmaster console it is possible to set your url preference to either use the www or non www format. This will prevent many problems arising out of canonical issues and problem of duplicate content (More on addressing canonical problems in a later article) Check Crawling Stats On the webmaster console, you can access your crawling stats. Search for the amount of time the Google bot has spent on your site and the average amount of data downloaded (lower the higher). Also you can get information regarding broken links along with other http errors. There exists Additional info called fetch as Google bot, which displays the web page because the Google bot sees it. This is helpful to identify if your content is clearly noticeable to the spider. Create A Robots.txt file A Robots.txt file is really a simple text file gives directions to the Spider on how best to crawl your website. If you don't want Google to crawl certain pages, it is possible to block these pages using the robots.txt file. However if you wish for the bot to spider all pages, then there is absolutely no requirement of a robots.txt file although I favor to have one just for sake of completion. Links To Your Website The webmaster console also shows the total number of links pointing to your site. But do not be surprised if you see a very low number because this isn't a real time data and it can only be utilized as a ballpark figure. But nevertheless it is useful. Analyse Keywords Using the data obtained by crawling your site, Google displays a set of keywords highly relevant to your pages. Additional hints is the useful feature and will assist you to analyse and correct keyword density and placement. Analyse Search Terms The Google webmaster console provides another powerful feature called search queries. This can be a synopsis of all the search queries leading to your website. This feature is very useful because you can focus on keywords which are in fact bringing traffic to your website. Check get more info with which your website loads is an important factor with regards to drawing visitor. A niche site that loads slowly may very well be missed by impatient viewers. Hence, it is prudent to test the site speed using Google webmaster tools. Predicated on this one can take remedial actions like compressing pages or displaying them as a cache.

Like it? Share it!


Walther Ipsen

About the Author

Walther Ipsen
Joined: April 21st, 2021
Articles Posted: 5

More by this author