Google Search Console and Optimization of Websites

Google Webmasters Tool has been renamed and repackaged as Google Search Console in 2015. Google search console is a useful platform for the webmasters to index the web pages and optimize the appearance on google search page as part of search engine optimization. And Google is not charging for this service.  It allows you to perform meaningful analysis and make changes that can affect how Google crawls, indexes and understands your site content.

Here are the main features of the Google Search console

 

  • Search Analytics

Search Analytics is a popular feature of Google search console. It helps to analyze website performance on Google search result page. Search analytics provides search metrics of the website that includes clicks, impressions, rankings and click-through rates. It allows to group data and filters data by following categories Pages, Queries, Devices, Search Type (Not for Apps), Search Appearance [Not always shown] and Date.

  • HTML Improvements

This report addresses the potential issues that found when crawling and indexing the website. Issues like Meta titles, Meta description problems, and Non-indexable content. This report regularly to identify changes that potentially increase your rankings in Google search results pages while providing a better experience for your readers.

  • Crawl Errors

Crawl error report helps you to solve various problems related to the crawl section mainly Site errors and URL errors. All the errors related to Googlebot encounters are shown clearly while crawling website pages. It can prevent your page from appearing in search results.

  • Fetch as Google

The Fetch as Google is a tool enables to test how Google crawls. Googlebot will test the website, how it renders the page, and whether any page resources (such as images or scripts) are blocked to Googlebot. This is useful for debugging crawl issues on the website. This includes changes in the content, title tag, etc. This tool help in communicating with the search engine bots and find out if the page can be indexed or not. This tool also helps in indicating when due to certain errors, the site is not being crawled or may get blocked by coding errors or robots.txt.

  • Sitemaps & Robots.txt Tester

The XML sitemap is used to help search engines (Google, Yahoo, Bing etc) to understand the website better while crawling by robots. There is a section named sitemap where you can test your sitemap to be crawled. No web pages are indexed by Google without the sitemap. Robots.txt is a text file which instructs search engine bots what to crawl and what not to crawl. This file is used to check which URL is blocked or disallowed by robots.txt as shown in Figure.

Other features provided by Google search console are Rich cards, Structured Data, Accelerated Mobile Pages (AMP), Google Index Status, and Sitemap submission.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *