This was known as Google Webmaster Tools for nearly many years, Google rebranded it as google search console in 2015.
Adding a Sitemap:
Sitemaps are files that provide the search engines and web crawlers valuable information about how our site is built and the type of content available there. Sitemaps can include metadata, with details about our site such as data about images and video content, and how often our site is updated.
By submitting our sitemap to Google Search Console, we are making Google’s job easier by assuring that they have the information they need to do their job.
We have to create a sitemap for our site. Something similar to – https://classicpolos.com/sitemap.xml Then, we can easily submit the sitemap from time to time. It is under the Index tab – Sitemaps.
Checking a robots.txt file:
Having a website doesn’t certainly mean we want to have all of its pages or directories indexed by search engines. If there are some things on our site we would like to keep out of search engines, we can achieve this by using a robots.txt file. A robots.txt file installed in the root of our site tells search engine robots (web crawlers) what we do and do not want to be indexed by using regulations known as the Robots Exclusion Standard.
Under the Crawl option, we can choose robots.txt Tester. The Robots.txt Tester Tool will let us examine at our robots.txt file, make changes to it, and it alerts us about any errors it detects. We can also choose from a range of Google’s robots or crawlers and enter a URL we wish to allow/disallow. Then we can run a test to see if the URL is identified by that crawler.
Fetch as Google and submit to index:
If we have made changes to our website, the quickest way to get the updates indexed by Google is to submit it manually. This will allow any changes done to things such as on-page content, title tags, meta tags, alt text to appear in search results as soon as possible.
Again to do this, under the Crawl option we can choose to Fetch as Google or the URL inspection tool. Once we enter the page we need to get indexed, Fetch and Render button sees off the process. Depending on the number/size of pages being fetched or inspected the timing can vary.
Site errors in Google Search Console:
Google Search Console can promptly notify us of any errors it finds on our site. If we want to check our site for internal errors, We can check that from the Crawl Errors page, which reveals any site or URL errors found by Google’s bots while indexing the page.
Any URL errors found will be displayed at the bottom. We can check any of the errors for a description of the error found and further details.
Other things for which Google Search Console can be used are:
To Identify our highest-traffic pages, monitor our CTR over time, Monitoring our impressions.
The ranking factors are a very significant factor and Search console allows us to identify ranking increases and decreases. We can compare our site’s search performance across desktop, mobile, and tablet. We can monitor how many total backlinks our site has and other things concerning SEO. To improve the site’s SEO, this is a great tool and with these reports, we will be easily able to check for errors, rectify the errors, and have keyword ideas and links.
Performance tab
In the Performance tab, we can see what pages and what keywords our website ranks for in Google. It is likely to see the data for up to 16 months. The data is available from the time we set up the account.
By monitoring the performance tab regularly, we can immediately see what keywords or pages need more consideration and optimization. The performance tab has a list of queries, pages, countries, and devices. The search appearance option gives us the opportunity to check how results are doing in search. These sections can be classified by the number of clicks, impressions, CTR, and average position.
The amount of clicks tells us how frequently people clicked on the website in the search results of Google. The impressions tell how our website or a specific page is displayed in the search results. The CTR – Click-through rate defines what percentage of people that have seen our website in the search results also clicked through to our website. Here, higher rankings mostly lead to higher click-through rates. The average position section explains what the average ranking of a specific keyword or page was in the time period we picked.
URL Inspection
The URL Inspection tool helps to examine specific URLs. Basically, what it does is, it recovers the page from Google’s index and compares it with the page as it lives now on our site to see if there are variations. On this page, we can also find more technical information, like when and how Google crawled it and how it seemed when it was crawled. It also tells more about the errors. It also gives information about the data found on the URL.
Speed
The speed report is a valuable addition. This report gives a good sense of how fast our site loads on mobile and desktop. Additionally, it also tells which pages have issues that keep them from loading quickly. The data is of real users.
Links
Within the links to our site part, we can find how many links from other sites are aiming at our website. Besides that, we can see what websites link, how many links those websites contain to the site, and lastly, what anchor texts are used most linking to our website. This can be helpful information because links still are very important for SEO.
Within the internal links section, we can check what pages of our website are most linked from other places on our site. This list can be worthwhile to examine constantly because we need our most important pages and posts to get the most internal links.
Mobile usability
The mobile usability tab within this section shows us usability problems with our mobile website or with specific mobile pages. Since mobile traffic is growing all over the world, it is always advised to check it regularly. If our mobile site isn’t user-friendly, lots of visitors will exit quickly.