Author: James Clark
First published: May 2, 2022
If you want your website to succeed on Google, you need to understand how visitors make their way to your content. That means you need to learn:
How Google “sees” your content
The keywords people search on Google that lead them to your web pages
This information is the foundation for successful SEO, whether you’re working on an enterprise-level website or a personal blog. And, there’s one resource you can use to learn both: Google Search Console (GSC).
The data that Search Console gives you access to (along with some complementary web analytics) can help you create well-informed strategies, capitalize on emerging trends, fix technical issues, and so much more—making it the quintessential tool for SEO.
In this guide, we’ll look at:
What is Google Search Console?
Search Console is a free tool from Google that, in the search engine’s own words, enables you to “monitor, maintain, and troubleshoot your site’s presence in Google Search results.” In a nutshell, it helps you understand how Google sees your site and fix issues it may have found.
Third-party SEO tools have developed ranking scores (such as Moz’s “domain authority,” for example) to estimate how Google sees particular sites. Google Search Console (GSC), on the other hand, gives you direct access to information that Google has about your site.
What does Google Search Console do?
GSC is a reporting platform, but it’s capable of much more than that. It also allows for two-way communication with Google: you can use it to tell the search engine about your site and request that it takes particular actions (whether Google acts on those requests is a different matter).
The reporting side of GSC
The action side of GSC
GSC will give you information on:
You can use GSC to:
We'll look at each of these reports and capabilities (and more) in detail as part of this guide.
Who uses Search Console and why?
GSC is available to anyone who owns or manages a website and completes the verification process (more about this in the next section). Once you verify your site, you can then invite other users to that property.
Using Search Console is entirely optional and you don’t need to use it for your site to appear in organic (that is, non-paid) search results. That said, if organic traffic is at all important to your business model, then it would make sense for you to use Search Console.
On a practical level, GSC can tell you if there are any problems that might be holding back your performance in Google’s organic search. When it discovers specific issues, it will flag these by email or through alerts in Search Console itself:
And even if GSC discovers no issues, it can still help you refine your content strategy and grow your organic traffic.
This means GSC is relevant to everyone from a small business owner with a single site looking to get leads from organic traffic, through to large agencies managing a number of sites on behalf of clients.
How is GSC different from GA4?
Google Search Console and Google Analytics 4 (GA4) are both free tools from Google that give you invaluable insight into your site’s performance and help you make website/business decisions. It’s no surprise that people might confuse the two—especially as it is possible to link the tools together and see Search Console-powered reports directly within GA4. We’ll look at that in our pro tips section later.
That said, these tools are very different both in terms of the data they provide and the questions that data can help answer.
Put simply, GSC is an SEO tool that tells you how Google sees your site—it’s focused on organic search performance. On the other hand, GA4 is an analytics platform that tells you what people do on your site—it’s focused on user activity.
Google Search Console (GSC)
Google Analytics (GA4)
GSC focuses on organic search performance, and can help you answer questions such as:
However, there is some crossover—both tools can give you insight into the amount of traffic you receive from Google organic search. But, be careful when making direct comparisons as the tools use different metrics: the number of search clicks (GSC) almost certainly won’t equal the number of new users (GA4) acquired from organic search. A user may click more than once, or perhaps the click is registered in Search Console but your analytics tracking code doesn’t fire, for example.
Also, remember that Google isn’t the only search engine or the only source of organic search traffic. Many other search engines have their own equivalent of Search Console—for example, Bing offers Bing Webmaster Tools. But, as Google accounts for over 90% of the global search engine market share worldwide, it probably makes sense to start with Search Console.
How to get started with GSC
In order to use Search Console, you must first verify your site to prove that you own or manage it. This is to prevent other people from accessing business-sensitive data about your website, and potentially even making changes that will affect its presence in Google search.
Google provides a number of methods to complete verification. The method you choose will depend on your technical skills and the level of access you have to your website.
You will also need a free Google account—if you have a Gmail account, then you probably have one of these already.
To begin the process:
01. Go to https://search.google.com/search-console.
02. Click on the blue Start now button.
03. Log in to your Google account (or create an account) when prompted.
04. You’ll now see the “Welcome to Google Search Console” splash screen.
(If you’re already logged in to your Google account, you’ll skip the second and third steps.)
Google Search Console site verification methods
On the “Welcome to Google Search Console” screen, you first choose the property type you want to verify: Domain or URL prefix. This determines the verification method(s) available to you.
Out of these two property types, Domain is more powerful as it will show you how Google sees your URLs across your different subdomains and protocols (HTTP or HTTPS). For example, if you verify the domain example.com, you’ll be able to access Search Console data for http://example.com, https://example.com, and https://subdomain.example.com pages.
On the other hand, URL prefix limits you to a single domain and protocol. If you verify the https://example.com prefix, you won’t be able to access data on URLs beginning with http://example.com or https://subdomain.example.com. However, there’s nothing stopping you from verifying multiple URL prefix properties. (Pro tip: This could also be a useful way to manage access if you have colleagues or partners who are working on a specific subdomain.)
The downside of the more powerful Domain approach is that it only gives you one verification method: DNS verification. This involves adding a record to your DNS configuration. Depending on your setup, you may need to do this through your domain name provider such as GoDaddy, or your website builder such as Wix.
In many organizations, the person responsible for SEO may not be the one responsible for (or even have access to) DNS configuration—in which case, the URL prefix approach could be easier.
Here, you have a choice of verification methods (in addition to the DNS approach):
Uploading an HTML file to your website (Google’s recommended approach)
Adding a meta tag to your website’s homepage
Using your Google Analytics account
Using your Google Tag Manager account
Perhaps you aren’t sure how to upload an HTML file or add a meta tag (a small piece of information about your site, almost like a label). If so, it’s worth checking whether your platform, CMS, or site builder offers any tools to make this process easier. For example, Wix has a site verification manager where you can paste in the meta tag and it will automatically be added to your site.
Note: In addition to verifying your site, Wix’s GSC integration also enables instant homepage indexation and automatically submits your sitemap to Google:
Whatever verification method you choose, remember that you will need to leave it in place even after verification. For example, if you upload a meta tag, removing that tag at a later date will cause you to lose access to that property in Search Console. For that reason, it’s sensible to verify your site with more than one method (if you can).
Search Console metrics definitions
Once you’ve verified your site, it can take some time (perhaps a day or two) for data to become available in Search Console. So, if you can’t see anything useful straight away, that’s nothing to worry about.
And, if your site itself is new, there may be no performance data at all to start with—but, this will rectify itself over time as you create more content for Google to crawl and the search engine learns more about your site.
While you’re waiting, it’s a good idea to familiarize yourself with the most common metrics you’ll see in GSC.
Total impressions: The number of times a page from your site has appeared in Google’s organic (non-paid) search results.
Total clicks: The number of clicks through to your page from a Google search results page.
Average CTR: Your average click-through rate, calculated as clicks / impressions x 100 (e.g., if you get one click from 100 impressions, your CTR is 1%.)
Average position: Your average position in Google organic search results, with “1” indicating the first or top result. The lower the number, the better.
Search Console will also tell you about the status of individual pages (URLs) on your site. There are three main steps your pages will go through:
01. Discovered: Google is aware of your page, perhaps through a sitemap or via a link from another page.
02. Crawled: Google’s bot has accessed the page and attempted to read its content.
03. Indexed: Google has added the page to its index, which means it may choose to display it as part of a search result.
It may be that certain URLs are discovered but not crawled, or crawled but not indexed. One of GSC’s main uses is to identify when this is happening and why.
Data freshness in Search Console
Before we dive into any specific Search Console reports, let’s talk for a moment about data freshness: how up-to-date the data in GSC is.
For all the GSC reports we’ll look at, data is typically available after two to three days. So if you visit GSC on a Wednesday, the freshest (most recent) data available to you may well be Monday’s.
And that freshest data is also likely to be provisional and subject to a small amount of change. You can tell if this is the case by hovering over a data point in a graph. You may see a message saying, “Fresh data - usually replaced with final data within a few days.”
GSC features overview
When it comes to finding your way around GSC, familiarity with other Google tools definitely helps. Search Console, like GA4 or Google Ad Manager, consists of a number of reports all accessible through a vertical menu along the left-hand side of the screen. The key GSC reports are grouped into four sections:
Some reports only become available once your site meets certain conditions. So don’t be alarmed if your GSC doesn’t have exactly the same reports as someone else’s, or if the reports you do have are in a slightly different order.
At the top of the main menu is a drop-down showing your currently selected property. If you have more than one property in Search Console, you can use this dropdown to switch between them. There’s also an option to “Add property” if you want to go through the verification process for another domain or URL prefix.
In this section I’ll go over the main reports and tools within Search Console, explaining what data they contain and how they can help you as an SEO or website manager.
Main dashboard overview
When you open Search Console, you’ll land on the Overview report. This gives you top-level metrics for GSC’s four key reporting sections (Performance, Indexing, Experience, and Enhancements).
The Performance part of the Overview, for example, shows you the number of clicks from organic search—while Indexing shows you the number of pages indexed (and just as importantly, not indexed).
In addition to the headline figures, each part of the Overview contains one or more graphs showing how your site has performed over the previous three months. The idea here is that you can quickly identify any unexpected performance changes and then click through to the relevant reporting section in Search Console to investigate further.
Note: Wix site owners that have verified their sites can view GSC data from directly within the SEO Dashboard.
One Search Console quirk is that perhaps the single most important section is Performance, yet you may not see this section in your GSC menu at all. Two of the reports in this section, namely Discover and Google News, only appear “if your property has reached a minimum number of impressions” in those Google services. And if you don’t have access to either of these reports, GSC moves (and renames!) the third report, the one that covers your performance in Search.
This means you might find the Performance report for Search in the menu immediately after the Overview (and simply called “Performance”); alternatively, you might find it in a dedicated Performance section where it will be called “Search Results.”
Search Results report
The Performance report for Search (no matter what it is called in your GSC) gives you the data you need to understand how your content is performing on Google and how you might optimize it.
It shows a table of all the search queries your website is ranking for—in other words, the keywords and phrases that users type into Google for which your site appears in the search results.
By default, two of the four key metrics (total clicks and total impressions) are selected—but you can click on the other two (average CTR and average position) to add them to both the table and graph on this page.
The tabs immediately above the table let you swap from a list of your top search queries to a list of your top pages (with the same metrics—clicks, impressions, and so on).
But what if you want to see the search queries for one particular page? 01. Click on the PAGES tab. 02. Click on the page you are interested in. 03. Now click back to the QUERIES tab.
When you do this, GSC will add a filter to the report restricting results to just the page you selected. If you want to remove or edit this filter, look for it at the very top of the report (above the chart).
In this same section, there’s also a filter to change the search type—so if you want to know how your site is performing in Google Images or Google News, this is the place to go.
Note: Wix site owners who have verified their site with Search Console can view their Google search performance over time, top search queries on Google, and top pages in Google search results directly from their Wix Dashboard.
Comparing time periods in GSC
By default, the Search Results report (and indeed the other main reports in GSC) only shows data from the last three months—the same as the Overview. However, you can change this using the date range filter, or use the Compare option to compare one period against another.
This can be very useful if your business is seasonal. For example, a retailer selling Christmas trees would expect relevant search phrases to have high volumes in Q4, but low volumes in Q1. You can see this for yourself using a tool such as Google Trends, which shows relative interest in different search terms over time:
For this sort of business, it makes little sense to compare Q4 to Q1. Instead, it would be much more accurate to compare performance in Q4 this year to Q4 the previous year.
Be careful though, because Search Console only retains data for the past 16 months. This means it isn’t possible to compare the most recent full year against the previous one in any of GSC’s reports (as this would require at least 24 months of data).
Reviewing your search data
If you want to review your search data systematically, you can export it to Google Sheets or download it as an Excel or zipped CSV file. You can do this by clicking on the EXPORT button to the top-right of the report and selecting your preferred option. This will give you up to a thousand rows of spreadsheet data, which you can sort and filter exactly as you wish, and potentially integrate into your keyword research plan or an SEO report.
For example, you might want to look at pages that have an average position of around 11 or 12 for an important, relevant search query. That average position means you are probably appearing towards the top of the “second page” of Google search results (though Google is rolling out scrolling search results pages, starting in the US).
Tweaking your content here could bump you up a few positions and see you appearing in the top 10 results, which may significantly increase the impressions and clicks you receive for that query.
Alongside the Search Results report, you may also have access to the Discover report. This follows exactly the same format but is entirely focused on your performance in Google Discover—a personalized content feed available in the Google mobile app.
Traffic from Google Discover is notoriously “spiky”—it’s not uncommon for publishers to suddenly receive thousands of visits to a single article appearing in Discover. This often appears in analytics tools as “direct” traffic and can be hard to identify. The Discover report in GSC is the place to go to see whether your sudden traffic surge has come from Discover.
The third and final report you might see in the Performance section relates to Google News—though it doesn’t cover all news traffic. Instead, it focuses solely on traffic from news.google.com and the Google News app on Android and iOS.
If you want to see traffic from the “News” tab in Google Search (which is what people often mean when they talk about “news” traffic), you’ll find this in the main Performance report for Search by adding the filter Search Type = News.
Interestingly, you can still access the Google Discover and News reports in GSC even if they don’t appear in your menu. Go to:
Then, choose a property from the dropdown if prompted to do so.
Let’s say you have concerns about one particular page on your site: has it been discovered, crawled, and indexed? You could search for it in the Indexing reports, but a much better approach is to use GSC’s URL Inspection tool.
To do this, paste your URL into the search box at the top of the page. (There’s also a menu item for URL inspection, but it just highlights the corresponding search box.) Once Google has retrieved the data about your page from its index, click on the Page indexing section to expand it and see all the details.
Perhaps you’ve spotted some URLs in GSC reports that you feel shouldn’t be there. Just pop one into the URL Inspection tool and you will be able to see how Google discovered the page (usually through an XML sitemap or a link from another page). That should be enough information for you to tackle the problem or at least investigate further.
Another key piece of information is the date on which Google last crawled the page. If the page is new or recently updated, and you are keen to get your changes reflected in Google search results as soon as possible, use the option up at the top to REQUEST INDEXING:
Bear in mind, though, that this is just a request. Google tries hard to manage expectations in its search documentation:
“Requesting a crawl does not guarantee that inclusion in search results will happen instantly or even at all. Our systems prioritize the fast inclusion of high quality, useful content.” — Google
Finally, the TEST LIVE URL option enables you to see the page as Google sees it (either as HTML or as a live screengrab). This can be useful for identifying any parts of the page that Google can’t (or won’t) crawl.
As part of the live test, you can also see if Google was unable (or decided not) to load any page resources. In the following example, Google hasn’t loaded the analytics or advertising code for the site. Neither of these would be needed by Googlebot, so there are no concerns here:
URL Inspection Mini Case Study
Recently I had an issue where Google Search Console’s Page Indexing report was reporting some mysterious partial URLs that weren’t actual pages on my site.
For example, for the page:
Google was reporting both that URL and the partial URL:
So, where was Google picking those partial URLs up from? I used the URL Inspection tool to inspect one, and it turned out that Google had discovered it via a referral from the article itself (rather than, say, an XML sitemap):
Inspecting the source of the referring article, I discovered the culprit was some ad code I’d added to my site that included part of the URL for targeting purposes. I was then able to tweak this so it didn’t look like a relative URL, dissuading Google from “discovering” it in future. But if it hadn’t been for the URL Inspection tool, it would have been much more difficult to work out where Google was picking this “URL” up from in the first place.
For the full case study, read my article “Why is Google Search Console detecting partial URLs for my WordPress site?”
Page indexing report
While the URL Inspection tool is useful for troubleshooting problems on individual pages, it won’t tell you about issues that affect groups of pages or even your entire site. That’s where the Page indexing report (under Indexing > Pages) comes in. (It used to be called the “Index Coverage” report, an appropriate name as it tells you how much coverage your site has in Google’s index.)
The top graph shows the number of pages on your site that are (and aren’t) indexed, much as we’ve already seen on the Overview. But underneath that is a breakdown of the reasons why pages haven’t been indexed—and the number of pages affected by each reason:
Under the “Source” column, you’ll see that some of the reasons will be marked “Google systems,” indicating that they relate to Google’s behavior; while others will be marked “Website,” indicating that “you should fix the issue if it makes sense to do so.” (Hover over the question mark icon at the top of the “Source” column to see Google’s explanation of this.)
That disclaimer, and the fact that Google uses the word “reason” rather than “problem,” both indicate that—in many cases—it’s fine for URLs not to be indexed.
For example, the following “reasons” aren’t usually cause for concern:
Page with redirect
This might relate to redirects you’ve added (or that your platform or site builder has added for you automatically; for example, when you delete a page or change its URL).
It’s nothing to worry about, although you may want to check that the pages you are redirecting to are indexed correctly.
Discovered—currently not indexed
Google has found your page but hasn’t crawled it yet.
Unless you have a huge site (such as an established eCommerce site) with thousands of uncrawled pages, this isn’t anything to worry about either.
Excluded by “noindex” tag
The noindex tag is a small tag added to a page to tell Google (and other search engines) not to index it.
This isn’t a technical problem, but do check to make sure you haven’t mistakenly noindexed a page you want to appear in search results. You might, for example, have added the tag when you were working on a new page (and it wasn’t ready for Google), but then forgotten to remove it.
And, here are a few “reasons” that may require some attention/action on your part:
Not found (404)
An error indicating that the page couldn’t be found on the server.
If you deleted that page from your website, you should put a redirect in place. If you haven’t deleted it, there might be a technical issue with your site.
Duplicate without user-selected canonical
Google sees this page as a duplicate of another page, and is showing the other page rather than this one in search results.
You can delete one of the duplicate pages or add a canonical tag to indicate to search engines which one is the main version.
This can happen when you have a long chain of redirects, or perhaps two pages that are both redirecting to each other, creating a “redirect loop.” To fix this, you’ll need to either break that loop or reduce the size of the chain. For example, you might have page A redirecting to B, B to C, and C to D. You could remove all of these redirects and instead redirect pages A, B, and C all directly to page D.
Bookmark Search Console Help for a more comprehensive list of non-indexing reasons.
Even when the Page Indexing report does indicate problems with the site itself, bear in mind that this report isn’t “real time.” It lists issues that Google found when it tried to crawl your content—not necessarily errors that affect your site right now.
To put it another way, the report could well include problems that have already been resolved by the time you came to look at it. As a result, it’s really worth digging into each issue to see whether it still needs to be addressed.
Click into any of the listed reasons for non-indexing to see:
A graph of the number of pages affected over the past 90 days
Some examples of the affected pages
You might immediately see a pattern: perhaps those 404s all relate to blog posts, or to tag pages. Here, for example, all of the affected pages are “feed” pages for specific categories:
Once you know which section of your site is affected and when the problem started (or at least was detected by GSC), it will be much easier to investigate, see whether it’s still an issue, identify the cause, and then fix it.
If you’re sure an issue has been resolved, the next step would be to ask Google to VALIDATE FIX.
Also in the Indexing section of GSC is the Sitemaps report. Actually, this is both a tool and a report, because it lets you submit an XML sitemap and then monitor its status.
But what is an XML sitemap? It’s a document that lists your website’s pages (or at least the ones you want search engines to crawl and hopefully index). Without a sitemap, search engines will still attempt to find your pages by following internal and external links (links on your site and on other sites)—but a sitemap can help them with that discovery process.
XML sitemaps shouldn’t be confused with HTML sitemaps, which are mainly intended for users of your site rather than search engines.
So how do you get an XML sitemap? It’s likely that your website platform or builder will be able to generate one for you, either as part of its core functionality or through a plugin or extension. Once you have it, enter its URL into GSC’s sitemap tool and click “Submit.”
You can submit more than one sitemap if, for example, you are using domain verification and each subdomain has its own sitemap. The report will list:
All submitted sitemaps
When Google last read each sitemap
The number of pages Google discovered
Whether Google encountered any problems
If you find that Google is not discovering a lot of your URLs, a first step would be to check that there are no issues with your sitemap.
Sitemaps mini case study
I noticed that Google was slow to discover (and then crawl and index) my latest blog posts, and suspected a problem with my XML sitemap. So, I visited the Sitemaps report in Google Search Console to see it was reporting one error:
Clicking on the error gave me this detail: “Sitemap is HTML. Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead.”
What could be causing this? I tried to view the sitemap using the OPEN SITEMAP link in GSC, only to be redirected to the homepage—my sitemap no longer existed, which was why GSC was hitting an HTML webpage instead.
My next step was to check the health of the software generating my sitemap, an SEO plugin added to the open-source platform my website was built on. In the back office of my site, I could see that the plugin was disabled—I’d done that myself during routine maintenance a week or two prior and had forgotten to re-enable it.
So I re-enabled the plugin, causing the XML sitemap to reappear. But as GSC had encountered problems reading the sitemap, it might not try again for a number of days. And there’s no option in GSC to nudge it to do so. The solution? Remove the sitemap from GSC entirely (via Sitemap details > More options) and re-submit it.
Once I’d done this and refreshed my browser, I could immediately see a success message—Google had read the fixed and resubmitted sitemap within seconds:
The Experience reports within GSC provide a summary of your site’s user experience as Google sees it. This isn’t just about user experience, though—it’s also key to good SEO:
“Google evaluates page experience metrics for individual URLs on your site and will use them as a ranking signal for a URL in Google Search results.” — Google
As Google doesn’t always explicitly say what is and isn’t a ranking factor, this is unusually clear guidance and not something you should ignore.
We’ll now dive into two of the individual Experience reports—Page Experience and Core Web Vitals.
The Page Experience report shows what proportion of your URLs are “good”—that is, offer a good user experience based on Core Web Vitals and mobile usability (the two more detailed reports in the Experience section). If a page passes both Core Web Vitals and mobile usability, and is served using HTTPS (rather than HTTP), Google considers it “good.”
Like the other GSC reports, the Page Experience report gives you a 90-day view so you can easily see whether your recent improvements have been picked up by Google or spot any concerning changes. In a sense, it’s an overview report, but specifically for page experience.
Core Web Vitals
Core Web Vitals are a set of factors that Google believes are particularly important to user experience. They measure “real-world user experience for loading performance, interactivity, and visual stability of the page” and break down as follows:
Loading performance is measured with a metric called Largest Contentful Paint (LCP). Simply put, this is how long it takes for the page’s main content to load.
Interactivity is currently measured with First Input Delay (FID). This is how long the user has to wait between first interacting with a page (clicking a button, for example) and the browser responding to that interaction. In March 2024, Google will replace FID with Interaction to Next Paint (INP).
Visual stability is the most interesting factor (although not necessarily more important to your SEO). Have you ever been on a web page and you go to click on something, when suddenly something loads on the page? Then, the whole page shifts around and you end up clicking on entirely the wrong thing (usually an ad)? That’s what Google means by visual stability, and it’s measured with a metric called Cumulative Layout Shift (CLS). The less shift, the better.
The Core Web Vitals report in GSC shows the number of URLs that are “good,” “need improvement,” or “poor” according to these three metrics. Are some of your URLs failing to hit the “good” rating? Click on the OPEN REPORT link alongside either the Mobile or Desktop section to see the reasons why. For example, this site has an issue with LCP on every page:
That suggests the problem isn’t with individual pieces of content, but with how the site (as a whole) loads. It may be worth looking at the page templates, or considering caching to improve site performance. For more ideas, check out our webinar on how to optimize your site for Core Web Vitals.
Mobile Usability and HTTPS
You may or may not see two further reports in the Experience section:
Mobile Usability — This shows you how many (and which) pages on your site are not mobile-friendly. Google is due to retire this report in December 2023, so if you want to evaluate your mobile usability from that point forward, you can use Google Lighthouse, among other tools and resources.
HTTPS report — This shows you how many of your indexed URLs are HTTPS and how many are HTTP. The HTTPS protocol is better both for your users’ security and for your site’s SEO (Google has confirmed it is a ranking factor), so if you have HTTP pages showing up here, it’s definitely worth diving into this report to find out why. It could be something as simple as an invalid certificate, or perhaps you’re missing redirects from HTTP to HTTPS.
The HTTPS report is relatively new and has not yet rolled out to all properties, so don’t worry if you can’t see it.
The last major section in GSC is Enhancements, which refers to features on your site that use structured data (e.g., breadcrumbs and videos).
But, what is structured data? Google defines it as “a standardized format for providing information about a page and classifying the page content.”
For example, structured data for videos can contain information about the video’s:
Contents (through a description or even a transcript)
This is all information that Google wouldn’t otherwise be able to get by crawling a page with an embedded video on it. So, you can see how structured data can help Google and other search engines understand a page in much greater detail.
This, in turn, can make you eligible to be shown in enhanced search results (also known as “rich results”).
Sticking with our video example, Google might add a “LIVE” badge to a video to show that it was live-streamed, or choose to highlight “key moments” in your video based on the structured data you provided:
Structured data may sound fairly technical, but there’s good news: Your platform or site builder—or even well-written third-party plugins—may add it to your pages automatically. And, GSC’s Enhancements reports are the place to see whether this is happening. You’ll have access to a different report for each enhancement that Google has detected on your site. (On the other hand, if you don’t have any enhancements, you won’t see the section in GSC at all.)
Note: Wix implements default structured data markup on blog posts, product pages, bookings pages, forum posts, and event pages.
GSC will also flag up any errors it has found in your structured data implementation. For example, this site was missing the “description” field from its video structured data:
You can see this now affects zero items, suggesting that the problem was fixed (and that Google noticed the fix).
If your structured data does need a fix, this may be something you can tackle yourself—or it may be something you need to raise with a plugin developer via a support request, for example.
Google also provides a Rich Results Test tool that lets you inspect any page to see all the structured data it found and whether it’s eligible for rich results. Unlike Google Search Console, you can use this to inspect any page—whether it belongs to a site you manage or not.
GSC’s Links report doesn’t get any coverage in the Overview and doesn’t have its own section in the menu—it just sits awkwardly between Legacy tools and reports and Settings, so it’s easy to miss. But that doesn’t mean it isn’t useful. It shows you:
External links — Your pages with the most links from other sites (otherwise known as “backlinks” or “inbound links”).
Internal links — Your pages with the most links from other pages on your site (i.e., the same domain). If a page doesn’t have any internal links, it won’t be listed here—even if it is indexed by Google.
Top linking sites — The sites that are linking to you the most.
Top linking text — The most commonly used anchor text in backlinks to your site.
For each of these sections, click on MORE to see a full list of pages, link metrics, export options, and search filters.
These reports are limited in that they don’t show historic data or changes over time: you can’t tell whether you are gaining new links, or when Google first detected a link.
Remember, links aren’t all equally valuable. A link from an authoritative site, such as www.bbc.co.uk, will help you much more from an SEO standpoint than one from a smaller or less trustworthy site. But, GSC won’t give you any insight into how it values your various external links. To do that you’ll need to use a different tool that estimates how authoritative different sites are, like Link Explorer from Moz, for example.
Crawl Stats report
Even more hidden than the Links report is the Crawl Stats report (which is only available in root-level properties)—this one isn’t listed on the left-hand navigation menu at all. To get to it:
01. Click on Settings in the main menu.
02. Scroll down to the Crawling section.
03. Click on OPEN REPORT.
The Crawl Stats report shows Google’s crawling history on your website: the number of crawl requests it has made, the average response time from your server, the server responses it has received, and so on.
Google has hidden this report because it’s aimed at advanced users with larger sites. The Search Console Help site says: “If you have a site with fewer than a thousand pages, you should not need to use this report or worry about this level of crawling detail.”
The reason for this is that only owners of large sites really need to think about their crawl budget—the number of pages that Google will crawl on their site on any given day. If they have more pages than their crawl budget allows for, it could take Google a long time to detect any changes (i.e., SEO improvements may not generate results as quickly).
The Crawl Stats report is designed to help site owners identify whether crawl budget is a concern—and if so, take steps to optimize it. For example, there might be an issue with crawl health: if Googlebot encounters lots of server errors or slow response times from a site, it will crawl that site less frequently.
Pro tips: Getting the most from GSC
We’ve seen that Search Console provides detailed information about your site’s performance in organic search—but that’s only one strand of your marketing activity. To make the most of your GSC data, it helps to see it in a broader context. Here are three ways you can do this.
How to connect Search Console with Google Analytics 4
You may be familiar with connecting GSC to Universal Analytics (the previous version of Google Analytics), but the process when it comes to GA4 is slightly different: Not only do you have to link the two tools as usual, you also have to “publish” the Search Console reports in GA4 in order to see and use them.
To connect your GSC property to your GA4 web data stream, follow the steps outlined in Analytics Help.
Then, publish the Search Console reports in GA4:
01. Click on Reports.
02. Click on Library.
03. In the “Collections” section, find the collection called Search Console.
04. Click on the menu icon (three dots) for this collection.
05. Select Publish.
After a few seconds, a new section called Search Console will appear in your GA4 reports menu. If you expand this, you will see two new reports—Queries and Google Organic Search Traffic.
The good news is that these reports will show data straight away (going back to when you verified the site in Search Console or created your GA4 web data stream, whichever happened more recently).
Here’s what the two new reports give you:
Queries — This shows your organic search queries, along with number of clicks, impressions, CTR, and average search position—the four key metrics we saw in the GSC Search Results report. The report lets you wrangle the data in slightly different ways than GSC: for example, you can produce a graph of organic search clicks over time, broken down by device category (desktop, mobile, or tablet)—whereas GSC will just give you the headline figures for that in the form of a table.
Google Organic Search Traffic — This report shows your landing pages along with key GSC metrics and some GA4-specific metrics (e.g., average engagement time, event count, and number of conversions). Having this data side by side is highly insightful: after all, what is the benefit of a particular page performing well in search if it isn’t generating any engagement or conversions?
Using Search Console data in Looker Studio
GA4 is great for showing your organic search data alongside other website data, but what if you want to take an even more holistic approach and bring in other data sources, such as business revenue or marketing spend?
You could use Looker Studio (formerly known as Google Data Studio), Google’s free data visualization tool. Looker Studio lets you pull in data from various sources using “connectors.”
There are free connectors for pretty much every Google product, including Google Analytics, Google Ads—and Google Search Console. You can even pull in your non-website data via Google Sheets. To get started, follow the instructions on Looker Studio Help.
Once you connect your data, you can put together your report by selecting different visualizations (e.g., line charts, heat maps, data tables, etc). Or, if you don’t want to build everything from scratch, Google has a Looker Studio Template Gallery containing a predefined Search Console report.
Using the Search Console APIs
For more advanced users, GSC offers an API (a way for other pieces of software to request data from, and send data to, Search Console). Strictly speaking, it offers four APIs, each with a different function:
Sites (for managing properties in a GSC account)
Search Analytics (for querying traffic data)
So instead of logging into GSC and inspecting a particular URL, for example, you could write a piece of code that uses the URL Inspection API to do it for you—perhaps on a set schedule.
But, you don’t have to be super technical to benefit from Search Console’s API. Other tools can (with your permission) use the API to provide you with Search Console data about your site.
Note: Wix’s Site Inspection tool enables site owners to monitor the status of their pages in Google’s index from within their Wix dashboard. This data comes directly from Google via the GSC URL Inspection API.
Google Search Console is your roadmap to better SEO
I’ve shown you that GSC is an SEO tool unlike any other. You don’t need it for your content to rank in search and for your site to get organic traffic from Google. But if you don’t use it, you are (to an extent) driving without direction: there’s no better way to get an overview of how Google sees your site and what issues might be holding you back.
Even if you don’t use Search Console for its reporting capabilities, it’s still hugely valuable as a way to communicate with Google. Need to submit a sitemap or request a re-index? GSC is your go-to.
What’s more, GSC is entirely free and available to anyone who owns or manages a website. If you’re just beginning your SEO journey, Search Console is a great place to start; if you’re an SEO professional with years of experience, getting familiar with its interface and capabilities can dramatically increase your efficiency.
James Clark is a web analyst from London, with a background in the publishing sector. When he isn't helping businesses with their analytics, he's usually writing how-to guides over on his website Technically Product. Twitter | Linkedin