Search Results
305 results found with an empty search
- Everything you need to know about your robots.txt file
Author: Maddy Osman A robots.txt file is a tool you can use to tell search engines exactly which pages you want (or don't want) crawled. Editing your robots.txt file is an advanced skill. Before you make any changes, you need to understand what the file is and how to use it. Keep reading to learn how a robots.txt file works and how you can use it to take greater control over your SEO efforts. What is the robots.txt file? A robots.txt file is a document that specifies which of your site pages and files can and can’t be requested by web crawlers. A web crawler (also sometimes referred to as a spider or bot) is a computer program that identifies web pages online. Crawlers work by scanning the text and following all the links on a web page. After a crawler finds a new web page, it sorts and stores the information it discovers. In the case of a search engine crawler, this information forms part of the search index . Each search engine uses its own crawlers to discover and catalog pages on the internet. Web crawlers for search engines like Google and Bing are some of the most frequently discussed bots, but other services like backlink checking tools and social media platforms also use crawlers to understand content around the web and may visit your site regularly. You can see how often and how many bots are visiting your site in your bot log reports . When a crawler visits your site, the first thing it does is download the robots.txt file, if there is one. Crawlers use the instructions in the robots.txt file to determine whether or not they can continue crawling certain pages and how they should crawl the site overall. You can use your robots.txt file to optimize your crawl or (if you are an experienced site manager) even block specific bots from crawling your site at all. Your robots.txt file also tells search engine crawlers which page links and files it can request on your website. The file contains instructions that “allow” or “disallow” particular requests. The “allow” command tells crawlers that they can follow the links on your pages, while the “disallow” command tells crawlers they cannot follow those links. You can use the “disallow” command to prevent search engines from crawling (following the links on) certain sections of your website. To make your sitemap available to Google , include your sitemap in your robots.txt file. A sitemap declaration is included within the Wix robots.txt for indexable sites, but self-built websites should ensure that a correct sitemap is available. Including a sitemap here can support consistent crawling and indexing. Without a robots.txt file, crawlers will proceed to crawl all the pages on your website. For small websites (under 500 URLs), this is unlikely to change how or how often your site is crawled and indexed. But, as you add more content and functionality to your site, your robots.txt file takes on more importance. How a robots.txt file can be used for SEO For your website to appear in search results, search engines need to crawl your pages. The job of the robots.txt file is to help search engine crawlers focus on pages that you would like to be visible in search results. In some cases, using a robots.txt file can benefit your SEO by telling search engine crawlers how to crawl your website. Here are some of the ways you can use a robots.txt file to improve your SEO: 01. Exclude private pages from search Sometimes, you’ll have pages that don’t need to appear in search results. These pages could include test versions of your website (also known as staging sites) or login pages. Telling crawlers to skip these private pages helps maximize your crawl budget (the number of pages a search engine will crawl and index in a given time period) and makes sure that search engines only crawl the pages you want to appear in search results. Note: Search engines don't crawl content blocked by a robots.txt file, but they may still discover and index disallowed URLs if they're linked to from other places on the internet, including your other pages. This means that the URL could potentially appear in search results. For ways to prevent your content from appearing in search results, see the alternatives to robots.txt section below. 02. Prevent resource file indexation Creating a website can require uploading resource files such as images, videos, and PDFs. Since you may not want these pages to be crawled and indexed, you can use a robots.txt file to limit crawl traffic to your resource files. Additionally, your robots.txt file can stop these files from appearing in Google searches. This helps ensure that both search engines and your site users are directed to only your most relevant content. 03. Manage website traffic You can use a robots.txt file to do more than keep website pages and files private. It can also be used to set rules for crawlers that prevent your website from being overloaded with requests. Specifically, you can specify a crawl delay in your robots.txt file. A crawl delay tells search engines how long to wait before restarting the crawl process. For example, you can set a crawl delay of 60 seconds: Instead of crawlers overloading your website with a flood of requests, the requests come in at one-minute intervals. This helps to prevent possible errors when loading your website. 04. Declare your sitemap As specified within Google’s documentation , you should include a line on your robots.txt that specifies the location of your sitemap. This helps Googlebot (and other bots) find your sitemap quickly and efficiently. If this is not present, then the sitemap may not be crawled regularly, which can cause delays and inconsistencies in how your site is indexed, thus making it harder to rank. Alternatives to robots.txt crawl directives If you want to prevent Google from indexing individual pages or files, other methods, like robots meta tags and the X-Robots-Tag, may be a better option than doing so via your robots.txt file. Robots meta tags Adding or modifying robots meta tags can send specific directives to bots for certain pages. For example, to prevent search engine crawlers from indexing a single page, you can add a "noindex" meta tag into the of that page. Depending on your site’s configuration there are a few ways to update your site’s robots meta tags. Use robots meta tag tool presets to make bulk updates If you have SEO tools on your site or in your CMS, you may be able to apply robots meta tags to single pages, sections of your site, or the site overall by adjusting your settings. For instance, if you are a Wix user, you can edit your SEO Settings to customize the robots meta tags for page types in bulk. Wix offers eight robots meta tags presets that are recognized by multiple search engines, but additional tags may also be relevant for your website. For example, if you would like to manage crawlability with regards to Bing or Yandex , you may wish to manually include tags specific to their bots. Or, to noindex the entire site (remove it from search results), you can update the Set site preferences panel. Other SEO tools may have similar presets, so it is worth checking the documentation to confirm what is available. Use robots meta tag tools to make updates to single pages To change the settings for a single page, you can add a robots meta tag into the head of the relevant page. If you have a Wix website, you can add or update custom meta tags in the Advanced SEO tab as you edit a single page. X-Robots-Tag on single pages Alternatively, robots meta tags can be inserted as an HTML snippet in the section of your page. You can specify a noindex directive , which tells crawlers not to include the page in search results, or add other tags as required. You might want to use a robots meta tag on pages such as: “Thank you” pages Internal search results pages PPC landing pages The X-Robots-Tag prevents search engines from indexing your resource files. Unlike a robots.txt file, the robots meta tag and X-Robots-Tag work better for single pages. If you want to manage crawl traffic for entire website sections, it’s better to use a robots.txt file. Wix site owners can toggle indexing settings by page by navigating from the Wix Editor to Pages (in the left-hand navigation). On any page where you’d like to make changes, click Show More and SEO Basics . Turn the Let search engines index this page toggle on or off depending on your preferred indexing status. The robots.txt file on Wix Your Wix website includes a robots.txt file that you can view and customize . Take care— a mistake in the file could remove your entire website from search results . If you’re not comfortable working with advanced website features, it’s best to get help from an SEO professional . You can view your website’s robots.txt file by going to the robots.txt editor in the SEO Tools page. You can click the Reset to Default button within the Robots.txt Editor to restore the original file (if needed). If you simply want to view your robots.txt file, you can do so by navigating to mydomain.com/robots.txt (where “mydomain.com” is your site’s domain). Understanding your robots.txt file on Wix If you’re going to make changes to your robots.txt file, you should first familiarize yourself with how Google handles the file . In addition, you need to be familiar with some basic terminology in order to create instructions for crawlers. Crawler : A computer program that identifies web pages online, usually for the purpose of indexing. User-agent : A way to specify certain crawlers. For example, you can differentiate between crawlers for Google Search and Google Images. Directives : The guidelines you can give the crawler (or specified user-agent) in a robots.txt file. Directives include “allow” and “disallow.” Robots.txt syntax Syntax is the set of rules used for reading and writing a computer program or file. Robots.txt uses code that begins by specifying the user-agent, followed by a set of directives. Take a look at the following code: User-agent: Googlebot-News Disallow: /admin Disallow: /login In the above example, the specified user-agent is “Googlebot-News” and there are two “disallow” directives. The code tells the Google News crawler that it can’t crawl the website’s admin or login pages. As mentioned above, user-agents can refer to more than just different Googlebots. They can be used to set rules for different search engines. Here’s a list of user-agents for popular search engines . If you want your rules to apply to all search engine crawlers, you can use the code User-agent: * followed by your directives. Besides “allow” and “disallow,” there’s another common directive to understand: crawl-delay. Crawl-delay specifies how many seconds a crawler should wait before crawling each link. For example, User-agent: msnbot Crawl-delay: 10 states that crawlers for MSN search must wait 10 seconds before crawling each new link. This code group only specifies a crawl delay—it doesn’t block MSN from any website sections. Note: You can include multiple directives in one code grouping. Making changes to your robots.txt file on Wix Before you make any changes to your robots.txt file, read Google’s guidelines for robots.txt files . To edit your robots.txt file, go to SEO Tools under Marketing & SEO in your site's dashboard. Then, follow these steps to edit your robots.txt file: Click Robots.txt Editor Click View File Use the This is your current file text box to add directives to your robots.txt file Click Save Changes Click Save and Reset Verifying changes to your robots.txt file You can use Google’s URL Inspection Tool to verify your changes. Type in a webpage address that you have disallowed to check if it’s being crawled and indexed by Google. For example, if you disallow the “/admin” section of your website, you can type “mywebsite.com/admin” into the URL Inspection Tool to see if Google is crawling it. If your robots.txt file was submitted correctly, you should receive a message that your URL is not on Google. Remember, you have to verify your domain with Google Search Console before using the URL Inspection Tool. You can use Wix SEO Setup Checklist to connect your website to Google and verify your domain . Then, any time you submit changes to your robots.txt file, re-submit your sitemap as well. When it comes to SEO, measure twice, cut once There are several ways you can use your robots.txt file to strengthen your SEO. There are also a multitude of reasons why you might want to make some of your pages inaccessible to crawlers. Now that you’re familiar with a variety of use cases, ensure that you’re selecting the right option for your purposes—in some instances, using robots meta tags may be more appropriate than disallowing via your robots.txt file. Ask yourself, are you simply looking to noindex a page? Or, are you trying to prevent resource file indexation, for example? Carefully consider what you’re trying to achieve before implementing any changes and record your changes so that you can reference them later. Maddy Osman - Founder, the blogsmith Maddy Osman is the bestselling author of Writing for Humans and Robots: The New Rules of Content Style , and one of Semrush and BuzzSumo's Top 100 Content Marketers. She's also a digital native with a decade-long devotion to creating engaging content and the founder of The Blogsmith content agency. Twitter | Linkedin
- Structured data for eCommerce category pages: Help Google understand your products and brand
Author: Lorcan Fearon Since the integration of AI into the search results, I have spent a lot of time optimizing my clients JSON-LD structured data. In particular, I’m focusing a lot on future-proofing product category pages for eCommerce websites and adapting them to the new era of search we are operating in. There are a number of reasons to do this, most of which I’ll cover in more detail in the later sections of this post, but here are the three main reasons to explore this tactic: Structured data helps search engines to not just understand your product category pages, but understand them deeply . A deeper understanding of your pages makes it easier for search engines to trust and rank that page for more relevant searches. Higher rankings help you compete on valuable SERPs and return value on your pages. I started to test this out when I was working with a client that sold underwear and loungewear. Underwear was their main product, but loungewear really drove up the AOV of their orders, so I created a custom CollectionPage schema markup for their existing pajamas category, which was sitting just off the first page of Google’s results for some great non-branded search terms. As no on-page changes were needed, the client was happy for us to try this out and we saw a great uplift in clicks and impressions, which had a knock-on effect on orders. Let’s get into the workflow so you can adapt this for your (or your clients’) category pages. In this guide, you will learn about: Why you should use CollectionPage schema Entity SEO & structured data in the future of organic search Where to craft and validate your structured data Why your JSON-LD code should be structured in a graph How to create your own CollectionPage schemas (with templates) How to implement this for your websites Why should you use structured data on your collection pages? I call upon JSON-LD when I am working on a website that is already doing all the right things on-page. The product category pages have all the right ingredients—they’re just edged out of those all-important top rankings by competitors. I sometimes describe this as an ‘extra-mile’ optimization (i.e., doing the thing that most people haven’t thought or bothered to do). Maybe it’s because they perform well via huge brand value and recognition or they only have the bandwidth/knowledge to prioritize SEO basics . In any case, this is a tactic that I don’t often see websites implement. Product schema is often there, as well as ‘out-of-the-box’ schema markup from plugins, but considering search engines’ preference for product category pages for so many of the non-branded search terms we value in SEO, it surprises me how little this schema type gets used. This tactic is something you can apply without changing anything on the page (in most cases), and it can help you close the gap on your competitors. If the product you sell could be confused with another or it has multiple names (e.g., flip-flops, sandals, thongs), then this is a way to ensure you are being as clear as possible for search engines, which is crucial if you want your category page to show up for those searches. However, this structured data implementation method is probably not a good idea if your category pages have a terrible user experience, they’re unusable on mobile, or loads slower than paint dries. There are often a number of more pressing issues product category pages can suffer from that you should consider addressing first. This really is going that extra mile for your online store. I describe this as leaving no stone unturned in the pursuit of the top ranking positions. Entity SEO and its importance for the future of search Also referred to as ‘ semantic SEO ,’ entity SEO is the practice of helping search engines to find deeper meaning in the keywords we use so that they better understand the concepts that these words refer to. If this is the first time you’re encountering entity or semantic SEO, here are some great resources that will not only help you better understand the theory behind my structured data approach, but also aid you in keyword research and content creation as well: What is semantic SEO? 10 ways to improve semantic SEO with disambiguation Why entities matter for SEO (Podcast) To demonstrate how a search engine understands entities, let’s take the first sentence of one of my childhood heroes’ Wikipedia entry: We can see (via Google’s Natural Language API ) that search engines can identify concepts and entities quite well here. This is important for the future of search because, as experienced SEOs know, Google doesn’t seem too interested in carrying on with the traditional ‘blue links’ search listings. Google doesn’t want to be a search engine as much as it wants to be an ‘ answer machine .’ In order for Google Search to reach its final form, as it were, it needs to be able to infer, understand, and relate concepts, entities, and things as easily and successfully as a human can. Whether you like it or not, this is the direction Google Search, and almost all search engines, are pursuing. This is why I have been drawn to this tactic. I see this as not only a good SEO tactic for today, but an even better one for tomorrow. By focusing on entities and their relationships now, you’re preparing your eCommerce store for an impending future where search engines don't just read your content, but truly comprehend it. Where to craft and validate your structured data Before we continue, I am going to assume from this point onwards that you can craft your own structured data, or at least know how to edit it, as I will provide my own template below. And if you don’t know how, don’t worry! Again, you couldn’t be in a better place. Crystal Carter created this comprehensive guide to structured data that will get you up to speed. A great website I use to play around with structured data is the JSON-LD Playground , mainly because it validates your code as you edit, whereas other validators require you to ‘submit’ your URL/snippet with a few seconds waiting. It may not sound like much, but when you are trying to find out where just one “]” should go, it does get quite tedious so real-time feedback can end up saving you a lot of time if you’re working at scale. Why your JSON-LD structured data needs to be a graph There’s just one more important stop to make before we get into the example: This is on the idea that your schema markup should be a ‘graph.’ This concept is best explained in this article from Joost de Valk , which changed the way I approach crafting JSON-LD structured data. The article covers this in greater detail than I’m able to here, but to try and summarize it neatly: A common schema markup practice in SEO is to create different snippets (e.g., a HowTo, a Product schema, WebPage and so on) but to have these as separate blocks . For example, you may have a snippet of schema markup that looks like this: { "@context": "https://schema.org/", "@type": "Product", "name": "Example Product", "image": [ "https://example.com/photos/1x1/photo.jpg" ], "description": "A description of the product.", "sku": "0446310786", "mpn": "925872", "brand": { "@type": "Brand", "name": "Studio Hawks" }, "review": { "@type": "Review", "reviewRating": { "@type": "Rating", "ratingValue": "4", "bestRating": "5" }, "author": { "@type": "Person", "name": "Stu D. Hawk" } }, "aggregateRating": { "@type": "AggregateRating", "ratingValue": "4.4", "reviewCount": "89" }, "offers": { "@type": "Offer", "url": "https://example.com/product", "priceCurrency": "GBP", "price": "119.99", "priceValidUntil": "2023-11-05", "itemCondition": "https://schema.org/NewCondition", "availability": "https://schema.org/InStock", "seller": { "@type": "Organization", "name": "Example.com" } } } And an Organization schema markup snippet that looks like this: { "@context": "https://schema.org/", "@type": "Organization", "name": "Example Corporation", "url": "https://www.example.com", "logo": "https://www.example.com/logo.png", "contactPoint": { "@type": "ContactPoint", "telephone": "(+44) 20 38877388", "contactType": "customer service", "areaServed": "UK", "availableLanguage": ["English", "Spanish"] }, "sameAs": [ "https://www.facebook.com/example", "https://www.twitter.com/example", "https://www.instagram.com/example" ] } What Joost de Valk is describing in the article is that schema markup should “always be one inter-connected graph.” This means that all your different schema properties should relate to one another and this relation should be indicated within the code it is housed in. { "@context": "https://schema.org/", "@graph": [ { "@type": "WebSite", "name": "Example Store", "url": "https://www.example.com", "publisher": { "@id": "https://www.example.com#organization" } }, { "@type": "Organization", "name": "Example Corporation", "url": "https://www.example.com", "logo": "https://www.example.com/logo.png", "contactPoint": { "@type": "ContactPoint", "telephone": "(+44) 20 38877388", "contactType": "customer service", "areaServed": "UK", "availableLanguage": ["English", "Spanish"] }, "sameAs": [ "https://www.facebook.com/example", "https://www.twitter.com/example", "https://www.instagram.com/example" ], "@id": "https://www.example.com#organization" }, { "@type": "WebPage", "url": "https://www.example.com/product", "name": "Product Page for Example Product", "description": "This is a detailed product page for Example Product available at Example Corporation.", "publisher": { "@id": "https://www.example.com#organization" }, "mainEntity": { "@id": "https://www.example.com/product#product" } }, { "@type": "Product", "name": "Example Product", "image": [ "https://example.com/photos/1x1/photo.jpg" ], "description": "A description of the product.", "sku": "0446310786", "mpn": "925872", "brand": { "@type": "Brand", "name": "Studio Hawks" }, "review": { "@type": "Review", "reviewRating": { "@type": "Rating", "ratingValue": "4", "bestRating": "5" }, "author": { "@type": "Person", "name": "Stu D. Hawk" } }, "aggregateRating": { "@type": "AggregateRating", "ratingValue": "4.4", "reviewCount": "89" }, "offers": { "@type": "Offer", "url": "https://example.com/product", "priceCurrency": "GBP", "price": "119.99", "priceValidUntil": "2023-11-05", "itemCondition": "https://schema.org/NewCondition", "availability": "https://schema.org/InStock", "seller": { "@id": "https://www.example.com#organization" } }, "@id": "https://www.example.com/product#product" } ] } For our purposes (and the purposes of this article), our CollectionPage is the mainEntityOf our WebPage, which belongs on our WebSite that is owned by our Organization. This is demonstrated in the next section. Connecting all of these entities together and organizing them neatly within your JSON-LD schema markup means you are going that extra mile to ensure that your category pages have the highest chances of being completely understood by search engines. It does add a layer of complexity to your structured data, but hey, if you are going that extra mile, you might as well really go that extra mile, right? CollectionPage schema snippet example Below is an example template of the code snippet I currently use. I have tried out some different approaches, and there may be different quirks or considerations depending on which platform your website is built on. Some CMSes may require you to download a plugin to upload custom structured data, which could upload it in the wrong place, or multiple times. It may not always be immediately obvious how to upload your own custom structured data on a CMS . Feel free to take this CollectionPage schema and adapt it for your online store: If you were to use this on a real page, below is how it could look. Let’s see how this snippet would shape up if we applied it to the Rolex men’s watches collection page: The Wikidata URLs referenced in this example point to the entries for Rolex (the brand), and to watches (the thing). Of course, for this particular page we could have a lot of confidence that search engines are well aware of Rolex and what their main product is. But, when you don’t have the brand value of a retailer like Rolex, directly connecting your product category to the Wikidata entries that are associated with it is an effective way to remove any doubts about what the page is for. I operate on a deep mistrust of search engines’ abilities and on a daily mantra of ‘make the search engines’ job easy’ when optimizing websites. By directly linking to entries from one of the databases search engines use to understand the world (i.e., Wikidata), you have gone that extra mile to ensure there is no room for doubt when search engines interpret your page. How to implement your custom CollectionPage schema markup Now that you’ve crafted your custom CollectionPage snippets and are ready to test them, you need to upload them to the page. For Wix website owners, I will walk you through the workflow step-by-step, but bear in mind that this process is roughly similar for other website platforms as well. There are two ways you can add structured data to your Wix web pages —from the SEO settings panel within the Wix Editor for the individual page or from the Edit by Page settings for the page type (Blog, Product, etc). To add structured data to an individual page using the Wix Editor: Go to your editor . Click Pages & Menu on the left-hand side of the editor. Click the More Actions icon (three dots) next to the relevant page. Click SEO basics . Click the Advanced SEO tab. Click Structured Data Markup . Click +Add New Markup . Add your new markup under Write your markup in JSON-LD format . Click Apply . You can also add or manage your structured data for all pages of a certain type by accessing the Edit by Page settings and selecting the desired page type: Go to the Wix Dashboard for your desired website. On the left-hand menu, go to Site & Mobile App > Website & SEO > SEO . Click on SEO Settings in the Tools and settings section towards the bottom of the page. Select the desired page type. From there, head over to the Customize defaults tab to customize the default structured data for that page type. If you are struggling to find a way to upload your own custom schema code to your website, you can also use Google Tag Manager to implement this . It’s a slightly more cumbersome process, but for stores selling high-competition product categories, every advantage matters. Structured data for eCommerce category pages: When every advantage matters Personally, as someone who has never been a whiz with code, I love playing around with JSON-LD schema markup and think there is so much we can be doing with it as SEOs for our websites. While these optimizations may feel optional right now, the state of the web continues to advance and, one day, it’ll be a best practice that you may not be able to forego. To that end, let’s recap what you’ve learned so that you can walk away with a clearer picture of how structured data helps you go that extra mile for your online store: Custom structured data for collection pages is a tactic that aims to prepare your eCommerce website for the future of organic search. Organic search and search engines are moving towards not just knowing the terms we search for, but understanding thoroughly as well as their relationships to other entities and concepts. Structured data is one of the elements that powers this understanding. This tactic is a supplemental optimization, meaning that this is for closing the gap on competitors for a page that already does all the right things (e.g., good UX, relevant content, a good product, E-E-A-T ). This is not a ‘hack’ or a silver bullet for your website. Consider Joost de Valk’s proposition that all schema markup should “always be one inter-connected graph.” You can use my template, and hopefully make it a lot better(!), to get started on testing these snippets yourself. Lorcan Fearon - Senior SEO Specialist & Operations Lead, StudioHawk UK From London via Nottingham & Kent, Lorcan Fearon is a senior search marketing specialist and operations manager for the UK team of Australia's largest SEO agency. Linkedin
- Analyze your SEO competitors with the SE Ranking app on Wix
Author: Mordy Oberstein Get started by: Creating a website → Competitor analysis is one of the most foundational SEO tasks I can think of. After all, your competition influences how hard you’ll need to work to show up at the top of search results. That is why when SE Ranking approached us to build an app for the Wix App Market, we were all-in on the idea of creating a tool that would help you understand the competitive landscape on Google—without having to leave Wix. Your competitors on the Google search engine results page (SERP) might not be who you think they are. As a matter of fact, your organic search competitors are often different from your competitors in real life (or on platforms like social media or YouTube). With that, let’s explore how the SE Ranking app helps you survey the competition on Google’s search results so that you can make better SEO and content decisions. Table of contents: How to get started with the SE Ranking app in Wix SE Ranking tool overview Competitor overview data International SEO competitor insights Exploring competitor keyword performance Discovering additional SERP competitors Paid performance competitor insights Performing keyword research with the SE Ranking tool How to get started with the SE Ranking app in Wix You can access the SE Ranking app within the Wix App Market , found in your Wix dashboard. The app is categorized under Marketing > SEO . Alternatively, you can search for it by name via the app market’s search field (and if you’re reading this post, you can access the SE Ranking app directly from here ). Once you install the app (installation is done on a per-site basis, not at the account level) you’ll see an entry point to it in the site’s dashboard. The SE Ranking app in Wix uses a “freemium” model. For each site you connect, the app grants you 10 lifetime data requests for each of the two tools within the app (i.e., competitive research and keyword research ). SE Ranking SEO competitor analysis tool overview As mentioned above, the SE Ranking app within Wix contains both a competitive research tool and a keyword research tool. Let’s start by first looking at the competitive analysis tool. To get started, enter a URL into the tool’s search field. Here, you have the option to analyze the entire domain, an individual URL, etc. You can also select regions to analyze the URL according to. Running an analysis presents you with a wealth of data, trends, and insights. I suggest you explore the tool firsthand, as there are too many data points and filtering options to include here. Competitor overview data Analyzing a website with the SE Ranking app presents you with an overview that includes: The estimated organic traffic to the site The number of ranking keywords the site has The estimated amount of paid traffic to the site The number of backlinks the site has Note: Before we go any further, I want to note that the data any third-party SEO tool presents is an estimation. Remember, the figures you see are not an exact reflection of the site’s performance, but rather a best estimate. Personally, I look at third-party data for the trends or to get an idea of the general orientation of the data landscape. The data in this section helps you understand the general state of the site’s performance. Beneath the initial overview, you can reference trends data in order to qualify the aggregated data shown. This way, you can understand whether your competitor’s traffic, keyword rankings, etc. are on an upward or downward trend. International SEO competitor insights Scrolling past the overview section, you can dive into a competitor’s organic performance according to regions or specific markets. This data gives you insight into the site’s per-market organic search performance, according to the site’s estimated: Organic traffic market share Amount of organic traffic Total number of ranking keywords You can also toggle to see the same data for the site’s paid efforts on the Google SERP. In a nutshell, you’ll have a bit of data to better understand where your competitor is both focusing and succeeding across the globe. To that end, click on the “Compare” button at the bottom of the “Traffic distribution by country” table to select the specific markets you want to analyze more deeply. To illustrate the table of data shown above, the SE Ranking app gives you an organic traffic trends chart. This can provide you with an immediate understanding of which markets your competitors are gaining or losing momentum in. Seeing a competitor slowly losing organic traffic in a specific market might be worth exploring, as it could mean there is potential for you to break into that market. The app also shows you a similar trends chart that highlights the number of ranking keywords your competitor has in a specific market over time. There is also a chart for the organic traffic cost over time per market. This means you’ll have a sense of how much you would have to spend in paid ads to receive the same amount of traffic your competitor pulls in organically. Exploring competitor keyword performance The SE Ranking app within Wix can help you understand what specific keywords drive your competitor’s organic traffic. The initial table gives you insight into which of your competitor’s keywords are improving (or declining) as well as what new keywords they have captured (and which ones they’ve lost). This data is contextualized with metrics such as: Monthly search volumes Ranking position Keyword competition score Click the “View detailed report” button to dive into your competitor’s ranking keywords. You can access more keywords and focus your competitor research by using any of the available filtering options. Pro tip: One thing you may want to do is set a filter to remove any keywords that contain the brand’s name. This way you’ll have a better sense of how the competitor is ranking for non-branded keywords. Another way to filter these keywords is by SERP feature . The app tells you the total number of keywords that your competitor ranks for that brings up a given SERP feature on the Google result page. You can use these to see top-level trends around keyword intent . For example, if there is an unusually large number of keywords that bring up a local pack, you know your competition is most likely targeting local markets. Select a particular SERP feature to bring up keywords for which the SERP contains that feature. For example, when selecting “Video” (for Google’s video box) I was served with the following keywords: In such a case, you might try to overtake your competitors by creating content that ranks for those keyword(s) and by creating videos that could be shown in the video box on the SERP. Discovering additional SERP competitors One of the most significant things the app enables you to do is discover additional organic competitors. If this is your goal, you can enter your own domain into the tool’s initial search field and pull data for your own site. You would then see who your organic competitors are in the “Organic competitors” section. The initial competitor research (as shown in the example below) focuses on keyword competition via: The total number of keywords each competitor ranks for within the top 100 positions on the Google results page The number of keywords unique to the domain being analyzed, unique to the competitor being viewed, and the number of keywords both domains rank for (see the hover-over in the image below) The data on the number of “common” keywords (i.e., the number of keywords where the domain being analyzed and the competitor being viewed both rank) is another reason why you might want to analyze your own domain. If you analyze your own domain, you can discover who your competitors are and see which competitors are ranking for the very keywords you also rank for. Strong keyword overlap might mean that the competitor in question is potentially attracting visitors away from your site. Click on “View retailed report” within the organic competitors report section to get a more detailed look at your competitors’ performance. Here, you can reference a trends chart to help you identify which of your competitors are falling off the SERP and which are gaining momentum and acquiring more traffic from Google over time. You can toggle between metrics and see trends for: Total traffic Number of ranking keywords Traffic cost Backlinks The table that follows the trends graph shows you your top 10 competitors on the SERP, as well as: The number of keywords both you and your competitors rank for The number of keywords your competitors rank for, but your domain doesn’t Total number of keywords (along with traffic and traffic cost) You can use filters here, as well, to refine your result and to discover additional opportunities. Selecting a keyword data point for a specific competitor will show you data similar to the “Organic competitor” comparison section of the app. Here, you’ll see visuals that help you compare your keyword and traffic performance to that of your competitors (you can analyze two competitors simultaneously in this report). The table that accompanies the visuals indicates the specific keywords that are unique to your competitor, that both your domain and the competition rankings for, etc. The table includes metrics such as ranking position, keyword difficulty, and search volume, so that you can decide which keywords you want to focus on. For example, if you see a competitor is ranking above you for a keyword that is relatively easy to rank for and has a high search volume, you may want to prioritize it. While there are a plethora of other report sections to explore in the SE Ranking app, the last bit of organic data I’d like to highlight is the “Top pages in organic search”table. Here, you can see which of your competitors’ pages matter most to them (or which of your pages is the most important to your organic efforts). The data shows which pages receive the most traffic (both in raw numbers and as a percentage of overall traffic). Click the “View detailed report” button to dive deeper into the performance of your competitor’s pages—or your own pages, for that matter. Say, for example, you know that both you and your competitors are targeting local event keywords. You could filter the results to show your competitors’ pages that contain the word “event” in the URL. That would be a quick way to see which of your competitor’s event pages are the most valuable to their success. Paid performance competitor insights The SE Ranking app within Wix also helps take a pulse on how your competitors are behaving on the paid SERP—specifically, how they are leveraging Google Ads to supplement their organic success and compete with you. For example, the “Paid keywords”table gives you information on which keywords your competitor is targeting with Google Ads. Moreover, you’ll be able to see which of these keywords have seen a performance improvement (or decline) as well as what new keywords they are buying. To better contextualize this data, the SE Ranking app also tells you where your competitor’s ads are appearing. Are they showing at the top of the SERP (potentially posing a bigger challenge to your rankings) or at the bottom (where they will be seen by fewer users)? Like with the organic research shown earlier, you can use the SE Ranking app to discover who you are competing against when buying Google Ads. Here, too, you’ll see information related to which competitors you overlap with. You can also use this table to dive into more detail about paid keyword overlap with your competitors. In addition, the app shows you which pages are connected to your competitor’s ads and how much of the paid traffic share they receive. This can be very helpful when you’re creating landing pages for your own ad campaigns, as you can look to see what works for your competitors. Lastly, the SE Ranking app can tell you which keywords are triggering your competitor’s ads on the SERP, with metrics on how many ads they are running for the keyword, CPC, and beyond. Performing keyword research with the SE Ranking tool in Wix You can also use the SE Ranking keyword research tool built into the app to qualify and explore a focus keyword discovered when analyzing your competitors. The overview section of the report gives you a top-level summary of the keyword’s organic opportunity for a given country, and includes information on the keyword’s: Ranking difficulty Search volume and search volume trends Regional volume CPC User intent The following section helps you expand on the seed keyword initially searched for by showing you similar keywords, related keywords, and questions that utilize the seed keyword. You can dive deeper into the initial results and see an expanded list for each keyword type. Here, you can filter your results and see expanded metrics, such as the SERP features that appear on the page for a given keyword. Pro tip: What’s interesting here is that if you click on a given keyword, you can see the search volume trends over time. This is a great way to spot any potential seasonality related to the keyword. You can also click on “Organic results”to see the top-ranking URLs for the keyword, along with which URLs have seen a ranking increase or decrease. Lastly, you can see which domains are running a Google Ads campaign that targets a similar keyword. Don’t jump to conclusions about who your competitors are Who are your competitors? It depends. Your brick-and-mortar competitors may not be who you’re competing against on the Google SERP. So, it’s best not to jump to conclusions about who they are (or aren’t). Instead, dive into the data and see who is actually giving you a run for your money in the search results. It’s easy to chase after every potential competitor and ranking opportunity, but no one has unlimited resources and being strategic in what you prioritize can make all the difference. The data within the SE Ranking app helps you understand who to focus on, what to focus on, and when to focus on it. Mordy Oberstein - Head of SEO Branding, Wix Mordy is the Head of SEO Branding at Wix. Concurrently he also serves as a communications advisor for Semrush. Dedicated to SEO education, Mordy is one of the organizers of SEOchat and a popular industry author and speaker. Twitter | Linkedin
- How to optimize for INP
Author: Yossi Fest In March 2024, Google officially replaced First Input Delay (FID) with Interaction to Next Paint (INP) as one of the three Core Web Vitals (CWV) metrics. This change wasn’t just a tweak—it was a fundamental shift in how Google evaluates user experience. The question you’re probably asking is why did Google make this change? Comprehensive evaluation: A web page might respond quickly to the first click, but then lag on subsequent interactions. INP considers all interactions, providing a more accurate representation of the user experience. Challenges and opportunities: Whilst INP is harder to optimize for than FID, it offers an opportunity (or for SEOs, a challenge) to significantly improve the overall responsiveness of your site. Sites that pass INP evaluations are likely to offer a superior user experience, which can lead to better engagement and general SEO performance. However, because of its complex nature, coupled with Google not explicitly providing information on the specific problematic interactions (unlike LCP and CLS, the other two CWV metrics), you as an SEO professional or site owner are left to do the work of identifying the INP issues, analyzing performance traces, and addressing the issues accordingly. In this guide, I’ll cover the nitty gritty of what goes into a web interaction, how to spot INP issues, and strategies to decrease your INP timings, ensuring you’re fully equipped to tackle this element of user experience and SEO. Table of contents: What is INP? Understanding interactions: The foundation of INP How to view your website’s interactions 3 steps to identify problematic interactions 01. Start with the Core Web Vitals report in Google Search Console 02. Crawl your site with Screaming Frog & the PageSpeed Insights API 03. Identify poor interactions using the Performance tab in Chrome Dev Tools 6 methods to optimize your INP 01. Audit your third-party scripts 02. Conduct a visual change audit 03. Audit your first-party scripts 04. Prioritize HTML hyperlinks over JS buttons 05. Optimize debounce time 06. Yield to the main thread How to measure the success of your INP optimizations What is INP? Interaction to Next Paint is the metric that Google uses to measure a web page’s responsiveness to user interactions. Google defines ‘interactions’ as: Clicking Mousedown Tapping (on devices with touchscreens) Key strokes INP does not include scrolling, as this is not an interaction with the page itself. INP is measured in milliseconds, and Google categorizes performance as follows: Good — INP is equal to (or less than) 200ms Needs Improvement — INP greater than 200ms and less than (or equal to) 500ms Poor — INP greater than 500ms Understanding interactions: The foundation of INP To optimize INP effectively, it’s essential to first understand what an interaction is and what factors influence its timing. An interaction is comprised of the following three components: Input delay — This is the initial delay between a user’s interaction and the browser registering the interaction (and subsequently being able to respond to it). While several factors can affect input delay, it’s usually JavaScript (JS) running in the background and the browser not being ready to attend to the interaction that adds unnecessary delay. Processing duration — After the input is recognized, the browser must process the action. This could involve complex calculations, DOM manipulation, or data fetching. Presentation delay — This is the time it takes for the browser to update the display after processing the interaction. Processing delay is usually influenced by rendering performance and how efficiently the browser can paint new frames. As you proceed with your optimizations, remember that it’s not just about knowing what INP is, but rather about understanding how each interaction on your website impacts performance. Equipped with this knowledge, you can approach performance optimization more strategically, identifying where the delays are, and what’s causing them. How to view your website’s interactions Now that I’ve caught you up on interactions and INP, it’s time to view and analyze your website’s interactions in detail. While there are several tools and techniques available to help you do this, I’m going to focus on using Chrome Dev Tools because you can get all the data you need from it, and it doesn’t require third-party tools or a setup process: Right click on the desired web page and click Inspect . This opens Chrome DevTools. Navigate to the “Performance” tab and hit the record button icon either in the top left, or click the “record” button on the right side panel, or “record and reload” to have the page loaded fresh. Interact with the page like your users would. Click on buttons and enter text into fields. Click on the record button again to stop the recording. The trace will then automatically load into the tab. Look for the row named “Interactions.” In this row, you will see the interactions you performed on the page as well as detailed information about everything that went into the interaction. You’l also notice the row called “Main” (underneath the “Interactions” row). This is the main thread of the browser where you can see all processing activity. Essentially, performance traces are detailed logs of everything that happens during the loading and interaction phases of a web page. By analyzing these traces, you can identify which interactions cause the highest INP scores and why. 3 steps to identify problematic interactions Finding the exact interactions that contribute to your page’s poor INP score can be time consuming, but with the right approach and tools, you can effectively isolate and address the issues. Following the three steps below to find problematic INPs on your pages: Start with the Core Web Vitals report in Google Search Console Crawl your site with Screaming Frog and the PageSpeed Insights API Identify poor interactions using the Performance tab in Chrome Dev Tools 01. Start with the Core Web Vitals report in Google Search Console The Core Web Vitals report in Google Search Console (GSC) is a great place to begin because it provides insights into which pages on your site are experiencing INP issues, and whether these problems are more prevalent on desktop or mobile. I use the INP section of the Core Web Vitals report for: URL group analysis — Drill down into the URL groups that Google identifies as problematic. This can help you focus your optimization efforts on the pages that need it the most. Trend analysis — Use the daily bar graph (showing the number of URLs; as shown in the example above) to see how your INP issues are trending over the last three months. Is the INP issue on the rise? Is it decreasing? Have there been any site updates that correlate to changes in trends? 02. Crawl your site with Screaming Frog & the PageSpeed Insights API Instead of manually entering all your URLs into PageSpeed Insights to see each page’s INP score, crawl your site with Screaming Frog using the PageSpeed Insights API. When used in tandem, these tools allow you to pull INP data along with other Core Web Vitals. This gives you a comprehensive view of how your entire site is performing. Enable the PageSpeed Insights API , then select only the following CrUX Metrics: CrUX Interaction to Next Paint (ms) CrUX Interaction to Next Paint Category Next, run a crawl of your site. You will then end up with a list of pages, their respective INP scores, and the CWV performance category that they fall into (as shown below). Once you have your list, prioritize the URLs to fix by the traffic they receive and/or importance (by other business metrics you can define, like revenue, for example). By crawling your site with Screaming Frog, you can also identify patterns and common issues that might affect your INP scores, which can be particularly useful for large sites with many pages and page types. 03. Identify poor interactions in the Performance Tab in Chrome Dev Tools Once you have prioritized the URLs you want to improve, you need to identify the problematic interactions on those pages. Up until October 2024, you’d have needed to use the Web Vitals Chrome extension , which provides real-time feedback on your site’s Core Web Vitals, including INP. However, the latest version of Chrome now includes detailed LCP, CLS, and INP data directly inside the Performance tab. Here’s how to use it effectively: Accessing the Performance tab — Open an incognito window in Chrome, then open Chrome Dev Tools by right-clicking and choosing “Inspect.” Then navigate to the Performance tab. Simulating network conditions — To match the conditions of poor internet connections that some of your users may access your site with, especially on mobile devices, I recommend throttling down CPU processing power to 20X. This can be done at the top of the Performance tab under “CPU” settings, or under the recording settings in the right pane of the tab. This also greatly helps with viewing the interaction in DevTools. Once you’ve throttled down, refresh the page. Perform interactions — Click around your site as a typical user would. Focus on the elements that you know users interact with the most, and work your way through all buttons, links, text input fields, etc. You’l notice that after each interaction, the interaction type, interaction target, and latency is logged in real-time. Identify problematic elements — Hover over the interaction target to highlight it on the page, or click on it to see it in the Elements tab. You’l notice that the INP time in the INP tile is the longest interaction from your interactions—exactly how Google calculates your page’s INP time. 6 methods to optimize your INP After you’ve identified the problematic interactions, it’s time to move onto the optimization phase, which can be the most difficult. Here are six methods I recommend for optimizing your interaction timings: Audit your third-party scripts Conduct a visual change audit Audit your first-party scripts Prioritize HTML hyperlinks over JS buttons Optimize debounce time Yield to the main thread 01. Audit your third-party scripts Third-party scripts are external pieces of code that your web page executes (typically for functionalities like analytics, social integrations, and/or tracking). While these scripts are super useful, they can also increase loading and page interactivity times. The more JS scripts your page needs to load, the longer your interactions will take until they finish fully loading. In addition, many third party scripts (e.g., TikTok , Facebook, Google Analytics ) may fire when they don’t need to. Start by identifying redundant scripts. Obtain a list of all third-party scripts on your site and determine if any of these scripts are redundant or unnecessary. You can obtain this list directly from the Network tab in Chrome DevTools, in a Lighthouse audit under the “Reduce the impact of third-party code” section of the Diagnostics section, or from most good web performance tools that provide content and script breakdowns. Defer non-critical scripts. For scripts that are essential but not critical for initial page load, use the async or defer attributes so that they only load after the main content loads. This ensures that your non-critical scripts don’t block critical main thread time during early interactions. Identify any scripts (or tags in Google Tag Manager) that fire on each click (or multiple times on a page when they don’t need to). Tags should only fire where they’re essential or when a user performs a specific action. Many websites manage their scripts through Google Tag Manager. This can make managing scripts much easier as there are native options to select exactly how you want your scripts to fire and what triggers them to do so. 02. Conduct a visual change audit Visual changes (like animations and effects that occur after a user interaction) can significantly impact INP. Animations and effects, such as blurring, sliding, fading, etc. often trigger extra rendering steps. Consider minimizing visual changes after an interaction that does not enhance your users’ experience to help improve INP timings. For example, let’s say a user clicks on the main menu on a page, and that makes the page fade slightly (or puts a blur effect on the rest of the page). This action requires additional rendering processing, which may be taking up valuable space on the main thread. By reducing (or simplifying) these visual changes, you can help reduce INP timings and promote a more responsive user experience, as the browser can focus more efficiently on processing the core interaction (rather than handling visual changes that may be unnecessary), and spend less time on: Repainting Layout recalculation Copositing layers 03. Audit your first-party scripts Unlike third-party scripts (where your business has no control over the contents of the scripts), first-party scripts are JS files that you own. You can perform the following actions to optimize script content and delivery: Bundling and minifying — Ensure that your JS files are bundled and minified. Bundling reduces the number of HTTP requests needed to fetch scripts, and minifying removes unnecessary characters (and spacing), reducing file size. Both of these practices can improve load times and responsiveness to interactions, directly impacting INP. Lazy loading first-party scripts — Just like the more common practice of lazy loading images, lazy loading JS files ensures that scripts only load as the user gets to them (i.e., scrolls down to an area that contains elements that require the JS). As an example, chat widgets and social media plugins usually aren’t required until the user has been on the page for a while first. This practice ensures that, once again, the main thread can remain free to handle interactions that may come its way. 04. Prioritize HTML hyperlinks over JS buttons JavaScript buttons can introduce significant delays in processing time because they require the browser to execute scripts to handle interactions. Compare this to regular HTML tag links that are natively processed by the browser and are generally much faster. Audit all internal links (including hidden links behind tabs, menu links, etc.) to see if they are coded as HTML hyperlinks with the standard tag, or whether they are implemented as JS buttons. If you find JS buttons, you should convert these to HTML hyperlinks where possible. This is especially important for navigation menus and prominent internal links where speed is crucial. 05. Optimize debounce time Debouncing is a technique used to limit the rate at which a function executes, which is particularly useful with search bars or any other text input fields where results may auto-populate. Without proper debouncing, each keystroke could trigger a server request, which in turn may overwhelm the main thread and increase input delay and processing times. Search bars usually have a debounce time of 300–400ms. I recommend increasing this to 1000ms in order to decrease the amount of server requests. The higher the debounce time, the fewer server requests there may be. It’s important to consider that users may experience a small amount of sluggishness due to slower responsiveness. It’s an important balancing act here—if you feel that the detriment caused by slowing down your auto-populated results may impact UX, then test out a lower debounce time. 06. Yield to the main thread Yielding to the main thread is a performance optimization that can help reduce the main thread’s blocking time during JS execution. It involves breaking down long JS tasks into smaller, more manageable ones, allowing the browser to ‘yield’ control back to the main thread to handle interactions from the user. Here are two ways to employ this tactic: Use ‘requestidlecallback’ to allow non-urgent execution to happen during idle periods (as the name suggests), so that critical tasks can be handled in the main thread first. Apply ‘settimeout’ for JS that cannot afford to wait for an idle main thread. This is used as a mechanism to manually split long-running JavaScript tasks into smaller chunks. The basic idea is to break a large task into smaller pieces and use setTimeout with a short delay (usually 0 milliseconds) to schedule the execution of the next piece of work after the current task completes. This allows the browser to handle other tasks (like user interactions or rendering updates) between the chunks, which helps keep the UI responsive. How to measure the success of your INP optimizations Now that you’ve hopefully implemented some of the techniques above, it’s crucial to perform an impact analysis to evaluate what worked, what didn’t, and to see if your INP times are moving in the right direction. Just like how we used various tools and techniques to assess what pages needed INP optimization, you need to do the same post-optimization. To measure the success of your INP optimizations: Immediately compare pre- and post-optimization performance Monitor INP trends over time Collect and review real user monitoring (RUM) data Immediately compare pre- and post-optimization performance Similar to how you identified poor interactions on a page, you can once again use the trusty Web Vitals extension to obtain interaction timings. Use the extension’s simulated throttle environment and compare timings from before and after optimizations. Monitor INP trends over time Use GSC’s Core Web Vitals report to monitor how your INP metric changes over time. Since this report is based on CrUX data, it provides a reliable measure of real-world user experience. It’s critical to remember that Google’s CWV scores are based on CrUX data collected and aggregated over the previous 28-day period, meaning that you cannot check your page’s ‘new’ INP immediately after optimization. Collect and review real user monitoring (RUM) data Real user monitoring (RUM) provides insights based on actual user interactions with your site, giving you an accurate, on-demand (as opposed to waiting the 28 days) overview of performance compared to lab tools. RUM involves collecting performance data directly from users as they interact with your site. The data collected provides timings for a wide range of metrics, such as TBT and, of course, all CWV metrics. Many solutions also provide detailed insight into how your site performs across different geos, devices, and browsers. Consider using Google’s free CrUX Vis tool, based on the Chrome User Experience Report dataset. Other common solution providers include Rumvision and Debugbear. Interaction to Next Paint: Maximize your responsiveness for better user engagement and search performance Whilst INP may not be the easiest to optimize for, it’s also an opportunity to perform a more holistic user experience audit. After all, INP is just a CWV metric that Google uses to assess general page experience—which is key for your users. Not only will you optimize for the Page Experience algorithm, you are also working toward ensuring that your site visitors have a smooth, responsive interaction that keeps them engaged and satisfied, maximizing the chances of driving meaningful business results. Remember that optimizing for INP is an ongoing, iterative process that requires continuous (or at least periodic) monitoring. By staying vigilant and keeping an eye on your CWV reports in Search Console, you can rest assured that you remain competitive in an increasingly user-centric web landscape. Keep testing, keep optimizing, and you’ll see the results of increased user engagement and search performance. Yossi Fest - Technical SEO Specialist at Wix Yossi Fest is a technical SEO specialist at Wix, where he's passionate about championing technical optimizations for better search visibility. Before joining Wix, he worked as an SEO lead at digital marketing agencies, driving organic growth for enterprise clients. Linkedin
- How to do keyword research with Wix’s Semrush integration
Author: Mordy Oberstein Available in: English , Português , 日本語 , Deutsch , and Français Get started by: Creating a website → Having a strong focus topic is becoming increasingly important for driving organic traffic from Google to your website. What’s more, establishing your focus topic helps to both refine your site’s overall identity and your target audience. One way to approach this journey towards topic and audience refinement is to choose the right focus keywords for your website. This is why we’re proud to partner with SEO toolset provider Semrush to give Wix users the data they need to better establish the focus of their websites. To make the most of these new capabilities, this article will walk you through: An overview of the Semrush integration within Wix Connecting your Wix account to the Semrush integration Getting started with Wix’s Semrush integration How to use the Semrush keyword research integration effectively Using your focus keywords in the SEO Setup Checklist An overview of the Semrush integration within Wix For those unfamiliar with the platform, Semrush is a leading provider of SEO-related data. In short, the platform helps you better understand your digital presence as well as that of the competition. The partnership we’ve established with Semrush focuses on the keyword research data it provides and integrates that data into the initial SEO Setup Checklist (formerly known as the SEO Wiz) found inside the Wix dashboard. We’ll explore the available data in more detail shortly, but first, here’s a quick look at the information you can access via the integration: As we’ll soon see, this data can enable you to refine the core topics (referred to in the SEO Setup Checklist as “keywords”) that are used as part of the foundational SEO setup for your Wix website. By refining the core topics (again, more commonly referred to as “keywords”) you give your site a better chance to rank on the search engine results page (SERP), target qualified traffic, and ultimately bring in more revenue. Once discovered, these core topics (or “keywords”) can then be used to complete the SEO Setup Checklist, as optimizing things like the SEO title tags for your foundational pages ( homepage , about page, etc.) will require you to use one of your core keywords (when done via the Wix SEO Setup Checklist ; this is not a requirement if you are optimizing via the Wix Editor or other parts of the Wix dashboard). Simply put, the workflow when using the Semrush integration in conjunction with the Wix SEO Setup Checklist looks like this: 01. Refine your core keywords using the Semrush integration 02. Select your core keywords 03. Implement your core keywords as laid out in the SEO Setup Checklist In the next section, we’ll take a look at how, exactly, to get started with this process. Before we move on, though, you should know that you do not need to pay to access the Semrush dataset—Wix site owners have limited access for free (more on this later). Connecting your Wix account to the Semrush integration When utilizing Semrush integration data as part of your Wix SEO Setup Checklist (and beyond) the first thing you need to do is connect your Wix account to Semrush. To do this, you need to access the SEO Setup Checklist via the Get Found on Google option in the Wix dashboard (within the Marketing & SEO section of your left-hand navigation panel). As you proceed through the checklist, you’ll be guided to an option to utilize the integration to choose your core keywords. If you have already started this process, then simply click the edit button within the SEO Setup Checklist to modify your current keyword selection. In either case, to access Semrush data, you’ll be prompted to establish a connection between Semrush and Wix. Once the connection has been established, you’ll be able to search through Semrush’s database for applicable keywords. There are two things to be aware of here: 01. You can add a maximum of five keywords to your SEO checklist. 02. If you are using the free version of Semrush, you will have the ability to run 10 keyword searches in a 24-hour period. If you already pay for Semrush, your access is dictated by your Semrush subscription. Getting started with Wix’s Semrush integration Now that your Wix site is connected to Semrush, how do you go about using the tool? Let’s first explore the data that is available to you so that we can get into a few actual use cases. For starters, the Semrush data works according to geolocation. That means you will get access to focus keyword ideas and data that are specific to a particular country. This is very important because the results can, in many cases, differ drastically depending on the country you select. With a country selected, you’re ready to get started searching for keywords to utilize within your SEO Checklist Setup (and beyond). To get started, search for a term that is closely associated with the core product, service, or topic that your site focuses on. For example, if my site sells ceiling fans, I might start by searching for the terms ceiling fans (we’ll soon see why we need to refine this term in order to choose an effective focus keyword): As you can see above, Semrush returns a slew of data. Let’s quickly look at what we have here (which will, in turn, show us why need to dig a bit deeper before selecting a focus keyword): Volume — This gives us an estimate of how often the keyword is searched for on Google each month. Trend — This shows the changes to the monthly search volume over time. Difficulty to rank — A composite metric that estimates how hard it would be to rank for the keyword given the competitive landscape on the Google SERP. Search intent — The intent associated with why a user would search for that particular keyword. Search intents can include informational, navigational, commercial and transactional queries (read our post on keyword and user intent to learn more). In this example, the first keyword option suggested by Semrush, ceiling fans , is searched for around 110K times per month, labeled as hard to rank for, with a trend that might indicate some seasonality (i.e., the dip in the Trend could reflect fewer searches depending on the time of year, which makes sense as who needs a fan when it’s cold out?). So, while trying to get your site to rank for ceiling fans is alluring because so many people search for the keyword each month it is, all things considered, not the best place for you to start as it is extremely difficult to rank for. If you search the keyword on Google (which you should always do when planning your content), you will see all sorts of eCommerce juggernauts, from Home Depot to Amazon to Wayfair, ranking for the keyword. As it stands now, the average site would be highly unlikely to rank for a keyword such as this and, at minimum, this would take a very long time to achieve (and, all other things being equal, only after a gargantuan amount of effort). If this is the case, how then do we use the Semrush data integration to choose the right focus topics/keywords? How to effectively use Wix’s Semrush keyword research integration The idea when choosing a focus keyword is that it should be exactly that—focused. Targeting an extremely broad (and therefore competitive) keyword with thousands upon thousands of searches a month is not usually very focused. The sites that rank for these kinds of keywords (keywords like ceiling fans ) have been operating in their respective industries for a long time and are leaders within the space. If that’s you, then great—you can certainly optimize your page’s title tag, headers, body content, etc. to target a broad keyword like ceiling fans . (For the record, an optimized page is not one that is written for search engines—your content should be written for users first .) However, for most sites, this kind of keyword is probably out of reach and might be something worth revisiting as the domain gets stronger. Dig deeper into keywords and topics with Semrush In the meantime, you can use the Semrush integration to dig deeper and find the right focus topics (again, I say topics over keywords because it’s not about a word per se, but how your site targets a topic overall). In our case, we need to ask ourselves if the site has a specific focus or point of differentiation from other sites selling ceiling fans. Perhaps the site focuses on designer ceiling fans or a specific type of ceiling fan, for example. In such a case, I might research the keyword designer ceiling fans : While the search volume is not anywhere near the 110K seen for the head term ceiling fan , it is a far more attainable keyword, with a search volume of 720 and a “medium” difficulty to rank. What’s more, it’s a far more targeted keyword that speaks directly to the target audience. Creating a home page that focuses on this segment of “ceiling fan” is more likely to produce qualified leads and not just rope in users from Google who are bound to be uninterested in the product itself, as most folks searching the term ceiling fans are likely looking for the typical product, not a designer edition of it. Let’s say, however, that you’re not exactly sure of the unique angle you should take. Alternatively, you can scroll through the initial results Semrush offers to see if there is a more targeted phrase that speaks to your business. For argument's sake, let’s assume your site mainly sells outdoor ceiling fans. While not an “easy” keyword to rank for, the term outdoor ceiling fan is far more attainable (and speaks to the business itself) while presenting a very nice search volume of 14,800 searches on Google per month: Still, a quick search on Google shows the same authority juggernauts dominating the SERP: It’s hard to compete for the term outdoor ceiling fans when a huge retailer like Lowes and the manufacturer itself (in this case, Hunter) are dominating the rankings. So, what now? Dig deeper. In this case, I would see if there is something unique about the outdoor fans this business sells, or if they are trying to sell these fans to a particular audience. Perhaps the site is focused on commercial customers. If so, Semrush already offers a topical suggestion that fits with industrial outdoor ceiling fans : Yes, roughly 500 people search for this term per month, not 15K. However, it is far more attainable to rank for and far closer to what our fictitious ceiling fan website actually offers (meaning that the site is more likely to produce sales, not just pull in traffic, when ranking for this keyword, all other things being equal). A quick Google search shows that, while big players like Amazon and Home Depot are still ranking, the SERP is also peppered with more niche sites as well: This tells us that ranking for the keyword, while not exactly easy, is feasible. Thus, targeting the term industrial outdoor ceiling fans with content that speaks to the topic on your home page, in your headers , in your title tag, etc. makes good sense and is a keyword to add to your list within the SEO Setup Checklist . There is one more quick point to make here: While you can use the Semrush integration to choose core topics/keywords to utilize in your Wix SEO Setup Checklist , you don’t have to use it that way. Rather, you can use the Semrush integration simply to find good topics to write about, either via a product page or even a blog post. For example, when searching for outdoor ceiling fans , Semrush came back with a suggestion to target the term 60-inch outdoor ceiling fan . While I wouldn’t make such a specific product the main focus of my site, seeing that 390 people search for the product each month, that the term is transactional (meaning the intent of the searcher is to buy something), and that it’s an “easy” keyword to rank for, I would definitely create a specific product page if I offered such a product. I might even write a blog post about 60-inch outdoor ceiling fans! The point is, you can use the Semrush data to either help you complete the Wix SEO Setup Checklist or just to find ideas about what to write about and target! Finding the fight informational keywords and content focus Let’s run through another example, this time focusing not on eCommerce content but on informational content (like a blog, etc). This time, let’s imagine our site is a gardening blog. We might be tempted to run a keyword search for the term gardening through Semrush. If we do, this is what we’d get back: The results include a lot of very high search volume suggestions that are either extremely hard to rank for or are just irrelevant to a gardening blog (or both). Again, let’s focus our search: Does our blog talk about the history of gardening? Does it offer tips about gardening? Assuming the latter, let’s run a search for gardening tips through the tool: Now, the keyword gardening tips is most likely going to be very competitive and hard to rank for as it’s a pretty broad term that includes every variety of gardening tips. Instead, I would focus on one segment of tips. Perhaps the blog has a focus on beginner content. In which case, the suggestion (shown above) for gardening tips for beginners would be a logical place to start as it still brings in a lot of traffic from search each month with a volume of 1K, but at the same time is not considered to be “hard” to rank for. Again, it’s not just about topics and keywords to use as part of the Wix SEO Setup Checklist . Use the Semrush data to see what other topics your site could discuss. For example, if we now search for what became our focus keyword in gardening tips for beginners , we get some nice ideas for a few blog posts: While they may not be the main focus of the site, a post on flower gardening tips for beginners and another on vegetable gardening tips for beginners could be a nice addition to this fictional site, and would reinforce the general focus around beginner gardening tips. Alternatively, you could simply enter another aspect of the overall topic into the tool and find other content ideas. In the screenshot below, I searched for gardening soil and got back the kernel of what might be a nice post on the difference between gardening soil and potting soil: Again, the point is not to limit yourself to only using the Semrush integration for the pages associated with your SEO Setup Checklist . Use the tool to its full capacity by finding new topics to write about on your site. And you can do all of this without ever leaving Wix. Of course, you can use the full Semrush toolset as well and can upgrade your Semrush account directly within the Wix dashboard: Coming full circle: Using your focus keywords in the SEO Setup Checklist Before we wrap things up, I’d like to come full circle and explain what to do with the focus keywords you end up selecting via the Semrush integration. For this, let’s use the term we decided on above: industrial outdoor ceiling fans . For starters, if the site solely sells this sort of fan, we might create a title tag for the homepage along the lines of: Industrial outdoor ceiling fans by Name of Business You don’t need to go to the Wix Editor to do this, you can add it directly in the SEO Setup Checklist : Along with that, I would be sure to include content on the homepage that broadcasts that this site focuses on selling industrial outdoor ceiling fans. To that end, I would either include an H1 or H2 header that discusses that the site sells that type of product, with at least a short paragraph further explaining what exactly it is that the site offers in this regard (for more on optimizing your homepage, watch our webinar on homepage SEO ). It’s also possible that industrial outdoor ceiling fans are just one type of product that I offer. Perhaps my site sells all sorts of industrial fans. I might then have a dedicated landing page or collection page honing in on industrial outdoor ceiling fans. In this case, I would optimize those landing or collection pages by doing the same—writing a title tag along with headers and body content that aligned with my products. Lastly, I want to reemphasize that good content focuses on the user, not on having certain phrases in certain places. If you set your sights on creating well-structured content that is focused and speaks to a target audience, you’ll likely create content that follows SEO best practices as a natural result. It’s all about quality content that makes the user’s experience as seamless and purposeful as possible. If you create content that aligns with that credo, you can’t go wrong. Mordy Oberstein - Head of SEO Branding, Wix Mordy is the Head of SEO Branding at Wix. Concurrently he also serves as a communications advisor for Semrush. Dedicated to SEO education, Mordy is one of the organizers of SEOchat and a popular industry author and speaker. Twitter | Linkedin
- Microsoft Clarity: Get more from your SEO by improving UX and conversions
Author: Celeste Gonzalez Get started by: Creating a website → SEOs and website owners typically have the same goals: to improve rankings, traffic, and conversions. We do this by creating great content and optimizing website performance . But, there are other things you can consider to further improve your site performance—both for visitors and your own business. Microsoft Clarity , while not a dedicated SEO platform, is an important user experience analytics tool to have in your toolset because it is SEO adjacent . This means that—although it won’t directly help you improve rankings—it will help you better understand your users, view your website through their eyes, find out how they are interacting with your content and features, and make the most of the traffic you’re already bringing in. (After all, the best SEO in the world isn’t very helpful if users simply leave after landing on your website.) And, with the recent launch of Wix’s Microsoft Clarity integration , these insights are more accessible than ever. Let’s take a look at what you need to know to start improving your website with Microsoft Clarity. Table of contents: What is Microsoft Clarity? How to add Microsoft Clarity to your website Microsoft Clarity and Google Analytics 4 User experience metrics within Microsoft Clarity Recordings Heat maps How to use Microsoft Clarity to improve UX and conversions Improve user experience Improve conversions Wix’s Microsoft Clarity integration What is Microsoft Clarity? Microsoft Clarity is a free user behavior analytics tool that allows you to see how visitors interact with your website (or your client’s website). Some of Clarity’s most compelling benefits include: No traffic limits Free of charge Does not slow down your site Easily integrates with websites on Wix and other CMSs Access to real-time data Heat maps Session recordings GDPR- and CCPA-ready Compatible with Google Analytics 4 (GA4) and other tools How to add Microsoft Clarity to your website To start gathering data with Microsoft Clarity, you must install a tracking code into your site either manually, through a third-party platform, or by sharing the code with a developer to install. It’s as simple as copying and pasting the code into the of your site, or using Google Tag Manager to install the code! Once the code is installed properly, you’ll gather real-time data immediately. To install Microsoft Clarity on your Wix website, refer to the section on Wix’s Microsoft Clarity integration . Microsoft Clarity and Google Analytics 4 While you can obtain some common user data from Clarity and GA4, these tools are actually better as complements of one another—not replacements. SEOs and digital marketers can connect the two via Clarity’s GA4 integration . This allows you to see session playback links within your GA4 dashboard. User experience metrics within Microsoft Clarity Clarity’s dashboard shows some engagement and UX metrics that digital marketers may already be familiar with (via other tools like GA4), as well as metrics that are unique to Microsoft Clarity. This includes: Sessions Pages per session Scroll depth Active time spent Users overview Insights overview JavaScript errors, Etc. In addition, your session recordings and heat maps have their own dashboards that you can parse through as well. Metric Description Example use case Sessions Clarity’s dashboard shows you total sessions (over the designated time period), but also includes more details like: Sessions with new/returning users Live users Unique users Pages per session These metrics exclude bot traffic. You can see how different audiences (e.g., new vs. returning users, users in the US vs. another country, etc.) use the site and decide if there are ways to better optimize page layout. Scroll depth This shows you how far users scroll down the page. Look at your average scroll depth to see if users are seeing all the important information on the page. If users aren’t scrolling to that information, you now know to place it higher on the page. This can also help you determine where to place calls-to-action . Active time spent The amount of time a visitor was using your site (scrolling, reading, etc.). This does not include time where your site was hidden behind tabs or windows. If it’s an informational page, you likely want to keep visitors on there for a good amount of time. However, if you see that they leave the page quickly, then the page probably isn’t answering their questions. There could be an opportunity to update the content or organize it more effectively. Dead clicks Unique to Microsoft Clarity, this metric tells you how often users think something on your website is clickable, but actually isn’t. A high percentage of dead clicks can indicate that your users are confused, perhaps due to your usage of colors, an unintuitive page layout, etc. Use dead clicks to see what you should make interactive or clickable on a page. Or, you can use it to understand how to change up your website’s design layout so users stop clicking where they shouldn’t. Rage clicks This is when a user rapidly clicks or taps on a small area of your web page. Similar to dead clicks, rage clicks indicate that users expect your website to respond when they click on that area. In this case, though, the succession of clicks also signals user frustration. This can mean that an element that definitely should be clickable, like a button, is broken and should be addressed ASAP. Excessive scrolling Excessive scrolling is exactly what it sounds like: when a user scrolls up and down a page more than is expected. This is generally indicative of a user not being able to find what they are looking for easily. This could mean that the information on the page is not organized in the most logical way. Add a table of contents to the page so users can easily find what they are looking for rather than scroll forever to find it. Quick backs This is the ratio of users that go to a page and then immediately navigate back to the previous page. This lets you know that users didn’t find the next page useful. A quick back can mean a problem with the navigation, internal linking structure, or that the new page is just not helpful or relevant to the user. At the top of the Clarity dashboard, you have the option to get more granular by segmenting your data. You can filter by time frame, device, browser, user action, product, and more. You can even create custom filters by using custom tags to identify specific users and their behavior on your site. Recordings Clarity’s recordings offer a way to watch how users interact with the pages and elements on your site. You can even see recordings for specific metrics, like recordings that show dead clicks or quick backs. In addition to the actual recording, Clarity shows you: The page the user entered the site on The user’s exit page The number of pages visited during that session Session duration Number of clicks Device Operating system Country You can use this data to optimize a variety of scenarios. Here are some examples: Metric Description Example use case Entry and exit pages See which page a user began their journey on your site with and where they ended up. When looking at these pages, ask yourself, are users flowing through your site the way you intended them to? If a user starts on a service page, you likely want them to end up on the contact page. Is this happening? If not, maybe there aren’t enough CTAs or they aren’t well placed. Number of pages visited See how many pages a user visited in a single session. Does this number make sense based on the session you watched? Was the user bouncing around from page to page searching for an answer until they finally found it? You can use this metric to help determine if things are missing from the user’s entry page or other pages they visited on your site. Device Segment behavioral data based on whether a user visited the site from their mobile device, PC, or tablet. Perhaps you’ve noticed that desktop users convert better than mobile users. It could be that the pop-up looks different on mobile than it does on desktop, which makes it more difficult to fill out. Note: Clarity only stores recordings for 30 days unless you designate a recording as a “favorite,” in which case you’ll have access to it for 13 months. Heat maps Heat maps allow you to see how users interact with your page, where they scroll and stop, etc. You can use them to help identify user trends and patterns by looking at what areas of a page have the highest and lowest amounts of engagement. Clarity divides heat maps into four categories: Click maps Scroll maps Area maps Conversion maps Use heat map data to determine where to place elements on a page for better visibility, engagement, and conversions. How to use Microsoft Clarity to improve UX and conversions Clarity provides you with a variety of different metrics (and you can even combine it with your Google Analytics data, as mentioned above). This data offers you insight into how users interact with your site, so it only makes sense to use it to improve their overall experience, which can then improve your conversions. Improve user experience Let’s dive into a few different examples of how you can use Clarity data to improve a user’s experience with your website. Zara is an international clothing brand with a reputation for its frustrating website. In addition to monitoring social media for user complaints about its site layout, the brand could also use Clarity to pinpoint the cause of its users’ frustrations. When shopping online for clothes, users are used to swiping left to see the next image of the product they’re interested in. If you do this on Zara’s site, instead of seeing the next photo, you swipe to the next product in whatever category you are looking at. This is likely leading to lots of quick backs to get back to the product the user was originally viewing. By resolving this issue on mobile, Zara can help its very annoyed customers, which should ultimately lead to more customer transactions. This was just one example use case. Here are some other ways you can improve your user experience, according to other Clarity metrics and features: Metric/issue Tip Excessive scrolling on a particular page A table of contents may be necessary to outline the information on the page, so the user can get to it quickly and easily. Surplus of rage clicks An element on your web page probably looks clickable (e.g.; an icon, an underlined word that looks like a link, an image, etc). You should either make it clickable or change up the layout so people stop thinking it is. Heat map shows lots of clicks on pieces of text within your content Users might be highlighting that information because they thought it was important. This could be a good place to add a link to supporting content or to even highlight it for users by making the text larger to call more attention to it. Scroll heat map shows that users are not scrolling far enough to see the key information This is a sign that you should change up the content’s layout and move your key information further up the page. High quick back rate on a particular page The page likely doesn’t have the information users actually want. Depending on the context, you can: Link to a different page instead Change the anchor text Update/refresh the content Improve conversions As mentioned before, improving your site’s overall user experience will, as a natural outcome, likely improve your conversions, too. For example, in a case study by Microsoft, users were confused with how to interact with the setup form. There was a blank in the statement above the input box (as shown in the image below), and users thought they should click on the blank to fill it out, rather than clicking on the box underneath the statement. After fixing this (and two other UX issues), the company in the case study, a prenuptial agreement planning platform, saw a 32% increase in revenue compared to the previous month. The company’s goal was to help users navigate the site easier, and by doing so, they were able to cash in. You should also use the data to explicitly call attention to forms and other CTAs to increase conversions as well. In a case study by the agency I work for, RicketyRoo , we looked at the homepage to see where we could urge more users to contact our client. After reviewing the heat maps and click data, we noticed that there was a button sending users to watch a video when it should have sent them to the contact page. The homepage had few conversions before, so after fixing the button and moving it to a highly visible place where users were already clicking, we were able to greatly improve form submissions. Here are some other ways you can use Clarity data to improve your conversion rates: Metric/issue Tip Incomplete form submissions Check to see if your forms and/or pop-ups are broken. Verify that the form can be successfully filled out and closed. Users are not scrolling to important information on the page Serve that information to users above the fold (or somewhere at least 50% of users have scrolled past it) and evaluate the impact. For example, instead of a giant list of products, include a search function above the fold. Increasing number of quick backs It could be that your anchor text is confusing to users and leading them to a page that does not include the information they are looking for. Javascript errors These could come from third-party plugins that are causing issues for your users or something as simple as a missing parenthesis in your code. Pay attention to JavaScript click errors in particular, as these refer to errors that occur after a user clicks. Wix’s Microsoft Clarity integration Wix and Microsoft have partnered to give you access to Clarity from directly within the Wix platform. To set up the integration, log into Wix, then: Locate Microsoft Clarity in the Wix App Market . Add Clarity to your website. Sign into Clarity from Wix to either create or connect an existing Clarity project. The Clarity experience within Wix allows you to create or link projects and view data to implement the UX and conversion improvements mentioned above. Pair these insights with Google Search Console data in your Wix Analytics to optimize your website to capture users from search results and guide them all the way through to conversion. Getting website visitors is only half the battle As I said in the introduction, flawless SEO won’t matter if visitors bounce from your website because they couldn’t see or do what they came for. Auditing your user experience enables you to go through your customer journey the way your audience does, which in turn allows you to uncover the oversights that may be chipping away at your leads and revenue. As with any SEO testing methodology, make note of the changes you implement and how they impact your website performance. This way, you can develop best practices for your specific niche and audience, enabling you to further optimize the customer experience throughout your marketing funnel. Celeste Gonzalez - Director of RooLabs at RicketyRoo Celeste Gonzalez leads RooLabs, RicketyRoo's SEO testing division, where she drives innovative strategies and engages with the SEO community. She is passionate about pushing SEO boundaries and sharing insights on both successes and challenges in the industry. Twitter | Linkedin
- How to use real-time analytics
Updated: October 28, 2024 Author: James Clark In the world of web analytics , it’s easy to dismiss real-time (or live) analytics as a vanity exercise—after all, aren’t trends over time more important? Absolutely, and real-time data can even ensure that your trends data is more reliable by helping you troubleshoot, monitor marketing activity, and make better informed decisions on the fly. In this article, I’ll show you what you should (and just as importantly, shouldn’t) use real-time reporting for. Next, we’ll dive into Google Analytics 4 and explore a couple of more advanced techniques, before considering the benefits of a dedicated real-time analytics tool and alternative sources of real-time data. Table of contents: What is real-time analytics? How to use real-time reporting for better campaign results Debugging or troubleshooting Monitoring marketing activity Making decisions in real-time What real-time analytics can’t tell you Real-time analytics in GA4 Comparisons in the GA4 Realtime overview Audiences in the GA4 Realtime overview Does GA4 update in real-time? Dedicated real-time analytics tools Other sources of real-time data Real-time analytics on Wix What is real-time analytics? Real-time analytics/reporting refers to a collection of data that reflects the most recent activities and actions of users on your site. This can include the number of visitors, the pages they visited, traffic sources, events triggered, and so on. Many analytics tools, including both Google Analytics 4 and Adobe Analytics, offer some kind of ‘real-time’ reporting. Some marketers may treat this kind of report as a vanity exercise: it’s nice to know that there are five people on my site at the moment, but how exactly does that help me make business decisions? Site owners will often move on to other reports where they can access historic data and start to understand trends over time. But, real-time reporting is incredibly useful once you understand how to apply it. There are two main styles of real-time reporting depending on the analytics tool you use (and some tools offer both): Event stream: An event or activity stream lists the events that have happened most recently on your site (in reverse chronological order). This almost always includes page views, but could also include button clicks, form submissions, purchases, or any other event you are tracking with the tool. Mixpanel and Piwik Pro are examples of tools that offer this kind of reporting. (GA4 also has an event stream, but as part of its DebugView rather than the real-time reports.) As an example, here’s an event stream in Mixpanel: Overview report: This type of report shows aggregated information about recent activity on your site. For example, GA4’s Realtime overview shows you how many users visited your site over the past 30 minutes (and, since May 2024, five minutes— GA4 is still evolving !). It also displays the number of pageviews per page and a count of each event: No matter what tool you use, it will take time for that tool to collect and process data. This is often referred to as ‘latency’—the higher the latency, the lower the data freshness. Even so, the advantage of real-time reporting is that it includes the freshest data the tool can offer you. In short, real-time reporting is not so much ‘real-time’ as it is ‘very recent activity.’ But that’s still hugely valuable, as you’ll soon see. Note: Wix site owners can access their Real-time Analytics (which includes an overview section as well as an activity stream) by going to Analytics > Real-time in their Wix dashboard. How to use real-time reporting for better campaign results While you wouldn't necessarily use real-time analytics to report on the success of your marketing campaigns, it plays a vital role in ensuring they run smoothly. Real-time data supports you in troubleshooting your tracking code, checking that your campaigns have deployed as planned, and making quick-fire marketing decisions. Let's look at all three of these use cases. Debugging or troubleshooting The most common use for real-time reporting is debugging or troubleshooting. Piwik Pro even calls its real-time event stream the “ tracker debugger ” in recognition of this. Real-time reporting (or whatever your tool labels it as) gives you the freshest data, making it very useful for checking whether tracking codes are working and that the tool is capturing data at all. After all, why wait a day for data to appear in the standard reports when you can check a real-time report after just a minute or two? Another troubleshooting strength of real-time reporting is that it not only shows you traffic on your site, but also lists the events that have taken place. On GA4’s Realtime overview, the “Event count by Event name” card is a great example of this. It lists page_view events of course, but also session_starts, scrolls, and any custom events you may be sending. Clicking on the name of an event then displays the event parameters that were captured. For the page_view event, that includes page_title, medium (for example, “organic”), and source (for example, “Google”): This level of detail makes the Realtime overview useful for ad hoc troubleshooting on low-traffic sites. For more complex debugging, it would be better to use the dedicated DebugView with its own events stream, as this can be used to only show events from your own device: Monitoring marketing activity Real-time reporting is also useful for checking that marketing campaigns have deployed as planned, and for monitoring the impact of those campaigns in near real-time. Let’s say you’ve scheduled a product email to go out to 10,000 people at midday—that will generate a spike in traffic (and hopefully sales) that you can see in your real-time report. If you don’t see those things, you may want to double check your email platform. In addition, real-time reporting lets you see the impact of a trending social media post or blog post almost immediately. (Of course, the challenge here is knowing when something is going to be trending so you can monitor the analytics.) But, it’s not just digital marketing that you can monitor with real-time reporting. Some traditional marketing campaigns (such as radio advertising) could cause a spike in activity on your site as well. And, if your CEO wants to know immediately how that expensive ad campaign performed, you’re far better providing some initial insight from real-time reporting than saying you have to wait until tomorrow to get data from the standard reports. Making decisions in real-time Real-time reporting is particularly useful when it comes to helping you make decisions about things happening live on your site, such as: Broadcasts/livestreams Flash sales A/B tests Here the emphasis isn’t so much on passively monitoring activity, but on using data to make decisions in real-time. Let’s say you’re planning to run an important webinar scheduled for 11AM. Should you start exactly on the hour, or should you wait until a couple of minutes past? Depending on your setup, your webinar tool may tell you the number of people that have signed in, but it won’t tell you the number of people on your site who are still in the process of doing so. This is where real-time analytics can fill the gaps and help you build a picture of activity on your site in order to get your timing spot-on. This is possible not just because real-time data is fresher, but because it’s also more granular—it lets you look at smaller time periods. The smallest time dimension available in GA4 outside of real-time reporting is hourly, and even then you have to customize a report or build an Exploration (like the one shown below) to take advantage of it. This makes it unsuitable for making real-time decisions. (Not to mention that Explorations can’t look at the current day’s data.) To give another example, let’s say you hold a virtual event with a number of short presentations by different speakers. Granular data would let you identify the individual presentations—or even the parts of presentations—that were less engaging and were causing your audience to drop from the event. Daily or hourly data would be much less useful here. What real-time analytics can’t tell you Real-time analytics gives you the freshest data, often covering a specific window of time (the most recent 30 minutes, for example). This makes real-time analytics the wrong choice for any sort of trend analysis . If you want to know whether sales have been increasing over the past year, turn to your standard reports. For the same reason, analytics tools won’t let you compare your real-time analytics data to a previous period. If you’re interested in year-over-year, month-over-month, or even day-to-day comparisons, again you should turn to your standard reports. The date picker on GA4’s standard reports has options for comparing against the previous period, the same period last year, and a custom period of your choice: You may also find that many of the dimensions and metrics you’re familiar with from your standard reports aren’t available to you in your real-time reports, especially session-based metrics ( such as bounce rate ). We’ll look at what that means for GA4 in particular in the next section. Finally, be aware that real-time reporting is unlikely to be entirely accurate. Most tools are set up using client-side tracking, where data is sent from the user’s web browser to the analytics service. But some users will block your tracking script with their browser or ad blocker settings—which means your tool will under-report the number of users on site. Your approach to consent will also affect the accuracy of real-time reporting. For example, if you’re using basic consent mode with GA4, then your Google tag will be blocked until the user grants consent via your consent banner . This, again, can lead to under-reporting. These are considerations with all analytics, not just real-time reporting. Real-time analytics in GA4 Now that you understand what real-time analytics is (and what it isn’t), let’s dive deeper into the real-time functionality in GA4. Once you’ve selected your Property in GA4 , you’ll find the real-time reports near the top of the Reports menu (just below Reports snapshot). There are two real-time reports: Realtime overview and Realtime pages. Note: If you’re a long-standing Google Analytics user, you may remember that Universal Analytics (the previous version of GA) had a whole suite of real-time reports. In addition to the overview report, there were individual reports focusing on user locations, events, conversions, and more. With GA4, Google has consolidated all of this information into two reports. But, as the Realtime pages report launched more recently (in October 2024), Google may well continue to expand GA4’s real-time reporting capabilities. Let’s look at the Realtime overview first. As with GA4’s standard reports, this consists of a number of “cards,” each of them summarizing one or two dimensions and metrics: for example, “Views by Page title and screen name” or “Event count by Event name.” And as is usual for GA4, some of the cards allow you to switch between dimensions and metrics using a drop-down: “Active users by First user source” can be changed to “Active users by First user medium” or “Active users by First user campaign,” among other options: One difference, though, is that the “customize report” option is missing. That means that, unlike the standard reports, you can’t rearrange, add, or remove any cards on the Realtime overview. The second real-time report, Realtime pages, is the easiest way to see which pages on your site users recently viewed. The large table lists the pages by page path (the end part of the URL, after the domain name) and gives the number of active users and views for each one over the past 30 minutes: Although this is similar to the “Views by Page title and screen name” card in the Realtime overview, having a dedicated report makes the information easier to understand at a glance. I could see it being used in a newsroom or up on a screen in a busy marketing department. Now let’s look at a couple of more advanced techniques we can use in GA4’s real-time reporting: Comparisons and Audiences. Both of these techniques work in the Real-time overview but not the Real-time pages report. (And if you’re new to GA4, you might want to check out our guides to getting started with GA4 and key events and conversions in GA4 first.) Comparisons in the GA4 Realtime overview Many of the dimensions and metrics you may be familiar with from the standard reports are absent from GA4’s Realtime overview. For example, none of the cards include the “browser” dimension, so there’s no way to see a full breakdown of your users’ browsers in real time. However, you can use the Comparison feature to get at least a little insight into this. Let’s say you wanted to know how many of your real-time users are using Chrome: 01. Go to the Realtime overview. 02. Click the Add comparison + (beneath the search bar) to open the “Apply a comparison” panel. 03. Click + Create new . 04. In the Dimension dropdown, choose Browser . 05. In the Match Type dropdown, choose exactly matches . 06. In the Value dropdown, tick Chrome . Your panel should look like this: Finally, click Apply . Now, the real-time overview will show you two versions of each card, the original (left) and a new version with the comparison applied (right). Here you can see that the site has had four active users in total over the past 30 minutes but only three of those were using Chrome: This approach works with other dimensions, too. But depending on your choice, you may find that some of the cards display, “Real-time data is not supported for this comparison.” For example, if you base a comparison around “screen resolution,” then the Event count and Key events comparisons will not be available. This is a limitation of GA4’s reporting . Audiences in the GA4 Realtime overview One of the cards in GA4’s Realtime overview lets you break down Active users (or New users) by Audience. Probably the most common use for GA4 Audiences is as a targeting option in Google Ads—but if you don't run any paid advertising, you may not be familiar with the Audiences feature. So what are Audiences, how do you build them, and how do they relate to real-time reporting? Audiences are groups of your users that meet particular conditions like “Browser = Chrome” or “have made a purchase.” So, they are similar to comparisons in some ways, but more powerful because they can also consist of users that performed a particular event . To build an audience: 01. With your property selected, go to Admin > Audiences . 02. Click on New Audience . 03. You could use one of GA4’s “reference” audiences (such as “Purchasers”), but for now let’s Create a custom audience . 04. Click on Add new condition and add a condition based around either a dimension (e.g., Browser) or an event (e.g., Click). 05. If you choose a dimension, click on Add filter to finish writing the condition—for example, Browser exactly matches (=) Chrome. 06. Optionally add more conditions to either include or exclude other groups from your audience. Once you’ve added your condition(s), the Summary in the bottom-right of the audience builder will give you an estimated audience size (based on the last 30 days' activity on your site): Although the Summary might suggest otherwise, audiences always start with zero members—they aren’t retroactive. For example, if you create an audience of “Purchasers” at midday on February 1, only users making purchases from that moment onwards are added to the audience. The Summary is only showing you how big your Audience might be by now if you had created it 30 days ago . This means there’s no point creating an Audience and then immediately hoping to use it in the Realtime overview . If you want to see how many users who completed a “sign_up” event and are currently on your site, you need to have created that audience long enough ago to make it meaningful. (Users can remain in an Audience for up to 540 days depending on the “membership duration” setting you chose when building the Audience.) If you do plan ahead, the combination of Audiences and real-time reporting can be incredibly powerful. Imagine you’re running a live event on your site designed to target a particular subset of users: previous purchasers from France, for example. Now, you’ll be able to tell at a glance whether you’re attracting the right audience or whether your messaging has appealed more to a different group. Does GA4 update in real time? You’ve already seen that one of the advantages of real-time reporting is the freshness of its data. So how fresh is ‘fresh’ when it comes to GA4? Focusing on the standard reports first, Google gives a processing time of 12 hours for daily data—or longer for the biggest sites. And this is the “typical” processing time, by no means guaranteed. GA4 is different from the old Universal Analytics, which had a stated processing latency of “24-48 hours,” but would often make data available within an hour or two. With GA4, 12 hours often really does mean 12 hours. To put that in context, if you wake up one morning and log in to GA4 to check the previous day’s figures, don’t be alarmed if it looks like traffic on your site has slumped. It may be that you aren’t seeing the complete picture for that day yet. So, when using the standard reports, it is safest to leave at least one full day before checking the data—in other words, don’t go checking Wednesday’s figures until Friday at the earliest. After all, you wouldn’t want to risk making business decisions on incomplete and potentially misleading data. And if you’re using GA4 to track activity on an app rather than a website, you may want to wait even longer. As the Analytics Help site says : “When a user’s device goes offline (for example, a user loses their internet connection while browsing your mobile app), Google Analytics stores event data on their device and then sends the data once their device is back online. Google Analytics ignores events that arrive more than 72 hours after the events are triggered.” Compared to the standard reporting, the real-time reports have an amazingly quick typical processing time of “less than one minute.” This is the case for both free and paid (360) GA4 properties, although this processing time is not guaranteed by the 360 SLA. Nevertheless, it offers by far the freshest data available in GA4. But be careful how you interpret that data: The real-time reports will tell you the number of users to have visited within a five-minute and a 30-minute window, but not whether those users are still on the site. So, if you have “100 users in the last 5 minutes,” it may be that all 100 are still on the site or that all 100 have left. The reality, of course, is probably somewhere in between. Dedicated real-time analytics tools If you absolutely need to know the number of users on your site at any given moment (rather than within, say, a 30-minute window), consider using a dedicated real-time analytics tool such as GoSquared or Realtime.li . These are designed to provide exactly that information, as this section of the Realtime.li dashboard demonstrates: GoSquared’s approach is unusual in that it uses a technology called “pinging” to check that visitors are still on your site. This means that if they leave, they will be removed from the live visitors count in around 30 seconds. On the other hand, if they are sitting on your site doing something passive (such as watching a long-form video), they will still be counted towards the live visitors total. Generally speaking, other analytics tools would stop counting these passive visitors after a set period of time. On the downside, a dedicated live analytics tool won’t offer you the same level of historic reporting as a more general platform might. This means you’ll likely end up running two tools on your site at the same time—for example, Google Analytics 4 for tracking trends over time and GoSquared for real-time reporting. Whichever tools you use, don’t expect them to give you identical results—every tool defines its metrics differently, even if they sound similar (sessions, visits, and so on). So, be consistent about when you use one tool and when you use the other. Other sources of real-time data I’ve shown you how to use real-time data both in all-purpose analytics tools such as Google Analytics 4 and dedicated real-time analytics tools such as GoSquared. But, I’d like to leave you with the thought that analytics tools aren’t your only source of useful real-time data. In particular, if you’re live streaming video content, it’s likely that your platform will be able to provide some great insights. YouTube, for example, can tell you the number of “concurrent viewers” (i.e., the number of viewers watching your stream simultaneously) as well as the “peak concurrent” (i.e., the highest figure you have achieved during the stream): Facebook Live, Vimeo, and IBM Player all have similar metrics. In addition, your platform may give you details on specific interactions, such as “Likes” or “chat rate” (the number of messages sent in live chat per minute). As with real-time analytics data from your website, you can use data from your streaming platform for troubleshooting or for identifying the most and least engaging parts of the livestream. Also, look in your website back office to see what data is available to you there. Most site builders and platforms now offer some level of built-in analytics , and this can be expanded through the use of third-party apps or plugins. Real-time analytics on Wix Wix site owners can access their real-time analytics by going to Analytics > Real-time in their Wix dashboard. You can also view a list of your Recent visitors over the last 24 hours, as well as a breakdown of each action they took during that session. Refer to the Live activity panel (shown above) to see which actions were recently taken on your site, including: Viewing a store product Entering the checkout flow Viewing a blog post Adding an item to a cart Becoming a new contact Booking and/or scheduling a service Completing an order Learn more about all of Wix Analytics’ real-time reporting capabilities. Real-time analytics: The right data for the right purpose Real-time analytics (while not exactly ‘real time’ ) is a source of genuinely useful data. It can answer questions that your regular analytics reports simply can’t. But it’s also a specialized tool, designed to do one thing and do it well . That means it supplements (rather than replaces) any analytics you’re already using. To extend the metaphor, it’s an extra tool in your toolbox. That said, there are many different sources of real-time data, and each will tell you something slightly different in a slightly different way. So ask yourself: do you need real-time data for troubleshooting, for monitoring marketing campaigns, for live events, or for something else entirely? The answer to that question will help you identify the right source of real-time data for you, and ensure you are using it for genuine business purposes rather than an ego boost each time you log in. In other words, it will help you keep it real . James Clark - Web Analyst James Clark is a web analyst from London, with a background in the publishing sector. When he isn't helping businesses with their analytics, he's usually writing how-to guides over on his website Technically Product . Twitter | Linkedin
- How to Use Wix SEO Settings
Speaker: Crystal Carter | 14 min In this video, you’ll learn how to use Wix SEO Settings and Edit by Page feature. Learn how to edit meta descriptions , title tags , and other SEO meta tags for your pages either individually or in bulk by page type. Designed to empower users with efficient and scalable tools for optimizing their Wix websites, these features are built into the Wix CMS and do not require a plugin. Read More
- Intro to technical SEO: A guide to improving crawling and indexing for better rankings
Author: Aleyda Solis The highest quality content on the web won’t get any search traffic if technical configurations aren’t correctly optimized for effective crawling and indexing. On the other hand, stellar technical SEO can help guide search engines (and users) to your most important pages, enabling you to bring in more traffic and revenue. In this article, I’ll guide you through the key concepts, configurations, and criteria necessary to fully leverage technical SEO for your website. Let’s begin. Table of contents: Technical SEO: What it is and why it’s important Crawlability, indexability, and rendering: Fundamental technical SEO concepts Technical SEO configurations to understand and optimize HTTP status URL structure Website links XML sitemaps Robots.txt Meta robots tags Canonicalization JavaScript usage HTTPS usage Mobile friendliness Structured data Core Web Vitals Hreflang annotations Technical SEO: What it is and why it’s important Technical SEO is the practice of optimizing your website configurations to influence its crawlability, rendering, and indexability so that search engines can effectively access and rank your content. This is why technical SEO is considered essential and one of the main pillars of the SEO process. It’s referred to as ‘technical’ because it doesn’t pertain to optimizing on-page content , but rather optimizing the technical configurations (e.g., HTTP status, internal linking, meta robots tags, canonicalization, XML sitemaps) with the goal of ensuring that search engines can access your content. It’s crucial to understand that while you don’t need to be a web developer or know how to code to handle technical SEO, you do need to grasp the basics of how websites are constructed. This includes understanding HTML and how other web technologies, like HTTP and JavaScript, function. This knowledge helps you evaluate and confirm that your website is optimized effectively for search. Overlooking technical SEO can lead to your pages not appearing in search results, ultimately resulting in lost opportunities for rankings, traffic, and the revenue that comes with it. The fundamental technical SEO concepts: Crawlability, indexability, and rendering Search engines, like Google, begin the process of providing results to users by accessing website pages (whether they’re text, images, or videos)—this is known as crawling . Once they’ve accessed and downloaded this content, they analyze it and store it in their database—this is known as indexing . These are key phases of the search process and you can influence them through the technical setup of your website. Let's take a closer look at each of these phases to understand how they function, and why and how you’d want to optimize them. Crawlability: Search engines discover your website pages through a process called ‘crawling’. They use ‘crawlers’ (also known as ‘spiders’ or ‘bots’) that browse the web by following links between pages. Search engines can also find pages through other means, like XML sitemaps or direct submissions through tools like Google Search Console . Some search engines (including Microsoft Bing, Yandex, Seznam.cz , and Naver) use the IndexNow protocol (which Wix supports) to speed up discovery when you create or update content. Popular search engines have their own crawlers with specific names. For instance, Google’s crawler is called ‘ Googlebot ’.Websites can control which search engines access their content through a file called robots.txt , which sets rules for crawling. To ensure search engines can find and access important pages while preventing them from accessing unwanted ones, it’s crucial to optimize your technical configurations accordingly. Indexability: After a search engine crawls a webpage, it analyzes its content to understand what it’s about. This process, known as indexing, involves evaluating the text-based content as well as any images or videos. In addition to HTML pages, search engines can often index content from text-based files, like PDFs or XMLs.However, not every crawled page will get indexed. This depends on factors like the originality and quality of the content, certain HTML configurations like meta robots and canonical annotations, and reliance on JavaScript for key design and content rendering, which can make indexing difficult.During indexing, search engines check if a page is a duplicate of others with similar content and select the most representative one (referred to as the ‘canonical page’) to display in search results. Therefore, it’s crucial that you correctly configure and optimize these different elements to ensure effective page indexing. Rendering: If your website utilizes client-side JavaScript, search engines need to perform an additional step called ‘rendering’ to index your content.Client-side JavaScript rendering involves using JavaScript to create HTML content dynamically in the browser. Unlike server-side rendering, where HTML is generated on the server and sent to the browser, client-side rendering starts with a basic HTML file from the server and uses JavaScript to fill in the rest.Because of this, search engines have to execute the JavaScript before they can see the content. While search engines like Google and Bing can render JavaScript to index the page, it requires more resources and time, and you might encounter limitations when relying on client-side rendering on a large scale. That’s why, when using JavaScript, it’s best to opt for server-side rendering to make indexing easier. Technical SEO configurations to understand and optimize Now that you understand the considerations that technical SEO seeks to optimize, let’s look at the different configurations that influence your technical SEO and how to optimize them to maximize your organic search visibility. I’ll cover: HTTP status URL structure Website links XML sitemaps Robots.txt Meta robots tag Canonicalization JavaScript usage HTTPS usage Mobile friendliness Structured data Core Web Vitals Hreflang annotations HTTP status HTTP status codes are numerical responses from your web server when a browser or search engine requests a page. These codes indicate whether the request was successful or an issue occurred. Here are key HTTP status codes and their implications for SEO: 2xx (success): 200 OK — Page successfully found and available for indexing assessment. 3xx (redirection): 301 moved permanently — This indicates a permanent move to another URL; it transfers the SEO value of the former URL to the final destination. That’s why SEOs use 301 redirects when performing a website migration , changing a URL, or when removing a page that used to attract rankings, traffic, and backlinks. 302 found — This indicates a temporary move and doesn’t transfer the former URL’s SEO value to the target page. 4xx (client errors): 404 not found — This indicates that the page was not found. A high number of 404 errors can impact your site’s crawl budget (i.e., the amount of time and resources a search engine dedicates to crawling your website). 410 gone — This indicates an intentional and permanent removal. This can be useful for de-indexing a page if it doesn’t have any rankings, traffic, or links. 5xx (server errors): 500 internal server error — This indicates the server failed to fulfill a request. This can be harmful to your SEO if not resolved. 503 service unavailable — This code indicates that a page is temporarily unavailable and can be used for website maintenance without impacting your SEO. You can use this status code to tell search engines to come back later. Soft 404 errors: These occur when a page returns a 200 OK status, but lacks content or shows an error message, suggesting that it doesn’t exist anymore or providing a poor user experience. For permanent content relocation, use a 301 redirect. For removed content, redirect to the parent category if the page had value, or use a 410 status if it didn’t. URL structure A well-designed URL structure is important for both search engines and users to understand the content of your webpages. Here are some widely accepted best practices for URL structure: Keep URLs simple, short, lowercase, and descriptive, using meaningful words instead of IDs. Use hyphens to separate words. Avoid underscores, spaces, or concatenation. Avoid generating multiple URLs for the same content, such as through session IDs or excessive parameters. Maintain a logical folder structure without going too deep to prevent overly long and complex URLs. Consistently use trailing slashes or non-trailing slashes to avoid duplicate content issues, and use 301 redirects to enforce canonical URLs. Good URL structure example Poor URL structure example yoursitename.com/smartphones/iphone/ yoursitename.com/id-23-p?id=2 Website links Links are crucial for search engines to discover new pages and for users to navigate your site. To optimize your website’s links, implement the best practices below. Include navigation links: Utilize main menus, footer links, and editorially placed links within your content to enhance crawlability and browsing experience. Use HTML tags: Use the HTML tag for links to ensure crawlability and avoid JavaScript-based links. Create descriptive anchor text : Use descriptive, relevant anchor text that accurately describes the linked page, incorporating targeted keywords when possible. Avoid generic terms like ‘click here’ or ‘read more’. Link to canonical URLs: Directly link to canonical, indexable URLs. Avoid linking to pages that redirect or trigger errors. Link to absolute URLs: Use full URLs instead of relative URLs to prevent issues. Structure and prioritize your linking strategy: Follow a logical, hierarchical structure for internal linking , prioritizing high-value pages. Cross-link between similar pages to aid both users and search engines. Avoid nofollow for internal and trusted external links: Generally, internal links should be followed by default. Reserve the rel="nofollow" attribute for when you don’t want to pass link equity. XML sitemaps XML sitemaps are files (in XML format) that tell search engines about the essential, indexable files of your website, such as pages, videos, or images, and their relationships. They aid search engines in efficiently crawling and indexing this content. While not mandatory, XML sitemaps are recommended for highly dynamic or large websites with thousands of URLs (or more). They complement internal links, helping search engines discover URLs within a site. There are various types of XML sitemaps, including general, video, image, and news sitemaps. Most web platforms automatically generate and update XML sitemaps when you add or remove new pages. Considerations for creating XML sitemaps include: Adhering to size limits (50MB uncompressed or 50,000 URLs) UTF-8 encoding Placing them at the root of the site URLs within sitemaps should be absolute references. Here’s an example of an XML sitemap that includes only one URL: Robots.txt The robots.txt file , located at a website’s root, controls which pages search engines can access and how quickly it can crawl them. Use it to prevent website overload, but don’t rely on it to keep pages out of Google’s index. The file must be UTF-8 encoded, respond with a 200 HTTP status code, and be named “robots.txt”. Your robots.txt file consists of groups of rules, each starting with a user-agent directive specifying the crawler. Allowed rules include: User-agent — Specifies which crawlers should follow your rules. Disallow — Blocks access to a directory or page using relative routes. Allow — Overrides a disallow rule to allow crawling of a specified directory or page. Sitemap — Optionally, you can include the location of your XML sitemap. Here’s a few examples of what a robots.txt file can look like: Meta robots tags Meta robots tags are placed in a page’s HTML head or HTTP header to provide search engines with instructions on that particular page’s indexing and link crawlability. In the example above, the meta robots tag includes the "noindex" directive, telling search engines not to index the page. Both the name and content attributes are case-sensitive. Allowed directives include: "noindex" — This prevents page indexing. "index" — This allows page indexing (it is also the default, if not otherwise specified). "follow" — Allows search engines to follow links on the page. "nofollow" — This prevents search engines from following links on the page. "noimageindex" — This prevents indexing of images on the page. You can combine these directives in a single meta tag (separated by commas) or place them in separate meta tags. Canonicalization Canonicalization refers to selecting the main version of a page when multiple versions or URLs exist, therefore preventing duplicate content issues. Duplicate content can result from URL protocol variations (HTTP and HTTPS), site functions (URLs with parameters resulting from filtering categories), and so on. Search engines choose the canonical version based on signals like HTTPs usage, redirects, XML sitemap inclusion, and the annotation. Practical methods to specify the canonical URL include: 301 redirects — You can simply direct users and crawlers to the canonical URL. annotations — Specify the canonical URL within the page’s HTML . XML sitemap inclusion — This signals the preferred URL to search engines. 301 redirects are ideal when only one URL should be accessible, while annotations and XML sitemap inclusion are better when duplicate versions need to remain accessible. Canonical annotations are typically placed within the HTML or HTTP headers, pointing to the absolute URL of the canonical page. For example: For non-HTML files like PDFs, you can implement canonical tags through the HTTP header. JavaScript usage JavaScript can enhance website interactivity, but some sites also use it for client-side rendering (where the browser executes JavaScript to dynamically generate page HTML). This adds an extra step for search engines to index content, requiring more time and resources, which can result in limitations at scale. That’s why server-side rendering is recommended instead. Some web platforms, like Wix, use server-side rendering to deliver both JavaScript and SEO tags in the most efficient way possible. If you can’t avoid client-side rendering, follow these best practices: Ensure links are crawlable using the HTML element with an href attribute. Each page should have its own URL, avoiding fragments to load different pages. Make the resources needed for rendering crawlable. Maintain consistency between raw HTML and rendered JS configurations, like meta robots or canonical tags. Avoid lazy loading above-the-fold content for faster rendering. Use search engine tools, like Google’s URL Inspection tool , to verify how pages are rendered. HTTPS usage HTTPS (Hypertext Transfer Protocol Secure) is crucial for sites handling sensitive information as it encrypts data exchanged between users and your website. Search engines, like Google, use HTTPS as a ranking signal , prioritizing secure connections in search results for better user experience. To ensure security, all pages and resources (images, CSS, JS) should be served via HTTPS. Migrating to HTTPS involves: SSL/TLS certificate — Purchase and install this on your web server. Server configuration — Configure the server to use the certificate. Redirects — 301 redirect all HTTP URLs to their HTTPS equivalents. For a smooth transition: 301 redirect — Ensure all URLs permanently redirect to HTTPS. Update internal links — Update internal links to HTTPS. External resources — Check external resources (e.g., CDNs) for HTTPS support. Mixed-content warnings — Resolve any mixed-content (i.e., when secure HTTPS pages load resources over an insecure HTTP protocol), ensuring all content is loaded via HTTPS to avoid browser warnings. Mobile friendliness Search engines, like Google, prioritize mobile-friendly websites, using mobile crawlers to primarily index mobile content for ranking (as opposed to desktop content). To provide a positive mobile experience, ensure that your site has a well-configured mobile version that fits mobile devices of various screen sizes correctly. These are the three main configurations for mobile-friendly sites: Mobile configuration Description Responsive design The same HTML code on the same URL, displaying content differently based on screen size via CSS. This is the method that Google recommends because it’s the easiest to implement and maintain. Dynamic serving The same URL but serves different HTML based on user-agent. Separate URLs Different HTML for each device on separate URLs. Regardless of the configuration, ensure mobile and desktop versions have equivalent crawlability, indexability, and content configurations (titles, meta descriptions , meta robots tags, main content, internal links, structured data, etc). Allow search engines to crawl resources used in both versions (images, CSS, JavaScript). Avoid lazy-loading for primary content and ensure that all content visible in the viewport is automatically loaded. Optimizing these elements will help search engines effectively access and index the mobile version of your site, improving its visibility and ranking. Structured data Structured data helps search engines understand and classify a page’s content, leading to enhanced search listings known as ‘ rich results ’. Popular structured data types for generating rich results include: Breadcrumb, logo, event, FAQ, how-To, image metadata, product, Q&A, recipe, reviews, software, and video. You can implement structured data in three main formats: JSON-LD — Recommended for ease of implementation and maintenance at scale, JSON-LD uses JavaScript notation embedded in HTML. Microdata — This format uses HTML tag attributes to nest structured data within HTML content. RDFa — This format is an HTML5 extension supporting linked data using HTML tag attributes. Google’s Rich Results Test tool validates structured data and provides previews in Google Search. Here is an example of JSON-LD structured data for a recipe page: Core Web Vitals Core Web Vitals (CWV) measure user experience for loading, interactivity, and the visual stability of a page. Google considers them in its ranking systems. The three main CWV metrics are: Core Web Vital metric Description Largest Contentful Paint (LCP) This measures loading performance by considering the render time of the largest visible image or text block. Interaction to Next Paint (INP) This metric observes the latency of all click, tap, and keyboard interactions that occur throughout the lifespan of a user’s visit to a page. Cumulative Layout Shift (CLS) This measures visual stability by assessing unexpected layout shifts during a page’s lifespan. Google Search Console provides insights into Core Web Vitals performance, which is crucial for site audits . You can improve Core Web Vitals by: Removing unused JavaScript — Avoid loading unnecessary internal or external JavaScript. Using next-gen image formats — Optimize images using lightweight formats like WebP for smaller file sizes without quality loss. Storing cache static assets — Store assets like images, CSS, and JavaScript in the browser cache to reduce loading time. Eliminating render-blocking resources — Asynchronously load external JavaScript to allow the browser to continue parsing HTML. Sizing images appropriately — Specify image dimensions to allocate space on the screen, reducing layout shifts. Hreflang annotations Hreflang annotations are useful for indicating the language and regional targeting of a page and its alternate versions to search engines like Google. There are three main methods for implementing hreflang: HTML — Add hreflang tags to the page’s HTML section using elements. HTTP header — Implement hreflang via the HTTP header for non-HTML files, like PDFs. XML sitemap — Include hreflang annotations in an XML sitemap. Below are some best practices for implementing hreflang annotations: Use them only for indexable pages with multiple language or country versions. Tag only the canonical versions of URLs meant to be ranked. Always self-refer and specify the language (and optionally the country) of the current page, along with its alternates. You can specify only the language, but you can’t only specify the country. When you specify a country, you need to specify the language as well. If you specify both the language and country, the language value should always be specified first, separated by a dash (-), followed by the country. Note that Google does not rely solely on hreflang annotations to identify page targets; it also considers other signals, like ccTLDs, local language, links from local sites, and local currency. Technical SEO is a team effort Building your website on a foundation of technical SEO best practices helps you get the most traffic from the content you’re creating anyway. Oftentimes, however, you’re not the one responsible for actually implementing technical SEO recommendations, which could mean that those suggestions don’t get implemented in a timely manner, hampering your search visibility as well as your career growth. To get your recommendations across the finish line, you need to: Set the foundations for partnership with devs and product stakeholders Strengthen communication for better implementation and outcomes Prioritize your recommendations Validate technical SEO execution To learn more about how to do just that, read my other article on how to get technical SEO recommendations implemented . Aleyda Solis - SEO Consultant and Founder at Orainti Aleyda Solis is an SEO speaker, author, and the founder of Orainti , a boutique SEO consultancy advising top brands worldwide. She shares the latest SEO news and resources in her SEOFOMO newsletter, SEO tips in the Crawling Mondays video series, and a free SEO Learning Roadmap called LearningSEO.io. Twitter | Linkedin
- The noindex tag: What it is, why you need it, and when to use it for better SEO
Author: Vinnie Wong Google may be a search giant, but it still has its limits. Serving Google too many irrelevant or low-quality pages can hurt your site’s crawlability and indexation, eventually resulting in lower rankings, traffic, and revenue. But what if you have a handful of pages that need to stay live without appearing in search results (e.g., gated content, internal search results, checkout pages)? Enter the noindex tag—your resource for telling search engines to keep a page off of the search results, while still making it available for the users that need it. By strategically applying noindex tags, you can streamline your site’s structure, prioritize your most valuable content, and maximize the time Google spends crawling your website. In this article, I’ll dive into the world of noindex tags and explore how they can help you take control of your website’s SEO. Table of contents: What is a noindex tag? Why the noindex tag is important for SEO Noindex vs. Robots.txt Noindex vs. Nofollow How to noindex a page How to noindex pages on Wix & Wix Studio How to check if a particular page is noindexed Best practices: How to use the noindex tag correctly What is a noindex tag? When implemented correctly, a ‘ noindex tag ’ is a piece of code that instructs search engines not to include a particular webpage in their indexes, preventing the page from showing up in search results. This tag is part of a larger family of meta directives known as ‘robots meta tags,’ which provide search engines crawlers with important instructions about how to interact with a website’s content. The noindex tag takes the following format when placed within the section of a page’s HTML: Alternatively, the noindex tag can target a specific search engine’s crawler (such as Google) by replacing “robots” with the crawler’s name, as shown below: The ‘index’ instruction is the default for search engines (allowing your pages to show up in search results), while the noindex tag explicitly tells crawlers not to add the page to their indexes. It’s crucial to understand that the noindex directive operates on a page-level basis—it only applies to the specific URL on which you implement it. Why the noindex tag is important for SEO It may seem counterintuitive to exclude pages from search engine indexes, but there are crucial scenarios where preventing certain pages from appearing in search results is beneficial for your website’s overall SEO health and user experience. Below are five ways to use the noindex tag to support your business’s online success: 01. Avoid duplicate content issues 02. Optimize crawl budget 03. Maintain content quality and relevance 04. Control access and visibility 01. Avoid duplicate content issues When search engines encounter multiple pages with identical (or very similar) content, they may have difficulty determining which version is most relevant to rank in search results/show users. This can lead to several problems: The ‘wrong’ version may rank instead of the original or preferred page. Link equity and ranking signals can dilute across the duplicate versions of the content. Websites may face algorithmic penalties for perceived manipulative duplicate content . Strategically applying the noindex tag to duplicate pages (such as printer-friendly versions) signals to search engines which version should get indexed and ranked. This consolidates ranking signals and helps ensure that the original, high-quality page is what gets shown to users in search results. 02. Optimize crawl budget As huge as Google is, the search engine giant has confirmed that there are just too many pages to crawl . To maximize its time and budget resources, Google limits how long it will crawl any one site—this is what SEOs often refer to as ‘crawl budget.’ For larger websites that have over 10,000 pages, crawl budget optimization will strengthen your site’s SEO. The noindex tag allows SEOs and site owners to manage their crawl budget by instructing search engine bots not to waste time indexing low-value or non-public pages, such as: Internal search result pages Filter or sorting pages for eCommerce websites User-specific content (private profiles or account pages) Auto-generated pages with minimal unique content By keeping these pages out of the index, search engines can focus on discovering and ranking the site’s most important, user-facing content. 03. Maintain content quality and relevance Over time, content will naturally become outdated (or less relevant). Deleting this content outright isn’t always the best solution, so the noindex tag allows you to keep the content on your site while preventing it from appearing in search results and potentially harming your overall content quality signals. This is useful for: Older blog posts or news articles that are no longer timely Product pages for discontinued or out-of-stock items Thin or low-quality pages that don’t meet current standards Noindexing this content helps ensure users find your most valuable, relevant content when searching related keywords. 04. Control access and visibility Many websites create content intended for a specific audience or requiring special access, such as: Members-only content Staging or development pages Paid resources or course materials Conversion funnel pages (e.g., ‘thank you’ pages) The noindex tag provides a simple way to shield these pages from search engine discovery, maintaining control over who can find and access your content. Noindex vs. Robots.txt While both the noindex tag and the robots.txt file provide instructions to search engine crawlers, they serve different purposes: Robots.txt controls crawling, specifying which parts of the site search engine bots are allowed to crawl and which are off-limits. The noindex tag allows bots to crawl the page but prevents them from indexing it. Here’s a simple robots.txt example: User-agent: * Disallow: /private/ This instructs all search engine bots not to crawl any pages within the “/private/” directory of your website. The key distinction is that robots.txt prevents search engine bots from accessing and crawling certain pages altogether, but it doesn’t directly impact whether a page can appear in search results. In contrast, the noindex tag allows bots to crawl the page but prevents indexing, keeping the page out of search results. It’s a subtle difference that has important implications: Difference in impact Robots.txt Noindex Crawling vs. Indexing Pages disallowed won’t be crawled at all. Search engines won’t see the content. Pages will be crawled, but their content won’t be indexed. Search engines can still analyze the page and follow links. Link equity flow Links on blocked pages won’t be followed or pass link equity (PageRank). Noindexing a page over the long term will eventually result in Google removing the page from its index completely, thus no longer following the links (i.e., noindex, nofollow ). Control level Operates at the directory or site-wide level. Can disallow entire sections, but not individual pages. Allows control of indexation on a page-by-page basis, providing more granular control. In practice, robots.txt and noindex are often used together. For example, you might use robots.txt to prevent crawling of sensitive pages and apply the noindex tag on specific pages that shouldn’t appear in search results (e.g., ‘thank you’ pages or faceted navigation). Noindex vs. Nofollow Noindex and nofollow are two distinct meta directives with specific purposes, often used together but serving different functions. The nofollow directive (which applies the nofollow link attribute to all the links on that page) is a meta tag that instructs search engine crawlers not to follow any outbound links on the page, acting as a ‘stop sign’ for link equity flow. Here’s what it looks like in the section: The meta robots nofollow directive is beneficial in two common scenarios: To tell Google that you don’t endorse a link: You might use a nofollow link if you’re linking to a website that you don’t necessarily recommend or trust. By using a nofollow link, you’re telling Google that you don’t want to pass any of your page’s ranking power to the linked page. In user-generated content to avoid link spam: If your website allows users to add to your content, such as comments or forum posts, you might want to use nofollow links for any links that users add. This can help to prevent spammers from adding links to their own websites in your content. Crawl prioritization On large sites, you can use nofollow on certain pages (or page types) to manage crawl budget and direct search engine bots to your most important content. By reducing the number of links bots have to follow, you streamline the crawling and indexing process. You can use noindex and nofollow together on a single page: This instructs search engines not to index the page or follow its links (common for pages like login screens or ‘thank you’ pages that are necessary but shouldn’t be discoverable through search or pass authority). However, using noindex and nofollow together incorrectly can have unintended consequences: Accidentally noindexing and nofollowing important pages could prevent indexing and cut off link equity flow to other key pages. Noindexing and nofollowing large site sections can hinder search engines from discovering and ranking your most valuable content. Use noindex on pages that shouldn’t appear in search results and nofollow only when necessary to control link equity flow. If unsure, err on the side of indexation and allow links to be followed to help search engines understand and rank your site effectively. How to noindex a page Adding the noindex tag to a page is relatively simple and requires access to your site’s HTML code. There are two primary methods: 01. Meta robots tag The most common way to noindex a page is adding a meta robots tag to the section of the page’s HTML: This instructs all search engine bots not to index the page. To target a specific bot, replace “robots” with the bot’s name: You can combine noindex with other directives, like "nofollow," by separating them with a comma: The process for adding this tag depends on your content management system or web platform: Wix: Use the settings in the Wix editor, the SEO settings panel, or the Edit by Page section . I’ll cover these options in more detail later. Other platforms: For closed content management systems, you can typically noindex a page within that page’s settings. For open source platforms, you may need to install a plugin. Refer to your specific platform’s documentation. After you add the noindex tag, save your changes and publish or update the page. The tag will take effect the next time a search engine bot crawls the page. Before you start putting this tactic to work, it is absolutely crucial that you: Avoid using both robots.txt disallow instructions and the noindex tag on the same page. The noindex tag will override the robots.txt file’s disallow instruction and prevent the page from being indexed, as it is a page-specific directive. The robots.txt file, on the other hand, is a broader instruction that applies to all crawlers and bots. 02. X-Robots-Tag HTTP header An alternative method for specifying noindex is using the X-Robots-Tag HTTP header, which you can add to your server’s HTTP response for a particular page (or group of pages): X-Robots-Tag: noindex This method is great for non-HTML files (PDFs, images, videos) and situations where you can’t directly access a page’s HTML code, but can configure your server’s response headers. Implementing the X-Robots-Tag header requires modifying your server configuration files (e.g., .htaccess for Apache servers or nginx.conf for NGINX). The process depends on your server setup, but here’s an example for Apache: Header set X-Robots-Tag "noindex" This code snippet instructs Apache to add the noindex X-Robots-Tag header to the HTTP response for the file "example.pdf." Note that using HTTP headers requires more technical knowledge and server access compared to adding meta tags to HTML. If you’re not comfortable modifying server configurations, stick with the meta tag method. Regardless of the implementation method, search engines will recognize the noindex directive and exclude the page from their indexes. How to noindex pages on Wix & Wix Studio Whether you’re on Wix or Wix Studio, it’s easy to add noindex tags to your pages through the built-in SEO settings. Here’s how: Open the Wix editor: Log in to your Wix account and open the editor for the site you want to modify. Access your Page Settings : In the editor, choose the page you want to noindex from the Pages & Menu options on the left-hand panel. Click on the ‘more actions’ (three dots), then click SEO basics . Apply the noindex: At the bottom of the SEO basics tab, toggle the switch for “Let search engines index this page (include in search results)” to the off position. This adds a noindex meta tag to the page. The noindex tag is now included in the page’s HTML code, and search engines won’t index it the next time they crawl your site. To view your noindexed pages on Wix, use the Site Inspection tool : Access your Site Inspection dashboard: From your Wix dashboard, go to Site & Mobile App > Website & SEO > SEO . Under Tools and settings , click on Site Inspection . Check your page status in Google’s index: In the Site Inspection report, open the filtering options. In the Index Status drop-down filter, select Excluded to filter for pages that are not indexed. Look for the status “Excluded by ‘noindex’ tag” to indicate pages that are noindexed. To apply noindex tags to all pages of a certain type (e.g., all blog posts in a category, all product pages, etc.), use the Edit by Page feature in the Wix dashboard: In your Wix dashboard, go to Site & Mobile App > Website & SEO > SEO . Under Tools and settings , select SEO Settings . From there, choose your desired category of pages and go to the Edit by page tab (shown below). How to check if a particular page is noindexed There are a few ways to check if a specific page is noindexed, including: Checking the page’s HTML code Google Search Console’s URL Inspection Tool Browser extensions Crawling tools Check the page’s HTML Code To check for a noindex tag in a page’s HTML: Open the page in your web browser. Right-click anywhere on the page and select “View Page Source” (or use Ctrl+U on Windows or Option+Command+U on Mac). In the new tab showing the page’s HTML code, use your browser’s search function (Ctrl+F or Command+F) to search for "noindex". If the page has a noindex tag, you should see a line like in the code. Note that this method only checks for the presence of the noindex tag and doesn’t confirm if search engines have respected the tag and excluded the page from their indexes. You can also use this method to check indexation on any webpage that you can access—not just the ones on your site. Use the URL Inspection Tool in Google Search Console To check the index status of one of your own webpages using Google Search Console : Log in to your Google Search Console account and select the property for your website. In the left-hand menu, click on “URL Inspection.” Enter the URL of the page you want to check in the search bar and press enter. Google will display information about the page, including its indexing status. If the page is noindexed, you’ll see a message like “URL is not on Google” or “Excluded by ‘noindex’ tag.” This tool provides a definitive answer on whether Google has excluded the page from its index based on the noindex tag, but it only works for pages on sites you have verified ownership of in Search Console. Use a browser extension Browser extensions, like Meta SEO Inspector for Chrome , can quickly check a page’s robots meta tags, including noindex. Install the Meta SEO Inspector extension from the Chrome Web Store. Open the page you want to check in Chrome. Click on the Meta SEO Inspector icon in your browser toolbar (it looks like a magnifying glass). The extension will display a summary of the page’s meta tags, including any robots directives like noindex or nofollow. Keep in mind that extensions can be handy for spot-checking individual pages, but aren’t as definitive as Google Search Console, as they only look at the page’s HTML and not its actual indexing status. Crawl the website Website crawling tools like Screaming Frog or DeepCrawl can check the status of multiple pages simultaneously, providing a comprehensive overview of your site’s indexation. To find noindexed pages using Screaming Frog: Enter your site’s URL in the tool and click “Start.” After the crawl is finished, click on the “Directives” tab in the bottom window. Click on the “Filter” dropdown and select “Noindex.” The tool will display a list of all the pages on your site with a noindex tag. Best practices: How to use the noindex tag correctly Implementing noindex tags incorrectly can lead to unintended consequences, such as important pages being excluded from search results or search engines misinterpreting your site’s structure. While it’s easy to implement a noindex tag, it’s also easy to do it wrong. Here are some tips to ensure your noindex tags lead to SEO improvements, not errors. Don’t block noindexed pages with robots.txt Include self-referential canonical tags on noindexed pages Regularly monitor site indexation Don’t block noindexed pages with robots.txt If there’s ever been a case for less is more, it applies to robots.txt and noindex tags. Specifically, the noindex tag only works if search engines can actually crawl the page. If you use the robots.txt file to disallow search engines from a page entirely, they won’t be able to see and respect the noindex tag. This can lead to a situation where you think a page is excluded from the index, but it actually still shows up in search results. Instead of using robots.txt to block noindexed pages, prioritize the noindex tag itself. Allow search engines to crawl the page so they can see the noindex tag and understand that it shouldn’t be included in their indexes. Include self-referential canonical tags on noindexed pages When you noindex a page, you can also include a self-referential canonical tag . This means adding a canonical tag that points to the page itself as the canonical (or ‘preferred’) version. It might seem counterintuitive to do this on a page that’s being excluded from search results, but it can actually help search engines better understand your site’s structure. Here’s an example of what a self-referential canonical tag looks like (if our example page’s URL was https://www.example.com/noindexed-page): Including this tag on your noindexed pages helps avoid potential confusion if the page is accessible through multiple URLs (such as with parameters or tracking codes). Without the self-referential canonical, search engines might choose one of these alternate URLs as the canonical by default, which could lead to unexpected indexing behavior. By specifying the page’s own URL as the canonical, you’re reinforcing the noindex signal and telling search engines that this specific URL is the authoritative version, even though it’s intentionally excluded from the index. Regularly monitor site indexation Even if you’re careful about implementing noindex tags correctly, mistakes can happen. A noindex tag might be accidentally removed during a site update, or a valuable page might get noindexed unintentionally. To catch these issues early, make a habit of conducting regular site audits . Tools like Google Search Console are invaluable for this purpose. In the Page Indexing report , you can see a list of all the pages on your site that Google has crawled and whether they’re indexed or excluded (and why). If you notice any important pages that are unexpectedly noindexed, or any noindexed pages that suddenly show up in search results, you can take action quickly to resolve the issue before it has a significant impact on your search traffic and business. Stay on the pulse with your noindex tags The noindex tag is just one piece of the SEO puzzle, but it’s a crucial one. By strategically using noindex tags in conjunction with other technical SEO tactics like canonicalization, structured data , and smart internal linking , you can create a website that’s not only search engine-friendly but also laser-focused on delivering value to your target audience. Just remember that for any tactics related to SEO success, using noindex tags isn’t a ‘set-it-and-forget-it’ task. Have a system to monitor your pages, whether it’s through regular site audits or checking your Wix dashboard, and you’ll be on track to prioritizing your site’s most important content. Vinnie Wong - Founder and Chief Strategist at Content Cartography Vinnie is a content expert with over 5 years of SEO and content marketing experience. He's worked with Ahrefs, Empire Flippers, and is committed to crafting exceptional content and educating others on the symbiotic relationship between content creation and effective link building. Twitter | Linkedin
- SEO A/B testing: Experiment for superior title tags and meta descriptions
Author: Jandira Neto Traditionally, SEOs have relied on advice and recommendations from Google in hopes that it would improve their organic traffic. Even for brands and agencies that rigorously follow these on-page SEO guidelines, the uncertainty can be tough to navigate, and sometimes you don’t even know where to start. With SEO A/B testing, you can take the guesswork out of website changes and make informed, data-backed decisions. By designing an experimentation program, you can run small tests that insulate you from risk while identifying valuable opportunities to optimize your site. In this article, I will show you how to build a testing strategy that enables you to run A/B tests on your meta tags so that you can enhance your competitiveness in the search results and bring in more organic traffic. Table of contents: A/B testing: The fundamentals What is A/B testing for SEO? How A/B testing works in SEO The benefits of A/B testing for SEO A/B testing strategies for title tags and meta descriptions Experimentation programs Identifying title tag test opportunities Identifying meta description test opportunities Crafting variations of title tags and meta descriptions Conducting A/B tests and gathering data Case studies and real-world examples 3 success stories from A/B testing title tags 3 success stories from A/B testing meta descriptions How to A/B test and optimize your meta tags on Wix A/B testing: The fundamentals For those newer to A/B testing, here’s some crucial context that will guide you throughout the process and help you better explain it to teammates and stakeholders. What is A/B testing for SEO? In the context of SEO, A/B testing is a methodology in which you compare the impact a site change has on two statistically similar web pages. A/B testing is not new to the world of marketing. American advertiser and author Claude Hopkins pioneered this method by conducting the first documented A/B test, in which he looked at the rate of return to measure the impact of his experiment with two distinct promotional coupons. Fast-forward to the modern day: A/B testing has expanded to all types of digital marketing (including SEO), giving rise to a variety of SEO testing tools, including SearchPilot (the one I work for), SEOTesting, and more. How A/B testing works in SEO To begin A/B testing (or just about any SEO testing ), you should first split your web pages into one of two subsets (also referred to colloquially as “buckets”): One subset of pages are the control pages —you will not make any changes to these pages so that they can serve as a baseline for comparison. The other subset of pages are the variant pages —you will test either an off-page or on-page SEO change to the pages in this group. This way, you can see how these changes affect organic traffic in a controlled, repeatable way. What’s the difference between user testing and A/B testing? The main difference between user testing and A/B testing is that the A/B testing tests Googlebot . User testing, on the other hand, uses cookies to test the behavior of your real-life site visitors. In A/B testing, you cannot show Google different versions of the same page (as that would constitute cloaking). To avoid this, SEOs split pages, not users. Users will see the same page every time. The benefits of A/B testing for SEO More and more website owners and digital marketers are adding A/B testing to their SEO strategy, and here’s why: A/B testing helps you make data-driven decisions that benefit your business/website. Long gone are the days of relying on assumptions or half-baked competitor analysis to anchor your marketing strategy. Instead, you can run a small scale A/B test to assess the ROI on a series of optimizations (enabling you to take on a fraction of the risk compared to implementing sweeping changes all at once without testing). If it goes well, you can bring a stronger case to your stakeholders to get their support for your recommendations. Another thing you might really appreciate is the agility and creativity that comes with A/B testing: You can test on almost any part of your site—from title tags and meta descriptions to schema markup and URL structures. Once you pinpoint an area of your website that you would like to optimize, you can run an A/B testing experimentation program. This is an iterative process that can help your site stay competitive and potentially improve your organic traffic. A/B testing strategies for title tags and meta descriptions Meta tags help Google understand what your website is all about. Google uses this important information to help determine what it displays in the search results. To that end, Google has said that “it’s important to use high-quality title text on your web pages.” Likewise, it’s also important to use relevant, high-quality copy in your meta description : “Google will sometimes use the [meta description] tag from a page to generate a snippet in search results, if we think it gives users a more accurate description than would be possible purely from the on-page content.” — Google, “ Control your snippets in search results ” Meta tags are a great element of your site to test on, but how do you strategize it? Where can you start? As an SEO consultant, I have tested on a wide variety of website sections with a wide range of customers, so I am no stranger to test ideation and building SEO testing strategies. When discussing strategies with customers, they normally come to me with their SEO problems and I, in turn, seek to prove the value of SEO by supporting them with a successful experimentation program. Now let’s take a look at how you can build an A/B testing experimentation program for your own website. Experimentation programs As we dive deeper into SEO A/B testing strategies, a pivotal type of strategy emerges—experimentation programs. This strategic approach plays an important role in running SEO A/B tests that will extract learnings about what works and doesn't work for your website. Let's explore what an experimentation program entails and how you can build one. What is an SEO experimentation program? An experimentation program is a structured approach to testing a range of SEO hypotheses in order to: Learn more about how your site performs under different strategies Improve key performance metrics Make more informed, data-driven decisions As opposed to testing to “just see what happens,” you are creating a personalized testing strategy to solve your SEO problems. Your experimentation program will uncover useful insights into the organic performance of your site and the ROI of the SEO tools you are currently using. By the end of the program, you should be able to look back at a portfolio of tests and see how much time and money you saved by eliminating risks associated with negative changes. A drop in your organic traffic from negative changes could massively damage your sales/conversions. You will also be able to see how much time and money you saved your engineering team by only deploying winning changes. How to create hypotheses for A/B tests The best testing strategies are cohesive and strive for one solid goal. Even though SEO is ever-changing, the idea behind an experimentation program is that the testing strategy should be cohesive—not purely reactive. You can create hypotheses based on existing strategies, known problem areas, or research in your industry. In that hypothesis, you should include: What changes you plan on making What pages you will change The projected impact on organic traffic Note: CRO testing measures users and their behaviors. This is a separate testing methodology and set of considerations, but you should consider conducting CRO testing if your conversion rate has decreased despite otherwise solid website performance. Going back to our example of finding the most well optimized title tag, a good example hypothesis for this testing strategy might be: “We want to find the most well optimized title tag (as the title tags on our site are low quality). We will test changing the title tag to feature more keywords on a subset of product pages, therefore improving our ranking for new keywords.” This hypothesis is not just for one test, but for as many as you like. You can iterate until you land on the best one. How to build a reliable SEO experimentation program The entire point of building an SEO testing program is so that you can obtain repeatable results. You can achieve a reliable experimentation program in four steps: Ensure your hypothesis is aligned with your high-level goals. If your digital marketing goals are to increase brand awareness via search rankings, for example, then your hypothesis might involve optimizing your title tags. Come up with a handful of variations to achieve your goal. Create a couple of title tag variations (based on keyword research in your industry) that are likely to improve your organic traffic and, in turn, your existing search rankings. Start SEO A/B testing, record observations, and analyze your data. Run your A/B tests for one to two weeks and observe the behavior of your variant page. Has organic traffic increased, decreased or stayed the same? Look at how your pages appear in the search results and take note of any visibility changes. After you draw conclusions from your test, come up with some new ideas to iterate and improve. You can create a fresh variation of your test(s) (based on your findings) and repeat the process until you identify the best method for optimizing. Identifying title tag test opportunities Review your current title tags and page performance to identify whether there’s potential for improvement: Does it align with Google’s recommendations? Is it competitive? And, think about how it could be better for Googlebot (more on this in the example below). So for example, this VR arcade business has a title tag (“Page 1”) that gives no information about the content of the page and Google will have trouble understanding it. There is an opportunity here to make the title tag more descriptive and reflective of the page’s content. It’s worth running an experimentation program and finding ways to optimize your title tag if it is: Generic (like the example above) Vague Contains keyword stuffing Does not reflect user search intent There are a few automated tools that can help you spot title tags that are not performing well, like WebFX's SEO grader . This tool gives your title tag a score to gauge its effectiveness, providing a baseline for A/B testing. However, I personally prefer manually inspecting title tags, comparing them to competitors and drawing on industry knowledge. There’s always room for improvement and optimization. By staying informed about your industry, you can determine if your title tag isn’t performing at its best. Identifying meta description test opportunities The same idea (discussed in the section above about title tags) also applies to meta descriptions. Again, you’re looking for meta descriptions that are generic, vague, stuffed with keywords, and/or don’t reflect search intent. The example VR arcade businesses’ meta description (above: “Come read the Game Catalogue”) lacks specificity and fails to communicate the content of the page. Crafting variations of title tags and meta descriptions Now that you have identified the opportunities, you can craft variations for the experimentation program using keyword research , competitor analysis , and a user-first mindset . Instead of these title tags… Let’s test… “Page 1” “Explore Virtual Realities at AI Arcade | Top VR Gaming in the UK | Multiple Locations for Endless Fun!” “Locations” “Our AI Arcade Locations | Find Nearby Venues for Immersive VR Experiences” “The Wrestling Game” “Step into the Ring: Immersive VR Wrestling Experience at AI Arcade” Instead of these meta descriptions… Let’s test… “Welcome to AI Arcade, Best VR, Affordable Games, Top Quality” “Welcome to AI Arcade: Immerse Yourself in Cutting-Edge VR Experiences at Multiple UK Locations. Affordable Games, Unparalleled Quality, and Endless Fun Await” “Come read the Game Catalog” “At AI Arcade, We Have A Diverse Catalog for Thrilling Adventures and Immersive Experiences. Choose Your Next Virtual Journey Now!” “We Offer A Gladiator Wrestling Game In Our Five Game Options” “Dive Into Thrilling Virtual Reality Combat As You Challenge Friends In A Gladiator-Style Arena. Experience Intense VR Wrestling Right In Your City!” Conducting A/B tests and gathering data Similar to traditional SEO best practices, you will implement the change and monitor performance before deploying the new changes to your whole site. This will help ensure the integrity of your A/B tests. Analyzing and interpreting test results While testing, you should monitor performance to see how the change you made impacts your organic traffic. You can track these performance changes in Google Analytics 4 , Google Search Console , your CMS’s built-in analytics, or various other third-party tools. The example above is from the SearchPilot platform. You can see that the test ran for a number of days and resulted in a 5.2% uplift in organic traffic. The customer won 797 extra sessions by making this SEO change. The SearchPilot platform takes into account many external factors that might affect the organic traffic of your website, such as algorithm updates and seasonal changes. Although you can use any of the SEO tools I mentioned (among many others), not all of them have features specifically to support A/B testing and sophisticated analyses. If your tools don’t offer those features, you can start off with before-and-after testing. Bearing in mind, this method does not factor in algorithm updates or seasonal changes, at the bare minimum you will be able to keep records on: Baseline performance (your control group/web page) The exact change you made and the date you made it The date the test ended and the impact(s) on performance Case studies and real-world examples A/B testing can revolutionize website performance and prove the value of SEO (where you otherwise couldn’t via standard site changes). Let’s go through some successful examples that my team and I (at SearchPilot) have tested that you could try emulating on your site. 3 success stories from A/B testing title tags 01. Adding “The Best” to the title tag Optimizing your title tags can impact the overall organic traffic your website brings in by influencing click-through rates and introducing new keyword rankings. Once you see a change in your organic traffic, you can identify what influenced the change by reviewing search results for your industry. Here’s an example that transcends industries: The hypothesis for this test was that adding “The Best” to the beginning of title tags could help produce better click-through rates and generate a positive impact on organic traffic. We ran the test and saw a 10% uplift in organic traffic. By deploying the change, the site could see an extra 11,000 organic sessions per month. Google respected the change and showed the new title tags in search results (remember, Google may opt to rewrite your titles in search results if it thinks they’re not relevant for the user). This change had an amazing impact and is easy to implement on just about any site in nearly any industry. 02. Adding a question to the title tag An informational query is a search in which the user is looking for an answer to a question. Their intent is to know something (hence these types of searches are sometimes called “know queries”). If your site is content heavy, you should optimize it to align with informational queries so that you can establish your sites’ expertise in the industry. This, in turn, can help improve your organic traffic. In conducting this test, we hypothesized that by increasing the occurrences of the targeted keyword and structuring queries as questions, we could enhance the page’s relevance and better align it with user search intent. The aim was to boost existing rankings and improve the organic click-through rate. The change was small but mighty. We saw a 5% uplift in organic sessions. Google was able to answer users “know queries” because the question users were searching was in the title tag. 03. Appending brand name and locale to the end of title tag Small, agile changes can improve your organic traffic. The hypothesis behind the test was that adding the brand name and location to the end of the title tag would help improve organic traffic by increasing the pages’ relevance, visibility, and trust. But, I did fear that the title tag would become too lengthy so Google would cut off the brand name and locale in the search results. This could have a negative effect on organic traffic. When title tags are truncated in search results by Google, the text can end up not being visible to the user. Thanks to our customers’ clear and concise title tags, there were no cases where the search results got cut off and the results showed that this small change brought in an impressive 9% organic traffic uplift. This result backs up my idea that a small but agile change made the pages more noticeable. By including the location, Google was giving the correct information to prioritize our site for users in the locale. Also, adding the brand name didn't just make the site more relevant to local users; it also sent out strong signals of authority because the brand is well-known. 3 success stories from A/B testing meta descriptions 01. Adding third-party ratings to your meta descriptions If you have good user reviews, you can make them work for your website. The hypothesis for this test was that adding third-party reviews to the meta description could help enhance E-E-A-T signals for improved rankings and/or better click-through rates. The meta description now helps instill an element of brand trustworthiness and allures users, potentially leading to an uptick in organic traffic and click-through rates. Google respected the new meta description and showed it in the search results. Although the test was not statistically significant , it resulted in a positive correlation. This means that there were small uplifts in the organic traffic that were not strong enough to be a 95% positive test, but the customer classified it as a positive test and rolled it out on their site. Testing this out on your site could be the beginning of an iterative process. 02. Removing the meta description altogether Sometimes your meta tags can do you more harm than good (i.e., when they’re irrelevant or poorly optimized). An easy way to test this is by removing them altogether. The hypothesis behind this test is that the site’s web pages featured low-quality and generic meta descriptions. So, we tested removing the meta descriptions entirely to allow Google to select a snippet from the page’s text that it deemed useful. The test was positive at an 80% confidence level, meaning if there is a 20% chance that if this change was rolled out on the site, it wouldn’t have the same impact. This might be the easiest out of all the examples that you can implement. You can learn what Google classifies as important on your page and therefore test putting all the variations of verbiage it populated as your meta description in an experimentation program. 03. Adding price to your meta descriptions To remain competitive in search results, you need to test different approaches: Most search listing snippets contain your standard text. How about adding some numbers to your meta description to make yourself look different and stand out? In this test, we experimented with adding the lowest price deal into the title tag. Prior to this, the title tags did not feature any pricing details. The hypothesis was that by adding the price, we could help boost click-through rates by attracting users who are seeking the best deals and improve search results performance by including relevant information. The test ended with a huge 12% increase in organic traffic. Test adding prices to your title tag; whether it’s the lowest price available or an average for the products you sell. This user-first approach could help you improve your organic search traffic from customers who search prices. How to A/B test and optimize your meta tags on Wix Wix website owners can see an overview of all their title tags, meta descriptions, page/product names, and URLs for a given page type (i.e., blog posts, products, events, etc.) in the Edit by Page section of the Wix dashboard ( SEO > SEO Settings > [desired page type] > Edit by Page ), as shown below. From here, you can make changes to any of your title tags and meta descriptions (without having to open each page individually) by clicking on the three dots to the right of the desired page, as shown below. You can also generate title tag and meta description suggestions based on your page content using our AI meta tag creator (accessible via each individual page or the Edit by Page section). You can also use the AI meta tag creator to refine suggestions for your brand/audience, expanding ways you can test your meta tags for the best performance. SEO A/B testing is your roadmap to iterative performance improvement SEO A/B testing helps you answer a simple question: “What should I optimize next?” Although some creativity is involved with your optimizations, testing largely takes the guesswork out of your strategy, which keeps you on track to iterative performance improvements. Once you’ve gained some experience with this process, you’ll be able to work smarter and faster by testing small (to manage risk) and applying those changes at scale, enabling you to get the most out of the SEO you’re already doing for your business (or your clients’ businesses). Jandira Neto - SEO Testing Consultant at SearchPilot Jandira is a technical SEO A/B testing consultant. She works to prove the value of SEO for the world’s biggest websites, delivering profitable, attributable results. She also enjoys staying on top of SEO industry news and providing SEO advice to small minority businesses. Linkedin
- Foster an education-first culture at your agency for better authority, business, and retention
Author: Christine Zirnheld Generative AI, third-party cookie deprecation (eventually), Google algorithm updates, broadening match types—digital marketing moves fast. To stay ahead and satisfy clients, your team must embrace ongoing learning and new skills. Without a culture of continuous education, your agency risks falling behind. In this article, I’ll delve into the training methods digital marketing agencies can explore to foster perpetual learning among employees, highlighting its crucial role in agency success, including: What it means to have an education-first culture at a marketing agency The business benefits of continuous education for agencies How to get the most out of employee training How to make learning a regular part of your agency’s week What does an “education-first” culture look like for marketing agencies? Your employees don’t need an MBA to grow their skills and become better marketers. In fact, at Cypress North, the agency where I work, we no longer require a bachelor’s degree from potential hires. We’ve found that the best education is what we learn from each other and real-life client experiences. However, that doesn’t mean you should work on client projects for 40 hours a week. When we only focus on client work, we miss out on opportunities to diversify our skills and learn strategies for new verticals. So, what does continuous learning look like at an agency? It requires breaking your teams out of siloed client groups to come together, collaborate, and share learnings. Our agency has various types of “training” that occur every day. These can look like hands-on working sessions or more formal training seminars. We’ll dive into the complete list of our weekly trainings later in this article. The benefits: How education improves your agency as a business and as an employer Education is an investment in the future. It may seem like a sacrifice to take hours away from client work right now, but over the long haul, an education-first culture at your agency means: Lower cost and more flexible teams Long-term client success Employee retention, growth, and fulfillment Training content that promotes agency authority Lower cost and more flexible teams As your agency grows, you’ll need to decide whether to hire more experienced or greener digital marketers to join your team. Professionals with longer resumes bring valuable experience, but you’ll also pay more upfront for senior-level employees. And, you won’t know if they’ll work with the same values and strategies as your agency requires until they’re already part of the team. Hiring less-experienced applicants requires a smaller initial investment than seasoned marketers. Another benefit is that you can design the perfect team for your agency, focusing on the skills and knowledge most helpful to your clients. However, because these greener marketers have less experience, they’ll have less hands-on knowledge to apply to client accounts. To enjoy the benefits of growing a team from the ground up, investing in continuous education and training is a must. Long-term client success Because our days are focused on clients, finding time for hands-on learning is often difficult. While carving out time for continuous learning can mean taking time away from client work, it leads to better performance in the long run. As an agency, your greatest competitive advantage is the ability to see what performs across multiple accounts. This valuable knowledge should be shared across client teams to improve performance across your agency. By siloing your learnings, your team may bill more hours, but miss critical insights and lessons that lead to the ultimate goal: performance and growth. Employee retention, growth, and fulfillment When good employees don’t see opportunities for growth at your agency, they leave. By providing continued education, employees can expand their knowledge and diversify their skill set. Exposing your teams to new challenges, tools, and strategies outside their dedicated client work shows an investment in your employees’ careers while elevating their capabilities for your agency. Training content that promotes agency authority It may feel like the best way to stay ahead of other agencies is to guard your expertise. At Cypress North, we’ve found the opposite to be true. If a topic is challenging our team, chances are other marketers are challenged by it as well. Whenever someone on our team brushes up on a topic to lead a training, there is an opportunity to create educational content to promote our brand. Examples include: Blog posts Video tutorials Webinars Downloadable resources , checklists, or planning templates Conference presentations Podcasts Social posts E-books Content like this establishes your team as industry thought leaders, which helps you attract clients as well as top talent to join your team. Our agency’s podcast, Marketing O’Clock , is an excellent example of this content strategy. We release weekly digital marketing news episodes and more evergreen content like tutorials and roundtable discussions. This is a winning strategy for both Cypress North and our customers: Our team benefits from forcing ourselves to stay up-to-date on the latest news and updates. Because we’re on the cutting edge of industry trends, we’re better positioned to exceed KPIs and keep clients happy. Potential clients find our content online and see Cypress North as a leader in the industry. As established thought leaders, we attract high-performing digital marketers to join our team, which brings even greater value and performance to our clients. How to get the most out of employee trainings Training and education can be a significant time investment. Below are some strategies that our team employs to ensure that learning sessions are time well spent. Put learning on the calendar Sometimes, simply showing up is the most challenging part of training your team. As an agency, we are bogged down with weekly tasks, reports , and client meetings. To instill a culture of continuous learning, you must hold your teams accountable by scheduling regular training sessions, webinars, and working sessions. At Cypress North, we put learning on the calendar every day. Below is a training schedule for a typical week at our agency, including learning opportunities that we will discuss in depth later in this post. Day Training Monday Marketing team standup Tuesday Account maintenance PPC training Wednesday News podcast recording Thursday Hands-on SEO training & PPC optimization training Friday Digital Marketing University We add these placeholder meetings to our calendar every week, but these meetings can all have varying topics (depending on what projects we’re working on or if there is a trending topic that we need to cover). We do our best to ensure the team knows what will be covered during each event as early as possible, so they can plan their day accordingly. Who should attend? Before scheduling meetings, decide who on your team needs to attend training. At our agency, we invite every marketing team member to attend every learning opportunity. While these meetings are only required for coordinator- and associate-level employees, more senior members are encouraged to join if they have time. As mentioned earlier, learning isn’t only for entry-level digital marketers. Even a veteran marketer with years of experience can benefit from learning about a new feature, product, strategy, or tool. Plus, because we aim to make training interactive, the rest of the team can benefit from hearing their point of view, too. Utilize your entire marketing team Cypress North is a mid-sized agency; we don’t have employees dedicated solely to training staff. We divide training responsibilities across team members to prevent training from becoming a burden, and we don’t only use senior employees. The primary benefit to this is that nobody has to spend excessive time prepping for or leading training, but there are many additional perks to this approach. Ex posure to diverse perspectives and approaches: An unfortunate side effect of agency growth is that teams become more siloed, focusing specifically on their clients. There are members of the marketing team at Cypress North that I have never had the pleasure of working with even though we’ve both been here for years. Mixing up the leaders of training sessions allows us to get varying perspectives on challenges and enables our team members to hear from everyone, not just those on their same client team. Beyond continuous learning, breaking out of our client teams helps us get to know each other better. This leads to better camaraderie among the team and benefits our agency culture. Sometimes, the perspectives and knowledge of greener team members is the most valuable of all. As a leader on client accounts, I often find myself caught up in reports, deliverables, and client relationships. Less-experienced marketers often spend more time working with Google Search Console and discovering new SEO or AI tools to help our clients. When we have the opportunity to combine these fresh ideas with years of client experience, the confluence of skills can drive incredible results. Leadership training for junior staff: When a coordinator or associate leads a training exercise, they aren’t just passing on what they learned, they’re also growing their own leadership skills. Learning sessions are an excellent opportunity for greener employees to grow more confident in public speaking and other management skills. Limit distractions When I attend a virtual learning session, I’m often distracted by emails, reports, and Slack messages. Remote training is tough because it’s hard to focus on learning when you have your day-to-day tasks on the screen in front of you. Most of my team works in-office, so we have the option to hold in-person meetings. However, we have two offices in different cities. For the larger training sessions (where we try to get the whole team together), we cannot get everyone in the same room. So, we utilize a hybrid approach. Here is an example of how our hybrid training works: whoever hosts the meeting will sit at their computer (in their office) and share their screen. The rest of the team gathers in a separate conference room and pulls the meeting up on a big screen. This approach allows whoever is in charge that day to share slides or a site they are troubleshooting on the big screen, while the rest of the team engages in the meeting. The team members in the conference room don’t get distracted by their day-to-day tasks and feel more comfortable speaking up and sharing insights because they are attending the meeting in-person. And, anyone working from home can still participate in the meeting, or we can record the sessions for new employees to watch later. Make training interactive To get the most out of training, it’s vital that your team is engaged and excited about the topic. Through trial and error, we’ve identified some strategies to help us achieve this goal. Us e real client accounts: Two of our most valuable training sessions are hands-on account maintenance training and client-specific working sessions. In these trainings, the session leader gets the team together to work through a p articular challenge or project. We encourage our team to bring their own client challenges to these sessions to benefit from other team members’ perspectives, including upper management and coworkers that do not work on that client account. This keeps our team more engaged because the knowledge they acquire will impact their actual client performance. It also allows us to feed two birds with one seed, training the team while executing deliverables for our client. Ask for feedback: Instead of just delivering a lecture to the group, we involve the whole team to keep everyone as tuned-in as possible. Before showing the group how to approach a problem, the session leader asks everyone what they would do. This strategy keeps everyone engaged and lets us get varying perspectives on the same problem. Record everything Record every marketing training session and upload it to a shared folder. This adds another step to the process, but your future self will be grateful when you have a library of training videos and educational materials for new hires. Recording meetings also allows anyone who missed the training to go back and review. Plus, if there are any questions on the topic in the future, your team can easily reference the video. Make learning a scheduled part of your agency’s week The different types of “training” that your team can engage in each week include (but aren’t limited to): Weekly marketing team standups Working sessions Industry certifications Marketing 101 trainings Book clubs Industry news Weekly marketing team standups At Cypress North, we gather the entire marketing team for a 30-minute call at the beginning of every week. This allows us to: Update the team on company matters Check in on workload for the week ahead Share learnings from the previous week Every team member concisely shares something that they learned. This doesn’t have to be something “new” or groundbreaking. Chances are, if a tool, feature, or strategy is new to one person on the team, it will be new to someone else. This is a quick exercise, but it challenges us to continue to grow our skills every week and pass our discoveries on to the rest of the team. What seems small to one person could be a game-changer for another client’s performance. Working sessions I’ve found that hands-on learning is the best way to grow my skills and confidence. Regularly scheduled, client-specific “working sessions” allow the entire client team to collaborate, strategize, and learn. One team member shares their screen and everyone puts their heads together to work on a specific project or challenge for that client. When we’re forced to set time aside to work together, it allows us to learn about new tools, strategies, and approaches to a problem. Industry certifications From Google Analytics to HubSpot, to Google Ads and more—there are plenty of digital marketing certifications for your agency to pursue. I’ve found that hands-on learning is a better approach to growth, but some potential clients value these platform certifications when choosing an agency partner. We have employees take these certifications when they’re new to the team before they have a full client workload. Marketing 101 trainings When teams are siloed into client accounts, they miss out on learning the strategies and tactics that don’t impact their clients. At Cypress North, we schedule “101” trainings every week. One team member leads the call and covers a digital marketing topic. This could mean showcasing how to use a tool, troubleshooting Search Console reports , deep dives into Google Sheets tricks, or covering best practices. These trainings give our greener employees opportunities to learn skills even if they don’t apply to their current client accounts. Book clubs Optional marketing book clubs are a great opportunity to challenge your team to think critically and learn. Content doesn’t have to be specific to agency disciplines (SEO, PPC, etc.); books about branding, sales, or leadership can also help with professional or agency growth. Industry news Staying up-to-date (and ahead of clients) on relevant industry developments is just as important as growing hands-on skills. Encourage your staff to share breaking news via Slack, weekly emails, or at weekly marketing meetings. Embrace continuous learning for agency success Prioritizing education enables your agency to stay ahead of industry trends and adapt quickly to changes, ensuring that your strategies remain effective and your clients are satisfied. Additionally, investing in the education and growth of your employees—regardless of their experience level—not only improves client outcomes, but also enhances employee satisfaction and retention. Ultimately, by fostering a culture of continuous learning, your marketing agency can achieve greater authority, drive better business outcomes, attract better candidates, and stay ahead of the competition. Christine Zirnheld - Senior Digital Marketing Manager at Cypress North Christine Zirnheld is a senior digital marketing manager at Cypress North, specializing in PPC. As a host of the Marketing O'Clock podcast, she covers breaking PPC & SEO news stories with lots of sass. Twitter | Linkedin











