top of page

Search Results

311 results found with an empty search

  • Microsoft Clarity: Get more from your SEO by improving UX and conversions

    Author: Celeste Gonzalez Get started by: Creating a website → SEOs and website owners typically have the same goals: to improve rankings, traffic, and conversions. We do this by creating great content and optimizing website performance . But, there are other things you can consider to further improve your site performance—both for visitors and your own business. Microsoft Clarity , while not a dedicated SEO platform, is an important user experience analytics tool to have in your toolset because it is SEO adjacent . This means that—although it won’t directly help you improve rankings—it will help you better understand your users, view your website through their eyes, find out how they are interacting with your content and features, and make the most of the traffic you’re already bringing in. (After all, the best SEO in the world isn’t very helpful if users simply leave after landing on your website.) And, with the recent launch of Wix’s Microsoft Clarity integration , these insights are more accessible than ever. Let’s take a look at what you need to know to start improving your website with Microsoft Clarity. Table of contents: What is Microsoft Clarity? How to add Microsoft Clarity to your website Microsoft Clarity and Google Analytics 4 User experience metrics within Microsoft Clarity Recordings Heat maps How to use Microsoft Clarity to improve UX and conversions Improve user experience Improve conversions Wix’s Microsoft Clarity integration What is Microsoft Clarity? Microsoft Clarity is a free user behavior analytics tool that allows you to see how visitors interact with your website (or your client’s website). Some of Clarity’s most compelling benefits include: No traffic limits Free of charge Does not slow down your site Easily integrates with websites on Wix and other CMSs Access to real-time data Heat maps Session recordings GDPR- and CCPA-ready Compatible with Google Analytics 4 (GA4)  and other tools How to add Microsoft Clarity to your website To start gathering data with Microsoft Clarity, you must install a tracking code  into your site either manually, through a third-party platform, or by sharing the code with a developer to install. It’s as simple as copying and pasting the code into the of your site, or using Google Tag Manager  to install the code!  Once the code is installed properly, you’ll gather real-time data immediately.  To install Microsoft Clarity on your Wix website, refer to the section on Wix’s Microsoft Clarity integration . Microsoft Clarity and Google Analytics 4 While you can obtain some common user data from Clarity and GA4, these tools are actually better as complements of one another—not replacements. SEOs and digital marketers can connect the two via Clarity’s GA4 integration . This allows you to see session playback links within your GA4 dashboard. User experience metrics within Microsoft Clarity Clarity’s dashboard shows some engagement and UX metrics that digital marketers may already be familiar with (via other tools like GA4), as well as metrics that are unique to Microsoft Clarity.  This includes:  Sessions Pages per session Scroll depth Active time spent Users overview Insights overview JavaScript errors, Etc.  In addition, your session recordings and heat maps have their own dashboards that you can parse through as well.  Metric Description Example use case Sessions Clarity’s dashboard shows you total sessions (over the designated time period), but also includes more details like: Sessions with new/returning users Live users Unique users Pages per session These metrics exclude bot traffic. You can see how different audiences (e.g., new vs. returning users, users in the US vs. another country, etc.) use the site and decide if there are ways to better optimize page layout. Scroll depth This shows you how far users scroll down the page. Look at your average scroll depth to see if users are seeing all the important information on the page. If users aren’t scrolling to that information, you now know to place it higher on the page. This can also help you determine where to place calls-to-action . Active time spent The amount of time a visitor was using your site (scrolling, reading, etc.). This does not include time where your site was hidden behind tabs or windows. If it’s an informational page, you likely want to keep visitors on there for a good amount of time. However, if you see that they leave the page quickly, then the page probably isn’t answering their questions. There could be an opportunity to update the content or organize it more effectively. Dead clicks Unique to Microsoft Clarity, this metric tells you how often users think something on your website is clickable, but actually isn’t.  A high percentage of dead clicks can indicate that your users are confused, perhaps due to your usage of colors, an unintuitive page layout, etc. Use dead clicks to see what you should make interactive or clickable on a page. Or, you can use it to understand how to change up your website’s design layout so users stop clicking where they shouldn’t. Rage clicks This is when a user rapidly clicks or taps on a small area of your web page. Similar to dead clicks, rage clicks indicate that users expect your website to respond when they click on that area. In this case, though, the succession of clicks also signals user frustration. This can mean that an element that definitely should be clickable, like a button, is broken and should be addressed ASAP. Excessive scrolling Excessive scrolling is exactly what it sounds like: when a user scrolls up and down a page more than is expected. This is generally indicative of a user not being able to find what they are looking for easily. This could mean that the information on the page is not organized in the most logical way. Add a table of contents to the page so users can easily find what they are looking for rather than scroll forever to find it.  Quick backs This is the ratio of users that go to a page and then immediately navigate back to the previous page.  This lets you know that users didn’t find the next page useful. A quick back can mean a problem with the navigation, internal linking  structure, or that the new page is just not helpful or relevant to the user. At the top of the Clarity dashboard, you have the option to get more granular by segmenting your data. You can filter by time frame, device, browser, user action, product, and more. You can even create custom filters by using custom tags to identify specific users and their behavior on your site.  Recordings Clarity’s recordings offer a way to watch how users interact with the pages and elements on your site. You can even see recordings for specific metrics, like recordings that show dead clicks or quick backs.  In addition to the actual recording, Clarity shows you:  The page the user entered the site on The user’s exit page The number of pages visited during that session Session duration Number of clicks Device Operating system Country You can use this data to optimize a variety of scenarios. Here are some examples: Metric Description Example use case Entry and exit pages See which page a user began their journey on your site with and where they ended up.  When looking at these pages, ask yourself, are users flowing through your site the way you intended them to?  If a user starts on a service page, you likely want them to end up on the contact page. Is this happening? If not, maybe there aren’t enough CTAs or they aren’t well placed. Number of pages visited See how many pages a user visited in a single session. Does this number make sense based on the session you watched? Was the user bouncing around from page to page searching for an answer until they finally found it? You can use this metric to help determine if things are missing from the user’s entry page or other pages they visited on your site. Device Segment behavioral data based on whether a user visited the site from their mobile device, PC, or tablet. Perhaps you’ve noticed that desktop users convert better than mobile users. It could be that the pop-up looks different on mobile than it does on desktop, which makes it more difficult to fill out. Note: Clarity only stores recordings for 30 days unless you designate a recording as a “favorite,” in which case you’ll have access to it for 13 months. Heat maps Heat maps allow you to see how users interact with your page, where they scroll and stop, etc. You can use them to help identify user trends and patterns by looking at what areas of a page have the highest and lowest amounts of engagement.  Clarity divides heat maps into four categories:  Click maps Scroll maps Area maps Conversion maps Use heat map data to determine where to place elements on a page for better visibility, engagement, and conversions.  How to use Microsoft Clarity to improve UX and conversions Clarity provides you with a variety of different metrics (and you can even combine it with your Google Analytics data, as mentioned above). This data offers you insight into how users interact with your site, so it only makes sense to use it to improve their overall experience, which can then improve your conversions. Improve user experience  Let’s dive into a few different examples of how you can use Clarity data to improve a user’s experience with your website.  Zara is an international clothing brand with a reputation for its frustrating website. In addition to monitoring social media for user complaints about its site layout, the brand could also use Clarity to pinpoint the cause of its users’ frustrations.  When shopping online for clothes, users are used to swiping left to see the next image of the product they’re interested in. If you do this on Zara’s site, instead of seeing the next photo, you swipe to the next product in whatever category you are looking at. This is likely leading to lots of quick backs to get back to the product the user was originally viewing. By resolving this issue on mobile, Zara can help its very annoyed customers, which should ultimately lead to more customer transactions.  This was just one example use case. Here are some other ways you can improve your user experience, according to other Clarity metrics and features: Metric/issue Tip Excessive scrolling on a particular page A table of contents may be necessary to outline the information on the page, so the user can get to it quickly and easily.  Surplus of rage clicks An element on your web page probably looks clickable (e.g.; an icon, an underlined word that looks like a link, an image, etc). You should either make it clickable or change up the layout so people stop thinking it is. Heat map shows lots of clicks on pieces of text within your content Users might be highlighting that information because they thought it was important.  This could be a good place to add a link to supporting content or to even highlight it for users by making the text larger to call more attention to it. Scroll heat map shows that users are not scrolling far enough to see the key information This is a sign that you should change up the content’s layout and move your key information further up the page.  High quick back rate on a particular page The page likely doesn’t have the information users actually want. Depending on the context, you can: Link to a different page instead Change the anchor text Update/refresh the content Improve conversions As mentioned before, improving your site’s overall user experience will, as a natural outcome, likely improve your conversions, too. For example, in a case study  by Microsoft, users were confused with how to interact with the setup form. There was a blank in the statement above the input box (as shown in the image below), and users thought they should click on the blank to fill it out, rather than clicking on the box underneath the statement. After fixing this (and two other UX issues), the company in the case study, a prenuptial agreement planning platform, saw a 32% increase in revenue compared to the previous month. The company’s goal was to help users navigate the site easier, and by doing so, they were able to cash in.  You should also use the data to explicitly call attention to forms and other CTAs  to increase conversions as well. In a case study by the agency I work for, RicketyRoo , we looked at the homepage to see where we could urge more users to contact our client. After reviewing the heat maps and click data, we noticed that there was a button sending users to watch a video when it should have sent them to the contact page.  The homepage had few conversions before, so after fixing the button and moving it to a highly visible place where users were already clicking, we were able to greatly improve form submissions.  Here are some other ways you can use Clarity data to improve your conversion rates: Metric/issue Tip Incomplete form submissions Check to see if your forms and/or pop-ups are broken.  Verify that the form can be successfully filled out and closed. Users are not scrolling to important information on the page Serve that information to users above the fold (or somewhere at least 50% of users have scrolled past it) and evaluate the impact. For example, instead of a giant list of products, include a search function above the fold. Increasing number of quick backs It could be that your anchor text is confusing to users and leading them to a page that does not include the information they are looking for. Javascript errors These could come from third-party plugins that are causing issues for your users or something as simple as a missing parenthesis in your code. Pay attention to JavaScript click errors  in particular, as these refer to errors that occur after a user clicks. Wix’s Microsoft Clarity integration Wix and Microsoft have partnered to give you access to Clarity from directly within the Wix platform. To set up the integration, log into Wix, then: Locate Microsoft Clarity in the Wix App Market . Add Clarity to your website. Sign into Clarity from Wix to either create or connect an existing Clarity project. The Clarity experience within Wix allows you to create or link projects and view data to implement the UX and conversion improvements mentioned above. Pair these insights with Google Search Console data in your Wix Analytics  to optimize your website to capture users from search results and guide them all the way through to conversion. Getting website visitors is only half the battle As I said in the introduction, flawless SEO won’t matter if visitors bounce from your website because they couldn’t see or do what they came for. Auditing your user experience enables you to go through your customer journey the way your audience does, which in turn allows you to uncover the oversights that may be chipping away at your leads and revenue. As with any SEO testing  methodology, make note of the changes you implement and how they impact your website performance. This way, you can develop best practices for your specific niche and audience, enabling you to further optimize the customer experience throughout your marketing funnel. Celeste Gonzalez - Director of RooLabs at RicketyRoo   Celeste Gonzalez  leads RooLabs, RicketyRoo's SEO testing division, where she drives innovative strategies and engages with the SEO community. She is passionate about pushing SEO boundaries and sharing insights on both successes and challenges in the industry. Twitter  | Linkedin

  • How to use real-time analytics

    Updated: October 28, 2024 Author: James Clark In the world of web analytics , it’s easy to dismiss real-time (or live) analytics as a vanity exercise—after all, aren’t trends over time more important? Absolutely, and real-time data can even ensure that your trends data is more reliable by helping you troubleshoot, monitor marketing activity, and make better informed decisions on the fly. In this article, I’ll show you what you should (and just as importantly, shouldn’t) use real-time reporting for. Next, we’ll dive into Google Analytics 4 and explore a couple of more advanced techniques, before considering the benefits of a dedicated real-time analytics tool and alternative sources of real-time data. Table of contents: What is real-time analytics? How to use real-time reporting for better campaign results Debugging or troubleshooting Monitoring marketing activity Making decisions in real-time What real-time analytics can’t tell you Real-time analytics in GA4 Comparisons in the GA4 Realtime overview Audiences in the GA4 Realtime overview Does GA4 update in real-time? Dedicated real-time analytics tools Other sources of real-time data Real-time analytics on Wix What is real-time analytics? Real-time analytics/reporting refers to a collection of data that reflects the most recent activities and actions of users on your site. This can include the number of visitors, the pages they visited, traffic sources, events triggered, and so on. Many analytics tools, including both Google Analytics 4 and Adobe Analytics, offer some kind of ‘real-time’ reporting. Some marketers may treat this kind of report as a vanity exercise: it’s nice to know that there are five people on my site at the moment, but how exactly does that help me make business decisions? Site owners will often move on to other reports where they can access historic data and start to understand trends over time. But, real-time reporting is incredibly useful once you understand how to apply it. There are two main styles of real-time reporting depending on the analytics tool you use (and some tools offer both): Event stream: An event or activity stream lists the events that have happened most recently on your site (in reverse chronological order). This almost always includes page views, but could also include button clicks, form submissions, purchases, or any other event you are tracking with the tool. Mixpanel and Piwik Pro are examples of tools that offer this kind of reporting. (GA4 also has an event stream, but as part of its DebugView rather than the real-time reports.) As an example, here’s an event stream in Mixpanel: Overview report: This type of report shows aggregated information about recent activity on your site. For example, GA4’s Realtime overview shows you how many users visited your site over the past 30 minutes (and, since May 2024, five minutes— GA4 is still evolving !). It also displays the number of pageviews per page and a count of each event: No matter what tool you use, it will take time for that tool to collect and process data. This is often referred to as ‘latency’—the higher the latency, the lower the data freshness. Even so, the advantage of real-time reporting is that it includes the freshest data the tool can offer you.  In short, real-time reporting is not so much ‘real-time’ as it is ‘very recent activity.’ But that’s still hugely valuable, as you’ll soon see. Note: Wix site owners can access their Real-time Analytics (which includes an overview section as well as an activity stream) by going to Analytics > Real-time in their Wix dashboard. How to use real-time reporting for better campaign results While you wouldn't necessarily use real-time analytics to report on the success of your marketing campaigns, it plays a vital role in ensuring they run smoothly. Real-time data supports you in troubleshooting your tracking code, checking that your campaigns have deployed as planned, and making quick-fire marketing decisions. Let's look at all three of these use cases. Debugging or troubleshooting The most common use for real-time reporting is debugging or troubleshooting. Piwik Pro even calls its real-time event stream the “ tracker debugger ” in recognition of this. Real-time reporting (or whatever your tool labels it as) gives you the freshest data, making it very useful for checking whether tracking codes are working and that the tool is capturing data at all. After all, why wait a day for data to appear in the standard reports when you can check a real-time report after just a minute or two? Another troubleshooting strength of real-time reporting is that it not only shows you traffic on your site, but also lists the events that have taken place. On GA4’s Realtime overview, the “Event count by Event name” card is a great example of this. It lists page_view events of course, but also session_starts, scrolls, and any custom events you may be sending. Clicking on the name of an event then displays the event parameters that were captured. For the page_view event, that includes page_title, medium (for example, “organic”), and source (for example, “Google”): This level of detail makes the Realtime overview useful for ad hoc troubleshooting on low-traffic sites. For more complex debugging, it would be better to use the dedicated DebugView with its own events stream, as this can be used to only show events from your own device: Monitoring marketing activity Real-time reporting is also useful for checking that marketing campaigns have deployed as planned, and for monitoring the impact of those campaigns in near real-time. Let’s say you’ve scheduled a product email to go out to 10,000 people at midday—that will generate a spike in traffic (and hopefully sales) that you can see in your real-time report. If you don’t see those things, you may want to double check your email platform. In addition, real-time reporting lets you see the impact of a trending social media post or blog post almost immediately. (Of course, the challenge here is knowing when something is going to be trending so you can monitor the analytics.) But, it’s not just digital marketing that you can monitor with real-time reporting. Some traditional marketing campaigns (such as radio advertising) could cause a spike in activity on your site as well. And, if your CEO wants to know immediately how that expensive ad campaign performed, you’re far better providing some initial insight from real-time reporting than saying you have to wait until tomorrow to get data from the standard reports. Making decisions in real-time Real-time reporting is particularly useful when it comes to helping you make decisions about things happening live on your site, such as: Broadcasts/livestreams Flash sales A/B tests Here the emphasis isn’t so much on passively monitoring activity, but on using data to make decisions in real-time. Let’s say you’re planning to run an important webinar scheduled for 11AM. Should you start exactly on the hour, or should you wait until a couple of minutes past? Depending on your setup, your webinar tool may tell you the number of people that have signed in, but it won’t tell you the number of people on your site who are still in the process of doing so. This is where real-time analytics can fill the gaps and help you build a picture of activity on your site in order to get your timing spot-on. This is possible not just because real-time data is fresher, but because it’s also more granular—it lets you look at smaller time periods. The smallest time dimension available in GA4 outside of real-time reporting is hourly, and even then you have to customize a report or build an Exploration (like the one shown below) to take advantage of it. This makes it unsuitable for making real-time decisions. (Not to mention that Explorations can’t look at the current day’s data.) To give another example, let’s say you hold a virtual event with a number of short presentations by different speakers. Granular data would let you identify the individual presentations—or even the parts of presentations—that were less engaging and were causing your audience to drop from the event. Daily or hourly data would be much less useful here. What real-time analytics can’t tell you Real-time analytics gives you the freshest data, often covering a specific window of time (the most recent 30 minutes, for example). This makes real-time analytics the wrong choice for any sort of trend analysis . If you want to know whether sales have been increasing over the past year, turn to your standard reports. For the same reason, analytics tools won’t let you compare your real-time analytics data to a previous period. If you’re interested in year-over-year, month-over-month, or even day-to-day comparisons, again you should turn to your standard reports. The date picker on GA4’s standard reports has options for comparing against the previous period, the same period last year, and a custom period of your choice: You may also find that many of the dimensions and metrics you’re familiar with from your standard reports aren’t available to you in your real-time reports, especially session-based metrics ( such as bounce rate ). We’ll look at what that means for GA4 in particular in the next section. Finally, be aware that real-time reporting is unlikely to be entirely accurate. Most tools are set up using client-side tracking, where data is sent from the user’s web browser to the analytics service. But some users will block your tracking script with their browser or ad blocker settings—which means your tool will under-report the number of users on site. Your approach to consent will also affect the accuracy of real-time reporting. For example, if you’re using basic consent mode with GA4, then your Google tag will be blocked until the user grants consent via your consent banner . This, again, can lead to under-reporting.  These are considerations with all analytics, not just real-time reporting. Real-time analytics in GA4 Now that you understand what real-time analytics is (and what it isn’t), let’s dive deeper into the real-time functionality in GA4. Once you’ve selected your Property in GA4 , you’ll find the real-time reports near the top of the Reports menu (just below Reports snapshot). There are two real-time reports: Realtime overview and Realtime pages. Note: If you’re a long-standing Google Analytics user, you may remember that Universal Analytics (the previous version of GA) had a whole suite of real-time reports. In addition to the overview report, there were individual reports focusing on user locations, events, conversions, and more. With GA4, Google has consolidated all of this information into two reports. But, as the Realtime pages report launched more recently (in October 2024), Google may well continue to expand GA4’s real-time reporting capabilities. Let’s look at the Realtime overview first. As with GA4’s standard reports, this consists of a number of “cards,” each of them summarizing one or two dimensions and metrics: for example, “Views by Page title and screen name” or “Event count by Event name.”  And as is usual for GA4, some of the cards allow you to switch between dimensions and metrics using a drop-down: “Active users by First user source” can be changed to “Active users by First user medium” or “Active users by First user campaign,” among other options: One difference, though, is that the “customize report” option is missing. That means that, unlike the standard reports, you can’t rearrange, add, or remove any cards on the Realtime overview. The second real-time report, Realtime pages, is the easiest way to see which pages on your site users recently viewed. The large table lists the pages by page path (the end part of the URL, after the domain name) and gives the number of active users and views for each one over the past 30 minutes: Although this is similar to the “Views by Page title and screen name” card in the Realtime overview, having a dedicated report makes the information easier to understand at a glance. I could see it being used in a newsroom or up on a screen in a busy marketing department. Now let’s look at a couple of more advanced techniques we can use in GA4’s real-time reporting: Comparisons and Audiences. Both of these techniques work in the Real-time overview but not the Real-time pages report. (And if you’re new to GA4, you might want to check out our guides to getting started with GA4  and key events and conversions in GA4  first.) Comparisons in the GA4 Realtime overview Many of the dimensions and metrics you may be familiar with from the standard reports are absent from GA4’s Realtime overview. For example, none of the cards include the “browser” dimension, so there’s no way to see a full breakdown of your users’ browsers in real time. However, you can use the Comparison feature to get at least a little insight into this. Let’s say you wanted to know how many of your real-time users are using Chrome: 01. Go to the Realtime overview. 02. Click the Add comparison + (beneath the search bar) to open the “Apply a comparison” panel. 03. Click + Create new . 04. In the Dimension dropdown, choose Browser . 05. In the Match Type dropdown, choose exactly matches . 06. In the Value dropdown, tick Chrome . Your panel should look like this: Finally, click Apply . Now, the real-time overview will show you two versions of each card, the original (left) and a new version with the comparison applied (right).  Here you can see that the site has had four active users in total over the past 30 minutes but only three of those were using Chrome: This approach works with other dimensions, too. But depending on your choice, you may find that some of the cards display, “Real-time data is not supported for this comparison.” For example, if you base a comparison around “screen resolution,” then the Event count and Key events comparisons will not be available. This is a limitation of GA4’s reporting . Audiences in the GA4 Realtime overview One of the cards in GA4’s Realtime   overview lets you break down Active users (or New users) by Audience. Probably the most common use for GA4 Audiences is as a targeting option in Google Ads—but if you don't run any paid advertising, you may not be familiar with the Audiences feature. So what are Audiences, how do you build them, and how do they relate to real-time reporting? Audiences are groups of your users that meet particular conditions like “Browser = Chrome” or “have made a purchase.” So, they are similar to comparisons in some ways, but more powerful because they can also consist of users that performed a particular event . To build an audience: 01. With your property selected, go to Admin > Audiences . 02. Click on New Audience . 03.  You could use one of GA4’s “reference” audiences (such as “Purchasers”), but for now let’s Create a custom audience . 04. Click on Add new condition and add a condition based around either a dimension (e.g., Browser) or an event (e.g., Click). 05. If you choose a dimension, click on Add filter to finish writing the condition—for example, Browser exactly matches (=) Chrome. 06. Optionally add more conditions to either include or exclude other groups from your audience. Once you’ve added your condition(s), the Summary in the bottom-right of the audience builder will give you an estimated audience size (based on the last 30 days' activity on your site): Although the Summary might suggest otherwise, audiences always start with zero members—they aren’t retroactive. For example, if you create an audience of “Purchasers” at midday on February 1, only users making purchases from that moment onwards are added to the audience. The Summary is only showing you how big your Audience might be by now if you had created it 30 days ago . This means there’s no point creating an Audience and then immediately hoping to use it in the Realtime overview . If you want to see how many users who completed a “sign_up” event and are currently on your site, you need to have created that audience long enough ago to make it meaningful. (Users can remain in an Audience for up to 540 days depending on the “membership duration” setting you chose when building the Audience.) If you do plan ahead, the combination of Audiences and real-time reporting can be incredibly powerful. Imagine you’re running a live event on your site designed to target a particular subset of users: previous purchasers from France, for example. Now, you’ll be able to tell at a glance whether you’re attracting the right audience or whether your messaging has appealed more to a different group. Does GA4 update in real time? You’ve already seen that one of the advantages of real-time reporting is the freshness of its data. So how fresh is ‘fresh’ when it comes to GA4? Focusing on the standard reports first, Google gives a processing time of 12 hours for daily data—or longer for the biggest sites. And this is the “typical” processing time, by no means guaranteed. GA4 is different from the old Universal Analytics, which had a stated processing latency of “24-48 hours,” but would often make data available within an hour or two. With GA4, 12 hours often really does mean 12 hours. To put that in context, if you wake up one morning and log in to GA4 to check the previous day’s figures, don’t be alarmed if it looks like traffic on your site has slumped. It may be that you aren’t seeing the complete picture for that day yet. So, when using the standard reports, it is safest to leave at least one full day before checking the data—in other words, don’t go checking Wednesday’s figures until Friday at the earliest. After all, you wouldn’t want to risk making business decisions on incomplete and potentially misleading data. And if you’re using GA4 to track activity on an app rather than a website, you may want to wait even longer. As the Analytics Help site says : “When a user’s device goes offline (for example, a user loses their internet connection while browsing your mobile app), Google Analytics stores event data on their device and then sends the data once their device is back online. Google Analytics ignores events that arrive more than 72 hours after the events are triggered.” Compared to the standard reporting, the real-time  reports have an amazingly quick typical processing time of “less than one minute.” This is the case for both free and paid (360) GA4 properties, although this processing time is not guaranteed by the 360 SLA. Nevertheless, it offers by far the freshest data available in GA4. But be careful how you interpret that data:  The real-time  reports will tell you the number of users to have visited within a five-minute and a 30-minute window, but not whether those users are still on the site. So, if you have “100 users in the last 5 minutes,” it may be that all 100 are still on the site or that all 100 have left. The reality, of course, is probably somewhere in between. Dedicated real-time analytics tools If you absolutely need to know the number of users on your site at any given moment (rather than within, say, a 30-minute window), consider using a dedicated real-time analytics tool such as GoSquared or Realtime.li . These are designed to provide exactly that information, as this section of the Realtime.li dashboard demonstrates: GoSquared’s approach is unusual in that it uses a technology called “pinging” to check that visitors are still on your site. This means that if they leave, they will be removed from the live visitors count in around 30 seconds. On the other hand, if they are sitting on your site doing something passive (such as watching a long-form video), they will still be counted towards the live visitors total. Generally speaking, other analytics tools would stop counting these passive visitors after a set period of time. On the downside, a dedicated live analytics tool won’t offer you the same level of historic reporting as a more general platform might. This means you’ll likely end up running two tools on your site at the same time—for example, Google Analytics 4 for tracking trends over time and GoSquared for real-time reporting. Whichever tools you use, don’t expect them to give you identical results—every tool defines its metrics differently, even if they sound similar (sessions, visits, and so on). So, be consistent about when you use one tool and when you use the other. Other sources of real-time data I’ve shown you how to use real-time data both in all-purpose analytics tools such as Google Analytics 4 and dedicated real-time analytics tools such as GoSquared. But, I’d like to leave you with the thought that analytics tools aren’t your only source of useful real-time data. In particular, if you’re live streaming video content, it’s likely that your platform will be able to provide some great insights. YouTube, for example, can tell you the number of “concurrent viewers” (i.e., the number of viewers watching your stream simultaneously) as well as the “peak concurrent” (i.e., the highest figure you have achieved during the stream): Facebook Live, Vimeo, and IBM Player all have similar metrics. In addition, your platform may give you details on specific interactions, such as “Likes” or “chat rate” (the number of messages sent in live chat per minute). As with real-time analytics data from your website, you can use data from your streaming platform for troubleshooting or for identifying the most and least engaging parts of the livestream. Also, look in your website back office to see what data is available to you there. Most site builders and platforms now offer some level of built-in analytics , and this can be expanded through the use of third-party apps or plugins. Real-time analytics on Wix Wix site owners can access their real-time analytics by going to Analytics > Real-time in their Wix dashboard. You can also view a list of your Recent visitors over the last 24 hours, as well as a breakdown of each action they took during that session. Refer to the Live activity panel (shown above) to see which actions were recently taken on your site, including: Viewing a store product Entering the checkout flow Viewing a blog post Adding an item to a cart Becoming a new contact Booking and/or scheduling a service Completing an order Learn more about all of Wix Analytics’ real-time reporting capabilities. Real-time analytics: The right data for the right purpose Real-time analytics (while not exactly ‘real time’ ) is a source of genuinely useful data. It can answer questions that your regular analytics reports simply can’t. But it’s also a specialized tool, designed to do one thing and do it well . That means it supplements (rather than replaces) any analytics you’re already using. To extend the metaphor, it’s an extra tool in your toolbox. That said, there are many different sources of real-time data, and each will tell you something slightly different in a slightly different way. So ask yourself: do you need real-time data for troubleshooting, for monitoring marketing campaigns, for live events, or for something else entirely? The answer to that question will help you identify the right source of real-time data for you, and ensure you are using it for genuine business purposes rather than an ego boost each time you log in. In other words, it will help you keep it real . James Clark - Web Analyst James Clark is a web analyst from London, with a background in the publishing sector. When he isn't helping businesses with their analytics, he's usually writing how-to guides over on his website Technically Product . Twitter | Linkedin

  • How to Use Wix SEO Settings

    Speaker: Crystal Carter | 14 min In this video, you’ll learn how to use Wix SEO Settings and Edit by Page feature. Learn how to edit meta descriptions , title tags , and other SEO meta tags for your pages either individually or in bulk by page type. Designed to empower users with efficient and scalable tools for optimizing their Wix websites, these features are built into the Wix CMS and do not require a plugin. Read More

  • Intro to technical SEO: A guide to improving crawling and indexing for better rankings

    Author: Aleyda Solis The highest quality content on the web won’t get any search traffic if technical configurations aren’t correctly optimized for effective crawling and indexing. On the other hand, stellar technical SEO can help guide search engines (and users) to your most important pages, enabling you to bring in more traffic and revenue. In this article, I’ll guide you through the key concepts, configurations, and criteria necessary to fully leverage technical SEO for your website. Let’s begin. Table of contents: Technical SEO: What it is and why it’s important Crawlability, indexability, and rendering: Fundamental technical SEO concepts Technical SEO configurations to understand and optimize HTTP status URL structure Website links XML sitemaps Robots.txt Meta robots tags Canonicalization JavaScript usage HTTPS usage Mobile friendliness Structured data Core Web Vitals Hreflang annotations Technical SEO: What it is and why it’s important Technical SEO is the practice of optimizing your website configurations to influence its crawlability, rendering, and indexability so that search engines can effectively access and rank your content.  This is why technical SEO is considered essential and one of the main pillars of the SEO process.  It’s referred to as ‘technical’ because it doesn’t pertain to optimizing on-page content , but rather optimizing the technical configurations (e.g., HTTP status, internal linking, meta robots tags, canonicalization, XML sitemaps) with the goal of ensuring that search engines can access your content.  It’s crucial to understand that while you don’t need to be a web developer or know how to code to handle technical SEO, you do need to grasp the basics of how websites are constructed.  This includes understanding HTML and how other web technologies, like HTTP and JavaScript, function. This knowledge helps you evaluate and confirm that your website is optimized effectively for search. Overlooking technical SEO can lead to your pages not appearing in search results, ultimately resulting in lost opportunities for rankings, traffic, and the revenue that comes with it. The fundamental technical SEO concepts: Crawlability, indexability, and rendering Search engines, like Google, begin the process of providing results to users by accessing website pages (whether they’re text, images, or videos)—this is known as crawling .  Once they’ve accessed and downloaded this content, they analyze it and store it in their database—this is known as indexing .  These are key phases of the search process and you can influence them through the technical setup of your website.  Let's take a closer look at each of these phases to understand how they function, and why and how you’d want to optimize them. Crawlability: Search engines discover your website pages through a process called ‘crawling’. They use ‘crawlers’ (also known as ‘spiders’ or ‘bots’) that browse the web by following links between pages. Search engines can also find pages through other means, like XML sitemaps  or direct submissions through tools like Google Search Console . Some search engines (including Microsoft Bing, Yandex, Seznam.cz , and Naver) use the IndexNow  protocol (which Wix supports) to speed up discovery when you create or update content. Popular search engines have their own crawlers with specific names. For instance, Google’s crawler is called ‘ Googlebot ’.Websites can control which search engines access their content through a file called robots.txt , which sets rules for crawling. To  ensure search engines can find and access important pages while preventing them from accessing unwanted ones, it’s crucial to optimize your technical configurations accordingly. Indexability:  After a search engine crawls a webpage, it analyzes its content to understand what it’s about. This process, known as indexing, involves evaluating the text-based content as well as any images or videos. In addition to HTML pages, search engines can often index content from text-based files, like PDFs or XMLs.However, not every crawled page will get indexed. This depends on factors like the originality and quality of the content, certain HTML configurations like meta robots and canonical annotations, and reliance on JavaScript for key design and content rendering, which can make indexing difficult.During indexing, search engines check if a page is a duplicate of others with similar content and select the most representative one (referred to as the ‘canonical page’) to display in search results. Therefore, it’s crucial that you correctly configure and optimize these different elements to ensure effective page indexing. Rendering:  If your website utilizes client-side JavaScript, search engines need to perform an additional step called ‘rendering’ to index your content.Client-side JavaScript rendering involves using JavaScript to create HTML content dynamically in the browser. Unlike server-side rendering, where HTML is generated on the server and sent to the browser, client-side rendering starts with a basic HTML file from the server and uses JavaScript to fill in the rest.Because of this, search engines have to execute the JavaScript before they can see the content. While search engines like Google and Bing can render JavaScript to index the page, it requires more resources and time, and you might encounter limitations when relying on client-side rendering on a large scale. That’s why, when using JavaScript, it’s best to opt for server-side rendering to make indexing easier. Technical SEO configurations to understand and optimize Now that you understand the considerations that technical SEO seeks to optimize, let’s look at the different configurations that influence your technical SEO and how to optimize them to maximize your organic search visibility.  I’ll cover: HTTP status URL structure Website links XML sitemaps Robots.txt Meta robots tag Canonicalization JavaScript usage HTTPS usage Mobile friendliness Structured data Core Web Vitals Hreflang annotations HTTP status HTTP status codes are numerical responses from your web server when a browser or search engine requests a page. These codes indicate whether the request was successful or an issue occurred. Here are key HTTP status codes and their implications for SEO: 2xx (success):  200 OK  — Page successfully found and available for indexing assessment. 3xx (redirection): 301 moved permanently  — This   indicates a permanent move to another URL; it transfers the SEO value of the former URL to the final destination. That’s why SEOs use 301 redirects  when performing a website migration , changing a URL, or when removing a page that used to attract rankings, traffic, and backlinks. 302 found  — This   indicates a temporary move and doesn’t transfer the former URL’s SEO value to the target page. 4xx (client errors): 404 not found  — This indicates that the page was not found. A high number of 404 errors can impact your site’s crawl budget (i.e., the amount of time and resources a search engine dedicates to crawling your website). 410 gone  — This indicates an intentional and permanent removal. This can be useful for de-indexing a page if it doesn’t have any rankings, traffic, or links. 5xx (server errors):  500 internal server error   — This indicates the server failed to fulfill a request. This can be harmful to your SEO if not resolved. 503 service unavailable  — This code indicates that a page is   temporarily unavailable and can be used for website maintenance without impacting your SEO. You can use this status code to tell search engines to come back later. Soft 404 errors:  These occur when a page returns a 200 OK status, but lacks content or shows an error message, suggesting that it doesn’t exist anymore or providing a poor user experience. For permanent content relocation, use a 301 redirect. For removed content, redirect to the parent category if the page had value, or use a 410 status if it didn’t. URL structure A well-designed URL structure is important for both search engines and users to understand the content of your webpages.  Here are some widely accepted best practices for URL structure: Keep URLs simple, short, lowercase, and descriptive, using meaningful words instead of IDs. Use hyphens to separate words. Avoid underscores, spaces, or concatenation. Avoid generating multiple URLs for the same content, such as through session IDs or excessive parameters. Maintain a logical folder structure without going too deep to prevent overly long and complex URLs. Consistently use trailing slashes or non-trailing slashes to avoid duplicate content issues, and use 301 redirects to enforce canonical URLs. Good URL structure example Poor URL structure example yoursitename.com/smartphones/iphone/ yoursitename.com/id-23-p?id=2 Website links Links are crucial for search engines to discover new pages and for users to navigate your site. To optimize your website’s links, implement the best practices below. Include navigation links:  Utilize main menus, footer links, and editorially placed links within your content to enhance crawlability and browsing experience. Use HTML tags: Use the HTML tag for links to ensure crawlability and avoid JavaScript-based links. Create descriptive anchor text : Use descriptive, relevant anchor text that accurately describes the linked page, incorporating targeted keywords when possible. Avoid generic terms like ‘click here’ or ‘read more’. Link to canonical URLs: Directly link to canonical, indexable URLs. Avoid linking to pages that redirect or trigger errors. Link to absolute URLs: Use full URLs instead of relative URLs to prevent issues. Structure and prioritize your linking strategy: Follow a logical, hierarchical structure for internal linking , prioritizing high-value pages. Cross-link between similar pages to aid both users and search engines. Avoid nofollow for internal and trusted external links: Generally, internal links should be followed by default. Reserve the rel="nofollow" attribute  for when you don’t want to pass link equity. XML sitemaps XML sitemaps  are files (in XML format) that tell search engines about the essential, indexable files of your website, such as pages, videos, or images, and their relationships. They aid search engines in efficiently crawling and indexing this content. While not mandatory, XML sitemaps are recommended for highly dynamic or large websites with thousands of URLs (or more). They complement internal links, helping search engines discover URLs within a site. There are various types of XML sitemaps, including general, video, image, and news sitemaps. Most web platforms automatically generate and update XML sitemaps when you add or remove new pages. Considerations for creating XML sitemaps include:  Adhering to size limits (50MB uncompressed or 50,000 URLs) UTF-8 encoding Placing them at the root of the site URLs within sitemaps should be absolute references. Here’s an example of an XML sitemap that includes only one URL: Robots.txt The robots.txt file , located at a website’s root, controls which pages search engines can access and how quickly it can crawl them. Use it to prevent website overload, but don’t rely on it to keep pages out of Google’s index. The file must be UTF-8 encoded, respond with a 200 HTTP status code, and be named “robots.txt”.  Your robots.txt file consists of groups of rules, each starting with a user-agent directive specifying the crawler. Allowed rules include: User-agent  —   Specifies which crawlers should follow your rules. Disallow  — Blocks access to a directory or page using relative routes. Allow  — Overrides a disallow rule to allow crawling of a specified directory or page. Sitemap  —   Optionally, you can include the location of your XML sitemap. Here’s a few examples of what a robots.txt file can look like: Meta robots tags Meta robots tags are placed in a page’s HTML head or HTTP header to provide search engines with instructions on that particular page’s indexing and link crawlability. In the example above, the meta robots tag includes the "noindex" directive, telling search engines not to index the page. Both the name and content attributes are case-sensitive. Allowed directives include: "noindex"  — This prevents page indexing. "index"  — This allows page indexing (it is also the default, if not otherwise specified). "follow"  —   Allows search engines to follow links on the page. "nofollow"  — This   prevents search engines from following links on the page. "noimageindex"  — This   prevents indexing of images on the page. You can combine these directives in a single meta tag (separated by commas) or place them in separate meta tags. Canonicalization Canonicalization  refers to selecting the main version of a page when multiple versions or URLs exist, therefore preventing duplicate content issues. Duplicate content  can result from URL protocol variations (HTTP and HTTPS), site functions (URLs with parameters resulting from filtering categories), and so on. Search engines choose the canonical version based on signals like HTTPs usage, redirects, XML sitemap inclusion, and the annotation. Practical methods to specify the canonical URL include: 301 redirects  — You can simply direct users and crawlers to the canonical URL. annotations  — Specify the canonical URL within the page’s HTML . XML sitemap inclusion  — This signals the preferred URL to search engines. 301 redirects are ideal when only one URL should be accessible, while annotations and XML sitemap inclusion are better when duplicate versions need to remain accessible. Canonical annotations are typically placed within the HTML or HTTP headers, pointing to the absolute URL of the canonical page. For example: For non-HTML files like PDFs, you can implement canonical tags through the HTTP header.  JavaScript usage JavaScript can enhance website interactivity, but some sites also use it for client-side rendering (where the browser executes JavaScript to dynamically generate page HTML).  This adds an extra step for search engines to index content, requiring more time and resources, which can result in limitations at scale. That’s why server-side rendering is recommended instead.  Some web platforms, like Wix, use server-side rendering to deliver both JavaScript and SEO tags in the most efficient way possible. If you can’t avoid client-side rendering, follow these best practices: Ensure links are crawlable using the HTML element with an href attribute. Each page should have its own URL, avoiding fragments to load different pages. Make the resources needed for rendering crawlable. Maintain consistency between raw HTML and rendered JS configurations, like meta robots or canonical tags. Avoid lazy loading above-the-fold content for faster rendering. Use search engine tools, like Google’s URL Inspection tool , to verify how pages are rendered. HTTPS usage HTTPS (Hypertext Transfer Protocol Secure) is crucial for sites handling sensitive information as it encrypts data exchanged between users and your website. Search engines, like Google, use HTTPS as a ranking signal , prioritizing secure connections in search results for better user experience. To ensure security, all pages and resources (images, CSS, JS) should be served via HTTPS. Migrating to HTTPS involves: SSL/TLS certificate  — Purchase and install this on your web server. Server configuration  — Configure the server to use the certificate. Redirects  — 301 redirect all HTTP URLs to their HTTPS equivalents. For a smooth transition: 301 redirect — Ensure all URLs permanently redirect to HTTPS. Update internal links  — Update internal links to HTTPS. External resources  — Check external resources (e.g., CDNs) for HTTPS support. Mixed-content warnings  — Resolve any mixed-content (i.e., when secure HTTPS pages load resources over an insecure HTTP protocol), ensuring all content is loaded via HTTPS to avoid browser warnings. Mobile friendliness Search engines, like Google, prioritize mobile-friendly websites, using mobile crawlers to primarily index mobile content for ranking (as opposed to desktop content).  To provide a positive mobile experience, ensure that your site has a well-configured mobile version that fits mobile devices of various screen sizes correctly. These are the three main configurations for mobile-friendly sites: Mobile configuration Description Responsive design The same HTML code on the same URL, displaying content differently based on screen size via CSS. This is the method that Google recommends because it’s the easiest to implement and maintain. Dynamic serving The same URL but serves different HTML based on user-agent. Separate URLs Different HTML for each device on separate URLs. Regardless of the configuration, ensure mobile and desktop versions have equivalent crawlability, indexability, and content configurations (titles, meta descriptions , meta robots tags, main content, internal links, structured data, etc).  Allow search engines to crawl resources used in both versions (images, CSS, JavaScript). Avoid lazy-loading for primary content and ensure that all content visible in the viewport is automatically loaded. Optimizing these elements will help search engines effectively access and index the mobile version of your site, improving its visibility and ranking. Structured data  Structured data  helps search engines understand and classify a page’s content, leading to enhanced search listings known as ‘ rich results ’. Popular structured data types for generating rich results include: Breadcrumb, logo, event, FAQ, how-To, image metadata, product, Q&A, recipe, reviews, software, and video.  You can implement structured data in three main formats: JSON-LD  —   Recommended for ease of implementation and maintenance at scale, JSON-LD uses JavaScript notation embedded in HTML. Microdata  —   This format uses HTML tag attributes to nest structured data within HTML content. RDFa  —   This format is an HTML5 extension supporting linked data using HTML tag attributes. Google’s Rich Results Test  tool validates structured data  and provides previews in Google Search. Here is an example of JSON-LD structured data for a recipe page: Core Web Vitals Core Web Vitals  (CWV) measure user experience for loading, interactivity, and the visual stability of a page. Google considers them in its ranking systems.  The three main CWV metrics are: Core Web Vital metric Description Largest Contentful Paint (LCP) This measures loading performance by considering the render time of the largest visible image or text block. Interaction to Next Paint (INP) This metric observes the latency of all click, tap, and keyboard interactions that occur throughout the lifespan of a user’s visit to a page. Cumulative Layout Shift (CLS) This measures visual stability by assessing unexpected layout shifts during a page’s lifespan. Google Search Console provides insights into Core Web Vitals performance, which is crucial for site audits . You can improve Core Web Vitals by: Removing unused JavaScript  — Avoid loading unnecessary internal or external JavaScript. Using next-gen image formats  — Optimize images using lightweight formats like WebP for smaller file sizes without quality loss. Storing cache static assets  — Store assets like images, CSS, and JavaScript in the browser cache to reduce loading time. Eliminating render-blocking resources  — Asynchronously load external JavaScript to allow the browser to continue parsing HTML. Sizing images appropriately  — Specify image dimensions to allocate space on the screen, reducing layout shifts. Hreflang annotations Hreflang annotations are useful for indicating the language and regional targeting of a page and its alternate versions to search engines like Google.  There are three main methods for implementing hreflang: HTML  — Add hreflang tags to the page’s HTML section using elements.  HTTP header  — Implement hreflang via the HTTP header for non-HTML files, like PDFs.  XML sitemap  — Include hreflang annotations in an XML sitemap. Below are some best practices for implementing hreflang annotations: Use them only for indexable pages with multiple language or country versions. Tag only the canonical versions of URLs meant to be ranked. Always self-refer and specify the language (and optionally the country) of the current page, along with its alternates. You can specify only the language, but you can’t only specify the country. When you specify a country, you need to specify the language as well. If you specify both the language and country, the language value should always be specified first, separated by a dash (-), followed by the country.  Note that Google does not rely solely on hreflang annotations to identify page targets; it also considers other signals, like ccTLDs, local language, links from local sites, and local currency. Technical SEO is a team effort Building your website on a foundation of technical SEO best practices helps you get the most traffic from the content you’re creating anyway. Oftentimes, however, you’re not the one responsible for actually implementing technical SEO recommendations, which could mean that those suggestions don’t get implemented in a timely manner, hampering your search visibility as well as your career growth. To get your recommendations across the finish line, you need to: Set the foundations for partnership with devs and product stakeholders Strengthen communication for better implementation and outcomes Prioritize your recommendations Validate technical SEO execution To learn more about how to do just that, read my other article on how to get technical SEO recommendations implemented . Aleyda Solis - SEO Consultant and Founder at Orainti Aleyda Solis is an SEO speaker, author, and the founder of Orainti , a boutique SEO consultancy advising top brands worldwide. She shares the latest SEO news and resources in her SEOFOMO newsletter, SEO tips in the Crawling Mondays video series, and a free SEO Learning Roadmap called LearningSEO.io. Twitter  | Linkedin

  • The noindex tag: What it is, why you need it, and when to use it for better SEO

    Author: Vinnie Wong Google may be a search giant, but it still has its limits. Serving Google too many irrelevant or low-quality pages can hurt your site’s crawlability and indexation, eventually resulting in lower rankings, traffic, and revenue. But what if you have a handful of pages that need to stay live without appearing in search results (e.g., gated content, internal search results, checkout pages)? Enter the noindex tag—your resource for telling search engines to keep a page off of the search results, while still making it available for the users that need it. By strategically applying noindex tags, you can streamline your site’s structure, prioritize your most valuable content, and maximize the time Google spends crawling your website. In this article, I’ll dive into the world of noindex tags and explore how they can help you take control of your website’s SEO.  Table of contents: What is a noindex tag? Why the noindex tag is important for SEO Noindex vs. Robots.txt Noindex vs. Nofollow How to noindex a page How to noindex pages on Wix & Wix Studio How to check if a particular page is noindexed Best practices: How to use the noindex tag correctly What is a noindex tag? When implemented correctly, a ‘ noindex tag ’ is a piece of code that instructs search engines not to include a particular webpage in their indexes, preventing the page from showing up in search results.  This tag is part of a larger family of meta directives known as ‘robots meta tags,’ which provide search engines crawlers  with important instructions about how to interact with a website’s content. The noindex tag takes the following format when placed within the section of a page’s HTML: Alternatively, the noindex tag can target a specific search engine’s crawler (such as Google) by replacing “robots” with the crawler’s name, as shown below: The ‘index’ instruction is the default for search engines (allowing your pages to show up in search results), while the noindex tag explicitly tells crawlers not to add the page to their indexes. It’s crucial to understand that the noindex directive operates on a page-level basis—it only applies to the specific URL on which you implement it. Why the noindex tag is important for SEO It may seem counterintuitive to exclude pages from search engine indexes, but there are crucial scenarios where preventing certain pages from appearing in search results is beneficial for your website’s overall SEO health and user experience. Below are five ways to use the noindex tag to support your business’s online success: 01. Avoid duplicate content issues 02. Optimize crawl budget 03. Maintain content quality and relevance 04. Control access and visibility 01. Avoid duplicate content issues When search engines encounter multiple pages with identical (or very similar) content, they may have difficulty determining which version is most relevant to rank in search results/show users. This can lead to several problems: The ‘wrong’ version may rank instead of the original or preferred page. Link equity and ranking signals can dilute across the duplicate versions of the content. Websites may face algorithmic penalties for perceived manipulative duplicate content . Strategically applying the noindex tag to duplicate pages (such as printer-friendly versions) signals to search engines which version should get indexed and ranked. This consolidates ranking signals and helps ensure that the original, high-quality page is what gets shown to users in search results. 02. Optimize crawl budget As huge as Google is, the search engine giant has confirmed that there are just too many pages to crawl . To maximize its time and budget resources, Google limits how long it will crawl any one site—this is what SEOs often refer to as ‘crawl budget.’ For larger websites that have over 10,000 pages, crawl budget optimization will strengthen your site’s SEO.  The noindex tag allows SEOs and site owners to manage their crawl budget by instructing search engine bots not to waste time indexing low-value or non-public pages, such as: Internal search result pages Filter or sorting pages for eCommerce websites User-specific content (private profiles or account pages) Auto-generated pages with minimal unique content By keeping these pages out of the index, search engines can focus on discovering and ranking the site’s most important, user-facing content. 03. Maintain content quality and relevance Over time, content will naturally become outdated (or less relevant). Deleting this content outright isn’t always the best solution, so the noindex tag allows you to keep the content on your site while preventing it from appearing in search results and potentially harming your overall content quality signals.  This is useful for: Older blog posts or news articles that are no longer timely Product pages for discontinued or out-of-stock items Thin or low-quality pages that don’t meet current standards Noindexing this content helps ensure users find your most valuable, relevant content when searching related keywords. 04. Control access and visibility Many websites create content intended for a specific audience or requiring special access, such as: Members-only content Staging or development pages Paid resources or course materials Conversion funnel pages (e.g., ‘thank you’ pages) The noindex tag provides a simple way to shield these pages from search engine discovery, maintaining control over who can find and access your content. Noindex vs. Robots.txt While both the noindex tag and the robots.txt file  provide instructions to search engine crawlers, they serve different purposes: Robots.txt controls crawling, specifying which parts of the site search engine bots are allowed to crawl and which are off-limits. The noindex tag allows bots to crawl the page but prevents them from indexing it. Here’s a simple robots.txt example: User-agent: * Disallow: /private/ This instructs all search engine bots not to crawl any pages within the “/private/” directory of your website. The key distinction is that robots.txt prevents search engine bots from accessing and crawling certain pages altogether, but it doesn’t directly impact whether a page can appear in search results. In contrast, the noindex tag allows bots to crawl the page but prevents indexing, keeping the page out of search results. It’s a subtle difference that has important implications: Difference in impact Robots.txt Noindex Crawling vs. Indexing Pages disallowed won’t be crawled at all. Search engines won’t see the content. Pages will be crawled, but their content won’t be indexed. Search engines can still analyze the page and follow links. Link equity flow Links on blocked pages won’t be followed or pass link equity (PageRank). Noindexing a page over the long term will eventually result in Google removing the page from its index completely, thus no longer following the links (i.e., noindex, nofollow ). Control level Operates at the directory or site-wide level. Can disallow entire sections, but not individual pages. Allows control of indexation on a page-by-page basis, providing more granular control. In practice, robots.txt and noindex are often used together. For example, you might use robots.txt to prevent crawling of sensitive pages and apply the noindex tag on specific pages that shouldn’t appear in search results (e.g., ‘thank you’ pages or faceted navigation). Noindex vs. Nofollow Noindex and nofollow are two distinct meta directives with specific purposes, often used together but serving different functions. The nofollow directive (which applies the nofollow link attribute  to all the links on that page) is a meta tag that instructs search engine crawlers not to follow any outbound links on the page, acting as a ‘stop sign’ for link equity flow. Here’s what it looks like in the section: The meta robots nofollow directive is beneficial in two common scenarios: To tell Google that you don’t endorse a link:  You might use a nofollow link if you’re linking to a website that you don’t necessarily recommend or trust. By using a nofollow link, you’re telling Google that you don’t want to pass any of your page’s ranking power to the linked page. In user-generated content to avoid link spam:  If your website allows users to add to your content, such as comments or forum posts, you might want to use nofollow links for any links that users add. This can help to prevent spammers from adding links to their own websites in your content. Crawl prioritization On large sites, you can use nofollow on certain pages (or page types) to manage crawl budget and direct search engine bots to your most important content. By reducing the number of links bots have to follow, you streamline the crawling and indexing process. You can use noindex and nofollow together on a single page: This instructs search engines not to index the page or follow its links (common for pages like login screens or ‘thank you’ pages that are necessary but shouldn’t be discoverable through search or pass authority). However, using noindex and nofollow together incorrectly can have unintended consequences: Accidentally noindexing and nofollowing important pages could prevent indexing and cut off link equity flow to other key pages. Noindexing and nofollowing large site sections can hinder search engines from discovering and ranking your most valuable content. Use noindex on pages that shouldn’t appear in search results and nofollow only when necessary to control link equity flow. If unsure, err on the side of indexation and allow links to be followed to help search engines understand and rank your site effectively. How to noindex a page Adding the noindex tag to a page is relatively simple and requires access to your site’s HTML code. There are two primary methods: 01. Meta robots tag The most common way to noindex a page is adding a meta robots tag to the section of the page’s HTML: This instructs all search engine bots not to index the page. To target a specific bot, replace “robots” with the bot’s name: You can combine noindex with other directives, like "nofollow," by separating them with a comma: The process for adding this tag depends on your content management system or web platform: Wix: Use the settings in the Wix editor, the SEO settings panel, or the Edit by Page section . I’ll cover these options in more detail later. Other platforms: For closed content management systems, you can typically noindex a page within that page’s settings. For open source platforms, you may need to install a plugin. Refer to your specific platform’s documentation. After you add the noindex tag, save your changes and publish or update the page. The tag will take effect the next time a search engine bot crawls the page. Before you start putting this tactic to work, it is absolutely crucial that you: Avoid using both robots.txt disallow instructions and the noindex tag on the same page. The noindex tag will override the robots.txt file’s disallow instruction and prevent the page from being indexed, as it is a page-specific directive. The robots.txt file, on the other hand, is a broader instruction that applies to all crawlers and bots. 02. X-Robots-Tag HTTP header An alternative method for specifying noindex is using the X-Robots-Tag HTTP header, which you can add to your server’s HTTP response for a particular page (or group of pages): X-Robots-Tag: noindex This method is great for non-HTML files (PDFs, images, videos) and situations where you can’t directly access a page’s HTML code, but can configure your server’s response headers. Implementing the X-Robots-Tag header requires modifying your server configuration files (e.g., .htaccess for Apache servers or nginx.conf for NGINX). The process depends on your server setup, but here’s an example for Apache: Header set X-Robots-Tag "noindex" This code snippet instructs Apache to add the noindex X-Robots-Tag header to the HTTP response for the file "example.pdf." Note that using HTTP headers requires more technical knowledge and server access compared to adding meta tags to HTML. If you’re not comfortable modifying server configurations, stick with the meta tag method. Regardless of the implementation method, search engines will recognize the noindex directive and exclude the page from their indexes. How to noindex pages on Wix & Wix Studio  Whether you’re on Wix or Wix Studio, it’s easy to add noindex tags to your pages through the built-in SEO settings. Here’s how: Open the Wix editor: Log in to your Wix account and open the editor for the site you want to modify. Access your Page Settings : In the editor, choose the page you want to noindex from the Pages & Menu  options on the left-hand panel. Click on the ‘more actions’ (three dots), then click SEO basics . Apply the noindex: At the bottom of the SEO basics  tab, toggle the switch for “Let search engines index this page (include in search results)” to the off position. This adds a noindex meta tag to the page. The noindex tag is now included in the page’s HTML code, and search engines won’t index it the next time they crawl your site. To view your noindexed pages  on Wix, use the Site Inspection tool : Access your Site Inspection  dashboard: From your Wix dashboard, go to Site & Mobile App > Website & SEO > SEO . Under Tools and settings , click on Site Inspection . Check your page status in Google’s index: In the Site Inspection report, open the filtering options. In the Index Status  drop-down filter, select Excluded  to filter for pages that are not indexed. Look for the status “Excluded by ‘noindex’ tag” to indicate pages that are noindexed. To apply noindex tags to all pages of a certain type  (e.g., all blog posts in a category, all product pages, etc.), use the Edit by Page  feature in the Wix dashboard: In your Wix dashboard, go to Site & Mobile App > Website & SEO > SEO . Under Tools and settings , select SEO Settings . From there, choose your desired category of pages and go to the Edit by page  tab (shown below). How to check if a particular page is noindexed There are a few ways to check if a specific page is noindexed, including: Checking the page’s HTML code Google Search Console’s URL Inspection Tool Browser extensions Crawling tools Check the page’s HTML Code To check for a noindex tag in a page’s HTML: Open the page in your web browser. Right-click anywhere on the page and select “View Page Source” (or use Ctrl+U on Windows or Option+Command+U on Mac). In the new tab showing the page’s HTML code, use your browser’s search function (Ctrl+F or Command+F) to search for "noindex". If the page has a noindex tag, you should see a line like in the code. Note that this method only checks for the presence of the noindex tag and doesn’t confirm if search engines have respected the tag and excluded the page from their indexes. You can also use this method to check indexation on any webpage that you can access—not just the ones on your site. Use the URL Inspection Tool in Google Search Console To check the index status of one of your own webpages using Google Search Console : Log in to your Google Search Console account and select the property for your website. In the left-hand menu, click on “URL Inspection.” Enter the URL of the page you want to check in the search bar and press enter. Google will display information about the page, including its indexing status. If the page is noindexed, you’ll see a message like “URL is not on Google” or “Excluded by ‘noindex’ tag.” This tool provides a definitive answer on whether Google has excluded the page from its index based on the noindex tag, but it only works for pages on sites you have verified ownership of in Search Console. Use a browser extension Browser extensions, like Meta SEO Inspector for Chrome , can quickly check a page’s robots meta tags, including noindex. Install the Meta SEO Inspector extension from the Chrome Web Store. Open the page you want to check in Chrome. Click on the Meta SEO Inspector icon in your browser toolbar (it looks like a magnifying glass). The extension will display a summary of the page’s meta tags, including any robots directives like noindex or nofollow. Keep in mind that extensions can be handy for spot-checking individual pages, but aren’t as definitive as Google Search Console, as they only look at the page’s HTML and not its actual indexing status. Crawl the website Website crawling tools like Screaming Frog or DeepCrawl  can check the status of multiple pages simultaneously, providing a comprehensive overview of your site’s indexation. To find noindexed pages using Screaming Frog: Enter your site’s URL in the tool and click “Start.” After the crawl is finished, click on the “Directives” tab in the bottom window. Click on the “Filter” dropdown and select “Noindex.” The tool will display a list of all the pages on your site with a noindex tag. Best practices: How to use the noindex tag correctly Implementing noindex tags incorrectly can lead to unintended consequences, such as important pages being excluded from search results or search engines misinterpreting your site’s structure.  While it’s easy to implement a noindex tag, it’s also easy to do it wrong. Here are some tips to ensure your noindex tags lead to SEO improvements, not errors. Don’t block noindexed pages with robots.txt Include self-referential canonical tags on noindexed pages Regularly monitor site indexation Don’t block noindexed pages with robots.txt If there’s ever been a case for less is more, it applies to robots.txt and noindex tags. Specifically, the noindex tag only works if search engines can actually crawl the page.  If you use the robots.txt file to disallow search engines from a page entirely, they won’t be able to see and respect the noindex tag.  This can lead to a situation where you think a page is excluded from the index, but it actually still shows up in search results. Instead of using robots.txt to block noindexed pages, prioritize the noindex tag itself. Allow search engines to crawl the page so they can see the noindex tag and understand that it shouldn’t be included in their indexes. Include self-referential canonical tags on noindexed pages When you noindex a page, you can also include a self-referential canonical tag . This means adding a canonical tag that points to the page itself as the canonical (or ‘preferred’) version. It might seem counterintuitive to do this on a page that’s being excluded from search results, but it can actually help search engines better understand your site’s structure. Here’s an example of what a self-referential canonical tag looks like (if our example page’s URL was https://www.example.com/noindexed-page): Including this tag on your noindexed pages helps avoid potential confusion if the page is accessible through multiple URLs (such as with parameters or tracking codes). Without the self-referential canonical, search engines might choose one of these alternate URLs as the canonical by default, which could lead to unexpected indexing behavior. By specifying the page’s own URL as the canonical, you’re reinforcing the noindex signal and telling search engines that this specific URL is the authoritative version, even though it’s intentionally excluded from the index. Regularly monitor site indexation Even if you’re careful about implementing noindex tags correctly, mistakes can happen. A noindex tag might be accidentally removed during a site update, or a valuable page might get noindexed unintentionally. To catch these issues early, make a habit of conducting regular site audits . Tools like Google Search Console are invaluable for this purpose. In the Page Indexing report , you can see a list of all the pages on your site that Google has crawled and whether they’re indexed or excluded (and why). If you notice any important pages that are unexpectedly noindexed, or any noindexed pages that suddenly show up in search results, you can take action quickly to resolve the issue before it has a significant impact on your search traffic and business. Stay on the pulse with your noindex tags The noindex tag is just one piece of the SEO puzzle, but it’s a crucial one. By strategically using noindex tags in conjunction with other technical SEO tactics like canonicalization, structured data , and smart internal linking , you can create a website that’s not only search engine-friendly but also laser-focused on delivering value to your target audience. Just remember that for any tactics related to SEO success, using noindex tags isn’t a ‘set-it-and-forget-it’ task. Have a system to monitor your pages, whether it’s through regular site audits or checking your Wix dashboard, and you’ll be on track to prioritizing your site’s most important content. Vinnie Wong - Founder and Chief Strategist at Content Cartography   Vinnie is a content expert with over 5 years of SEO and content marketing experience. He's worked with Ahrefs, Empire Flippers, and is committed to crafting exceptional content and educating others on the symbiotic relationship between content creation and effective link building. Twitter  | Linkedin

  • SEO A/B testing: Experiment for superior title tags and meta descriptions

    Author: Jandira Neto Traditionally, SEOs have relied on advice and recommendations from Google in hopes that it would improve their organic traffic. Even for brands and agencies that rigorously follow these on-page SEO guidelines, the uncertainty can be tough to navigate, and sometimes you don’t even know where to start. With SEO A/B testing, you can take the guesswork out of website changes and make informed, data-backed decisions. By designing an experimentation program, you can run small tests that insulate you from risk while identifying valuable opportunities to optimize your site.  In this article, I will show you how to build a testing strategy that enables you to run A/B tests on your meta tags so that you can enhance your competitiveness in the search results and bring in more organic traffic. Table of contents: A/B testing: The fundamentals What is A/B testing for SEO? How A/B testing works in SEO The benefits of A/B testing for SEO A/B testing strategies for title tags and meta descriptions Experimentation programs Identifying title tag test opportunities Identifying meta description test opportunities Crafting variations of title tags and meta descriptions Conducting A/B tests and gathering data Case studies and real-world examples 3 success stories from A/B testing title tags 3 success stories from A/B testing meta descriptions How to A/B test and optimize your meta tags on Wix A/B testing: The fundamentals For those newer to A/B testing, here’s some crucial context that will guide you throughout the process and help you better explain it to teammates and stakeholders. What is A/B testing for SEO?  In the context of SEO, A/B testing is a methodology in which you compare the impact a site change has on two statistically similar web pages.  A/B testing is not new to the world of marketing. American advertiser and author Claude Hopkins  pioneered this method by conducting the first documented A/B test, in which he looked at the rate of return to measure the impact of his experiment with two distinct promotional coupons.  Fast-forward to the modern day: A/B testing has expanded to all types of digital marketing (including SEO), giving rise to a variety of SEO testing tools, including SearchPilot  (the one I work for), SEOTesting, and more.  How A/B testing works in SEO  To begin A/B testing (or just about any SEO testing ), you should first split your web pages into one of two subsets (also referred to colloquially as “buckets”):  One subset of pages are the control pages —you will not make any changes to these pages so that they can serve as a baseline for comparison. The other subset of pages are the variant pages —you will test either an off-page or on-page SEO change to the pages in this group.  This way, you can see how these changes affect organic traffic in a controlled, repeatable way.  What’s the difference between user testing and A/B testing?  The main difference between user testing and A/B testing is that the A/B testing tests Googlebot .  User testing, on the other hand, uses cookies  to test the behavior of your real-life site visitors.  In A/B testing, you cannot show Google different versions of the same page (as that would constitute cloaking). To avoid this, SEOs split pages, not users. Users will see the same page every time.  The benefits of A/B testing for SEO More and more website owners and digital marketers are adding A/B testing to their SEO strategy, and here’s why: A/B testing helps you make data-driven decisions that benefit your business/website. Long gone are the days of relying on assumptions or half-baked competitor analysis  to anchor your marketing strategy.  Instead, you can run a small scale A/B test to assess the ROI  on a series of optimizations (enabling you to take on a fraction of the risk compared to implementing sweeping changes all at once without testing). If it goes well, you can bring a stronger case to your stakeholders to get their support for your recommendations.  Another thing you might really appreciate is the agility and creativity that comes with A/B testing: You can test on almost any part of your site—from title tags and meta descriptions to schema markup and URL structures.  Once you pinpoint an area of your website that you would like to optimize, you can run an A/B testing experimentation program. This is an iterative process that can help your site stay competitive and potentially improve your organic traffic. A/B testing strategies for title tags and meta descriptions Meta tags help Google understand what your website is all about. Google uses this important information to help determine what it displays in the search results.  To that end, Google  has said that “it’s important to use high-quality title text  on your web pages.” Likewise, it’s also important to use relevant, high-quality copy in your meta description : “Google will sometimes use the [meta description] tag from a page to generate a snippet in search results, if we think it gives users a more accurate description than would be possible purely from the on-page content.” — Google, “ Control your snippets in search results ” Meta tags are a great element of your site to test on, but how do you strategize it? Where can you start? As an SEO consultant, I have tested on a wide variety of website sections with a wide range of customers, so I am no stranger to test ideation and building SEO testing strategies. When discussing strategies with customers, they normally come to me with their SEO problems and I, in turn, seek to prove the value of SEO by supporting them with a successful experimentation program. Now let’s take a look at how you can build an A/B testing experimentation program for your own website. Experimentation programs As we dive deeper into SEO A/B testing strategies, a pivotal type of strategy emerges—experimentation programs. This strategic approach plays an important role in running SEO A/B tests that will extract learnings about what works and doesn't work for your website.  Let's explore what an experimentation program entails and how you can build one. What is an SEO experimentation program? An experimentation program is a structured approach to testing a range of SEO hypotheses in order to: Learn more about how your site performs under different strategies Improve key performance metrics Make more informed, data-driven decisions As opposed to testing to “just see what happens,” you are creating a personalized testing strategy to solve your SEO problems.  Your experimentation program will uncover useful insights into the organic performance of your site and the ROI of the SEO tools you are currently using. By the end of the program, you should be able to look back at a portfolio of tests and see how much time and money you saved by eliminating risks associated with negative changes. A drop in your organic traffic from negative changes could massively damage your sales/conversions. You will also be able to see how much time and money you saved your engineering team by only deploying winning changes. How to create hypotheses for A/B tests The best testing strategies are cohesive and strive for one solid goal. Even though SEO is ever-changing, the idea behind an experimentation program is that the testing strategy should be cohesive—not purely reactive. You can create hypotheses based on existing strategies, known problem areas, or research in your industry. In that hypothesis, you should include: What changes you plan on making What pages you will change The projected impact on organic traffic  Note: CRO testing measures users and their behaviors. This is a separate testing methodology and set of considerations, but you should consider conducting CRO testing if your conversion rate has decreased despite otherwise solid website performance. Going back to our example of finding the most well optimized title tag, a good example hypothesis for this testing strategy might be: “We want to find the most well optimized title tag (as the title tags on our site are low quality). We will test changing the title tag to feature more keywords on a subset of product pages, therefore improving our ranking for new keywords.” This hypothesis is not just for one test, but for as many as you like. You can iterate until you land on the best one. How to build a reliable SEO experimentation program The entire point of building an SEO testing program is so that you can obtain repeatable results. You can achieve a reliable experimentation program in four steps: Ensure your hypothesis is aligned with your high-level goals.  If your digital marketing goals are to increase brand awareness via search rankings, for example, then your hypothesis might involve optimizing your title tags. Come up with a handful of variations to achieve your goal. Create a couple of title tag variations (based on keyword research in your industry) that are likely to improve your organic traffic and, in turn, your existing search rankings.  Start SEO A/B testing, record observations, and analyze your data.  Run your A/B tests for one to two weeks and observe the behavior of your variant page. Has organic traffic increased, decreased or stayed the same? Look at how your pages appear in the search results and take note of any visibility changes. After you draw conclusions from your test, come up with some new ideas to iterate and improve.  You can create a fresh variation of your test(s) (based on your findings) and repeat the process until you identify the best method for optimizing. Identifying title tag test opportunities Review your current title tags and page performance to identify whether there’s potential for improvement:  Does it align with Google’s recommendations?  Is it competitive?  And, think about how it could be better for Googlebot (more on this in the example below). So for example, this VR arcade business has a title tag (“Page 1”) that gives no information about the content of the page and Google will have trouble understanding it. There is an opportunity here to make the title tag more descriptive and reflective of the page’s content. It’s worth running an experimentation program and finding ways to optimize your title tag if it is:  Generic (like the example above) Vague Contains keyword stuffing Does not reflect user search intent There are a few automated tools that can help you spot title tags that are not performing well, like WebFX's SEO grader . This tool gives your title tag a score to gauge its effectiveness, providing a baseline for A/B testing. However, I personally prefer manually inspecting title tags, comparing them to competitors and drawing on industry knowledge. There’s always room for improvement and optimization. By staying informed about your industry, you can determine if your title tag isn’t performing at its best. Identifying meta description test opportunities The same idea (discussed in the section above about title tags) also applies to meta descriptions.  Again, you’re looking for meta descriptions that are generic, vague, stuffed with keywords, and/or don’t reflect search intent. The example VR arcade businesses’ meta description (above: “Come read the Game Catalogue”) lacks specificity and fails to communicate the content of the page. Crafting variations of title tags and meta descriptions Now that you have identified the opportunities, you can craft variations for the experimentation program using keyword research , competitor analysis , and a user-first mindset . Instead of these title tags… Let’s test… “Page 1” “Explore Virtual Realities at AI Arcade | Top VR Gaming in the UK | Multiple Locations for Endless Fun!” “Locations” “Our AI Arcade Locations | Find Nearby Venues for Immersive VR Experiences” “The Wrestling Game” “Step into the Ring: Immersive VR Wrestling Experience at AI Arcade” Instead of these meta descriptions… Let’s test… “Welcome to AI Arcade, Best VR, Affordable Games, Top Quality” “Welcome to AI Arcade: Immerse Yourself in Cutting-Edge VR Experiences at Multiple UK Locations. Affordable Games, Unparalleled Quality, and Endless Fun Await” “Come read the Game Catalog” “At AI Arcade, We Have A Diverse Catalog for Thrilling Adventures and Immersive Experiences. Choose Your Next Virtual Journey Now!” “We Offer A Gladiator Wrestling Game In Our Five Game Options” “Dive Into Thrilling Virtual Reality Combat As You Challenge Friends In A Gladiator-Style Arena. Experience Intense VR Wrestling Right In Your City!” Conducting A/B tests and gathering data Similar to traditional SEO best practices, you will implement the change and monitor performance before deploying the new changes to your whole site. This will help ensure the integrity of your A/B tests. Analyzing and interpreting test results While testing, you should monitor performance to see how the change you made impacts your organic traffic. You can track these performance changes in Google Analytics 4 , Google Search Console , your CMS’s built-in analytics, or various other third-party tools.  The example above is from the SearchPilot platform. You can see that the test ran for a number of days and resulted in a 5.2% uplift in organic traffic. The customer won 797 extra sessions by making this SEO change. The SearchPilot platform takes into account many external factors that might affect the organic traffic of your website, such as algorithm updates  and seasonal changes.  Although you can use any of the SEO tools I mentioned (among many others), not all of them have features specifically to support A/B testing and sophisticated analyses. If your tools don’t offer those features, you can start off with before-and-after testing. Bearing in mind, this method does not factor in algorithm updates or seasonal changes, at the bare minimum you will be able to keep records on: Baseline performance (your control group/web page) The exact change you made and the date you made it The date the test ended and the impact(s) on performance Case studies and real-world examples A/B testing can revolutionize website performance and prove the value of SEO (where you otherwise couldn’t via standard site changes). Let’s go through some successful examples that my team and I (at SearchPilot) have tested that you could try emulating on your site. 3 success stories from A/B testing title tags 01. Adding “The Best” to the title tag Optimizing your title tags can impact the overall organic traffic your website brings in by influencing click-through rates and introducing new keyword rankings.  Once you see a change in your organic traffic, you can identify what influenced the change by reviewing search results for your industry. Here’s an example that transcends industries: ​​The hypothesis for this test was that adding “The Best” to the beginning of title tags could help produce better click-through rates and generate a positive impact on organic traffic. We ran the test and saw a 10% uplift in organic traffic. By deploying the change, the site could see an extra 11,000 organic sessions per month. Google respected the change and showed the new title tags in search results (remember, Google may opt to rewrite your titles in search results  if it thinks they’re not relevant for the user). This change had an amazing impact and is easy to implement on just about any site in nearly any industry. 02. Adding a question to the title tag   An informational query  is a search in which the user is looking for an answer to a question. Their intent is to know something (hence these types of searches are sometimes called “know queries”). If your site is content heavy, you should optimize it to align with informational queries so that you can establish your sites’ expertise in the industry. This, in turn, can help improve your organic traffic. In conducting this test, we hypothesized that by increasing the occurrences of the targeted keyword and structuring queries as questions, we could enhance the page’s relevance and better align it with user search intent. The aim was to boost existing rankings and improve the organic click-through rate. The change was small but mighty. We saw a 5% uplift in organic sessions. Google was able to answer users “know queries” because the question users were searching was in the title tag. 03. Appending brand name and locale to the end of title tag Small, agile changes can improve your organic traffic.  The hypothesis behind the test was that adding the brand name and location to the end of the title tag would help improve organic traffic by increasing the pages’ relevance, visibility, and trust. But, I did fear that the title tag would become too lengthy so Google would cut off the brand name and locale in the search results. This could have a negative effect on organic traffic. When title tags are truncated in search results by Google, the text can end up not being visible to the user. Thanks to our customers’ clear and concise title tags, there were no cases where the search results got cut off and the results showed that this small change brought in an impressive 9% organic traffic uplift. This result backs up my idea that a small but agile change made the pages more noticeable. By including the location, Google was giving the correct information to prioritize our site for users in the locale. Also, adding the brand name didn't just make the site more relevant to local users; it also sent out strong signals of authority because the brand is well-known.  3 success stories from A/B testing meta descriptions 01. Adding third-party ratings to your meta descriptions If you have good user reviews, you can make them work for your website.  The hypothesis for this test was that adding third-party reviews to the meta description could help enhance E-E-A-T signals  for improved rankings and/or better click-through rates. The meta description now helps instill an element of brand trustworthiness and allures users, potentially leading to an uptick in organic traffic and click-through rates. Google respected the new meta description and showed it in the search results. Although the test was not statistically significant , it resulted in a positive correlation. This means that there were small uplifts in the organic traffic that were not strong enough to be a 95% positive test, but the customer classified it as a positive test and rolled it out on their site. Testing this out on your site could be the beginning of an iterative process. 02. Removing the meta description altogether  Sometimes your meta tags can do you more harm than good (i.e., when they’re irrelevant or poorly optimized). An easy way to test this is by removing them altogether.  The hypothesis behind this test is that the site’s web pages featured low-quality and generic meta descriptions. So, we tested removing the meta descriptions entirely to allow Google to select a snippet from the page’s text that it deemed useful. The test was positive at an 80% confidence level, meaning if there is a 20% chance that if this change was rolled out on the site, it wouldn’t have the same impact.  This might be the easiest out of all the examples that you can implement. You can learn what Google classifies as important on your page and therefore test putting all the variations of verbiage it populated as your meta description in an experimentation program. 03. Adding price to your meta descriptions To remain competitive in search results, you need to test different approaches: Most search listing snippets contain your standard text. How about adding some numbers to your meta description to make yourself look different and stand out? In this test, we experimented with adding the lowest price deal into the title tag. Prior to this, the title tags did not feature any pricing details. The hypothesis was that by adding the price, we could help boost click-through rates by attracting users who are seeking the best deals and improve search results performance by including relevant information. The test ended with a huge 12% increase in organic traffic. Test adding prices to your title tag; whether it’s the lowest price available or an average for the products you sell. This user-first approach could help you improve your organic search traffic from customers who search prices. How to A/B test and optimize your meta tags on Wix Wix website owners can see an overview of all their title tags, meta descriptions, page/product names, and URLs for a given page type (i.e., blog posts, products, events, etc.) in the Edit by Page  section  of the Wix dashboard ( SEO > SEO Settings > [desired page type] > Edit by Page ), as shown below. From here, you can make changes to any of your title tags and meta descriptions (without having to open each page individually) by clicking on the three dots to the right of the desired page, as shown below. You can also generate title tag and meta description suggestions based on your page content using our AI meta tag creator  (accessible via each individual page or the Edit by Page  section). You can also use the AI meta tag creator to refine suggestions  for your brand/audience, expanding ways you can test your meta tags for the best performance. SEO A/B testing is your roadmap to iterative performance improvement SEO A/B testing helps you answer a simple question: “What should I optimize next?” Although some creativity is involved with your optimizations, testing largely takes the guesswork out of your strategy, which keeps you on track to iterative performance improvements. Once you’ve gained some experience with this process, you’ll be able to work smarter and faster by testing small (to manage risk) and applying those changes at scale, enabling you to get the most out of the SEO you’re already doing for your business (or your clients’ businesses). Jandira Neto - SEO Testing Consultant at SearchPilot Jandira is a technical SEO A/B testing consultant. She works to prove the value of SEO for the world’s biggest websites, delivering profitable, attributable results. She also enjoys staying on top of SEO industry news and providing SEO advice to small minority businesses. Linkedin

  • Foster an education-first culture at your agency for better authority, business, and retention

    Author: Christine Zirnheld Generative AI, third-party cookie deprecation (eventually), Google algorithm updates, broadening match types—digital marketing moves fast. To stay ahead and satisfy clients, your team must embrace ongoing learning and new skills.  Without a culture of continuous education, your agency risks falling behind. In this article, I’ll delve into the training methods digital marketing agencies can explore to foster perpetual learning among employees, highlighting its crucial role in agency success, including: What it means to have an education-first culture at a marketing agency The business benefits of continuous education for agencies How to get the most out of employee training How to make learning a regular part of your agency’s week What does an “education-first” culture look like for marketing agencies? Your employees don’t need an MBA to grow their skills and become better marketers. In fact, at Cypress North, the agency where I work, we no longer require a bachelor’s degree from potential hires. We’ve found that the best education is what we learn from each other and real-life client experiences.  However, that doesn’t mean you should work on client projects for 40 hours a week. When we only focus on client work, we miss out on opportunities to diversify our skills and learn strategies for new verticals.  So, what does continuous learning look like at an agency? It requires breaking your teams out of siloed client groups to come together, collaborate, and share learnings. Our agency has various types of “training” that occur every day. These can look like hands-on working sessions or more formal training seminars. We’ll dive into the complete list of our weekly trainings later in this article.  The benefits: How education improves your agency as a business and as an employer Education is an investment in the future. It may seem like a sacrifice to take hours away from client work right now, but over the long haul, an education-first culture at your agency means: Lower cost and more flexible teams Long-term client success Employee retention, growth, and fulfillment Training content that promotes agency authority Lower cost and more flexible teams As your agency grows, you’ll need to decide whether to hire more experienced or greener digital marketers to join your team. Professionals with longer resumes bring valuable experience, but you’ll also pay more upfront for senior-level employees. And, you won’t know if they’ll work with the same values and strategies as your agency requires until they’re already part of the team.  Hiring less-experienced applicants requires a smaller initial investment than seasoned marketers. Another benefit is that you can design the perfect team for your agency, focusing on the skills and knowledge most helpful to your clients.  However, because these greener marketers have less experience, they’ll have less hands-on knowledge to apply to client accounts.  To enjoy the benefits of growing a team from the ground up, investing in continuous education and training is a must. Long-term client success Because our days are focused on clients, finding time for hands-on learning is often difficult. While carving out time for continuous learning can  mean taking time away from client work, it leads to better performance in the long run.  As an agency, your greatest competitive advantage is the ability to see what performs across multiple accounts. This valuable knowledge should be shared across client teams to improve performance across your agency. By siloing your learnings, your team may bill more hours, but miss critical insights and lessons that lead to the ultimate goal: performance and growth. Employee retention, growth, and fulfillment When good employees don’t see opportunities for growth at your agency, they leave. By providing continued education, employees can expand their knowledge and diversify their skill set.  Exposing your teams to new challenges, tools, and strategies outside their dedicated client work shows an investment in your employees’ careers while elevating their capabilities for your agency.  Training content that promotes agency authority It may feel like the best way to stay ahead of other agencies is to guard your expertise. At Cypress North, we’ve found the opposite to be true. If a topic is challenging our team, chances are other marketers are challenged by it as well. Whenever someone on our team brushes up on a topic to lead a training, there is an opportunity to create educational content to promote our brand.  Examples include: Blog posts Video tutorials Webinars Downloadable resources , checklists, or planning templates Conference presentations  Podcasts   Social posts  E-books  Content like this establishes your team as industry thought leaders, which helps you attract clients as well as top talent to join your team. Our agency’s podcast, Marketing O’Clock , is an excellent example of this content strategy. We release weekly digital marketing news episodes and more evergreen content like tutorials and roundtable discussions. This is a winning strategy for both Cypress North and our customers:  Our team benefits from forcing ourselves to stay up-to-date on the latest news and updates. Because we’re on the cutting edge of industry trends, we’re better positioned to exceed KPIs and keep clients happy. Potential clients find our content online and see Cypress North as a leader in the industry. As established thought leaders, we attract high-performing digital marketers to join our team, which brings even greater value and performance to our clients. How to get the most out of employee trainings  Training and education can be a significant time investment. Below are some strategies that our team employs to ensure that learning sessions are time well spent. Put learning on the calendar Sometimes, simply showing up is the most challenging part of training your team. As an agency, we are bogged down with weekly tasks, reports , and client meetings. To instill a culture of continuous learning, you must hold your teams accountable by scheduling regular training sessions, webinars, and working sessions. At Cypress North, we put learning on the calendar every day. Below is a training schedule for a typical week at our agency, including learning opportunities that we will discuss in depth later in this post. Day Training Monday Marketing team standup Tuesday Account maintenance PPC training Wednesday News podcast  recording Thursday Hands-on SEO training & PPC optimization training Friday Digital Marketing University  We add these placeholder meetings to our calendar every week, but these meetings can all have varying topics (depending on what projects we’re working on or if there is a trending topic that we need to cover). We do our best to ensure the team knows what will be covered during each event as early as possible, so they can plan their day accordingly. Who should attend? Before scheduling meetings, decide who on your team needs to attend training. At our agency, we invite every marketing team member to attend every learning opportunity. While these meetings are only required for coordinator- and associate-level employees, more senior members are encouraged to join if they have time.  As mentioned earlier, learning isn’t only for entry-level digital marketers. Even a veteran marketer with years of experience can benefit from learning about a new feature, product, strategy, or tool. Plus, because we aim to make training interactive, the rest of the team can benefit from hearing their point of view, too. Utilize your entire marketing team Cypress North is a mid-sized agency; we don’t have employees dedicated solely to training staff. We divide training responsibilities across team members to prevent training from becoming a burden, and we don’t only use senior employees.  The primary benefit to this is that nobody has to spend excessive time prepping for or leading training, but there are many additional perks to this approach. Ex posure to diverse perspectives and approaches: An unfortunate side effect of agency growth is that teams become more siloed, focusing specifically on their clients. There are members of the marketing team at Cypress North that I have never had the pleasure of working with even though we’ve both been here for years. Mixing up the leaders of training sessions allows us to get varying perspectives on challenges and enables our team members to hear from everyone, not just those on their same client team. Beyond continuous learning, breaking out of our client teams helps us get to know each other better. This leads to better camaraderie among the team and benefits our agency culture.  Sometimes, the perspectives and knowledge of greener team members is the most valuable of all. As a leader on client accounts, I often find myself caught up in reports, deliverables, and client relationships. Less-experienced marketers often spend more time working with Google Search Console and discovering new SEO or AI tools to help our clients. When we have the opportunity to combine these fresh ideas with years of client experience, the confluence of skills can drive incredible results.  Leadership training for junior staff: When a coordinator or associate leads a training exercise, they aren’t just passing on what they learned, they’re also growing their own leadership skills. Learning sessions are an excellent opportunity for greener employees to grow more confident in public speaking and other management skills. Limit distractions  When I attend a virtual learning session, I’m often distracted by emails, reports, and Slack messages. Remote training is tough because it’s hard to focus on learning when you have your day-to-day tasks on the screen in front of you.  Most of my team works in-office, so we have the option to hold in-person meetings. However, we have two offices in different cities. For the larger training sessions (where we try to get the whole team together), we cannot get everyone in the same room. So, we utilize a hybrid approach. Here is an example of how our hybrid training works: whoever hosts the meeting will sit at their computer (in their office) and share their screen. The rest of the team gathers in a separate conference room and pulls the meeting up on a big screen. This approach allows whoever is in charge that day to share slides or a site they are troubleshooting on the big screen, while the rest of the team engages in the meeting.  The team members in the conference room don’t get distracted by their day-to-day tasks and feel more comfortable speaking up and sharing insights because they are attending the meeting in-person. And, anyone working from home can still participate in the meeting, or we can record the sessions for new employees to watch later. Make training interactive  To get the most out of training, it’s vital that your team is engaged and excited about the topic. Through trial and error, we’ve identified some strategies to help us achieve this goal. Us e real client accounts: Two of our most valuable training sessions are hands-on account maintenance training and client-specific working sessions. In these trainings, the session leader gets the team together to work through a p articular challenge or project. We encourage our team to bring their own client challenges to these sessions to benefit from other team members’ perspectives, including upper management and coworkers that do not work on that client account.  This keeps our team more engaged because the knowledge they acquire will impact their actual client performance. It also allows us to feed two birds with one seed, training the team while executing deliverables for our client.   Ask for feedback: Instead of just delivering a lecture to the group, we involve the whole team to keep everyone as tuned-in as possible. Before showing the group how to approach a problem, the session leader asks everyone what they would do. This strategy keeps everyone engaged and lets us get varying perspectives on the same problem. Record everything  Record every marketing training session and upload it to a shared folder. This adds another step to the process, but your future self will be grateful when you have a library of training videos and educational materials for new hires. Recording meetings also allows anyone who missed the training to go back and review. Plus, if there are any questions on the topic in the future, your team can easily reference the video. Make learning a scheduled part of your agency’s week The different types of “training” that your team can engage in each week include (but aren’t limited to): Weekly marketing team standups Working sessions Industry certifications Marketing 101 trainings Book clubs Industry news Weekly marketing team standups  At Cypress North, we gather the entire marketing team for a 30-minute call at the beginning of every week. This allows us to:  Update the team on company matters Check in on workload for the week ahead Share learnings from the previous week Every team member concisely shares something that they learned. This doesn’t have to be something “new” or groundbreaking. Chances are, if a tool, feature, or strategy is new to one person on the team, it will be new to someone else.  This is a quick exercise, but it challenges us to continue to grow our skills every week and pass our discoveries on to the rest of the team.  What seems small to one person could be a game-changer for another client’s performance. Working sessions  I’ve found that hands-on learning is the best way to grow my skills and confidence. Regularly scheduled, client-specific “working sessions” allow the entire client team to collaborate, strategize, and learn. One team member shares their screen and everyone puts their heads together to work on a specific project or challenge for that client. When we’re forced to set time aside to work together, it allows us to learn about new tools, strategies, and approaches to a problem.  Industry certifications From Google Analytics  to HubSpot, to Google Ads and more—there are plenty of digital marketing certifications for your agency to pursue. I’ve found that hands-on learning is a better approach to growth, but some potential clients value these platform certifications when choosing an agency partner.  We have employees take these certifications when they’re new to the team before they have a full client workload.  Marketing 101 trainings When teams are siloed into client accounts, they miss out on learning the strategies and tactics that don’t impact their clients. At Cypress North, we schedule “101” trainings every week.  One team member leads the call and covers a digital marketing topic. This could mean showcasing how to use a tool, troubleshooting Search Console reports , deep dives into Google Sheets tricks, or covering best practices. These trainings give our greener employees opportunities to learn skills even if they don’t apply to their current client accounts.  Book clubs Optional marketing book clubs are a great opportunity to challenge your team to think critically and learn. Content doesn’t have to be specific to agency disciplines (SEO, PPC, etc.); books about branding, sales, or leadership can also help with professional or agency growth.  Industry news Staying up-to-date (and ahead of clients) on relevant industry developments is just as important as growing hands-on skills. Encourage your staff to share breaking news via Slack, weekly emails, or at weekly marketing meetings. Embrace continuous learning for agency success Prioritizing education enables your agency to stay ahead of industry trends and adapt quickly to changes, ensuring that your strategies remain effective and your clients are satisfied. Additionally, investing in the education and growth of your employees—regardless of their experience level—not only improves client outcomes, but also enhances employee satisfaction and retention. Ultimately, by fostering a culture of continuous learning, your marketing agency can achieve greater authority, drive better business outcomes, attract better candidates, and stay ahead of the competition. Christine Zirnheld - Senior Digital Marketing Manager at Cypress North Christine Zirnheld is a senior digital marketing manager at Cypress North, specializing in PPC. As a host of the Marketing O'Clock podcast, she covers breaking PPC & SEO news stories with lots of sass. Twitter  | Linkedin

  • Meta Ads for eCommerce: Tactics to increase sales

    Author: Akvile DeFazio   For eCommerce brands, the quest to attract customers and drive sales is an enduring pursuit. In this competitive environment, the ability to reach out to vast audiences, introduce them to your brand, and drive sales can make or break your online store. If those are the goals you want to achieve, then do not overlook Meta’s ecosystem (most notably on Facebook and Instagram).   Let’s dive into the various options that Meta Ads offers for eCommerce brands looking to increase sales as well as the tactics you’ll need to increase your bottom line. Table of contents: Why Meta Ads is a mainstay for eCommerce brands Getting started: The Meta Ads Pixel & Conversion API How to set up Meta Ads tracking How to set up Meta Ads tracking on Wix Create and connect your product catalog Establish baseline performance with dynamic product ads Sales-driven campaign types Advantage+ shopping campaigns Advantage+ shopping campaigns vs. Manual campaigns Maximizing sales volume and sales value Utilizing AI for targeting with Advantage+ audiences Meta Ads best practices Ad copy Images Video Ways to increase profitability for your Meta Ads eCommerce campaigns Why Meta Ads is a mainstay for eCommerce brands Meta Ads is an advertising platform that serves ads across several platforms under the Meta umbrella, including Facebook, Instagram, WhatsApp, and other partner sites and apps in the Meta Audience Network . As one of the oldest social media advertising platforms, Meta has significant global usage that benefits brands looking to drive sales and scale their digital marketing.  It is comparatively more efficient (read: cheaper) than other advertising platforms, leverages advanced targeting capabilities powered by AI, and excels in accomplishing various business goals (i.e., driving website traffic, increasing brand awareness, acquiring leads for your newsletter, and bringing in sales). Getting started: The Meta Ads Pixel & Conversion API To use Meta Ads’ full capabilities for your eCommerce brand, it’s vital to implement the Meta Ads Pixel on your website and the Conversions API (CAPI) on your server. While you can run campaigns without it, doing so helps you track and understand which campaign, ad set, and ads your sales are coming from and will allow you to optimize for sales. Without at least one of these, your campaigns will struggle to optimize for sales. The Meta Ads Pixel and CAPI also aid Meta in more effectively optimizing towards your sales-driven goals and can help create custom audiences for you to target, including retargeting people that visited your website or abandoned shopping carts. How to set up Meta Ads tracking If you're a Wix website owner, the process is straightforward—you can read more about it in the section below. If your online store is on a different CMS, refer to Meta's instructions on integrating its Pixel and CAPI. How to set up and implement the Meta Pixel How to set up and implement Conversions API How to set up Meta Ads tracking on Wix With Wix, you can seamlessly accomplish both of these tracking set up tasks via the Marketing Integrations  section of the Wix dashboard (shown below, accessible by navigating to your dashboard Settings > Marketing Integrations ). After you connect the Meta Pixel and CAPI to your Wix website , you’ll be able to track events like product views, add to carts, newsletter signups, and purchases. Next, you can optimize for those events, which tells Meta what goal is most important for you in a given campaign (e.g., purchases). Create and connect your product catalog Now that you’ve got your Meta Ads Pixel and CAPI configured for your eCommerce store, you can upload your product catalog for the platform to show your products to users.  There are several options you can use to upload your products:  Data feed (i.e., a spreadsheet or XML file) A partner platform The Meta Pixel  Meta’s catalog batch API (better for stores with hundreds of thousands of products),  Manually Again, Wix site owners can use the built-in integration  to keep their Facebook Catalog in sync with their website inventory (or you can create your catalog in Meta’s Commerce Manager ). Your product catalog should include all the essential details of the products you sell, such as the product description , materials, dimensions, pricing, etc.  Establish baseline performance with dynamic product ads You can then promote your products using highly effective dynamic product ads (DPAs). Meta can serve DPAs showcasing products from your store that it “thinks” have the highest likelihood of purchase from the audience(s) you are targeting.  When deciding which products to promote in your DPAs, consider setting up these product sets first to determine benchmarks: Bestsellers New arrivals On sale All products If you have seasonal product sets or product themes (e.g. sneakers, boots, sandals), create those product sets and test those out as well. I often see that bestsellers work very well to increase sales volume as new customers are often drawn to popular items. Your best-selling products could be a great entry point to the brand, enticing people to shop. New arrivals also tend to perform well as many consumers want to be one of the first to own something new. If you have ten or fewer products, running DPAs in a catalog campaign may not yield sales at desirable costs, as there is not enough variety for Meta’s systems to rotate through and optimize for. For eCommerce brands with limited product offerings, consider launching other ad types that don’t require a catalog until you have more products.  Sales-driven campaign types There are currently six campaign objectives to select from:  Awareness Traffic Engagement Leads App promotion Sales Since we’re discussing Meta Ads to increase sales, you’ll use the Sales campaign objective for Conversions or Catalog Sales, then optimize for purchases at the ad set level. Optimizations selected in the ad set level can also be set for other events, such as “add to cart” (but since the primary goal here is to acquire more sales, you’ll optimize for purchases).  Next, let’s discuss Advantage+ Shopping Campaigns and whether a more automated approach is right for you.   Advantage+ shopping campaigns Advantage+ shopping campaigns (ASC) are a sales-driven campaign type designed for eCommerce brands. They leverage Meta’s AI technology to dynamically deliver ads to people that are most likely to make a purchase.  Advantage+ shopping campaigns vs. Manual campaigns While you are still able to set up campaigns via a more manual process, Meta is making a big push to use ASC so it’s possible that manual sales campaigns may disappear in the future (as they may on other platforms, like Google).  ASC uses a more streamlined approach compared to manually configured campaigns and their respective ad sets (where you specify budget, placements, optimizations, and other targeting criteria). This campaign type uses Meta’s technology to more effectively display your ads to the people that are most likely to make purchases (compared to a more human-selected targeting strategy).  Manual campaigns have historically worked better as advertisers tested their own selection of audiences based on interests, behaviors, and other detailed targeting combinations—until recently. While some of these detailed targeting options remain, Meta has removed many of them over the years due to changes in privacy legislation and technology.  Manual targeting can still work well, but is decreasingly less effective than it once was—on the other hand, AI-powered targeting continues to prove its potency as it learns and evolves.  I still recommend that digital marketers invest some time to get familiar with manual targeting, as this can help you better understand how automated targeting systems work and is valuable for determining what combination of automation and manual controls are best for your online store.  Maximizing sales volume and sales value “ASC achieves an average of 17% improvement in cost per acquisition,” according to Meta . That’s a pretty flashy statistic, and I, too, have seen positive results since the launch of ASC. But in addition to that, I’ve seen more sales at a lower cost, but also at a higher return on ad spend (ROAS), increasing profitability. Let’s learn about how you can reach that level of campaign performance. With Advantage+ shopping campaigns, you can tell Meta to optimize ads for the maximum number of conversions or sales, and (as of the end of 2023) there is a new option to optimize for the maximum value of conversions, enabling you to provide an optional ROAS goal. You can find more details about this setting and how to use it at the end of this article. (Note: The ROAS option is only available when selecting your website as the conversion location. If you choose to maximize the number of conversions, you can alternatively choose for people to make purchases via your website and shop [via Facebook or Instagram], just your website, or website and app.) When setting up your first ASC, you can certainly test either optimizing for maximum conversions or sales, but you may be more successful maximizing for the number of conversions so that you can establish a baseline of data.  Once you begin to drive more sales, see what is performing well (and what is not), and what kind of ROAS you can achieve, test out the other option to optimize for the maximum value of conversions with a ROAS goal. This goal should be slightly higher than what you saw in your initial campaign, so then the system can work towards the improved (yet likely attainable) new goal. In this secondary test, use your top-performing ads in the campaign.   If you have already run ads before, add your existing top-performing ads into your ASC as they will likely give you a better chance for success right out of the gate (as opposed to launching ads that have not yet been tested). Utilizing AI for targeting with Advantage+ audiences There are a multitude of ways to reach your customers, including targeting by age, location, gender, interest, behaviors, and other demographics, as well as custom audiences that you can set up.  More recently, though, Meta introduced Advantage+ audiences, a newer option that utilizes artificial intelligence to find your campaign audience. I’ve been using Advantage+ audiences and have also begun seeing more success than with some of the other aforementioned targeting options. With Advantage+ Audiences, you can add an audience suggestion to give the system some guidance as to who you want to target before it expands to find more people that are likely to accomplish your campaign goal (e.g., maximizing for sales).  Meta works best when it has data to work with. You feed it data and, in turn, it brings you more results. After testing a variety of suggested audiences, you may want to begin with these two suggested in-platform custom audiences, as they have been effective for many eCommerce accounts and may be a great starting point for yours as well:   Facebook – post/ad engagers – over the last 90 (or up to last 365 days) Instagram – post/ad engagers – over the last 90 (or up to last 365 days)   The longer lookback window may give Meta more data to better optimize, especially if you are newer to advertising on Meta or have a lower budget.  To create these audiences for use in your campaigns, go to Meta’s Audience area to create a new audience. Under “Meta Sources,” select “Instagram account” first and then come back to create the second one under “Facebook page” to cover both platforms.  On the next screen, ensure your account or page is correct under “Source,” then click the drop-down menu to select the “Event” you want, such as “Accounts Center accounts who engaged with any post or ad.” Then, select your lookback time under “Retention.” Name your audience and add a description so that other people working in your account (i.e., staff or other vendors like contractors and agencies) have some context about your custom audiences. Hit “Create audience” and you can now go apply your Advantage+ audience to your ad sets. Meta Ads best practices Next, I’ll help you increase your campaign efficiency with some ad creative best practices, as well as a few tactics to test that can also help you increase your profitability (not just sales volume). Ad copy You have several places to compose complementary ad copy:  Headline In the main text Description line Be concise with your ad copy. Speak to the value of the product, unique benefits, the solution they offer to a particular problem, and add a call to action (CTA) , such as “Shop Now” (in addition to selecting the same “Shop Now” CTA button that Meta appends to your ad).  If you offer free shipping, discounts, or promotions, mention that as well. Craft compelling copy for the text and headline, though you may want to forego description text as it doesn’t display on most ad placements. Many brands add “Free Shipping” copy there but instead, consider adding it to the headline as it’s bold and can be seen across more placements. Images When it comes to images, you want to shine a spotlight on your product and eliminate any potential distractions for the audience.  Aim for bright, bold colors so they stand out on the black or white platform backgrounds. Showcase your product so consumers aren’t wondering what you are advertising: If you’re selling shoes, make sure your models or the creators you are working with  don't have distracting backgrounds or other apparel to detract from your product and reduce the chances of a purchase. Video With video creatives, put your best foot forward in the first three seconds of your video:  Showcase your product. Add overlay text. Include an enticing hook (e.g., by asking a question or sharing something unique about your product, as that will keep people interested and watching more of your video). Add captions so everyone watching can understand what is being said (many people keep their devices on mute and you don’t want to alienate anyone who is hard of hearing).  If you don’t have video content, there are many free and paid tools that can take your images and turn them into videos by adding some subtle motion, overlay text, transitions, and effects. It’s worth trying out Meta’s free tool (in Ads Manager) that can turn your static images into videos via free templates and effects—this is an excellent way to repurpose existing content  and take advantage of automation to create fresh creative on a budget. Whether it’s video or images, it’s best practice to create two variations so that your ads will show optimally across most of the ad placements. Create a 1:1 (or 4:5) aspect ratio image/video as well as 9:16 version as well to fit in Stories and Reels placements. Refer to Meta Ads’ specs  for Facebook and Instagram images and videos.  Ways to increase profitability for your Meta Ads eCommerce campaigns As you begin to run Meta Ads and understand what is working, this would be a good time to test out some other tactics to increase profitability (in addition to increasing sales volume). Here are a few tactics to experiment with: Promote higher-margin products in ads by either manually configuring ads or by setting up dedicated catalog product sets for more profitable items in dynamic product ads. Test Advantage+ catalog ads in an ASC to give Meta more control over which products to serve in your dynamic product ads, so you can drive more sales and allow Meta to better optimize and potentially lift your ROAS. Try the ad set-level ROAS goal setting, where you can gradually increase your ROAS goal and maximize the value of conversions. Set this up in the ad set with the following selections: Leave no eCommerce campaign unoptimized By using the above strategies (such as setting up a detailed product catalog and deploying dynamic product ads to testing Advantage+ Shopping Campaigns and utilizing a more automated, data-driven approach), experimenting with various optimization settings, and serving ads to Advantage+ audiences, you can fine-tune your advertising efforts to focus on what truly resonates with (and converts) your audience. As Meta continues to evolve and present new features, embrace testing and crafting compelling ads, as that will solidify the foundation for long-term success in driving eCommerce sales and increasing your profits. Akvile DeFazio - President at AKvertise Akvile DeFazio is the president of AKvertise , an award winning social media advertising agency. With 16 years of experience, she works with eCommerce, lead gen, event, and entertainment clients to reach their goals through future-forward strategies. Twitter  | Linkedin

  • Anatomy of the SERP: A complete guide

    Updated: June 7, 2024 Author: Mordy Oberstein Google’s search engine results page is a complex and multi-layered ecosystem. What Google shows on the SERP for any given keyword can either significantly improve your chances of bringing traffic to your site or jeopardize those efforts, making this basic information critical for every site owner. Here’s a look at what the Google SERP has to offer and how it can impact your site’s organic traffic. Table of contents: What is the SERP? The organic text results Not all organic results are created equal Example of an organic result with multiple elements SERP features and paid results 01. Paid SERP features 02. Exploration features 03. Features that present organic opportunity 04. Features that don’t present organic opportunity 05. Local features and knowledge panels The mobile SERP What is the SERP? To the average person, Google’s search engine results page  (AKA the “SERP”) may not seem that complicated. What’s so hard to understand? The SERP is what appears on the screen when someone enters a query and Google returns a list of options to choose from, right? Yes, but not exactly.  Google’s SERP can be broadly divided into three categories: The organic results listings SERP features Google Ads For many years, Google’s “rankings” referred almost exclusively to the order of the organic listings. Today Google has become more complex, including a mix of listings, features, and ads throughout the search experience. The organic text results Let’s start with the most fundamental element on the SERP—organic results. Organic results are the list of websites we’re all used to seeing Google display when we search for something. They’re called ‘organic’ because the sites Google displays don’t pay to appear on the SERP. They appear because Google, for a host of reasons, thinks these are the best results to show a user for a specific keyword. Organic results are typically easy to identify. They include the page’s URL along with a clickable title  that sits above a description  of what can be found on the page. That’s not to say this is how organic results always looked on the SERP. Past versions of Google’s organic results included the page’s URL showing in green and the title resting above the URL. This means that you can expect the appearance of the organic results to evolve in the future as well. In fact, organic results look a bit different on mobile. The most notable distinction is that mobile results contain a favicon. (The mobile and desktop version of the SERP differs in many ways and we’ll get to that later on in this article.) Not all organic results are created equal It all sounds pretty simple and straightforward, but it’s not. That’s because Google employs what is known as rich results . Rich results can include all sorts of additional information and even visuals. Consequently, a rich result can be far more noticeable and therefore clickable than your “standard” organic result. Take this result from Edmunds.com, for example. It is visually distinct from standard search listings and provides users with a preview of the tutorial. A rich result like the one above takes up a lot more real estate on the SERP. That means it can be significantly more noticeable than your “average” organic result. The more noticeable it is, the more clicks it might get (in theory). The additional content and the amount of space it occupies on the SERP is only one advantage that a rich result presents. At times, rich results present visual elements that make them stand out from other results on the page. Example of an organic result with multiple elements Let’s take the result below as an example. It contains review stars as well as an image thumbnail. Imagine this was the only result on the page with the review stars and image thumbnail, wouldn’t it immediately stand out? What if your site was the only one without these elements, how much harder might it be for your page to get noticed and attract visitors? As is the case with standard organic results, rich results also look different on mobile than on desktop. How can you turn your organic result into a rich result? The short answer is by using structured data . SERP features and paid results Believe it or not, we haven’t even cracked the tip of the iceberg yet. Along with organic results, Google displays what is commonly referred to as “SERP features” on the results page. This is where our story gets a bit complicated. There is an almost countless number of SERP features that Google employs. Sometimes these features can take on various forms or include any number of secondary elements. In fact, there are often SERP features within other SERP features. So, what is a SERP feature? Google describes these as “ visual elements ” that are “t he building blocks of the Google Search results page that a user can perceive or interact with.” When SEO specialists discuss SERP features, they are often referring to  any element that is not an organic listing result but offers the user content or leads the user to new content a SERP feature. Sounds a little confusing, doesn’t it? Have a look at the image below. Do you see all of the elements placed within the red boxes? Those are all   SERP features. As you can see, the results page can contain a heap of SERP features. There were also one or two things within the organic results that could have technically counted as SERP features as well. Not only are there a significant number of SERP features that can appear on any given results page, but there are also far-reaching implications as to why they’re there. Instead of rattling off the dozens of SERP features, let's instead try to categorize them. In doing so, we’ll get a better understanding of the various types of features as well as what they mean for your site. The categories we explore are not part of any official breakdown. Also, as you will see, there are some features that don’t fit well into these categories. Still, categorizing the features Google shows will help us quickly get a sense of the complexity they bring. Here are five categories of SERP features you may commonly come across in the search results: 01. Paid SERP features 02. Exploration features 03. Features that present organic opportunity 04. Features that don’t present organic opportunity 05. Local features and knowledge panels 01. Paid SERP features Paid elements on the SERP are literally the exact opposite of organic results ie. these are results that are present because businesses have paid to advertise for these search results.  Despite this, they may very much look like organic results. At times, the only thing that makes a paid result distinct from an organic result is the word ‘Sponsored.’ What you see above is an example of what is known as a Google Search Ad. Like with organic results, these ads do come in various shapes and sizes. There are various elements that are, at times, added to Search Ads to make them more visible. Search Ads that appear on mobile may appear differently. These differences include elements that allow users to call the business directly from the ad, the ability for the user to send the business a text message, the insertion of image thumbnails and more. It’s important to note, that while Google used to show ads to the right-hand side of the organic results , as of 2016, this is not the case. Search Ads now appear either above or below the organic results. That means the first thing a user often sees is not your site (even if it is ranking 1st organically), but an ad. This is why it is important to keep tabs on the ads that appear on the same SERP as your organic results. There is another ad format to be aware of–Product Listing Ads (PLAs). PLAs are often a scrollable carousel of products that presents an image of the product, reviews, and even information related to shipping or sales, etc. Unlike Search Ads, it’s possible for PLAs to appear to the right-hand side of the organic results on desktop. It’s entirely possible that all of the above-the-fold content on the SERP contains sponsored content and for there to be additional sponsored content to the right-hand side of the results. Such is the case in the image above. Certainly, it can be hard to compete on a SERP like this. There are a whole host of places where sponsored content may appear on the SERP. At times, the presence of sponsored content may be more or less obvious. In either case, it is important to understand the competitive landscape paid results might present to your site. 02. Exploration features Let’s move away from the impact of paid SERP features and discuss navigational "Exploration features’. This group of features helps users navigate to additional content or even the content they initially intended to find. Disambiguation box Imagine you searched for the term ‘rangers.’ Did you mean the Texas baseball team? The NY hockey team? The army rangers? It can be hard for Google to know. So, Google offers a Disambiguation Box. Clicking on an item inside of the box takes you to a new SERP about that topic. What we have here is a SERP feature that helps users navigate to either additional content or the correct content they originally wanted. Related Searches The ‘Related Searches’ feature is a set of additional search terms that are related to the one the user originally searched for. Google often displays these at the bottom of the SERP to help users navigate to additional or more refined information. People Also Search For The ‘People Also Search For’ feature can also help users get closer to the information they’re looking for. This feature can be shown as a standalone element or as part of other SERP features. Refine this search More recently, Google has expanded its repertoire of navigational SERP features. This includes features that enable users to either broaden or refine their initial search queries. To that end, there are a wide variety of carousels and filters that enable users to explore related topics, products, and the like. Search filters Many of the navigational features Google employs are not standalone features. Google often utilizes a set of filters above its Image Packs and Video Boxes (and at times even as an independent set of filters shown at the top of the SERP). 03. Features that present organic opportunity While certain SERP features are pay-to-play and others merely whisk the user away to a brand new SERP, some features can drive a serious amount of traffic to your site (or other properties). Of course, with great opportunity also comes great competition. So, what exactly are SERP features that offer you organic opportunity? Well, they’re SERP features that showcase your page’s URL or link to your other properties, such as your social media accounts. The best example of this would be the Featured Snippet. Featured snippets A featured snippet is a box that contains a snippet of content from a website along with the URL for that page. This box is placed at the very top of the SERP (although it has been known to appear beneath ads) and takes up a large amount of space on the SERP. In short, a featured snippet is extremely visible and often very clickable (i.e., they can bring a significant amount of traffic to your site). Featured snippets can come in a wide variety of formats. There is the list featured snippet shown above, along with featured snippets that utilize table data and those that present a short paragraph of content. There are even featured snippets that present YouTube videos. Various elements can be added to featured snippets. There are featured snippets that include a carousel of images, ones that include a set of filters and more. There are all sorts of other SERP features that direct users to your site or one of your other properties. Take the People Also Ask feature, which is basically a cousin of the featured snippet. People Also Ask These generally appear as a series of four cards (each reflecting a question) that can be expanded to reveal a short snippet to answer each question. Like the featured snippet, the content snippets here also contain a URL to the page where the content was pulled from. Fun fact: As you expand a People Also Ask card, more question cards automatically load beneath it. Image results Since we're talking about various media, one thing to consider is the appearance of images on the SERP. Google, in various ways, presents users with a series of images when the query’s intent calls for visual media. When clicked, these images can bring you to Google’s Image SERP, where your URL might be displayed. Meaning, you can get site traffic from the placement of images on the SERP. Organic opportunities for your other properties There are also a host of SERP features that can drive traffic to your other properties (aside from your website). This highlights the importance of having a well-rounded content strategy. Google often shows video content within a standalone SERP feature. The Video Box presents a series of videos that overwhelmingly come from YouTube. Being featured here could be a nice way to direct users to your YouTube channel  and can make you relevant in the event you don’t rank organically. Social media also comes into focus. For example, if you Tweet often enough, a carousel of your recent Tweets may appear. This helps you control your own narrative when users search for branded keywords. Again, there are too many features to list here. The main takeaway is that there are opportunities for your URLs inside of SERP features. Sometimes these opportunities might apply to specific types of sites (such as Google’s news carousel) while at other times any site may have an open opportunity to garner more site traffic. 04. Features that don’t present organic opportunity The SERP, as you’ve seen thus far, is complicated. It’s also a bit controversial. Google has a series of SERP features that don’t present any page’s URL. These features also don’t lead to a social profile or even YouTube. Rather, these features directly answer the user’s question. For that reason, they are often referred to as Direct Answers or Answer Boxes. Here’s the situation: if Google answers the user’s question, why would that same user visit any of the sites among the organic results? Direct Answers can, and often do, limit the amount of traffic sites pull in. The matter becomes more complex when you consider the variety of Direct Answer Boxes Google has in its arsenal. There are Answer Boxes that present: Weather forecasts Sports scores and schedules Word definitions Translations Flight information Conversions (currency, units of measurement, etc.) Stock prices and trends Nutrition information This is not to say you won’t get any traffic if your page ranks on the same SERP as an Answer Box. It just means that your traffic potential might be diminished. It really all depends on the user, the keyword, and what Google presents as a Direct Answer. The most important thing to know is whether you are potentially competing with an Answer Box so that you can research the impact and adjust accordingly. 05. Local features and knowledge panels There are some SERP features that don’t really fit into the categories we’ve described above. Some features don’t have a URL per se but lead users directly to your Google business panel. Some features contain URLs, just not to your site. The two main features to discuss here are Local Packs and Knowledge Panels. Local packs ‘Near me’ queries are one of the most common types of searches. This is where a user might search for things like “best pizza near me” or “florist near me.” These kinds of queries almost always bring up GBP listings for local businesses  that generally appear towards the top of the SERP. It’s called the Local Pack and it gives the user a direct pathway to a local business. Notice, there’s a lot of information packed into each business's listing. This information comes from properly setting up a Google Business Profile . Without doing this, your business may not appear in a Local Pack. If your business does not appear in the Local Pack, there’s a good chance that most users will never see it, even if it ranks well within the organic results. This makes GBP optimization one of the most important elements in local SEO . Clicking on the listing brings the user to the Local Finder (shown below) and automatically brings up a full business panel for the listing (which includes a link to the business’s website when applicable). Here, the user can see a fuller set of local listings (the Local Finder is also accessed by clicking “View all” at the bottom of the Local Pack). Actually, the business panel you see above is the perfect segue to our next topic: Knowledge Panels. Knowledge panels Google has a way of understanding the relationships between things and topics in order to present users with a fuller set of information and connect them with other relevant material. Moreover, Google has a way of knowing that some words aren’t just words, but are also “things” or semantic “entities. ” It’s how Google, for example, knows Wix is not just a website but an entire product and corporation. This is called the Knowledge Graph. The most visual representation of Google’s ability to understand words as “things” and to understand the connection between related “things” is the Knowledge Panel. The knowledge panel  is a collection of all sorts of information related to anything from household names, like celebrities and politicians, to companies to sports teams to products and far beyond. Local knowledge panel In fact, one of the more common forms of knowledge panels looks a lot like the business panel we saw above. It’s called the local knowledge panel. On desktop, knowledge panels appear to the right-hand side of the organic results. This means that they do not impact which results do or don’t appear above the fold. Entity knowledge panel Your typical entity knowledge panel may contain a link to a webpage. However, that webpage is usually Wikipedia, as the site is a major source of Google’s entity-based information. On mobile, Google places the Knowledge Panel above the organic results (as a rule) thereby pushing the results significantly further down the SERP (and generally out of initial view). It’s worth noting that the Knowledge Panel can and does act like a Direct Answer Box in many ways. Look back at the above example for the movie A League of Their Own , there is a good deal of information that the user gets without ever having to click on a website. For instance, the user can see who the cast of the film was, the ratings the movie received, etc. All this without even clicking on the other tabs within the Knowledge Panel. The bottomline is that the Knowledge Panel is an important part of what users see when they search for your brand (at least, it should be). It’s also a huge source of information that often replaces the need to visit an actual website. The mobile SERP We’ve already taken a peek at the mobile SERP throughout this post. That said, it’s worth mentioning that the mobile SERP is is different from the SERP on desktop. The reason for this ranges from the amount of space available on a mobile device to user intent being potentially different on mobile than on desktop. It’s possible that your ranking on desktop may not exactly align with your mobile rankings for the same keyword(s). Not only that, due to the format of the mobile SERP, what might appear above the fold on desktop might not rank above the fold on mobile. When it comes to the SERP, different features have different capabilities, content, and elements on mobile than they do on desktop. The mobile SERP even contains some SERP features that do not appear on desktop at all (at the time of this writing). For all of these reasons, it’s extremely important to pay attention to both the desktop and mobile SERP independent of each other. That means monitoring your site’s organic performance across both devices. The SERP’s constant evolution Google’s SERP is constantly evolving. Each year, there are hundreds of tests and upgrades Google makes to the look of the SERP and to the features found on it. Some of these changes can be quite significant and can impact your site’s organic performance. That means it always pays to keep an eye on the SERP and how it’s evolving. This can involve anything from comparing your site’s organic performance across devices, monitoring your rankings on desktop vs. mobile, or simply paying a visit to the desktop and mobile SERPs every now and then. Mordy Oberstein - Head of SEO Branding, Wix Mordy is the Head of SEO Branding at Wix. Concurrently he also serves as a communications advisor for Semrush. Dedicated to SEO education, Mordy is one of the organizers of SEOchat and a popular industry author and speaker. Twitter | Linkedin

  • How to select the perfect GBP category for your business

    Author: Krystal Taing As a local business, there’s no question that Google Maps and Search generate the largest volume of traffic and potential new customers compared to other search engines . Managing and optimizing a Google Business Profile (GBP) is one way to ensure you are present and discoverable by searchers looking for products and services like yours. However, it’s not enough to just add your business to Google—it’s important to understand how to best represent your business and offerings to searchers. For local businesses that are just getting started or looking to reevaluate their presence, this guide will help you focus on just a single, but largely impactful element of your Google Business Profile: your business category. Table of contents: The basics of selecting your primary GBP category Choosing a category that ranks Are more categories better? How your business category influences GBP functionality Quick steps to get started with your business category on Google The basics of selecting your primary GBP category When completing your Google Business Profile , the first field you are asked to complete is your business name. The second field is your category. It’s no mistake that Google includes this at such early stages of optimizing your profile. It’s a required field and you cannot move on to finish setting up your business until it has been designated. What Google does not inform you of at this stage is how your category can affect your business’ visibility and how searchers engage with you. Let’s begin with an introduction to the basics of categories on GBP. Your Google category should best represent and describe what your business is as a whole. Your category displays prominently on search and maps as a helpful indicator to customers. On an expanded business profile, the category is displayed right beneath the name and reviews (as shown below). Choosing a category that ranks While selecting a category may seem fairly mundane, Google has nearly 4,000 category options (which may vary by region) to choose from. It also adds and removes categories monthly . The category you choose can impact your business in a number of ways, including rankings, availability of fields and features, as well as other requirements for verification. So, what should you consider when choosing your category? It’s important to know that your category on Google does affect how you rank for searches. This means that Google uses the category you designate to better understand your business and to help it determine how and when it should return your listing as a relevant result for certain searches. You can improve how and when Google displays your profile by being intentional when selecting your category. The positive news is that you don’t get just one category—businesses can select one primary category and up to nine additional categories. If you find that there is more than one category that describes your overall business, you should add the most specific category as the primary category and any others as additional categories. While not as influential as the primary category, additional categories can still impact your rank and visibility in the search results. In most cases, Google will display your primary category to searchers. However, if Google determines that displaying one of your additional categories is relevant, it may dynamically display this instead. The general assumption here is that displaying a category that is more specific and relevant to the query tells the user why Google is showing them the result. An example can be seen in the search result below. When searching Day Spa San Diego , the first result shows the category “Day Spa.” However, upon clicking into the listing on Maps or looking at the business profile, the primary category is “Massage Therapist.” This means that Google has dynamically displayed one of the additional categories to better serve the search query. While the primary category does hold the most influence from a ranking perspective, Google still uses the additional categories to better understand your business. Are more categories better? For businesses that are uncertain about which category may be the most relevant, you can choose the one that you and your customers would likely describe your business as most often. For example, if you are a local gym that offers a pool, sauna, and tanning bed, you should choose the category that most represents the main features of your business. Your business category can set customer expectations Choosing the wrong category could be detrimental for a number of reasons. In this scenario, if the gym decided to set its main category to “Sauna,” users searching for a Sauna that see this listing as a result and decide to check it out could be disappointed as the gym requires a membership. The best category would be “Gym,” and the business should utilize attributes to describe the features and services that the gym offers. The business is not a pool or a sauna—rather, it is a gym that also has a pool and a sauna. Seasonality may influence your primary business category If your business changes primary services or focuses throughout the year, you can change your primary category to reflect this. These types of changes are typically core business services that are impacted by seasonality, such as air conditioning servicing in the summer and heating servicing in the winter. Because the primary category has more impact on your visibility, it can be strategic to align this with the main parts of your business if those vary by season. Your business segments may influence your categories There are scenarios where the additional categories can describe the overall business when the business is broken out into large segments that serve different needs. In the example of a department store, adding categories such as “Furniture Store,” “Baby Store,” etc. helps explain the types of products and services available. As long as the categories are relevant to your overall business, you can’t choose too few or too many . How your business category influences GBP functionality Google determines which functionalities and attributes your business gets access to based on your primary category. For example, a business that categorizes itself as a “Mortgage lender” would get access to the attribute that allows it to publish an “appointment URL.” Alternatively, a business that has the primary category of “Clothing store” would get access to the attribute that allows it to publish an “online inventory search URL.” Hotels have an entire separate section to complete their property attributes and room amenities. Other variables dictated by your primary category that you should keep in mind include: Food menus, which can be published on profiles within the restaurant vertical Service menus, which are available to service-based businesses such as plumbers and healthcare providers. (Note: Each service can be tied to a separate category.) Booking features for reservations and appointments, which are available to restaurants and service industry businesses. “General Update” or “Event” post types, which are only available to hotels. Limited visibility for reviews and star ratings, which applies to businesses with educational categories. These only display on the bottom of the profile. Stricter verification requirements, which can affect businesses that share categories where spam is more prevalent. This includes businesses such as garage door repair and locksmiths, which are also known to trigger reverification for very small data edits. The inability to hide your business’s physical address . There are some categories that Google does not allow this for, so if attempting to do this results in an error or you are missing this feature, it could be tied to your category. Mandatory business hours. Some businesses, such as property leasing companies, are required to publish primary hours on their listing in order to have their profiles published on Google. Quick steps to get started with your business category on Google Once you understand all of the impacts of your category on your Google Business Profile, how can you get started? You can begin your process by building a potential list of category options. 01. Create a list of category considerations. You can leverage this tool to search all available Google Business Profile categories in your region. If you can’t find an ideal category, you can always send feedback to Google support with the background of your business, which it could potentially add later on. 02. Review which categories your competitors have set on Google to see whether they might also be appropriate for your business. 03. Add categories that may not be reflected in your business name but are core to your products or services. This will give you a better chance of showing up for related queries, especially if your competitors have keywords in their legal business name. 04. Use keyword tools, such as Google Search Trends or Semrush , to measure search volume for your potential categories. 05. Don’t forget to monitor for new and changing categories every few months—Google may just add more relevant categories for your business. Krystal Taing - Global Director of Pre-sales Solutions, Uberall Krystal Taing is the Global Director of Pre-sales Solutions at Uberall. She is a Google Business Profile Platinum Product Expert and faculty member at LocalU. She helps brands at managing hybrid customer experiences. Twitter | Linkedin

  • Topical authority 101: When it’s important and who needs it for better SEO

    Author: Maeva Cifuentes If you were searching for personal finance advice online, you’d discover content from (or quoting) Warren Buffet, who’s written shareholder letters, given countless interviews, and published articles about investment strategies and market insights. But if you wanted advice on how to deal with your dog’s fleas, you’d more likely take advice from the blog of a local veterinarian. Investing Dog care Warren Buffet is widely known as one of the most successful investors of all time, with his own holding company outperforming the S&P 500 over the long term. Plus, he has a wide body of publications and public appearances speaking on investment. The local vet likely has over eight years of education on animal health and clinical studies, as well as practical experience in the field. And if they publish a lot about it, caring pet owners will know and follow the local vet. Even if they both published content about one another’s respective topics, Google is more likely to prioritize investment content from Warren Buffet and pet care content from the local vet. This is topical authority. Not only do people prefer to consume content from someone that is an authority on a topic, Google prefers to serve that same content to them. Building topical authority can help you rank higher and faster, but it’s not the right choice for all businesses. In this article, I’ll help you navigate whether it’s right for your brand (or clients’ brands) and how you can measure and build your own topical authority to succeed in search. Table of contents: What is topical authority in SEO? When is topical authority important? Is topical authority a measured Google ranking factor? How SEOs measure topical authority Topical share of voice Number of mentions from other relevant sources on the topic How to build topical authority and choose your topics What is topical authority in SEO? Topical authority is the extent to which a website is an expert on a given topic. If you have high authority in a topic, all your pages on that topic are likely to rank higher than websites that have less authority regarding that topic (all other considerations being equal). And, the quality of your website’s creators and contributors are just as important as the quality of the content on your website. Google’s search quality evaluator guidelines even  says, “If the website is not the primary creator of the MC (Main Content), it’s important to research the reputation of the content creator as well.” The guidelines use the word ‘creator’ 146 times and often interchangeably with the website. It also warns quality raters that reputation research is required at all steps. So, authoritative, experienced authors  are a key part of the topical authority equation. In many of Google’s communications to SEOs, it tells us that credibility is what we should prioritize. Its E-E-A-T (Experience, Expertise, Authority, Trustworthiness)  framework provides SEOs with a slew of guidelines on how to make content more credible.  In the aforementioned search quality guidelines, Google defines ‘authoritativeness’ as “the extent to which the content creator or the website is known as a go-to source for the topic.” What the examples in the intro have in common is experience and prolificness . For SEO practitioners, that means you can build authority by consistently publishing expert content  on a specific topic, both on your own website and on other reputable websites and content platforms. TL;DR:  If you want to rank well for a specific topic, you need to publish a lot of pages about that topic for Google to take you seriously. When is topical authority important? Not all businesses want to commit to building topical authority—and, not all of them have to. Oftentimes, businesses pursue topical authority because they operate in highly competitive verticals, so every advantage matters. Brands and SEOs working with clients in these sectors need to know that it can take a heavy resource investment (that’s also tricky to attribute) to generate topical authority that lifts your brand’s rankings in search results. This is because building topical authority requires you to publish a lot  of content around a topic, without it necessarily being content that immediately converts. So for resource-sensitive businesses, you might struggle to make sense of spending $1000 on a blog post about ‘The History of Beer Brewing’ when you could spend it on a ‘Best Beer Brewing Kit’ landing page. There are some reasons why you might not need  to publish about every possible angle around your topic: Your audience doesn’t need to be educated on the topic to buy (for example, in impulse purchases like fast fashion). The product or service has a short sales cycle, making long-form content less necessary, (e.g., concert tickets). The topic is extremely niche and competition is very low (in this case, you want to check whether investing in SEO in the first place is necessary for your business). SEO is just a nice-to-have and not a full-on revenue and brand strategy. In this case, you probably don’t need to invest so much of your resources across the customer funnel (illustrated in the example graphic above). You aren’t struggling to rank your ‘money pages’ in the top three positions and you can make money through search without people needing to know you as an expert. However, there are scenarios where you have very little chance of succeeding through organic search if you don’t have topical authority: You are competing in a highly competitive industry with well-established players , like many SaaS verticals. You provide a B2B product or service, where you’re selling to a buying committee rather than a consumer, with a more complex decision-making process. Your ‘money keywords’ are highly competitive and you can’t seem to get in the top three positions no matter what you do. Long story short, you don’t need to publish about the [history of t-shirts] and [best fabric for a t-shirt] if you can rank [womens white t-shirt] and get people to buy directly from that page. Save yourself the money and effort. But, if you want any chance of ranking in the top three for [affiliate marketing], for example, you can count on it being a challenging journey of publishing high-quality content fervently around all potential angles of affiliate marketing for years to come. Is topical authority a measured Google ranking factor? Nobody is sure whether Google actually has a measure for authority or not. In the above-mentioned search quality evaluator guidelines, Google only mentions authoritativeness 10 times. On the other hand, it mentions trust 177 times. So, while you can get a pretty clear idea of what trust means to Google and how it can tell if you’re trustworthy, there’s no clear indication of whether it measures topical authority, and if so, how. In May 2023, Google published a blog post  about a topic authority system:  “Publishers looking for success with topic authority should do exactly what their publications would normally do: provide great coverage about the areas and topics they know well. Such work should naturally align with what our topic authority system measures and with our general guidance about creating helpful, people-first content.”   — Jen Granito, Group Product Manager, Search at Google However, this blog post seems to be specific to SEO for news  websites and newsy queries, rather than normal businesses or publishers trying to use SEO to grow. There are many other instances where Google has specifically and openly stated that it doesn’t have an authority score embedded in the algorithm. For example, in this video , John Mueller, senior search analyst at Google, said, “From my point of view, we don’t have anything like a website authority score.” Yet, the May 2024 Google Search internal documentation  leaks revealed that there was indeed a site authority feature. That said, nothing in the leak showed weights or confirmed that these features were actually included in the current algorithm. It’s impossible to tell whether they are used at all and, if so, how.  So, we don’t know whether Google’s algorithm can actually gauge authority, and whether it directly uses it when ranking pages. However, websites with higher topical authority (measured via a proprietary topical authority score—more on that below) gained traffic 57% faster than websites with low topical authority, and  that high topical authority increased the percentage of pages that got their first click within three weeks of publishing, according to a study by Graphite .  I’ve seen this firsthand with Hotjar, a former client. Initially, the client had thousands of pages targeting UX designers. When we published content for this audience, it would often rank in the top two pages within 1–2 days. Later, the client wanted to target product managers, so we began publishing content on product management. Since there was no existing content for this audience, it took a couple of weeks for the new content to start being indexed and ranking.  Despite having a huge brand and one of the strongest possible domain ratings (Ahrefs’ proprietary metric), this content took a couple weeks for Google to index and rank after publishing—a testament to the power of topical authority. How SEOs measure topical authority Can you quantitatively measure topical authority? While there is no official measurement of topical authority (as far as Google wants to share), there are ways you can attempt to measure yours. If I were to create a framework for measuring topical authority, I’d look at two things: Topical share of voice Number of mentions from other relevant sources on the topic Topical share of voice I define topical share of voice as your visibility across all keywords/subtopics of a given broader topic compared to your competitors.  Let’s say you want to build topical authority around the topic of home brewing beer.  In Ahrefs (which I’ll use for this example because it provides share of voice), you can see that there are about 238 clusters related to home beer brewing with over 30,000 in monthly search volume . If you want to build authority on this topic, you could start by creating content to build out that cluster, tracking your share of voice across all the keywords in the cluster. The more keywords from that cluster you rank in the top three SERP positions, the better. To monitor the visibility of a set of keywords, you can add them to Ahrefs Rank Tracker: Set up a project. Click on “+Add keywords.” Add the keywords from your topic cluster . Click on “Add keywords.” Navigate to your Overview report to review the tracked keywords. Share of voice (SOV) and market share are strongly correlated . Studies show that for every 10% growth in market share, advertising brands have a corresponding 6% growth in share of voice. This means that, to hit your market share goals, you should aim for a share of voice that is slightly higher than your target market share. For example, if your goal is a 3% market share, you should aim for around a 5% share of voice. In Ahrefs’ keyword tracker tool, you can compare your share of voice for your specific keywords against your main competitors.  There isn’t much data available on benchmarking your SOV percentage. What a good SOV percentage is depends on many factors. The industry, your competitors, your keyword strategy, local vs. global focus, etc. all play a role in what percentage can be considered a ‘strong’ share.  A good rule of thumb is to align your SOV goals with your market share goals. What’s a ‘good’ market share for a company of your size in your industry? A similar rate is probably good for share of voice. Number of mentions from other relevant sources on the topic If nobody has ever wanted to quote you or hear your opinion about a topic, are you really an expert on it? If you want to build a business, would you rather take advice from a successful serial entrepreneur that you’ve heard about from reputable sources, or a random person who told you they ‘know how to run a business’? If you didn’t personally know either of them, you’d probably trust the entrepreneur who’s been more abundantly quoted and celebrated in the press, rather than take chances on the unknown. That’s essentially what Google does. It’s essentially saying, “If other websites that talk about brewing beer are citing this person, they must think they’re an expert on that topic. So I also think they’re an expert on that topic.” You can keep a Google Alert on to get notified of publications mentioning your brand, or use social listening tools like Hootsuite or Brandwatch. How to build topical authority and choose your topics The idea behind topical authority is that if you want to rank easily about a given topic, you want to publish a lot of helpful content about that topic. If you have a gardening website, and you have a large library of content around permaculture, perennial flowers, and soil types, does that mean you need to start from scratch to rank anything about tomatoes? As a rule of thumb, if you can only come up with one or two article ideas around a topic, it’s not enough diversification for you to consider it a topic on its own. In that case, you can broaden the scope under which that topic might fall and write more about that overall cluster (i.e., zoom out of the topic a bit). It’s not an exact science. You could probably come up with hundreds of topics around tomatoes specifically, covering things like: Tomato growing kit How long does a tomato take to grow? Growing tomatoes indoors Tomato plants stopped growing Tomato seedlings stopped growing How to grow tomatoes Best soil for growing tomatoes With each new page you publish about tomatoes, you’d add internal links  to all the other tomato-related pages. This creates a strong semantic network on your website , enhancing its relevance and authority on the topic of tomatoes. As you grow this cluster, it would become increasingly easier to rank future content about tomatoes. Then, you could connect overlapping topics and build out other clusters as a method of expanding your topical authority to new, related areas. An article about the [best soil for growing tomatoes] could be linked to ones about the [best soil for growing zucchini] and [best soil for growing lettuce], and then suddenly you have a ‘best soil’ content cluster. That said, you probably couldn’t create a whole topic cluster about [rare herbs] in backyard gardens because it would be too niche and would probably fall under the topic of [herb gardening] instead. By strategically building out topic clusters and interlinking related content, you create a robust network that signals to search engines your comprehensive coverage and expertise on a subject.  This approach not only enhances your website's topical authority but also improves its chances of ranking well for various related keywords. More topical authority, more traffic, more revenue At the end of the day, whether topical authority is an actual ranking factor or not, it will help your website. If you work with experts to publish super helpful content on a topic, audiences will  respect your voice around that topic more. If you publish a lot about it, you’re more likely to be found in Google Search for that topic. If you publish a lot about a topic, you’ll grow more traffic around the topic you want to be known for. And, if you publish a lot about a topic and support all those pages with internal links, you’ll be able to rank higher for all the keywords in that topic. The rising tide lifts all boats. Maeva Cifuentes - CEO & Founder, Flying Cat Maeva is the founder and CEO of Flying Cat Marketing , an SEO and content agency driving growth with a holistic, revenue-based SEO approach for B2B SaaS companies in HR tech, martech, and salestech. Maeva is also a fractional CMO, marketing advisor, and certified confidence coach. Linkedin

  • What are Google algorithm updates?

    Author: Mordy Oberstein Every year, Google updates its search results thousands upon thousands of times. While the majority of these updates are small adjustments to Google’s algorithm, they can have big implications for you, your site, and your potential revenue.  Understanding Google’s various types of algorithm updates and their purpose helps you create better content, recover from rankings changes associated with algorithm updates, and “future-proof” your website. If your business or brand relies on ranking above competitors in the search results (and most do), then here’s everything you need to know about Google algorithm updates. Table of contents: What Google algorithm updates are Why Google algorithm updates take place How often does Google implement algorithm updates? The types of Google algorithm updates Core algorithm updates Targeted algorithm updates Google best practice updates Unconfirmed Google updates How to handle confirmed Google updates Future-proofing your website against Google updates What is a Google algorithm update? When Google introduces new and better technology and considerations into its search algorithm, these are called “Google algorithm updates.” Google makes these updates so that it can better understand page quality and relevance (or a domain overall, as many of Google’s quality assessments look at the quality of the entire site—not just a single page).  While we often think of an algorithm update as reevaluating the weight of certain factors on a SERP  (search engine results page), this is an oversimplification. Many of Google’s algorithm changes incorporate technological advancements, specifically in machine learning.  To that end, some experts speculate that many of Google’s updates are not changes to the algorithm in the strictest sense, but machine learning recalibrating and testing. These changes are perhaps behind a good number of unconfirmed Google algorithm updates. Why Google algorithm updates take place To satisfy users, Google needs to serve the best results possible, considering many factors, including user expectations and technological advancements. So, the search engine often updates or “tweaks” its algorithm to change what the SERP shows.  In the early days of SEO, Google would release updates to keep people from manipulating the algorithm. For example, the Penguin update  targeted spammy link practices, and the Panda update  protected against thin content.  While Google still releases updates targeting spam, recently the company is placing more emphasis on surfacing the highest quality content for users. How often does Google implement algorithm updates? Google implements algorithm tests and changes daily. Though many of them are small, the company’s own documentation suggests that there were over 4,000 updates in 2021 . Core updates (specifically) tend to occur four to five times a year. Historically, Google would carry out major updates one at a time. However since 2022, large-scale updates like the Product Reviews update and core updates have rolled out concurrently or in quick succession.  While this is a recent trend, it’s important to note that we don’t know if this trend will continue. Nevertheless when trying to understand how your site has been impacted by an algorithm update , rapid changes like this can make it difficult to pinpoint the particular cause of a visibility surge or drop. What kinds of algorithm updates does Google make? Google’s algorithm updates fall under the following categories: Core updates Targeted updates Unconfirmed updates Let’s explore each of these update types further. Core algorithm updates “Core updates” (or “broad core algorithm updates”) are when Google implements wide-ranging changes to how its algorithm works.  Rather than slight modifications to ancillary aspects, these updates signal a broad change in how Google’s algorithm ranks pages and sites . These updates are important because, rather than affecting how a single page may rank for a keyword, they can impact domain-wide visibility. Think of the “core algorithm” as a stew, where each spice and ingredient works in relative harmony with the others. An update to the core algorithm might mean a change in how those various ingredients factor into each other and the role they play in the overall stew—among other things (such as advancements that enable the elements within the core algorithm to function at a higher level). While Google has long released broad core algorithm updates, Google Search Liaison Danny Sullivan began officially announcing  core updates  in March 2018.  These updates have tremendously impacted how search marketers think about content. Perhaps the most notable of these updates was known as the Medic update (AKA the August 2018 core update), as it disproportionately impacted Your Money Your Life (YMYL)  sites, including finance, health, and other sites that could significantly harm users if they present inaccurate information. In many ways, the Medic update served as the prototype for subsequent core updates. It showed a clear qualitative leap in Google’s ability to understand and profile content. Those significantly impacted by the update included sites with a thin content experience and those that put marketing aims above substantial content.  For example, if a user searched [how to eat better], pages that use heavy marketing language or those that showed bias towards their own product or service would likely rank lower after this update. On the other hand, Google’s algorithm rewarded authoritative, expert, and unbiased articles on the same topic. Since then, Google’s core updates have shown an increased ability to understand what quality content looks and sounds like. This includes cases of Google demoting pages that, instead of offering informational content, took a marketing tone as well as instances of rewarding pages that offer a highly targeted content experience that is of clear value to users.  Targeted algorithm updates Along with core updates, Google also carries out updates that target specific types of content. These align with ranking systems and include: Spam Updates  Link spam updates Reviews updates These updates can cause ranking changes for some website types and not others.  It’s important to note that while each of these updates may focus on a certain content type, Google sometimes releases the updates concurrently with core updates, which can make it difficult to tell if the impact was due to one ranking system or another.  For instance, the March 2024 spam update released at the same time as the March 2024 core update . Some sites could be affected by one or both updates, so it may be difficult to isolate the impact and identify solutions. Google best practice updates In rare instances, Google will announce a new algorithm update ahead of time. Examples of this include the:   Mobile-first update HTTPs update Page Experience update , which introduced performance metrics (known as Core Web Vitals ) These updates are similar in that they seek to reward sites making adjustments based on Google’s newest best practice recommendations for website management and user satisfaction. When these kinds of updates occur, Google’s teams often create new tools and documentation to support SEOs and developers adapting to the changes. In the case of the Mobile-first update, for example, Google introduced a mobile-friendly testing tool , while Core Web Vitals ushered in a suite of UX tools in Lighthouse .  Unconfirmed Google updates Google makes thousands of changes to its algorithm every year, yet only officially announces a fraction of these updates.  Instead of relying purely on confirmed updates, search marketers use “SEO weather tools” to track significant algorithm changes. These tools  indicate rank volatility level by visualizing ranking movement (as shown below). Google’s broad core updates have been the most commonly confirmed update type, but other confirmed updates include   the Spam updates that worked to reduce the prevalence of websites that violated Google spam policies in search results.  In general, confirmed updates result in far more rank volatility than unofficial updates. How to handle confirmed Google updates An official core update (or other confirmed update) is a bit different than the run-of-the-mill unconfirmed update. In some cases, ranking gains and losses can be more long-term, with there being fewer reversals.  If you believe that your site was affected by an algorithm update, it is important to assess the impact of Google algorithm updates  thoroughly before you take steps towards recovering from core update ranking changes . You need to understand how Google views your site as well as the topic that the keyword(s) represent. While Google may see you as an authority on one topic, it may think another falls just outside your site’s identity (in which case you would want to show Google that the topic is pertinent to your site by creating highly-detailed and nuanced content around it that connects to the other aspects of your website/business).  There are a variety of reasons why Google would no longer look at your content the same way. It could be that the intent of your pages around a given topic is not aligned with how Google sees user needs here.  Your job is to figure out where this is happening and to analyze the SERP so that you can see what Google is after and then do that, but better and with differentiation.  Future-proofing your website against Google updates Rank volatility is just a natural consequence of competing on the Google SERP. No site is without rank volatility. Every site sees some of its rankings come and some of its rankings go. Expecting that your rankings will always be at the top of the SERP is like expecting bad things will never happen to you. So much is out of your control—especially in competitive spaces where many pages are vying for top rankings.  However, the most basic and important thing you can do is create really good content, which is what most of Google’s own advice on core updates  discusses.  Remember—like everything in SEO, what good content looks like  depends on your vertical. The tone, structure, and feel of a basic outline on heart diseases, for example, will look different than a thesis on quantum physics, which will be different than a post about a local baseball game.  So long as you keep your audience and their needs in mind—and present them with a first-class user experience—you’ll be able to insulate your website from Google updates as much as any business possibly can. Mordy Oberstein - Head of SEO Branding, Wix   Mordy is the Head of SEO Branding at Wix. Concurrently he also serves as a communications advisor for Semrush. Dedicated to SEO education, Mordy is one of the organizers of SEOchat and a popular industry author and speaker. Twitter  | Linkedin

Get more SEO insights right to your inbox

* By submitting this form, you agree to the Wix Terms of Use and acknowledge that Wix will treat your data in accordance with Wix's Privacy Policy

bottom of page