Search Results
305 results found with an empty search
- Google’s shifts from authority to content diversity on the SERP
Author: Mordy Oberstein For a while now, I’ve considered Google to be a search engine with a strong bent on becoming what I think of as an “authority engine”—sure, it wants to provide results that you can visit to find the information you need, but at the same time, there’s a clear desire on Google’s part to become the go-to authority. Google wants to be the resource that people rely on not only to direct them to the information they’re seeking, but also to be the authority that actually provides that information. If you search for things like “what is the weather in nyc” or “yankee game score,” for example, you can find the answers without ever clicking on a single result. Google is not just a search engine—it’s an answer engine. To me, that’s ultimately why it’s the “authority engine.” But, I think that’s been changing recently and will change further. In this article, we’ll go over: Google’s transformation from result provider to knowledge authority Why Google became a knowledge authority How Google leveraged featured snippets for authority Google’s push to go beyond the authority dynamic What more content diversity says about Google and the web itself Google: From search engine to information provider to authority Fundamentally, a search engine is a facilitator. As a pure construct, a search engine does not provide you with any information, it merely leads you to sources that do. Sometime around 2015, Google changed that paradigm: In 2013, we saw the launch of a carousel presenting a listing of local establishments. In 2014, Google introduced the featured snippet, a box that appears at the top of the results presenting a snippet of information found on a web page, along with a link to that page. Then, in 2016, Google presented us with the direct answer, which does as it says—provides you with the direct answer to your question: In the years that followed, Google has only increased its investment in SERP features and the information they contain. Featured snippets started to take on multiple forms, including video featured snippets and a snippet that basically functioned as a direct answer. As time went on, Google’s Knowledge Graph vastly improved and all sorts of information became directly available in knowledge panels. Now, it’s to the point where these SERPs resemble what SEO veteran Dan Shure calls “micro-sites.” The point is, we have been living in an era where Google has turned the SERP and, subsequently, its own identity into something other than that of a facilitator. Google has become a very powerful provider of information. Google is no longer exclusively a search engine and it hasn’t been for a long time. It has become a knowledge provider and, in being a provider of information, it has become a knowledge authority. Why Google became a knowledge authority Many search marketers will tell you that Google started providing information directly on the SERP so as to keep users within its ecosystem. The prevailing theory is that Google wants to keep users on its properties, moving them from one set of results to the next, because it affords more opportunity for ad placements, which generate revenue for Google. Let’s run through an example: Say, I run a query for “yankee stadium.” I might (and did) get this: Notice the ads that dominate the left side of the results page. Now, if I scroll down a bit, I would see the “People also search for” feature at the bottom of the knowledge panel: I might then click on the option to scope out another New York stadium, Citi Field, only to move to a new SERP with new ads: The more information on the page, the more I stay on Google, the more opportunity to move me to another set of results, and the greater the chance I will eventually click on an ad. That’s the theory. Sure, this sort of construct will lead to more ad placement and more revenue, all other things being equal. What bothers me about this theory is that myopically engineering its user experience to drive engagement with search ads is not entirely in Google’s character. I’ve always found Google to play the long game. Take Google Search Console, for example: Instead of creating a data vacuum (that would surely be filled by third-party tools), Google offers free data that becomes the primary tool of all SEOs, thus allowing them to create better sites and content. That’s a long-term, “big thinking” play. Offering more entry points to more SERPs for the sake of more ad placement is not a long-term, “big thinking” strategy and, to me, isn’t how Google typically operates. So, why double down on providing information right on the SERP if not to keep users within its own ecosystem to generate more ad revenue? What’s the long-term play here? It’s authority. Google providing users with information directly makes it the authority and not the sites it would have otherwise facilitated. Providing others with knowledge—moving them from helplessness towards empowerment—is an extremely potent relationship. By sending you to other sites, all Google was doing was facilitating you feeling that powerful dynamic with whatever website you landed on. “Why shouldn’t we get in on that?” decision makers at Google might have thought. And they did. Google started to provide a slew of information, creating the association that it is directly the knowledge provider and the authority. That’s a very important association to create for a search engine. The entire idea of a search engine is that users trust them to provide a path to the most substantial information. What fosters that sense more than actually providing that sought-after information? In the business context, the logic was likely that users would be more inclined to return if they felt they could come to one place with expedited access to the information they were seeking, from a platform they trust as the purveyor of that information. Google decided to reinforce a far deeper and far more powerful latent association amongst its user base (i.e., as an authoritative knowledge provider) because doing so fosters a unique bond. This bond, in turn, creates and subsequently reinforces that latent notion that Google is where we should go for information. Google’s association with information (and not just information retrieval) urges users to seek out the platform for all of their information needs. The play here seems to be that the more users think of Google as the go-to source for information, the more they will return to the platform and the more ads Google can serve them. Google is a for-profit company. However, the move to show more information on the SERP itself (which may downgrade the urgency for clicks to websites) is not primarily about the immediate return on investment. It’s about Google creating a certain identity for itself so that, long-term, users will view the platform a certain way—all of which leads to increased use, which ultimately leads to more ad clicks. Google’s featured snippets have been a very important part of this construct. However, they are quickly moving away from being a part of the “pure authority paradigm” and perhaps it says a lot about the state of the world’s most popular search engine. How Google leveraged featured snippets for authority Google stores information about entities in its knowledge graph. This enables it to offer information without any connection to a URL. However, most of the information out there in the ether that is the web exists on web pages. This presents a bit of a problem for a platform looking to become the source of information. The solution? Featured snippets. Featured snippets enable Google to directly answer users’ queries while not actually owning the content. While still somewhat controversial, in many ways, it’s a win-win. Sites get their URL prominently displayed at the top of Google’s results which, in many cases, could positively impact clickthrough rate. Conversely, Google gets to position itself as a knowledge authority by presenting a snippet of information on the SERP. How exactly does Google use the featured snippet to position itself as a knowledge authority if the content within it belongs to a specific website? For starters, the content belonging to a website and the content being perceived as belonging to a website are two different things. When content is displayed within the featured snippet, while the URL that hosts the content is present, it’s not exactly prominent. The content itself almost seems to exist separate from the URL, at least initially. Moreover, Google employs methods with which to directly answer the user’s question. One such method is bolding the most relevant text within the snippet: There is also a featured snippet format that is essentially a direct answer with an accentual snippet of content: For the record, I’m not saying Google is doing anything nefarious. Again, I think what you see here generally works for both Google and site owners. Moreover, the formats shown above give users what they want: quick and immediate access to information. But, featured snippets show Google moving beyond the authority dynamic It all sounds perfect: Google, features your content prominently so that your sites earn more traffic. The search engine gets to position itself as the market leader, bringing in more searches and more potential ad revenue, and you get more clicks (in theory). It was all going so well until some folks spotted a few tests to the format of featured snippets. Google runs hundreds (if not thousands) of tests on what it shows in the SERP and how it shows it. What makes these limited tests noteworthy? The answer is diversification within the featured snippet. Here’s the first test of featured snippets that got me thinking that things may be changing: Gone is the big bold lettering telling you the answer before you get to the snippet of content. Instead, there is a header that says “From the web.” Explicitly telling users that what they are about to read comes from sources across the web stands in sharp contrast to Google positioning itself as the author by using the featured snippet to directly answer the query. Moreover, if you start reading the actual snippet of content, not only do you see multiple URLs further diluting the featured snippets’ focus on authority, but the content itself addresses the query from different angles. Each section on the snippet (with its corresponding URL) is a unique review of a product. The content is not cohesive. It doesn’t all come together to form one flowing paragraph that represents the one true answer. This same concept is reflected in the second test of featured snippets that was discovered around the same time: In fact, in this instance, Google is explicitly sharing the “authority wealth” with a header below the main snippet that reads “Other sites say.” Coincidentally (or not), less than a month later Google was seen displaying a new feature termed the “More about” feature. Here again, Google presents a diverse set of content snippets attached to URLs. Seeing this live (not merely as a test to the SERP) made me think the sands have significantly shifted. This is interesting because the query that brings up the carousel (shown above) would be prime material for a featured snippet that rattles off a list of things you need to do in order to apply for a loan, much the way it does for most other queries of this nature, as shown below. Clearly, something has changed. For the record, it’s not as though the ability to focus on content and URL diversity within the featured snippet is a new development—Bing has been using this very model for its version of featured snippets for years. Furthermore, Google itself has been using this very model with its “Sources Across the Web” feature for a few years now. So, it’s not that Google couldn’t prioritize content diversity over content authority. Rather, it’s that it chose not to. This isn’t necessarily a good or bad thing—each construct has its own positives and negatives. What a shift towards more content diversity says about Google and the web itself Practically speaking, Google moving towards a more diverse showing of content and URLs within featured snippets could mean more potential traffic for sites. That is, at least, if you were not already the sole URL being shown within a specific snippet. More broadly, I think this shift represents how the web itself is maturing. Every time another CEO goes before Congress to discuss data privacy, more people become more skeptical about what’s out there in the digital sphere. Semrush data indicates that Google searches related to data privacy are up over 100% since 2017. This is an important part of the maturation of the web. Relying on a tech provider such as Google for the one true answer stands in contradistinction to this maturation process. User skepticism can be, and as it currently stands, is, integral for a healthier web. While full-on authority may have been what garnered trust in the past, it’s my belief that Google realizes that there needs to be a stronger element of transparency in the mix. Again, this speaks to how we as online content consumers are “wising up” and pushing for a safer, more mature web. Google’s departure from positioning itself as an authority by presenting users with the “one true answer” speaks to how it, as the leader in search, sees the state of the web and the state of those who use the web. It’s a marked shift in what it means to be a healthy operator and leader within the digital space. Moreover, there’s increased pressure on Google to get it right. Recently, there have been an increasing number of major publishers questioning Google’s ability to serve quality results (take this article from the New Yorker as just one example). Showing a more diverse set of content within its answers helps to portray Google as providing a better and more accurate content experience. Whereas in the past, Google may have been better served by providing “the” answer, today’s user is more receptive to having a more holistic and well-rounded set of content (and is fundamentally better served by it). For the record, I think Google is ahead of the curve. Since about 2018, it’s released a set of algorithm changes (referred to in the SEO industry as “core updates”) that I believe have injected new abilities to discern good content from bad. Meaning, Google has long been aware that user expectations around content are shifting and that it needs to move quickly to meet those expectations. What’s happened in the more recent past, at least as I see it, is that people have become rapidly aware that the content out there on the web needs to improve (again, something Google has realized in a substantial way since 2018). At this juncture, the awareness of the user base around the lack of quality content has outpaced Google’s ability to sift such content out of the results. Simply put, we’re far more aware of the lack of quality content on the web and are looking to Google to handle the problem without considering how far Google has come in this regard and without fully appreciating that much of the fault is on content creators, not just search engines. Google’s recently announced Helpful Content update echoes this sentiment, recommending that content creators evaluate whether “the content is primarily to attract people from search engines, rather than made for humans.” In any regard, Google providing a more well-rounded set of answers creates a sense of topical transparency and therefore quality. Expect more content diversity on the SERP in the future Is Google going to kill off the featured snippet as we’ve known it? No, I don’t think so. Having one snippet of information can be quite useful both for how the search engine wants to position itself and to users looking for information (especially factual information). Sometimes you do just want a quick answer. But, there will be an increase in instances of multi-sourced and multi-perspective features on the SERP. The Google results will, inevitably, contain an increasing number of features that give users entry points to multiple sources of information around a given topic. Doing so helps optics. It also speaks to how Google’s algorithm around topics functions. Most importantly, doing so is simply good for people in search of information. *Disclaimer: Mordy Oberstein is not associated with Google and Google was not involved in the writing of this piece. Mordy Oberstein - Head of SEO Branding, Wix Mordy is the Head of SEO Branding at Wix. Concurrently he also serves as a communications advisor for Semrush. Dedicated to SEO education, Mordy is one of the organizers of SEOchat and a popular industry author and speaker. Twitter | Linkedin
- How stable are Pinterest rankings and traffic? [Study]
Author: Mordy Oberstein Pinterest is an organic powerhouse. Each month, the millions of keywords it ranks for bring in over a billion site visits from Google. It’s no surprise that, for many, leveraging Pinterest to bring visitors to the images they're hosting on the social media platform is vital. This is why, more often than not, whenever a large Google algorithm update rolls out, some of the analysis that gets done will inevitably mention Pinterest and its organic market share. But, how much of a force is Pinterest really? While the domain is clearly a juggernaut, what does that mean for individual users hosting content on the platform? More specifically, what I want to know is how consistent are the rankings (and by extension, the organic traffic) of a specific Pinterest asset? The problem: Pinterest URL swapping on the SERP Before diving into the data, let me explain the problem: As mentioned, Pinterest garners a lot of traffic from Google. The issue is that, unless you’re Pinterest, you don’t really care about that per se. What you, as a creator on Pinterest, care about is how much traffic can Google drive to your specific assets that you host on Pinterest. At first glance, this doesn’t even seem to be a question. Pinterest pulls in an incredible amount of traffic from Google Search as, for many types of queries, the SERP is littered with Pinterest URLs. The problem, however, is this: What you’re looking at above is Google essentially swapping out different Pinterest URLs within the same ranking position vicinity. When I saw this, it made me wonder, how stable is a ranking Pinterest URL? How often is Google swapping out one Pinterest URL for another? Because when I started to dive in, what you see above seemed to be a pattern. That is, Google seems to give Pinterest a ranking slot on the SERP and oscillates between showing various Pinterest URLs within that slot. So, I’ll ask the question again: how potent is Pinterest in terms of bringing in traffic via search to your specific assets if it seems that Google is relatively quick to swap out various Pinterest URLs? Pinterest URLs & Google ranking: Methodology and limitations The Semrush data team analyzed 1,487 keywords on desktop and another 1,425 keywords on mobile in order to see how often Google is swapping out Pinterest URLs on the SERP. Only keywords that displayed a Pinterest URL with an average rank of 10 or better were considered. The team then analyzed how many times one of these URLs for the given keywords was being swapped for another Pinterest URL. What, however, is the definition of a URL swap in this instance? If a specific Pinterest URL was ranking #7 for a keyword and then moved to rank #10, while a new Pinterest URL began ranking at position #3, is that a swap? What if a Pinterest URL was ranking #8 and then no longer ranked top 10 at all, only to have another Pinterest URL begin to rank at position #10—is this a swap? For the purposes of this study, anytime a Pinterest URL stopped ranking among the top 10 results on the SERP and another Pinterest URL started ranking top 10, it is considered to be a swap. Now, based on the patterns I’ve seen and as shown in the images above, generally speaking, Google gives a certain slot—or in some instances, slots—to Pinterest. The URLs that Google then swaps fall within a certain range of ranking positions. Thus, it makes sense to consider one Pinterest URL as being swapped for another, even if they are not at the same exact ranking position. However, as noted above, this study includes any instance of swapping even if the swap represents a discrepancy in ranking positions “range.” This is simply a limitation to note. Also, approximately 1,400 keywords per device is not a small number of URLs. At the same time, it is not as if a million URLs were analyzed. This, too, is something to consider. Similarly, the data collection period covered a period of 30 days. These days were chosen because, as a continuum, they reflected days of average volatility (so as to increase the accuracy of the data) but all-in-all a larger period could, in theory, yield different results. With that, let’s get to the data itself. How consistent are Pinterest URL rankings on the Google SERP? Just 43% of the keywords studied presented the same Pinterest URL on the desktop SERP over the entire course of the 30-day data period. Meaning, the other 57% of the time, Google is not using the same Pinterest URL on the SERP over the course of the month. On mobile, this number jumps up to a full 60%. Pinterest URL diversity on the SERP is the norm, which means you should, as a rule, expect your ranking Pinterest URLs to be replaced on the SERP at some point. In other words, volatility is the rule rather than the exception when it comes to specific Pinterest URLs ranking on the SERP (again, Pinterest as a domain is very consistent, but we’re concerned with specific creators here, not the platform). The question is, how volatile are specific Pinterest URLs on the Google SERP? To phrase it another way: How many unique Pinterest URLs is Google utilizing over the course of a month? Is your Pinterest pin or board and its URL sharing the SERP with just one other Pinterest URL? What’s the organic market share like for specific Pinterest URLs on the SERP? According to the data, Google swaps Pinterest URLs an average of six times per month and utilizes three unique Pinterest URLs when doing so. In other words, you can expect to share the SERP with two other Pinterest URLs (other than your own) each month. What’s more, you can also expect your URL to be swapped an average of two times per month. For creators relying on organic traffic from their Pinterest uploads, that’s not exactly a picture of stability and stands in sharp contradistinction to our a priori understanding of Pinterest from a domain perspective. When Google swaps Pinterest URLs: Patterns and observations Big data is great and the insight it affords can indeed be illuminating. Still, I typically find that there’s a level of nuance that can only be surfaced by looking at specific instances. With that in mind, let’s dive into some of the patterns I noticed while analyzing specific cases of Google swapping Pinterest URLs on the SERP. Simultaneous consistency and volatility among Pinterest URLs on the Google While the data does show Google has a propensity to swap the Pinterest URLs it ranks on the SERP, this volatility does at times coincide with stability. Specifically, there is a pattern where Google will show one Pinterest URL consistently on the SERP within a position range for the entire course of a 30-day period (perhaps longer, but I only looked at a 30-day period). At the same time, Google may also rank a secondary Pinterest URL at a slightly lower ranking position. This is exactly the pattern seen in the example below: The URL represented by the purple line consistently ranked on the SERP over the entire 30-day period analyzed. Below it, represented by the yellow, pink, and orange lines, was a secondary Pinterest slot on the SERP where Google oscillated between three different Pinterest URLs (or no secondary Pinterest URL at all, depending on the day). Practically speaking, it is entirely possible to experience significant volatility while tracking one of your Pinterest URLs, while another Pinterest URL sees relative stability for the same keyword on the SERP. In terms of real numbers across the dataset we tracked, 50% of the time Google showed two Pinterest URLs on the SERP simultaneously. There is overlap, and a good amount of it: While there are days when Google truly swaps one Pinterest URL with another, there are also days when Google might show both URLs only to remove one of them a day or two after that. Search intent when Google ranks Pinterest pins and boards It is possible that, even though your Pinterest URL for your particular pin is being swapped, the Pinterest URL that replaces yours also contains your pin. This is because Google doesn’t merely swap a Pinterest URL to a specific pin with another URL to a different pin. Rather, Google sometimes replaces a URL to a specific pin with a collection of pins (a Pinterest board). For example, take the keyword mens ring ruby which (as shown earlier on in this article) reflected multiple instances of Google swapping Pinterest URLs. In one case, Google swaps a link from this specific pin: To a collection of pins, as seen here: It is possible that the specific pin shown previously can appear in the collection above. However, even if that were to be the case, a link to your specific pin is obviously of greater value. Take the instance below, for example: The dominant Pinterest URL is to a board (you can tell by the URL structure, just for the record). There’s a secondary URL it tests out (reflected in the orange line), which is considerably less consistent than the board shown in purple. The same can be seen in the rankings for the keyword combat workout: Yes, Google does experiment with an alternative Pinterest URL, but both reflect boards, not specific pins. The same thing goes for the keyword wooden family tree but in the inverse, Google experiments with multiple Pinterest URLs on this SERP; all of them pins, none of them boards: For whatever reason, it seems Google sees the intent of the keyword as either being relevant for a specific Pinterest pin or the opposite, that the user would be better served with a Pinterest board. The types of keywords predisposed to more Pinterest URL swapping Some keywords are subject to Google swapping two Pinterest URLs just once on the SERP each month, while some see Google swapping five or six URLs back and forth, perhaps ten times over the same period. Why? Why do some keywords see so much “Pinterest URL swapping” while others don’t? It’s hard to determine a definitive reason here. In fact, it’s impossible to say unless Google itself released a statement as to why. However, there are some patterns within the dataset that may possibly explain why some keywords lend themselves to more Pinterest URL swapping than others. While I’m not privy to the exact thinking around what about each keyword lends itself to one intent over the other, it is interesting to see how specific Google is here. The most notable trend, although it does not account for all instances, is that the more obscure the “item” represented in the keyword, the fewer swaps. For example, the following keywords saw either one or two Pinterest URL swaps: Dollar tree decorations Puppet makeup Manor lake australian labradoodles Laundry room storage Screaming needle I would imagine that the more obscure the reference, the less content with which to conduct the URL swapping. Conversely, the keywords below saw between 10-15 swaps: World map watch Silver bengal cat Vintage windbreaker jacket Brick paint colors Old lady costume Again, the more mainstream the item is, the more Pinterest content at Google’s disposal with which to execute the swaps (all other things being equal). Is this 100% why certain keywords exhibit less stability with their ranking Pinterest URLs? No, there are many instances within the dataset that contradict my point above. However, again, there does seem (at least to me) to be a pattern where more obscure sorts of keywords tend to exhibit less Pinterest URL swapping. Pinterest URL consistency inside SERP features Pinterest URLs can be a real factor inside of Google’s various SERP features. Similar to the analysis above, the Semrush team pulled some data related to Pinterest URL consistency within two prominent SERP features: featured snippets and image packs. Pinterest URL consistency: Featured snippets Believe it or not, Pinterest URLs are used in featured snippets. In the US alone, the domain has earned featured snippets for 9,400 keywords. Within the smaller dataset we analyzed for this study, there were no featured snippets that contained a Pinterest URL for the entire 30-day period. (Again, that is a number to take with a grain of salt as the dataset here is somewhat limited in that it reflects about 1,500 keywords and not all of them will generate a featured snippet). Still, when Pinterest URLs were used within the featured snippet, the swapping continued. When Google displayed a Pinterest URL within a featured snippet at least once over the 30-day period, the search engine utilized (on average) four other URLs over the same period (for a total of five different URLs seen within the featured snippet on average over the data period). However, not every URL Google swapped in these instances was from Pinterest. Of the five URLs Google used within these featured snippets over the 30-day period, 56% of them were Pinterest URLs. So, while Google tends to give Pinterest a ranking slot (or two) on the SERP and oscillates between various Pinterest URLs in these slots, this is not the case for featured snippets—at least not to the same extent. With featured snippets, Google is not locked in to giving Pinterest (as a domain) the slot and merely swapping various Pinterest URLs. Rather, Google only replaces one Pinterest URL with another Pinterest URL just over half of the time. For the record, on average, it would appear that each of the five URLs gets about two “spots” in the featured snippet, as we noticed that Google swapped the URLs 12 times over the 30-day data period. That is, the same five URLs (just over half of which were Pinterest URLs) constituted a total of 12 different URL swaps over the 30-day data period. Meaning, Google used a URL in the featured snippets, swapped it with another, and then reused the already displayed URL again at some point (as a rule). Pinterest URL consistency: Image packs As is to be expected, one of the more prominent places that Pinterest URLs can appear is within Google’s image pack. Accordingly, the Semrush team also pulled data on how often Google was swapping Pinterest URLs inside the image pack. To start, the average image pack includes links to 13 URLs on desktop and 10 on mobile. Of those URLs, only 13% of them come from Pinterest on desktop and just 9% on mobile. Google seems to have swapped these Pinterest URLs 13 times on desktop and 15 times on mobile over the course of the 30 days. Note: This doesn’t indicate whether Google is swapping Pinterest URLs within the SERP feature more often than it does for URLs from other domains. Why so much swapping? Why isn’t there a more consistent showing on the SERP for Pinterest URLs? Clearly, I cannot offer a definitive answer—I am not Google. All I can do is offer my best theory. To me, this is all about the nature of images and intent. If you recall, Google, on average, executes six Pinterest URL swaps for keywords that display a Pinterest URL among the top 10 results. That number more than doubles when you look at the image pack, where Google executes 13 swaps (again, this is the number of total swaps, not unique URLs used for swapping). Moreover, while I don’t have specific numbers, the Semrush team did mention that image pack URLs are often moved around in terms of position and even entirely removed from the SERP feature. To me, this tells a lot of the story. Google sees images as being “dynamic.” Whatever the reason, Google tends to not treat images in a static way on the SERP when possible. Personally, I think this is because there are so many varieties and variations to the images that reflect a given product or topic, etc. Having a limited and fixed set of images to reflect the topic or product doesn’t align with the very nature of visual representation, which is often nuanced and extremely varied. Think about the images Google shows for “Batman”—if it went with the same five images for all eternity, that would not reflect the diverse way the topic can be visually represented. This comes into play on the main SERP as Google has limited space to show images (as opposed to Image Search per se). From a search intent point of view, Pinterest URLs are present to serve as access to images. It’s a way to provide users with an entry point to see an image that aligns with the search query they entered. If we think about a ranking Pinterest URL as an image on the SERP (instead of as an organic result), then you can make sense of why there is so much volatility: Google is treating the URLs within the organic results much the way it treats images in an image pack. This might be why we generally don’t see the same pattern with Amazon. Google is not showing one specific Amazon product URL one day and a different one the next. Google simply shows a URL to a set of Amazon results—not so with Pinterest. In the chart below, while Amazon ranks with one consistent URL, Google swaps a variety of URLs to specific Pinterest pins over the course of the month: Why? I think it’s because Google treats Pinterest URLs like an image. And, images need diversity, not stale, static, and therefore generic representation. Pinterest rankings need qualification Tracking rank sounds easy, but it’s not. Doing it in a way that makes good sense and that doesn’t end up being a bit of a vanity metric can be hard. All the more so when trying to define the organic performance of your Pinterest pins on the SERP. Seeing that your pins or boards rank well at a given moment, based on what we’ve seen above, is not enough. In these cases, you simply can’t sit back and assume traffic is coming in because at a specific moment in time your Pinterest URLs rank well (not that you really ever should have such a mindset). As we’ve seen, there is an unusual amount of volatility with Pinterest URLs on the SERP. Taking that into account when assessing performance, reporting, and certainly when predicting future performance is highly recommended. Mordy Oberstein - Head of SEO Branding, Wix Mordy is the Head of SEO Branding at Wix. Concurrently he also serves as a communications advisor for Semrush. Dedicated to SEO education, Mordy is one of the organizers of SEOchat and a popular industry author and speaker. Twitter | Linkedin
- What AI content generators mean for the future of search
Author: Mordy Oberstein How the web—and search engines in particular—handle the proliferation of AI-written content (as sparked by OpenAI’s breakthroughs with ChatGPT) is the central question facing SEOs and content creators at the moment. Foretelling how AI-written content will shape the future of search and SEO is multi-faceted in that it includes both how search engines algorithmically handle that content and how search engines themselves will incorporate AI chat functionality into their ecosystems. Both of these issues are complex in their own way and, at this point, no one has all the answers. The best I can do is apply my SEO outlook—with respect to how search engines have evolved—to the situation at hand in order to describe what I see as Google’s inevitable reaction to AI-written content and AI chat technology. Table of contents: The problem AI-written content poses for the web Google’s inevitable response to AI-written content The potential emphasis on domain-level metrics Concerns for SMBs and ranking in the era of AI-written content How Google can combat the over-proliferation of AI content AI chat technology on the SERP The problem AI-written content poses for the web In my opinion, the place to start this exploration is not within the scope of SEO. Rather, I think we should examine the (perhaps obvious) problem AI-written content presents to the web as web content is the clay that search engines use to mold their result pages. Before anything else, it’s vital to understand just how powerful Al content generators are—and I don’t mean the power and the proficiency of the technology per se. Rather, the power AI content generators have to solve a very serious pain point for most of the web: content is hard. If you come from a content background like myself, it’s sometimes difficult to appreciate just how hard it is to create “high-quality content.” Content creation is really an art form. In my humble opinion, creating strong content relies on both the ability to connect to the recesses of your persona(s) and to then be able to deliver that connection in a scaffolded, methodological, and engaging manner, all at the same time. Content requires profundity and the unique ability to distribute that profundity into digestible chunks. At the same time, content is the lifeblood of the web. Without content, there is no such thing as a website. What a predicament for the average site owner: In a way, when a search engine like Google asks a site owner to “create good content” it’s an unreasonable request. Being able to create a substantial series of content to fill a website is a unique skill. Just like the average person probably can’t change a car transmission, they also can’t create professional-level content. We get fooled into thinking this is a possibility because everyone can read and write. Being able to write and being able to create content are not one and the same. A site owner who first starts dipping their toes into content creation will quickly realize just how tough and time consuming it is. For the record, I’m not saying that the average site owner can’t create enough content to put forth a substantial website. What I am saying is that there is a ceiling here. In addition, the amount of time it takes to create content can be prohibitive for many site owners. So even if a site owner is a fabulous writer, what are the chances that they’ll have the time to create content as it’s a time-consuming process even for the best of us. (Parenthetically, and with much bias, this is one of the great advantages of a platform like Wix in that it frees up time to focus on content creation and this is why, regardless of my bias, the platform represents the future of the web in a certain regard). And now we arrive at an inflection point: AI content generators seemingly solve both of these pain points. They certainly save time thereby making the content creation process more accessible and ostensibly spin up pretty decent content to boot. The temptation is real. To the unsuspecting person, AI content generators open up a huge world of possibilities and cost savings. In other words, the pain and struggle of content creation are so significant and the possible solution AI content generators present is so strong that, inevitably, the web will become filled to capacity with AI-written content. The problem, of course, is, AI-written content is not a panacea and is in many cases a gateway drug to the proliferation of thin, inaccurate, and unhelpful content. One could argue the web is already overfilled with such content. That’s why Google has done everything from releasing its “new” set of core updates that began in 2018 to the Product Review Updates in 2021 to the more recent Helpful Content Update. However, with the newfound capability and accessibility of AI content generators, there is going to be a proliferation of this sort of content unlike anything the internet has ever seen. It will be such a proliferation that Google will have to respond, because if it doesn’t, it faces criticism by discerning users for the irrelevance of its results. The question is, how will Google solve this inevitability? Google’s inevitable response to AI-written content Let me propose a wild scenario: What if every web page whose content answers the question what are snow tires? was created by AI? What if we went really wild with this hypothetical and said all of the content AI content generators spun up to answer this question was more or less the same? (Now that I put this into writing, the latter doesn’t seem that wild.) In such a scenario, if someone went to Google what are snow tires?, how would Google know what page to rank if all of the content out there was of nearly identical quality? If all snippet-level content is equal, then what will rank at the top? In a world that may very well be driven by AI-written content, this scenario (while hyperbolic) isn’t that far-fetched. How much value does human experience lend to snippet-level topics that have been answered across the web a thousand times over? What new insights are you going to add to a static topic like what are snow tires? that hasn’t already been done before? Snippet-level content has the potential to be a great fit for AI content generators, assuming the technology allows for topical accuracy. So flash forward five years in time when all of this content will be written (in theory) by AI—how does Google decide what to rank for the query what are snow tires? (or whatever snippet-level query) when all of the snippets are relatively the same? AI-written content and the search engine’s emphasis on domain-level metrics The problem I laid out above, to borrow an SEO cliche, is “old, not new.” The truth is, AI-written content amplifies the quality conundrum that already faces the web. There is a proliferation of mediocre content on the web today. The web is, unfortunately, full of fluff content that is more concerned with ranking, traffic, or whatever acquisitional metric, than with helping its target audience. Google has the same problem with this content as it does with AI-written content. A less-than-stellar site owner can spin up a lot of snippet-level content without exerting a ton of effort, as again, the nature of this content isn’t exactly prolific. It’s for this reason that “quality” has long been a domain-level metric for Google. (For the record, it’s a long-standing myth among SEOs that Google doesn’t rank sites and instead only ranks pages). Meaning, if the pages on a site for snippet-level content are of “adequate” quality but the other pages that target deeper content needs are not, the performance of those snippet-level pages would be negatively impacted (all other things being equal). This concept culminated with the advent of the Helpful Content Update, which according to Google’s own documentation: “ …introduces a new site-wide signal that we consider among many other signals for ranking web pages. Our systems automatically identify content that seems to have little value, low-added value or is otherwise not particularly helpful to those doing searches.” This issue really comes into focus within niche topics or once you begin to move past surface-level understanding. To me, this is why Google explicitly focuses on sites that don’t offer a level of nuance and depth in their content. When explaining what sites should avoid (within the context of the Helpful Content Update) the search engine asks content creators: “Did you decide to enter some niche topic area without any real expertise, but instead mainly because you thought you'd get search traffic?” Simply put, ranking snippet-level content (that doesn’t really vary from one site to the next) is contextualized by how the site handles content that should be very differentiated. The performance of snippet-level content that is easily churned out doesn’t exist in a vacuum, but is dependent on how well you handle more niche topics and subject matters that require more nuance and expertise. In other words, ranking is a semantic proposition. What you do on the site, as a whole, impacts page-level performance. And it’s not just a matter of “quality” in the sense that the pages don’t present a negative user experience (i.e., intrusive interstitials or filled to the brim with ads). Quality is far more holistic than that and far more semantic than that. Quality, with regard to Google’s algorithms, very much overlaps with relevance. Google doesn’t consider it to be a quality experience if the user is bombarded with all sorts of topics that are not interrelated when navigating through a website (and rightly so). Is it really a quality site or a quality experience if the user encounters a lack of topical cohesion across the site? A website should have an “identity” and it should be very clear to the user what the function of the site is and what sort of information they might expect to find on it. Don’t take my word for it, here’s what Google again advises site owners to consider when looking to avoid being negatively impacted by the Helpful Content Update: “Does your site have a primary purpose or focus? Are you producing lots of content on different topics in hopes that some of it might perform well in search results?” Having a strong content focus (and the quality of that content itself) sends signals to Google about the entire domain. This is the answer to the AI-written content conundrum: Domain-level metrics help Google differentiate sites for ranking when the content at the page level is generic. Google will inevitably double down on these domain-level signals as it already has with the Helpful Content Update. To apply this to our original construct (where all of the snippet-level content answering what is a snow tire? is written by AI and is therefore relatively undistinguishable), ranking this content will involve looking not just at the content on the page, but how the site deals with the topic across the board. If two pages have basically the same AI-written content about what a snow tire is, Google will be forced to look at the strength of the domain itself with regard to snow tires. Which site has a prolific set of content around snow tires? Which has in-depth, niche knowledge? Which site goes down the snow tire rabbit hole and which site has a heap of snippetable AI-written content? Parsing out the quality of the domain is how a search engine will function in spaces where AI-written content has legitimately made generic all of the content that answers a top-level query—which makes a great deal of sense. Don’t look at the query what is a snow tire? simply as someone looking to get an answer to a specific question. Rather, zoom out. This person is looking for information about snow tires as a topic. Which site then makes the most sense for them to visit: a site that has a few pages of generic information about snow tires or a site that is dedicated to diving into the wonderful world of tires? Domain-level metrics also make sense without the problem AI-written content poses. All AI-written content does is make this approach an absolute necessity for search engines and for them to place the construct at the forefront of how they operate. In an era of the web that will be saturated with content that lacks a certain degree of differentiation (i.e., AI-written content), what you do on the other pages of your site will increasingly be as important as what you do on the page you want to rank. Google will, in my opinion, differentiate that which can’t be differentiated by redoubling its focus on domain quality. My concerns for SMBs and ranking in the era of AI-written content What worries me the most about the above-mentioned ranking construct (i.e., one that is heavily weighted on the strength of the domain overall) is that it might leave smaller sites in a bit of a pickle. A site competing for a snippet-level query with AI-written content (similar to all of the other AI-written content around that topic) will be able to rely on the strength of its other content to rank here, according to what I’m proposing. A large site with an efficient content team that has created all sorts of topical clusters related to the overarching topic should thrive, all other considerations being equal. However, a smaller site (typically run by a smaller business) does not have those resources. So while they may have strong content, they may lack quantity. In a world where semantics rule the ranking day, such sites would (in theory) be at a disadvantage as they simply would not be able to create the strong semantic signals needed to differentiate the pages that target snippet-level queries. One could argue that, given the current ecosystem, this is already a problem. While there might be something to that, if Google increases its emphasis on domain-level metrics, the problem will only increase exponentially—in theory. Experience and language structure: How Google can combat the over-proliferation of AI content What about cases where content is not predominantly generated by AI? Even if most snippet-level content is eventually spun up by AI content generators, that still leaves many content categories that are probably not best-served by this method of content creation. How would Google go about handling AI-written content where the content demands more nuance and depth, and is a bit more “long tail” in nature? Again, I don’t think search engines will be able to ignore this problem as AI content generators “solve” an extreme pain point that will inevitably lead to mass (and most likely improper) usage. Google will have to figure out a way to “differentiate” AI-written content if it expects to keep user satisfaction at acceptable levels. The truth is, we may have already been given the answer. In December 2022, Search Engine Journal’s Roger Montii reported on a research paper that points to Google being able to use machine learning to determine if content was written by AI (sort of like turning the machines on the machines). Of course, we don’t know for sure how (or even if) Google deploys this technology, but it does point to a logical construct: language structure can be analyzed to determine if an author is likely human or not. This is fundamentally the basis of a plethora of tools on the market that analyze chunks of text to determine the likelihood that it was constructed by AI (Glenn Gabe has a nice list of the tools you can use to determine AI authorship). The language structures humans tend to use contrast sharply with the language structures employed by AI content generators. The schism between the two language structures is so deep that a number of companies make a living analyzing the difference and letting you know what content has and hasn’t been written by humans. This is precisely what a tool called GLTR did with a section of this very article below: Notice all of the words in purple and in red—these indicate the content was written by a human, which it was (me). Now compare that with something ChatGPT spun up about how Google will rank AI-written content: There’s a far lower ratio of red and purple wording, indicating that this was written by AI. Language structure cannot be overemphasized when differentiating human-written content from AI-written content. Should you use AI to spin up a generic piece of “fluff” content, you are far more likely to create something that seems like it was not written by a human. Going forward, I see Google differentiating low-quality content and AI-written content (which runs the risk of being low quality) by examining language, as it is exactly what machine learning algorithms do: profile language structures. Profiling language constructs is very much within Google's wheelhouse and is a potentially effective way to assess human authorship and quality overall. What’s more, this perfectly aligns with both the extra “E” (for experience) in E-E-A-T and Google’s guidance around Product Review Updates, both of which focus on first-hand experience (as in, that which AI cannot provide). How can Google know if you have actual experience with something? One way is language structure. Imagine you were reading reviews on the best vacuum cleaners on the market and in describing these “best” vacuum cleaners, one page wrote, “Is great on carpet,” while another page wrote, “great on carpet but not for pet hair on carpet.” Which of these two pages was probably written by someone who actually used the darn thing? It’s obvious. It’s obvious to us and it’s also not far-fetched to think that Google can parse the language structure of these two sentences to realize that the modification of the original statement (as in “but not for pet hair on carpet”) represents a more complex language structure which is more closely associated with text based on actual first-hand experience. Aside from domain-level metrics, language structure analysis, to me, will play a vital role in Google determining if AI wrote the content and if the content is generally of sound quality. The integration of AI chat technology into the SERP Let’s talk a bit now about the other elephant in the room: the integration of AI chat technology into search engine results. Clearly, search engines will integrate AI chat experiences into their result pages. I say this because, from Bing to You.com, they already have. Google (at the time of writing this) has indicated that AI chat will be a part of its ecosystem with the announcement of BARD. The question is, what will these systems look like as they mature and how will they impact the ecosystem? More succinctly, will AI chat on search engines be the end of all organic traffic? Understandably, there’s been a lot of concern around the impact of AI chat experiences on organic clicks. If the search engine answers the query within an AI chat experience, what need will there be for clicks? There’s a lot to chew on here. Firstly, for top-level queries that have an immediate and clear answer, the ecosystem already prevents clicks with Direct Answers (as shown below). Is this Google “stealing” clicks? Personally, I don’t abide by this view. While I do think there are things that Google can improve on to better the organic ecosystem, I don’t think abolishing Direct Answers is one of them (also, every ecosystem has areas for improvement, so don’t take my statements here as being overly critical in that way). I think the web has evolved to the point where users want to consume information in the most expeditious manner possible and Direct Answers fill that need. To the extent that AI chat features within search prevent clicks, we need to consider this dynamic as well. Is the chat stealing clicks or simply aligning with the user’s desire to not have to click, to begin with? If it’s the latter, our problem as SEOs is with people, not search engines. However, because of how frequently users might engage with AI chat features, including organic links for the sake of citation is critical to the health of the web—both in terms of traffic incentives and in terms of topical accuracy. It’s vital that users know the source of the information presented by the AI chat feature so that they can verify its accuracy. I’ve seen many instances where these chat tools present out-of-date information. It’s really not that hard to find at this point, so including citations is key (what would be even better is if the providers pushed their tools to be more accurate, but hopefully that will come with time as we are still in the infancy of this technology’s application). Take this result from Neeva’s AI chat feature as an example: The result implies that the Yankees have a good defensive shortstop (the player who stands between second and third base). This was true…at the start of the 2022 season as indicated in the first citation within the chat’s response: Fast forward to the end of the season and there were many concerns about one particular player’s defensive abilities: At least with citations, a user might be clued into the potential issues with the results (again, the better path would be for AI chat tools to evolve). The point is that citations are very important for the health of the web both because they contextualize the answer and because they enable a site to receive traffic. This is even a point that Bing acknowledged in its blog post outlining how its AI chat experience functions: “Prometheus is also able to integrate citations into sentences in the Chat answer so that users can easily click to access those sources and verify the information. Sending traffic to these sources is important for a healthy web ecosystem and remains one of our top Bing goals.” I’m actually very happy to hear Bing say that. I think a lot of the organic market share has to do with how the search engines decide to present their chat results. In Bing’s case, the AI chat feature sits to the right of the organic results (on desktop)—not above them. My eye initially sees the organic results, then the summary from the AI chat feature. You.com, for example, makes you move from the initial organic results to a specific tab and then places organic results to the right of the chat box. Search engines will need to be responsible in how they present their AI-produced content so as to maintain a healthy and functioning web. And again, a lot of that does not comes from the functionality per se, but how these search engines go about accenting the AI content with organic opportunities. As AI chat ecosystems evolve, more opportunities for clicks to sites might exist. Personally, I don’t think the novelty of these tools is in their function as a direct answer. For that, we already have Direct Answers and featured snippets. The novelty, to me at least, is in their ability to refine queries. Look at the conversation I had with You.com’s chat feature about pizza in NYC: Here, the lack of URLs within the chat was a major limitation to my overall user experience. I think the example above (i.e., query refinement) is where users will find chat tools more useful and presenting relevant URLs will be critical. To be fair, there are organic results to the side, but I would have much preferred (and even expected) the chat to offer me two or three curated URLs for local pizza places that fit my criteria. Parenthetically, you can see how this format might wreak havoc on rank tracking (should URLs be placed at each stage of the chat). How valuable is ranking at the top position when there is a URL within a citation provided by the AI chat experience? Will rank trackers see those initial citations? Possibly, but as you refine the query with additional chat prompts as I did above, they certainly won’t be able to! AI chat integrated into the SERP could put a far greater emphasis on data sources like Search Console (where you can see the total impressions), and may make rank tracking within third-party tools less reliable than it currently is. So, does the integration of AI chat into the SERP mean the end of organic traffic? Probably not. It would appear that search engines seem to generally understand the need to incentivize content creation by pushing organic traffic and offering context to the user via citation and beyond. To again use Bing as an example, there seems like plenty of opportunity to take notice of the organic results on the SERP below: My read on Bing is that it is using the chat functionality to accent search. I see the Bing SERP, just for example, as trying to use the chat feature to offer a more layered and well-rounded search experience—not to replace it. At a minimum, there are some early and encouraging signs that the search engines understand that they cannot leave organic traffic out of the equation. AI content generation: A pivotal moment for the web and SEO Over the course of my time in the SEO industry, I’ve seen all sorts of trends come and go. I’m still waiting for the dominance of voice search to materialize. AI content generators are not a fad. The problems that they “solve” are too attractive and the technology too advanced to ever be put back in the bottle. Search engines, as we’ve already seen, are going to incorporate the technology into their ecosystems. Search engines are also going to have to adapt their algorithms accordingly so as to handle the impending wave of AI-written content. Whatever the outcome of all of this is, I can say one thing with total certainty—I cannot remember a more determinative moment for the web and for SEO. Mordy Oberstein - Head of SEO Branding, Wix Mordy is the Head of SEO Branding at Wix. Concurrently he also serves as a communications advisor for Semrush. Dedicated to SEO education, Mordy is one of the organizers of SEOchat and a popular industry author and speaker. Twitter | Linkedin
- Live AMA: Understanding Wix's high performance and CWV scores
Have your questions answered in a live AMA with Wix’s Performance Tech Lead, Dan Shappir. Plus, take a deeper look into how Wix prioritizes performance and what this means for you and your clients. Read the Transcript Transcript: Understanding Wix high performance and CWV scores Speakers Brett Haralson, Community Manager, Wix Dan Shappir, Performance Tech Lead, Wix 00:00 Brett: Hey, hello everybody and welcome to this week's Partners live AMA with Dan Shappir. Today we're going to be talking all about Core Web Vitals understanding Wix's high score and performance. And let's just kind of jump into it. This is a 100% live AMA. So everything I'm about to ask, is what you've submitted. Dan, welcome, Dan is the head of Wix performance—Tech Lead here at Wix. Dan, everybody knows you as “Dan the Beast”, welcome. 00:27 Dan: Thank you very much. I'm so excited to be here and engage with the community. Looking forward to the questions, let’s see if they can stump me. 00:35 Brett: Yes, yes. Can you stump Dan? So by the way, the questions, here's just for those of you who are joining us, here's kind of the flow, what you can expect. To sign up to this and register you submitted some questions. I've got them all ready to go. However, I'm going to try to get some in chat. So as we're talking about things, if you want to ask Dan a question, please go ahead and do it. And I'll try to get to it towards the end. And also, there were a lot of questions that were submitted about, you know, what [are] CVW, what is SEO and any tips? We did that webinar with Dan and I and Dikla from Google. It [was] a little while ago, but I added it to the description if you want to go back and refresh yourself. So with that being said, Dan, it's been a while since we talked. And I think the only thing I can do to show where we've come is this right here. It's this graph, Dan. 01:31 Dan: Yeah, it's a great graph. And it's important to note that this is not our graph. This is a graph that's actually hosted on HTTP Archive, which is this open source project sponsored by Google. And the data that's coming in, that's feeding this graph, is data collected by Google with their Chrome user experience report and database, it's the same data that they then use for the SEO ranking boost, that you know, that performance can now give. So this is not Wix data. This is, you can call it objective data about Wix's performance as it's reflected by Google data. 02:16 Brett: So I think it's important to note too, that it's, it's not Wix, so I want to thank you for clarifying that. Wix has come really, really far. And I love this graph. And then I'm gonna jump into the questions. But I want to just spend two more seconds on this. I think it's really important to note that on this graph, if you go back to 2020, Wix is really at the bottom of the pack. Now, if you look at this, Wix is leading the pack. This is incredible Dan, this. What the heck are y'all doing over there? 02:46 Dan: I have really, yeah, I have to say that this has been a company wide effort, you know, this, I’d love to take credit for it. But really, hundreds of people at Wix have been working on this diligently. It's been designated as a top priority, strategic task across the entire Wix organization. And essentially, everybody in Wix R&D, Support, QA, Marketing, everybody has been engaged in pushing Wix’s performance, up and up, you know. 03:17 Brett: So it's funny, because it's all the results. Alright. We said, we said, you know, the Partners, that was one of the pain points, and we needed to be fast, we needed to load faster. And I remember saying and sitting down with so many Partners and you, and we're this is our top priority at Wix. This will happen. I remember even executive levels with some of our roundtable saying that it's so great to see this. I'm glad to be where we are. But we still have questions, Dan. Let's see. Let's see. So I'm going into the question bank, and I'm gonna start pulling questions. If you have some, please go ahead and drop them. I'll try to get to them towards the end. So here we go. First up, Jose wants to know, “Do I need to republish my site to benefit from performance improvements?” Now I saw this a lot. Dan, is there something the Partners need to do to see something happen? Are there any backend widgets they need to work on? Or does it just happen? 04:09 Dan: Okay, so let's distinguish between, let's say, modifying your website to get the most bang for the buck. And as you said, in the webinar that we did with Google, we did provide a whole bunch of tips and suggestions of things that you can do to get the most performant website that you can on top of the Wix platform. That being said, in order to just benefit from the improvements that we're making, you don't have to do a thing. One of the great things about Wix is that, you know, we don't break your website, you don't need to update plugins, you don't need to update themes. You don't need to worry about security or scalability. We take care of all these things for you. And the same goes with performance. If somebody built a website on our platform eight years ago, and didn't touch it since then, didn't publish it or anything, it's much faster now than it's ever been. 05:06 Brett: I don't know if anybody, I mean. I really think if you were to look that up in Websters, it would literally be defined as sorcery. I'm serious. That's incredible. That's really incredible. But I do have other questions, too. I think I'll touch on those that actually want to elaborate a little bit on that. But I'll circle back to that. So I'm tossing another one at you. Ray wants to know, “How can partners utilize these scores to help promote Wix to their clients?” And Wellington has a secondary follow-up question to that, “How can you correct the assumption that Wix is slow?” And you know, and Dan, I'll chime in here for just a second. I understand that, you know, as a Partner, your creativity, your business, is building a web presence for a client. And a lot of clients have different conceptions of something, or they may have seen an ad on Facebook or Google or something, and they're interested in this site. So— or this platform, a lot of Partners, I think, battle that—convincing their client to go a certain way, because it's what they love. What would you say about this to help Partners put a feather in their cap? 06:17 Dan: Well, first of all, let's start with the fact that as the graph shows, if we go back two, three years, four years, Wix was slow. You know, we did do a lot of work, we have come a long way, you know, I can give examples of other places where Wix has substantially pushed the envelope forward, like SEO, or with accessibility, where, you know, we knew that we needed to up our game, and we were able to do that. And performance is yet another example of this. So if we look at that graph, and compare, then three, then let's say two years ago, if you built a site on Wix, or you built a site on WordPress, then with WordPress, you would have been three times more likely to have a good Core Web Vitals score. Now, with Wix in the US, you're twice as likely to [have] a good Core Web Vitals score, then you are with WordPress. So there's a definite improvement and shift here. So if somebody says, you know, “I heard that Wix is slow?” Well, the answer to that is your information is simply outdated. Wix has come a long way forward. And, you know, that's kind of the benefit of using a platform such as ours, that you get all these benefits and improvements over time, like we said before, without you needing to do anything. 07:48 Brett: Who doesn't love Dan, your information is outdated. I want a shirt, I'm gonna brand that me and the Partners are going to start wearing shirts that quote, Dan the Beast, “Your information is outdated.” Wix is the GOAT. I saw that, by the way, that was great, who said that? Oh, gosh, it was great. We need more, we need more GOAT icons. Okay. So let me go to another question. I think you answered both of those. That was great. Thank you for that. Matt. I think this is Matt. This is a great, great question, “How can we view more detailed CWV metrics?” And more importantly, he wants to know, “Is it possible to import, export and share with clients?” I think this is a fantastic question, Dan. 08:28 Dan: Well, the great thing about Core Web Vitals and what Google [has] done is that they've kind of standardized the market around these, these metrics. And as a result of this, you can literally see these metrics in almost any performance measurement tool that you use. So currently, we don't yet show them in the Site Speed dashboard. And you know, you can take my use of the word yet as an indication of things to come. But you can definitely check them out in other sources. So for example, if you're interested about your own website, and if you have enough traffic, then you know, if you just go into the Google Search Console, you will see there is a Core Web Vitals tab in there. And you can actually get information about your code vitals for your own website within the Google Search Console. They will actually highlight, you know which pages have good Core Web Vitals, which pages need improvement, and then you can kind of focus on those. So that's one place where you can see this information. Another place where you can see this information is in Google PageSpeed Insights, where you can literally put in any website, your own, your competitors, you know, like CNN, whatever, and if that website has sufficient traffic, you will see the Core Web Vitals information for that website. Now, unfortunately, PSI is kind of confusing, the way that the data is presented. A little bird at Google whispered in our ear that they're looking at revamping their user interface and hopefully making it clearer and more understandable. Because you kind of have the score at the top, which—it doesn't actually have to do with Core Web Vitals, it's actually based on lab data. 10:25 Brett: And I have a question about that, it’s queued up. So let's talk about that in just a second. Because that's interesting. I want to know about that. But Dan, I'm curious about what the Partners do. I know a lot of the SEO-specialized Partners have like a report that they show a lot of their clients to show how they're gaining local organic SEO traffic? Are any Partners doing anything with performance, are you sending this to—just drop it in the chat? I'm curious. So I [because] I would Dan, what do you think? 10:53 Dan: Well, for sure. I mean, you know, you can, for example, that graph that we showed at the beginning that we said that it's from HTTP Archive. That graph is available to anybody. We can we, you know, we can share the link to that. And it's a really nice tool that the people that the HTTP Archive have created, because you can filter and compare various CMS platforms or website builders or eCommerce platforms, you can look at different geos. By the way, I highly recommend that you filter it for the particular geography that you're in. So for example, if you're selling in the States, and you want to compare to others, to other platforms, then, you know, filter that graph to the States or, UK or wherever, because that is a better indication of what you can expect. And then you can definitely just show that, you know, I'm going to build a website for you, if I build it with this platform, it's that much more likely to get a good Core Web Vitals score than if, you know, you build it with some other platform. 12:04 Brett: Yeah, and again, here, I think it's a really great opportunity here for Partners to share some of the other sites that they've done and show those scores. And you know, so I think this is a great question. And I think every Partner can handle it a different way. But I think it's a good conversation for us to have as a community, Partner, so whatever you do, I'm curious. Okay. 12:26 Dan: Yeah, just just one more comment on that one of my favorite posts on our Community group in Facebook, was this post where people started posting, you know, screenshots and grabs of their GT metric scores, and you know, boasting how far like, "We had a C, and now it's an A", and it's all green and whatnot. So that, you know, I really enjoyed watching that conversation. It was really great. 12:53 Brett: That's great. It's great. A lot of Partners are actually doing this. And by the way, there are some really good questions that have gone into chat that I've taken note of, so we may actually get to stump the GOAT today. Okay. So let's—I'm gonna keep going. So let's go to another one. Alright. So here's a great one. How do I check current performance and measure impact of site changes? Is there a way to see if I've made some changes? Maybe some changes that we've talked about in the previous webinars, Dan. I make those changes. How do I know if I've measured or if my performance is shifted? That's a good question. 13:31 Dan: So you know, all the tools that we've mentioned are totally relevant to measure your performance at any point in time. One of the great things about the Site Speed dashboard— currently it just shows the Time to Interactive metric, but it definitely shows it over time. So you can see, so you know, there's this nice graph in there that you can see how the changes that you're making, impact your site, or likewise, you can measure different points in time. One of the problems with the Google tools is, you know, it, actually, let me clarify that. If you use the Google Search Console, they use a moving average of, you know, looking at a month back, but it's from today, until a month back. In PageSpeed Insight, they only look at like month segments. So you need to take into account that changes that you make, will not show for example in PageSpeed Insights in the field section for about up to a month. So be aware of that when you're trying to measure the changes that you're making. So either use like a lab score to see whether the score is going up or down, you know, we'll talk a little bit about PageSpeed Insights and how to, you know, consider that score in a bit. So I don't want to go too deeply into that right now. But I will say that it's really useful for seeing whether you're improving or regressing, you know, so forget about what the actual score is right now. Just compare it to a score that you had before, see whether it's higher, or whether it's lower. And that's a great way and again, you can actually run it directly from within the Site Speed dashboard, you don't actually have to go to PageSpeed Insight. If you go to the Site Speed dashboard, in your Google, in your sorry, in your Wix dashboard, you can scroll down, and you can see your PSI score for both, Lighthouse score for both desktop and mobile. And you can click Refresh to rerun it again and again. So you can check the impact of changes that you made. Now, what I usually recommend for people to do—so first of all, you know, one of the great features, one of the best features, in my opinion, that we have in Wix, is our Site History. So you can always make changes. And then if you don't like them, well, you can just refer to a previous version. You know, it's useful for performance. But it's also useful, just you know, in general, if you're testing out various changes, and now we also have the, what's it called, the Release Candidates within the Editor that do like, which is an amazing feature, you can run like A B test. Now you can't A B test for performance, at least not yet. But— 16:17 Brett: Is that a not yet, Dan? Is that a not yet? Yeah, 16:20 Dan: We'll see. But, but you can, you can use that mechanism. Or you can even really go old school. And you can either duplicate the page, or duplicate even the entire site. And so for example, you can duplicate the page, make whatever changes you want, then, for example, use PageSpeed Insights to compare the score for this page and compare the score for that page. One more thing that I will say about Google's PageSpeed Insight, it's a known issue with that, that scores within it fluctuate a lot. So if you're looking at the PSI score, I would recommend for you to essentially run it several times, like I don't know, five times. And then take the average score, or something like that, or the median score, something like that. And not just, you know, run it once and assume that whatever you get is the actual, like, absolute score that you have. 17:18 Brett: I hope everybody's taking notes. I'm pretty sure that there are some notepads smoking right now, there's so much heavy writing or typing keyboards burning up. I think that whole segment just needs to be turned into a blog. Everything you just said needs to be a blog right there. 17:32 Dan: Yeah, that's probably gonna happen. That's incredibly good. Yeah, that's probably going to happen as well. 17:37 Brett: Okay, good. Good, because we need that. Alright, let's, I've got another good question. Rhen wants to know, now, this is kind of a double part here. Rhen wants to know, “Why his mobile PSI score is low?” And Ari wants to know, similar, but specifically about Stores. So maybe this is the same, or maybe they're different? I'll let you, I'll let you answer this. 17:58 Dan: So I'll start with the general one, about talking about the mobile PSI score. So you know, when you run, when you put in your website, or anybody's website, inside PSI, and you press the Go button, it does two things. Again, as we previously explained, if you have sufficient traffic, it will actually go and retrieve your field data from that Google database. But in addition to that, it actually loads your site on a Google virtual machine somewhere in the cloud, and does a whole bunch of measurements on it. So it effectively does the single session, and just tries to measure the performance of that particular session. Actually it does two sessions, one to measure desktop performance and one to measure mobile performance. In the case of mobile, Google are intentionally simulating a low-end device, the device that they're simulating in PageSpeed Insights is a Moto G4 phone, that's a phone that was released, like the beginning of 2016. So it's over five years old. And they're using a simulated 3G network. So you know, our experience is that the vast majority of visitors to Wix websites have much better devices and connectivity than that. So it's not surprising. You know, sometimes people ask me, why do I see green Core Web Vitals, but I'm seeing, you know, a relatively low score in PageSpeed Insights, especially for mobile. Well, that's the reason. The reason is that your users, probably your actual users, your actual visitors, probably have much better devices and much better connectivity than what Google is simulating. Now why is Google simulating such a low-end device? Well, because they want to be inclusive, because, you know, we're living in a global economy. They want you to think about potential customers in Africa, or in Southeast Asia or whatever, where they might have, you know, not such good, not such powerful devices or slower conductivity than what you might have. And, in fact, they've recently written a blog post. We can share a link to that as well, although it's a bit technical, about why there is a potential significant discrepancy between their mobile, their lab scores, those simulated scores and the actual field data. The important thing to note here, is that the ranking boost within the Google Search Console is just based on the field data. So the lab data that you're seeing in PageSpeed Insights has zero impact on the Google ranking algorithm, you can use it as you know, as an indication, and like, you know, I want to move up the score. So you know, I'm making changes, I can see the score going up, because it will take time until these changes are reflected in the field data. But it's important to remember that this is only used as a tool to give you an indication of what a low-end device might experience when visiting your site. I hope this was clear—kind of a technical explanation. 21:26 Brett: I feel like every time I ask a question, you pull out a book open and start reading. And then we close the book and go to the next one. It's like the library of Dan here. I don't know what's going on. So yeah, it makes perfect sense to me, that makes perfect sense to me. 21:41 Dan: Now, going back to the specific part about Stores. So there are a couple of points I wanted to make here. The first and important point is that, you know, in many ways, a store site, or a blog site, or an event site, or fitness site or restaurant or whatever. They're all just Wix sites, and most of the changes that we're making are essential infrastructure changes that impact every site, regardless of which Wix features it actually uses. That being said, you know, it's not possible to move the needle equally across the board. So some aspects of Wix might be, let's call it further ahead, in terms of performance than others. But we're not stopping. We're not holding, you know, we'll talk about this later on, we keep on pushing forward. And, our goal is to be, you know, the fastest best option across the board. 22:45 Brett: Okay, we'll close that book. Let's open another one. So Daniel wants to know, “How well does Wix's performance scale with large databases and stores?” So is there like a breaking point where too much affects performance? Is there a sweet spot? 23:04 Dan: So we built Wix to scale, this whole change that we made with the introduction of dynamic pages and collections, and stuff like that was implemented exactly for this purpose. You know, it used to be that if you wanted to have lots of items within your Wix site, you basically just needed to build manually, lots and lots of pages. These days, that's not the way to go. You build a single dynamic page, you bind it to a collection, and off you go. And the great thing about that, is that, you know, the mechanism doesn't really care how many items are in the collection in terms of the performance of that dynamic page. Because these are databases running on fast servers, they're built to scale, there's literally no problem. Every page is wholly independent of the other pages in the site. So the fact that you know, you have one page, which is heavy, and another page, which is lighter, you know, the heavy page does not impact the lighter page. For example, that being said, you know, sometimes you show a lot of content within a single page. So for example, you might have a product catalog, or a blog feed, or gallery, or what have you or a repeater. And in that case, if you decide to display a lot of items within that, let's say catalog, that will result in a bigger page, and that page as a result will be heavier, and that will have an impact on performance. So usually, my recommendation is not to overdo it in terms of items on a page. You know, when reviewing websites, occasionally I see mobile pages that are 30, 50, even 100 screens long. And I, you know, I kind of asked myself, you know, who expects their visitors to scroll through 100 screens on their mobile device to find the item that they're interested in. If that's your approach, you're creating a cognitive overload for your visitors and it's unlikely that they will scroll through that entire page. And that huge page has a performance cost. We are working on mitigating it, we've done some work, we're doing more work to be able to handle bigger pages. But there are no free lunches. The more stuff you put on a page, you know, it will impact your performance. So generally speaking, in the context of you know, having large databases is, you know, you know, go wild to have as many items in your collection as you would like. But make sure not to try to overload your visitor with too many items on a single page. 26:01 Brett: It makes perfect sense to me, Dan, perfect sense. So for those of you who are just joining us, we are live, we're having a live AMA with Dan the man, the GOAT, the legend. And I'm taking questions that you have submitted, but if you have one that you want to ask, please drop it in chat. I'm gonna, I've got a few more to go. And then I'm going to go to some of your live questions. And we're going to keep going. So great question. And thank you, Dan. So let's jump to this. And I think you sort of asked this. I mean, I think you sort of answered this, but let's maybe go a little bit more in deep, a little more in-depth, how does adding content or functionality to a page impact and you kind of touched on that which is a perfect prelude to this question. So I'll ask it again, how does adding content to a page impact CWV? 26:50 Dan: Well, first, yeah, so as I said, there are no free lunches. The more stuff that you put on the page, the greater the impact on the page's performance. It's almost impossible to add stuff with zero impact. You know, like I said, we are doing all sorts of optimizations, like, for example, lazy loading images. So for example, [on] a Wix page, you know, you, we initially load low-quality images that, you know, are replaced with the final high-resolution images. The images that are below the fold, or, you know, outside the initial viewport that you need to scroll to get to, we only download them when you start scrolling towards that section. So we don't download them upfront. So in this way, we kind of tried to mitigate the impact of adding more content to the page. But like I said, at the end of the day, the more stuff that you put in, the heavier the page becomes, the bigger the HTML, the bigger, you know, more stuff. Now—so you do need to take that into account. And also, as I said, there's also the concept of perceived performance, or the cognitive overhead, the more stuff that you put on the page, the greater the load is on your visitor to try to figure out what that page is about. So don't just think about the performance in terms of how long it takes for the browser to load and display your content. Try to also think about how long it takes for the visitor to kind of comprehend what you're showing to them and being able to understand what your website is about, you know, what is your primary message that you want to get across. Which brings me to an important point—it's a term that's familiar in marketing, I don't know how many of our listeners are familiar with it, that's a call-to-action or CTA. It basically refers to that message or that action that you would like your visitors to perform. So for example, if it's a store, what obviously what you want for them to do is to make a purchase, if I don't know if, let's say you're a fitness trainer, you may want them to book an appointment or something like that. So, anything that is [conducive] to your CTA, you know has a place on that page. Anything that does not contribute to that CTA should probably be removed. It will improve your performance, it will reduce the cognitive overhead and will likely improve your conversion. And you know, sometimes I look at pages that are all messed up and you know what happens there begin— you know, somebody in the company wants to promote one thing and somebody else wants to promote another thing. So ultimately, they just tried to put everything in there and at the end of the day, that's just a bad idea. And you do need to try to figure out what your website is all about and try to focus on that. Another point that I would like to make is that not all components are created equal. You know, there are obviously some heavier things and some lighter things. So obviously a gallery is heavier than a single image. So when you're putting stuff, especially in the initial viewport, again, what is known also as above the fold, think about the stuff that you're putting in there. For example, I usually recommend for people to make sure that they have some text, at least some text above the fold, not just images, not just galleries, not just videos, but also some text, because that text will appear usually faster, and it will provide meaningful content for the person who's visiting your website. You know, I kind of strayed off from the original question. 30:59 Brett: I like it. I think there's—I hope people are taking notes. I mean, there's just so much knowledge. I kind of like it when you kind of wander off a little. It's still very interesting, but relatable, right? It's related to what we’re talking about. Can we close that book? And can I go to the library and pull another one out? 31:17 Dan: Yeah, for sure. Go for it. You know, okay. 31:20 Brett: So, here's a good one. I'm watching the chat. There's a couple questions. By the way, Patricia, your question I pulled and it's coming next. So hang tight on that one. What exactly is Wix doing for CWV for Wix Stores? Is it separate Dan? Is the performance different for eComm sites versus regular, earlier you said it's all the same. So I'll give you an opportunity to hit this nail on the head. 31:44 Dan: So it's kind of the same, but not exactly the same. So as I said, all Wix sites share the same infrastructure. And the same underlying technology. And the same core—let's call it code, by the way, and it's also true, whether you're using ADI, or the Wix Editor, or Editor X, whatever editor you used to build your website. It's all running on the same infrastructure, and using the same core code to actually render the site. And so as a result, improvements that we are able to make within that infrastructure, and within that core code impacts every Wix website out there. And by the way, I want to give—you know, use this opportunity to give this huge shout out to what is known inside Wix as the Viewer Company. That's the team working on the core code that displays websites, they made a huge improvement in terms of performance, they've effectively rewritten that entire component from scratch. Much of that upward trend that you saw on the graph is a result of their work. It's amazing work that they've done. And, and as I said, that impacts every Wix website of any type, regardless of the functionality that it uses. That being said, obviously, there are also some elements within a Wix Store website that are specific to Stores, like you know, the shopping cart icon, you only get that if you've got the Store functionality. Or you may add chat in a Store that you might not add, for example, in a Blog. And those things also have their own code. And we are working to improve the performance of all of these components. As I said, you know, some are further ahead than others. But obviously Stores [are] really important for us. And, and it's one that we're focusing a lot of effort on, in particular. And as you saw, when you looked at the graph, you know, I'll say it quietly, I think that one of the companies shown in that graph was Shopify. And they are as you can see, they're the one just behind us now. They're also making improvements. They've also upped their game in terms of performance, so everybody's kind of doing it with one exception. But, yeah, anyway, but we've managed, at least in the US for—no, well, not at least, but for example, in the US to actually pull ahead of them in terms of the performance of websites built on the platform, or more accurately stated, the percentage of websites built on our platform that get good Core Web Vitals versus the percentage of websites built on their platform that get good Core Web Vitals. 34:54 Brett: Yes, and I think it was either this week or last week, I saw a Partner drop an article that was specifically talking about how another platform is now trying to put together their own team similar to what Wix has done. Because it's evident that the performance that we've—the progress we've made in a year is incredible. And I love, here, and I'm not gonna, this is a question I'm gonna ask you in a minute. But I love how you're saying, you know, we've done well, but we're still there's so much more for us to do. And I love that, I just love that. So I'm gonna go back to the library, Dan. I'm gonna get another one. And then I've got a few that people have asked in the chat that are just outstanding, I want to do a little bit of overdrive and get to see if we can stump the man. Okay, here we go. Here we go. So Patricia wants to know, “Is it better to upload WebP files to make the page faster?” And then Gordon has sort of like a really close question to that. And this question is, “What is the best format to use for fast and clear loading images, specifically ones that are extra large?” 36:02 Dan: Yeah. So images, you know, they're a very interesting topic, because on the one hand, it's really obvious to everybody, you know, we want to have good clear pictures on the website. But then when you kind of start delving into this topic, there are a lot of technicalities in there. And also it turns out that there are a lot of myths. So first of all, I want to point out that in that webinar that we did, I discussed media in particular, so I highly recommend for people who are interested in this topic, to go back and check it out, you know, beyond what we just say in this AMA. Because there are a lot, there's a lot of useful information there about what you can do to get the most out of it. I also want to say that Wix has some of the best media people that I've ever encountered in the industry, working for it, you know, the Wix Media services are amazing, you get out-of-the box functionality that you need to, you know, purchase separately on other platforms. One of the things that we do is that we automatically optimize images for you. For example, when you crop and clip images, we just download the parts that are actually visible on the screen, we don't, so you can upload this huge image that contains, for example, you want to show a portrait of yourself, but you know, your favorite image is the one that you actually took on vacation, and there's a whole bunch of stuff all around. You can upload that huge image, then within Wix, within the Editor, just crop the part that you actually want to show, and you don't have to worry about it, we won't download all that stuff that's outside the cropped area. So that's one example of some of the optimizations that we automatically do for you. Another optimization that we do for you is to automatically use modern and optimized image formats. WebP is another one, we'll discuss maybe more of them when we talk about—if we have time to talk about future plans that we have. But you can upload your images as you know, standard JPEG or PNG formats. And we will automatically convert them to WebP for browsers that support it. So we actually recommend that you use the original format, don't convert to WebP yourself. There are some browsers out there that don't properly support WebP. And by uploading the original format, it enables us to use that on those older or less capable browsers, and then do the optimal conversion to WebP for browsers that actually do support that format. So you know, you don't have to worry about WebP, we take care of that for you. And as another advantage, when a newer image, a media image comes along, that's even better than WebP, we will use that—automatically. And again, you won't need to do anything. So just as an example, we talked before about old websites. A person who built their website, six years ago, seven years ago, before you know WebP was even out there. Well, they're now serving WebP automatically from their website, because we do this automatically for each and every Wix website out there. So that's one important note to make. In terms of the format to load, without going too much into details. It's generally preferable to use JPEGs over PNGs where possible. Sometimes you need PNGs because you need transparency, for example, maybe you're creating some sort of a parallax effect or something like that and you need that transparent background. But if you can make do without, then I would generally recommend to use JPEG, they result in smaller files. And they result in smaller WebP files. So JPEGs that are converted into WebP are smaller than PNGs that are converted into WebP. So that is what I generally recommend using. Oh, and do avoid GIFs if you can. GIFs you know, people use animated GIFs. I prefer animated clips, you know, video animated, we just use a looping video or something instead, because animated GIFs are huge. They don't get converted into WebP so it's just this GIF and I've seen websites where a single GIF was like three times bigger than the rest of the website. 40:43 Brett: So Patricia, I hope you got all of that. I hope, I mean, I know you're out there. I'm just curious. How do you feel about that response from Dan, because that was, it blew my mind too. And Sam, awesome. Thanks for the love man. That's, I agree, Dan, and everybody at Wix is doing a really good job. But we don't stop there. And that kind of leads me into my next question. Before I jump into the questions from our Partners that are viewing Dan, and I'm gonna ask for just a moment of overdrive. So one of the Partners actually asked this Simon wants to know, “So what are the next updates for Wix Performance? What's on the horizon?” So we've come a long way. Absolutely. But we're not stopping there Dan. Can you tell us all these—maybe in the future things, these air quotes we're using. What’s next on the agenda? 41:33 Dan: So obviously, you know, putting all the required restrictions and whatever about forward looking statements and whatnot, you know, we have plans, but then, you know, fate intervenes. But that being said, we are definitely not stopping. One thing that I do want to know, if you look at that graphic, and if you can put it up again, you will see that some in some months, we move forward, and then we kind of we kind of—it seems like we stop, and then we move forward again, you know. So I can't promise that we will be able to move forward at the same rate in each and every month. But we have put systems in place that first of all are intended to prevent regression. So we don't expect to see ourselves ever going backward. And we do intend, and we are continuing to push forward. So overall, you will continue to see that graph keep on going up and up and up and up. For sure. And we do have a lot of stuff on our plate. You know, there are people at Wix, even right now, specifically working on performance related advancements to our platform. So you know, to give an example of something that just got recently rolled out. So it's already out there. But it came out so recently that it's not yet impacted that graph. It's support for HTTP/3. HTTP/3 is like the little one of the latest and greatest web standards, really, really new, not widely used. And it improves the performance of downloading content from the web servers down to the browsers. And we've stopped and we've rolled it out. So we use HTTP/3 where we can and it can deliver content much faster. So that's an example of something that's already been deployed, but is not yet impacting that graph that you showed before. Something else for example, that we're looking at, I mentioned before, that we're looking at support for newer media formats. So you know, WebP is currently the hotness that some websites are using. By the way, I'm sometimes surprised that so many websites aren't yet using WebP because it's really widely supported. But really recently, for example, a new format has come out called AVIF, which is supposed to be something like 20, even sometimes 30% smaller than WebP and we're looking at it. So this is something that we're currently investigating. And if we find that it actually delivers on its promise, and is actually able to reduce the size of the image downloads without adversely impacting quality, then we will automatically enable support for it. And again, you won't have to do anything. Brett: Nobody has to do anything. Yeah, anything. Yeah, it will just—you'll just start getting AVIF. And yeah, another thing that we're looking at is being smarter about how we do this gradual image display. We already have it but we're looking, but currently it's either low-res or high-res and we're looking at making it [a] more gradual kind of build up to the final form. Let me see, I've actually made a list of some of the things. So I'm— 45:08 Brett: I'm gonna jump in while you're doing this, I want to preface this because I'm going to start bringing in questions from the Community that have asked about this. And Rhen had a really good question kind of about that, “With these increases in the scores, do you anticipate future optimizations will be incremental? Or do you think there's going to be things that can make some huge jumps in the future?” And I don't know if that's what you're getting ready to show? Or— 45:30 Dan: Yeah, well, the reality usually is that these things are incremental. You know, if there were obvious ones, that would make this a huge change, then we would just, you know, go for it. We are working on some changes. So you know, the Core Web Vitals, there are three of them. Again, I won't go into too much details, but we are looking at making some significant improvements on, you know, one of them. So you might, you might see an occasional jump. But overall, this is going to be a gradual thing, if for no other reason [than] there are so many different types of Wix websites out there. So for example, there are some websites where the primary content is an image. And there are some websites where the primary content is text. And so if we make an improvement in how quickly we are able to download and display an image that benefits, you know, those sites, but not the ones where the primary content is textual. And that general graph that we showed was across all Wix websites, so you know, we might make a change that would make a particular website suddenly really improve in terms of performance. But if you look at Wix as a whole, I expect more of a gradual improvement to be honest. Brett: That makes a lot of sense. Dan: Yeah, so I did want to mention a few more things that we're looking at. So you know, we've introduced a Site Speed dashboard, that was definitely a version one. We are looking at ways to make that dashboard better, provide more actionable metrics, and in general be more applicable when you're looking to improve your performance. So expect to see improvements there. Oh, another really cool one. You know, a lot of people when they use PSI, the Google PageSpeed Insights, all the recommendations there are really generic. And a lot of them are not really things that you know, you as a Wix website owner can actually do anything with. So for example, you might see recommendations such as reduce the amount of JavaScript, well, you don't really have control over the JavaScript, this is up to us. Well, you know, you can remove functionality from the page, that will likely reduce the amount of JavaScript that you're using, but you know, short of that, you know, you can't really keep your functionality and reduce the JavaScript, that's up to us. Well, guess what, we are working on it. We are working on significantly reducing the amount of JavaScript that we reduce in order to provide a current functionality by essentially being smarter about, you know, identifying exactly which functionality each page is using, and only downloading what the page actually exactly needs. And this is a work in progress. This is, you know, not something that will likely happen overnight, it will happen gradually. It's something that we will keep improving over time. But going back to Google PageSpeed Insights, we are actually looking to integrate Wix specific suggestions into Google PageSpeed Insights, so that when you put in a Wix website, it will identify that it's a Wix website, and in that Recommendations section and Suggestions. In addition to the generic ones, you will also get Wix specific recommendations and suggestions for things that you can improve. I think that's a really cool, cool thing that we are looking to do. 49:21 Brett: And for those of you he's not reading this off of a script, like it's incredible to me, Dan, how you just, it's you have there's, there's you know, more than you've probably forgotten more than I'll ever know in my life. Okay. I've got another one. And this kind of touches—you touched on this a little bit, but I love when the Partners are interested, technically. So looking at the graphs, Wix improved significantly. I mean, can you talk about specifically things that you did, and you touched on this a little you talked about the viewer. Do you want to add anything? Or just kind of talk about how the viewer—talk about that again, for this particular question, because I thought this was incredibly interesting. Yeah. Dan 49:58 So just to clarify FID is First Input Delay, it measures the time when a visitor visits your site and the first time that that visitor interacts with the page in any way whatsoever, for example, [they] click on a button or on a menu. Anything other than scroll and zoom. Scroll and zoom don't count. Any actual interaction that requires the page to respond. The browser measures the time that it takes for that first interaction, and, and sees how quickly the browser, the web page responds. And that's the FID. And ideally, by the way, FID which should be under 100 milliseconds, because according to research, that counts as an essentially instantaneous response. And, as you correctly stated, that's one of the main things that we improved, you know, if we look at the graph, we really like went from having really poor FID to being right up there, with almost perfect FID. And that has to do with that you know, I shouted out to the Viewer team before, that mostly has to do with the work that they've done. We've shifted a lot of the computation that used to take place within the browser, off to our own servers, so that instead of having to do a lot of heavy lifting, using JavaScript inside the browser, we just now do it on our fast servers. And we were you know, we offload this effort off of the visitors device. And, you know, by offloading this processing off of the device, it frees up the device to more quickly respond to the visitor's interaction. So if you're asking specifically where, how did that happen? Well, you know, that's kind of a really short explanation of what we did. 52:01 Brett: Thank you. And by the way, I have to say that there was a secondary question that was asked, and I also want to grab this, I think this also is a pretty good one. Wix has comparable performance to Shopify in the US, but not in other places. And this is kind of not to compare with Shopify, but on more of the horizon, are there other geographies that you can maybe speak on that Wix is working on? And increasing the performance in other geographies? Or is there anything you want to touch on there? 52:33 Dan: So you know, for sure, so first of all, I have to, you know, the reality is that some geographies will have better performance than others, if for no other reason [than] mobile networks are better in some places than in others, or that the device that the average person might have, would be better than those, you know, faster that's not to say, then those that you might have, that people might have in other countries. And that's something that obviously over which we have no control, although, and here, I don't actually want to go into the details, we are looking at ways to even mitigate that. That being said, there are things, definitely things that we can do and that we are doing. So for example, way back when I joined Wix, we effectively had one data center in the US, which would serve the entire world. Now, Wix has many data centers, spread around the globe, which is obviously better for reliability and uptime. But it's also better for performance because you will be served by a data center that's closer to you. Beyond that, we are working with using CDNs to quickly deliver content, you know, Content Delivery Networks, stuff like Fastly, or Akamai or Google has a CDN there are various CDN providers out there. And you know, one of the cool and unique things that we are doing is that we actually try to optimize CDN per geography. So a particular CDN might be better in the States, but another CDN might actually be better in India. So we actually try to measure the CDN performance that we are getting in particular geographies. And if we see that one CDN is potentially better than the other one, we will actually automatically switch. So yes, we are working hard to improve performance around the globe. I can give as again, a concrete example. performance in Australia, for example, has improved dramatically over the past you know, years and months, because of you know, such changes that we have made in our infrastructure. 54:55 Brett: That's interesting because somebody actually asked that so you know, [are] these performance improvements only in the US? And that's actually what they're talking about the clients that are in the UK, Australia and New Zealand. So I guess what you just said sort of answers that question as well. 55:11 Dan: Well, yes, we improved around the globe. And by the way, you know, we just showed the graph from the HTTP Archive website for the US. But you know, go in there like I said, we should provide the link and select the UK instead, and you will see the same thing. You will see that the graph is, you know, going up and up, and that we are much better than most of our competitors, if not all. 55:41 Brett: And that's a great question. So I can, what other resources Dan, and and this is, a great question here that was a follow-up. Are there other sites and tools? Can I add some of that, if you can give me a few of those links I'll add that to the description. So the Partners can sort of— 55:54 Dan: Yeah, so yeah. So we saw that HTTP Archive site where, you know, if you want to sell Wix as a platform, not a specific site, that's just a great research tool to use. Or you can use Google PageSpeed Insight or GT metrics, if you want to use it to measure the performance of a particular website, even one that's not your own to do comparisons. If you're looking at your own Core Web Vitals data, then Google Search Console, the Core Web Vitals tab in it. And of course, our own Site Speed dashboard that you can use to look at performance data for your website on Wix. 56:37 Brett: So this has been incredible. And Dan, I just have to say, you really are a GOAT, you're the greatest of all time man. And it's incredible, because we asked you a question and you just amazingly explained it and go into so much detail. Like I said, there are keyboards smoking, and pencils and pads on fire from all the notes, we're definitely gonna have to dissect this. This has been absolutely incredible. So there's more to come. Wix isn't done. But I want to thank you. First off for taking the time and just coming in and sitting with us and answering our questions. This is such great content for our Partners, you know, they love this, and I appreciate it. 57:18 Dan: You're very welcome. I enjoyed this a whole lot myself. As you know, I love engaging with the Community. By the way, for example, I'm on Twitter. You can hit me up there. I'm slightly, occasionally on Facebook. Not much. But you know, you can also try to drop a question there. I'm sure, Brett, and you know, you can always contact Brett and our amazing Support team. We've got amazing support. One of the things that we've done in terms of performance is we've trained a lot of our support people to be able to answer support questions related to performance. So it's not just me by not by a long shot. 57:57 Brett: And by the way, by the way, huge shout out to them. They've been in the chat. They've been answering questions. Amazing job. I saw a lot of actual Partners comment how great their interaction with Support was. 100% Agree. Awesome, awesome team efforts all around, right? Dan: Exactly. Brett: Look, you're getting some shout outs. Mike. Michael wants everybody to follow you on Twitter, because you tweet about interesting stuff. 58:22 Dan: Yeah, it's Dan Shappir on Twitter. Just so you know. So you know, feel free. 58:27 Brett: Awesome. Dan, thanks a lot. I want to and by the way I do see sometimes comments in the Community and you write dissertations and it just blows people's minds. Okay, so, so awesome. Thanks, y'all. I'll see y'all in the Community. If you're not in the Community, what are you doing? You got to get in there with us. Okay, for sure. So, thanks, Dan. Thanks, Partners, and I'll see you all out there. Have a great day. Bye.
- How to Use Wix Site Inspection
Mordy Oberstein | 11 min

![How stable are Pinterest rankings and traffic? [Study]](https://static.wixstatic.com/media/a484d4_d7c650d09cde40d98744b25a34292acd~mv2.jpg/v1/fit/w_93,h_66,q_80,usm_0.66_1.00_0.01,blur_2,enc_auto/a484d4_d7c650d09cde40d98744b25a34292acd~mv2.jpg)


