top of page
SubscribePopUp

Breaking Down Technical SEO



Join experts from Wix and Deepcrawl for this session to understand how to keep your website healthy for search engines. Learn how to improve your organic performance with best practices that address common technical SEO issues.




In this Webinar We Covered:


* The fundamentals of technical SEO


* Recognizing, prioritizing and resolving common issues


* Tools and tips for maintaining a healthy website



About our Speakers


Nati Elimelech, SEO Technical Lead, Wix


With over 15 years experience in SEO, Nati teaches about advanced SEO solutions, large-scale SEO, and handling SEO infrastructure challenges for veteran SEOs and agencies. When he’s not working on making millions of websites SEO-friendly, Nati spends his time with his lovely wife, baby girl, cat, and dog.




Chris Spann, Senior Technical SEO, Deepcrawl


Chris works with some of Deepcrawl’s largest clients. He holds a consistent track record of providing key recommendations and deliverables, paving the way for the identification of significant opportunities. Chris has seen many SEO issues and their solutions.





About our partner: Deepcrawl


Deepcrawl is a pioneering SaaS platform that empowers the world’s leading brands to harness their revenue potential through technical SEO. Its cloud-based enterprise technologies help brands diagnose and fix performance issues on their websites. Deepcrawl has partnered with Wix to launch a custom-built app that helps users to identify opportunities for SEO. It’s now available on the Wix App Market.


 

Transcript: Breaking Down Technical SEO Webinar


Speakers


Edu Giansante, Head of Community, Wix

Nati Elimelech, SEO Technical Lead, Wix

Chris Spann, Senior Technical SEO, Deepcrawl



00:03

Edu: Alright. We're going live here. Let's see people are coming in, I can see a lot of new numbers and faces popping up here. This is insane. It's a lot of people. Wow. Welcome, everyone. Welcome to our amazing, amazing webinar we have today on SEO. We're gonna go technically on SEO. And we have two amazing people here with me. I'll give it like 30 seconds before we start, so maybe we can do like an icebreaker here.


I was asking the guys actually, what should I ask—if it's something related to music or to, I don't know, drinks? And they're like, well, we all have young kids here, so we’re not into this. So my icebreaker question for you guys. And in the meanwhile, I want to hear where you guys are joining us from. Let me know here in the chat. By the way, when you chat, you can select the option so everyone can see. Okay. Oh my God, this is going insane. This is crazy. Okay, Nati, Chris question for you guys, because you have young kids. What would be your dream, like, you know, ability that you want your kid to learn in the next five years? That would be like, wow, I love my kids so much more.


01:11

Chris: For me, I think the big thing, so my daughter is just just starting to learn to sing. And I'm really looking forward to it because I'm quite musical. But I'm really looking forward to her learning. So although it's almost cuter at the moment, because she kind of can't sing. So in a way, I wish that would stay as it was but yeah, I'm looking forward to actually being able to play my guitar and sing a song with her. I think that's gonna be great.


01:36

Edu: Oh, man, it's amazing. I would love that. Actually, you know that statistically, your daughter has more chances of becoming a famous singer than she has of becoming a famous YouTuber. There are more YouTubers than musicians out there. So better chances for her to be the next Lady Gaga. Who knows?


01:52

Chris: Well, you know what, this chat is making me feel like a Twitch streamer. So.


01:56

Edu: Oh, yes. How about you, Nati?


02:02

Nati: So Rose, my daughter, is 18 months old now. My wish is a bit more practical. I want her to be able to walk the dog, to walk the Husky. It gets so cold during the night, man, she needs to step up. You know, take a part of the load on and I'm just waiting for her to catch up with that.


02:24

Chris: Yeah, we have two dogs that do not get walked as much as they used to.


02:31

Edu: I'm in the dog stage still. So I’m still walking my dog myself. Amazing. Oh my God. I'm seeing so many people coming from a lot of places. It's incredible. Like, I mean, all over the world. We have India, we have Canada, we have the US. We have a lot of countries in Europe. I’m in Europe as well, myself. We have Israel. Wow, insane guys insane.


And we beat the 1K, this is amazing. Oh my God, we have 1.1K people here. This is next level amazing. So I will respect your time and get started, cause I know guys are here to see and hear about technical SEO, not about you know what we're going to do with our kids and our dogs.


So let me share my screen here very quickly, and we're going to get started. So I want to introduce you to these two amazing guys here, Nati and Chris. These guys are like the next level of tech SEO and we're gonna bring this conversation and you know, make it very informal, because we want to make sure that you guys feel comfortable about asking questions. Because we have this crazy chat with a lot of questions coming in, if you guys have any questions, anything you want to ask, please use the Q&A icon there. There's a Q&A box. You can click on that. Throw your questions there. We have a ton of people behind the scenes here in the backstage making sure that the questions get answered. And I'll throw some of the questions live to the guys as well. Without further ado, I want to hand the mic to you both Chris and Nati Welcome, please, the floor is yours.


04:04

Chris: Cool, thanks very much. I guess if we can jump into the first slide then Edu. Okay, yep. So basically this session today is kind of designed to help especially owners of Wix sites, understand a little bit about technical SEO.


We're going to cover off some of the basics and the fundamentals of technical SEO and what technical SEO is, a lot of common issues that we see. So my job at Deepcrawl is that I’m a member of the professional services team. So I spend a lot of my time auditing and looking at big enterprise level sites and some smaller sites as well.


So we're going to discuss common issues that we see across sites like this. And then also tips and best practices and things like that for maintaining a healthy website. Obviously, Nati is here as the Wix expert. And I'm here, hopefully to talk about, to bring a little bit of experience from how we see things at Deepcrawl as well. So I guess if we can skip on?


Edu: Yeah.


Chris: Nati, would you like to say this?


05:26

Nati: Yeah, sure. So let's talk about what the definition of technical SEO is really about, without getting into too much detail. And I do want to stress that SEO, technical SEO is an entire discipline. Don't, you know, feel out of place, if you don't understand everything right away. It's about a way of thinking, a way of looking at things, rather than remembering everything by heart and remembering code by heart.


But the biggest aspect, the gist of technical SEO is about playing nice with search engines. You know, search engines, and the bots, crawlers. These are all, basically, that’s software. And technical SEO is mainly about making a website bot-friendly. Taking care of all the special needs, that software, that [a] bot, that [a] crawler has, so our website can actually rank in Google. In other words and in most cases, technical SEO is about getting out of the way. It’s about getting out of the way, and helping your content rank if it's good enough. If it satisfies your search intent, it's about not holding your progress and halting it because of certain issues. That's the way I see it.


06:53

Chris: Yeah. The way I always think about it, technical SEO is making the information retrieval bit, right, of Google's life more easy. So Google, half of Google is information retrieval—is getting information out of websites. The other half is understanding what that information is, and then deciding who does it best. This is very much focusing on just making sure that Google—and by extension users—can get the information that they want out of your website.


So I guess if we just briefly touch on how Google crawls websites. So effectively, obviously, by crawl we mean when Google visits your website, goes through it and finds as many links as it can on that website to understand as much as it can about that website, as much of what is on your website.


And it does it, nowadays, it does it pretty much like a user does. It uses a specialized version of Chrome, called Chromium, which is designed to be more easily automated. So back in the old days, Nati, as you remember, Google used to just kind of request content, and then whatever code came back, it would try and pass that code and understand what the page was based on from that.


Nowadays, it still does do that. But it will also now actually take a visible representation of your page. And it will try and use that to actually understand what a page is about as well. So kind of gone are the days of the little tricks you used to be able to do to hide things from Google or to make Google think things were there that weren't. And nowadays, yeah, as I say, you now have to treat Google a little bit like a real person.


08:36

Nati: Definitely, so I love that you mentioned that. Search engines and their bots [are] about information retrieval. So let's go over what information is. Okay, what do these bots care about, right?


So to make it simple, Google, or the bot, knows about a certain URL on the web, it doesn't matter which website, and then it goes and fetches it, right? It makes a request to that page, to that URL. The server returns what we call a response. And that response contains the headers and the HTML Google cares about. Now, basically, Google extracts, it doesn't use everything, but it extracts what they care about.


So one, of course, is the status code. We were talking about 200s, 301s, 404s, 500 codes. Whenever we mentioned that, you should know that that's the first thing. Any client, any software that fetches even your browser that fetches the URL, that's the first thing they see. Is this page okay to crawl? Oh, can I continue? So whenever the bot fetches that URL, they get the response, and they also get the HTML. The HTML, if everything is okay. The response is the interesting bit because Google takes two things away from that.


One is the content, or the main content. Now when we're talking about content, it's not necessarily a long-form post, blog post or an article—images are content, videos are content, almost everything that a user sees is content. Almost, right? If we take away the UI and different functionality of a website so Google takes that.


And the other thing Google does, is extract all the links on a page. Why? Because that's how Google and other bots know about the other pages on a website, right? Because whenever I fetch a page, I get all the HTML. The HTML contains the links, and those links will be added to my crawling queue.


Now I know about them, and then I can fetch them. So the most important part, I think that the biggest foundation of technical SEO and about being crawl-friendly, is making sure Google can discover all the web pages, all the URLs, on a website—right?


And then, what's the next layer, Chris? If I made sure, for example, Google does discover all the URLs on my website. First of all, what do you think I would need to pay attention to that may harm that cause, that may harm that goal, that may detract from Google's ability to figure out what's going on on the website?


11:28

Chris: So I guess the main thing, I guess, actually, if we jump into the next slide, potentially, we can start looking at some of the some of the main issues. The very, very first thing we'd start to look at, and I think the big thing for a lot of sites—sometimes with enterprise sites, you can get really into weird details and really strange issues. But the biggest thing is always making sure that content is available, effectively, to crawlers. But also that content is, wherever possible, as unique as possible.


So one of the first issues that we see, with some websites, and this happens a lot with shop fronts, and things like that, quite often, is what we call content duplication. Now, a key thing you need to get your head around with technical SEO is that Google sees the URL, and what comes back from when it requests a URL are very separate things, or things that aren't necessarily related. So if Google asks for four different URLs, and gets the exact same content, or almost the same content back each time, it's going to struggle to understand which URL it should put in the search results, I guess, if that makes sense.


So if you have two or three pages that focus on the same product, or the same destination, say, you begin to create issues where Google doesn't understand necessarily, which one is the real one—which is where canonical tags come into play. And a canonical tag is essentially, I believe, Nati—it's just a setting in Wix, which is, effectively a way of saying to Google and to other search engines: the URL that lives in this tag here is the canonical version, the original, the progenitor. And these work across sites as well. So if you were to push out content to be syndicated, for instance, if you're a great blog, those blogs can specify your site within the canonical tag, which means that you will then get the credit back from Google. Nati, I know in our dry run, we had a great example of duplicate content that you brought up, and I'm trying to remember what it is now.


13:52

Nati: Anything that has to do with any parameters to the URL, basically even UTM tracking parameters or sorting parameters. Everything that changes characters, adds characters, changes the URL, but serves the same content, is basically a duplication. But I do have a question. Other than Google not being sure what to show in search results, why is that such an issue? Is it a big issue? Does it become more of an issue for certain websites? What do you think about that?


14:29

Chris: I think for a lot of sites, duplicate content is a—it can be a smaller issue. And nowadays, it's certainly less of an issue than it used to be. Nowadays, Google is quite good. What we used to see back in the old days and where canonical tags came from, is people used to just straight up steal content, right? They would take content from well-ranking websites because the thinking was, oh, if such and such a website ranks really well for this keyword, then it must be their content. So they would steal their content, put it on the page.


These canonical tags help issues like that. As I say, I think nowadays unless you have millions of pages Nati, or I guess a good example is like Nati was saying—if you have faceted navigation on your site, and you have, let's say, a red dresses page, and a red dress is under £50 (pounds) page, but all your red dresses are also under £50 (pounds), you now serve that exact same piece of content across two different URLs. Which, again, says to Google—Google starts to go, I don't necessarily know what to file this under, I guess. Whereas a canonical tag just helps you to say, hey, guess what, it's going here.


So back at you then Nati, detecting duplicate content. I mean, it's kind of easy for me, we have a duplicate content function in Deepcrawl which helps me find this stuff. But I guess, is there an easy way of pulling this information out, if you're not using some—like a big enterprise crawler?


16:08

Nati: So, there are plenty of solutions out there. First of all, any type—most crawlers will get the job done, do exactly what we specified, fetch the HTML, extract all the links, crawl all the links, and then you will be able to see if you have duplication issues.


Another great tool is Google Search Console, which will alert you [in] some instances. But another great, great tool is just using your keyboard, copying a piece of text, and searching it inside of quotes. And then you can find out if your content is duplicated on other websites, or if it's duplicated, but there are other URLs omitted from the search results. So you can use Google search for that. You can use external tools or the app. And you can use Google Search Console, which you should always, always, always be using. It's a tech SEO's best friend, I think.


But I did notice that canonical isn't really directly visits, right? When I say, hey, this is the canonical version—I'm basically giving Google another hint, right? I'm not—I can't decide that for Google, because Google looks at other signals, other pages pointing to a URL, if it's been linked to, and all other stuff like that. So what would you recommend when deciding on the canonical URL or the original URL? Is there anything you think needs to be done by tech SEOs on a site level? Like I didn't know, editing internal links?


17:54

Chris: I was about to say the biggest thing is internal links, right? If you have a holidays to Spain—if you have two holidays to Spain pages, and one is linked to in your main nav, and in 10 blog posts. But you set the other one is the canonical tag, Google's probably going to go well, hey, I don't think this is the original version of this content. This page that's pointed to all over the website, this feels like the actual canonical version.


And this is where you can get into some interesting issues sometimes, where people have produced great content, but they've canonicalized the wrong URL. And then Google ends up using the URL that is poorly linked to and then goes, oh, well, this isn't a great page, and then ranks it poorly. I've just spotted somebody in the chat asking how you set up a canonical tag and Nati I'm gonna throw it to you, I believe they're just on by default in Wix.


18:54

Nati: That's very CMS-dependent of course. In Wix, all URLs are canonicalized by default, so you don't have the duplication going on. But, had you decided to edit it yourself for any reason, you could use the SEO panel to edit the canonical tag or default and override it.


But out-of-the-box, you shouldn't have any duplication issues. Unless you do it yourself, if you copy an article over more than once. Or even if you, for example, have a blog, and you have a couple of posts. And you use all the categories you have and you tag them in all the posts. Basically what you will be creating—and that's your fault, not ours, is a lot of different tag pages or category pages, but with the same content, right? Same posts. So it's down to user error at this point, and you can override our settings, but I think most CMSs out there at this point—the ones that do care about SEO—not only canonicalize their URLs, I hope, but they also give the ability to edit it. And if they don't, they should.


20:12

Chris: Cool. Let's move on then to our next slide, which is 404 errors and broken links. So Nati, I guess, first off, why would you see a 404 error? Where would you see broken links? How and why is this an issue?


20:33

Nati: Okay, so there are, again, a few ways to detect 404 errors. One is using the app, Deepcrawl for, you know, a full offering of tools or other crawlers to detect this. They just go over the links they find. And if something is broken, they're alerted to it.


The other one is, of course, Google Search Console. And I'll surprise you by also adding Google Analytics if you actually have it configured, because then you can filter pages by the 404 title. You can see all the URLs that have triggered the 404. And that applies to most CMSs, that's a cool trick. You can use Google Analytics if you have it to detect 404s.


So 404s are caused by two types of root causes. One is user error, of course. I've created a page, I've linked to another page on my website. And I didn't link very well, or I've linked to another page on my website, but I have, later down the road, changed the URL for that website.


Alright, external causes are that someone linked, I have a backlink to my website. But that backlink is broken, because they didn't parse it well, or because I, again, have changed the URL structure or the slug and didn't do anything about it.


So there are internal and external causes to that. I do want to stress—not every 404 error is a big issue you should be taking care of right away. Because as you know, Chris, the larger websites, especially eComm websites, right? They tend to accumulate 404s over time. Sometimes not by their own doing.


So imagine I'm a website, right? I have Google Search Console, and I have done a crawl with an external tool, with a third party tool. And I see a lot of 404s. I don't have all the time in the world, SEO is about doing 200 different things at once, right? So how would you go about prioritizing 404s? And when are they something an SEO should be able to live with?


22:56

Chris: So I would say the biggest, the biggest time to worry about 404 errors, or a page returning a 404, or a link being broken effectively—is if you're in a situation where effectively one of your big pages, one of your big traffic drivers or one of your big conversion pages potentially breaks somehow.


Now this could be, you know, again, down to somebody deciding a URL needs to be changed for whatever reason, or links to that URL suddenly breaking [it]. Obviously, I would always say that internal links within your site should always return a 200 status eventually, whether they go through a redirect or not. I think we'll get onto redirects in a little bit. But yeah, effectively all your—obviously again, as a user, if I'm clicking around a navigation on a website, or something similar, and I suddenly find that a page is broken when I click on a link, it's a nightmare. It's an awful user experience, I don't get anywhere. And crucially, more importantly, I then as a website, I don't get to rank for whatever is on that page, right?


So quite often, what we will see is, we prioritize, essentially, 404s that we find a lot of effectively. Again, 404s are very, very easy. Obviously, if you have a header nav across 200 pages on your website, and you make a single typo in your header now, you suddenly break 200 links. So we will always flag things like. And obviously, quite often, I'm working on sites with millions of URLs and these can create millions of 404s which can be changed in a, you know, opening up the CMS, pressing Delete and adding the S on the end—or whatever the change is. So I would always recommend starting there and say, ensuring that your header navigation and things like that all work perfectly, then your big content pieces and go from there.


How to deal with these issues? As I say, quite often, you've got a couple of options there. The best issue and the best way to fix it, and the easiest way to fix it—is to fix the link, right? If it's an internal link. But the big problem sometimes is that you might find that you have a link from— and this happens all the time from a great—you might have a great backlink, right? So you might have got some national press or some international press. But you will find that—but you might find that, despite you sending them the link that you want them to use, they've just got it wrong. Now they're not responding to your emails, etc, etc, etc. So what can you do? So obviously, the perfect thing to do is to go into your CMS, go into Wix or whatever, and change the link. So it's now the right link that points to the right URL. But the other options are redirects, Nati, right? And I'm going to hand it over to you for this, just because obviously, I know you can talk about how to actually fix it within—or how to set up a redirect within Wix.


26:23

Nati: Yep. So we said first of all, you fix the link that you want to redirect, right? And you want to redirect things, URLs that have had past signals. What I mean by past signals is that there was actual content behind them. That there was something tangible and real behind them, not just just a systematic 404, so make sure you do prioritize that. Going about it in Wix, first of all, in many places, and we're extending that over time, whenever you change a URL slug, for example, or sometimes the URL structure that we're working on. We will automatically add a 301 redirect for you. We want to fix the links for you. But in many cases, we'll edit automatically and you can disable it yourself.


If you want to add your own manual redirections—301 redirections—you can go to the Redirect Manager under SEO Settings, and you have many options there. You can do it one-by-one, you can do it by groups, you can upload the CSV. And that way you make Google understand—you make Google pass the signals, pass everything basically Google knew about regarding your URL, to the new one.


There is, of course, one major caveat here—that you can’t just redirect anywhere you want. You should always, always, always, always redirect to the same or matching content. Meaning that just redirecting 404s to your homepage isn't going to help much. Just redirecting it to another URL, another page that's important for you, isn't going to help for long. You want to always, always redirect to the best and closest match possible. Content-wise, of course, right? Not URL structure-wise. It doesn't matter what the URL is, it's about the content behind it. So always make sure you redirect to the right URL. And also always make sure that URL you're redirecting to is the canonical version. And that it returns 200—okay, the one we've mentioned earlier.


Chris: Yeah.


28:47

Edu: So I have a question, maybe a dummy question here. But I'm getting like a lot of insights here, which is really, really good. But I have two questions. One is, how do I actually edit the URL?


And the second question related to that is—because you mentioned like a human error where they forgot an “s” and then crashed the whole thing. But is there an easy way for us to troubleshoot and find the source of the 404 issue?


29:12

Chris: So I can take the second bit in terms of finding the source. So again, any web crawler that you might use, will always highlight a 404 error. Search Console also contains a list of 404s that Google has found, which is obviously a great resource. But also I mean, again, a little bit of manual UX I guess, and testing of your site, will help you find anything. Again you know, before you push a new navigation live, make sure all the links in it work, effectively.


They will always be my choices—whenever you push content live, make sure those links work, go and check yourself. But also you've got, as I say, Search Console, you've got any web crawler you choose to use. And they will be my first two options. Nati, I'm going to pass fixing the link or actually editing the link over to you because again, it feels like a Wix question.


30:24

Nati: So in Wix, there are a couple of places where you can do that. You can do it in the SEO panel, where you can edit the slug for the type of page you're working on, in most cases. And you can do it to a greater extent in our SEO Patterns. You can actually change the entire structure for your blog posts or your product pages. And we're extending that as well to other page types.


So you can do it as a whole—for example, all of your blog posts, if you wanted to start with “banana”, you could end with “someone take my Husky out for a walk please”, you could. And if you just wanted to edit the specific slug for a specific page, blog post, product, whatever—you can do that as well.


31:14

Chris: Yeah. So next, let's jump into something a little bit more, I guess. What am I going to say? Relatable, I suppose to a standard user. So as we mentioned before, Google is a robot that does go off and just pull back the code that builds out your website. But nowadays, as well, Google does what we call “render” a page. So it will actually generate an image somewhere and it will analyze the image of the “above the fold”, as we call it. So “above the fold” refers to whatever is visible when your website first loads. So effectively, that is on your desktop. Obviously, that is the widescreen rectangle of what you're looking at. The “fold” is an old journalism term. That's right. It's [where] the newspaper used to physically fold—the important information was above it, right? And for whatever reason, we've taken that, and we've just carried that through into SEO.


So Google just looks at the initial viewport—the top content of the page. And nowadays, it's fascinating, Google will do things like check out the size of text on a page. So you have your headings, so your H tags, H1,2,3, etc. But Google will look at text that is bolded on a page, text that is large on a page, text that takes up the majority of the initial viewport, and will say hey, this content is important to this page, right?


Again, in the old days—I talk about the old days a lot—people would do things like make their headings really small and hide them in content. This is I mean, as well as being awful for accessibility, like really bad for accessibility. It wasn't great for bots. I've just spotted somebody asking if the text needs to be live text, will it read an image? Always make text live. Google can read images. But if you want that content to be reliably seen by Google, I would always say to make that actual text, right?


But yeah, so back in the old days, Google would be able to—or people [would] be able to hide stuff effectively by making H2s small and things like that. We can't get away with that nowadays. And so what we need to do—as well as the page needs to be human, readable and accessible. It also needs to be accessible for bots, right? Nati, like nowadays, we should just try and treat Google like most of the humans that use a website.


34:05

Nati: So definitely, I think we need to lay some grounds for—some basics for—how Google works again, and how we will behave before we delve into that. So for everyone not aware, we have discussed that Google fetches HTML, right?


And Chris did mention that Google runs JavaScript and renders the page. Rendering a page is about basically taking all these building blocks. The building blocks are the HTML and the CSS and the JavaScript. And using the pieces like Legos and instruction, and basically building the page.


Okay. Now, in the past, we all cared about this word HTML Google sees. Now, tech SEOs should care about what Google actually sees—it's not just HTML. It's how it's constructed, how it's put together. And what Google is trying to do is figure out, okay, what's the most important piece of content on a page? And when we understand that when Google looks at a web page—by the way, something you should know about how bots behave, and then I’ll continue about how Google views a web page.


A few things you should be aware of. First of all, Google is mobile-first, right? What does being mobile-first mean? Being mobile-first means that the primary version Google theoretically should crawl is the mobile one. Why? Because most searches, most of our interactions on the web, are done on mobile. So whenever we look at a page, we should all—most of the time, we should look at it and design it according to mobile-first best practices. How would it look when viewed on mobile? And that's where the folds most come into play. Because think about it, it's prime real estate. And you don't have much space on mobile devices, right? You have the fold. I'm not—we're not saying Google only sees the fold, right? Because when Google fetches a web page and renders it, it actually opens a very, very large or tall viewport, right? Because Google doesn't scroll, Google doesn't click anything. Googlebot doesn't interact with anything. All it does is open a large viewport, and whatever is in that space, in the fold, is considered—or maybe gets extra value out of being there.


So whenever you're looking as a tech SEO, at a website, bring it up on mobile. It's best if you actually do it using one of Google's tools, like the website Mobile-Friendly Test, or the Rich Results Test, or fetching it using Inspect URL in Google Search Console, and inspect whatever's in the fold. If you have something pushing down the main content—and when I'm talking about the main content, I'm talking at least about the main heading of a page—then you should look into that. Because what you're signaling to Google and to your users is, look, this pretty image is what I care about. But whenever someone hits a page and goes to a page—the user and the bot—the first thing they should know is, this is what I'm going to be reading about. This is the product I'm going to be viewing. This is the content being served to me. So whenever you're thinking layout, think about how it would be displayed on mobile. I think that's the best advice I can give when it comes to that.


38:00

Chris: And just to lead on from that Nati, a thing that we see a lot and a thing that people don't do enough of. We all have a real habit, right? We do our work on our laptops, usually plugged into, you know, plugged into WiFi, plugged into a router or on our home WiFi. Which is great. And we build our websites, we go yep, that looks amazing. But then what we don't do is—we don't go outside and pick up our phone and actually see what happens, right?


Like—my phone is a very middle range Android phone, it's like a £200 (pound) phone. If your website struggles to work on my phone, while I'm waiting at the bus stop, then I'm not going to use that site. You know what I mean? Especially again, if you run a bus timetable website, and your website doesn't work on my mobile phone at the bus stop, I'm not going to use your site, you know?


So a big thing is to know who your users are. And again, we talked about Google Analytics earlier on. Look at what browser they're using and what devices they're using. And ensure your site works for those users and those browsers. Should we move on then to something a little bit more content based, Nati, and discuss schema markup?


Basically, schema markup powers everything that's on a Google search result that's not an old fashioned bluelink, right? So again, my favorite is always recipes when it comes to schema markup. If anyone else is like me—when I'm looking for a recipe to make, I do most of my searching with my eyes. But also, again, like I said, I have a two-year-old, or nearly two-year-old. I'm also really interested in things that I can cook in ten minutes. So review schema is a great way of taking your really good recipe and saying, this is what it looks like, it looks amazing. 300 people have said that it's great, 300 people have rated it five stars. And also, it takes 15 minutes and here are the ingredients. And that's all based on schema markup. So schema markup is a way of organizing the content, basically.


So while Google does its best to understand the content that's on a page, schema markup is basically a way of using—it's almost like you're filling in a form, right? For a website. Google says, what are the ingredients? So then it gives you a box to write in the ingredients. And Google goes, thanks very much. What does it look like? And then you give it a picture. And Google goes, thanks very much. We've got the picture, I understand that. What are the ratings, you can say? Well, this person said it was four out of five, this person said it was four and a half out of five. And Google goes, great. Now we know all that information, we can build out what we call Rich Snippets, which are these nice looking snippets, which can, in [the] best situations, they can entice people in. So again, looking at the recipe ones there, straight away, now, if I'm looking at that, I'm going to go for the one on the right, because they're the best chewy chocolate chip cookies, and they look like it. So I'm going to click on that one. What other examples? I guess, Nati? I mean, I was gonna say, what's your favorite schema markup? I know not everyone's as weird as me.


41:37

Nati: I don’t think I have a favorite one. But I think people first need to understand that schema markup is just another layer, right? Of your content. Schema markup is about structuring your content in a way that would be best suited for machines. Okay. Why do we do it? Why do we want to do it?


First of all, we do it because it offers enhanced, aka rich results. But it also helps search engines and software understand what's on a page better, because it's structured. We say, hey, look, this is the price, this is the image, you see the SKU for the product. And all the data is neatly structured for a non human. So first of all, I think every structured data that helps other machines understand what's on my website is beautiful.


However, I think a lot of people get drawn—sucked into the structured data game, the markup game, and try to markup everything on the website, right? And that's a bit odd. That's like going to your kitchen and putting a label on everything like this is a fork, this is a spoon, this is a knife.

Like Google would know that's a knife and that’s a spoon.


So what I'm saying is, structured data markup is great. But I would advise everyone to stick to whatever produces rich results on Google. And you don't have to remember by heart, what does and what doesn't—you could go on Google and search for “Google search gallery”. Whenever you do that, the URL will be brought up. And you can see, like, a catalog of all the different rich results that Google offers. So products are of course a must. And for events, you get neat rich results. Recipes, definitely. I have my own recipe website. And rich results have been like amazing, because they offer increased, you know metrics, better performance on search results, they make you stand out. And they make you more helpful to the user.


So I would focus on anything that can bring a rich result. However, in Wix, I wouldn't go overboard because most of these are already generated by default. So I wouldn't advise Wix users to go overboard. You can do that, you know in our SEO panel—and in SEO Patterns, you can set for an entire page type. I just want to stress—structured data, those rich results, are only applicable when you rank high on Google.


Meaning that if you're on the second, third page on Google, don't bother. Don't waste your time. Which brings me to my next point about taking a similar approach in tech SEO and your website health. You're going—when using software, when using Google Search Console, when detecting issues, you should always prioritize, right?


45:08

Chris: Yeah, that's right. So if we jump onto the next slide. We—oh, sorry, I've had the wrong slide there open. So we're getting onto prioritization in a second, Nati. But I guess what you're talking about leads on to auditing websites, which we discussed earlier.


45:31

Chris: So effectively, yep. We have. So we talked about all these different issues.


45:38

Chris: But how do you actually go about finding them? How do you go about fixing them? So obviously, we've mentioned Google Search Console a few times. Google Search Console is, again, whether your site is a tiny, one person operation, or whether you are some of the biggest websites on Earth, Google Search Console is a great source of information for all sorts of problems. I use it for all of our clients, pretty much every single day.


Search Console is a great tool. Third party auditing tools as well, I mean, obviously, I'm a big fan of Deepcrawl. I was before I worked here. But there are other tools out there, we've mentioned Screaming Frog a couple of times. Screaming Frog is a great, great tool to help you find problems. So Screaming Frog operates as closely as possible to Googlebot effectively. And will go through your sites and find as many pages as it can, and then give you a lot of great reporting on the issues, or any issues that you might find, they might be able to find on the site.


Once you've done that—and we were talking about prioritization earlier—you need to begin prioritizing those issues, which we'll get onto in a second. A very important thing to do, is to schedule and have a regularity to these audits. Now, regular doesn't necessarily mean often. It doesn't necessarily mean you know, daily, or you know, three times a week or whatever like that. If your site is, you know, for a lot of sites, weekly, or maybe monthly, is the kind of cadence you might be looking for.


Again, what we do for a lot of big clients is they will have one big check of the site monthly or bimonthly. And then smaller—sort of what we would call tactical crawls and audits—more weekly. And then also, the last point is, and this is kind of what Nati was talking about before, is not to waste time on SEO myths. Or not to waste time on rabbit holes that don't necessarily lead anywhere, right?


It's very easy to get caught up on weird issues or small issues that don't necessarily provide much impact, and ignore a page over here that's not being indexed anymore, or a load of broken links, or something like that. So it's always worth focusing on that stuff. And again, always try and avoid hearsay, on Twitter, or whatever, about what the new, definite big ranking factor is. Because again, I'm sure Nati and I have both seen a lot of them over the years that have turned out to be not an awful lot, or haven't been a lot. But someone's done a study, right? A really flawed study and said that if your page is green, it ranks better. It's very easy to get lost down these rabbit holes. Is there anything else on that? I see I'm just looking at time. Or do you want to jump straight into how to prioritize?


49:02

Nati: I think we could jump to how to prioritize. I know that people have been, you know, saying we don't know the tools too well, to actually make the connection. What are you talking about? So people all over the chat, please understand we don't have time to show the tools. We do have a previous webinar that explains a lot of the tools with our basic data.


We will, however, when we can, share links to articles explaining how to use the relevant Wix tools and when—so don't worry about that. I'll make sure that happens. Right now, you need to take away a way of thinking, a way of looking at a website through the eyes of a non human—a software bot. And that's what we're trying to impart here.


So—after you will get down the line all the links explaining how to do all of these things and additional explanations. And that's that. I think we can go to prioritization.


But before prioritizing—it doesn't matter if that's the Deepcrawl app or anything else. You mentioned a website audit. A website audit is about, it's just an inspection, right? Like an auto inspection. You bring your car in, and they tell you everything that's wrong, right? So technical SEO is about finding what’s gone wrong and what you expect to be there, but isn't there. So before prioritizing—how would you go about scheduling these audits? I've decided on the software I'd like to use, okay, I've connected Google Search Console. I've decided to use Deepcrawl or Screaming Frog or whatever other tool I've heard of. Should I do it every day? Should I do it every week? How much time are users or people here supposed to invest in technical SEO?


51:18

Chris: Quite often, I would say that it can vary enormously on the size of the site and how much content you're putting out. I think for the vast majority of sites, a monthly audit and health check is more than enough. Again, most tools—Deepcrawl has automated crawling of the site and Deepcrawl will, the Deepcrawl app within Wix will crawl your site weekly.


But also, most tools, again, Screaming Frog, you can set schedules within Screaming Frog as well. Although do remember that means that your computer with Screaming Frog on does need to be turned on at the time. I'd maybe recommend making sure that's on a Monday morning while your computer is on or leave your computer on for a couple of hours on a Friday evening, something like that, in order to get the data. But yeah, I would say for the majority of sites monthly but weekly is a great way to absolutely make sure that you find anything, basically as soon as it happens, right?


52:28

Nati: Hopefully. Eventually, everything breaks, right?


52:31

Chris: Right. Exactly. Yeah. And sometimes it breaks and you don't know why. So.


52:38

Nati: Okay, so let's discuss a bit of [about] prioritization but because I feel this is the most important aspect of doing SEO work, not just technical SEO. So lay it on me, what's the first thing? There are plenty of issues—I go to Search Console, I see a lot of errors and statuses, and notices Google is throwing at me, and Deepcrawl is throwing stuff at me. And it’s overwhelming. I don't understand half of it. What should I target first? [What] should I invest my limited time in resolving?


53:16

Chris: Yep. Biggest thing is always content. Any issues that make content non-indexable, right? So any situation where you have a page on the site that Google is not going to be able to see. That's issue number one. I would always rather Google see a page with issues than not see a perfect page or a page that's perfect otherwise, you know.


So internal linking is always going to be a thing that we're going to look at straight away. If you've got links to big pages that are broken, you're going to want to focus on those. After that—


53:54

Nati: Sorry for stopping you. Okay, what does that mean? That the page is or isn't indexable?


54:00

Chris: No problem. So a page that's not indexable is a page that basically cannot be put into the index by Google. There are tags within a page that you can use to set this. What I would say is the vast majority of, again, eComm and things like that—eComm websites, you will want the majority of pages indexed. And kind of your indexing or not indexing strategy is a bigger topic, I would say for a lot of this. But on the whole, yeah—so a page that is not findable or able to be put into the index by Google is a page that we would consider to be not indexable. Whereas a page can be indexed and no longer findable by Google, should links be removed or things like that.


54:48

Nati: So there are a lot of—again, it depends, like in everything in SEO. And that's the biggest lesson here, people. Everything depends in SEO. If I don't know anything about indexing but I did want to go about finding issues. What are things that usually make Google not want to index something, meaning that I haven't instructed Google to not index anything.


If we think about the index as a big library. And each website is a book. And each page on that website gets a page in a book. And I haven't taught Google, look, ignore this chapter, don't index it. But still, Google hasn’t, right? I can see some of my posts aren't being indexed. Google Search Console told me that this is not a part of my offering in my libraries, Google. What should I look out for—even if I'm not a technical person?


55:54

Chris: I would say again, like you mentioned before, use your keyboard. Google your own website, Google the things that you're trying to rank for, and see if you are there—is always the biggest indicator. Also, again, I think there are—if you're not a technical person, you don't want to be looking for tags, I believe there's toggles and things like that within Wix to actually no index pages.


Well, yeah, that would be the first place I'd start. If obviously, you're looking at a website that you own, and a website where you have control of the CMS. I'm just looking at the time Nati, you have a couple of minutes left. So we have a next slide here on how to keep a site healthy.


Yeah. So obviously, we've talked about automated and scheduled crawls before. Obviously, we have apps and crawlers out there. A big thing for me is to always consider SEO, with every single change that you make, and also to educate people who use your site. So whether that's your colleagues, or if you have a situation where you're a consultant, things like that. Try and teach everyone in the business about SEO, because, again, from working in quite big companies. And obviously, the bigger your company gets, the more points of failure there are. In all aspects of SEO, the people are normally the biggest point of failure, right?


Someone might decide one day, hey, I don't know why the URL for this page is this, I'm going to change it. And then not fix the links—or a number of different issues. I don't like the H1 tag, I'm going to change the header of this page—unbeknown [to] the fact that you've spent the last six months building links to that page, and you've worked really hard to make sure that page is super-optimized, blah, blah, blah, blah, blah.


So I would say to always try and teach as many people as you possibly can within your business, what SEO is, why it's important, just to think about it with every change you make. And also, like we mentioned right back at the start, the great thing about technical SEO is, you have documentation. It's the one bit of SEO that Google will actually lay it out and say, here's how you do it correctly. And even better, here's a tool that you can just test it with. Use those tools. Use that documentation to build things out to the letter, because it's the one part—it's why I stayed in technical SEO and moved away from traditional SEO. I like to be able to do a thing, and then press a button, and have a little pat on the head from somebody that says yep, you've done it right, well done. So I would always focus on that stuff. What about yourself, Nati? What are your tips for keeping a site healthy and in good shape?


58:46

Nati: Prioritization, of course, I think for most websites, a monthly crawl is great. But I think it's important not just to audit a website, meaning to run a program or an SaaS service, or some sort of software. It's also about tracking the number and types of errors over time. You want to see that your website health is trending up, right? You don't want to see an increase in errors, you want to see a decrease in errors.


So one of the things I would constantly monitor is like last month, I had this and this and this issues, have some of them increased? Gotten worse? Or have they improved. If I noticed a trend, for example, I have an increase in 404s. Maybe there is a wider issue that I haven't solved at the core yet. So always look at the trends and compare what the services [and] the tools serve to you.


So I know the Deepcrawl app does that. You can compare—there are trend lines there that you can see. I know that Google Search Console in many reports also offers trend lines for various things. I would say that the most important tool—and I beg Deepcrawl’s forgiveness—the most important tool in a tech SEO’s arsenal is Google Search Console. You can do that, by the way, in a couple of clicks using our Connect to Google feature in the Wiz [SEO Tools, under Marketing & SEO]. Always go there. Always see what's stopping your website from reaching its full potential.


Now, no one is going to teach you to be a tech SEO in an hour. You know what, they're not going to even teach you to be a tech SEO in a month. Experience, time, and a lot of research goes into that. The important thing is to check, to encounter issues, to do a Google search, or, you know, ask for advice in supporting communities, on Twitter or in Facebook groups, and then resolve it. What we try to do on our end, is try for you to not have to take care of that at all.


But it's always, always, always important to check it once in a while. Just like you have your car tested and licensed every year so it doesn't break down while you drive. That's the same thing here. You don't want something silly on your website, impacting your business or your clients’ business. And that's what tech SEO is all about. It's what could I or someone else screw up? And how do I fix that? And it's not something you'll get right into right away. You have to keep at it. But the first step is connecting to Google Search Console, diving into the data—wait a couple of days for the data to populate.


Dive into the data. Understand what Google is pushing back at you, meaning if Google is surfacing it, I guess it's important. And one last tip. A lot of people have been noticing around the web—I manage a lot of large SEO communities and a lot of people will be noticing—there's an increase in “crawled” but not “indexed” in Google Search Console. I’m sure you've encountered it on many websites, forums, discussions. So, that's a great example of how sometimes it's not your fault. Remember, Google is a third party—they have bugs, they have preferences, they have limited resources, even if it's Google. It's not always your fault. But when it is, you better take care of that.


1:02:44

Edu: Awesome Nati, Chris, I want to thank you both for your time. You know, this one hour was insane. A lot of really good insights, we had over 250 questions asked in the Q&A. Plus, I can easily say 200,000 chat comments here. So it was really, really good to see all this movement and everyone [is] really, really excited to hear from you both.


As you said, it's not going to be something you learn overnight, it's not going to be something that's easy to just stop and do it and okay, now I know tech SEO. It takes time, it takes you know, energy to go through things.


And again, Deepcrawl, having Wix SEO with all the features we released recently helps a lot, helps a ton. So guys, if you have additional questions, I know we couldn't go through this because 250 questions—we would be here for like five days and not be able to answer all of them. So feel free to ask us questions. If you are part of any of the communities we own like [the] Editor X community, or Wix Partners community or All Things Wix. Join it, keep asking the questions there. If you want to go on Twitter, make sure you tag us using the hashtags TechSEO, WixSEO, SEOWix, Deepcrawl. Go for it, go crazy, ask questions there. I'll try and convince the guys to maybe spend some time there to get the questions answered on Twitter as well.


But above all, I want to be respectful of your time and we dedicated one hour here. So again, thank you so much, guys. Kermit says hi, as well. Yeah, thank you, everyone. Chris, Nati, thanks for all the time and everyone in the backend here who is like not showing their faces but it's a lot of people behind the scenes to make this happen. So thank you everyone, guys. Thanks for everything. Have a great day.


1:04:19

Nati: SEO rocks.


1:04:20

Chris: Thanks.


Get the Searchlight newsletter to your inbox

* By submitting this form, you agree to the Wix Terms of Use

and acknowledge that Wix will treat your data in accordance

with Wix's Privacy Policy

Thank you for subscribing

bottom of page