top of page
SubscribePopUp

Breaking Down Technical SEO



Join experts from Wix and Deepcrawl for this session to understand how to keep your website healthy for search engines. Learn how to improve your organic performance with best practices that address common technical SEO issues.


Read the Transcript



In this Webinar We Covered:


* The fundamentals of technical SEO


* Recognizing, prioritizing and resolving common issues


* Tools and tips for maintaining a healthy website



About our Speakers


Nati Elimelech, SEO Technical Lead, Wix


With over 15 years experience in SEO, Nati teaches about advanced SEO solutions, large-scale SEO, and handling SEO infrastructure challenges for veteran SEOs and agencies. When he’s not working on making millions of websites SEO-friendly, Nati spends his time with his lovely wife, baby girl, cat, and dog.


Twitter | Linkedin



Chris Spann, Senior Technical SEO, Deepcrawl


Chris works with some of Deepcrawl’s largest clients. He holds a consistent track record of providing key recommendations and deliverables, paving the way for the identification of significant opportunities. Chris has seen many SEO issues and their solutions.


Twitter | Linkedin




About our partner: Deepcrawl


Deepcrawl is a pioneering SaaS platform that empowers the world’s leading brands to harness their revenue potential through technical SEO. Its cloud-based enterprise technologies help brands diagnose and fix performance issues on their websites. Deepcrawl has partnered with Wix to launch a custom-built app that helps users to identify opportunities for SEO. It’s now available on the Wix App Market.


 

Transcript: Breaking Down Technical SEO Webinar


Speakers


Edu Giansante, Head of Community, Wix

Nati Elimelech, SEO Technical Lead, Wix

Chris Spann, Senior Technical SEO, Deepcrawl



00:03

Edu: Alright. We're going live here. Let's see people are coming in, I can see a lot of new numbers and faces popping up here. This is insane. It's a lot of people. Wow. Welcome, everyone. Welcome to our amazing, amazing webinar we have today on SEO. We're gonna go technically on SEO. And we have two amazing people here with me. I'll give it like 30 seconds before we start, so maybe we can do like an icebreaker here.


I was asking the guys actually, what should I ask—if it's something related to music or to, I don't know, drinks? And they're like, well, we all have young kids here, so we’re not into this. So my icebreaker question for you guys. And in the meanwhile, I want to hear where you guys are joining us from. Let me know here in the chat. By the way, when you chat, you can select the option so everyone can see. Okay. Oh my God, this is going insane. This is crazy. Okay, Nati, Chris question for you guys, because you have young kids. What would be your dream, like, you know, ability that you want your kid to learn in the next five years? That would be like, wow, I love my kids so much more.


01:11

Chris: For me, I think the big thing, so my daughter is just just starting to learn to sing. And I'm really looking forward to it because I'm quite musical. But I'm really looking forward to her learning. So although it's almost cuter at the moment, because she kind of can't sing. So in a way, I wish that would stay as it was but yeah, I'm looking forward to actually being able to play my guitar and sing a song with her. I think that's gonna be great.


01:36

Edu: Oh, man, it's amazing. I would love that. Actually, you know that statistically, your daughter has more chances of becoming a famous singer than she has of becoming a famous YouTuber. There are more YouTubers than musicians out there. So better chances for her to be the next Lady Gaga. Who knows?


01:52

Chris: Well, you know what, this chat is making me feel like a Twitch streamer. So.


01:56

Edu: Oh, yes. How about you, Nati?


02:02

Nati: So Rose, my daughter, is 18 months old now. My wish is a bit more practical. I want her to be able to walk the dog, to walk the Husky. It gets so cold during the night, man, she needs to step up. You know, take a part of the load on and I'm just waiting for her to catch up with that.


02:24

Chris: Yeah, we have two dogs that do not get walked as much as they used to.


02:31

Edu: I'm in the dog stage still. So I’m still walking my dog myself. Amazing. Oh my God. I'm seeing so many people coming from a lot of places. It's incredible. Like, I mean, all over the world. We have India, we have Canada, we have the US. We have a lot of countries in Europe. I’m in Europe as well, myself. We have Israel. Wow, insane guys insane.


And we beat the 1K, this is amazing. Oh my God, we have 1.1K people here. This is next level amazing. So I will respect your time and get started, cause I know guys are here to see and hear about technical SEO, not about you know what we're going to do with our kids and our dogs.


So let me share my screen here very quickly, and we're going to get started. So I want to introduce you to these two amazing guys here, Nati and Chris. These guys are like the next level of tech SEO and we're gonna bring this conversation and you know, make it very informal, because we want to make sure that you guys feel comfortable about asking questions. Because we have this crazy chat with a lot of questions coming in, if you guys have any questions, anything you want to ask, please use the Q&A icon there. There's a Q&A box. You can click on that. Throw your questions there. We have a ton of people behind the scenes here in the backstage making sure that the questions get answered. And I'll throw some of the questions live to the guys as well. Without further ado, I want to hand the mic to you both Chris and Nati Welcome, please, the floor is yours.


04:04

Chris: Cool, thanks very much. I guess if we can jump into the first slide then Edu. Okay, yep. So basically this session today is kind of designed to help especially owners of Wix sites, understand a little bit about technical SEO.


We're going to cover off some of the basics and the fundamentals of technical SEO and what technical SEO is, a lot of common issues that we see. So my job at Deepcrawl is that I’m a member of the professional services team. So I spend a lot of my time auditing and looking at big enterprise level sites and some smaller sites as well.


So we're going to discuss common issues that we see across sites like this. And then also tips and best practices and things like that for maintaining a healthy website. Obviously, Nati is here as the Wix expert. And I'm here, hopefully to talk about, to bring a little bit of experience from how we see things at Deepcrawl as well. So I guess if we can skip on?


Edu: Yeah.


Chris: Nati, would you like to say this?


05:26

Nati: Yeah, sure. So let's talk about what the definition of technical SEO is really about, without getting into too much detail. And I do want to stress that SEO, technical SEO is an entire discipline. Don't, you know, feel out of place, if you don't understand everything right away. It's about a way of thinking, a way of looking at things, rather than remembering everything by heart and remembering code by heart.


But the biggest aspect, the gist of technical SEO is about playing nice with search engines. You know, search engines, and the bots, crawlers. These are all, basically, that’s software. And technical SEO is mainly about making a website bot-friendly. Taking care of all the special needs, that software, that [a] bot, that [a] crawler has, so our website can actually rank in Google. In other words and in most cases, technical SEO is about getting out of the way. It’s about getting out of the way, and helping your content rank if it's good enough. If it satisfies your search intent, it's about not holding your progress and halting it because of certain issues. That's the way I see it.


06:53

Chris: Yeah. The way I always think about it, technical SEO is making the information retrieval bit, right, of Google's life more easy. So Google, half of Google is information retrieval—is getting information out of websites. The other half is understanding what that information is, and then deciding who does it best. This is very much focusing on just making sure that Google—and by extension users—can get the information that they want out of your website.


So I guess if we just briefly touch on how Google crawls websites. So effectively, obviously, by crawl we mean when Google visits your website, goes through it and finds as many links as it can on that website to understand as much as it can about that website, as much of what is on your website.


And it does it, nowadays, it does it pretty much like a user does. It uses a specialized version of Chrome, called Chromium, which is designed to be more easily automated. So back in the old days, Nati, as you remember, Google used to just kind of request content, and then whatever code came back, it would try and pass that code and understand what the page was based on from that.


Nowadays, it still does do that. But it will also now actually take a visible representation of your page. And it will try and use that to actually understand what a page is about as well. So kind of gone are the days of the little tricks you used to be able to do to hide things from Google or to make Google think things were there that weren't. And nowadays, yeah, as I say, you now have to treat Google a little bit like a real person.


08:36

Nati: Definitely, so I love that you mentioned that. Search engines and their bots [are] about information retrieval. So let's go over what information is. Okay, what do these bots care about, right?


So to make it simple, Google, or the bot, knows about a certain URL on the web, it doesn't matter which website, and then it goes and fetches it, right? It makes a request to that page, to that URL. The server returns what we call a response. And that response contains the headers and the HTML Google cares about. Now, basically, Google extracts, it doesn't use everything, but it extracts what they care about.


So one, of course, is the status code. We were talking about 200s, 301s, 404s, 500 codes. Whenever we mentioned that, you should know that that's the first thing. Any client, any software that fetches even your browser that fetches the URL, that's the first thing they see. Is this page okay to crawl? Oh, can I continue? So whenever the bot fetches that URL, they get the response, and they also get the HTML. The HTML, if everything is okay. The response is the interesting bit because Google takes two things away from that.


One is the content, or the main content. Now when we're talking about content, it's not necessarily a long-form post, blog post or an article—images are content, videos are content, almost everything that a user sees is content. Almost, right? If we take away the UI and different functionality of a website so Google takes that.


And the other thing Google does, is extract all the links on a page. Why? Because that's how Google and other bots know about the other pages on a website, right? Because whenever I fetch a page, I get all the HTML. The HTML contains the links, and those links will be added to my crawling queue.


Now I know about them, and then I can fetch them. So the most important part, I think that the biggest foundation of technical SEO and about being crawl-friendly, is making sure Google can discover all the web pages, all the URLs, on a website—right?


And then, what's the next layer, Chris? If I made sure, for example, Google does discover all the URLs on my website. First of all, what do you think I would need to pay attention to that may harm that cause, that may harm that goal, that may detract from Google's ability to figure out what's going on on the website?


11:28

Chris: So I guess the main thing, I guess, actually, if we jump into the next slide, potentially, we can start looking at some of the some of the main issues. The very, very first thing we'd start to look at, and I think the big thing for a lot of sites—sometimes with enterprise sites, you can get really into weird details and really strange issues. But the biggest thing is always making sure that content is available, effectively, to crawlers. But also that content is, wherever possible, as unique as possible.


So one of the first issues that we see, with some websites, and this happens a lot with shop fronts, and things like that, quite often, is what we call content duplication. Now, a key thing you need to get your head around with technical SEO is that Google sees the URL, and what comes back from when it requests a URL are very separate things, or things that aren't necessarily related. So if Google asks for four different URLs, and gets the exact same content, or almost the same content back each time, it's going to struggle to understand which URL it should put in the search results, I guess, if that makes sense.


So if you have two or three pages that focus on the same product, or the same destination, say, you begin to create issues where Google doesn't understand necessarily, which one is the real one—which is where canonical tags come into play. And a canonical tag is essentially, I believe, Nati—it's just a setting in Wix, which is, effectively a way of saying to Google and to other search engines: the URL that lives in this tag here is the canonical version, the original, the progenitor. And these work across sites as well. So if you were to push out content to be syndicated, for instance, if you're a great blog, those blogs can specify your site within the canonical tag, which means that you will then get the credit back from Google. Nati, I know in our dry run, we had a great example of duplicate content that you brought up, and I'm trying to remember what it is now.


13:52

Nati: Anything that has to do with any parameters to the URL, basically even UTM tracking parameters or sorting parameters. Everything that changes characters, adds characters, changes the URL, but serves the same content, is basically a duplication. But I do have a question. Other than Google not being sure what to show in search results, why is that such an issue? Is it a big issue? Does it become more of an issue for certain websites? What do you think about that?


14:29

Chris: I think for a lot of sites, duplicate content is a—it can be a smaller issue. And nowadays, it's certainly less of an issue than it used to be. Nowadays, Google is quite good. What we used to see back in the old days and where canonical tags came from, is people used to just straight up steal content, right? They would take content from well-ranking websites because the thinking was, oh, if such and such a website ranks really well for this keyword, then it must be their content. So they would steal their content, put it on the page.


These canonical tags help issues like that. As I say, I think nowadays unless you have millions of pages Nati, or I guess a good example is like Nati was saying—if you have faceted navigation on your site, and you have, let's say, a red dresses page, and a red dress is under £50 (pounds) page, but all your red dresses are also under £50 (pounds), you now serve that exact same piece of content across two different URLs. Which, again, says to Google—Google starts to go, I don't necessarily know what to file this under, I guess. Whereas a canonical tag just helps you to say, hey, guess what, it's going here.


So back at you then Nati, detecting duplicate content. I mean, it's kind of easy for me, we have a duplicate content function in Deepcrawl which helps me find this stuff. But I guess, is there an easy way of pulling this information out, if you're not using some—like a big enterprise crawler?


16:08

Nati: So, there are plenty of solutions out there. First of all, any type—most crawlers will get the job done, do exactly what we specified, fetch the HTML, extract all the links, crawl all the links, and then you will be able to see if you have duplication issues.


Another great tool is Google Search Console, which will alert you [in] some instances. But another great, great tool is just using your keyboard, copying a piece of text, and searching it inside of quotes. And then you can find out if your content is duplicated on other websites, or if it's duplicated, but there are other URLs omitted from the search results. So you can use Google search for that. You can use external tools or the app. And you can use Google Search Console, which you should always, always, always be using. It's a tech SEO's best friend, I think.


But I did notice that canonical isn't really directly visits, right? When I say, hey, this is the canonical version—I'm basically giving Google another hint, right? I'm not—I can't decide that for Google, because Google looks at other signals, other pages pointing to a URL, if it's been linked to, and all other stuff like that. So what would you recommend when deciding on the canonical URL or the original URL? Is there anything you think needs to be done by tech SEOs on a site level? Like I didn't know, editing internal links?


17:54

Chris: I was about to say the biggest thing is internal links, right? If you have a holidays to Spain—if you have two holidays to Spain pages, and one is linked to in your main nav, and in 10 blog posts. But you set the other one is the canonical tag, Google's probably going to go well, hey, I don't think this is the original version of this content. This page that's pointed to all over the website, this feels like the actual canonical version.


And this is where you can get into some interesting issues sometimes, where people have produced great content, but they've canonicalized the wrong URL. And then Google ends up using the URL that is poorly linked to and then goes, oh, well, this isn't a great page, and then ranks it poorly. I've just spotted somebody in the chat asking how you set up a canonical tag and Nati I'm gonna throw it to you, I believe they're just on by default in Wix.


18:54

Nati: That's very CMS-dependent of course. In Wix, all URLs are canonicalized by default, so you don't have the duplication going on. But, had you decided to edit it yourself for any reason, you could use the SEO panel to edit the canonical tag or default and override it.


But out-of-the-box, you shouldn't have any duplication issues. Unless you do it yourself, if you copy an article over more than once. Or even if you, for example, have a blog, and you have a couple of posts. And you use all the categories you have and you tag them in all the posts. Basically what you will be creating—and that's your fault, not ours, is a lot of different tag pages or category pages, but with the same content, right? Same posts. So it's down to user error at this point, and you can override our settings, but I think most CMSs out there at this point—the ones that do care about SEO—not only canonicalize their URLs, I hope, but they also give the ability to edit it. And if they don't, they should.


20:12

Chris: Cool. Let's move on then to our next slide, which is 404 errors and broken links. So Nati, I guess, first off, why would you see a 404 error? Where would you see broken links? How and why is this an issue?


20:33

Nati: Okay, so there are, again, a few ways to detect 404 errors. One is using the app, Deepcrawl for, you know, a full offering of tools or other crawlers to detect this. They just go over the links they find. And if something is broken, they're alerted to it.


The other one is, of course, Google Search Console. And I'll surprise you by also adding Google Analytics if you actually have it configured, because then you can filter pages by the 404 title. You can see all the URLs that have triggered the 404. And that applies to most CMSs, that's a cool trick. You can use Google Analytics if you have it to detect 404s.


So 404s are caused by two types of root causes. One is user error, of course. I've created a page, I've linked to another page on my website. And I didn't link very well, or I've linked to another page on my website, but I have, later down the road, changed the URL for that website.


Alright, external causes are that someone linked, I have a backlink to my website. But that backlink is broken, because they didn't parse it well, or because I, again, have changed the URL structure or the slug and didn't do anything about it.


So there are internal and external causes to that. I do want to stress—not every 404 error is a big issue you should be taking care of right away. Because as you know, Chris, the larger websites, especially eComm websites, right? They tend to accumulate 404s over time. Sometimes not by their own doing.


So imagine I'm a website, right? I have Google Search Console, and I have done a crawl with an external tool, with a third party tool. And I see a lot of 404s. I don't have all the time in the world, SEO is about doing 200 different things at once, right? So how would you go about prioritizing 404s? And when are they something an SEO should be able to live with?


22:56

Chris: So I would say the biggest, the biggest time to worry about 404 errors, or a page returning a 404, or a link being broken effectively—is if you're in a situation where effectively one of your big pages, one of your big traffic drivers or one of your big conversion pages potentially breaks somehow.


Now this could be, you know, again, down to somebody deciding a URL needs to be changed for whatever reason, or links to that URL suddenly breaking [it]. Obviously, I would always say that internal links within your site should always return a 200 status eventually, whether they go through a redirect or not. I think we'll get onto redirects in a little bit. But yeah, effectively all your—obviously again, as a user, if I'm clicking around a navigation on a website, or something similar, and I suddenly find that a page is broken when I click on a link, it's a nightmare. It's an awful user experience, I don't get anywhere. And crucially, more importantly, I then as a website, I don't get to rank for whatever is on that page, right?


So quite often, what we will see is, we prioritize, essentially, 404s that we find a lot of effectively. Again, 404s are very, very easy. Obviously, if you have a header nav across 200 pages on your website, and you make a single typo in your header now, you suddenly break 200 links. So we will always flag things like. And obviously, quite often, I'm working on sites with millions of URLs and these can create millions of 404s which can be changed in a, you know, opening up the CMS, pressing Delete and adding the S on the end—or whatever the change is. So I would always recommend starting there and say, ensuring that your header navigation and things like that all work perfectly, then your big content pieces and go from there.


How to deal with these issues? As I say, quite often, you've got a couple of options there. The best issue and the best way to fix it, and the easiest way to fix it—is to fix the link, right? If it's an internal link. But the big problem sometimes is that you might find that you have a link from— and this happens all the time from a great—you might have a great backlink, right? So you might have got some national press or some international press. But you will find that—but you might find that, despite you sending them the link that you want them to use, they've just got it wrong. Now they're not responding to your emails, etc, etc, etc. So what can you do? So obviously, the perfect thing to do is to go into your CMS, go into Wix or whatever, and change the link. So it's now the right link that points to the right URL. But the other options are redirects, Nati, right? And I'm going to hand it over to you for this, just because obviously, I know you can talk about how to actually fix it within—or how to set up a redirect within Wix.


26:23

Nati: Yep. So we said first of all, you fix the link that you want to redirect, right? And you want to redirect things, URLs that have had past signals. What I mean by past signals is that there was actual content behind them. That there was something tangible and real behind them, not just just a systematic 404, so make sure you do prioritize that. Going about it in Wix, first of all, in many places, and we're extending that over time, whenever you change a URL slug, for example, or sometimes the URL structure that we're working on. We will automatically add a 301 redirect for you. We want to fix the links for you. But in many cases, we'll edit automatically and you can disable it yourself.


If you want to add your own manual redirections—301 redirections—you can go to the Redirect Manager under SEO Settings, and you have many options there. You can do it one-by-one, you can do it by groups, you can upload the CSV. And that way you make Google understand—you make Google pass the signals, pass everything basically Google knew about regarding your URL, to the new one.


There is, of course, one major caveat here—that you can’t just redirect anywhere you want. You should always, always, always, always redirect to the same or matching content. Meaning that just redirecting 404s to your homepage isn't going to help much. Just redirecting it to another URL, another page that's important for you, isn't going to help for long. You want to always, always redirect to the best and closest match possible. Content-wise, of course, right? Not URL structure-wise. It doesn't matter what the URL is, it's about the content behind it. So always make sure you redirect to the right URL. And also always make sure that URL you're redirecting to is the canonical version. And that it returns 200—okay, the one we've mentioned earlier.


Chris: Yeah.


28:47

Edu: So I have a question, maybe a dummy question here. But I'm getting like a lot of insights here, which is really, really good. But I have two questions. One is, how do I actually edit the URL?


And the second question related to that is—because you mentioned like a human error where they forgot an “s” and then crashed the whole thing. But is there an easy way for us to troubleshoot and find the source of the 404 issue?


29:12

Chris: So I can take the second bit in terms of finding the source. So again, any web crawler that you might use, will always highlight a 404 error. Search Console also contains a list of 404s that Google has found, which is obviously a great resource. But also I mean, again, a little bit of manual UX I guess, and testing of your site, will help you find anything. Again you know, before you push a new navigation live, make sure all the links in it work, effectively.


They will always be my choices—whenever you push content live, make sure those links work, go and check yourself. But also you've got, as I say, Search Console, you've got any web crawler you choose to use. And they will be my first two options. Nati, I'm going to pass fixing the link or actually editing the link over to you because again, it feels like a Wix question.


30:24

Nati: So in Wix, there are a couple of places where you can do that. You can do it in the SEO panel, where you can edit the slug for the type of page you're working on, in most cases. And you can do it to a greater extent in our SEO Patterns. You can actually change the entire structure for your blog posts or your product pages. And we're extending that as well to other page types.


So you can do it as a whole—for example, all of your blog posts, if you wanted to start with “banana”, you could end with “someone take my Husky out for a walk please”, you could. And if you just wanted to edit the specific slug for a specific page, blog post, product, whatever—you can do that as well.