What is vector embedding for SEO: Insights from Mike King
- George Nguyen
- Apr 23
- 4 min read
Updated: Apr 28
Author: George Nguyen

Search engines are integrating LLM technology, like AI overviews, to provide more chat-like experiences in search, but the impact of this convergence doesn’t end there—it’s allowing search engines to rely less on keyword matching and more on semantic search to understand both user queries and the document (e.g., web page) under consideration.
In our webinar on SEO for conversational search, Mike King, CEO and founder of iPullRank, explained the technical underpinnings of modern, LLM-powered search engines, emphasizing the crucial role of vector embeddings.
In this article, I’ll explain what vector embeddings are and their significance in SEO, drawing on insights and examples from King’s presentation.
Table of contents:
What are vector embeddings in SEO?

Vector embedding is a method LLMs use to assess the relationships between different pieces of content. They are numerical representations of words, phrases, or documents in a multi-dimensional space. These representations capture the semantic meaning of the text, allowing search engines to understand these relationships closer to the way a human would.
“Search engines operate on what’s called the vector space model,” King said during the webinar.
“What they're doing is creating vector representations of your query and vector representations of documents—these are effectively coordinates in multi-dimensional space. And so, whichever documents are physically closest in space to the query are considered the most relevant.” — Mike King, Founder & CEO, iPullRank

This technique transforms the traditional view of relevance (which is often a qualitative notion) into a quantitative measure.
This means that relevance (as far as SEOs in 2025 and beyond are concerned) is not just about matching keywords, but about understanding the underlying meaning and context of the text.
How vector embeddings work
“You’re literally taking these documents, breaking the sentences down, turning those into numbers, and then based on those numbers, two things that are saying the same thing will have similar representations in that multi-dimensional space," King said.

For example, “If you take the vector representation of the word ‘king,’ and you subtract the vector representation for the word ‘man,’ and then you add the vector representation for the word ‘woman,’ the closest match will be the vector representation for the word ‘queen,’” he explained. This demonstrates how mathematical operations on vector embeddings can reveal semantic relationships between words.
Measure relevance with cosine similarity
Once vectorized, you can use cosine similarity to measure the distance between vectors in this multi-dimensional space.

“[Cosine similarity] is basically measuring the distance between angles. So if you get a cosine similarity that's close to one, that means it's really similar or highly relevant. If it’s close to zero, that means it’s orthogonal or not related. And if it's close to negative one, that means it’s opposite.” — Mike King, Founder & CEO, iPullRank
This suggests that search engines calculate the relevance of content by determining how close the vector representation of a document is to the vector representation of the user’s query. The closer the vectors, the more relevant the document is considered to be.
Should you use vector-based techniques for SEO?
You do not need to focus on all possible SEO techniques; just the ones that are right for your particular business, niche, and level of competition. Keep that in mind as you evaluate whether vector-based workflows are right for your brand—for example, if you are a small business that hasn’t even set up Google Search Console yet, you should cover those SEO basics before moving on to advanced strategies.
“Vector embeddings are fundamental to how both modern search engines and conversational search make sense of content,” King said when asked what type of businesses should adopt vector embedding for SEO, explaining, “There is value in any business that wants to appear in them using them.”
“If you're looking to engineer the relevance of your content to perform better, [vector embedding] is a good place to start. That said, if you are a small business, you should focus on building your brand and aligning with your audience expectations first. Dig into advanced techniques like this once you have a solid foundation.” — Mike King, Founder & CEO, iPullRank
During the webinar, I commented that the ‘spirit’ of vector embedding is very similar to journalistic and content creation best practices. So, if I (like many other editors and content marketers) already adhere to those standards, should I pursue vector-based techniques?
While these tactics are not mutually exclusive, King advised that I keep doing what I’ve been doing: “It’s how Google is modeling what it is that you do. What you’re describing is exactly what you should continue doing (e.g., deep research on the topic, adding a lot of information related to that topic).”
![A quote from Mike King that says: “The representation that Google has mathematically is them training on the entirety of the internet and knowing what those relation relationships look like. So there’s a lot of different principles behind natural language processing that reflect real-world usage of words.
So, what [Google is] doing is they’re just saying like, ‘Okay, well we know how words are used typically across all the books in the world and all the documents on the internet, and so we’ve modeled that. How well does your content match that model?’
That’s not me saying, ‘Hey, you gotta copy word-for-word what someone else said’; you just have to represent those ideas as well as other people have—even if you use different words.”](https://static.wixstatic.com/media/a484d4_0c6c2a6bef8945bcb40f8ba4197e0e2a~mv2.jpg/v1/fill/w_147,h_83,al_c,q_80,usm_0.66_1.00_0.01,blur_2,enc_avif,quality_auto/a484d4_0c6c2a6bef8945bcb40f8ba4197e0e2a~mv2.jpg)
Get started with vectorizing using Screaming Frog
Vectorizing your content can serve a range of SEO use cases, including clustering, classifying, making recommendations, measuring similarity/diversity, detecting anomalies, retrieving information, and generating and translating text.

To get started, check out Vector Embeddings is All You Need: SEO Use Cases for Vectorizing the Web with Screaming Frog, where King provides a tutorial on setting up Screaming Frog for vector analysis (including a custom GPT to help you write your code).
Google already adopted vector embedding—so should you
By representing content (text, images, etc.) as numerical vectors, search engines can capture the semantic meaning and relationships between words and documents. As search engines move further towards semantic search and conversational AI, optimizing content for relevance goes beyond simply using keywords. It involves creating content that clearly conveys meaning and context.
“As natural language processing technology has yielded denser embeddings (as compared to the sparse embeddings featured in approaches like TF-IDF), Google has improved its ability to capture and associate information on a passage, page, site, and author level. Google moved on a long time ago, but with the rapid advancements in vector embeddings, we can catch up.” — Mike King, Founder & CEO, iPullRank
George Nguyen is the Director of SEO Editorial at Wix. He creates content to help users and marketers better understand how search works. He was formerly a search news journalist and is known to speak at the occasional industry event.