
{"id":16463,"date":"2023-04-20T09:28:05","date_gmt":"2023-04-20T13:28:05","guid":{"rendered":"https:\/\/ipullrank.com\/?p=16463"},"modified":"2025-07-31T16:03:27","modified_gmt":"2025-07-31T20:03:27","slug":"content-relevance","status":"publish","type":"post","link":"https:\/\/ipullrank.com\/content-relevance","title":{"rendered":"Relevance is Not a Qualitative Measure for Search Engines"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"16463\" class=\"elementor elementor-16463\" data-elementor-post-type=\"post\">\n\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-04af6b6 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"04af6b6\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-81dab01\" data-id=\"81dab01\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-2b703af elementor-widget elementor-widget-text-editor\" data-id=\"2b703af\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Writing my book \u201c<\/span><a href=\"https:\/\/www.amazon.com\/Science-SEO-Decoding-Search-Algorithms\/dp\/1119844835\/\"><span style=\"font-weight: 400;\">The Science of SEO<\/span><\/a><span style=\"font-weight: 400;\">\u201d has been incredibly eye-opening as I\u2019ve dug deep into the minutia of how search engines work.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">Fundamentally, SEO is an abstraction layer wherein we manipulate inputs (site architecture, content, and links) for a search engine to get specific outputs (rankings and traffic). At that abstraction layer, we talk about aspects of those inputs as being qualitative (good vs bad) and quantitative (PR 7 vs LCP 2.5), but a search engine is a mathematical environment, so even a qualitative feature must be quantified through a quantitative proxy.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">For example, the Panda update is said to have looked at page quality and the utility of content for users and we were presented with <\/span><a href=\"https:\/\/developers.google.com\/search\/blog\/2011\/05\/more-guidance-on-building-high-quality\"><span style=\"font-weight: 400;\">qualitative direction<\/span><\/a><span style=\"font-weight: 400;\"> on how to create better content that yields user satisfaction. However, according to the \u201c<\/span><a href=\"https:\/\/patents.google.com\/patent\/US8682892B1\/en\"><span style=\"font-weight: 400;\">Ranking search results<\/span><\/a><span style=\"font-weight: 400;\">\u201d patent filed by Navneet Panda and Vladimir Ofitserov, Panda constructed a \u201cgroup-specific modification factor\u201d as a post-retrieval multiplier to augment the ranking of pages. That score was a ratio of \u201creference queries\u201d to \u201cindependent links\u201d or, in other words, the number of inbound linking root domains pointing to that page, subdirectory, or domain divided by the number of queries that directly referenced the page, subdirectory or, domain. The \u201c<\/span><a href=\"https:\/\/patents.google.com\/patent\/US9031929\"><span style=\"font-weight: 400;\">Site quality score<\/span><\/a><span style=\"font-weight: 400;\">\u201d, and \u201c<\/span><a href=\"https:\/\/patents.google.com\/patent\/US9195944B1\/en\"><span style=\"font-weight: 400;\">Scoring site quality<\/span><\/a><span style=\"font-weight: 400;\">\u201d patents, further expand on this idea and introduced \u201cuser selections\u201d (clicks and dwell time) to improve the calculation. None of these are qualitative measures of page quality. Rather, they are all measures of quantitative user behaviors that act as proxies for a scalable determination of quality based on user feedback.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">It makes sense that, if a piece of content is high quality, a lot of people will link to it, a lot of people will search for it by name or URL, and a lot of people will spend a lot of time reading it.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">Now, relevance is only one of many measures used in a ranking scoring function, however, it\u2019s perhaps one of the most important query-dependent factors that inform what pages enter the consideration set.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">So, how might a search engine determine relevance?<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8b9f707 elementor-widget elementor-widget-heading\" data-id=\"8b9f707\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Search Engines Operate on the Vector Space Model<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e8bb080 elementor-widget elementor-widget-text-editor\" data-id=\"e8bb080\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">While there is an array of retrieval models out there, every major web search engine you\u2019ve ever used operates on <\/span><a href=\"https:\/\/en.wikipedia.org\/wiki\/Gerard_Salton\"><span style=\"font-weight: 400;\">Gerard Salton\u2019s<\/span><\/a><span style=\"font-weight: 400;\"> vector space model. In this model, both the user\u2019s query and the resulting documents are converted into a series of vectors and plotted in high dimensional space. Those document vectors that are the closest to the query vector are considered the most relevant results. <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-32ab78e elementor-widget elementor-widget-image\" data-id=\"32ab78e\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/vectors-cosine-theta.png\" class=\"attachment-large size-large wp-image-16471\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/vectors-cosine-theta.png 500w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/vectors-cosine-theta-300x300.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/vectors-cosine-theta-150x150.png 150w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/vectors-cosine-theta-80x80.png 80w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-935f98f elementor-widget elementor-widget-text-editor\" data-id=\"935f98f\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Queries and documents being converted to vectors is an indication that we\u2019re talking about mathematical operations. Specifically, we\u2019re talking about Linear Algebra and the measure of relevance is <\/span><a href=\"https:\/\/www.machinelearningplus.com\/nlp\/cosine-similarity\/\"><span style=\"font-weight: 400;\">Cosine Similarity<\/span><\/a><span style=\"font-weight: 400;\">. <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a240024 elementor-widget elementor-widget-image\" data-id=\"a240024\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"800\" height=\"266\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/cos-diff-vectors-1024x341.png\" class=\"attachment-large size-large wp-image-16469\" alt=\"cos-diff-vectors\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/cos-diff-vectors-1024x341.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/cos-diff-vectors-300x100.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/cos-diff-vectors-768x256.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/cos-diff-vectors-825x275.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/cos-diff-vectors-945x315.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/cos-diff-vectors.png 1500w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-0d824e3 elementor-widget elementor-widget-text-editor\" data-id=\"0d824e3\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Cosine Similarity measures the cosine of the angle between two vectors, and it ranges from -1 to 1. A result of 1 indicates that the vectors are identical, 0 indicates that the vectors are orthogonal (i.e., have no similarity), and -1 indicates that the vectors are diametrically opposed (i.e. have an opposite relationship).\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">This is where a big disconnect lies in how we discuss relevance in the SEO community.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">Often we eyeball pages and declare that we have the more relevant page than our competitors and are puzzled as to why are we not ranking better. To further confuse things, everyone\u2019s heuristic about this is different. Some people feel that because an expert wrote it, or their page is more visually readable, or because it\u2019s better designed it\u2019s more relevant. While all of these measures may impact the vectorization of the page, they are all qualitative measures of something other than relevance. Complicating matters, some people believe because a page is longer, it\u2019s more relevant. This is partially why the question about content is usually, how long should it be?<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f2bedf4 elementor-widget elementor-widget-heading\" data-id=\"f2bedf4\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Word Vectors<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-eedb59b elementor-widget elementor-widget-text-editor\" data-id=\"eedb59b\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">In the vector space model, we are converting words into vectors, which in its simplest form are numerical representations of words in a high-dimensional space. Each dimension corresponds to a unique feature or aspect of the word&#8217;s meaning. Word vectors capture the statistical properties of words, such as their frequency and co-occurrence with other words in a corpus (or index).<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d4da60e elementor-widget elementor-widget-image\" data-id=\"d4da60e\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"500\" height=\"500\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/words-as-vectors.png\" class=\"attachment-large size-large wp-image-16470\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/words-as-vectors.png 500w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/words-as-vectors-300x300.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/words-as-vectors-150x150.png 150w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/words-as-vectors-80x80.png 80w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-4ca5aaf elementor-widget elementor-widget-text-editor\" data-id=\"4ca5aaf\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">There are several techniques used to generate word vectors, including one-hot encoding, TF-IDF, and neural network-based models. One-hot encoding is a basic approach that assigns a unique vector to each word in a corpus or vocabulary. Each dimension of the vector is set to 0, except for the dimension that corresponds to the word, which is set to 1.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-567d87e elementor-widget elementor-widget-image\" data-id=\"567d87e\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"266\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/one-hot-encoding-1024x341.png\" class=\"attachment-large size-large wp-image-16477\" alt=\"one hot encoding visual representation\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/one-hot-encoding-1024x341.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/one-hot-encoding-300x100.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/one-hot-encoding-768x256.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/one-hot-encoding-825x275.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/one-hot-encoding-945x315.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/one-hot-encoding.png 1500w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c79c342 elementor-widget elementor-widget-heading\" data-id=\"c79c342\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Bag of Words and TF-IDF<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e2a4e7e elementor-widget elementor-widget-text-editor\" data-id=\"e2a4e7e\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">In a web search engine, the bag of words model represents a page as a collection of individual words or &#8220;tokens&#8221; and their respective frequencies. If some of this language is beginning to sound like things you\u2019re learning about ChatGPT, stay tuned, you\u2019ll see why in a second.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">In this model, we create a &#8220;bag&#8221; of words, where the order and context of the words are ignored. For example, if we have the sentence &#8220;The dog started to run after the cat, but the cat jumped over the fence&#8221;, the bag of words representation would be {dog: 1, start: 1, run: 1, cat: 2, jump: 1, fence:1}\u00a0 one-hot encoded as illustrated below. <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5ebbea0 elementor-widget elementor-widget-image\" data-id=\"5ebbea0\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"450\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation-1024x576.png\" class=\"attachment-large size-large wp-image-16468\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation-1024x576.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation-300x169.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation-768x432.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation-1536x864.png 1536w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation-825x464.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation-945x532.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bag-of-words-representation.png 1920w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cf22656 elementor-widget elementor-widget-text-editor\" data-id=\"cf22656\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">In the example, we have three sentences and each column represents the stems of the words (it\u2019s \u201cstart\u201d not \u201cstarted\u201d) in the vocabulary. Each cell represents the term frequency of that word in the sentence. When plotting in multidimensional space, each row could represent a vector based on the vocabulary of the document.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">The limitation of the bag of words model is that it treats all words equally, regardless of their importance or relevance to the document. To offset that, the information retrieval world devised term frequency-inverse document frequency (TF-IDF).<\/span><\/p><p><span style=\"font-weight: 400;\">TF-IDF, on the other hand, takes into account the importance of each word in a given document or corpus of documents. It assigns higher weights to words that appear frequently in a particular document or corpus, but less frequently in other documents or corpora.<\/span><\/p><p><span style=\"font-weight: 400;\">While the OGs talked about and used it for a long time, over the past five years, the SEO community (myself included), got very loud about the TF-IDF concept. Using the idea of co-occurrence, many of us have been optimizing our content to account for those missing co-occurring terms.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">Technically, this isn\u2019t wrong, it\u2019s just rudimentary now that Google has a stronger focus on semantic search rather than lexical search.<\/span><\/p><p><span style=\"font-weight: 400;\">There are many SEO tools out there that compare a given page to those that are ranking in the SERPs and tell you the co-occurring keywords that your page is missing. Yes, this will directly impact what Google is measuring semantically, but the TF-IDF model is not a direct reflection of how they are determining relevance for rankings at this point.\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b43b725 elementor-widget elementor-widget-heading\" data-id=\"b43b725\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Word Embeddings<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-33ab7d8 elementor-widget elementor-widget-text-editor\" data-id=\"33ab7d8\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">The terms &#8220;word embeddings&#8221; and &#8220;word vectors&#8221; are often used interchangeably in the context of NLP. However, there is a subtle difference between the two concepts.<\/span><\/p><p><span style=\"font-weight: 400;\">Word embeddings are a specific type of word vector representation that is designed to capture the semantic and syntactic meaning of words. Word embeddings are a series of floating point values generated using neural network-based models, such as Word2Vec, GloVe, and BERT that are trained on large amounts of text data. These models assign each word a unique vector representation that captures its meaning in high-dimensional space. This is the concept that underlies how the GPT family of language models works. In fact, <\/span><a href=\"https:\/\/platform.openai.com\/docs\/guides\/embeddings\"><span style=\"font-weight: 400;\">OpenAI offers embeddings as a service<\/span><\/a><span style=\"font-weight: 400;\"> in their APIs.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ed0ab9a elementor-widget elementor-widget-image\" data-id=\"ed0ab9a\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"450\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings-1024x576.png\" class=\"attachment-large size-large wp-image-16467\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings-1024x576.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings-300x169.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings-768x432.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings-1536x864.png 1536w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings-825x464.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings-945x532.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/sparse-vs-dense-embeddings.png 1920w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-43e703a elementor-widget elementor-widget-text-editor\" data-id=\"43e703a\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Sparse and dense embeddings are two ways of representing words or sentences as embeddings.<\/span><\/p><p><span style=\"font-weight: 400;\">Sparse embeddings operate like a dictionary where each word is assigned a unique index, and the embedding for a word is a vector of mostly zeros with a few non-zero values at the index corresponding to that word. This type of embedding is very memory-efficient because most values are zero, but it doesn&#8217;t capture the nuances of language very well.<\/span><\/p><p><span style=\"font-weight: 400;\">Dense embeddings operate as a mapping where each word or sentence is represented by a continuous vector of numbers, with each number representing a different aspect of the word or sentence. This type of embedding is more memory-intensive, but it captures the context and relationships within the language.<\/span><\/p><p><span style=\"font-weight: 400;\">A good analogy for sparse and dense embeddings is a photo versus a painting. A photo captures a moment in time with precise detail, but it may not convey the depth and emotion of the subject. You don\u2019t see the emotion in the brush strokes that the painter or the multiple dimensions of the scene. Whereas a painting is a representation of the subject that is more abstract, but it can convey more nuanced emotions and details that are not captured in a photo.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e1b7cf0 elementor-widget elementor-widget-image\" data-id=\"e1b7cf0\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"512\" height=\"305\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/dense-retrieval-passage.png\" class=\"attachment-large size-large wp-image-16472\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/dense-retrieval-passage.png 512w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/dense-retrieval-passage-300x179.png 300w\" sizes=\"(max-width: 512px) 100vw, 512px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-db113e0 elementor-widget elementor-widget-text-editor\" data-id=\"db113e0\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Dense embeddings unlock a lot of features and functionality in search that were far more difficult under the lexical system. If you remember when Google mentioned <\/span><a href=\"https:\/\/searchengineland.com\/how-google-indexes-passages-of-a-page-and-what-it-means-for-seos-342215\"><span style=\"font-weight: 400;\">passage indexing<\/span><\/a><span style=\"font-weight: 400;\"> a few years back, this is the idea of what they call <\/span><a href=\"https:\/\/research.google\/pubs\/pub51472\/\"><span style=\"font-weight: 400;\">aspect embeddings<\/span><\/a><span style=\"font-weight: 400;\"> where the relevance of features of a page can be reviewed and scored when performing different retrieval operations. This is why Google\u2019s featured snippets have gotten so much better and they are able to highlight down to the specific sentence.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e2a3f98 elementor-widget elementor-widget-heading\" data-id=\"e2a3f98\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h3 class=\"elementor-heading-title elementor-size-default\">Word2Vec (or the moment Google left SEO behind)<\/h3>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ade77fa elementor-widget elementor-widget-text-editor\" data-id=\"ade77fa\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">It\u2019s worth taking a step back to show how we got here. In January 2013, Tomas Mikolov, Kai Chen, Greg Corrado, and Jeff Dean shared their \u201c<\/span><a href=\"https:\/\/arxiv.org\/abs\/1301.3781\"><span style=\"font-weight: 400;\">Efficient Estimation of Word Representations in Vector Space<\/span><\/a><span style=\"font-weight: 400;\">\u201d which gave the NLP world the then-novel neural network architecture known as Word2Vec. <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5837b56 elementor-widget elementor-widget-image\" data-id=\"5837b56\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"502\" height=\"512\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/word-vectors.png\" class=\"attachment-large size-large wp-image-16473\" alt=\"word vectors\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/word-vectors.png 502w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/word-vectors-294x300.png 294w\" sizes=\"(max-width: 502px) 100vw, 502px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cbc2f38 elementor-widget elementor-widget-text-editor\" data-id=\"cbc2f38\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">In that paper, they highlighted that, with the embeddings they generated using Word2Vec, one can do simple mathematical calculations to encode and uncover semantic relationships. Famously, they showcased how subtracting the vector for the word \u201cman\u201d from the vector for for the word \u201cking\u201d and adding the vector for the word \u201cwoman\u201d yielded a vector that was closest to the word \u201cqueen.\u201d\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">These mathematical operations held across many word relationships. In the examples below we see this for verb tenses and the relationships between countries and their capitals in high dimensional space.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8cd91f9 elementor-widget elementor-widget-image\" data-id=\"8cd91f9\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"512\" height=\"179\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/relationships-of-word-vectors-in-high-dimensional-space.png\" class=\"attachment-large size-large wp-image-16474\" alt=\"relationships of word vectors in high dimensional space\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/relationships-of-word-vectors-in-high-dimensional-space.png 512w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/relationships-of-word-vectors-in-high-dimensional-space-300x105.png 300w\" sizes=\"(max-width: 512px) 100vw, 512px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-334caa0 elementor-widget elementor-widget-text-editor\" data-id=\"334caa0\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Word2Vec was a significant leap forward, it had a key shortcoming in its ability to understand context and meaning accurately. One of the primary reasons for this is that Word2Vec uses a technique called a &#8220;skip-gram&#8221; to predict the surrounding words of a target word based on its position in a sentence. This technique assumes that the meaning of a word is determined by the words that surround it, without taking into account the broader context of the sentence or document.<\/span><\/p><p><span style=\"font-weight: 400;\">Despite this shortcoming, this is where Google began to move from the lexical model and really unlocked the semantic model thereby leaving the SEO community in the dust. It\u2019s quite shocking that our SEO tools have still been just counting words all this time when such a leap forward has been available open source for 10 years.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">Now let\u2019s talk about BERT.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-76727c9 elementor-widget elementor-widget-heading\" data-id=\"76727c9\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h3 class=\"elementor-heading-title elementor-size-default\">BERT<\/h3>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-69abe66 elementor-widget elementor-widget-text-editor\" data-id=\"69abe66\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Google&#8217;s <\/span><a href=\"https:\/\/arxiv.org\/abs\/1810.04805\"><span style=\"font-weight: 400;\">Bidirectional Encoder Representations from Transformers (BERT)<\/span><\/a><span style=\"font-weight: 400;\"> is a pre-trained NLP model that revolutionized the field. BERT is based on the <\/span><a href=\"https:\/\/ai.googleblog.com\/2017\/08\/transformer-novel-neural-network.html\"><span style=\"font-weight: 400;\">Transformer<\/span><\/a><span style=\"font-weight: 400;\"> architecture, a type of neural network that is particularly well-suited for sequence-to-sequence tasks such as machine translation and text summarization.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5856618 elementor-widget elementor-widget-image\" data-id=\"5856618\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"512\" height=\"286\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-understanding-context.png\" class=\"attachment-large size-large wp-image-16475\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-understanding-context.png 512w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-understanding-context-300x168.png 300w\" sizes=\"(max-width: 512px) 100vw, 512px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-bbe1a35 elementor-widget elementor-widget-text-editor\" data-id=\"bbe1a35\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">What sets BERT apart from previous NLP models is its ability to generate dense vector representations of words and sentences. Unlike earlier models that used comparatively sparse word vectors, BERT is able to capture the meaning and context of a sentence in a dense, continuous vector space. This allows the model to better understand the nuances of language and provide more accurate results for tasks such as search, text classification, sentiment analysis, and question answering.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a73da8d elementor-widget elementor-widget-image\" data-id=\"a73da8d\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"266\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-transformer-1024x341.png\" class=\"attachment-large size-large wp-image-16466\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-transformer-1024x341.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-transformer-300x100.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-transformer-768x256.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-transformer-825x275.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-transformer-945x315.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/bert-transformer.png 1500w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-37679d9 elementor-widget elementor-widget-text-editor\" data-id=\"37679d9\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">BERT was trained on a massive corpus of text data, including web pages, books, and news articles. During training, the model learned to predict missing words in a sentence, based on the words that came before and after the missing word. This task, known as masked language modeling, allowed the model to learn the relationships between words and their context in a sentence.<\/span><\/p><p><span style=\"font-weight: 400;\">BERT also introduced another technique called next sentence prediction, which helps the model to understand the relationship between two consecutive sentences. This is particularly useful for tasks such as question answering and natural language inference.<\/span><\/p><p><span style=\"font-weight: 400;\">Google Search was one of the first applications to benefit from BERT&#8217;s improved performance. In 2019, Google announced that it had implemented BERT in its search algorithms, allowing the search engine to better understand the meaning behind search queries and provide more relevant results. BERT&#8217;s ability to generate dense vector representations of words and sentences allowed Google Search to better understand the nuances of language and provide more accurate results for long-tail queries, which are often more conversational and complex.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-0ccebb8 elementor-widget elementor-widget-heading\" data-id=\"0ccebb8\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h3 class=\"elementor-heading-title elementor-size-default\">The Universal Sentence Encoder<\/h3>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ba62651 elementor-widget elementor-widget-text-editor\" data-id=\"ba62651\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Google&#8217;s <\/span><a href=\"https:\/\/ai.googleblog.com\/2018\/05\/advances-in-semantic-textual-similarity.html\"><span style=\"font-weight: 400;\">Universal Sentence Encoder<\/span><\/a><span style=\"font-weight: 400;\"> (<\/span><a href=\"https:\/\/arxiv.org\/abs\/1803.11175\"><span style=\"font-weight: 400;\">paper<\/span><\/a><span style=\"font-weight: 400;\">) is a language model that is capable of encoding a sentence or piece of text into a dense embeddings. These embeddings can be used for the same various tasks that we discussed with BERT.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f2cd665 elementor-widget elementor-widget-image\" data-id=\"f2cd665\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"512\" height=\"159\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/embed-words.png\" class=\"attachment-large size-large wp-image-16478\" alt=\"embed words\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/embed-words.png 512w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/embed-words-300x93.png 300w\" sizes=\"(max-width: 512px) 100vw, 512px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-dba380e elementor-widget elementor-widget-text-editor\" data-id=\"dba380e\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">In fact, I don&#8217;t know that I need to go into any more technical detail. Just know that it can be used to generate <\/span><a href=\"https:\/\/ai.googleblog.com\/2019\/07\/multilingual-universal-sentence-encoder.html\"><span style=\"font-weight: 400;\">valuable embeddings for semantic search<\/span><\/a><span style=\"font-weight: 400;\"> and it\u2019s open source. <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-51a74e2 elementor-widget elementor-widget-heading\" data-id=\"51a74e2\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Introducing Orbitwise<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1d40fa0 elementor-widget elementor-widget-text-editor\" data-id=\"1d40fa0\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Knowing that Google is using dense embeddings for so many aspects of its ranking systems, I\u2019ve gotten frustrated with how SEO tools are still trapped in the lexical dark ages. So, we\u2019ve built a free tool called <a href=\"https:\/\/ipullrank.com\/tools\/orbitwise\/\">Orbitwise<\/a> (orbit + bitwise) to calculate relevance using the vector space model and Google\u2019s Universal Sentence Encoder.\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ea6265a elementor-widget elementor-widget-image\" data-id=\"ea6265a\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"305\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Screenshot-2023-04-18-at-2.25.55-PM-1024x390.png\" class=\"attachment-large size-large wp-image-16482\" alt=\"orbitwise screenshot\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Screenshot-2023-04-18-at-2.25.55-PM-1024x390.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Screenshot-2023-04-18-at-2.25.55-PM-300x114.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Screenshot-2023-04-18-at-2.25.55-PM-768x293.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Screenshot-2023-04-18-at-2.25.55-PM-825x314.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Screenshot-2023-04-18-at-2.25.55-PM-945x360.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Screenshot-2023-04-18-at-2.25.55-PM.png 1528w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-0125c3e elementor-widget elementor-widget-text-editor\" data-id=\"0125c3e\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">At this point, Google has an array of language models to choose from, so there\u2019s no telling which one is actually in production in Search. However, since we\u2019ll never have exactly what Google has for any SEO use case, we should all agree that SEO tools are about precision, not accuracy. That is to say, even if we\u2019re not using the same language model as Google the relative calculations between pages should be similar.<\/span><\/p><p><span style=\"font-weight: 400;\">Let\u2019s jump into an example to show it works.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-dcb03ea elementor-widget elementor-widget-heading\" data-id=\"dcb03ea\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h3 class=\"elementor-heading-title elementor-size-default\">The [enterprise seo] example<\/h3>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-26f8139 elementor-widget elementor-widget-text-editor\" data-id=\"26f8139\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Just like any other marketer, I feel our <\/span><a href=\"https:\/\/ipullrank.com\/enterprise-seo\"><span style=\"font-weight: 400;\">enterprise SEO<\/span><\/a><span style=\"font-weight: 400;\"> landing page is more relevant than the pages that rank above it and definitely better than the page that ranks #1. Like you, my position is mostly a feeling that I have after power-skimming their headings rather than actually reading the copy. <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-816e9ae elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"816e9ae\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-61ba6c8\" data-id=\"61ba6c8\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-b230850 elementor-widget elementor-widget-image\" data-id=\"b230850\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"266\" height=\"512\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/ipr-enterprise-SEO.png\" class=\"attachment-large size-large wp-image-16480\" alt=\"ipr enterprise SEO\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/ipr-enterprise-SEO.png 266w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/ipr-enterprise-SEO-156x300.png 156w\" sizes=\"(max-width: 266px) 100vw, 266px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-50 elementor-top-column elementor-element elementor-element-7f77994\" data-id=\"7f77994\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-e1935ea elementor-widget elementor-widget-image\" data-id=\"e1935ea\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"236\" height=\"512\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/hubspot-enterprise-SEO.png\" class=\"attachment-large size-large wp-image-16479\" alt=\"hubspot enterprise SEO\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/hubspot-enterprise-SEO.png 236w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/hubspot-enterprise-SEO-138x300.png 138w\" sizes=\"(max-width: 236px) 100vw, 236px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-39c1040 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"39c1040\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-7ae6c76\" data-id=\"7ae6c76\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-62d887e elementor-widget elementor-widget-text-editor\" data-id=\"62d887e\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">When we think through the lens content versus links, the question is what do we do now to improve our position? Optimize the content or build more links? This is exactly the time to fire up Orbitwise.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-b1df3bc elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"b1df3bc\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-493afea\" data-id=\"493afea\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-d1e80cb elementor-widget elementor-widget-image\" data-id=\"d1e80cb\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"512\" height=\"280\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/orbitwise-mapping.png\" class=\"attachment-large size-large wp-image-16483\" alt=\"orbitwise mapping\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/orbitwise-mapping.png 512w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/orbitwise-mapping-300x164.png 300w\" sizes=\"(max-width: 512px) 100vw, 512px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e1e6995 elementor-widget elementor-widget-text-editor\" data-id=\"e1e6995\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Orbitwise takes your query and converts it into a 512-dimension embedding to use for comparison against the embeddings that are generated for the documents. The tool then performs cosine similarity between the query and each document to determine relatedness. We also calculate a percentage of similarity and reduce the dimensionality of the embeddings and place them in 2D space for a simplistic visualization.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-30a9b1f elementor-widget elementor-widget-image\" data-id=\"30a9b1f\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"512\" height=\"280\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/enterprise-seo-orbitwise-map.png\" class=\"attachment-large size-large wp-image-16484\" alt=\"enterprise seo orbitwise map\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/enterprise-seo-orbitwise-map.png 512w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/enterprise-seo-orbitwise-map-300x164.png 300w\" sizes=\"(max-width: 512px) 100vw, 512px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8c98084 elementor-widget elementor-widget-text-editor\" data-id=\"8c98084\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">The dots are often clustered quite closely, so it\u2019s best to use your mouse wheel to zoom in and understand the distance because a small difference means a lot to the calculation.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-05cbb46 elementor-widget elementor-widget-image\" data-id=\"05cbb46\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"366\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9-1024x469.png\" class=\"attachment-large size-large wp-image-16491\" alt=\"\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9-1024x469.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9-300x137.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9-768x352.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9-1536x704.png 1536w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9-825x378.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9-945x433.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image9.png 1901w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a77d4f2 elementor-widget elementor-widget-text-editor\" data-id=\"a77d4f2\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Since my page does not rank in the top 10, I need to pull it separately by entering it in the input box below. Once I\u2019ve done that, my page is added to the chart and the table.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c25e50f elementor-widget elementor-widget-image\" data-id=\"c25e50f\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"650\" height=\"358\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image6.png\" class=\"attachment-large size-large wp-image-16495\" alt=\"content relevance highlight\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image6.png 650w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image6-300x165.png 300w\" sizes=\"(max-width: 650px) 100vw, 650px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f0dee69 elementor-widget elementor-widget-text-editor\" data-id=\"f0dee69\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Zooming in clarifies that my page has very high relevance with the query since it&#8217;s overlapping the black dot whereas the number one ranking page has much lower relevance. <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-837c21a elementor-widget elementor-widget-heading\" data-id=\"837c21a\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h4 class=\"elementor-heading-title elementor-size-default\">Interpreting the results\n<\/h4>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b6823d7 elementor-widget elementor-widget-text-editor\" data-id=\"b6823d7\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Now we have our results. In the column on the left, we have a percentage of relevance between the URL and the query. Comparing my landing page to the number one result makes it immediately clear that my page\u2019s issue is not relevance since they have 64 and we have 73.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d37603b elementor-widget elementor-widget-image\" data-id=\"d37603b\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"537\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image7.png\" class=\"attachment-large size-large wp-image-16490\" alt=\"top 10 serps content relevance\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image7.png 996w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image7-300x202.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image7-768x516.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image7-825x554.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image7-945x635.png 945w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-abe86df elementor-widget elementor-widget-text-editor\" data-id=\"abe86df\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Looking at these in the context of link data from Ahrefs helps me definitively know that the problem here isn\u2019t relevance or content optimization, but authority. So, go ahead and link to our enterprise SEO page, whenever y\u2019all are ready. \ud83d\ude01<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d67a999 elementor-widget elementor-widget-image\" data-id=\"d67a999\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"183\" src=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image23-1024x234.png\" class=\"attachment-large size-large wp-image-16489\" alt=\"link data from Ahrefs\" srcset=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image23-1024x234.png 1024w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image23-300x68.png 300w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image23-768x175.png 768w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image23-825x188.png 825w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image23-945x216.png 945w, https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/image23.png 1258w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-fb32de6 elementor-widget elementor-widget-text-editor\" data-id=\"fb32de6\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Surely, I may have assumed it was an authority issue before even looking, but it\u2019s much better to have a definitive answer so I can know how to prioritize our efforts.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e6bd271 elementor-widget elementor-widget-heading\" data-id=\"e6bd271\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Wrapping Up<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b86fa23 elementor-widget elementor-widget-text-editor\" data-id=\"b86fa23\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Learning more about the technical aspects of how Google works makes it very clear that SEO software has some catching up to do. It\u2019s my hope that tools like Orbitwise showcase how SEO tools can be improved by leveraging some of the same open-source technologies that have come out of the Google Research teams.\u00a0<\/span><\/p><p><span style=\"font-weight: 400;\">In the meantime, feel free to <\/span><a href=\"https:\/\/ipullrank.com\/tools\/orbitwise\"><span style=\"font-weight: 400;\">give Orbitwise a spin<\/span><\/a><span style=\"font-weight: 400;\"> as you\u2019re working through your own questions of relevance. Let me know how it\u2019s working for you in the comments below.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-9fa4510 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"9fa4510\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-41ee8dc\" data-id=\"41ee8dc\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-58a0aa9 elementor-widget elementor-widget-template\" data-id=\"58a0aa9\" data-element_type=\"widget\" data-widget_type=\"template.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-template\">\n\t\t\t\t\t<div data-elementor-type=\"section\" data-elementor-id=\"17351\" class=\"elementor elementor-17351\" data-elementor-post-type=\"elementor_library\">\n\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-51a09b09 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"51a09b09\" data-element_type=\"section\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-2238df9f\" data-id=\"2238df9f\" data-element_type=\"column\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-7fba21b9 elementor-widget elementor-widget-heading\" data-id=\"7fba21b9\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Next Steps<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d80b72f elementor-widget elementor-widget-text-editor\" data-id=\"d80b72f\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-weight: 400;\">Here are three ways iPullRank can help you combine SEO and content to earn visibility for your business and drive revenue:<\/span><\/p><ol><li style=\"font-weight: 400;\" aria-level=\"1\"><b>Schedule a 30-Minute Strategy Session: <\/b><span style=\"font-weight: 400;\">Share your biggest SEO and content challenges so we can put together a custom discovery deck after looking through your digital presence. No one-size-fits-all solutions, only tailored advice to grow your business.<\/span><a href=\"https:\/\/ipullrank.com\/contact\"><span style=\"font-weight: 400;\"> Schedule your consultation session now<\/span><\/a><span style=\"font-weight: 400;\">.<\/span><\/li><li aria-level=\"1\"><strong>Get Our Newsletter:<\/strong> AI is reshaping search. The Rank Report gives you signal through the noise, so your brand doesn\u2019t just keep up, it leads. <a href=\"https:\/\/ipullrank.com\/rank-report\">Subscribe to the Rank Report.<\/a><\/li><li style=\"font-weight: 400;\" aria-level=\"1\"><b>Enhance Your Content&#8217;s Relevance with Relevance Doctor:<\/b><span style=\"font-weight: 400;\"> Not sure if your content is mathematically relevant? Use Relevance Doctor to test and improve your content&#8217;s relevancy, ensuring it ranks for your targeted keywords.<\/span><a href=\"https:\/\/ipullrank.com\/tools\/relevance-doctor\"><span style=\"font-weight: 400;\"> Test your content relevance today<\/span><\/a><span style=\"font-weight: 400;\">.<\/span><\/li><\/ol><p><span style=\"font-weight: 400;\">Want more? Visit <\/span><a href=\"https:\/\/ipullrank.com\/blog\">our blog<\/a> <span style=\"font-weight: 400;\">for access to past webinars, exclusive guides, and insightful blogs crafted by our team of experts.\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Writing my book \u201cThe Science of SEO\u201d has been incredibly eye-opening as I\u2019ve dug deep into the minutia of how search engines work.\u00a0 Fundamentally, SEO is an abstraction layer wherein we manipulate inputs (site architecture, content, and links) for a search engine to get specific outputs (rankings and traffic). At that abstraction layer, we talk [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":19464,"comment_status":"open","ping_status":"open","sticky":false,"template":"elementor_theme","format":"standard","meta":{"_acf_changed":false,"content-type":"","footnotes":""},"categories":[1,260,26],"tags":[],"diagnosis-deliverable":[],"class_list":["post-16463","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","category-relevance-engineering","category-seo"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Relevance is Not a Qualitative Measure for Search Engines<\/title>\n<meta name=\"description\" content=\"Learn how a content relevance score against the top query results can determine whether your SEO strategy should focus on links or content.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/ipullrank.com\/content-relevance\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Relevance is Not a Qualitative Measure for Search Engines\" \/>\n<meta property=\"og:description\" content=\"Learn how a content relevance score against the top query results can determine whether your SEO strategy should focus on links or content.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/ipullrank.com\/content-relevance\" \/>\n<meta property=\"og:site_name\" content=\"iPullRank\" \/>\n<meta property=\"article:published_time\" content=\"2023-04-20T13:28:05+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-07-31T20:03:27+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png\" \/>\n\t<meta property=\"og:image:width\" content=\"699\" \/>\n\t<meta property=\"og:image:height\" content=\"400\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Mike King\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@ipullrankagency\" \/>\n<meta name=\"twitter:site\" content=\"@ipullrankagency\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Mike King\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"17 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/ipullrank.com\/content-relevance#article\",\"isPartOf\":{\"@id\":\"https:\/\/ipullrank.com\/content-relevance\"},\"author\":{\"name\":\"Mike King\",\"@id\":\"https:\/\/ipullrank.com\/#\/schema\/person\/82831a4b9f4b8be81d5a9bfed4cb9b20\"},\"headline\":\"Relevance is Not a Qualitative Measure for Search Engines\",\"datePublished\":\"2023-04-20T13:28:05+00:00\",\"dateModified\":\"2025-07-31T20:03:27+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/ipullrank.com\/content-relevance\"},\"wordCount\":2845,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/ipullrank.com\/#organization\"},\"image\":{\"@id\":\"https:\/\/ipullrank.com\/content-relevance#primaryimage\"},\"thumbnailUrl\":\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png\",\"articleSection\":[\"Content\",\"Relevance Engineering\",\"SEO\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/ipullrank.com\/content-relevance#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/ipullrank.com\/content-relevance\",\"url\":\"https:\/\/ipullrank.com\/content-relevance\",\"name\":\"Relevance is Not a Qualitative Measure for Search Engines\",\"isPartOf\":{\"@id\":\"https:\/\/ipullrank.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/ipullrank.com\/content-relevance#primaryimage\"},\"image\":{\"@id\":\"https:\/\/ipullrank.com\/content-relevance#primaryimage\"},\"thumbnailUrl\":\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png\",\"datePublished\":\"2023-04-20T13:28:05+00:00\",\"dateModified\":\"2025-07-31T20:03:27+00:00\",\"description\":\"Learn how a content relevance score against the top query results can determine whether your SEO strategy should focus on links or content.\",\"breadcrumb\":{\"@id\":\"https:\/\/ipullrank.com\/content-relevance#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/ipullrank.com\/content-relevance\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/ipullrank.com\/content-relevance#primaryimage\",\"url\":\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png\",\"contentUrl\":\"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png\",\"width\":699,\"height\":400},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/ipullrank.com\/content-relevance#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/ipullrank.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Relevance is Not a Qualitative Measure for Search Engines\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/ipullrank.com\/#website\",\"url\":\"https:\/\/ipullrank.com\/\",\"name\":\"iPullRank\",\"description\":\"Digital Marketing Agency in NYC\",\"publisher\":{\"@id\":\"https:\/\/ipullrank.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/ipullrank.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/ipullrank.com\/#organization\",\"name\":\"iPullRank\",\"url\":\"https:\/\/ipullrank.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/ipullrank.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/ipullrank.com\/wp-content\/uploads\/2025\/03\/Logo_-_Layers.svg\",\"contentUrl\":\"https:\/\/ipullrank.com\/wp-content\/uploads\/2025\/03\/Logo_-_Layers.svg\",\"width\":177,\"height\":36,\"caption\":\"iPullRank\"},\"image\":{\"@id\":\"https:\/\/ipullrank.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/ipullrankagency\",\"https:\/\/www.linkedin.com\/company\/ipullrank\/\",\"https:\/\/www.youtube.com\/@iPullRankSEO\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/ipullrank.com\/#\/schema\/person\/82831a4b9f4b8be81d5a9bfed4cb9b20\",\"name\":\"Mike King\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/ipullrank.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/d57e62b40de6db99771f85cbce3ab1d29071b8cd0d643c8dcf2fc55818e1769f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/d57e62b40de6db99771f85cbce3ab1d29071b8cd0d643c8dcf2fc55818e1769f?s=96&d=mm&r=g\",\"caption\":\"Mike King\"},\"description\":\"Mike King is the Founder and CEO of iPullRank. Deeply technical and highly creative, Mike has helped generate over $4B in revenue for his clients. A rapper and recovering big agency guy, Mike's greatest clients are his two daughters: Zora and Glory.\",\"url\":\"https:\/\/ipullrank.com\/author\/ipullrank\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Relevance is Not a Qualitative Measure for Search Engines","description":"Learn how a content relevance score against the top query results can determine whether your SEO strategy should focus on links or content.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/ipullrank.com\/content-relevance","og_locale":"en_US","og_type":"article","og_title":"Relevance is Not a Qualitative Measure for Search Engines","og_description":"Learn how a content relevance score against the top query results can determine whether your SEO strategy should focus on links or content.","og_url":"https:\/\/ipullrank.com\/content-relevance","og_site_name":"iPullRank","article_published_time":"2023-04-20T13:28:05+00:00","article_modified_time":"2025-07-31T20:03:27+00:00","og_image":[{"width":699,"height":400,"url":"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png","type":"image\/png"}],"author":"Mike King","twitter_card":"summary_large_image","twitter_creator":"@ipullrankagency","twitter_site":"@ipullrankagency","twitter_misc":{"Written by":"Mike King","Est. reading time":"17 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/ipullrank.com\/content-relevance#article","isPartOf":{"@id":"https:\/\/ipullrank.com\/content-relevance"},"author":{"name":"Mike King","@id":"https:\/\/ipullrank.com\/#\/schema\/person\/82831a4b9f4b8be81d5a9bfed4cb9b20"},"headline":"Relevance is Not a Qualitative Measure for Search Engines","datePublished":"2023-04-20T13:28:05+00:00","dateModified":"2025-07-31T20:03:27+00:00","mainEntityOfPage":{"@id":"https:\/\/ipullrank.com\/content-relevance"},"wordCount":2845,"commentCount":0,"publisher":{"@id":"https:\/\/ipullrank.com\/#organization"},"image":{"@id":"https:\/\/ipullrank.com\/content-relevance#primaryimage"},"thumbnailUrl":"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png","articleSection":["Content","Relevance Engineering","SEO"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/ipullrank.com\/content-relevance#respond"]}]},{"@type":"WebPage","@id":"https:\/\/ipullrank.com\/content-relevance","url":"https:\/\/ipullrank.com\/content-relevance","name":"Relevance is Not a Qualitative Measure for Search Engines","isPartOf":{"@id":"https:\/\/ipullrank.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/ipullrank.com\/content-relevance#primaryimage"},"image":{"@id":"https:\/\/ipullrank.com\/content-relevance#primaryimage"},"thumbnailUrl":"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png","datePublished":"2023-04-20T13:28:05+00:00","dateModified":"2025-07-31T20:03:27+00:00","description":"Learn how a content relevance score against the top query results can determine whether your SEO strategy should focus on links or content.","breadcrumb":{"@id":"https:\/\/ipullrank.com\/content-relevance#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/ipullrank.com\/content-relevance"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ipullrank.com\/content-relevance#primaryimage","url":"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png","contentUrl":"https:\/\/ipullrank.com\/wp-content\/uploads\/2023\/04\/Frame-1597879964.png","width":699,"height":400},{"@type":"BreadcrumbList","@id":"https:\/\/ipullrank.com\/content-relevance#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/ipullrank.com\/"},{"@type":"ListItem","position":2,"name":"Relevance is Not a Qualitative Measure for Search Engines"}]},{"@type":"WebSite","@id":"https:\/\/ipullrank.com\/#website","url":"https:\/\/ipullrank.com\/","name":"iPullRank","description":"Digital Marketing Agency in NYC","publisher":{"@id":"https:\/\/ipullrank.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/ipullrank.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/ipullrank.com\/#organization","name":"iPullRank","url":"https:\/\/ipullrank.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ipullrank.com\/#\/schema\/logo\/image\/","url":"https:\/\/ipullrank.com\/wp-content\/uploads\/2025\/03\/Logo_-_Layers.svg","contentUrl":"https:\/\/ipullrank.com\/wp-content\/uploads\/2025\/03\/Logo_-_Layers.svg","width":177,"height":36,"caption":"iPullRank"},"image":{"@id":"https:\/\/ipullrank.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/ipullrankagency","https:\/\/www.linkedin.com\/company\/ipullrank\/","https:\/\/www.youtube.com\/@iPullRankSEO"]},{"@type":"Person","@id":"https:\/\/ipullrank.com\/#\/schema\/person\/82831a4b9f4b8be81d5a9bfed4cb9b20","name":"Mike King","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/ipullrank.com\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/d57e62b40de6db99771f85cbce3ab1d29071b8cd0d643c8dcf2fc55818e1769f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/d57e62b40de6db99771f85cbce3ab1d29071b8cd0d643c8dcf2fc55818e1769f?s=96&d=mm&r=g","caption":"Mike King"},"description":"Mike King is the Founder and CEO of iPullRank. Deeply technical and highly creative, Mike has helped generate over $4B in revenue for his clients. A rapper and recovering big agency guy, Mike's greatest clients are his two daughters: Zora and Glory.","url":"https:\/\/ipullrank.com\/author\/ipullrank"}]}},"_links":{"self":[{"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/posts\/16463","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/comments?post=16463"}],"version-history":[{"count":0,"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/posts\/16463\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/media\/19464"}],"wp:attachment":[{"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/media?parent=16463"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/categories?post=16463"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/tags?post=16463"},{"taxonomy":"diagnosis-deliverable","embeddable":true,"href":"https:\/\/ipullrank.com\/wp-json\/wp\/v2\/diagnosis-deliverable?post=16463"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}