Resources

Glossary of terms

In SEO, you are sure to come up against lots of abbreviations, jargon, and technical terms. As this can be overwhelming for those new to the practice, we have put together a glossary to keep you up to date.

Demystifying SEO Language

Our comprehensive glossary is your key to unraveling the world of SEO terminology, making it accessible and understandable for beginners and experts alike.

10 Blue Links

10 Blue Links was formerly the way a search engine presented the 10 first link entries as suggestions, when you entered a query on it. It was used just to stand out and present results by sticking to what the user was looking for in their search. However, this strategy somehow evolved and now we can observe that not all the information given sticks to that information, but instead goes beyond the titular links. Although it's considered to be an outdated expression, we can currently still see it when simply referring to SERPs.

10x Content

The content that is 10 times better than the top or highest ranking result for a specific keyword. In order to achieve a 10x content we should take into consideration the following steps:

  • To provide a great UX on any device.
  • Content must be interesting, useful, high quality and, of course, remarkable.
  • Content must be, somehow,  different from your competitors’ content.
  • Content must be, in some way, an emotion creator. Readers should experience some sort of feeling towards what they're reading.
  • Content has to become a problem or a necessity solver for readers and, at the same time, provide accurate, comprehensive and unique information or resources.
  • Content must be presented to readers in the most pleasurable and attractive style possible.

200 Response

The HTTP 200 OK response code is a success status response that indicates that a request has been succeeded and received correctly on the server. When in an SEO environment the 200 OK response code will ensure us that our site is working properly and that all the linked pages are working as they should.

2XX Status Codes

The 2xx status codes are a three digit code response to the request a website server receives from the browser each time you visit a new page on a website. This is the HTTP status code.

When it comes to SEO, these codes are differently handled by search engines, so it is most important to understand how they work and learn to use them correctly. Otherwise, when used incorrectly, HTTP status codes could cost major technical SEO issues to your site.

301 Redirect

The 301 status code is a code that tells the browser that a web page has been moved permanently to a different page, which means that its former URL has changed. Any references to these resources in the future should be using one of the URLs included in the response.

When it comes to SEO, the 301 status should be used every time a URL is changed  permanently, as this redirect sends your current link equity from your content to the new website. However, links resulting in a status code 301 usually gives less link equity than 200. So, as a piece of advice, if you have several links going through a 301 Permanent Redirect, you should eventually fix those links immediately.

302 Redirect

The 302 status code is a code that, like 301s, tells the browser that a web page has moved to another page with a changed URL. The difference between these two status codes is that while 301 shows a permanent redirect, the 302 status code only shows temporary page redirects.

When it comes to SEO strategy, you should be applying a 302 redirect when you want to move bots and users temporarily from one page to a new page. Mind also that if the page is not going back to the former page, you should always use a 301 permanent status code instead.

304 Not Modified

A 304 not modified status code is an HTTP status that indicates the users that the website they're trying to access hasn't been updated since the last time it was accessed.

In other words, usually accessing a web page or URL for the first time, the browser would request access to it from a web server and it will save (or cache) the information so it doesn't need to download that data every time. So when clicking on a web page or URL that hasn't been modified since the last time we accessed it, the web server will send back a 304 not modified status code, which will let the browser know to use a saved (or cached) version of that web page.

404 Error

The 404 error is the most common 4xx code response from the server and it tells the users that the page they requested could not be found. In other words, users  will not access the page, although it may have not been deleted at all. A 404 error code can happen because of several site problems, though the most repetitive are the ones resulting after a web owner has removed a page on site but this page is still linked to internally; or maybe it also can happen simply because of a bug in the system.

In any case, even in some SEO strategies that a 404 error seems to be a useful and strategic resource, it should be avoided at all costs if possible.

410 Gone

The 410 Gone is a status code from the server that indicates to the user that the target resource has been deleted and, as it seems, this condition will be permanent. This status may be similar to a 404 one. However, while the 410 mainly indicates that the page has been permanently removed, the 404 status just indicates that the page could not be found.

Mind that a 410 status code should only be applied to pages that are to be removed permanently and that these will never come back to use. If you are not entirely sure whether the page will be coming back, you may just use a 404 status code instead.

4XX Status Codes

The 4xx status code, or also known as 4xx Client error status code is a status response that indicates that a website or a page could not be reached, either because it's unavailable or because the request itself contains bad syntax.

When it comes to SEO strategy, 4xx status codes should be taken in serious consideration as, apart from giving the user a terrible experience, it will definitely affect all your SEO efforts.

5XX Status Codes

A 5xx error code is a status response that indicates to the user that, although the request was valid and correctly sent, the server was unable to process the request itself. When experiencing a 5xx error code status for your website you should take a look at your server right away, as this kind of error status can cause big issues for your SEO results.

Mind that if you are hosting your own server, you'll need to start debugging to find out why the server is not responding properly. However, if you're using an external hosting provider you may need to reach out to them so they can recheck it.

A Low Volume Search Query

A low-volume search query is a Google status code that appears whenever a user's search query includes a keyword with very little search traffic on the browser properties. 

Whilst using low-volume keywords may be cheaper for your marketing strategy at first, it will definitely affect your strategy effectiveness in the long term. Buying low-volume keywords will probably make fewer people see your ad, click on it, or even affect how many people will buy your product or service. 
This status code forces, somehow, online businesses to bid on more competitive keywords, rather than less expensive phrases that, in the end, make no use for Google's online traffic statistics.

Accelerated Mobile Pages (AMP)

Accelerated Mobile Pages (known worldwide as AMP) are an HTML open source framework designed by the AMP Open Source Project to improve mobile devices web pages performance. It was formerly created by Google as a competitive strategy against Instagram and Apple.

Since the use of smartphones and tablets has considerably increased and even overtaken PC desktop devices, a new framework as AMP was needed. Specifically, what AMP does is to use a stripped-down version of HTML that enables it to create websites that load as quickly as possible on these types of devices.

There are three elements that are characteristic of the AMP framework, being these the AMP HTML, AMP JavaScript and, last but not least, the AMP CDN.

However, all servers can read AMP Source Code as its code is based on well-known scripts. 

Advanced Search Operators

An Advanced Search Operator (sometimes also known as search parameters) is a character or a range of characters and commands used in a search engine query to narrow the regular text searches in order to amplify it to get to a more specific result. Whenever a user searches for something on the search engine, these advanced search operators work as special commands that modify and change the original query so as the search engines drill deeper and serve with a better result. 

Here are some examples of Advanced Search Operators:

  • intitle: "fleming vs penicillin". Search only in the page's title for a word or phrase. Use exact-match (quotes) for phrases.
  • allintitle: fleming vs penicillin. Search the page title for every individual term following "allintitle:". Same as multiple intitle:'s.
  • inurl: porsche announcements inurl:2024. Look for a word or phrase (in quotes) in the document URL. Can combine with other terms.
  • allinurl: tripadvisor field-keywords belgium. Search the URL for every individual term following "allinurl”. Same as multiple inurl:'s.
  • intext: "google chrome vs mozilla firefox vs safari search engine". Search for a word or phrase (in quotes), but only in the body/document text.
  • allintext: google chrome mozilla firefox safari search engine. Search the body text for every individual term following "allintext:". Same as multiple intexts.
  • "Porsche announcements''filetype:pdf. Match only a specific file type. Some examples include PDF, DOC, XLS, PPT, and TXT.
  • related:theguardian.com. Return sites that are related to a target domain. Only works for larger domains.
  • Fleming AROUND(3) penicillin. Returns results where the two terms/phrases are within (X) words of each other."

Algorithms

Algorithms are a set of instructions used for solving a problem or accomplishing a task in a computerised device. These algorithms allow a computer to perform its functions manually in a short amount of time and by using a finite number of steps as a form of a hardware or also as a range of software-based routines. 

The main types of algorithms known in computing are:

  • Numerical algorithms
  • Algebraic algorithms
  • Geometric algorithms
  • Sequential algorithms
  • Operational algorithms
  • Theoretical algorithms

Some of the most well known algorithms in computing are:

  • Sort: Displaying data in a useful and logical manner.
  • Search: Finding key data in sorted data groups.
  • Hashing: Finding key data with an indexed key ID component.
  • Dynamic programming: Transforms large, complex problems into a set of smaller, easy to resolve problems instead.
  • Exponential by squaring (EBS): Speeds up and reduces the calculation of large numerical problems. It is also known as binary exponentiation.
  • String matching and parsing: Finds patterns in large databases by using predetermined terms and restrictions.
  • Primality testing: Establishes prime numbers either in a determined or probabilistic way. It is mostly used in cryptography.

Alt Attributes

An ALT attribute is an HTML attribute (or tag) for an image described in a text form. Whenever an image is not correctly shown for any reason in a page, this ALT attribute will show an alternative text instead. An ALT Attribute is used by search engines to identify the image content, as files can not generally be read by crawlers. With normal and correct web functionality, these attributes will not appear at all, since the image will be shown without any complications. 

When it comes to SEO, an ALT attribute has a very important role both for the user and also for the search engines.

  • User: If these attributes do not appear in a web page, it means the page works correctly and that the UX is not being affected. It also means a big deal for people with visual disabilities as they are able to read and understand the image shown that they can not really see.
  • Search engine: These attributes serve the search engines with essential data about the image content and, most importantly, about the website content.

Alt Text

ALT Text is a short description added to an image in a website to tell the search engines what that image is and what elements appear in it. ALT Text is also known as ‘alternative text’. This HTML code does not only describe an image content but also its context and purpose in a website. 

Three main uses are identified in the use of ALT Text:

  • Replace an image or describe an image in a text form whenever a web page does not load properly. In that case, a text will appear instead of an image and it will show the non seen image description.
  • Allow people with visual disabilities to know when they have an image in front of them in a webpage and be able to understand what the image is about.
  • Give data to the search engine so it can index the images in a webpage properly. This is very important in SEO as crawlers read these HTML codes and index the media content on the internet.

Ambiguous Intent

Ambiguous intent is basically an unclear and unspecific search on the internet made by the user that will definitely need further detail in order to get a more concrete answer from the search engine.

In these kinds of searches, the search engines use the context as an organic position on the SERPs (Search Engine Ranking Pages) as the engines can not focus on anything specific since the used keyword is too generic.

AMP

AMP is the abbreviation for ‘Accelerated Mobile Pages’. The AMP is an HTML open source framework designed by the AMP Open Source Project to improve the web pages performance on mobile devices. It was formerly created by Google as a competitive strategy against Instagram and Apple.

Since the use of smartphones and tablets has considerably increased and even overtaken desktop devices, a new framework as the AMP was needed.

Specifically, what AMPs do is to use a stripped-down version of HTML that enables it to create websites that load as quickly as possible on mobile devices. 

There are three elements that are characteristic of the AMPs framework, being these the ‘AMP HTML’, ‘AMP JavaScript’ and, last but not least, the ‘AMP CDN’.

However, the truth is that all servers can read AMPs Source Code as each code is based on well-known scripts.

Amplification

Amplification is a marketing term used to describe methods used on the internet so as to impact a larger audience. In the SEO world, amplifying a website’s content is the only way to drive traffic to your website as sometimes it can turn out to be quite difficult to do this when competing with all the other websites on the net. 

An amplification strategy or method applied to a website will result in:

  • An expansion of your audience
  • New business opportunities
  • New leads
  • Increasing your customers' loyalty towards your business and brand.

Some of the best amplifications methods considered in the digital world are:

  • Influencer marketing
  • Tagging (and also using hashtags)
  • Including CTA strategies to your content and posts
  • Email marketing
  • Retargeting
  • Looking into new niche communities worthy to be included to your database
  • Using promotional messages and adding social buttons to your content

Anchor Text

Anchor text is the visible and clickable text in a hyperlink when linked to another document or website. It is usually seen on browsers in blue and underlined, although it can be modified through HTML or CSS.

Anchor text is crucial in SEO since browsers use the external anchor text as a way to figure out how other users will visualise your page and, at the same time, figure out what the pages are about. Anchor text can provide important data about the link’s destination content to users and search engines. SEO-friendly anchor text will be considered as such when being succinct, relevant to the page where it is linked to, not generic and by having low keyword density.

The Anchor Text types found on the web are: 

  • Exact-match
  • Partial-match
  • Naked link
  • Generic
  • Images

API

An API (Application Programming Interface) is a software interface that allows two different applications to ‘talk’ to each other. Technically speaking, when people use an application on their mobile phone, such as Facebook, for example, this application connects to the Internet and sends information and data to the server. Then when the server receives this data, it interprets it so as to be able to perform the required actions and then sends it back to the user's phone. Whenever the data gets back to the user’s phone, the application interprets it and presents the information in a readable way into the user’s phone screen.

Article Spinning

Article Spinning is an SEO writing technique used as a way to create new content automatically from existing content. The offered software for the content creation is known as article spinner. 

This technique is based on the rewriting of a whole article or a part of it, not by changing the subject or the topic of it, but by changing the words used in order to make the text look different from the original. The article spinning can be manual, when the writer does the recycling and rewriting of the text by him/herself, or automatic, when the writer used a specific software to do it instead.

Although the spinning article technique may seem attractive to content creators, extra care must be taken when doing it as search engines can - due to algorithm improvements - identify spinned articles and mark them as spam, which will be identified by Google’s webmasters as rule breaking.

Async

Async (asynchronous) is an SEO term that refers to when the browser does not have to wait for a task to finish before moving to the next web page. 
In the early years of the Internet, a resource on the net had to be downloaded sequentially, meaning that every resource had to be completely downloaded for the second to start downloading. However, when nowadays everything on the net depends basically on SEO strategies and users' experience in it, it is of important relevance to offer a great and fast website’s loading speed.

It is known that the more resources a webpage has in it, the heavier it is to load, which ends up giving the user a bad experience. Async technique will be of great help as this resource will ensure that every resource is individually downloaded, meaning the user will be able to see and use the page faster.

Auto Generated Content

Auto-generated content refers to that content that has been generated programmatically; in other words, content created through coding from an established or pre-defined template. 

In many cases, attention must be paid so as not to be involved in what Google will identify as a ‘black-hat’ tactic, as auto-generated content can sometimes be used to index many pages and manipulate search rankings.

In the SEO world, the auto-generated content results, most of the times, in giving the user a real bad UX, as the content turns out to be unoriginal, repeated and unhelpful. 

Google helps us with this by providing some examples of what is considered as bad auto-generated content:

  • Content automatically generated from pre-existing content, for example when using auto-correction techniques.
  • Content automatically generated by putting together related data from different websites without unifying it and making it different from its original source.
  • Content automatically generated after literally translating it without getting checked by a human before its publication. 
  • Content automatically generated from ‘scraping’ the RSS.

Backlinks

Backlinks are links strategically placed in a text to guide the user to our website. The goal of backlinks is to amplify online traffic and to improve a website's organic SERPs. The more high quality backlinks we generate, a better organic ranking will get for our site.

There are two types of backlinks known:

  • Follow: normal links that have authority and guide bots to these sites.
  • Unfollow: these links basically say to the search engines the will to guide users to that link, but the search engine really does not need to visit it since the webmaster does not want to give it further authority.

In the SEO world, search engines will for sure rank follow backlinks better than the unfollow ones, and Google has so far established more than 170 factors classify them, two of the most important ones being:

  • Content quality
  • Number of backlinks received.

Although a link building strategy is commonly used nowadays because of the difficulty of ranking on search engines just by using keywords. However, special care must be taken when using a link building technique so as not ending up in a ‘black hat’ tactic which Google will identify and, as a consequence, it will affect our site, its traffic and even its conversions. 
 

Bad Neighbourhood

A bad neighbourhood in marketing refers to a low quality neighbourhood where a website is placed. This means that the website is currently being around or related to other sites considered to be violating the search engines rules or that they have even been penalised.

It is important to know that this problem is not related to any IP, subnet or host inconvenience, but to the quality of the link in the site. For a website, to be in a bad neighbourhood, means that the relationship between the search engines and that site will not be trustworthy, as its position in the organic ranking will be endangered and it can even be removed from the search engine.

Bing Webmaster Tools

Bing webmaster is an online business tool similar to Google Analytics but for Bing search engine, that displays registered data about visitors to a site, clicks and behaviours from the users, in order to know the content better and optimise our website properly. 

Bingbot

Bingbot is what Bing’s crawler is called, just as Googlebot refers to Google’s crawler, but Bingbot is designed under Microsoft tech.

This crawler searches the web in order to find new content or sites to index, it follows the hyperlinks contained in the Internet, it recollect all data and HTML documents available, and it records them in its URL directory, the Bing index. After this process is completed, the sorted list is displayed on the search engine.

Black Hat SEO

Black Hat SEO refers to SEO techniques that do not follow search engines and webmaster’s rules and intend to manipulate them to get faster results.

A few examples of Black Hat SEO techniques are:
Hidden content
Keyword stuffing
Prepaid links
Cloaking
Using doorway pages
Using link farms
Using Private blog networks (PBN)

Blockers

Blockers (or ad blockers) are a type of extension that can be installed to the browsers in order to block advertising and pop-ups to any type of website. These extensions allow the user to visit a website without being interrupted by banners, pop-ups, or any kind of advertising. This is considered as an advantage for users, but far from that when it comes to ads providers. Since companies dedicate a large amount of the budget to invest in paid ads, the fact that ad blocker blocks those ads considerably affects the company’s revenue.

Bots

Bots (short term for informative robot) refer to a software or informative program served by artificial intelligence (AI) in order to perform automated tasks across the internet as if it were a human being.

Bots are capable of performing a large range of tasks, such as text editing, moderating conversations, answering questions, or even sending emails. As time goes by and these softwares are incredibly improved, many companies use them as a client’s satisfaction rate improvement.

The most common Bots used on the Internet are:

  • Bots crawlers (or spiders)
  • Bots in Social Media Network
  • ChatBots
  • VoiceBots
  • HackerBots
  • SpamBots

Bounce Rate

The bounce rate tells us what percentage of the visits to our website end up leaving it without having visited other pages of our same site. The higher the bounce rate, the worse the visitor interaction.

A high bounce rate can result because of many common situations, such as:

  • The user does not like the information of the website
  • The user does not like the design of the website
  • The user has come through a link which offered the wrong thing
  • The website takes too long to load

Some good recommendations to reduce the bounce rate are:

  • Create a proper structure for the website’s text
  • Use internal links in the posts
  • Use CTAs (calls to action) in your website
  • Do not show everything the home page index

Branded Keywords

Branded keywords are the keywords that potential clients use as a shortcut to search for information close to conversion. As potential clients use these branded keywords in order to look for more specific and detailed information about a product or service, establishing a correct tactic for our content marketing strategy is essential. 

In addition to increasing traffic and audience of a web, branded keywords are also used as an essential part of the last funnel stage, in which users end up deciding whether to buy or not.

When it comes to SEO, branded keywords help us to get to a better segmentation, to better comprehend our audience’s necessities, to identify new opportunities, to identify any inconveniences our clients could be experiencing with our products or services that need revision and also to follow our competitor’s strategies. 

Breadcrumb Navigation

Breadcrumb navigation refers to a secondary navigation form inside a website. This kind of structure allows users to go back quickly to the home page or to another level of the web, by using internal links, like following a breadcrumb path. Breadcrumb navigation means a big deal for a better user experience (UX), since it helps the user see the website’s hierarchy and its transparency . It is also important to know that, when using a breadcrumb navigation on a website, the fewer clicks required, the better the user experience on the web. The most represented breadcrumb trail is the one symbolised as common arrows with a greater than sign (>), for example ‘Home Page > Category > Subcategory’.

When it comes to SEO, breadcrumb trails are very useful as natural internal links increase and, for that reason, it also increases the possibilities to rank to the first position of the SERP.

Bridge Page

A bridge page is a page created to drive traffic to another site. Nowadays, the Internet is filled with bridge pages. These pages are also known as banner farms, entry pages, doorway pages and getaway pages.

Bridge pages can often be seen as landing pages, but they are quite different. Landing pages are simply the single web page where marketers send traffic to and try to create more leads, whereas a bridge page is that middle stage in the funnel placed between the squeeze page and the sales page, like the promoting stage of the selling process.

Main purposes of a bridge page would be:

  • Presell the offer before sending buyers to the affiliate product
  • Warm up the traffic
  • Connect with visitors and gain their trust
  • Fill the gap between prospect and product

Some essential elements to be included in a bridge page design would be:

  • Eye catching title
  • Audiovisual content (a video for example)
  • Great written content
  • Bonuses or discounts
  • Value stack
  • Social proof
  • CTA buttons
  • Countdown timer
  • Banner or Pop-up

Briefing

Broken Links

Broken links are links in a website that are no longer available, meaning that when clicking on them we will not be directed to any other page. Broken links can be originated by many reasons such as programming errors, a change in the Internet address of the linked web page and a temporarily unavailable web page.

Broken links highly affect the organic ranking of a business, as Google penalises it through its search bots identifying it as a bad experience for the user that comes across that broken link.

The best thing to do when having broken links in our website is to recheck all the links in it every once in a while. If we come across any broken link it should be checked in order to identify what is the problem with that link and have it solved as soon as possible.

Browser

A browser is a software, program or app that allows the user to see the information and data in a web page. The browser reads the code, usually an HTML code, in which the page is codified and it performs it into the screen allowing the user to interact with its content and, even, navigate through it.
The most current known browsers are Google Chrome, Safari, Microsoft Internet Explorer, Microsoft Edge, Mozilla Firefox and Opera. 

Business Directory

A business directory is an online listing of businesses that provides specific data about them, such as name, address, contact information, partners, and their services and products.

Having a business directory will help a business to get more traffic thanks to the awareness increase and its impact on potential customers.

When it comes to SEO, having a business directory will help dramatically to your SEO strategy, as you will be officially associating your business to other trusted brands and companies. Here relays the importance of staying away from ‘bad neighbourhoods’ and only relate your business to trustworthy brands, so as the search engines identify valuable backlinks and make the business rank to first positions on the SERP.
 

Cached Page

A cached page is a web page that has been saved either by a search engine on its servers or by a user’s browser on a computer or mobile device. This happens when a user searches for a web page on the browser and, then, this browser stores a copy of web HTML pages and images as web documents.

Cached pages are not usually recommended as the digital world and its web pages change frequently, meaning that if a web page has been changed and the user enters a cached version, the user will be seeing the previous version of it and will miss all the new updates. There are two ways of fixing cached pages; either we ‘refresh’ the browser’s search a few times or, if it seems it is still not working, we can clear our browser’s cache.

However, an outdated version of a web page is not always negative, basically because when using a cached version of a web page this will be loaded quicker than the last version of it. 

Caching

Caching refers to the process of sorting data in a fast access hardware, commonly known as cache. A cache often stores a range of temporary data as a way to improve the speed of the search engine and its browser.

Some benefits of caching are:

  • Digital performance improvement
  • Reducing the cost of the data storage
  • Reducing the load on the backend
  • Predicting digital performances
  • Eliminating database hotspots
  • Increasing read throughput 

Canonical Tag

A canonical tag is a HTML element in the form of a tag inserted in the source code of web pages indicating whether its content is original or not. These tags work as a warning to the search engines, which will tell them where the user should be redirected to when performing a search.

The canonical tag can sometimes be seen as similar as the 301 redirect code, but these two codes are far from being technically similar. While a canonical tag serves as an indicator for search engines about where to take the user when performing a search depending on a website’s original and unique content, the 301 redirect code (or other redirect codes) tells browser that a web page has moved permanently to different page, changing its former URL. Whereas with the canonical tag the user will be directed to a web page apparently well indexed by search engines, when it comes to a redirect code, the user would be redirected to nowhere as that web page will no longer exist in that version searched.
 

Canonical URL

A canonical URL is the URL of the one page Google presents to a user after a search as the most original and unique page from a set of duplicate pages on your site. Also known as canonical link or canonical tag, canonical URLs first appeared in February 2009 for the need to identify duplicate content and pages in order to select the original ones.

Canonical URLs or tags are essential when it comes to SEO, as these tags help indexing pages correctly that will, at the same time, result in a better search from Google’s algorithm. It is also important to always check that your website’s content is unique to other content on the site and to that of competitors, otherwise it will be identified as duplicated content, affecting organic rankings. Using canonical URLs will help us avoid these inconveniences.

To implement canonical URLs on a site:

Add the canonical tag to the head of the page, like this: <link rel=”canonical”>. 

It should not be added to the body, as if it is used more than once, Google will ignore the tag entirely. 

Canonical URLs must never be “noindex”, “disallow” or “nofollow” links, as if Google can not read that code it will simply ignore it. 
 

ccTLD

A ccTLD (country code top-level domain) is a territorial domain extension that belongs to a specific country. For example, as the territorial code for Germany is DE, its territorial domain extension would be ‘.DE’.

ccTLD extensions are always made up of a two letter code, which at the same time is used to catalogue each country to the international standardisation standard ISO 3166. Beside the existence of these ccTLD extensions, which basically are subject to the requirements determined by the domain regulatory corporate name of each country, there is also the possibility of using the gTLD extensions (generic-top-level-domains), which are subject to international regulations.

When it comes to SEO, choosing between using ccTLD or gTLD extensions means a great deal, though the truth is that ccTLDs are known to be the strongest way to tell users and search engines that that site and its content is specially targeted to a determined country or region. This results in Google reading that site as to be specifically relevant to that geographic marked area by the ccTLD and so it should be shown on that area’s SERPs.

Citations

A citation is an online reference of your business name on another site, featuring the business’ name, address, phone number and it may even incorporate a direct connection or link to your site.

When it comes to SEO, citations are essential for a website indexing and its rankings, as these references help crawlers identify the relevance of that site and allows it to decide where to rank it among its competitors. Citations are mainly considered to be key factors in local SEO.

The are many opinions around about which citation websites are best when in the SEO world; nevertheless, SEO experts agree that the main citations to be considered for local SEO are:

  • Google My Business
  • Bing Maps
  • Facebook
  • Yelp

Click-through Rate

CTR (click-through rate) is a metric in Internet marketing that measures the number of clicks advertisers obtain on their ads divided by the number of times the ad has been shown (also called impressions).

The resulting formula looks like this: clicks / impressions = CTR. For example, if an ad has received 15 clicks and 100 impressions, the CTR obtained would be 15%.
This kind of metric is mostly used as a tool to measure a digital campaign’s impact on our audience. So achieving a high click-through rate is essential for a business’ PPC success (pay per click), as it will affect both the quality score and, at the same time, how much advertisers will be paying every time someone clicks on their search ad.

The main platform used to measure a CTR  is Google Search Console, formerly known as Webmaster Tools, which allows marketers to consult its business pages’ CTR on Google’s search results.

CTRs have become one of the most essential factors affecting an SEO ranking. It is important to make our links attractive in order to accomplish a higher number of clicks on the SERPs, which as a result will make our CTR increase.
 

Client-side and Server-side Rendering

Client-side rendering refers to when a website could have been generated in the browser, whereas a server-side rendering refers to websites generated on the server. To explain the differences:

Client-side rendering:

  • The server sends a JS code to the browser, which interprets that code in order to create the HTML or CSS.
  • The initial load requires more time, and so as to help the load a loader or a skeleton might be added to it.
  • Data is quickly performed after the first load.
  • It affects the SEO negatively as the HTML code is empty.
  • It often uses SPA, static sites and web applications.

Server-side rendering:

  • The server sends a complete web page (HTML, CSS and JS) to the browser.
  • The initial load is faster.
  • The view of the website is built in the backend by the use of templates.
  • Search engines can track the website for a better SEO.
  • A CSR site can be transformed to a SSR site by using special technologies, such as Gatsby, Next or Nuxt.
     

Cloaking

Cloaking is an SEO technique based on showing different content to web users and crawling robots like GoogleBot. Though it started as a popular technique among webmasters for its quick and proper results, it turned out to be less effective over the years; ending up being penalised by Google for being considered black hat SEO.

Cloaking was formerly used as a way to optimise the content shown to Bots without granting a good user experience.

Some cloaking techniques often used are:

  • Programming the server to display different content depending on the browser that made the page request. For example, when a search is made on Google, the content that would be showing on it would be the highly optimised one.
  • Using redirect with JavaScript since, until recently, search engines could not read this code and run it, so it could be used to present different content to robots and people.
  • Showing different content depending on the IP address to the browser from which the visitor is coming.
  • There are still some SEO experts using cloaking and other black-hat SEO techniques, aware that Google might penalise them. However, if you want to be sure to be using any cloaking technique on your website, you can use the ‘Crawl as Google’ function from Google Search Console.
     

Co-citation

Co-citation is an SEO term that refers to when our website or company is mentioned or cited by two independent sources that are not linked in any physical form with each other, but only through its related content.

Search engines will use the co-citation to establish a map of content related sites, so as to decide which sites are the most relevant and where to rank them on the SERPs. 

Co-citation is often related to ‘co-occurrence’; an SEO term that basically identifies similar keywords between different websites, whereas the co-citation refers to the fact that a website is being mentioned by two other different sources, not necessarily linked.

Co-occurrence

Co-occurrence is an SEO term that measures references after ideas and concepts based on keywords that occur between different websites. In other words, in order for Google bots to crawl a piece of content so as to identify co-occurrence, bots look for similar keywords with close proximity presented on multiple websites. These searches include keywords that are similar to each other (not identical) and based on the same theme.

Co-occurrence is often related to ‘co-citation’; another SEO term that refers to when a website is being mentioned by two other different sources that are not necessarily linked, whereas co-occurrence identifies similar keywords between different websites.

When it comes to SEO, co-occurrence results in Google presenting a website that is not owning that exact title or description we have searched for. How does that happen? By association (and co-occurrence) of ideas and terms, Google can find concordances in the body of the text shown in its results, because of the keywords used and so make associations and understand that that page needs to be presented to the user after that precise search.

Commercial Investigation Queries

A commercial investigation query is that query the searcher writes on the browser to compare different results in order to find the option that suits best the searcher’s desires or expectations. This kind of query can be done only for the user’s intention to purchase. Some commercial investigation queries examples would be:

  • Best store in London to buy Ethiopian food
  • Apple music vs Spotify
  • Best wine cellar in the city to provide your restaurant
  • Best travel package to travel to Disneyland

When it comes to SEO, these kinds of queries will provide you, as a business owner, relevant information when establishing a segmented target, searching for potential clients/partners, assessing your competitors and so on. Commercial investigation queries will also provide you with best keywords that will, as a result, help to optimise your website for search engines.

Competition

The term competition in SEO refers to all those websites that will be displayed on the SERPs when the user types the same search terms as you. Those websites will, after all, become your direct online competitors and you all will be competing for the same keywords or search terms.

Computer Generated Content

Content Delivery Network (CDN)

A content delivery network refers to a set of servers installed in different locations of a network containing local copies of certain content (images, documents, videos, websites, and so on), in order to serve that web content over a wide geographic area. The main goal of content delivery networks is to offer multiple PoPs (Points of Presence) outside the origin server.

These data centres keep the online world connected by bringing content to the people on the Internet, regardless of their geographic location towards other individuals or even the website’s main server. As an example, you would be using a content delivery network, each time you go on Amazon or Facebook.
Content delivery networks mean a great deal for SEO strategies, as these networks process users’ requests faster, allowing websites to better manage their traffic and resulting in a more satisfying user experience.
 

Content Gap Analysis

A content gap analysis studies the current content performance of a website, identifying aspects to improve and establishing the differences with its competitors. It is an essential part of a good content strategy, as it allows a company to identify opportunities and optimise errors.

There are four steps or questions known for a good content gap analysis:

  • Where are we?
  • Where do we want to be?
  • How far are we from our SMART goals?
  • How are we planning to accomplish our SMART goals?

A content gap analysis basically starts with an SEO level current situation of a site. In that analysis, a site should be analysing its current content, tracking what keywords we are positioning, what content is generating better or worse traffic and, last but not least, the keywords that visitors are using to find our website.

Content Hub

A content hub is a centralised online web space or tool that offers curated and organised content of a brand or a specific topic so that users can access it in a single space. In other words, using a content hub will help you capture user’s attention towards your business thanks to segmented publications based on your audience’s interests and preferences.

When it comes to SEO, using a content hub would help your strategy significantly because:

  • It improves the user experience
  • It generated brand authority
  • It increases visibility and traffic
  • It generates engagement with the user
  • It helps generate leads

Content Marketing

Content marketing is a marketing strategy focused on the creation and distribution of relevant content, in order to attract, acquire and draw the attention of a defined target, with the aim of encouraging them to become its customers.
Content marketing was formerly based on written pieces, such as corporate magazines. However, this has currently been moved to a digital space, where we find content marketing in the form of blogs, ebooks, infographics and videos, among others.

What are the benefits for using content marketing?

  • It complements any SEO strategy, as browsers positively value a web with high quality and updated content.
  • It can help improve the brand’s reputations and the branding itself.
  • It can help increase the audience’s loyalty, as websites with resources and content of great quality will offer a better user experience.
  • It can help improve customer service.
  • It improves Public Relations.
  • The content itself feeds the company's paid channels and its social networks.
     

Content Relevance

Content relevance is a criteria used by search engines to qualify a website’s content as to be of great or poor quality. As a result, it serves to make a website better or worse positioned on the SERPs.

Content relevance is essentially related to a good content marketing strategy and to a proper SEO strategy because of the following benefits:

  • It improves organic rankings
  • It attracts more traffic
  • It improves SEM strategy (paid marketing)
  • It increases the brand’s authority
  • In order to ensure a good relevance to our content, we should take care of:
  • Properly understanding the user
  • Working our website’s structure 
  • Generate original and unique content
  • Indexing the content properly

Content Syndication

Content Syndication refers to giving permission to other website’s authors to republish your own content. This practice is usually implemented when an owner of a website sees that the website is not reaching the expected audience, which is eventually a waste of resources.

Content syndication allows users to amplify their audience by sharing their own content to other websites and, at the same time, it helps organic traffic to the original website as the syndicated websites will tell the search engines from which website they are sharing the content in their page, and so the search engines will know the quality and reliability and will, eventually, show better ranking.

It is important to know that syndicated content is not considered as copy pasted content, but as an authorised distributed content.

Conversion

A conversion is any action carried out by a user on a website that generates benefits and value for that business. The action of conversion is one of the most important actions an online business expects to see on its website. That is why the company will be implementing digital marketing strategies that will guide the user through the five stages of the funnel in order to keep the users attention as long as possible and to eventually achieve a higher conversion rate.


The conversion process happens when a key action takes place inside the strategy that makes the user become from a simple persona looking around to a lead, a fan, or a client, among others. Although the conversion stage is highly important to a company’s marketing strategy, none of the 4 other funnel stages must be forgotten, and is essential to establish the business most important goals that we are expecting to accomplish with that precise strategy.


What are the main factors to consider in order to get a great conversion rate?

  • Establish clear goals.
  • Define the indicators.
  • Measure the conversion rates by using established metrics (KPIs).
  • Optimise the results, preferably by using an A/B test, or in case that does not result in any clear conclusions, proceed to restructure the website architecture and the whole strategy from the beginning.


However, there are some factors that can negatively affect our conversion rate and that must be taken into consideration. These are:

  • Unattractive and low offer.
  • Poor content.
  • Unfriendly navigation and non likeable website design.
  • Bad procedures, related to forms to be filled, confusing questions, or too much data asked to the user.
  • Poor usability.
  • Excessive loading and downloading time.
  • Bad indexing.
  • Poor internal browsers.
     

Conversion Rate (CR)

Conversion rate (CR) is the percentage of users who perform a specific action (conversion) in a website, such as purchasing, downloading, registration or even making a reservation, among others. The CR is obtained as a result of the division between the number of goals achieved and the unique users who indeed visited the website.


The conversation rate can be applied to an entire website or, instead, to each one of the products or services offered. Marketers actually recommend the last one, applying the conversion rate to each product/service, as it will give more real and clear results. This KPI will allow us to analyse many situations in our e-commerce, such as whether a client does several purchases in a single visit, or if perhaps a client makes different visits in different moments or days with a single purchase in every visit, among others.


Some factors that can affect the improvement of a conversion rate are:

  • Functionality
  • Accessibility
  • Usability
  • Intuitivity
  • Persuasion 

Cookies

A cookie, in the digital world, refers to a data packet that a browser automatically stores on a user’s computer every time this user visits a web page. The main purpose of cookies is to identify a user after their interaction with a web page so as to offer them more appropriate or specific content based on those browsing habits, the next time they visit a web page.


Cookies allow web pages to identify your computer’s IP so as next time you enter those pages, cookies will remember you, your last searches and, for that, your preferences. Just as simple as to know in which language to present you a webpage depending on the version you last visited. The ability to login in a web page without having to remember or introducing your credentials (username and password) is because of the use and allowance of the cookies on that web page. Cookies will remember those credentials and so next time you want to enter that web page, cookies will present you the chance to login with those credentials or else different ones, in case you are not the person belonging to those stored data.


However, it is always important to be aware of cookies and online security, as we are basically telling the Internet our mostly personal information, such as bank account credentials, health insurance credentials, and so on. However, the truth is that as far as we know cookies cannot transmit or execute viruses or either install malware such as Trojans or spyware. Anyway, every time we visit a new web page, or we login somewhere, the system will ask us whether we accept the use of cookies on that precise web page or not.


Cookies are so relevant in the digital world because it allows a company to segment the audience and differentiate one user from the other, it improves the user experience, it serves as a statistics data collector, helps brands to create remarketing campaigns (retargeting), and also it helps to have a better knowledge of the user’s behaviour.
 

Core Web Vitals

Core Web Vitals, also known as Main Web Metrics, are metrics used to evaluate the performance of a website depending on its user experience (UX).


To ensure that their websites do not fall behind in rankings, businesses everywhere have been shifting their focus to prioritising user experience; analysing and making the appropriate changes to make sure that each of their indexed pages are as accessible and as user friendly as can be.


The Core Web Vitals consist of three important measurements that score a website visitor’s user experience when loading a specific webpage. These metrics score page speed - how quickly the content on the page loads, responsiveness - how long the browser takes to respond to a user’s input, and visual stability - the stability of the content loading in the browser.

  • Largest Contentful Paint (LCP): LCP is the time it takes to load the content from a user's perspective. (Page Load Time). In other words, it is the time taken from clicking on the respective page link, to how long it takes for all the contents on that specific web page to load up. For the best result, your LCP score should not be more than 2.5 seconds.
  • First Input Delay (FID): FID focuses on user interactivity with the specific web page. (PageSpeed). Put simply, it is the time it takes to fully interact with the web page or a specific task on that page. For example, the time it takes from arriving on the website and then clicking on an option in the menu's tab. For the best result, FID score should, at most, be 100ms. The less time, the better.
  • Cumulative Layout Shift (CLS): CLS focuses on the stability of the web page from the user's perspective. In other words, the emphasis is on the visual stability of the elements of the web page. For example, focusing on the movement of any video pop-up, audio content or any visual element when the web page loads, thus making the user forget the initial location of the specific element or the visual. For the best result, CLS score should at most be 0.1. Once again, the lower the score, the better.


In addition to the newly added measurements, the Core Web Vitals update also includes the existing signals:  

  • Mobile Friendliness - focuses on the compatibility of the web page to display it efficiently on a mobile phone browser.
  • Safe Browsing (Virus Free) - making sure the website is safe to view. No hidden viruses, malwares, etc.
  • HTTPS Security - making sure the website is HTTPS secure (which can also be noticed from the URL tab in any browser for the specific web page).
  • No Intrusive Interstitials (Less Pop-up) - focuses on making the content cleaner. As in, getting ads or pop-ups when the user is in the middle of an article and they were not expecting it 


Therefore, Core Web Vitals have become an important factor when determining the new overall page experience score, but how do we measure Core Web Vitals?
Core Web Vitals measurement metrics can be seen from the performance of the 3 main ranking factors listed above - Largest Contentful Paint (LCP), First Input Delay (FID) and Cumulative Layout Shift (CLS).


As mentioned above, the significance of the Core Web Vitals has been noted by many businesses who are wishing to improve their rankings. Hence, all the popular tools for measuring the performance metrics have been updated to incorporate the new update. 


Popular tools such as Chrome User Experience Report, PageSpeed Insights and Search Console have all been updated and can now be used to determine the metrics.


Is the optimisation of Core Web Vitals possible?
 

Of course it is. Here are some final tips to further optimise each Core Web Vitals signal.
 

For LCP:

  • Remove any third party scripts or larger page elements - this can be done with the help of Google Tag Manager.
  • Optimise CSS files.
  • Use of image sprites and reducing unnecessary codes.
  • Optimise image file sizes.
  • Remove web fonts.
  • Reduce Javascript.

For FID:

  • Remove any third party scripts or using browser cache.
  • Reduce JS execution time - minifying or compressing the code will help achieve this.
  • Minimise main thread work.
  • Utilise web workers to do tasks off the main thread.
  • Use small transfer sizes and low request counts.

For CLS:

  • Set up specific attribute dimensions for any media or visuals. Use size attributes on images and video elements.
  • Never insert content above existing content, except in response to a user interaction.
  • Use transform animations instead of animations of properties that force layout changes.

Cornerstone Content

Cornerstone content is basically the pillars within a website’s content. In other words, cornerstone content refers to all that content (articles, posts…) that stand out on a blog or on a website because it highly represents the brands. The one content the company is mostly proud of. 


The main factor a company should taking into consideration in order to generate relevant content to be tagged as cornerstone content are:
 

  • Establish the keywords that will help a URL rank and why.
  • Use related content as a complement.
  • Optimise the web page or create if it has not been created yet.
     

Crawl Budget

Crawl budget refers to the time that Google assigns to its crawlers to crawl each website. Put in other words, crawl budget can be understood as a metric that indicates the number of times a website’s pages are visited by a search engine’s algorithm.

Crawlability

Crawlability refers to how easy it is for a search engine to read and interpret a website. That action is best known as “crawl”. Crawlability is mostly related to SEO optimisation and indexing strategies in the digital sector.


Some actions that can help to make a website more crawlable are:

  • Improving a website’s sitemap
  • Using internal linking in a smart way
  • Choosing a good hosting service
  • Optimising meta tags

Some factors that will make a website less crawlable would be:

  • Not having a proper sitemap
  • Having broken links, 404 errors or dead-end web pages.

Crawler

A crawler, also known as a spider, is a small software bot (robot) that analyses and reads a website’s code and content, from one page to another through its links, in order to categorise and index that web page and its competitors. They basically work as data collectors on the Internet, and once they recover all the information about the website and its business, they study its content, user experience and other things to decide where to position each website on the SERPs.

Crawlers can although be blocked from your search engines by using the robots.txt file. This will forbid crawlers to go into your web pages and get data from it, but at the same time it will restrict the information these softwares get in order to give you better rankings.

Crawler Directives

Crawler directives are  the rules search engines can get from crawlers as to tell them how to behave on that precise website. These directives can:

  • Tell a search engine whether to follow or not links on that page.
  • Tell a search engine not to use a page in its index after it has crawled it.
  • Tell a search engine to not crawl a page at all.

The most popular crawl directives are ‘robots meta directives’, also known as ‘meta tags’. These serve to crawlers recommendations on how to crawl or index a specific website.

Crawling

Website crawling is the journey that a small software bot (a crawler) makes to read and analyse the code and content of a website. The importance of ‘crawling’ lies in the fact that if the crawler cannot “read” your website, your website may not rank at all. Your website should be optimised for the Crawlers as they already have a lot of work. Hence, optimising your pages and simplifying their job will help your rankings immensely.

This information is found in servers. Crawlers identify the number of websites hosted by each server, and work their way down the list.

They do this by visiting each website and collecting data. This data includes:

  • How many pages the website has
  • The website’s different types of content: Text, video and image content (additional content formats include: CSS, HTML and JavaScript).

Crawlers are smart and meticulous. This data-collecting process is repeated constantly, meaning that the Crawlers can keep track of changes made to a website (for example, adding a webpage or deleting an image).

Critical Rendering Path

Critical Rendering Path (or CRP) is a series of steps performed by a browser in order to convert the HTML, CSS or the JS codes into pages that will be able to present the user with a message and provide them with the best possible user experience.

These main steps are:

  • Building the DOM tree (DOM ‘Document Object Model’)
  • Generating the CSS Object Model
  • Generating the Render Tree
  • Setting the layout on the web page architecture
  • Presenting (painting) the page’s pixels on the device screen

The top three factors to keep in mind in order to improve that critical render path process are:

  • The number of critical bytes
  • The length of the critical path
  • The number of critical elements
     

CSS

CSS, also known as Cascading Style Sheets, is a computerised language used to style elements written in a markup such as HTML, in order to separate the content from the visual representation of the site itself. This kind of language is somehow a graphic design language that allows you to create pages in a more exact way and apply a range of styles to them, such as colours, margins, fonts, shapes, etc.

This defined design of a web page will give you greater control over the final results you expect to get from that website and its pages. Another digital term extremely related to CSS language is the HTML, which is basically the big picture of the language. In other words, HTML is the markup language, which forms the bases of a site), whereas the CSS language emphasises the styling of it, the aesthetic part.

There are three ways of implementing a CSS code to a website, the internal and the external way.

  • Internal: These kinds of CSS languages will be loaded every time a website is updated, which can increase load time and, as a consequence, reduce the user experience drastically. Mind also that internal CSS cannot be used in different pages at once.
  • External: These languages will be made by a .css file, which means that you can do all the styling in a separate file and then apply it to any page you want. It will also translate to an improvement of the user experience.
  • Inline: This type of CSS works with all those elements containing the <style> tag, which means that every component of the page has to be styled and contain that precise tag.

DA (Domain Authority)

The DA, also known as domain authority, is a value from 1 to 100 that measures a website’s weight or authority depending on its organic position on the SERPs. In other words, it is a metric that serves as a way to decide a domain’s supremacy over a different one, or even its respective ones.

When it comes to SEO, controlling your DA status regarding your website serves to check if the SEO strategies you are currently performing are working properly. It will also allow you to realise about your competitors or other websites’ DA, in order to analyse their strategies.

De-indexed

De-indexing refers to when your website is temporarily or permanently taken out from search engine rankings. In other words, if your site has been de-indexed, it means that your site will neve appear on SERPs while that state is applied to your site.

The main reasons a website might be suffering a de-indexing status are:

  • Duplicated content
  • Outdated content
  • Gated content
  • Pages with little and poor content, or no content at all

De-indexing can also be applied by search engines due to wrongful practices (Black-Hat SEO), or voluntarily by a site webmaster.

Defective Links

A defective link is a link that has no object or does not lead to anything. Therefore a website’s rankings on search engines will be drastically affected, making the website with defective links appear in a lower position on the SERPs.

Directory Links

Directory links refer to those links born in a web directory, which acts as a list or catalogue of different websites. The former purpose of directory links is to list people or  businesses’ online entries, such as their homepage address or their contact information.

Although web or link directories are not used as they used to be, these directories have managed to survive and, somehow, amplify their range of uses, such as covering entire websites or even allowing searches within their database.

When it comes to SEO, submitting your website to a link directory will for sure help you strategy, as you would be  getting backlinks in the form of directory links to the site. It may be considered as a link building tool.

DNS (Domain Name System)

A DNS, also known as Domain Name System, is technology based on a database used to resolve names on networks. In other words, DNS technology allows us to know the machine’s IP address where the domain we want to access is hosted.

When it comes to this kind of technology we need to be aware of the three components needed for  a DNS system to work properly:

  • DNS client: runs on the user’s computer and makes requests to the DNS server to resolve names.
  • DNS server: resolves the request from the DNS client and sends the response.
  • Authority zones: disseminate information about domain names and subdomains.

Dofollow Link

Dofollow links are the conventional links that allow both users and search bots to follow them and to convey authority to the destination URL. In other words, if your  webmaster is using a link to a site with dofollow links, either the bots on the search engine and the other users will be able to follow it.

When it comes to SEO, dofollow links will be of relevant significance for your SEO strategy and so as to achieve a better Page Rank.

Think is like this as a Google example:

The dofollow link will tell the GoogleBot that that backling is allowed to be tracked. So the dofollow will be interpreted by that bot as a sign of trust from the linked website to the destination page. As a result,  this will be read as  a trust signal and will have a positive effect on the rankings of the linked landing page.

DOM (Document Object Model)

The Document Object Model (DO) is an HTML and XML programming interface used to interpret HTML and XML files. The DOM gives a representation of a document as a group of nodes and structured objects that have properties and methods.

It basically connects web pages to programming languages or scripts, such as JavaScript.

As an object model, the DOM serves to identify:

  • The interface and object used to represent and manipulate  a document.
  • The semantics of these interfaces and objects, including both behaviour and attributes.
  • The relationship and collaborations among these interfaces and objects.

DOM is highly related to SEO strategies as the rendering times of a website directly affect the positioning of that site, so learning how the structure of the DOM influences a site is essential. Some factors to be taken into consideration would be:

  • Shrinking and applying DOM elements in the correct form.
  • Proper use of JavaScript.
  • Minifying HTML, CSS and JavaScript content.
  • Using Content Delivery Networks (CDM).
     

Domain Name Register

The Domain Name Register is the action of reserving a name on the Internet for a certain period of time, usually one year. Domain registration is necessary for a web page, an email account or any other web service to give you personality and a  recognised identity. However, it is not always necessary to register a new domain.

When it comes to SEO,  the length of that domain name registration is worth taking it seriously, as webmasters have often  said that search engines  give more preference to  long time registered domains. This is because the domains usually registered for less than a year or a year, are domain bought for spamming webs. So think consciously about your domain's name registration and how it will affect your SEO strategy and, as a result, your revenue and traffic.

Domain Popularity

Domain popularity measures how many different domains have created one or more links towards a specific website. In order to get that indicator we ought to analyse the number of domains that have provided inbound links to the main site.

We might sometimes realise that the same domain created thousands of links from the initial site. That might happen either because of a voluntary strategy performance, or due to technical problems. Here lies the importance of evaluating domain popularity, which will allow us to overcome such inconveniences by giving us a clear view of how far a  site’s backlink profile really extends.

Domain Rating (DR)

The domain rating is a metric that helps you to measure the authority of a domain after analysing the generated impact of the link building strategy. In other words, DR tells us the strength of a link directed to a specific website.

The truth is that DR is not actually an official Google indicator, which means that it may be understood in different forms depending on the different SEO service platforms. This indicator is presented as a numeric value that goes from 0 to 100, being 100 the higher result that we can get.

Some of the main advices given to improve a domains rating, and as a result a domain authority might be:

  • Generate high quality links
  • Use natural and friendly anchor texts and keywords

Domain Trust

As the name itself shows, domain trust is basically the trust in a domain or the level of trust a domain demonstrates. Domain trust can be explained as the interaction of many factors that can demonstrate a website’s respectability and credibility.

The main goal of the domain trust is to ensure the improvement of a web page position on the SERPs. The right use of domain trust will positively act as a website’s positioning factor, so it must be considered an important asset in an SEO strategy.

Be careful not to use black-hat techniques to improve your domain trust, because it will eventually be detected by search engines and webmasters and it will result in really bad consequences. 

Doorway Page

Doorway pages are those pages prepared with specific content to be positioned by certain dictionaries in search engines, with the exception that when users enter them, they are redirected to another page with the doorway page creator’s content of interest.

These pages are often filled with keywords that when placed in those pages help to channel the traffic to the main website. Although it may seem similar to what landing pages are for, there’s nothing in common between them. While doorway pages are generally considered as outdated spam tactics that try to fool the users and redirect them to the content we want them instead to what they thought they were being directed to; landing pages are pages within a website, developed for the unique purpose of converting visitors into leads or as sales prospects through a particular offer.

Duplicate Content

Duplicate content refers to that content that is similar or exact to other website’s content or placed on different pages on the same website. It is basically word-for-word content from another page.

When it comes to SEO, duplicate content is not considered to be a great strategy, although it’s still often used by many websites in order to manipulate search engines so as a way to achieve a higher ranking on the SERPs and, as a result, get more traffic.

What search engines basically look for when their crawlers read web pages and sites is to offer the user an optimal search result, so it will never be positive if search engines are always finding the same content in different places on the Internet. Therefore, search engines will check every duplicate content, evaluate it and decide which duplicate content is worse.

A good strategy to reduce or to get rid of duplicate content is to try and expand the content, like rewording, rephrasing it, or even merging it onto a single page.

Dwell Time

Dwell time is a metric (KPI) that measures the time starting when users enter a website by clicking on one of the search results, until they return to the initial results page again.
This metric is not really considered to be a positioning factor, but an algorithm Google uses as a quality value, in order to tell whether a result reaches the user’s search intention. Therefore, as a result, that would eventually be helping that website's positioning on SERPs.

Some recommendations to improve dwell times value might be:

  • Generate high quality content
  • Improve loading speed
  • Improve the user experience, or at least pay attention to it
  • Offer interesting and entertaining content

Dynamic URL

A dynamic URL is a page address generated from a database obtained and analysed by the server. It is known to be the result of using a URL parameter with a static URL. It is often used in forums, product lists, sessions and interactive websites. The most used programs to create this type of dynamic websites are JavaScript, PHP, Java EE or Microsoft's NET platform.

When it comes to SEO, because as search engines spiders a dynamic URL, this site will be seen as a huge sequence of web pages with a constant content changing. This is known as spider trap. Therefore, if what we want to do is to ensure a better website indexation, it may be better to simply use a static URL, in which its content does not change unless the HTML code is rewritten.

Editorial Link

An editorial link is a one-way link placed within the body of content given naturally by other websites to specify a resource. It is a form of organic link building that are essentially backlinks.

Email Outreach

Email outreach is the activity of creating and sending email marketing campaigns to particular individuals to get them to take a positive action. It is often used to promote content, build links and collaborate or partner with businesses.

Engagement

Engagement refers to the action of communicating or interacting in an online community. The ways users interact can range from 'liking', 'favouriting' to 'commenting'. An online community can be anything from a website to a social media page.

Entry page

Entry page, similar to the landing page, is referred to as the first page viewed when visitors first access a website or mobile site. This page is important because the provided page content can initiate further engagement.

Evergreen Content

Evergreen content is content that is optimised to remain relevant and beneficial over a long period of time. This is due to the content relating to a topic that is continuously applicable; such as 'football scores' or 'how to get rid of a cold'.

External link

An external link (also known as a backlink) is a hyperlink that will direct a user from one page to another target page when clicked. It is a link on one domain that goes to a different domain. For example, a link on a page from blue.com will direct you to a page on purple.com.

Faceted Navigation

Faceted navigation refers to a technique that allows the user to navigate their way through a page, such as to filter and/or sort results through multiple categories on a webpage. In most cases, they are located on the sidebars of eCommerce websites.

Featured Snippets

A featured snippet is a summary of information that is selected as most relevant to the search query within the search engine results page (SERP). They range from two to three sentences and appear towards the top of the SERP, beneath the advertisements.

Fetch and Render tool

The Fetch and Render tool enables you to experience how Google crawls a page on a website and how they visually render it on mobile and desktop devices. This Google feature can be located in the crawl section of the Google Webmaster Tools.

File Compression

File compression is a data process in which the size of a file is reduced by re-encoding the file data to use fewer bits of storage than the original file. This method is very effective as it allows for easier and faster transmission and will save a substantial amount of space long term.

Follow

In the world of digital technology, a follow represents a user choosing to see online content published by a digital media account. Very similar to subscribing, following is a way to support and observe members online.

Frame

A frame is a structure that allows for the dividing of a webpage into two or more separate parts. Often times a frame is used to keep one part of a webpage static while a different part of the page is loaded or scrolled through.

Gated Content

Gated content is online material that requires a form to be filled out in order for someone to access the content. For example, having access to an exclusive e-book, only after a form requiring your full name and email address.

Gateway Page

A gateway page is a type of spam that redirects a website visitor to another website by inserting results for particular keywords and keyword phrases. These pages are designed to increase traffic to a website that has little or no relevancy to the original website.

Geographic Modifiers

Geographic modifiers, also known as geo-modifiers, refer to location-specific keywords used to assist and define search queries. For example, 'Gym' is not a geo-modifier, but 'Gym in Manchester' is.

Google Ads (AdWords)

Google Ads (AdWords) is a pay-per-click online advertising platform hosted by Google, that allows advertisers to display and promote their businesses through the search engine results page.

Google Alerts

Google Alerts is a free notification service offered by Google that sends emails to users, notifying them with summarised search activity regarding search terms.

Google Algorithm

Google algorithm is a complex system used to retrieve data from its search index and provide the best possible results for a certain search query. For example. If someone searches 'restaurants near me', the results that show will depend on the location or country of the individual.

Google Analytics

Google Analytics is a free analytics platform that allows a user to track and analyse the performance of a website or application. This is capable due to google providing real-time statistics and analysis of user interaction from within the website.

Google Autocomplete

When a user types a letter, word or phrase into the search bar, Google will show a number of search suggestions. The feature is believed to save thousands of seconds every day, mostly on mobile devices. As well as being convenient, SEO professionals can use it as a powerful tool in their keyword research, strategy creation and intent exploration.

Google Bombing

Due to Google’s algorithm ranking pages higher for particular keywords, web pages implementing a Google bombing campaign will create large numbers of links with those terms as the anchor text. Almost the exact opposite of SEO (Search Engine Optimisation), the aim of such Google bombing is to make the web pages rank for off-topic - rather than relevant - keywords.

Google Business Profile

With a Google Business Profile, businesses can manage the information that users find when searching for it online. This information, including address, opening hours and website, can be managed for both Search and Maps, while the tool also allows business owners to read and reply to customer reviews on Google.

Google Caffeine

The Google Caffeine update allowed the search engine to crawl data more efficiently and add sites to its index quicker. This made fresher information available across a wider range of websites, although it had a very minor impact on rankings. Previously, sites’ content would be reindexed every couple of weeks, but Caffeine made this occur in seconds. 

Google Keyword Planner

Google Keyword Planner is a free keyword tool that is part of Google Ads. SEOs use this tool to research keywords, view estimates of the search volumes they receive, and predict how much it will cost to target these words. It also allows users to see how searches change over time, allowing SEOs to identify search trends and target keywords at the right time.

Google My Business

Google My Business, now called Google Business, allows you to list your business on Google so people can find it on Google search and maps. To claim ownership of this business profile, you need to set up an account with Google Business, then you can customise and manage the profile. This is good for local SEO and will help people discover your business in local results.

Google penalty

If Google thinks your website goes against its Webmaster Guidelines, it can institute a Google penalty. This may be because you have used keyword stuffing, hidden text, or another black-hat SEO technique to try to manipulate Google's rankings. Google can issue a penalty for these actions to demote your website's position in search rankings or remove it from listings altogether (de-indexing). This penalty might be for your whole website or just a specific offending URL.

Google Search Console

Google Search Console is a free Google tool that allows you to monitor and improve how your website appears in search results. Google Search Console can help you check that Google has crawled your site, request that updated content is re-indexed, identify links to your website from other sites, view which search queries show your site, and receive notifications for issues on the website. Google Search Console can be used alongside other Google tools such as Google Ads, Google Analytics, and Google Trends.

Google Trends

Google Trends is a free tool from Google that can be used to track and compare searches for different keywords. With this tool, you can see the popularity of different search terms in different countries with a history going back to 2004. SEOs can use Google Trends to identify search trends and compare keyword popularity over time.

Googlebot

Googlebot is the name of Google's web crawler which crawls the internet finding pages and adding them to the Google index. Googlebot follows links and gathers information about the content on each page in order to create a detailed index from which the most relevant pages can be presented as search results.

Guest blogging

Guest blogging, also sometimes called guest posting, is where a person publishes content on a third party website that promotes and links back to their own site. This is a useful strategy for SEO because it created backlinks, indicating to Google's algorithm that your page is a trusted and useful authority and therefore should be ranked more highly.

H1 tag

An H1 tag is an HTML title tag which tells search engines what the title of a specific web page is. Other heading tags, H2, H3, H4, H5 and H6, are used for less important headings on a page. An H1 tag helps Google to understand the structure of a page and helps users to understand what the page is about.

Header tags

Header tags indicate the text one a page that should be formatted as a heading. The main heading will have an H1 tag, which indicates that it is the main and most important heading. Other headings can have H2, H3, H4, H5, and H6 tags. They are important both for user readability and for SEO because they help clarify to search engines what the page is about.

Hreflang

Hreflang is an HTML attribute which specifies the language a page is in and which geographical areas it should be shown to. If you have translated your website content into multiple languages, the hreflang tag tells search engines which language to show to people in different areas. These attributes are important to have if you have two slightly different versions of a page for different audiences - for example a British English version and an American English version. Other than a few spellings, these pages will look almost exactly the same, which can damage your page ranking. Fortunately, using an hreflang tag tells Google that these are not duplicates and indicates which one it should show to different people.

HTML

HTML, or Hypertext Markup Language, is the standard markup language for creating and structuring web pages. For creating web pages, it is often used alongside CSS and JavaScript.

HTTPS

HTTPS means Hypertext Transfer Protocol Secure and it is used to establish a secure connection between the user and the site that has been visited. HTTPS is mainly used by websites that deal with monetary transactions or transfer user's personal data which could be highly sensitive.

Image carousels

An image carousel is an interactive feature added to sites. Presented in a slideshow format, they allow the user to display multiple images or content in a single space. For example, one page about 'Holidays in Cuba', could host an image carousel that shows different tourist attractions to visit within the country.

Image compression

Image compression is a type of data compression applied to digital images, to reduce their size for storage or transmission, allowing for a cheaper cost. The two types of image compression are lossy and lossless. Lossless allows the file to maintain the same quality as before it was compressed, however lossy discards some parts of the photo.

Image Sitemap

An image sitemap can help Google find and understand image files on your site. You can add images to an existing sitemap or create a separate sitemap for your images alone.

Inbound link

Inbound link, also known as a backlink, is a link coming from another website that links back to your own website. The more inbound links that direct to your site, the more likely your domain is to rank higher within search engines.

Index Coverage Report

Google's index coverage report tells you the indexing status of all the URLs Google is aware of on your website. It has a summary page that categorises your URLs by error, warning, or valid, and indicates if any of your URLs cannot be found.

Indexability

A page's ability to be analysed by a search engine and added to the index is referred to as its indexability. Indexability is similar to crawlability, which refers to how easily a crawler can access a site for indexing, however these are not the same because sites may be easy to crawl but difficult to index. If search engines cannot index your site, they will not be able to show it in search engine results pages.

Interstitial Ad

An interstitial ad is an interactive full-screen ad that covers the entire face of a host app or site. For example, an appealing, customisable game advertisement, that takes over the whole mobile screen but offer the users an option to exit or skip the ad.

IP address

IP address stands for Internet Protocol address, it displays a numerical label such as 174.0.3.1 that is connected to a particular interface.

JavaScript

JavaScript is a powerful and flexible programming language. It can execute on a web browser that allows us to make interactive webpages such as popup menus, animations, form validation, etc.

Javascript SEO

Javascript SEO is part of technical SEO, it makes it easy to crawl and index Javascript heavy websites. This allows for the websites to be considered search-friendly and provide the potential to be ranked higher within search engines.

JSON-LD

JSON-LD is a scripting language that allows publishers to communicate important information to search engines. It is a lightweight linked data format to easily read and write structured data on the web using vocabularies like schema.org.

Kanban: A scheduling system

Kanban: A scheduling system is a workflow management system that allows you to optimise production and inventory. Presented visually, the system was created to aid planned and scheduled manufacturing. These systems would often be used in supermarkets to help identify stock item locations and which items have sold.

Keyword

A keyword is a specific word or phrase that defines what your content is about. For example, typing something like 'silver necklace' into a search engine, although it is more than one word, it is still considered a keyword.

Keyword Cannibalization

Keyword cannibalization occurs when numerous pages on a site compete for the same or similar keyword. For example, the search term 'electronic scooter' is typed into Google and your domain consists of multiple pages relating to 'scooters', 'electronics' and 'electronic scooters'.

Keyword Density

Keyword Density, also known as keyword frequency, refers to the number of times a keyword or phrase appears on a webpage compared to the total number of words on the page. Keyword density is represented through percentage or ratio.

Keyword Difficulty

Keyword Difficulty is the process of evaluating how difficult it is to rank for a specific keyword in Google's organic search results. A difficulty percentage is given based on how competitive the keyword is and whether it would be beneficial to use or not.

Keyword Explorer

Keyword Explorer is a time-saving keyword research tool that allows you to find profitable keywords. It helps you to define a certain niche and research beneficial search terms from users.

Keyword Proximity

Keyword Proximity refers to how two or more keywords are placed in the text. For example, the phrase 'driving school', placed in a body of text, 'Looking for a driving school? Driving school with Johnny can help you pass in no time'.

Keyword Ranking

Keyword ranking refers to a page's particular place in the search engine results page in regards to a specific search query. For example, when a search term is entered into Google relating to the content of your pages, the keyword ranking is reflected through whichever spot your URL is shown.

Keyword Stemming

Keyword stemming is the tactic of modifying various forms of a keyword or phrase. Often times this is through the use of adding prefixes, suffixes and pluralisation. For example, additional forms of the word 'honest' are 'dishonest' and 'honesty'.

Keyword Stuffing

Keyword stuffing is the practice of saturating a page with SEO keywords in an attempt to rank higher within search results and gain more attraction to the page.

Keywords

KPI

KPI stands for Key Performance Indicator, and it is a measure of an organisation's performance and success in achieving specific targets over a period of time. KPIs help organisations to make more informed, data-driven decisions based on quantifiable measurements of performance. All departments in a company can have their own KPIs in accordance with their organisational goals.

Lazy loading

Lazy loading is the process of delaying the loading of non-critical resources on a webpage until required in order to speed up load times. In the case of an image, lazy loading will utilise a placeholder until it is needed. This allows webpages to load more quickly and efficiently. 

Link Bait

The goal of link baiting is to generate backlinks for a web page through informative, engaging content. To create effective link bait, this content needs to be topical, unique and visually attractive. The overall goal of link bait is to grab and hold the user's attention by presenting your information well, and in doing so, creating content that is worthy of sharing. It increases the likelihood that this web page will be shared by other blogs, for example, thus creating backlinks. If it is not presented well to users or is difficult to read, then the web page is unlikely to be able to generate backlinks. 

Link Building

Link building refers to the practice of gaining links from other websites to your own, otherwise known as backlinks. Link building is an excellent way of ranking higher on search engines, as the more links a page has from other high-quality websites, the better. It will inform search engines that this page is of importance and significance, and as a result, it will rank more highly.

Link Equity

Link equity refers to the concept that some links have more value and authority than others, and that these links can pass on this authority to others through inbound links. This ultimately helps a page to rank higher on a search engine. When a page receives backlinks from another page that the search engine has deemed as high-quality, the link equity will then be passed on. Link equity can be passed on by both internal and external links, however external links will provide more equity long-term compared to internal links. It is also important to remember that search engines do not reward quantity, but rather quality. Search engines will reward web pages that may have have fewer, more high-quality links as opposed to flooding with page links that have little to no link equity. 

Link Exchange

Link Exchange involves websites agreeing a deal whereby they will exchange links to each other's websites. This helps to boost each other's organic traffic and rankings on Google. Link exchanges can also involve more than one website that this agreement is made with. The main point of link echanges are to drive more organic traffric to a web page, so it is important that link exchanges are relevant to increase the authority of your web site, therefore helping it to rank higher. 

Link Explorer

Link Explorer is a tool that analyses the pages that link to your website for Page Authority, linking domains, external links, and other information. It can show you the anchor text, the link target page, and the scam score for all these links. 

Link Farm

A link farm is a group of websites that are setup with the sole intention of linking to each other, regardless of their relevance. This SEO technique is often frowned upon, as having inbound links is a key measure that search engines use to boost a web page’s ranks. Link farms are therefore used to manipulate rankings by using unrelated links to boost a page’s amount of inbound links.

Link Juice

Link juice is a colloquial term used commonly in SEO to describe the value that is passed on from one link to another. This is also known as link equity. A web page that is deemed to be of high-quality will pass on its link juice to another through backlinks. In turn, this will make that page receiving the backline to then rank higher as it is being linked to by a high-value web page.

Link Popularity

Search engines, such as Google, are able to determine the quality of a web page by looking at the amount of high-quality inbound links. Link Popularity is an important measurement built into search engine algorithms that can help pages rank higher. The better a web page's link popularity, the more likely it is to rank higher. 

Link Profile

A web site can have a multitude of inbound links from different web pages. A Link Profile is entire makeup of all of these many different inbound links. 

Link Reclamation

Link reclamation is the process of finding any broken or lost links and fixing or replacing them altogether. Link reclamation helps ensure that you are not losing out on any link equity, and that your rankings are not being hurt. It's important to note that this includes both internal links and links from other websites. 

Link Rot Definition - Viaduct Generation

Link rot is the colloquial term given to a broken link. Link rot can occur when a web page is changed, moved, or removed from the internet entirely. Often, clicking on a link that has decayed or undergone link rot will lead the user to a 404 error page. Link rot happens all over the internet, and it occurs fairly often as pages are altered. 

Local Pack

When a user types a search query with local intent, Google shows a local pack. This is a SERP feature that displays a map of the local area and revlevant business listings. If a local pack is shown on a search results page, it's almost always featured in the number one spot. Therefore, it's advantageous to optimise your website to appear in this local pack. // Side note: Google recently updated local packs so they can show a paid result above the organic local search results. 

Local SEO

Local SEO helps your website appear higher in the local rankings on search engines like Google. Nearly half of searches are made with local intent, so local SEO is an important strategy. To optimise for local searches, you might claim and manage your Google Business profile and do local link building. 

Long-tail Keyword

Longer and more specific than short-tail keywords, long-tail keywords may be made up of 3-5 words and are a good indication that a person is ready to make a purchase. For example, somebody who searches 'laptop' is probably less likely to be ready to choose and buy a laptop than the person who searches 'Samsung Galaxy Chromebook'. Because these keywords have lower search volumes, your business is more likely to be able to rank highly for them. This means that long-tail keywords are useful to optimise for. 

Manual Action

A Manual Action is a penalty which Google places on your website if your content does not follow the quality guidelines. A Manual Action can make the pages of your site rank lower on Gogle or be completely omitted from SERPs. 

For a Manual Action to be placed on your website, a human reviewer will make a judgement on the integrity of the content on your site based on many different factors. From a content perspective, the main issues for which a Manual Action will have been taken on your site are: 'pure spam', 'thin content' or 'structured data issues'.

'Pure spam' is when a website has resorted to underhand tactics to manipluate the Google quality guidelines, this is sometimes called 'Black Hat SEO'.

'Thin content' is pretty self explanatory. Thin content is when the content on a site has little-to-no value and negatively impacts the user experience (UX). 

'Structured data issues' are issues that arise when the content on a website does not marry up with the structred data markup of the website. For example, this could be marking up irrelevant or misleading content. 

If a Manual Action is placed on a website, the owner will be alerted via their Google Search console access and via email communications. Impacted users will need to take steps to solve the issues raised by the Manual Action and request a review to have the penalties on the website lifted. 

Manual Penalty

A Manual Penalty is the specific penalty relating to a piece of content which violates Google's quality and communtiy guidelines when a Manual Action is enforced on a website. A manual penalty issued by a human reviewer rather than an automation. There are many types of manual penalties including, but not limited to, dangerous content, harrassing content, misleading content and hateful content. 
 

Meta Description

A meta desciption is a brief summary of a webpage which is displayed in SERPs.

The meta description should be within 100-160 characters long and include the focus keyword of your webpage. For example, if you had a product page for 'electric scooters', it would not be useful for your meta decription to state that this page displayed apples. 

Meta Keywords

A meta keyword was a HTML element which was used to describe the contents of a webpage. This practice is now unsupported by most search engines and largely outdated. Meta keywords were supposed to function in a similar manner to modern meta descriptions. 

For example, on a product page displaying 'electric scooters' you would put in the meta tag <meta name= "keywords" content="electric scooter">
The practice of using meta keywords was dropped by most because it became a target for keyword stuffing and other Black-Hat SEO practices. 

Meta Redirect

A meta redirect, also known as a HTML redirect, is a way to use the meta refresh tag to redirect a user to a different webpage, without the use of a 301 redirect. A meta redirect commands the web browser to change the page, where a 301 redirect commands the server.

There are a few reasons why 301 redirects are preferable to a meta redirect. Firstly, Google prefers 301s. Secondly, a meta redirect will not transfer the rankings from one page to another like a 301 will. Thirdly, a meta redirect is not considered best practice for a website as it can create a confusing user experience (UX). Meta redirects are often used my spammers as it takes longer for Search engines to pick up on them. 

Meta Tag

A meta tag is an umbrella term for the meta data regarding a webpage. 'Meta', in the most simple terms, means how something defines itself. If a book is self-referential, for example, we might term it as meta. Meta tags function in the same way, they are the means in which a website defines itself to a crawler. There are many types of meta tags including: titles, descriptions, robot tags, keywords and so on.

Meta tags tell search engines key information regarding your webpage. Search engines scan the HTML of a webpage for meta tags to discern what a webpage is about. Using meta tags within the HTML of a webpage, you can define what search engine crawlers set the title of the page as, the summary of the page, the key ideas on the page and even command it to index the page or not. 

Meta Title

A meta title is the meta tag which refers to the title of a webpage. Meta titles should be between 59-70 characters in length and include the focus keyword of the webpage. Meta tiles are the linked titles which come up on SERPs. 

You'll see that this page, for example, has the lovely meta title:

Meta Title Definition - Viaduct Generation

This tells the user and search engine bots all of the information needed to discern what the webpage is about. 

Mobile-first Indexing

Mobile-first indexing is how Google preferentially indexes mobile-friendly versions of webpages in SERPs.

Think of mobile-first indexing like a pineapple upside-down cake: Google brings the good stuff to the top. Because most users start their searches on a mobile device, Google tends to use the mobile-friendly version of a webpage to rank and index a page. Historically, Google would use desktop versions of webpages, but this shifted in 2018 when it became clear that mobile search outweighed desktop. 

Nofollow

Nofollow tags tell search engines to ignore a particular link so that it does impact their ranking. This is done by using the rel="nofollow" tag. When this is added, it will tell the crawler not to follow it and to disregard it. It does not pass on any PageRank and as a result, the page's search engine rankings are not impacted. 

NoIndex Tag

NoIndex tags tell crawlers not to index a particular page. This may be because the page is still under development, is private, or shouldn't be indexed because it contains a large database or is the mobile-friendly version of another page. 

Off-page SEO

Off-page SEO is a key part of an SEO strategy (alongisde on-page SEO and technical SEO) and refers to all the aspects of SEO that take place outside your website. The main aspect of this is link building (getting authoritative sources to link to your website),but it also includes tactics such as content marketing, PR, social media, and brand building. These tactics can increase your pages' relevance, authority, and trust in the eyes of search engines like Google. 

On-page SEO

On-page SEO refers to optimising the content of a web page to improve user expeirence, help search engines understand your website, and rank more highly in search engine results pages. It focuses on optimising content, headings, and metadata around the keywords you want your page to rank highly on search engines for. Alongside off-page SEO and technical SEO, it is one of the most important aspects of SEO. On-page SEO is sometimes also referred to as on-site SEO. 

Organic Traffic

Organic traffic, also known as free traffic, refers to the visitors who arrive at your website from the organic listings on a search engine results page. The users who click on a search ad (paid listing) to come to your website are referred to as paid traffic. A successful SEO strategy can increase the organic traffic to your website. 

Outbound Link

Outbound links are links on your website that point to other websites. These are also sometimes referred to as external links. Outbound links in your content are useful because they signal your authority and research to your readers, allowing them to follow the links to find your sources and additional information. Relevant outbound links are also useful for SEO because they help search engines better understand the content of your site by adding topic signals. However, irrelevant outbound links can be detrimental, so we recommend avoiding them. 

PA (Page Authority)

PA (Page Authority) is the score out of 100 that indicates where a page is likely to rank on the search engine results page (SERP). The higher the score, the easier it is for a page to rank highly. It is made up of threee parts: how recently the page was updated, how trustworthy the page is (including high quality content), and how relevant the links on the page are. To improve page authority, you can update pages more frequently and build quality links. // Side note: page authority is different to domain authority, which measures the authority of the entire domain or subdomain. 

Page Content

Page content refers to all the information in a website, including text, images, audio, video, and links. 

Page Speed

Page speed refers to how quickly a single page loads. It is different to site speed, which means the average loading speed of a sample of the pages on your website. Page speed is impacted by many factors including the page filesize, how compressed the images are, and the site's server. It's important for SEO because page speed is a key ranking factor. 

Paid Listing

Paid listings, also known as search ads, PPC ads, or sponsored results, are the advertisements that appear on the search engine results page. To display these paid listings, people and businesses bid on keywords. The adverts that have the highest bids will be shown when somebody searches for those keywords. They tend to appear above or on the right hand side of the organic adverts. 

People Also Ask

People Also Ask (PAA) provides other searches that are related to your search query. It's a rich snippet that appears on the search engine results page. 43% of search results pages now feature a PAA and answers are given beneath each query with their web page source. PAA boxes can appear in any position on the results page, not just in the first or second rankings. 

Qualified Lead

A qualified lead is a lead who has already indicated that they are interested in a brand and has therefore been identified as somebody who is likely to become a customer. By identifying these qualified leads, the marketing and sales team can focus their efforts on the people most likely to convert into customers. The criteria that define a qualified lead may differ between organisations but it can often include actions such as visiting certain pages, interacting with the brand's social media posts, filling in forms, or clicking on calls to action. // A qualified lead may also be referred to as a Marketing Qualified Lead (MQL), which means that an organisation's marketing team believes they are likely to become a customer, or a Sales Qualified Lead (SQL) which indicates that the sales team agree that they will convert into a customer. 

Qualified Traffic

Qualified traffic means the traffic from your website visitors who are already interested in your brand, products, or services and are likely to become customers. For traffic to be qualified, the content of your page needs to be relevant to what the searchers are looking for, making them more likely to convert to customers. 

Query

Search queries, also known as web queries, are the strings of words that people type into search engines when they want to find something. There are three categories of search queries: navigational (queries where people are searching for a specific website), informational (where people are looking for information), and transactional (when people are ready to buy something). 

RankBrain

First introduced in 2015, RankBrain is a machine learning algorithm which helps Google better understand search queries and intent so it can show searchers the most relevant results. RankBrain is Google's third most important ranking signal. Instead of trying to match the exact keywords in your search query with websites, RankBrain instead tries to understand what searchers are looking for and show them the results for that. It also measures user satisfaction from the results it shows, learning and tweaking its algorithm to perform better. 

Ranking

Ranking means the position of a page or website on the search engine results page (SERP). The higher the ranking the better because the top results receive the largest volumes of traffic. The exact ranking factors and weightings that search engines like Google use to determine rankings are kept secret, but some of these factors include the number of backlinks, page loading speed, sitemap and internal linking, relevance of keywords and content, click through rate (CTR), bounce rate, and more. 

Ranking Factor

Ranking factors are factors that search engines use to decide how to rank search results. Google uses over 200 ranking factors and nobody outside Google knows the exact list. However, we do know that factors include backlinks, site speed, mobile-friendliness, site security, internal links, and keywords in content and title and header tags. Search engines use these factors and multiple algorithms to determine rankings, weighting different factors in relation to the type of search query to rank the most relevant content highest. 

Ranking Opportunities

Ranking opportunities are opportunities for websites to appear in search results for certain keywords. Your website may already rank moderately on Google for some keywords so these represent opportunities to increase your ranking to reach higher results. Search engine results page rankings are determined by many different ranking factors: Google uses more than 200 factors to decide where to rank your pages.

Reciprocal link

Reciprocal links are where two sites will link to each other, perhaps because they offer similar industry expertise or complementary products or information. While a few reciprocal links can be beneficial to website visitors and SEO because they act as backlinks, Google warns against excessive reciprocal linking. Google says that 'excessive link exchanges' that try to manipulate PageRank violate Webmaster Guidelines. Therefore, it's a good idea to limit reciprocal links to those that will be beneficial to readers and where the sites have real relationships. 

Reconsideration Request

If Google has identified problems in a manual action or security issues notification, you can send a reconsideration request to ask Google to review your site and reinstate it in search results and the Google index. A reconsideration request is an email send to Google explaining that you have fixed the issues with the website and reassuring that they won't happen again in the future. 

Redirection

A redirection is a way of sending users from an old URL to a new one. This may be because the site owner has deleted the page or has temporarily moved it to a new location. An HTTP redirect status code tells search engines to display a new page instead of an old one. Users will not even notice that they have been redirected. For permenant redirection, you may use a 301 redirect, while 302 redirects are temporary redirections. 

Referral Traffic

Referral traffic refers to the visitors that come to your site from other websites and social media platforms, instead of from a search engine like Google. Referral traffic is good for your business and your SEO because it brings new and relevant visitors to your website and the backlinks tell search engines that your site is a source of quality and trustworthy content. // The other types of traffic your website might see include direct traffic (traffic where users type in the URL of the website or access the site another way without coming through another website or search engine), organic traffic (where visitors come via a search engine's unpaid results), and paid search traffic (where visitors come from PPC ads). 

Regional Keywords

Regional keywords are the keywords that SEOs use when they want to target a small and specific group of people. These keywords will be specific to the local area - for example, they might be the colloquial name for a neighbourhood or area, or a type of slang that is used in that location. For example, if you're trying to target specific American audiences for your sandwich shop, it's important to research their regional slang terms for sandwich such as sub, hoagie, hero, grinder, or even spuckie! // Because regional keywords tend to have far lower search volumes, it can be hard for marketers to find the right keywords. However, if you get it right, regional keywords can do wonders for your hyperlocal SEO.

Rel-canonical

Using a rel-canonical tag tells search engines that this is the master copy of a page and the version that you want to appear in search engine results pages. This avoids the problems that can arise with duplicate content appearing on multiple URLs. This is also referred to as canonicalisation and it's important because it helps you control the duplicate content that appears on the site and potentially damages your rankings. // Even if you haven't consciously duplicated content on your website, a rel-canonical tag can be useful because mulitple URLs can point to the same page, causing search engines to see this as duplicate content. For example, people might link to your homepage in different ways, so it's useful to canonicalise the page to avoid issues. 

Related Searches

At the bottom of Google's search results page are eight related searches. These are suggestions for other searches created by Google's algorithm. As well as helping searchers find out more, they're useful for SEOs who are looking for additional content ideas to cover in blog posts. These related searches help you understand users' search intent and therefore create pages that are relevant and useful to them.

Rich Snippet

Rich snippets refer to search engine results that feature more information that previews the website, such as thumbnails, recipe details, or star ratings depending on the type of web page. Some also come with carousels. These features make them more attractive to users and have a higher click-through rate.

Robots.txt

Robots.txt, also known as the robots exclusion standard or robots exclusion protocol, is the standard used to give instructions to bots and tell them which pages of a website to crawl. The robots.txt file can be viewed by adding /robots.txt to the end of the homepage URL.

Schema Markup

Schema markup, also called structured data, is code added to your website HTML which is used to tell search engines what your content is about. This is helpful for SEO because it means that search engines can display your page for more relevant searches. This description of your page in code creates a rich snippet which is added to the search engine results page (SERP). Scheme markup is commonly used to add information about people, places, organisations, and events.

Search Engine Results Pages (SERPs)

The Search Engine Results Page (SERP) is the page that is shown to search engine users after they input a search query. It lists organic search results, search ads, and SERP features such as featured snippets. The organic results on this page are ranked by how relevant the search engine's algorithm considers them to be.

Search Intent

Search intent, also known as audience intent or user intent, is the reason a user has typed a search query into a browser. There are three types of search intent: navigational (for example, searching for 'Twitter'), informational (for example, 'how do peanuts grow?'), or transactional (for example, 'buy Apple Macbook Pro'). SEOs need to keep search intent in mind when creating website content to ensure that it is relevant and useful to them.

Search Visibility

Search visibility, also known as search engine visibility, describes how visible a page is in the search engine results pages. It's the percentage of people who see the page after putting a search query into the search bar. The higher the ranking of a page, the higher the search visibility will be.

Search Volume

Search volume refers to the number of searches for a specific query in a search engine such as Google. You can use Google Keyword Planner to calculate the search volumes for different terms. This can help you optimise page content to correspond to keywords with a high search volume.

Seasonal Trends

Search traffic and website traffic are often impacted by annual events such as Christmas or Summer. For example, an e-commerce business that sells swimwear may see an increase in searches in the spring and summer, while a toy retailer is likely to want to rank highly in Christmas-related searches. These seasonal trends are replicated annually and are an opportunity to target seasonal markets and visitors.

Secondary Keywords

Secondary keywords add context or additional information to the primary search terms. For example, if the primary keywords are 'phone charger', the secondary keywords might be 'braided', or 'portable'. These keywords are unlikely to help you rank higher on the search engine results page, but they help show search engines that your content is relevant to search intent.

SEO Audit

SEO audits are a full analysis of a site's performance and ability to rank on search engine results pages (SERPS). It will indicate the areas that can be improved to boost SEO such as any issues with technical SEO, website structure, user experience problems, or even content gaps. Performing an SEO audit is usually the first step in creating or updating an SEO strategy. Audits should be carried out frequently in order to quickly resolve issues and achieve SEO success.

SERP Features

When a user enters a search query into Google, the vast majority of results pages include SERP features. These are any elements on the page that are not the organic search results, also known as the 10 blue links. SERP features include featured snippets, local packs, shopping results, knowledge panels, top stories, top ads, and instant answers. These are important for SEO because they have a high click-through rate; for example, around 8% of clicks go to a featured snippet.

Sitemap (HTML)

Sitemaps in HTML are lists of all the important pages of your website which are made to help users navigate your site. Intended to improve user expeirence, they are different to XML sitemaps which are designed specifically for search engine crawlers to help them index your pages. While an XML sitemap is necessary for SEO, HTML sitemaps are less important but can still be useful.

Sitemap.xml

An XML (Extensible Markup Language) sitemap is a list of all the important pages on a website. This is useful for telling search engines about the structure of your website and allowing them to crawl and index it more easily. Crawlers use internal links to find their way around a website, but using an XML sitemap can speed up the process of finding pages for them. Webmasters can also include additional metadata about the URLs listed in the sitemap such as when they were last updated.

Thin Content

Web pages that have little useful content for visitors are deemed thin content by Google and may receive a penalty. For a page to be considered thin content, it might have very little content, duplicate content, or content that is not relevant or useful. This might also refer to doorway or affiliate pages. Thin content increases your website's risk of a high bounce rate and a poor user experience. It also makes it unlikely that your site will receive backlinks.

Thumbnails

Thumbnails are previews of an image that have been downsized. The word has historically been used to refer to very small drawings or images, but in the context of the internet it means a smaller version of an original image that acts as a preview.

Time On Page

Time on page is a metric that measures the average amount of time that users spend on a page before moving on to the next page on your website. This is different to session duration time, which counts the average amount of time that users spend on the site. One key thing to remember with time on page is that it doesn't count people who landed on your site and then clicked back without looking at any of your other pages. Likewise, if a person visits many pages on your site before leaving, the last page they visited before leaving will not be counted in time on page.

Transactional Queries

Transactional queries are search queries that indicate that the searcher is ready to make a transaction or purchase. Unlike informational search queries (where the searcher needs information) and navigational search queries (where they're trying to go to a specific website or page), transactional search queries represent customers at the end of the conversion funnel. Sometimes these search queries have terms such as 'buy' or 'order' in them, and often they use the exact brand and product names of the items that the searcher wants to purchase.

TrustRank

TrustRank is one of the algorithms that helps determine the ranking of pages in the search engine results page. It analyses links to distinguish spam from quality pages using 'trust signals' which help determine whether a website is a trustworthy source.

Side note: TrustRank is the name of both the algorithms by Google and Yahoo; although Yahoo patented it first, Google's TrustRank algorithm is more popular.

URL

A URL (Uniform Resource Locator) is the address of a web page. It includes the domain name as well as the address of the specific page on the website. While URLs aren't always easy to read, it's an SEO best practice to create URLs that have clear words that describe the page.

URL Slug

A URL slug is the part of a URL that uses keywords that humans can easily read. Slugs can be generated automatically using the title of the page, or they can be manually created. They make it easier for people to identify page they are looking for and are also good for search engine optimisation.

User Experience

In SEO, user experience, often shortened to UX, refers to the overall experience of using a website. It covers how difficult or easy the site is to use, the efficiency of the site, and its structure, navigation, and interface. User experience is about providing the best experience to your website visitors.

UTM Code

A UTM code (Urchin Tracking Module) is a snippet of code on the end of a URL. It is a parameter that allows you to track user engagement on your website. To use them to track a particular campaign or ad, you add the UTM to the end of your URL so Google Analytics can track information such as the ROI of different campaigns. They allow you to define five parameters which help you learn more about what is bringing visitors to the site: campaign, source, medium, content, and term.

Why 'urchin'?

Urchin was an online software that was designed to analyse website traffic. Google bought the software and it became Google Analytics, but the name stuck.

Voice Search

Voice-enabled search or voice search uses spoken search queries instead of typing them in the browser. Speech recognition technology is used to interpret a user's query and provide search results. Optimising for voice search is important for local SEO because many local searches are made this way.

Webmaster Guidelines

Webmaster guidelines are the regulations set out by search engines for websites. They are a set of best practices that help you to build your site so it will have better indexability and crawlability and appear in Google search. They provide guidelines for design, technicals, and quality of the website.

Website Structure

Website structure is the way in which each individual subpage of a site links to one another. A sound website structure is key as without it, crawlers will not be able to find (and thus index) subpages. Linking the most important pages to the homepage is crucial as crawlers need to be able to find subpages with ease.

Webspam

Webspam is a Black Hat SEO tactic that seeks to manipulate search engine rankings using techniques such as keyword stuffing, thin content, cloaking, and excessive links. These are all SEO tactics that go in direct violation with Google’s webmaster guidelines as they negatively impact the user experience. In fact, Google has a team designated to finding these spammy pages and imposing manual penalties.

White Hat SEO

In old Western films, the hero would wear a white hat whereas the villain would wear a black hat. In the same vein, White Hat SEO means only using ethical practices and following the search engine guidelines. Rather than adopting more spammy approaches, for instance, duplicating material that is exclusively geared for search engines, White Hat SEO strategies focus on giving readers high-quality, relevant information that maximises user experience.

X-Robots-Tag

A robots meta tag is an element of HTML code that is used to control how search engines crawl and index URLs. X-robots-tags are an alternative to robots meta tags. An x-robots-tag will control non-HTML content like a Microsoft Word document, or a PDF; this requires an HTTP head response, instead of an HTML code.

YMYL Pages

Your Money or Your Life (YMYL) pages contain information that can harm the mental wellbeing of a user because of the stressful nature of the topic being discussed. The content could be any material that, if provided in an untruthful, or deceptive way, can negatively impact the user’s safety, mental wellbeing, or financial stability. These pages often share medical or financial advice, topics anxious internet users regularly seek out. If found to be spreading harmful misinformation, Google will heavily penalise these pages.

Zero Click Searches

Zero click searches occur when the top search results answer a user's query without them having to click onto a third-party website. For instance, the answer to their query may be provided in a brief snippet at the top of the page, meaning the user never needs to leave the search engine results to receive the information they need.

Discovery Call

Unlock Your Digital Potential

Book a discovery call with our experts and discover how Viaduct Generation can supercharge your online presence for substantial business gains.

Book Now
Scroll To Top