Foreword: To help teach some of our content writing team about search engine optimisation, I asked them to create the biggest SEO glossary on the internet. They did that, and this is their story…
Some of the terms in this document don’t relate directly to SEO, but are most definitely things that SEOs will encounter as they work with clients and other departments if they work in an agency.
Nobody is perfect which is why our SEO team are working alongside our Content team to fix any discrepancies or incorrect definitions that may appear.
P.S. If you think there are some definitions that are missing, please contact us and we will do our best.
10 Blue links – This phrase refers to the traditional way in which search engine results were displayed. Once a query had been searched, the search engine would bring up 10 blue links as the results. This method is extremely basic but ultimately laid the groundwork for the way search engines results are presented today.
10 blue links, as a phrase, is generally used today to refer to outdated search engine results pages and a basic layout. SERP’s have been upgraded and improved upon a lot since the 10 blue links days, with Google offering the best example of that. Google now offers a myriad of results when you conduct a search expanding upon a simple list of relevant websites. Typically, a Google search will include elements such as relevant shopping options, a Google maps result, a Google business page and even image results.
3 Pack – If you’re a business or organisation that relies on local customers to buy your products and services then you’ll want to know about the 3 Pack. 3 Pack refers to a type of SEO that focuses on driving local, nearby traffic to your business, it is the listing of three businesses you see in the search results when you search for a keyword that is locally relevant such as “near me” or “near [location]”.
These searches appear with a map above it that highlights where the businesses in the 3 Pack are. Google interprets your search query and offers up three Google My Business listings that may be most suitable, based on what it is you’re looking for.
For example, if I lived in Manchester, and wanted some sushi, I would Google “sushi in Manchester”, in seconds the search engine brings up a 3 Pack that presents three different sushi restaurants that are in Manchester that I might be interested in.
301 redirect – Sends users to a different URL to the one they clicked on. Different to a 302 redirect, which is temporary, a 301 redirect is a permanent change. The term ‘301 redirect’ is taken from the HTTP status code for this action.
Commonly, 301 redirects are used when a company has a new website under a different domain name and needs to ensure users can find it whilst being unable to access the old URL. Once a 301 redirect has been placed on a URL that webpage is no longer accessible as it will automatically send users to the new page.
Often, when a URL has garnered a high value in terms of its linking and ranking Google, the owner won’t want to lose the quality by simply removing the page. Instead, a 301 redirect can transfer the value of the original URL to the new URL to which users are being directed too.
302 Redirects – A 302 Redirect tells search engines that a page, or an entire website, has been moved somewhere else temporarily. This type of redirect is ideal if you want to briefly direct people to a temporary page that they can use, be it to get contact details, business locations, or to purchase products and services, while you work on building a new site or updating the current one.
Crucially, you should only use a 302 Redirect if you fully intend to restore your original website. Another handy use for a 302 is if you want to test a new page and glean customer feedback, without impacting the ranking and general SEO value of the existing page.
The difference between a 302 Redirect and a 301 Redirect is that the latter is a more permanent option (more than a year). You’d only use a 301 if you were permanently closing your website, or web page, for an extended period, say 12 months or more
404 error – The error code received when the link you’ve clicked doesn’t exist. Broken links can occur when the webpage no longer exists or it’s been moved to another URL. This can happen if a 301 redirect hasn’t been applied to the old URL or the redirect hasn’t been applied properly.
404 errors are quite common as sites are moved all the time without the owners of pages linking to the site ever being notified. When the user attempts to view the webpage via the broken link, Google will return with the 404 error notifying the user it no longer exists. Custom 404 error pages can be created by website owners which notify their users on what they should do once they receive the message. Just like with 301 redirects, the 404 error got its name from the HTTP status code.
AEO – AEO stands for Answer Engine Optimisation and is a form of SEO that has gained greater popularity in recent years thanks to the rise in voice searches, and devices such as Alexa, Google Home, and HomePod by Apple. As more and more people use voice-assisted devices, the need for industries and sectors to adapt their marketing and SEO to accommodate it has grown. AEO focuses far more on one singular answer, this is because you’re not viewing a screen, you’re listening to the answer so there can only be one response, not a list of six or seven.
AEO isn’t going to replace SEO, billions of people are still going to search for things the old fashioned way (if you can call it that), but the prevalence of AEO is certainly going to increase. It can even match or surpass the number of searches made by typing out queries, especially as voice technology and AI get more sophisticated.
AI – Artificial intelligence is intelligence that is displayed by machines which are different to the natural intelligence that humans and animals demonstrate. AI is a form of intelligence that doesn’t involve emotions or consciousness. The term can also refer to any machine or piece of technology that displays particular problem-solving traits and has been shown to learn as it is fed new information. The goal of AI is to allow machines to receive information and make rational decisions based on the data. Rather than what we have now where machines are just facilitators for our decisions and play no part in the process other than storing and displaying the information we’ve created. Machine learning is an associated term and this refers to the idea of computers learning and adapting to new data on their own.
Agile Content Development – Agile Content Development (ACD) is a methodology that looks to continuously improve and optimise content. Rather than just writing content based on data, publishing it, and seeing how the chips fall, ACD aims to tweak and change content based on requirements and search behaviour.
By continuously improving it, the content has a far greater chance of ranking higher, for longer, because it is being tweaked and kept current. ACD is a customer-centric methodology and must meet demands, queries, and intentions at different times.
Agile Content Development is split into four phases: Discovery, Briefing, Optimisation, and Measurement. By adopting this method, copywriters can enjoy real-time recommendations on keywords and topics that inform their content creations and ensure it is always optimised. ACD removes the guesswork and replaces it with knowledge.
ACD should be something all copywriters and website owners do to avoid work becoming stale, outdated, and ranking for keywords that are no longer relevant or getting the search traffic they once had.
Ahrefs – Ahrefs is a tool used by marketing agencies and businesses for thorough SEO analysis and to monitor backlinks. Ahrefs is made up of a range of different tools that can help people looking to rank for keywords, and monitor the performance of pages that have already been indexed by search engines.
Split into six parts, Ahrefs is one of the most comprehensive SEO analysis tools out there, it is divided up as follows:
- Site Explorer – Helps to analyse backlinks and profile competitor sites.
- Content Explorer – Discover the most popular content in your industry so that you can emulate it, and beat it.
- Keywords Explorer – Find industry-relevant keywords to target, and base your content on.
- Rank Tracker – Track your keyword rankings and create reports.
- Site Audit – Analyse your website and discover SEO issues that need fixing.
- Alerts – Be the first to hear of new backlinks, mentions, and updated keyword rankings for your site.
Alexa Rank – Alexa Rank is a relatively new global ranking system that lists millions and millions of websites in order of popularity. The way this system works is that the lower the ranking, the better. Amazon calculates this ranking by examining the average daily unique visitors and the number of page views over the most recent three month period. Alexa Rank should be thought of in the same way as Google Analytics and is Amazon’s attempt to compete in this market. Ironically the website that has the best Alexa Ranking – 1 – is Google, which just showcases the breadth and power of this internet behemoth.
It is popular but isn’t without its sceptics, while the ranking system may allow businesses to charge more for advertising, and attract better quality guest writers, the data is limited to users that have the programme installed so websites with extremely high traffic, maybe ranked poorly, despite having great results.
Algorithm – An algorithm is defined as a process or set of rules that are carried out by calculations and similar problem-solving operations. Algorithms are often carried out by computers because they are extremely complex and hard to understand. In terms of how it relates to SEO, an algorithm is a complex system that Google undergoes to determine the rank and return of the billions and trillions of pages that are indexed by the site every single day. The algorithm at Google is quite mysterious and is relatively unknown by people who don’t work there. However, things such as long-form content, ontology and long-tail keywords are favoured by it and are often rewarded with a high ranking.
Alt Tag – Alt tags, otherwise known as alt text or alt attributes are image descriptions written in HTML that inform search engines about the images you are displaying on your web page. This is important because search engine bots aren’t very good at reading actual images, so by specifying alternate text and including a brief but accurate description of the image, you are giving web crawlers a better, clearer and more comprehensive description of your web page.
Often overlooked, alt tags can be optimised with proper keywords and descriptions to improve visibility on Google’s image search while also improving indexing accuracy and improving content relevance.
Anchor Text – Anchor text is the clickable text of any link, often denoted as blue underlined text. Every time you see a link and click on it, you’re reading and clicking on the anchor text. Anchor text is used to provide information – both to users and to search engines – about what the web page being linked to is about. For example, when we link here to a blog we wrote about anchor text earlier this year, the text you click on to be directed to the blog is the anchor text.
Anchor text is more important than a lot of people give it credit for, because it helps navigability and allows crawlers and users to better understand and move around your website. And if you try to influence this with spammy keyword stuffing tactics, you’ll find yourself penalised for it.
Answer The Public – Answer The Public (ATP) is a handy keyword research tool that visualises search engine queries and questions, auto-complete terms, and suggests keywords in something called a “search cloud”. ATP breaks down a search term into six different categories, the 5 Ws (‘who’, ‘what’, ‘when’, ‘where’, and ‘why’) as well as ‘how’, ‘can’, ‘are’, ‘which’, and ‘will’. It creates these in the form of reports that can be saved, stored, and shared by multiple users (this is a feature that is only available on pro accounts).
ATP is perfect for businesses looking to examine search intent and glean insight into what their potential customers are searching for. By using ATP, businesses can plan out content and create documents that directly answer these questions. It is a good place to start, but businesses should be aware that ATP doesn’t come with search volumes, however, it does help give them greater insight and a better understanding of their target market.
Attribute rel=”nofollow” – ‘“Nofollow” refers to the value of the same name that is found in the rel attribute. A rel attribute is another sub term that provides context about the relation of the linking page to the link target. The “nofollow” value is used to signal to search engines that they should essentially ignore this link and not put any authority on it. The concept behind the term is an old one and dates all the way back to 2005. Google introduced this feature to try and prevent spammy links giving undue authority to sites and blogs. The “nofollow” link attribute allows Google to learn about the context of the link and use that information to make a ranking fairer. There are four main reasons why you’d use this attribute. The main reason you’d use it is in cases where you want to link but not be associated with the link target. The other reasons include when you link to widgets, certification badges, and press releases. Unfortunately, Google now no longer treats them as directives and instead takes it as a hint that they shouldn’t put any SEO weight on those links.
B2B – Standing for ‘Business to business’, B2B is one of the most commonly used terms in the business world. B2B involves one business selling its products and services to another business. For example, a company that designs lighting solutions could sell their products to an electrician who uses their products as part of their service. B2B tends to happen when a business is looking for raw materials. B2B can also loosely refer to the way a website is designed, who a business advertises to, and the type of language that is used.
B2C – Business-to-customer is very similar to B2B. In this instance, the business is replaced with customers. B2C refers to the selling of products to consumers directly, without any supply chains or third parties. A great example of a B2C business is Amazon which sells goods directly to customers. The term became very popular during the dot com boom during the 1990s. Online B2C businesses, such as Amazon, arrived and became a threat to traditional high street customers.
BERT – Standing for Bidirectional Encoder Representations from Transformers, BERT is quite simply the biggest update Google has released since they released RankBrain. Google states that BERT will impact 1 in 10 search queries and is – to be it simply – their neural network-based technique for natural language processing. To put it even more simply, Google is trying to improve the ways its machines interpret our searches to provide us with better results. It’s the biggest attempt since the release of RankBrain to take the onus off the user to type the ideal phrase and shift it to the search engine to provide the right results. As BERT rolls out more and more users will start noticing their results are more accurate and that Google will be able to better understand nuances and contexts of search queries. Impacting 10% of all searches may not sound like a big deal but it is, because, in future, it will only grow. Pretty soon BERT will more than likely affect all searches made on Google and, unfortunately for people in SEO, there is very little they can do about it in terms of optimisation. In a way, that’s a good thing, websites and those looking to rank well can now focus on providing real value to their target audience, instead of overly worrying about keywords.
Backlinks – A link from one page to another, also known as inbound links. Within Google’s algorithm, backlinks essentially count as votes for a page and any web pages with a high number of backlinks often have a high organic ranking. These backlinks notify Google that the content which is being linked is relevant and useful to users. Webpages with few or no backlinks at all will be recognised by Google as irrelevant and will be more difficult for those pages to achieve a high organic ranking.
Not all backlinks are the same, however, which is why it’s important to ensure your website uses quality backlinks otherwise they will count for very little. Poor quality backlinks account for very little and even with a thousand of them, they wouldn’t return the same value as a single high-quality backlink. Sites with good domain authority can provide the most valuable backlinks, as Google will read this as the site passing on its authority to your site.
Banner – This term refers to a form of advertising that a user would usually see in the form of a banner on a separate website to the one that is selling the product. Banner advertising follows you around as you search the internet in an attempt to get you to buy the product you didn’t purchase or keep you aware of that particular brand in case you want to purchase something from them in the future. For example, let’s say you go to a candle store, put candles in your basket and either get distracted or decide at the last minute you don’t want to buy them, you leave the site. There stands a good chance that, on the next couple of websites, you’ll see banner ads that display the products you were looking at a moment ago. Clever, right? Banner ads have a long and storied history, they first appeared back in 1994 and were the first form of advertising specific to the internet. This history brings with it great success, overall the internet advertising business is worth around $124 billion. These days, banner ads are underpinned by something called programmatic marketing which allows marketers – harnessing AI – to bid in real-time for ad space in the time it takes for the ad to load.
Black Hat SEO – Black hat SEO is essentially any SEO tactic aimed to improve page ranking that Google’s quality guidelines. Characteristically, these tactics revolve around content creation designed specifically to manipulate the search engine algorithms instead of creating rich and audience-focused quality content.
Now, black hat SEO has changed over the years, and many SEO practices that were commonplace 15-20 years ago might now be considered black hat. Google’s Webmaster Guidelines change, and with them so does what should be considered black hat SEO. However, you can usually identify black hat SEO tactics such as keyword stuffing, hidden text, cloaking and doorway pages in the way that they ignore the user experience in favour of algorithm manipulation.
Blogger – A blogger is someone online who hosts a blog on any given topic. Quite simply, they are a content creator online who has their website or blog on which they upload their thoughts and grow their business.
The reason that bloggers matter to SEO is primarily one of link building. Bloggers grow their business and can make for excellent business partners, so SEOs will often seek relevant and authoritative bloggers for their link building outreach. Rather than working with another business or website, working with bloggers can be more personal and they can be an excellent resource for authoritative and relevant links.
Blogger Outreach – Blogger Outreach is a process that businesses undertake when they want to leverage the influence of bloggers, influencers, and prominent users of social media, to help boost their brand awareness and keyword reach. The process begins by reaching out to a pre-selected group of influencers in a particular industry, one that the company wants to become more prominent in, usually. Often, the blogger or influencer is given access to products and services for free, or for a fee, in exchange for them promoting it on their social media channels, reviewing them, and generally using their influence to market the business on their behalf.
Done correctly, this can be a very cost-effective way of growing your business and your brand because the ‘cost’ of this type of outreach is simply letting someone use the product or service you want to promote. This is far cheaper in comparison to other methods such as PPC or Digital PR.
Bounce Rate – Bounce rate is an important metric that shows you how many visitors came to your website and then left without engaging with your content or visiting another page. Generally speaking, a lower bounce rate is better and shows that your content and user experience is doing something right, encouraging interaction with your website.
Bounce rate will vary from industry to industry and is affected by a variety of factors. But SEOs working on a campaign will always try to improve the bounce rate of a website. While Google claims that bounce rate is not a ranking factor when it comes to Google search, it can indicate and highlight important site or content issues.
Bounce rate could be affected by slow site speed, bad content, high ad density, poor relevancy, and more. It might not be a direct ranking signal, but the bounce rate is something that every good SEO will take into consideration and try to improve.
Branded Keyword – A branded keyword is a specific type of keyword that includes your brand name in it. For example, if you’re looking for headphones and search for ‘Apple headphones’ the branded keyword in there would be ‘Apple’. Keywords that don’t include a company’s name are classed as ‘Non-branded keywords’. For a successful SEO campaign, it’s important to have a mixture of both branded and non-branded keywords. Too many branded keywords and you could be at risk of negative search results if you receive bad PR. On the other hand, too few branded keywords will mean your branding is limited.
Breadcrumb Trail – Breadcrumbs are navigation tools – they are a small text path that lets the user know where they are on your site. It also helps Google to establish the structure of your site too, and breadcrumbs that appear in search results give users an overview of where the webpage is situated on your site. Most breadcrumb trails are usually visible at the top of a webpage. To add breadcrumbs to your CMS, such as WordPress, you can download various plug-ins to do so.
There are several advantages of using breadcrumbs, but the main one is that Google appreciates them, which is always beneficial for SEO. Google sees breadcrumbs – especially those that appear in the search results – as valuable to users, as they do enhance user experience. To keep your site visitors satisfied, and to ensure that they enjoy browsing your site, you should use breadcrumbs so that they always know where they are. Breadcrumbs are also great for lowering bounce rates – if the page that the visitor is on doesn’t provide the solution they are looking for, a breadcrumb trail can direct them to another part of the site. After all, it’s better to redirect them to another part of your site than back to the SERPs.
CMS – CMS stands for Content Management System and is a dynamic website on which multiple users can manage, control, edit and maintain the content and structure of a website. A CMS is a database that is typically easy to learn and set up, providing greater accessibility for people to create and run their website. There are a lot of different options out there when it comes to choosing content management systems – the most popular example of which is WordPress.
Different versions of a CMS will offer a varying amount of control over the code of the website, affecting your ability to implement thorough technical SEO. This allows people who don’t want to deal with code and who just want to set up a simple website with a template to get started quickly, while SEOs can use a different version to more comprehensively optimize the website they’re working on. There are also plenty of SEO plugins that you can install and use to improve your SEO while using a CMS.
CSS – CSS stands for Cascading Style Sheets, and it is a programming language that, used alongside a markup language like HTML, describes to the browser how the web page’s HTML elements should appear to the user. CSS is all about the presentation of a web page – for example, at its most basic, the colour and fonts used. CSS can also be used to ensure that web pages adapt and appear differently when viewed on different devices.
Caffeine – Not to be confused with the key ingredient in coffee, Caffeine is Google’s indexing system that crawls the web in search of relevant web pages. These pages are then indexed, assuming they comply with Google’s guidelines, and can then appear as part of Google’s SERP. Previously, Google’s old indexing method would consist of a web-wide crawl every few weeks, leading to a layered approach which often led to outdated results.
The Caffeine approach, however, is a continuous process that can provide the most relevant, up to date results available. This means web pages are being added to the index all the time and therefore ensuring the SERP isn’t producing outdated results. The program is a huge advancement in technology when compared with the old method, and is also being continuously improved, making Google the number one search engine to focus on and optimise your website for.
Canteen Theory – This theory – created by our very own James Welch – relates to his belief that Google’s aim is to try and determine how big a company’s canteen is. The general thought is that the larger the canteen, the larger the company is, and, therefore, the more likely that they are to be trusted. And Google wants the most trustworthy of companies at the top of its listings. This is because trustworthy companies are more likely to give a good customer experience to Google’s visitors – and the more that this happens, the more that visitors will return to Google.
But why a canteen? Imagine the canteen of a large company full of people eating lunch. Each of those people is likely to have at least one social media account, with most having at least two or three. Each of these accounts is somewhere that each employee could share that they work for the company in some way. Maybe it is in their LinkedIn bio, maybe they have tweeted about their job. Maybe they have a link on their Facebook profile. Some of these people may have a personal blog that mentions where they work.
All of these are signals that can be made only by a large company. A small company – which will have a small canteen – cannot make the same signals, because it doesn’t have the same number of people able to make them.
But it is not limited to people within the canteen to make the signals. A larger company is more likely to have signals created by customers of the company – and non-customers, too. For example, a retailer with tens of stores is likely to accrue more tweets, posts, blogs, and news stories about it than a company with just four people in an office.
In Google’s aim to produce the best search results, it has to use factors that are hard for smaller companies to replicate. This fits in to James’ mantra that ‘the harder something is to do, the more impact it has on Google’.
Call To Action (CTA) – A call to action (CTA) is a term that refers to a prompt or an invitation for a user to take a specific desired action. These are often phrases that are incorporated into webpage copy, advertising messages or specific buttons that help the user to complete the action e.g. to visit a contact page, to get in touch with someone. A well-written and successful CTA will be clear, easy-to-understand and will result in conversion after prompting the audience to take a specific action. Whilst there can be multiple CTAs on a webpage or within a piece of content, they should not confuse or overwhelm the audience – the next step, and the desired action, should be extremely clear.
CTAs will often include strong action verbs to prompt the reader, such as ‘call’ or ‘buy’, and some may use a sense of urgency within the tone of voice to prompt the reader to take action only. This is often done by using specific timeframes, such as ‘buy this now, available for a short time only’. An effective CTA can be a powerful tool in growing your audience and increasing your sales.
Canonical URL – A canonical, or ‘preferred’, URL is a URL that Google believes is most representative, from a set of duplicate pages on your webpage. In other words, it is Google’s most preferred version of a webpage. A canonical link element or tag is found in the webpage’s HTML header, to inform search engines if there is a more important version of the webpage. These elements prevent issues with duplicate content, in the context of search engine optimisation. The canonical can be situated in a different domain to the duplicate.
You should choose a canonical URL from a set of similar pages for numerous reasons. First, it helps to specify which URL you want to appear in the search results. Secondly, it can help to reduce the time that a crawler might spend crawling on duplicate pages. A canonical link will help a crawler to get the most from your site – it can spend time crawling new pages of your website as opposed to crawling similar versions of pages, such as desktop and mobile versions. Other advantages include the management of syndicated content, that it helps search engines to consolidate the information that they have for URLs, and it can make tracking metrics for products and topics much easier, as this is usually more challenging with a variety of URLs.
Clickbait – Clickbait is a term that refers to a piece of text, or picture that is sensationalised and is designed to entice people to click. The defining features of clickbait are being over the top and often misleading. Headlines are often dishonest and entice people in to read the content. This content doesn’t necessarily reflect the sensationalised headline that made people click in the first place. An example of clickbait could be: “Leading Surgeon Reveals The Worst Food That You Eat Every Day!”. Clickbait is a form of fraud but is not punishable by law. It’s a practice, however, that is frowned upon by the online community.
Cloaking – The practice of presenting misinformation or different information to what the user expected once clicking through to a site. This is considered a breach of Google’s guidelines and will result in the site being penalised once it’s flagged. This can be done by coding a page in a certain way which means when a search engine crawls the site it only reads HTML whereas to a human user, it will display images or other content.
The main goal of cloaking is to boost a page’s ranking for certain keywords and when a page is clicked on, send the user to a different place than it would the search engine. Cloaking is considered a black hat SEO practice and something which you should not actively participate in. If your site is found to be cloaking by Google, it will receive large penalties and potentially be de-indexed.
Commercial investigation queries – When a searcher wants to compare one product against another to determine which is best. These types of searches are often conducted with the intent of research or purchase. The reason these different types of query exist is that Google needs to read and understand the different types of searches to provide the most accurate results.
For example, searching for a distributor for your product would count as a commercial investigation query. These types of search queries can provide valuable information in terms of keywords and information about your competitors. These queries are very important during keyword research as information on the searcher rather than the keyword can be far more beneficial in some cases.
Competition – In SEO terms, competition refers to two different things, direct competitors & SEO competitors. The former looks at competitors who sell similar products and services, or operate in the same area. Direct competition could refer to online companies, or bricks and mortar rivals. SEO competitors refer to business and rivals who are competing for the same keywords on page one of Google. For example, a dozen or so businesses could be writing content and optimising their website for the keyword “LED lights”. These will be all in competition with others and the business that satisfies the algorithm most will appear top.
Competitor Research – A broad term but one that is very, very important, and a key phase for any business looking to disrupt the market. In SEO terms, Competitor Research involves spending time looking at businesses and their websites, sitemaps, and the way they go about creating content and displaying information about the products and services that are similar to what you’re selling. Beyond that, competitor research also involves examining what keywords, both short and long, and questions they rank for. By doing this, you can identify what kind of keywords you need to target to surpass your competitors.
Competitor Research is vital, without it you’re not making informed decisions and missing out on learning about potential topics that you could use to help improve the performance of your website, from an SEO and keyword perspective. Having an incredibly clear picture of who you’re up against will allow you to find spaces and areas in your industry that haven’t been looked at, these opportunities, however small, could be the difference between success and failure.
Content – Content, specifically web content, is the textual, visual and aural content published on a website online. While SEOs and content marketers most often use the term content to refer to written text on a website, content can be anything from videos and images to written blogs and podcasts. Content is essentially any creative element on a website, whether it consists of audio files, embedded video, applications and tools, or text.
YouTube, for example, is a website that consists almost entirely of video content, whereas a website for podcasts consists almost entirely of audio content. The quality of content that a website produces is a key SEO factor.
Web content is so important to SEO because it is a driving factor behind traffic generation. A website with high-quality content that is thoughtful and insightful will generate very different results to low-quality content that uses keyword stuffing techniques and is not valuable to users. The creation of engaging and quality content – like our biography of Elon Musk – and organising this for easy navigation is a key facet of SEO and web design.
It is in the content of a website where keyword optimisation is performed – overall, content marketing is a very powerful SEO tool. While video and audio content is very popular and widely used, search engines still favour text-based content when crawling and indexing a website – which is why many SEOs still focus on producing written content for a website.
Content Delivery Network – A content delivery network (often known simply as a CDN) is a distributed network of servers that deliver content to users. From on-page text and image content to applications and downloadable content, a CDN serves HTML or static resources based on geographic location.
The servers that make up a content delivery network are to be positioned around geographic groups of users, speeding up content delivery massively. CDNs were first created in the late 1990s as a way of dealing with the massively expanding demand for fast and reliable internet across the world.
Contextuality – Contextuality is absolutely crucial for SEO. After all, SEO involves making each webpage on your site as contextual as possible. Contextuality has been an important concept since search engines were created, but over time, algorithms have got better at analysing contextuality. As Google’s algorithms evolve and continue to improve, they are now understanding webpage and entire websites better than they ever have before.
You can improve and maximise the contextuality of a page in many ways. Much of this involves the written content, by including keywords and relevant ontological phrases. However, it doesn’t just mean adding as many words as possible. It’s all about context. If you’re writing about a product, discuss what it is, who it’s for, the benefits of the product and the solution it solves. Make sure that you don’t just write solely about that one product, but other topics that branch off in relation to it. Contextuality isn’t just about words though – using breadcrumbs on your website is another way to improve contextuality for search engine optimisation. It allows the crawler, or bot, to see where it is on a website, thus increasing contextuality.
Conversion Rate – The conversion rate is a calculated percentage of users that have completed a specific targeted goal. A conversion can take different shapes, whether it be a sale, a form fill-in, or anything else. When goal tracking, a conversion is a term used to describe a user meeting your desired goal.
This can be used for ads, website engagements, emails, and anything else – the conversion rate is the percentage that shows exactly how successful your campaign or ad is. Conversion rate is the ratio of visitors who completed the desired goal against the total number of visitors.
Conversion rate is an important metric in analysing campaign and advertising performance, and it can help advertisers to optimise, tweak and improve their strategy.
Cookies – It’s a term we’ve all heard and blindly clicked ‘accept all’ when landing on a new website but what are cookies? Well, essentially cookies are small pieces of data that identify your computer to a network and help improve your browsing and web experience by tailoring, mainly ads, to things that are relevant to your recent internet travels. There are two types of Cookies – Magic Cookies and HTTP Cookies – the former is a slightly outdated concept that refers to the transfer of information sent to and from computers and databases. The more common form of Cookies, the HTTP ones, are the type you’ll most likely be most familiar with. They are designed to track, personalise, and save info for each user’s sessions. Let’s say, for example, you go on a website that sells shoes, once you’ve accepted the site’s cookies, had a browse, and left, you’ll more than likely see ads for shoes when you go on other websites, be it to browse content or watch videos. HTTP Cookies were first used by Lou Montilli in 1994 when he recreated the concept while helping an e-commerce company fix overloaded servers.
Core Algorithm – A core algorithm is essentially an algorithm that is a fundamental part of Google’s ranking functionality. In all honesty, there’s been a lot of confusion amongst SEOs as to what exactly defines a core algorithm, and Google hasn’t been particularly clear on the matter either.
After the Panda update, Andrey Lipattsev, a senior search quality strategist at Google, said that a core algorithm is one that Google essentially no longer has to worry about or work on. In his words, a core algorithm (PageRank being a good example) is past the experimental stage and now functions on its own and will be functioning, unchanged, for the foreseeable future.
Crawlability – Site crawlability can be improved upon and influenced in several ways, including optimizing and improving site speed, optimizing image and video, using the proper redirects, use of efficient internal linking, and creating a proper site map. These tactics allow the search engine’s crawlers to more easily navigate your website, improving its ability to find your content and accurately index your website.
Crawl Depth – This is a term that describes the level to which a search engine will index pages within a given website. Take a look at any half-decent site and you’ll see that they contain main pages and subpages which go deeper and deeper, similar to files on a computer. Crawl depth is calculated by starting at the homepage (which has a depth of 0), then, any page that is linked from there has a depth of 1. This number increases the further from the homepage you go. As a general rule, you want to be able to find the most important pages within three clicks.
Crawlers – A crawler is a program that search engines use to crawl the web, including your website. Alternatively known as a bot, spider, web crawler or Googlebot, crawlers allow search engines to scan and analyze websites across the internet to accurately rank and index them. Crawlers will visit your website to collect information on website navigation, performance and content, adding and updating the information it finds to the search engine’s index.
Crawling – Crawling is the process by which search engines discover and index your website and web pages. Using a crawler, Google and other search engines use crawling to collect information on the internet’s billions of public web pages to accurately keep its index updated.
During the crawling process, the search engine will analyse the content and code of your website and follow internal and external links to build a picture of your website’s position on the internet, its relationship to other web pages, and the quality of its content. There are also tactics that SEOs will employ to influence search engine crawling through improving crawlability.
Cross-Linking – To cross-link is to link two pages within a website to increase the relevance of both of them. For instance, if you had a service page that talked about the black shoes you sold, you could link that page to a blog that discusses all the types of shoes which are available for women. Cross-linking in this way is an ideal way of showcasing to google your authority on a topic because, as it’s scanning a site on your page, it’s constantly being directed towards other relevant content that relates to that page. What better way to showcase authority? If you can prove authority by effectively cross-linking, you’re setting yourself up for SERP success. Good cross-linking is about a couple of things – the web page/content it is linking to and the anchor text. Both of these have to be extremely relevant for you to achieve the authority that’s going to see that page fly up the rankings. You can’t just link to a page that isn’t relevant to the content that you’re linking from. Further, the anchor text should be relevant to the site that it is anchoring. Done right, you can easily see how dozens of cross-links help to increase link juice and overall domain authority.
Crowd Marketing – Crowd Marketing is a fairly new term, which isn’t uncommon in the marketing world, and it’s used to describe a new type of marketing that is all about appealing to the masses. Its newness means that it gets misdefined by a lot of marketers. In essence, Crowd Marketing goes beyond Influencer Marketing – which is what it gets confused with – and incorporates content creation, SEO, and social media marketing, all of which results in verified lead generation. It’s different also because it focuses on targeting an audience within a market, rather than just masses. Crowd Marketing helps to build the authority of a business within an industry. There are five types of Crowd Marketing – Classic, Backlink Generation, Content Distribution, Reputation Management, and Crowd Influencer Marketing. Classic Crowd Marketing simply involves publishing high-quality content within your industry. Backlink Generation is about getting attention from other users in your industry and ultimately getting them to link back to your site. This can help boost domain authority which improves ranking drastically. Content Distribution involves publishing vast amounts of content across various channels. Reputation Management involves creating profiles on every relevant platform so that customers can find you quickly. Finally, Crowd Influencer Marketing involves reaching out to macro and micro-influencers who will push your products/services on your behalf (if they believe in it, of course).
Customer journey – Customer journey is an all-encompassing term that looks at the process by which a customer purchases a businesses product or service. Depending on the sector or industry, the customer journey could take minutes or months. The journey depends on the products and services being sold, the price of them, and the effect they have on the customer. For example, a typical journey for purchasing life insurance may start with someone searching for more information, thinking about the options available, consulting with family members before deciding on where to purchase this product. This could take a long, long time. Businesses must ensure they plan content and advertisements that ensure they are always at the forefront of a customer’s mind during their decision making.
Customer Lifetime Value – Often shortened to CLV, Customer Lifetime Value is an important metric that measures the total worth of a customer to a business throughout their relationship. It’s a metric that uses the following formula: customer value x average customer lifespan. CLV is very important because keeping existing customers, and extracting value from them, is much cheaper than trying to acquire new customers. For example, if a customer has repeatedly bought a product over five years, their CLV to the business will be very high and much cheaper for the business to keep them because they don’t have to spend money on advertising and promotions to attract them. It’s been labelled by some as “the most important metric that companies ignore”. To back this up, a recent study indicated that 34% of people knew fully what the concept of Customer Lifetime Value included. CLV can help you segment the value of your customers, allow you to focus on long-term company-wide growth, and accurately measure how much you should spend on customer acquisition. It’s a powerful metric to build your strategy on.
Data – A broad term, data is loosely defined as a collection of facts, figures, and empirical evidence. Data can be collated and used for reference, analysis, or to make decisions. Data is the most valuable commodity and resource on planet Earth, recently surpassing Oil. Data is incredibly important for a business. The right data can help them make more informed decisions and allows them to use their budget in the most efficient and effective way. An email is the most basic form of data. However, spreadsheets, keyword research, and analytics are other more complex forms of data.
Dead-End Page – This kind of page is one to be avoided. A dead-end page can disrupt user flow and encourage people to leave your website – which you don’t want. These dead-end pages have no internal or external links, and they also don’t have a call to action or real point to them. All pages should be designed and written so that there is an end goal – be that a phone call, email, or to click through to another page.
Deep Linking – Deep linking is the practice of linking to a specific page on your website, or someone else’s. This isn’t just linking to a homepage or service page, it involves linking to a very specific piece of content such as a news article or blog. Deep linking can be tricky to do but, like all things that are difficult in the world of SEO, search engines will reward it if it is done well. An example of this would be if you wrote: “In the past few months we have seen how employees enjoy the benefits of remote working.” and then in that sentence, you linked to a specific news article that discussed the increase in remote working.
De-indexed – This is when Google takes an action on your site to deliberately remove it from the Google index and generally carries a negative connotation. This is due to Google crawling your website and finding something which it has deemed in breach of its quality guidelines. To ensure your site can be indexed and as a result, unable to be de-indexed, you will need to follow these guidelines extremely closely to ensure complete compliance.
Often de-indexing can be attributed to a few factors and it generally relates to potential spam content. For example, if your website has acquired a large number of backlinks in a very short period, Google will notice this as suspicious behaviour and will penalise you for it. Participating in link farms or spamming comments will also be flagged as uncooperative behaviour and result in your site being de-indexed.
To prevent your site from being de-indexed or to get it re-indexed if this has occurred already, you will need to cleanse your site of spam links. This will require you to audit your site and disavow any links which are deemed spam. Once spam links have been removed and you’re happy your site only contains natural links, you can submit a reconsideration request to have Google re-index your site.
Digital PR -Digital Public Relations (PR) is a strategy that involves creating high-quality content to increase brand awareness. This content is then pitched to online publishers who will share the content, citing your brand as the information source. For PR to be successful, strong relationships between writers and publishers is crucial. Most content published for PR purposes is emotional content – it’s content that the audience can resonate with, generating interest in the content and subsequently the brand and any relevant services or products.
The relationship between SEO and digital PR is often overlooked, but it is highly important. Promoting your brand via different online publications will help you to become more credible and trustworthy, both in the eyes of potential clients and customers and from Google’s point of view. Using digital PR tactics and link-building can go hand-in-hand to do this. By linking press releases back to your website, you’ll allow potential clients and customers to click through with ease, whilst Google will recognise your sector as trustworthy and a knowledgeable source within your sector.
The takeaway message here is that by getting authoritative, trustworthy domains – such as digital news publications – to link to your site, Google will trust your site, see it as a credible source and this will only work in your favour when it comes to those all-important search rankings. A higher search ranking provides the opportunity to attract more traffic, subsequently resulting in more conversion.
Disavow Links – You would disavow a link if you thought that that link was a threat to your site’s SEO performance. In the same way that good links from sites with authority benefit your site, bad links from spammy sites can significantly damage your reputation in the eyes of Google. By disavowing a link, you’re telling Google that you do not want this counted toward your site. Disavowing a link is an absolute last resort and should be avoided because it can hurt your own SEO performance. Before opting for this, try and manually request the link be removed.
Disavow Tool – A tool that is used to discount the value of an inbound link. This is used to prevent any penalties for link spam which could harm the ranking of your website. As Google’s ranking algorithm has been improved and adapted, it’s now able to recognise if there are too many links to a particular domain, registering this as spam linking and penalising the website as a result. This is done in an attempt to provide its users with the most relevant sources of information and prevent them from viewing spam content.
For those looking to achieve the optimum ranking for their website through organic SEO, this is an extremely important tool as it can essentially tell Google not to count the link when it crawls your site. The bad link which you may have unknowingly created could lead to an unsavoury site or another site that Google has recognised as not relevant, hurting your ranking as a result of your website being associated with the other.
Display Network – A display network (specifically, the Google display network) is a group of over 2 million websites, apps and videos on which your advertising can appear. Advertising through the Google display network means that your Google ads can be seen on YouTube and through Gmail, as well as appearing on millions of other websites online. Through advertising on the Google Display Network, you can target your ads to appear to particular audiences, locations and in particular contexts. Display network sites reach over 90% of internet users across the world.
Domain Authority – DA, or domain authority, is a search engine ranking score and metric used in part to predict how well a website will rank within its sector, industry or niche. It is used to give businesses and marketers a picture of how authoritative a website is – and it’s no secret that authority is a big part of how Google indexes and ranks websites.
Domain authority is essentially a score for a website’s overall strength, building up over time as more links are acquired and more content is produced. The score can be increased by improving your website’s authority. This can be done in several different ways, but perhaps the best is to earn top quality links and backlinks from high ranking, trusted and reputable sites.
The term domain authority has an awful lot to do with link equity (otherwise known by SEOs as ‘link juice’). Earning strong backlinks that pass a lot of link equity to your website will earn you a higher domain authority – which, in turn, will improve your visibility and ranking in the SERPs.
Domain Name – A domain name is essentially the address of your website online. It is the text that a user enters into the browsers address bar to get to your website (for example, embryodigital.co.uk).
A domain name is a known ranking factor in Google’s algorithms, and the difference between a good domain name and a poor one can have a significant impact on your SEO and performance. Having a relevant, strong and SEO focused domain name is certainly something you should consider.
Doorway Page – This is a term that isn’t commonly used but refers to a page on a website that is made specifically to rank for particular keywords. These pages act as a doorway to other areas of the site, usually product pages, and are quite unpopular online. They are unpopular because they have often been used for nefarious purposes. In practice, they should be unique, content-rich pages that provide genuine value with pushing a hard sell. However, in practice, they have often included mass-produced content that has various versions that don’t deviate from each other. Doorway pages clutter up search engines and present a challenge to the Googles and the Bings of the world.
Duplicate Content – In short, duplicate content is when a significant amount of content on one web page matches content that exists elsewhere – either on the same website or a different website entirely. If two substantial pieces of content across different web pages are, either identical or matches closely to one another, Google may deem it to be duplicate content.
The issue arises when a search engine crawler finds and indexes the content in two different places and isn’t able to tell whether the content has been copied, ending in a potential penalisation. Duplicate content is generally considered black hat and has become something that many SEOs fear due to penalisation, but there are also some common misconceptions. The main issue is when Google struggles to decide which version of the content is most relevant to any given search term, potentially impacting search rankings.
You might have duplicate content that needs to appear across many different URLs on your website, but this isn’t actually a problem. Good SEOs will know how to canonicalise content for search engines.
Dwell Time – The amount of time a user spends on a page once clicking through from the search engine results. The dwell time officially ends when the user leaves the website. Although similar, this is different to bounce rate as this is the rate at which users view and leave a page in a certain period. Dwell time is specifically the amount of time the user spends on a webpage, either reading it or initially deciding whether it’s what they were looking for before leaving.
This stat can prove extremely useful for website owners as it gives them a clear indication of what users think of their site’s first impression. If the dwell time is high, their website is eye-catching, informative and overall useful to the user leading them to stick around. If the dwell time is low, this will be reflected in the bounce rate as users are generally taking one look at your webpage and deciding to leave very quickly after clicking through.
Dynamic URL – This is a specific URL that features content that depends on variable parameters which are provided by the server which delivers it. Characters such as ‘&’ ‘$’ ‘+’ ‘=’ ‘?’ ‘%’ ‘cgi’ indicates a dynamic URL. Various search engines won’t index dynamic URLs. However, Google does as long as the information in that URL is specific to the industry and is packed with content. It’s vital that you have at least one URL which does have a static URL that doesn’t change (a homepage URL is a good example of that).
Editorial Link – These links are very good to have. An editorial link is an organic inbound link that is naturally used by a website with high authority. For example, if a university’s website linked to your site because it had content that they wanted to reference. An editorial link is a great indication that your content is thorough, well-written, and useful. They are also a sign of a strong link profile and are different to acquired links which are usually requested or paid for.
Engagement – The amount of interaction a user has with content. For example, if a user clicks a link or likes a photo on social media, this is classed as engagement. Often, the success of an advertising campaign is measured by engagement. High engagement generally means that the campaign has been a success and poor engagement helps you identify areas for improvement.
Measuring the engagement of your posts and content allows you to get an understanding of whether your target audience thinks it’s relevant or not. Engagement rates may fluctuate as there is no definitive way to guarantee engagement. This is why analytics and data are so important in the world of SEO as it allows you to predict the types of content which will achieve high engagement rates.
Evergreen Content – Evergreen Content is a term used to describe website content that has been written to be relevant for years to come. Evergreen Content can help establish a business as a thought-leader in its industry, and from an SEO POV, it allows search engines to understand the business and over time award them with consistently high rankings because they have put in the work to create content that is thorough, insightful, and in-depth.
An example of Evergreen Content could be an in-depth guide, that would end up being around 4000, 5000, 6000 words or more (there is no real limit, the more words the better), about a topic in an industry that will remain relevant for years to come. For instance, if you’re a business that operates in the telecoms industry, an 8000 word ‘Guide to VoIP’ is a great example of a piece of Evergreen Content, as VoIP is a topic that isn’t going away anytime soon.
Evergreen Content should be combined with blog content to create a potent SEO strategy that ensures long-term success as well as short-term gains, which can be achieved by creating regular blog posts.
Exact Match Keyword – This is a type of PPC option that only allows your ad to be shown when people search for that specific phrase. This can reduce cost and prevent you from showing up for keywords that you don’t want to be associated with. For example, if you were selling black women’s shoes, you would set up an ad that would only appear when people type in the keyword “Black women’s shows”, it wouldn’t show for similar keywords such as “Black shoes for men” or “Children’s black shoes”.
External Link – An external link, otherwise known as an outbound link or outgoing link, is a link that leads away from your website. Essentially, if you have a link on one of your web pages that leads to a web page on a different website to your own, that is considered an external – or outbound – link. For example, if Embryo Digital was to link to your website from our own, that would be considered an external link on our side, and an inbound link on your side.
FTP – FTP stands for file transfer protocol, and this is the system used to deliver and transfer computer files between system and server. As an example, if a website has been built without a CMS (content management system), then to publish a web page you will need to use FTP to transfer the web page file from your computer to the server’s file.
Favicon – A favicon is a small icon that can serve as helpful branding for your company. It’s a 16×16 pixel icon that can be found on tabs or drop-down menus. Favicons are tiny which means they should only contain your company logo or one or two letters. Favicons are increasingly becoming a more important part of company branding and act as a handy visual marker for people looking on their tabs list or reading list. Favicons are not exactly SEO critical but they are just one of many things that form part of your overall web strategy. The key to a good Favicon is simplicity, using space wisely, ensuring your brand identity is clearly displayed, and leveraging abbreviation and colour coordination (not easy right?). The best Favicons are the simplest – think YouTube, Whatsapp, and Twitter – so don’t overcomplicate yours when you’re creating yours. It could be your logo or the first letter of your company name in your branding.
Featured Snippets – For some search terms Google will display a box with answer summaries above the term’s organic search results – these are known as ‘featured snippets’. Usually used for question-based search queries, the featured snippet box contains a summary answer to the query with a link to the web page from which the answer came.
Often known as ‘position zero’ on the search engine results page, featured snippets are often sought after by SEOs because they provide a stamp of authority. When Google determines that your answer to a search query should be displayed as a featured snippet, it can essentially be seen as Google having deemed your answer to be the ‘best’ on the internet. This might not always be the case, but ‘position zero’ is, nonetheless, a great thing to earn for your website.
For further reading on featured snippets and earning position zero through great content, we wrote a blog at the beginning of 2020 detailing how and why position zero is the SEO aim for the year – and how, while you can’t influence Google’s programming, there’s still plenty you can do to try and earn your spot above organic search results.
Follow Link/Do-follow Link – A follow link (or do-follow link) is a link that passes authority – or ‘link juice’ – from one site to another. All links are ‘follow links’ by default, meaning that they can be simply categorised as any link that has not had the ‘nofollow’ attribute applied to it – there is no specific ‘do-follow’ attribute.
Follow links pass on authority, and they do this because they allow crawlers to follow the link and more accurately place a page’s ranking in the SERPs. Authority is a known ranking signal, so follow links from high authority pages that pass on PageRank is very valuable for SEO performance.
Frase – Frase is an AI content creation tool that helps business owners and content writers create content on any topic in a way that is ontologically relevant, as well as targeting keywords. Users write their content within Frase, after entering a target keyword. Before a user begins writing, the platform scans the internet for that target keyword and then pulls together a report, based on the top 10-20 search results. In this report are the headers, questions, and titles that are used by websites in those search results.
However, where Frase comes into its own is by compiling an exhaustive list of ‘Topics’ which aren’t just standard keywords, they are phrases and words that are regularly used by competitor websites that are targeting this keyword also. By incorporating these ‘Topics’ into your content you can ensure that you’re talking the same language as those websites which are already successful. This deep level of ontology is a trend that is going to become more prevalent in SEO as it becomes more nuanced and intelligent.
Friendly URL – A ‘friendly-URL’ is a type of Uniform Resource Locators that is easy to read by both search engines and Google’s spiders. On large, dynamic websites the URLs will just be a series of words, symbols, and numbers. For instance, it could be something like . In today’s SEO world though, it’s important that your URLs can be understood and have relevant contextual information in them – it’s all part of building up that picture that your website is worthy of good rankings. So, how does one go about building a ‘friendly URL’? Let’s say you’re a website that sells tiles. For the friendly URL, you’d have something such as In this URL, not only do you know who the company is, but you can clearly see that they have a ‘tiles’ page and within that a ‘wall tiles’ page. It makes sense that the latter page would be under the former page. This, to Google, is a clear indicator that the website is well thought out and planned out in a user-friendly manner, the URL reflects this.
Geo-dependent request – A vast amount of searches have a local angle to them, ‘dry cleaners near me’, ‘restaurants in London’, and ‘trains to Birmingham’. Therefore, more and more search engines are throwing out different search results depending on people’s geography. For instance, if you’re based in London and type ‘sushi restaurants near me’ you’ll get an entirely different set of results than if someone in Manchester typed the exact same term. It may sound obvious but geo-dependent requests are a huge part of Google’s service, and it all boils down to them providing users – you and I – with the most relevant answer to their query. Brands and local businesses can leverage geo-dependent requests by bidding for paid ad positions on the search engine results pages that are based on geography. For instance, if you’re a digital marketing agency in Manchester you could potentially bid on keywords such as ‘SEO agencies near me’ knowing that people who search this time in your location will see your business.
GET Parameter – Often referred to as URL parameters or query strings, this term refers to a URL structure that can be used to gather specific data or adjust how a page’s content is viewed. Get Parameters can come in two forms: Active parameters – adjust the visibility of content on a page. The URL formula can be modified to either filter out content, or order it in a systematic way. Example: (the ‘?’ Is always present in this URL and is followed by the command you wish the page contents to follow) Passive parameters – a passive GET parameter doesn’t alter the visibility or order of content, but instead enables website hosts to collect user data. This data can be incredibly insightful for evaluating marketing campaigns, as the tracking feature of this URL provides information about how a user landed on the page. Example: these UTM parameters work in collaboration with tools such as Google Analytics to gain data about page visits. It is important to consider, however, that too many GET parameters used across a website’s subpages can have an adverse effect on rankings. An experienced SEO professional would utilise this handy feature in an SEO-friendly way, eliminating unnecessary parameter URLs to mitigate any risks to duplicated content damaging rankings.
Google Adwords – Adwords is a platform that is run by Google which allows advertisers to pay to display their ads at the top of relevant search engine results pages. Adwords is the opposite of organic SEO because businesses don’t need to spend time writing keyword-rich content and wait 3-6 months to watch their content rise through the rankings. Adwords work on an auction basis, a user submits their ads and accepts to pay a certain amount per click of their advert. When someone types in a keyword that is relevant to your ad, search engines, in milliseconds, determine which 3 adverts should appear at the top of the search, above the organic search. The word ‘Ad’ is displayed prominently next to the content.
Google Alerts – Google Alerts is a notification service that you can set up to track activity related to your target keywords and search terms. You can set it up so that whenever there is a change in content indexed by Google for use in the search results, you receive an email notification.
For example, if you wanted to track your company name, a particular service or product, you could set up Google Alerts to notify you when changes are indexed. Google alerts is commonly used for reputation management and link building because it’s a great tool for outreach and to identify potential opportunities. Google alerts is also often used to monitor the competition and to track any change they make to their relevant content.
Google launched Google alerts in 2003. While the service has had its problems and has come under criticism in the past, it has come to be a widely used SEO tool.
Google Algorithm – A Google algorithm (or any search engine algorithm) is a complex computer program with a process and set of rules, which Google uses to retrieve its indexed data and deliver ranked search results.
You’ll often hear people refer to ‘the Google algorithm’ in the singular sense when talking about how Google’s search engine functions. But the truth is that Google is made up of many individual algorithms all working simultaneously. Google uses a combination of many algorithms when it delivers ranked web pages to users via its SERPs (search engine results pages), all based on a large number of different ranking factors.
Google Analytics – Analytics is a service offered by Google that allows users to track and analyse the performance of their website. It was launched in 2005 and has since become the most widely used analytics service on the internet to date. Users can track session duration, bounce rate, pager per session, and demographic information. Users can use that information to create targeted campaigns to improve conversions and leads.
Google Bomb – Googling Bombing or Googlewashing, as it’s sometimes called, is the practice of getting a website to rank for terms and phrases that are in no way relevant to the product or service that is being sold there. Google Bombing is done by linking to various, irrelevant, web pages on anchor text. It would be like writing about car insurance and linking to a website that sells LED lights. It is definitely a black hat SEO practice and will look to exploit Google’s algorithm.
Google Bowling – The mass building of unnatural links to a competitor’s website. This is considered a black hat SEO practice and began when Google started penalising sites for attempting to achieve higher rankings through linking on forums and other spam blog sites.
Google is extremely clever when it comes to deciphering between natural and unnatural links and how to deal with the sites which use unnatural links. To create the most relevant experience for its users, Google reduces the ranking of sites associated with spam links to reduce the chance that users will view the site. While it may seem that building a large number of links is a good idea, carrying it out in this way can actually have a lot of negative results.
Google Dance –
Google Keyword Planner – To help your SEO and PPC campaign get off to a good start, the Keyword Planner could potentially be a handy tool. It is a free-to-use service, found within Google Ads, that can help you generate keyword ideas and inform you of bid estimations, both of which are key aspects of a thorough, and successful, marketing campaign.
The Keyword Planner allows you to search for keywords and groups of ad ideas to determine how they may perform. It can also help you identify areas that you may want to target, that are within your budget so you don’t have to worry about overspending to be successful.
As mentioned above, a key feature of this service is that it is completely free for people to use. The downside to this is that it doesn’t give you the most accurate data, it simply gives you estimations and broad ranges. However, something it does do that other services can’t do is suggest unique keywords that you won’t find elsewhere. This can potentially give you an edge over other businesses who’ve neglected to use the Google Keyword Planner.
Google Maps – A fully virtual mapping of 98% of the places in the world in which people live. Location details, street view and 3D recreations of locations are all available through Google maps, including a directions function which has led to it being used in place of a sat nav. With live data feedback, Google maps can provide real-time updates on traffic, road accidents and even the position of speed cameras.
Integrated into other features on Google, such as business pages, Google maps can provide the location of business premises. This is tied in with other important details the business may want to share such as contact details and a link to their website. It can also be used in organic and paid adverts on the search engine to define an audience by distance or location.
Google My Business – A tool that allows businesses to manage their online information across different Google features including Search and Maps. Google offers businesses the option to increase their online presence by entering relevant details into their Google My Business page. This may include details such as business contact information, opening times or even location. By entering this information, Google can understand more about the business and produce more relevant results for searchers.
By entering the location of a business, Google will be able to present your business as a result of a location-based search, meaning if someone is searching for a service you offer in the local area, your business will likely be provided as a result. Including as much information as possible about your business not only helps Google, but it also helps the searcher and prompts them to take an action on your site.
Google News – A news feed of collated articles based on the user’s preferences. Google’s algorithm will read a user’s viewing habits and use that to create a personalised news feed based on topics and news sources that you view regularly. Designed to provide quick hits of up to date information, the feature (which can be downloaded as a separate app) works best when embraced as you’re able to select specific topics of news to follow.
In-depth features allow the user to view the weather in their area, local news or news which has been verified by a fact check. Acting as a hub for all of the news relevant to a specific user, you can also search for topics, locations and sources which you wish to explore. The ‘For You’ page is a collection of articles that Google’s algorithm has read and understood to be relevant to you.
Google Tag Manager – Google Tag Manager is a free application for managing and deploying marketing tags on your website (or on your mobile app), without modifying the code. A very simple example of how GTM works is as follows. Information from one source of data (your web site) is shared by Google Tag Manager with another data source (Analytics). If you have plenty of tags to handle, GTM is very helpful, because all code is stored in one location.
Grey Hat SEO – Grey Hat SEO is a risky practice and is a term that refers to the practice of techniques that aren’t clearly defined as ‘bad’ by Google, and the material that it publishes about SEO.
Grey Hat SEO is a murky area and is a topic of SEO that is contested by many. There are some clever pieces of innovation that businesses could undertake which could boost their site, or see them lose thousands of pounds in lost traffic. Further, what is considered Grey Hat SEO one year, may well be classified as White Hat or Black Hat SEO the following year by Google. There is a great deal of risk involved in Grey Hat SEO and if search engines decide that these tactics violate their terms of service, you could suffer greatly. Therefore, it is best to stick to White Hat SEO practices, have some patience, and be confident that your long-term approach, which follows the rulebook, is best for your business.
Guest Posting – Guest posting is a straightforward concept – you post on another blog or website as a guest. By doing so, you’ll earn greater exposure because of the external backlink to your own blog from this site. To successfully guest post, or blog, on somebody else’s platform, you should establish strong relationships with others, as growing your network can open up opportunities to guest post. By guest posting, you may increase your influence both on your own site, external sites and even on social media platforms too, as it gives you exposure to another, wider audience.
Guest posting also offers potential benefits to the host site too, making it beneficial for both the guest blogger and the host site. By hosting guest bloggers, the host site will keep generating new and interesting content regularly. This is great for SEO and makes you look like a reputable, reliable source of information. If you’re posting on other blogs, you should therefore offer to allow the hoster to post on your site too. That way, you’ll both reap the benefits of guest posting. When guest blogging, it’s important to pay attention to the links that you include in your content. You should ensure that your anchor text is relevant and useful, in the context of the URL that you are linking to. As with all SEO strategies that involve linking, your links should be useful, legitimate and trustworthy.
HTML – Standing for Hypertext Markup Language, HTML is the standard code used to create web pages and applications, and it is the code that search engines read your website in. HTML is used to create heading tags and site maps, and HTML source code is the foundation for any programming. Want to read this page’s HTML? Just right-click and select Inspect, and you’ll be given the full HTML script for this page.
HTML is a core part of web development and is often the first coding language people learn to build a website. It’s also a crucial part of SEO, as the vast majority of technical SEO is done within the HTML source code. When done well, technical SEO uses HTML understanding to keep HTML clean and optimized, making it easier for search engines to crawl and read your website.
HTTP (Hypertext Transfer Protocol) – The whole internet is based on the Hypertext Transfer Protocol (HTTP). Hypertext links are used for loading web pages. HTTP is an application layer protocol for transferring data between networked devices and running above other network protocol stack layers. A standard HTTP flow includes a customer requesting a server and sending a response message.
Head Section – The head section of a web page refers to the part at the top of an HTML document that doesn’t display in the web browser when the page is loaded. This is a sort of behind the scenes part of the web page, containing things such as metadata, links to CSS, and more.
The <head> tag is placed between the <HTML> and <body> tags, which allows it to not be displayed in the browser. The metadata that the head section can contain includes information on the document title, the HTML itself, the author, characters, styles, scripts, and more. It’s an important part of on-page and technical SEO because it gives an opportunity to include important keywords that describe the page to Google.
Homepage – A homepage is the web page on a website that operates as the primary starting point of that website. It is the default web page that you load when you visit a web address domain name. For example, visiting embryodigital.co.uk will take you to the Embryo Digital home page.
The home page of a website often includes a contents section, navigation bar with links to the other pages of the website – it is essentially a website’s hub. There isn’t a standard home page layout, but many include navigation tools such as a search bar and informational content about the website and business. The home page is located in the root directory of a website.
A homepage should, very quickly, explain what the company is, what it sells, and how people can get in touch with the company to enquire, ask questions or request a quote. The homepage sets the tone for the rest of the website and should be regularly updated to ensure company information is always correct.
Hyperlink – A hyperlink is a type of link that goes to another page on your website. It can be in the form of an icon, graphic or piece of text and, once clicked, will take you to the page that is mentioned. Text hyperlinks are usually blue and underlined, and when you hover over it your mouse will change from an arrow to a hand. Hyperlinks are found all over websites but can also be used in PDF documents and other similar pieces of content to allow people to jump to different places quickly and easily. Essentially, hyperlinks allow people to move across the web at super-fast speed.
IP Address – An IP address (or internet protocol address) is a series of numbers that identify a website’s domain location online. IP addresses often have a domain name assigned to them, as this is a much easier address for humans to remember than a string of numbers. But the IP address is still the main way in which the internet and your browser locate a website.
IP addresses can be either dedicated, where a website has its own unique address or shared. Shared IP addresses are used when several websites share an address on a server.
While your internet protocol address is not known to be a ranking factor, it can affect site performance. For example, a dedicated IP address can see an increase in site speed which, in turn, is a Google ranking signal.
Image Carousels – A feature that can usually show up to five images and allows site creators to display a series of featured images. A similar feature is used on social media ads where ‘tiles’ can be used as a swipe-to-view advert. These are also in use quite regularly on websites as they can display a series of images without covering large sections of the webpage.
These can be extremely helpful when a site creator would like to display images on a webpage whilst still having plenty of room for text. However, their effectiveness has been challenged by some as claims have suggested that they reap low-conversion rates and fail to engage customers in the same way other content does.
Inbound Link – An inbound link – or a backlink – is a link to a web page from another website. It’s a term used to differentiate between links coming from other websites and the internal linking on your own website. Every link is both inbound and external on either side – inbound linking refers to those coming into your website. For example, if Embryo Digital were to link to your website, that would be considered an external link for us, but an inbound link on your side.
Commonly known as backlinks, inbound link building is a common and powerful SEO strategy that can have a great impact on your search rankings. For further reading, we wrote a blog compiling great SEO posts all about gaining as many good backlinks as possible. Building a strong link network and securing high authority inbound links is an important search engine ranking factor and is a big part of most SEO strategies.
Index Coverage Report – A report of the URLs you own and the status they currently occupy in Google. All of your URLs will be listed and grouped by the status and the reason for that status. For example, if there is an error status, the reason for the issues will explain why the URL has been flagged as an error. The index coverage report is the best way for a site owner to see the collective status of how their site is performing.
Although having 100% coverage may sound like a positive thing, it means that every single page on your site is being indexed. While this may be okay for sites with a small number of pages, it may mean for others that pages such as order confirmations are indexed, which can damage your site’s ranking as they likely won’t be optimised for the relevant keywords.
Indexed Pages – Pages of a website that has been crawled by a search engine and indexed as part of its database. Pages can be indexed upon request from the website owner to have the search engine crawl the site or naturally, as the search engine finds the page through high-quality and relevant links.
By having its pages indexed, a website can improve its domain authority and be officially recognised by a search engine. This increases the chances of your web pages showing up in a Google search and as a result, drives a higher amount of organic traffic to your website. If your website is not indexed, it’s likely due to it being new as Google has to make its way through millions of domains. Alternatively, if the pages still aren’t indexed after a long period, the issue could lie with the structure of your sitemap.
Indexing – Indexing in the context of SEO is the process whereby Google’s bots crawl your new website, page, or blog, and index it on the search engine results page for your chosen keywords. The indexing process involves Google understanding what your page is about (which is why linking and ontology are both so important) and rewarding it with a ranking. As you add more content and information to your page, your rank will increase as Google rewards you for providing users with more information and context about a topic than your competitors.
Infographic – Infographics are ways to visually represent data – such as statistics – or any other information or knowledge that needs to be seen and understood clearly and quickly. Several types of infographics can be created for various purposes, such as statistical infographics, geographic, process, or informational. As humans, we can visually digestpatterns and trends, and infographics make use of this skill.
Infographics are a great way to improve your SEO. They can be used as an effective digital marketing tool and can play an influential role in increasing brand awareness, especially if the purpose of your infographic is to share information or data about your business, products or services. By increasing brand awareness and generating interest amongst your audience, they can increase your web traffic.
Because infographics are easy to understand and engaging when designed correctly, they are also a great piece of shareable content. By getting your infographic shared and published across different platforms or publications, and linking back to your site, you’ll increase your reach, your exposure and your credibility. Ultimately, when designed and shared effectively, infographics have the potential to be a key marketing tool that, when used alongside other web elements, can increase your ranking in search results.
Information Architecture – Information architecture (IA) in SEO is the site structure and overall hierarchy of a website’s various web pages. The information architecture is a focus on organising and labelling a site’s web pages and content into an efficient, coherent, clear and effective structure.
Information architecture is important because not only does it make your website more navigable and user friendly, but it also enables search engines to more easily understand, index and rank your content by improving crawlability. The goal is to help both users and search bots find the information they need more easily.
Setting up a strong site structure and information architecture requires understanding the hierarchy and importance of different web pages and pieces of content, allowing them to become part of a larger, more coherent picture. The main components of information architecture are:
- Organisation schemes and structures
- Labelling systems
- Search systems
- Navigation systems
Informational Queries – A search conducted purely to gain information on a topic. A user will use a search engine to enter a keyword or phrase to achieve informational results. By categorising queries into these different categories, search engines can determine which results to prioritise for users searching. Searching using the queries allows users to benefit from certain processes and keywords which the search engine will use.
Each different word in the query can be interpreted as a different driving point for results. This allows Google or whichever search engine you’re using to provide multiple results which are all relevant. It is also able to prioritise certain pages or sites which it deems the most relevant to you.
Intent/User Intent – When it comes to SEO, the intent is how we refer to what users want when they enter a query into a search engine. It’s the ‘why?’ behind a search term that helps us to understand what users need and, by extension, how best to give it to them.
Search intent, often categorized by SEOs as either informational, navigational, or transactional, gives us an insight into who is searching for what, and why. This, in turn, allows SEOs to adapt their offering to suit the audience, aligning content and optimizing for specific search intent.
Internal Link – An internal link is a link that connects web pages on the same website. If the destination of a link is on the same website on which that link can be found, then it is considered an internal link.
Internal linking is used mainly for navigational purposes because it helps both visitors and crawlers (search bots) to more easily move around and understand your website and its page hierarchy. Internal linking keeps traffic on the same website and improves user experience, while also making your website easier to crawl and index. SEOs employ internal linking strategies to build a proper website structure and hierarchy.
Internet Service Provider (ISP) -An ISP is a company that offers a range of services that allows users to access and use the internet. As such, they are commonly thought of as a gateway to everything that is on the internet. These companies can operate in many forms, like as a privately owned company or a non-profit organisation. ISPs provide Internet access to businesses and consumers and can also offer other services such as domain registration and web hosting. ISPs have evolved significantly since the internet was founded. Access used to be gained by using a dial-up connection, before moving to satellite, copper wire, fibre optics and other high-speed broadband technology.
KPI – This stands for key performance indicator and is generally a way of measuring the success of either an activity or employees within a workforce. It’s not limited to this, however, and is commonly used in other areas of business, including websites. Setting a KPI for a website may require the site to achieve a certain amount of click-throughs or sales within a set period. As a result, if this is not achieved, it indicates to the website owner that changes need to be made to improve the effectiveness of the website.
Keyword Analysis – An essential part of any SEO strategy, keyword analysis is the process of searching and analysing keywords that are relevant to your industry. It’s so important because it gives you the jumping-off point to write content that targets these keywords, to get people to your site and, ultimately, increasing sales. Finding relevant keywords that are of good quality, with high search volumes, is essential and is the start of any search marketing campaigns. Broadly, there are three types of keywords – short, mid, and long tail. It’s important to target a mix of these keywords as they are all searched by different people, with different intentions.
Keyword Cannibalisation – This rather striking term refers to an issue that is quite common among several sites. Keyword Cannibalisation refers to what happens when a website has several pages that target the same keyword. When more than one page ranks for the same keyword they start to diminish each other’s authority and reduce click-through rates (CTR) and conversion rates because they are competing with each other, hence the cannibalisation part of the term.
To avoid this, businesses must create a clear sitemap to ensure that each page targets a specific keyword to avoid cannibalisation. A good example would be if a company sold shoes and within that, they had a page which targeted white shoes, cannibalisation would occur if they then created a new page which was called ‘More White Shoes’. As both pages are targeting the same term, ‘white shoes’, they would split the CTR (click-through rate) and conversion rates that the company would receive for the traffic that is driven to the pages. While this may sound good, it isn’t, it is much better to focus on one page with a 3rd place ranking than two pages with a 6th and 7th place ranking.
Keyword Density – The term keyword density refers to how often a given search term or keyword appears in the content of a page. It is a metric calculated as a percentage – for example, if you have a 500-word blog and a keyword appears five times then the keyword density would be 1%. Sometimes known as keyword frequency, keyword density is a foundational aspect of SEO.
Keyword density as a metric is interesting because of the way that it has changed over the years. Search engines and their algorithms have grown a lot over the years – for example, ‘keyword stuffing’ used to be a viable SEO tactic in the ‘old days’, whereby some SEOs would force a high keyword density to influence SERP performance. However, some SEO experts believe that search engines now use high keyword density to identify spammy content, which can potentially lead to lower search results. Some SEOs believe that keyword density hasn’t been a ‘thing’ for a number of years.
Keyword Difficulty – This is a metric used to define the competition for a certain keyword. When keyword research is being carried out for certain terms, there is software that will return this data and inform the user how many other sites and pieces of content are being optimised for this same keyword. The more sites which use this keyword, the more difficult its rating is which means it’s going to be hard to beat the ranking of these existing sites.
Keyword difficulty is extremely valuable as it can help users determine which keywords they should optimise. If the keyword which someone is looking to optimise has an extremely high difficulty, they may opt for a similar keyword or keyword phrase which has a lower difficulty and are therefore more likely to rank higher than if they attempted to use the difficult keyword. Finding the right balance is also key as there is a reason these keywords have a high difficulty, which is that they have a high search volume. These keywords are the most popular because they are the most relevant and most popular in terms of searches.
Keyword Frequency – This refers to the number of times a keyword is mentioned on a webpage or in a piece of content. The more times a keyword is mentioned, the higher the frequency. Frequency is closely linked to keyword density which describes how many keywords are featured next to each other in a sentence. Getting the frequency right is key to a successful SEO strategy. A low frequency means your page won’t rank for the keyword you’re looking to target, too high and you run the risk of over-optimizing your content. Try to use variations and synonyms of the keyword to keep things on track.
Keyword Proximity – Keyword proximity is the term used to describe how close keywords are in a body of text. For example, let’s say you sold black shoes, you’d naturally want to target the keyword “Black shoes”. The close these two words are together in the text, the better keyword proximity. “We sell black shoes” is better than “We sell shoes that are black”. Good keyword proximity allows search engines to better understand the context of the page. Good keyword proximity is one of the easiest things people can do to ensure a successful SEO strategy.
Keyword Research – One of the core and fundamental skills of any successful SEO is that of keyword research. Search engine optimisation can’t live without keywords, and keyword research is the process by which SEOs determine what relevant words and phrases users might search for, and which of these are the best to be optimised for.
Keyword research isn’t just about simply identifying what your customers are looking for, it’s about finding out which terms have the most search volume and competition, which keywords will either be easy or difficult to rank for, and researching which keywords are going to drive the most traffic to your site, increase brand exposure, and provide your business with the most profit possible.
Keyword research is one of the foundations of any SEO campaign.
Keyword Research Tools – A website or software which uses an algorithm to determine which are the most popular keywords used on Google. The tool will be able to identify exact match keywords based on how many times they appear on relevant web pages and the density of the keywords. It will also be able to identify long-tail keywords, which include the keyword you suggested, but in a longer format such as a question. Reports based on these keywords will be able to identify how competitive they are and how much value they will have to your ranking position.
These tools are used to identify which keywords a piece of content should be optimised for to achieve the best ranking on Google.
Keyword Stemming – Stemming refers to a search engines ability to understand the different ways a particular search query can be spelt. Google in particular has used stemming in its algorithm for years. Stemming means you can use the word ‘buy’ in a keyword, and then use ‘buying’ and ‘bought’ in other contexts. Search engines will understand that they all mean the same thing. It’s important to use Stemming because it makes your content look more natural, and reduces the need to crowbar in keywords as they are written. You can afford to be more relaxed with this and still receive high rankings.
Keyword Stuffing – You want to avoid this. Keyword stuffing refers to the overuse of a keyword in a way that doesn’t read like natural conversation. Keyword stuffing used to be popular when algorithms rewarded it but, while it is still used by some in an attempt to gain an advantage, it is now largely punished. An example of this would be: “We absolutely love skips and skip hire so if you want to hire a skip in the UK contact us and hire your skip today”. Natural content that flows nicely and includes one mention of your specific keyword will rank much higher than the above example. Simply because it is nicer content for the user to read, which is what search engines care about.
Knowledge Graph – Google’s knowledge graph is a useful concept that benefits so many of us, yet not many people stop to understand how it actually works. Ever typed a quick query into Google and received a fast, concise and enlarged response? Go on, try it now. Search, for example,‘Toy Story release date’. What follows is a very specific, short answer that eliminates the need to continue researching for answers within the page results. A knowledge graph is a hub of information derived from different entities and the connections that are easily identified between them. This can include tangibles such as locations; organisations and individuals, as well as intangibles such as colours and emotions. The data stored on the web around certain topics helps Google to recognise the most common answers to search queries, allowing the search engine to collate the most relevant information for users. By implementing a strong combination of SEO practises such as link building and adding large chunks of informative, authoritative content regularly, Google will begin to validate your content and connect it to other sources.
Landing Page – In short, a landing page is nothing other than the page on your website on which a visitor arrives after clicking a certain link or ad. It’s called this because when people visit your website, they have to ‘land’ somewhere. Wherever they first land, that’s technically a landing page. But in general, the term has come to mean something more specific.
The term landing page is perhaps more commonly used to refer to a specific standalone web page. This is a page designed with the sole purpose of capturing leads and generating conversions on your website. While Google Analytics uses the term landing page to refer to any page on which a user first visited a site, you’ll almost always hear the term used for specifically made, standalone pages.
Most websites build landing pages to complement and above specific campaigns, whether it be for a special offer, new product, or anything else. Paid advertising and search results will often send users to specially made landing pages to encourage specific conversions.
Another key feature of a landing page is that, very often, users receive something in exchange for their information. This could be a guide to an industry-related topic or a piece of content that offers insights that can’t be found for ‘free’ on the internet.
Landing pages go beyond standard content, which has a goal of educating and enticing potential customers in and looks to secure the sale. Clarity, short copy, and easy to fill in forms are all key aspects to a successful landing page.
Link Building – Link building is the process of increasing both the quality and quantity of backlinks – or inbound links – to your website to improve your SERP visibility and search ranking. Link building is a major tactic in most SEO campaigns and, alongside great content and web optimisation, have a great influence on your rankings.
The aim is to get other trusted, relevant and high authority websites to link to your website, sending traffic in your direction and also signalling to Google the relationship and trust between your websites. There are a lot of different techniques SEOs use for link building, some of the most common of which includes:
- Producing quality content to organically attract editorial links from relevant and high authority websites.
- Conducting outreach to influencers, bloggers and media outlets to build relationships and attract links to your website.
- Building partnerships with other relevant businesses and organisations within your industry or niche.
- Writing guest posts and generating content to be published on other websites.
- Manually building links by submitting your website to online directories and review websites, or building links between different websites you own.
- Paying for them through sponsored content, paid reviews and more.
No SEO campaign is complete without a properly targeted link building strategy in place – it’s one of the tenets of great SEO.
Link Decay – Also called ‘Link Rot’, ‘Link Death’ or ‘Reference Rot’, Link Decay is a term that describes the process by which a hyperlink no longer points to the original file, content, page, or server that made it be used in the first place.
Link Decay can occur when the resource it links to is changed to a new address or has been made permanently unavailable by the domain owner. Link Decay can damage website authority because the content it is citing is no longer as authoritative as it once was.
This topic is researched and studied by people around the world and is subjective because of the level to which the internet’s ability to remember and preserve information is talked about. As a result of this discussion, the estimations of these rates differ dramatically. A good rule of thumb though is to ensure that the links you do use aren’t to a personal website or more frivolous blog content. Another tip is to use WebCite to permanently archive information.
Link Diversity – Diversity of links is a term that aims to describe how varied the links that feature in your content are. If you get as many diverse links in your content as possible, search engines will reward you with high rankings. Diversity can include the type of content you like to such as blogs, videos, and articles. It can also include the type of URL such as ‘.co.uk’, ‘.edu’ and ‘.org’. Be diverse in your choice of anchor text too, this will ensure that links are more naturally placed, while still being relevant to the content they are linked from.
Link Donors – It goes without saying that building domain authority is what boosts a website’s ranking. But, how can domain authority be built effectively? Earning links from other notable websites is one of the best ways to establish credibility. The higher the quality of links you have connecting from another site to your own, the greater the impact on increasing domain authority. Donation links are a particularly straightforward type of backlinking. By researching and identifying sites that happily accept links for donations, you can expect a link to appear shortly after enquiring. However, it is essential that you look at the authority of the donation site to evaluate how impactful the backlink will be for your website. Obtaining link donors can be pricey, so it’s always important to explore all of your options to locate the right sites!
Link Equity – Often more commonly known as ‘link juice’ by SEOs, link equity is a term that describes the way in which a link can pass authority from one page to another. It’s a known search engine ranking signal, which is why it’s an important part of any SEO or link builder’s job.
The value of link equity is determined by a few different factors, including the topical relevance of the link, the linking page’s authority value, HTTP status, and much more. Essentially, if a page with high authority and good SEO links to another website, that will have high link equity.
Link equity is a known ranking factor that Google uses to determine a page’s ranking, but it doesn’t only count for external and backlinks. Internal links can also pass link equity, allowing SEOs to better control the flow and structure of authority through a site’s structure.
Link Exchange – Back in the good ol’ days of the internet – around 1997 to the early 2000s, link exchange emails were at their highest in terms of volume. After a real surge of businesses finding themselves with a website, and reading blogs about how to promote themselves in search engines such as Excite, Yahoo, and Google, they invariably would send out emails to other website owners, having learned that links were ‘everything’ when it come to online marketing success. Essentially, an email would be sent from one website manager to another, asking for a link to and from each other, so that they could both benefit. Over the years, Google would push out many messages saying that this tactic was not as efficient as people thought it to be. And over time, these emails dropped to almost nil.
Link Farm – Another remnant from the very experimental and ‘Wild West’-like late 90s-early 2000s were link farms. In many cases, these were single page websites – or single pages of links on an otherwise normal website – that housed anywhere from tens to thousands of links to various websites with zero thought of theming or rationale. Over time, these pages would become more palatable as Google changed algorithms. Theming was an option for many of such sites, while others just soldiered on until their impact was negligible or even detrimental. Thankfully they don’t exist anywhere near as much as they used to. However, they do exist in the forms of PBN networks of sites, but in a form that looks a lot more palatable to the human and robot eye – although great work is being done by Google to find and rid these sites of their positive SEO impact. At the time of writing this entry at the end of 2021, link farms are very few and far between, and PBNs have nowhere near the impact that they used to have.
Link Graphs – A link graph is a visual representation of the surrounding network of a website or specific URL. In other words, this graph is a map that pinpoints every backlink that connects to the central webpage in question. A link graph is used to collect useful data about how the authority of a domain may be increasing or decreasing – as the map demonstrates the quantity and quality of the domains that link to the focal URL. By analysing the relationships within the dataset, a business can accurately audit its website and keep track of the backlinks obtained.
Link Hoarding – When a website is solely building inbound links and neglecting outbound links. Google’s and its algorithm reads websites from a neutral perspective meaning that it understands the higher value in links that are interconnected. While some think that just building inbound links will help protect their link value as outbound links can reduce the value, this isn’t the case.
If your content is referenced as equally as it references other content and websites, Google will read this as a much more relevant source. Link hoarding can lead to Google flagging your site as spam and penalising it as a result. Understanding that link building is a two-way street and ensuring your site isn’t built solely on inbound links is important to maintaining a good ranking on Google.
Link Shingling – This is a term that refers to linking to a page from multiple pages but with different link text each time. This link text needs to be loosely related to the page that it is linking to and it’s a great way of helping Google understand the target page more. This is because you’re showing that other words are relevant to the page you’re linking. Getting, for example, nine or so relevant keywords among just four links will help Google understand the site more.
Link Spam – Spamming links involves posting out-of-context links on comment boards, forums, websites and blogs that aren’t relevant to where the link goes. The goal of link spamming is to increase the number of external links that a page has which, in theory, will boost its authority. Link spam is frowned upon in the SEO world because it’s not providing any value and is a bit of a cheap trick. Also, because search engines are smarter than ever, the rewards for this practice aren’t as great as they once were. A long-term approach that is genuine, and looks to provide value will organically boost your authority over time.
Link Velocity – The speed at which your website is being linked to by other sites. This accounts for both other users building these links or you pointing links back to your own site. Some think that achieving a large number of links in a short period, therefore a high link velocity, would lead to them being penalised by Google but this isn’t the case. Google won’t penalise you for having a high link velocity as it will only count the quality of your links.
That being said, the chances are if your website is receiving a large number of links in quick succession, they’re likely to not be natural links and this will harm the ranking of your website. Low link velocity, i.e. building links over a long period of time, generally constitutes a better quality of link and ensures they are natural which is much healthier from an SEO perspective.
Local Search/Local SEO – For many small and medium businesses, targeting the correct audience is paramount for gaining revenue and custom. One of the main targets that businesses tend to tackle is their local area, as this is where the majority of their customers will travel from. More often than not, a person looking for a service within close proximity to themselves will add their location area to the search term. Let’s say, for example, Julie in Manchester wants to find a local hairdresser. Julie will then go on to search for ‘hairdressers in Manchester’. This is where SEO plays a major role in meeting the search criteria with relevant onsite content, helping to guide this user to their webpage for the services they wish to receive. Simple? Yes. Effective? Absolutely! Successful Local SEO marketing should include the following:
- Content containing specific postcodes, cities or towns
- Inclusion of the word ‘near me’ within on-site content
- Connecting the site to GPS-based software
Long-form Content – Whilst long-form content can seem self-explanatory, there is a lot of debate around the minimum length of long-form content. An effective piece of long-form content would usually be around 1500-2000 words at a minimum. Whilst longer in length, this type of content should still serve a clear purpose and remain engaging for the target audience. The upper limit to the amount of content on a page depends on numerous factors such as the purpose of the content, the target audience and the topic.
Pages that rank well on Google for certain keywords and ontological phrases are long-form pieces of content that are written well and have plenty of links within them. To write this type of content, you must think critically and conduct thorough research into the subject you are writing about. Long-form content also gives you plenty of room to write a piece that is well-written with keywords and ontological phrases subtly weaved through – keywords should not be jammed into the piece simply to fulfil SEO goals, this tactic will not create long-term success.
The best content for SEO and a good ranking on Google will be well-researched – to show the search engine and the user the knowledge that you have on the subject – and will use best practices when it comes to incorporating keywords and any relevant ontological phrases. Using research tools, such as Frase, Ask The Public and Ahrefs, to investigate the search volume of keywords, high-ranking sites in the relevant SERPs and frequently asked questions to search engines can help to inform your long-form content. Above all, long-form content will establish your brand as authoritative and credible, which will be rewarded in the SERPs and in leads and conversions.
Long-tail Keyword – A keyword that spans more than a single word. When used in written content, long-tail keywords are generally more specific and can be highly valuable if used correctly. They’re used as a more targeted approach to customers, including extra words to make it tailored to a more specific audience. For example, if the keyword is “shoes” and you want to turn this into a long-tail keyword to target a more specific audience, it could become “men’s smart shoes”. Just by adding those extra words, it’s already targeting a much smaller group of people, therefore making the chance of purchase higher.
Long-tail keywords are best used as part of a question or search phrase as the longer nature of the keyword will account for more of the question. They’re also more effective when used in voice search terms.
Major Search Engines –
Manual Action – If a human reviewer working for Google thinks that a web page or website doesn’t adhere to Google’s quality guidelines, then they may invoke a ‘manual action’, which would mean that a web page or a site as a whole will see lower rankings – or even be removed from Google’s index altogether. In many cases, this will happen without any prior warning. Google does not like its index to be manipulated, and a manual action is the most fierce way for them to show this.
If you are unfortunate to have been given a penalty from the result of a manual action, you will most likely find a notification about it – not necessarily how to resolve it – in Google Search Console.
Mega Menu – A mega menu is an expandable menu on a website. These are most commonly displayed as a dropdown menu, showing lower-level site pages to the user. When a mega menu is made effectively, it can improve the contextuality of your website, which is beneficial for SEO. With mega menus, an effective tactic is to alter the location of the menu HTML, to the bottom of a HTML document. This makes your web pages more contextual – the H1 of every page, along with the first paragraph, are closer to the top of the HTML document. However, the user will notice no difference, and their user experience will not be impacted.
Whilst having an effective mega menu is important, it should not be your only focus regarding links. You should also ensure that you interlink your pages with hyperlinks, situated within paragraphs in your webpage. This helps Google to understand your site better – with greater contextuality – which is beneficial for SEO.
Meta Description – A meta description is a short snippet of content that is found under the title of a piece of content that is indexed on Google. It aims to summarise the content that is available on that link, to allow the reader to decide whether or not they want to click on that link.
Meta descriptions are a perfect opportunity to entice potential readers to your site and, it gives you a chance to incorporate your target keywords into another aspect of your content. This further boosts the SEO performance of the content and gives you a greater chance of getting a higher rank.
Think of meta descriptions as an advertisement for your content, and write this short piece of copy as such. Therefore, make it compelling, exciting, and readable, remember, you only have seconds to attract someone who is scrolling through page one. If you don’t write one, Google, and other search engines, will automatically pull through a sentence of text from the content, though this is nowhere near as impactful as writing your own.
Meta Tags – Meta tags provide information that can be read by search engines and web crawlers. The information pertains to the webpage, and is found in the HTML of the document. Search engines will use meta tags to retrieve and understand information about the webpage, which they can then use for numerous things – for example, to determine rankings, and to display search results snippets. Whilst it should be , some don’t always have to be used. Examples of meta tags that don’t always need to be used include social meta tags, robots, language, geo, refresh, and site verification tags. On the other hand, for good SEO practice, you should always include meta content type, title, meta description, and viewport. There are so many different types of meta tags that some are now deemed unnecessary by SEOs far and wide. For example, many say that the following meta tags serve very little purpose and are a waste of space, even in the eyes of Google – author, rating, expiration, copyright, abstract, cache-control, distribution, generator, and resource type.
Metadata – Put simply, metadata is the data that informs search engines what exactly your website is about. Metadata gives descriptive information about a website and its content. Important examples of metadata include title tags, meta descriptions, and robots, but metadata can be found all over a website.
Metadata optimization is an important part of technical SEO and improving the nuts and bolts of your website. Like a lot of SEO, it’s all about making life easier and more clear for Google, making your website more navigable and quicker to crawl, index and understand.
Metrics – A metric is the quantifiable measure of one piece of data. For example, ‘Page views’ is a metric of how many people viewed a particular page on a website. Metrics are very singular, important, and extremely clear. Businesses use metrics to measure success and will set predetermined benchmarks that define said success. There are hundreds of metrics in the world of SEO, and it can be hard to find the most relevant ones to use. And, it’s important to note that your chosen metrics will all depend on your business, products/services, industry, and sector. With that in mind, the five most important metrics could be argued as being the following:
- Organic Traffic
- Click-Through Rates
- Bounce Rate
- Keyword Rankings
- Domain Authority
Microblogging – Microblogging involves writing concise, short content, often for platforms that are specifically designed to publish and share this type of content, such as Twitter and LinkedIn. Links, images and videos can also accompany the text on a microblog, to maximise audience engagement and interaction. The content and the file size of a microblog is typically much smaller than a standard blog, and of course long-form content.
Whilst people often enjoy consuming these shorter snippets of content as opposed to a lengthy blog post, long-form content is still the key to ranking high on Google and generating more web traffic to your blog post, and your site overall. That’s why microblogging usually takes place on platforms that are designed for this type of content, such as Twitter, Facebook, Instagram and LinkedIn. In other words, microblogging should not replace long-form content. Instead, it should act as a different tool and type of content to use within your digital marketing strategy – it can be well implemented into a social media marketing strategy. Microblogging is particularly effective when accompanied by a link to a long-form piece of content that has had its core messages taken and used in a microblog to promote the piece.
Micromarking – The purpose of micromarking is to help search engines quickly find, and understand, the content on your website. Introducing micromarking to your website involves the use of tags and attributes to structure information. Micromarking uses what could be considered as a unique language made up of tags such as <div> and <span>, amongst others. Microdata can be implemented on a website in two formats usually – , and . So, how exactly does micromarketing affect SEO and the promotion of a website in the search engine results pages? Well ultimately, it can affect website promotion, although this doesn’t happen directly. By using microdata, the site can become attractive to the search engine, which is then reflected in the SERPs. Having a higher position in the SERPs can increase the click-through rate of the snippet.
Mobile Speed Update – A Mobile Speed Update is an update performed by search engines that factors in mobile page speed when deciding on page rankings. This update punishes pages that are slow to load on a mobile device and promotes websites that perform quickly on a smartphone.
Search engines have a vested interest in providing users with a seamless experience, therefore, it makes sense that they prioritise websites that not only have optimised, well-written content, but pages that load quickly, thus reducing the amount of time that the customer is waiting.
Thankfully, there are plenty of tools out there that can test your site speed so that you can ensure your content will load in good time, and not affect your ranking. A good tip is to test your site on the weakest signal (i.e. 3G), because, by making sure your site runs well on the worst signal, you can be pretty confident that it’ll be as good, if not better, on other signals such as 4G, 5G, and WiFi.
NAP – This stands for name, address and phone number. Using NAP in your web pages is important if you want to rank for locally-based searches. By regularly stating your name, address and phone number on your website will build your presence in the location in which you operate. Typically your NAP can be added to the footer of your website, which means it will appear on each webpage and boost the number of times it is mentioned.
By ensuring this information is readily available and placed in important places on your website, you’re reinforcing your website’s local identity. Making sure this is kept consistent throughout your site will give you the best authority, which means if you have several business numbers or business names, it’s best practice to use the same ones throughout. Setting up a Google business page and adding your location is vital if you want to rank for searches in your location.
Natural Links – These occur when other websites, blogs and online content links back to your website because it’s relevant to their topic. This assures you that your content is of good quality as it’s being referenced by other sites and will help improve your ranking when Google crawls your site. Typically, this is the best and safest form of linking as it drives traffic directly to your website and any changes to Google’s algorithm will leave your site unaffected.
The more natural links your website gets, the better the ranking will be and the higher your site will appear on Google. Once this happens, more people will see your content and likely link to it on their site, which further improves your site’s ranking without you having to do any link building yourself. When other highly recognised blogs or websites link to your site, this massively improves your ranking.
Navigational Queries – This is a query which is searched with the intent of finding a specific page or website, unlike an informational query which is a more general search to find any information relating to the topic. For example, if you search for a brand name, the first result will likely be that brand’s website and will be classed as a navigational theory, whereas a question would be informational as you’re looking for an answer.
These types of queries are difficult to rank for as google understands exactly what type of search this is and will only bring up relevant results. If you’re searching for a specific brand, Google won’t provide you with similar results as the first choice because Google knows you’re looking for one specific website. Intercepting the search journey of the user and the result with this type of search is near impossible.
Nesting Level – The . The nesting level is referred to as a number, such as 1 or 2. As an example, the main page of a website would have a nesting level of 1. Consider this main page to act as a trunk for the rest of the webpages to branch off. These branches might be documents with a nesting level of 2. Then, branches that extend from these branches have a nesting level of 3, and so on. When adjusting nesting levels and optimising them, you should always remember that the further documents should be located a maximum of 2 clicks away from the main page of a website. By putting the resource further away, you are essentially hiding this information – the search engine will deem this too complicated, and this will be affected in the search results.
Network Science– By understanding the concept of network science, you can also gain a better understanding of how to create a successful website that is search engine optimised. Networks are all around us – there are computer networks, social networks, the Internet and even genetic networks. Network science, as an academic field, is the study of these networks and the connections between several elements.
When you have an understanding of network science, you will begin to understand why websites rank highly, and why others do not. You’ll also know what it takes to make a site rank well and the steps you need to take, as your knowledge of network science will help you in your thought process of why things are the way they are.
Nofollow – Most links on the website are ‘follow links’ – this means that they are links that search engines are intended to follow when indexing your website. However, in instances where you do not want a search engine bot to follow a particular link, you can choose to add a ‘nofollow’ tag.
Nofollow links allow users to click through and use the link but prevent crawlers from following and indexing them. If you want to send users to a particular website but, for whatever reason, do not want to provide that website with any SEO buff – or link juice, as it is sometimes known – a ‘nofollow’ attribute will accomplish this.
This prevents the passing of domain authority to another website via a backlink if, for example, it is a paid link. The ‘nofollow’ meta tag is a great way of controlling your linking and the way that your website passes authority and link juice onto other websites without restricting your linking for user experience and value.
Noindex – A HTML tag that requests search engines not to index the page and remove it from search engine results pages. This can be done when pages need to exist as they are crucial to your website but, due to their nature, could harm your website’s ranking if they were crawled by Google. An example of these pages would be the login page to your website or pages which thank a user for subscribing.
If you don’t have direct access to your server, including this tag in the HTML of your webpage is the best way to ensure Google doesn’t index it. Inserting the following tag into the header of your webpage will prevent most search engines from crawling your page, <meta name=”robots” content=”noindex”>. To prevent only Google from indexing your page, use <meta name=”googlebot” content=”noindex”>.
Off-Page SEO – This a collective term that describes all the work done to improve a website’s SEO performance outside of its website. At the heart of off-page SEO is backlinking, which is the process of painting links to your site on other people’s websites. Getting your website URL on other people’s websites will help boost your authority, especially if those websites themselves have high authority. For instance, your off-page SEO would improve if you got a link to your site from a University website or charity website. These types of ‘.org’ and ‘.ac.uk’ domains are valued very highly by search engines.
On-Page SEO – On-page SEO is the opposite of off-page SEO and is a term that characterises everything done on your website’s individual pages to improve the ranking in the SERPs. On-page refers to the content of the page as well as the HTML source code. Ways to improve your on-page SEO include things such as adding keywords to your content, adding in links, and creating a good header structure. On-page SEO aims to improve your domain score, which is measured 1-100 with 100 being the best, and 1 being the worst.
Ontology – Ontology for SEO purposes is an extremely important factor, as it can help to increase your ranking and the position of your site on search engine result pages (SERPs). To rank high on Google, a common process involves incorporating keywords into your copy, so that Google recognises these terms and considers your site to be worthy of a high ranking on the SERPs. However, as with most aspects of digital marketing, things are ever-changingwhen it comes to the way content is ranked. Now, ontology is more important than ever.
According to the Oxford English Dictionary, ontology is ‘a set of concepts and categories in a subject area or domain that shows their properties and the relations between them’. For your content marketing, this means that you can produce content that shows your knowledge about topics and concepts outside of simply the basics. You can show how these link together and relate to each other, and the main topic of discussion.
To use ontology to your advantage, you need to change the way you think about your keywords. Gathering keywords with a high search volume to include in your content should now be considered a starting point, rather than the only step in the process. These keywords should be a foundation on which you build your content or your knowledge. From this foundation, you should then find related ontological phrases to use within your content too. Using ontological phrases effectively will show your in-depth understanding and knowledge about the subjectyou are writing about. This will then demonstrate to both the search engine and your audience, that you are best placed to offer the products, services or advice.
Open Graph – Open Graph is – it’s a snippet of text that communicates the content on a page to social media platforms. Taking Facebook as an example – Open Graph allows integration of your site and Facebook, and communicates what content should show up when one of your pages is shared on Facebook. Whilst many people argue that Open Graph tags don’t directly affect your on-page SEO, it’s still worth using as they can influence the performance of the links used on social media. People do also argue though that with Open Tags, people are more likely to see and click shared content when the tags are optimised. As a result, . There are usually three reasons for this – because the tags tell people instantly tell users what the content is about, the content is considered more eye-catching on social media feeds, and because they also help Facebook understand what your content is about. The latter can help to boost your brand visibility through search.
Organic Results – Organic search results are those that appear on the SERPs without having been paid for. These natural and organic results are the result of Google indexing and ranking your page based on its content quality and relevancy against any given search term. If, for example, you implement a long term and dedicated SEO campaign for your website, you may eventually organically rank at number one on the SERP for a targeted search term without having to pay for advertising.
In turn, the traffic that visits your website through these organic rankings is known as organic traffic – traffic that has found and visited your website through making a search and finding your website on the SERP. This is essentially free traffic, as opposed to the traffic you gain through paid advertising.
Over Optimised – This is a term that refers to the practice of putting too many SEO techniques onto your pages, or site. Over optimisation can take many forms, the most common way it is found however is by overusing a keyword. Back in the day, before Google got wise to this issue, companies could stuff their website with tonnes of keywords and be rewarded for it
Nowadays, Google is much smarter and severely punishes businesses who ‘keyword stuff’, not only is it a bad practice but it also makes the user experience much worse. Another way businesses over-optimise is by pointing all their internal and external links to what would be considered ‘obvious’ navigation pages that are top-level and apparent. Google rewards hard work, therefore, businesses should look to link to pages found deep inside the website as this shows you’ve worked hard to link to specific pages.
Linking to toxic sites, and trying to rank for keywords that aren’t relevant to your business, will also be punished by Google.
PDF – A Portable Document Format (or PDF as it’s commonly called) is a type of file format that was developed by Adobe in 1993. It is used to capture and send electronic documents in an intended format. A key feature of PDFs is that they display a document in the exact way the user wishes it to appear, no matter what device it is being viewed on. For regular Word documents, a PDF may not be needed. What they are good for is larger documents such as articles, product brochures, and academic papers. PDFs can be interacted with and users may zoom in on specific parts of it if they wish to.
PPC/Pay Per Click – Pay Per Click (or more commonly PPC) marketing is a form of paid advertising used to show your ads across the search results and Google’s display network. This is a marketing model in which adverts are optimised and paid for, most often bidding on specific keywords and search terms to target relevant traffic to be sent to a specific resource, whether a product page, landing page, or anything else.
PPC is a form of SEM (search engine marketing), often working alongside SEO as a paid and, arguably, more targeted alternative. The cost of the ads varies according to a number of factors, such as relevance, competition, and account history, and campaigns can be set for a number of different criteria. If you want ads to burn through your budget for as many results as possible, you can do that. And if you want your budget to carefully deliver a greater return on ad spend, you can do that too.
Combining SEO and PPC is a powerful marketing tool, providing greater clicks, conversions, and increasing your real estate in the search results. There are a few different platforms through which pay per click marketing can be done, including:
- AdWords – Google’s PPC ad platform and the most commonly used on the web.
- AdCenter – Microsoft’s alternative PPC ad platform.
- And Yahoo! Search Marketing – Yahoo!’s PPC ad platform.
Page Authority – Page Authority is a term used to describe a score that was developed by Moz which indicates how well a page will rank once it has been indexed and placed in the search engine results page (SERP). Page authority is based on a 1-100 scale (with 100 being the highest, and one being the lowest). Page authority isn’t binary, which means that getting from a 70 score to an 80 score is much, much harder than getting from 30 to 40. This is because the processes and work needed to increase the score, when you’re already in the top end, is more complex, takes longer, and requires specific knowledge.
A great way of boosting your score is by getting your website link on a well-trusted website (such as a Government website or a ‘.org’ domain). This shows that you’re well trusted and deserving of a good score because an established site deems you authoritative.
Panda – Google Panda was initially released in 2011 and was originally known as ‘Farmer’. The algorithm update’s purpose was ’. Whilst originally rolled out separately from Google’s core algorithm, Panda has since been incorporated into said core algorithm during March 2012. The problems that Panda was designed to address include duplicate content, thin content, sites that lack trustworthiness and authority, content farming (low-quality pages, often aggregated from other sites), low-quality content, content that did not match the search query, and high ad-to-content ratio, amongst plenty of others.
Parsing – Parsing is a form of automation that involves gathering and extracting information from online resources, such as a website. The information/content is in the form of HTML code, and the results are added as a database. An , and then they display the relevant documents when searching. Parsing happens in three phases – the content is retrieved in its original form, the data is then extracted and transformed, and the result is generated.
Penalty – A Google penalty is the search engine’s way of punishing websites for errant behaviour. This penalty can take the form of being delisted for a particular keyword or having your ranking drop significantly to the point where audiences can’t find you. A Google penalty can affect any website and can be handed out as a result of well-intentioned efforts to improve a site’s performance. The reasons behind why the penalties are handed out are shrouded in mystery, much like the algorithm. Reasons aside, Google penalties are to be avoided at all costs as they are hard to shake off.
Penguin – Shortly following Google Panda was Google Penguin – , such as those engaging in keyword stuffing and using manipulative link schemes. After its release, Penguin went through ten updates before becoming part of Google’s core algorithm in early 2017. As mentioned, the primary purpose of Penguin was to diminish the presence of websites using keyword stuffing and link schemes. Keyword stuffing involves adding large quantities of keywords to a webpage in an attempt to manipulate your ranking position. As for link schemes, these refer to the purchase, development or acquisition of backlinks from sites that could be deemed irrelevant/unrelated and low-quality. This ultimately paints a picture of relevance and popularity, which is false, to try and gain a higher ranking.
People Also Ask Boxes – This is a modern feature that Google uses to provide the most relevant search results for its users. These boxes consist of questions that are relevant to your search query as Google attempts to predict your next move to save you time. The inclusion of this feature allows Google to provide you with more information on a particular topic than simply a list of web pages.
Another factor that makes this feature so popular is the inclusion of a snippet from the accompanying website. Once expanded, the box will provide a relevant snippet of information taken from the website to provide the user with the answer to the question without them having to click through.
The popularity of this feature has increased dramatically since its introduction, giving businesses the chance to rank on page 1, for search terms which they otherwise may have been further down the pecking order.
Personalised Results – This is when standard organic SEO results are overridden in favour of other results which Google has deemed more relevant to them based on recent searches. Google is continuously gathering data on us as we use the search engine and can use this to provide us with incredibly accurate results. This feature makes keyword optimisation and rank tracking more difficult and can create uncertainty as to where exactly your site ranks.
For example, if one user was to search for ‘football boots’ and follow that up with ‘socks’, there may be sock brands that ordinarily would rank first but Google may pick a brand that specifically sells football socks and make that your first result. This isn’t Google changing that result for everyone though, the hard SEO work the sock brand has done to gain that top ranking won’t be undone, this is simply a personalised result for you.
Piece of Code – Used to build websites, apps and software, code is what tells the website how to operate and look. Everything on the internet essentially boils down to lines and lines of code. In terms of SEO, code can be used to help strengthen the ranking of a website through certain coding techniques. Writing or rewriting code using this technique will allow Google to read and index your content. This is done through the use of key phrases which Google will then read as relevant to the topics included in your website.
In addition to links on your website, this can help give your website a boost in its current ranking. Though links can do a lot of the work in terms of achieving a good ranking from Google, coding which has been SEO optimised will put further emphasis on the relevance of your site.
Pop-up – Pop-ups are windows that, without prompt, appear when you land on a webpage. These pop-ups will often encourage people to sign up for a newsletter to receive a discount or remind them that there are items in their basket. Search engines hate old school pop-ups (those that open in a new window) and will very often ban them and prevent them from appearing on the user’s screen. The new types of pop-ups that appear within a window don’t tend to affect SEO performance but they should be used sparingly as they can annoy users and discourage them from going back to the site.
Private Blog Network – Also known as a public blog network or a PBN, . As this involves several sites linking to each other or to one central site, PBNs are similar to link wheels and link pyramids. The sites used are typically built using expired domains that have some previous history. These expired domains are usually re-registered and have some content added featuring links to the target site. You may be able to identify PBNs due to a few different factors. For instance, the sites might all have the same IP, and similar site designs and themes. The site ownership might also be the same, and duplicate content may be present. PBNs are not a great way to build links and authority, and ultimately, you should steer clear. When it comes to link building and gaining greater authority on the web, best practices are always the favoured option!
Qualified Lead – A type of lead that has been deemed ready to be contacted by the sales team. Typically, a lead will come through a website and get in touch. If their enquiry matches the business’s criteria, or the problems they are messaging about can be solved by the products/services sold, they’ll become ‘qualified’. A qualified lead will then often be approached by a member of the sales team who will take time to answer specific questions based on their enquiry, and provide one-on-one time. They do this because they know this qualified is interested and more likely to purchase the product than most.
Quality Guidelines – The Google quality guidelines are essentially written guidelines for webmasters and SEO detailing what tactics are forbidden or discouraged. The quality guidelines highlight what actions Google deems to be malicious or attempting to manipulate search results. The Google quality guidelines essentially define what can be considered ‘black hat’ or ‘white hat’ SEO.
Black hat SEO and spammy tactics, for example, are actions that go against the quality guidelines. Meanwhile, those SEOs who abide by the guidelines can be considered white hat. The Google quality guidelines have changed a lot over the years and it’s important to stay up to date – failing to do so could even lead to a manual penalty and a huge step back in your SEO efforts.
Quality Update – A Google Quality Update is an update that is undertaken by the search engine every so often with one goal – to demote poor quality content. These updates can have a serious impact on your website if your SEO isn’t good enough, and your content is old, uninformed, and not following best practices.
However, if your website has recently been injected with fresh, optimised content that is keyword friendly you may find yourself benefiting greatly from these updates, as Google rewards you with higher rankings. What does ‘quality’ look like to Google? Well, increasingly, ‘quality’ work is determined by how beneficial the content is for the reader. Google is putting the user at the heart of its algorithm, and as a result, so should your content. Yes, SEO should still absolutely feature, but so should be well-written content that is genuinely informative, brings value, and is in-depth. By showcasing your knowledge you’re helping the reader and thus, impressing Google.
Query – The term query often refers to a search query which is simply whichever phrase you’ve entered into the Google search engine. Google will provide a list of results it deems relevant to your search query, which although it seems like a simple process, there is a lot of work that goes on behind the scenes. For example, your search query will contain a keyword or keyword phrase which certain websites will be optimised to rank for and this is how Google chooses the most relevant results.
This further proves the relevance of effective keyword optimisation as it’s crucial to ranking highly for certain keywords. This also creates a competitive environment as websites are continuously competing with one another to achieve the best ranking. Whilst this competition is happening, Google is benefiting the most as it’s streamlining its results and providing the most relevant websites possible to its users.
ROI – Standing for return on investment, ROI is the term used to describe the amount of money a company receives from their initial investment. In SEO terms, this could relate to the amount of money spent on a new website, a PPC ad campaign, or the investment made to hire SEO content marketing experts. ROI is relative to every business and each company will have different definitions of good and bad return on investment. For instance, a new company may consider a good ROI to be to break even. Whereas a larger company may see that as a terrible ROI because their investment was much higher.
ROMI (Return on Marketing Investment) – Marketing a product can be expensive and can be done through different channels and methods. Return on marketing investment (ROMI) is a metric used to measure the effectiveness of a marketing campaign. As such, . ROMI is similarly to return on investment (ROI) but is more specific as it pertains specifically to marketing. For ROMI to be effective, marketers should set measured metrics for the campaign. Simply, ROMI is measured by calculating the total revenue generated against marketing investment, and it should only reflect the direct impact of a marketing campaign. In the context of SEO, ROMI calculates the ROI for your SEO campaigns – if the organic revenue generated by your SEO campaigns is higher than the cost to run them, then you will have a positive ROI.
RSS Feed – A live feed of updates relating to a specific source. An RSS feed can be set up to provide results on a specific topic or news source and the feed will deliver updates on new pieces of news and articles which are published. Rather than being a list of new content, an RSS feed is used to notify users that this new content is now available.
Typically, they’re used to notify users when blogs or podcasts have been published and in their most basic form will be text-only. RSS feeds with images and videos are available, but it’s recognised as a text-only feature. RSS feeds often appear as a widget option to insert on web pages and blogs.
Ranking – ‘Ranking’ is the broad term used to describe the position of different web pages on search engines. The higher the ranking, the more likely you are to receive more web traffic, and, in theory, more sales because users don’t want to scroll down too far, preferring to just click on the top result.
All businesses that feature on search engines are aiming for a high ranking, getting on page one of Google for a search term is a good goal to start, from that, efforts should be taken to get as high as possible on the first page. Getting on page one is so fundamental because, quite frankly, no one scrolls to page two – it is that stark.
Rankings can be changed by regularly updating and adding to pages with fresh content, and ensuring site speed and performance is good, and not over optimising. Regular monitoring is key to ensuring rankings remain stable.
Ranking Signal – A ranking signal, or ranking factor, is the term for any one thing that is believed to contribute to how Google’s complex series of search algorithms will analyse and rank your website, determining its organic search results and rankings. While Google traditionally keeps its cards close to its chest, it has for years claimed that its algorithms rely on hundreds of unique ranking factors to help deliver its users with the highest quality and most relevant search results.
Reciprocal Linking – Reciprocal Linking is a term that refers to when two hyperlinks link to each other. For example, if Embryo had a link to the Manchester Evening News (M.E.N) homepage, it would be reciprocal linking if the M.E.N had a link on their page that linked to Embryo’s homepage. Reciprocal links are usually pre-agreed between two webmasters and are done so because it is seen as mutually beneficial, with both websites achieving a boost in authority and almost helping each other up the rankings ladder.
However, reciprocal links are quite controversial and many SEO experts see it as a bit of a scheme. This is a notion that is held because there used to be instances of webmasters getting together and monopolizing a keyword by reciprocally linking to each other’s sites. This is less common now that Google is far more intelligent. As long as your reciprocal links happen by chance, there is no need to worry about them negatively affecting your rankings.
Regional Keywords – These are keywords that will overwrite a master keyword if relevant for a specific region. This is used in location-based searches when somebody searches for something, often a business, in a specific area. Once they enter the name of a place, if a website has been optimised for that keyword, Google will view that as the most relevant and offer it as a result because Google knows you’re looking for a business in that specific area.
Regions cover a larger area than that of local-based searches and have the potential to achieve much higher traffic by doing so. This is done by using a series of geographically relevant keywords to ensure you rank when these terms are searched. If you search using the term “near me”, Google can use your location to bring up relevant results of places near you as a result of these regional keywords being used.
Relevant Queries – When looking to improve your SEO, it’s important to consider relevant queries – in other words, . One of the ways to uncover queries to target is by carrying out keyword research – this will help you to identify how popular these queries are, and how difficult it is to rank for them. Typically, there are considered to be three different types of search queries – navigational, informational and transactional. Navigational refers to a search query that’s used with the purpose of finding a particular website, such as Facebook. Informational queries cover broad topics, where thousands of search results could be deemed relevant. As for transactional queries, these show an intent to complete a transaction, which is usually a purchase. For example, searching for a specific product would be considered a transactional query. Within these three groups are then queries that would be relevant to your target audience.
Response Code – Response codes – also known as HTTP response status codes, or simply status codes – indicate to users whether a specific HTTP request has been successful. A response code is a three-digit code indicating the server’s response to the request. For example, a 301 response code indicates that the page has moved and the user will be redirected, while a 403 response code means that the user is not authorized to visit that web page. There are a lot of different response codes.
Responsive Design – This term refers to the idea that web design and development should respond to the user’s behaviour, interactions, and the environment in which they are in (i.e. desktop, mobile, and tablets). A key element of responsive web design (RWD) is the notion that elements of the page would reshuffle and change orientation based on the device that is being used. RWD is very important in this modern world, where people view content on so many different devices because it’s important to offer the most optimised experience for the user. If you offer a poor experience that can’t be viewed on a phone and a laptop, people will become frustrated and leave your site. If you’re an e-commerce company, this could be the difference between making a sale, and not.
Retargeting – A retargeting campaign is a process by which a company will carry out certain advertising campaigns to target people who have recently left a website without purchasing anything. These campaigns can be specific to a certain product category or the site as a whole. Retargeting can take many forms such as email or social media advertisements. Sometimes they will include an offer to financially incentivise them to come back, other times it will just be more of a reminder.
Rich Snippets – Rich Snippets are more in-depth snippets of content that Google displays on the search engine results page (SERP). Rich, in this context, refers to the amount, and type, of information that is available in the snippet – this could involve pictures, reviews, reading time, and even nutritional information and pricing if it is relevant to the search term and web page.
Rich snippets can increase the chances of people clicking through to your content because there is more immediate information available to them, which can increase confidence that this site will have the information they need.
To benefit from Rich Snippets you must add something called structured data to your website. This is the form of code, written in a specific format that can be understood by search engines. Once read, search engines can use this to create Rich Snippets. Using plugins on your website, and reading up on the importance of structured data, can ensure that your website will, over time, be featured as a Rich Snippet.
Robots.txt – Robots.txt, or the Robots Exclusion Protocol, is a text file used by a website to communicate with web robots search as search engine spiders (or crawlers). Robots.txt is a text file accessible at the root of the website that communicates important information to crawlers. For example, using robots.txt allows SEOs to tell search bots how to process each page of the website. You can set certain pages to be ignored by the crawler, ensuring that only the most useful and important content is crawled and indexed.
SERP (Search Engine Results Page) – The SERP, or search engine results page, is the page displayed by Google or another search engine after a user enters a search. The SERP will display usually around ten organic search results, ranked by order of relevance and quality, showing the URL, page title and short description of each result. Your ranking on the SERP will determine your user visibility and greatly affect the amount of organic traffic coming onto your website.
Raising your SERP ranking and visibility is one of the first things any SEO will aim to improve and is a key goal for most SEO campaigns. The SERP is how Google analyses the intent of the user search against its index of web pages and websites in an attempt to deliver the most relevant and useful content. Depending on the kind of search made, the SERP may also include other features such as:
- Featured snippets – also known as ‘position zero’
- AdWords Ads – paid advertising that is shown above and below the organic search results
- Local pack – with a local map
- Related questions and searches
- Shopping results
Search engine optimization is closely tied to improving your ranking on SERPs for the queries and search terms most important to your business.
Sales Funnel – A sales funnel refers to the journey that potential customers go through on the way to making a purchase. There are several steps within a sales funnel, and these are usually the following – awareness (when people first become aware of brand/product/service), interest (does it solve their problem, competitor analysis), decision (they might dig deeper into prices and packages), action (making a purchase). However, the stages might be different depending on the company’s sales model. A . With these insights, you can decide on the best way to invest in your marketing channels and activities to create relevant messaging at each stage, which will then ultimately turn potential customers into paying customers.
Sandbox – Sandbox is a filter used by Google that is suspected of preventing new websites from ranking high in the search engine results pages (SERPs). As Google ultimately aims to prioritise good quality, up-to-date content, it’s believed that the Sandbox tool helps the search engine to filter out new or ‘flash-in-the-pan’ websites from those that are more comprehensive and better managed. In essence, Google’s aim is to be a reliable and useful search engine that people can use to quickly find what they’re looking for. Because of this, relevance is of the utmost importance to the success of should appear highest in the SERPs. In reality, few people know for certain whether Google Sandbox actually exists, but this filter is suspected of having been added to Google’s algorithms sometime around March 2004.
Satellite Domain/Website – A satellite domain or satellite website is a site that has been set up by a business or webmaster with the sole purpose of boosting the authority and presence of the main domain or website. For instance, if you owned a website that sold products in a competitive market or where a customer’s own research was a key part of the sales funnel you may well create a separate website that was filled with relevant content and topics which could then link back to the main domain. In theory, by doing this, you’re strengthening the main domain by constantly linking back to another site which, in the eyes of Google, is proof of an authoritative website that’s worthy of high rankings. However, while satellite domains were a fruitful tactic a few years back, they are now considered more of a black hat SEO tactic than a grey one. Like any SEO practice, the ones that are genuine, honest, and take time to complete will win out in the long term.
Schema Markup – Also commonly known as ‘rich snippets’, schema markup is a kind of microdata that, when added to a web page, creates an enhanced description of the page. Added to the HTML of a web page, the schema markup improves the way search engines read your page and allows them to include the rich snippets in the search results.
Now, while schema markup doesn’t necessarily directly impact your organic search rankings, it’s still a great thing to implement as part of your SEO campaign. This is because it gives you more space on the SERP, improving what we like to call search engine real estate. It’s also known to improve click through rate from your organic rankings.
Search Console – Google’s search console (GSC), formerly known as Google Webmaster Tools, is a free service provided by Google for website owners and webmasters. It is a series of free tools and resources for use in website optimisation, performance tracking and more. These tools are invaluable to any SEO, and verifying your website with GSC is generally considered an SEO campaign best practice.
Once you have access to the Google search console, you’ll have access to a wide range of resources. Through the search console, you can submit sitemaps, check for manual penalties on your site, access crawl reports to check the indexing of your web pages, monitor your site speed, and more. Google search console also provides valuable performance tracking information such as your number of impressions on search results, your ranking position for certain keywords and search terms and the number of clicks you are receiving.
Search Engine Results – The search engine results, or simply search results or search engine rankings, are those pages, ads and links that the search engine delivers to a user based on their query. Appearing on the SERP (search engine results page), the search results are a ranked list based on relevancy and quality, matched to suit the needs of your search query. But the search results will also include relevant ads too. Essentially, Google’s number one goal is to deliver users with the best, most useful and relevant content possible – the search results are Google’s attempt to do this.
Search History – When you search for a topic on a web browser, it will document everything you’ve searched for and give you a list of your search terms and visited websites. Essentially it leaves a breadcrumb trail for you to retrace your steps, visit previous websites and keep track of your movements online. Google also reads this and uses it to understand your interests, habits and trends as a user to present you with relevant content. It can also tell whether you’re using a browser on a desktop or an app on a mobile device and even keep track of the ads you click on.
While you can delete your search history locally from your computer, the data will remain on Google’s servers. There are options for users who want to browse the web without their search history readily available to be viewed such as Incognito mode. However, this won’t hide you from Google and your online movements can still be tracked.
Search Robot – Search robots are automated tools used by search engines, such as Google, Bing, and Yahoo!, to build their databases. These robots, also known simply as bots, wanderers, crawlers, and spiders, systematically crawl the web to discover new websites, as well as updates to existing ones, and create a record of the digital spaces they’ve crawled. They do this by following a series of links, scoping out connections between webpages and processing the data, such as content, sitemaps, links, and HTML codes, to create an up-to-date index. As search robots are automated, they process the data they crawl far faster and more accurately than a human could. Their main use is by search engines, which use them to scan the content of the web; however, spammers also exploit bot software to scan for email addresses and personal data. which will deny them access.
Search Volume – This is the amount of times a keyword is searched for by a user on a search engine. This can indicate the popularity of a keyword and help users determine which keywords to optimise for. If a keyword has a very low search volume, it is likely not worth ranking for. If a keyword has an extremely high search volume, you will want to rank for that keyword but the competition for it will be very high. Opting for keywords that have a good search volume but not too much competition are the best option for keywords to rank for.
Large search volume keywords are often very broad in comparison to low search volume keywords which are likely more focused on a specific topic. It’s often easier to rank for low search volume keywords but there is a chance it will be less relevant to your website due to the specificity.
Seasonal Trends – Refers to the changes in search trends throughout the year due to the season. For example, there may be a rise in searches for chocolate around the last couple of months of the year as people search for Christmas presents. This also allows websites to focus on these key search terms at specific times of the year and drive conversions.
Targeting these keywords at popular times allows businesses to take advantage of these search trends and attempt to climb the SERP rankings as a result. At other times of the year, these specific keywords may be less popular and ranking for them at these times may prove fruitless. This is why it’s important to understand the market and track changes in trends. Certain keywords will reach peak seasonal popularity, which you can find out through various tools such as Google Trends. This allows businesses to do research ahead of time and ride the trends at the right time.
Seed Keywords – A single word that acts as the starting point for a string of keywords. Seed keywords are short and don’t feature any modifiers. Some long-tail keywords may include seed keywords alongside other modifiers. For example, if the seed keyword was ‘boots’, the long-tail keyword may be ‘children’s blue football boots’. This is just one variation of the keywords as they are the starting point and can be grown into something longer, hence the name.
Seed keywords can be very useful as they can help users understand the relevance of certain topics. They can also be used to generate longer keyword phrases that are associated. When used alone these seed keywords can be too broad to achieve any real results but when used in a longer keyword phrase, they hold much more value.
Semantic Core – A semantic core is a cluster of keywords and phrases that encapsulate the types of goods or services that your business sells. It aims to cover the broad scope of phrases that a potential user may input into a search engine to find the answer to something that they need. Determining your semantic core, then, is a key aspect of a corporate marketing strategy, especially when developing your website’s on-page SEO. This cluster of keywords should be used throughout your website so that your website’s authority is optimised within this core field you have specified. An effective semantic core for your company should, therefore, address the needs of a search query your target audience is likely to be asking. When reviewing your website’s semantic core, it should accurately describe what your business does so that the users it funnels through will be appropriate and interested in what your business has to offer.
Site Map – A site map is pretty much what it sounds like – it’s a map of all the pages on your website. It’s important during SEO because it quickly and easily informs Google of your site’s structure and content, making it quicker to crawl and index and improving navigability. There are typically two different types of site map:
- HTML site map – this type of site map is usually organised by topic, hierarchy and structure, allowing users to better understand and navigate your website.
- XML site map – this type of site map is the one that provides web crawlers and search bots with a list of web pages on a given website, giving them everything they need to quickly and accurately index the website.
Site Structure – Site Structure refers to how you structure your website and helps search engines understand what elements of your site are the most important. A solid, well thought out site structure is so, so important to the success of your website. By structuring things properly, ensuring your build a pyramid shape of pages that begins with the homepage, and then branches out to service pages, which then link to smaller topic pages, you make it as easy as possible for search engines to figure out what your website’s purpose is.
Other things that come under the umbrella of site structure are things such as categories, how blog content is parsed, taxonomies, internal links, navigation, and breadcrumbs. As your site gets bigger, develops, and has more content added to it, it’s important to keep everything organised as this will make it easier for everyone to navigate around your website.
Social Media – The simple definition of social media is that it is websites and programs online that facilitate the creation and sharing of content and online media by individuals. The most widely used and popular examples of social media include Facebook, TikTok, Twitter, YouTube, Instagram, LinkedIn and more. Social media websites allow users to socialise while creating and sharing their content.
Social media marketing is a form of digital marketing that is often performed alongside search engine optimisation. From paid advertising to organic posting, social media marketing is a powerful tool with the ability to tap into a massive and active audience.
Over the years, social media has become more important to SEO, with links from many social media websites now appearing in searches. Securing links within social media sites and encouraging web traffic from social media accounts onto a website are becoming increasingly important SEO tactics.
Social Signals – In the world of search, social signals refer to the likes, shares, comments, and interactions that businesses have across any social media channels, which search engines perceive when calculating visibility. Building a relevant cohesive social media strategy is no longer just ‘a nice thing to have’ but something that can directly impact your SEO. These days search engines want to give rankings to businesses that are real and active. What more active signal that your business is open and ready to provide products/solutions is there than a regularly updated set of social media channels. While not yet as valuable a currency as backlinks, social signals, in particular, those signals that involve the sharing of content and web pages is certainly something that should be considered an important factor. The importance of social signals is further emphasised by the fact that Google and Twitter have partnered and now display tweets from businesses that are relevant to the search term. This kind of feature is very useful for businesses that are operating in fast-paced industries and need to leverage new industry news and topics.
Source Code – This is the term that is most people will know from right-clicking on your mouse when viewing a web page. On the menu that pops up once you have right-clicked will be an option to ‘view source’. Upon selecting this option, you will given a view of the code that is used to render the page in your browser – the ‘source code’. On this view, you will see various codes and tags from <head> to <i> to <p> and many, many more.
Status Codes – See response codes. Status codes is another way to refer to the same thing. HTTP status response codes are the three-digit codes that indicate whether a request made to a web server has been successful.
Structured Data – Otherwise known as schema, structured data markup can be added to a website’s HTML to add more contextual information about a web page’s content. This helps the search bots to identify the important and contextual elements, allowing it to more accurately crawl, index and rank a website’s content.
Structured data is a form of microdata (a kind of different website code) that creates the enhanced description often known as rich snippets, allowing this to appear in the search results. This can take the form of FAQs, star ratings, maps, and more.
Structured data markup was generalised in 2011. Schema.org agreed with Google, Bing, Yahoo, and Yandex that they would create a standard form of structured data to be supported and displayed in the SERPs of these different search engines.
TF-IDF – Shortened, mercifully, to TF-IDF, this term is not exclusive to SEO but it is a phrase that’s becoming increasingly important as search engine algorithms begin to understand the wider context of pieces of content. TF-IDF stands for term frequency-inverse document frequency. It’s a sort of technique that search engines – Google, Yahoo, Bing et al – use to measure the importance of a term, word, phrase, or keyword within a blog, web page, or site. From an SEO perspective, a TF-IDF helps to go beyond ranking just keywords and looking at the relevant content that surrounds it. In essence, it’s rewarding webmasters who don’t keyword stuff and, instead, create algorithmically wonderful copy that includes keywords and relevant information. The formula works as so: TF = (the number of terms that appear in a document)/(Total number of terms in the document). IDF = log_e (Total number of documents / Number of documents with term in it). Once you have those numbers you can then time TF by IDF. The end figure will give you a good idea about how many times you’re using a particular phrase, compared to your competitors, and anyone else who is ranked for that term.
Target Keywords – A website or web page’s target keywords are the words and phrases that it is intended to rank for. This is a central part of SEO and digital marketing. Any SEO campaign will identify the target keywords each web page should rank for, and on-page SEO will be performed to optimise a page to specifically rank for these target keywords.
Title Tags – A title tag specifies a web page title, and is HTML element. These tags are displayed on search engine results pages – they are the clickable title that you see for each result. The title tag should always be accurate and concise, summarising the purpose of the page and its content. Title tags are important for several reasons, including for sharing on social media networks and for SEO. As such, there are certain things to consider when writing a title tag, for best practice.
For example, you should give every webpage a unique title and avoid keyword stuffing. In other words, don’t add lots of keywords into your title tag for the sake of it. This can create a bad user experience, which Google will recognise and penalise you for in the SERPs. There is no character limit on a title tag but you should keep in mind that titles will appear differently depending on the display and Google’s display pixels.
Traffic – Simply, web traffic is the number of people that visit your website over a given period. Web traffic is measured in visits, which are sometimes called ‘sessions’. Traffic is one of the most popular metrics used by businesses because it is such a clear way of displaying how popular your site is, and how effective your wider marketing method is at attracting audiences. When SEO analytics first came about, traffic was seen as the most important metric. However, it’s now much more important to measure traffic alongside other metrics such as click-through rates and bounce rates. This will give you a much more rounded picture of your site’s performance.
Transactional Query – These are searches that are made with a clear and direct intent to buy a product or complete a transaction. The user will use the search engine to search for their desired products and complete the transaction, likely with one of the top results. This allows businesses to anticipate this and compete for the top rankings using these particular queries.
The clear indicator that a query is transactional is if certain keywords are used such as ‘buy’ or ‘order’. The search for specific products or brand names also signals the user’s intent to buy. By picking up on these keywords, the search engine is able to offer relevant results and help aid the user in their search to buy.
Twitter Card – A Twitter Card is a couple of lines of code that allow users, who tweet links to your content, to show a ‘card’ that’s visible to users. It’s a handy thing to have because it allows you to go beyond the 280 character limit to create content-rich tweets that stand out as people are aimlessly scrolling through their feeds. Twitter Cards allow your users to view an image, watch a video, download an app, or even visit a landing page. It’s easy to see how Twitter cards allow you to create social media content with real intent and encourage people to convert – be that watching a video, buying a product, or signing up for something. And the best thing? The user NEVER has to leave Twitter to experience these, which as everyone knows is great for people who don’t like tapping, scrolling, or swiping anymore than they need to. The benefits of Twitter Card’s are apparent – you can make sure you have a consistent look on posts across all your platforms, as well as attribution that could drive more traffic to your site. You can also create custom titles and descriptions for your photo and URL. Oh! And you create an awesome mobile experience for people!
URL – URL stands for uniform resource locator. Found at the top of your browser when viewing a web page, the URL is the identifying string of characters that leads your browser to access and display any given web page. The URL for the Embryo Digital SEO page, for example, is: https://www.embryodigital.co.uk/seo-manchester/
Your URL is important because it shows the web page’s location on a network or domain and gives both users and search engines important information about the nature and content of the web page. Not to be overlooked, your URLs should also be optimised and well thought out – they’re not something thrown away to simply be overlooked during search engine optimization.
Unnatural Link – An artificial link generated to manipulate a page’s ranking. Unnatural links are generally a thinly veiled attempt by scrapers and spammers to piggyback off your website’s value or attach your website to a ‘bad’ part of the internet to harm its ranking. These can be identified by Google as not being editorially placed, meaning irrelevant anchor text has likely been used as the topic of the page being linked is also irrelevant.
If a website is flagged for unnatural links, it’s ranking can drop dramatically, with its owner required to send a request to Google should a warning be issued. Google made the decision to penalise for unnatural links in an attempt to prevent websites climbing rankings due to spamming links and irrelevant sources being recommended to users as a result.
User Experience – User Experience (UX) is similar to User Journey but is a broader term and focuses on all aspects of a customer relationship with your website, and your products and services. While UX does refer to the ease with which people can head to your website and buy your products and services, it is broader because it also defines the experience a customer has after they have bought the product or service, be that a physical product, or if it’s a digital product, it will be based on how easy the platform is to use.
Memorable UX is about going beyond what the customer thinks they want from a product or buying process and offering a seamless experience that keeps them coming back to your website or product. A good and often cited example of UX is on e-commerce sites. If an e-commerce website, app, or social media ad is well thought out then users will naturally gravitate back to that site to purchase more products because they don’t have to spend time with a website that offers poor UX.
User Journey – User Journey is a term that refers to the experience a person has when they visit your website. A typical user journey will more often than not start with a Google search, or clicking an ad, this will then take the user to the homepage, or a specific service page. From there, the journey should be about educating them, and making them respond to the call to action that’s on the page – that could be calling, emailing, or inputting information.
A well-thought-out user journey makes it easier for the user to complete the goal and lowers the risk of them getting frustrated, and leaving the page to go to another website. Once a journey has been decided on, it’s important to analyse the data to fine-tune it and make it as optimal as possible. Analysing areas where people leave your site, unnecessary interactions and time spent on the site are just three things that need to be looked at to refine the user journey.
Vertical Search – The term vertical search relates to specific, smaller types of search engines that attempt to index content that is relevant to the website or company that is attempting to collate it all. While the worldwide web attempts to index all content on the internet a site that focuses on a vertical search just indexes work that is relevant. A good example of a Vertical Search engine is Yell which only collates crowd-sourced reviews of restaurants, attractions, shops, and other areas of note. Vertical search engines are beneficial in many ways. Firstly, because the scope is narrowed, these sites can be incredibly precise. Secondly, by being so precise they can include taxonomy and ontology that increases domain authority in the eyes of search engines. For organisations that are for-profit, attempting to become a vertical search engine is a great, long-term way of becoming an authority in your industry. For instance, if you worked in the fashion industry, your aim would be to create reams and reams of content that answer as many search queries as possible in as much detail. By being so comprehensive on a topic, and touching on every possible topic that’s relevant to what your target audience is searching for, the theory goes that you force Google and other search engines to put you at the top of the rankings because you’ve out content-ed everyone.
Voice search – Voice search is a form of user search that combines the latest in speech recognition technology with regular search engine queries, allowing users to verbally speak questions rather than needing to sit down at a keyboard and type them out. Voice search is becoming increasingly popular, and, as such, it is becoming more of a priority of SEO campaigns.
The given software will interpret the speech, translate it into search queries which it will then submit to one or more search engines to deliver the user with relevant answers. With new tools such as Apple’s Siri and Amazon’s Echo technology being specifically designed to utilise voice search, it is a trend that looks to continue growing and expanding in the future.
SEOs are beginning to adapt, with voice search optimisation becoming a more central skillset. Including short and concise answers in your content, optimising for featured snippets, and even creating specific voice search FAQ pages are great ways to optimise your site for voice search results.
WHOIS – Pronounced ‘who is’, WHOIS is a type of protocol that’s query and response-based and relates to the querying of databases that store data such as IP addresses and domain names. The WHOIS database, which was drafted by the Internet Society, is displayed in a way that humans can read. It is an extremely useful tool that allows people to get in touch with those who own the domain. WHOIS took a bit of a negative turn a few years ago when GDPR was introduced. Essentially, because the WHOIS protocol publishes the names, and addresses of those with a certain internet domain, it directly goes against GDPR as it didn’t ask for the express consent of these people before that data is shared.
Webmaster Guidelines – Otherwise known as the Google SEO guidelines, or simply the Google guidelines, the webmaster guidelines are the guidelines set out by Google to provide website owners and webmasters with the information and guidance they need to optimise their website for search engine crawling and indexing. Alongside this, the webmaster guidelines also define what the search engine considers spam and the penalties that should be expected when violating the guidelines.
Because of this, the Google webmaster guidelines are often used to draw the line and define what can be considered black hat SEO techniques compared to white hat SEO. The webmaster guidelines are often split between the quality guidelines, which layout spammy tactics and what is considered good quality SEO, and the general guidelines, which define the actions that can be done to improve a website’s indexability and crawl-ability.
Website Navigation – Essentially, website navigation is just how a user moves throughout a website and travels from web page to web page. Navigation is often overlooked but is an incredibly important part of user experience. If navigation is unclear or confusing, you will lose the traffic that could have potentially engaged with your website and ultimately converted.
Your website navigation and structure have a big impact on conversions, bounce rate, sales, and more. If a user cannot find the content they are looking for, they will leave. A clear, strong and hierarchical website navigation structure guides visitors – especially vital for e-commerce websites with a clear sales funnel.
White Hat SEO – Essentially, white hat SEO is a phrase that is the opposite of ‘black hat SEO’ which is defined as negative SEO practices and often refers to hackers. Black hat practises that sabotage and cause harm to websites means that white hat SEO practises fall in line with Google’s terms and conditions on how to correctly optimise a site.
White hat SEO practises are hugely beneficial to websites and, when carried out in the most effective way, can improve the ranking of a website on Google’s SERP. Most SEO practices you come across will be white hat as they’re done to improve a website. If your website hasn’t used white hat SEO techniques, Google will flag this as a black hat practice and penalise your site. These punishments are likely to include de-indexing and your ranking being reduced.
WordPress – WordPress is an open-source content management system (CMS) that is used by millions of people across the world – it is one of the most popular CMS out there thanks to its user-friendly interface and ability to add plugins and add-ons. It was first published back in May of 2003 and is now home to 75 million websites. More than 409 million people view around 23.6 billion pages each month, according to WordPress themselves.
One of the many positive features of WordPress is that it can be used by beginners, who have no experience with development and web design, to create a website, and by experienced developers who are adept at coding. This is because WordPress has an incredible range of features that suit people of all skills and abilities. The open-source nature of WordPress means that it is free and can be used, adapted, and changed based on the web master’s needs. Plugins can also be added, such as Yoast SEO, to track SEO performance and enhance it.
Yoast SEO – Yoast SEO is a plugin that can be installed on a WordPress website to enhance SEO performance and ensure content, and web pages, are optimised for a specific keyword. Rather than just guessing, Yoast SEO makes it very easy for your business to meet, and exceed, SEO standards that can help you in your efforts to boost keyword reach.
Yoast SEO is a very user-friendly platform and allows businesses to create titles, meta descriptions, and focus keywords so that every aspect of your content is optimised. Once installed, the plugin rates each page, once the important information has been filled in, on a score from 1-100 (with 100 being the highest score, and 1 the lowest). It bases this score on the targeted keyword and the prevalence of it across the headers, copy, title tags, alt-text, metadata, and any other areas of your content where keywords are important.
A good score is anything that is 80+, the score can be improved by adding content and refining the technical aspect of content further. Yoast can also help you track the use of keywords (to ensure you’re not over optimising pages) and manage sitemaps to ensure everything is structured correctly and optimised.