This article was updated on 2022/23/06.
Every online buyer’s journey begins with a query entered into, and processed by, a search engine. As marketers we must understand every step of the customer’s journey to reach and engage them at every single phase. Since consumers use more than one channel for inspiration or information, and often complete purchases using a different channel, SEO is the essential tool in this process.
Understanding SEO basics, essential tactics, and how the largest search engine in the world revises and updates its algorithm is a critical first step toward effectively utilizing SEO in your marketing strategies. Here is a crash-course guide to foundational SEO terms, data and must-know algorithm updates to increase your awareness of the greater SEO landscape and to create a foundation to build upon.
SEO Basics: Terms
Foundational SEO terminology once grasped, eliminates the shroud of difficulty that surrounds SEO. Here are some essential terms and definitions that make SEO easier to understand:
Simply put, site speed is how fast a webpage loads once a user clicks its link. Site speed is critical to SEO as search engines like Google prioritize websites that load quickly and efficiently. Additionally, the longer users need to wait to access a webpage, the more likelier they are to bounce from it if the page and its elements, such as videos and photos, don’t load quickly.
This an open-source tool designed to run on any webpage to audit accessibility, load times, SEO and more. Lighthouse only needs a URL to run a series of audits against the page and creates a detailed report on how well a page performed. Google Lighthouse is an invaluable tool for SEO as the reports provide precise information on what aspects of your page or pages are not performing well. A reference file is provided with each audit to explain how to fix a variety of issues. Website audits are especially useful after Google algorithm updates as they can shed light on how a previously top-performing page is suddenly seeing drops in visibility.
The process of linking keywords and content to relevant webpages or to pages within your own website. A variety of different link types exist including inbound, internal and external links. Content linking is a crucial, intricate practice that is a core element of SEO optimization and each type of link requires a specific balance to effectively rank for SEO.
A website with relevant content is ranked higher by crawlers when several reputable sites inbound link to them. It requires a delicate mix of branding, content creation and partnership building to elevate a website’s ranking using inbound links. Internal linking requires connecting pages with a logical, meaningful relationship and search engine crawlers are smart enough to determine the value of those relationships.
This is a range of intentions and actions a user takes online, whether it’s comparing clothing prices between department stores or fact-checking information via a search engine query. Understanding why a user searched using a specific term is key to the SEO optimization process.
User searches are often categorized as either transactional, navigational or informational. Marketers can utilize user intent to create content that best answers user needs, which makes the content more valuable to users and in turn, more valuable to search engines who seek to prioritize and index websites with helpful content.
Web indexing (or indexation) for search engines is the method of indexing the contents of a website, namely its keywords, content and metadata to create a better, robust vocabulary for onsite or internet searches.
Web crawlers are responsible for crawling sites and creating copies of pages for search engines to process for certain queries users provide. Crawlers are significantly impacted by changes like Google algorithm updates as they ultimately inform what website qualities, elements and characteristics crawlers should prioritize.
SEO Basics: Google Algorithm Updates
Google published its first algorithm update in 2000 and has since continued iterating new improvements to its search engine that has significantly impacted the search industry and the way people search. Google’s algorithm has evolved into a complex set of rules to deliver optimized search engine result pages (SERPs).
Many believe, tracking Google search algorithm updates is key to performing well on their platform. Here is a history of some of Google’s most important algorithm updates and the impact they’ve had on the search industry.
Panda (February 2011)
The Panda update introduced a new search filter designed to prevent websites with poor quality content from ranking in Google’s top search results. The update primarily targeted content farms and sites with heavy advertising, while boosting the rankings of reputable news sites and social media sites. Google experienced a drop in revenue from the update as it didn’t spare any of the company’s partners from scrutiny.
Penguin (April 2012)
Google cracked down on websites that violated Google’s Webmaster Guidelines to artificially increase their rankings using link schemes with the Penguin update. Sites that manipulated several links directing to a page (now known as a Black-hat SEO tactic) or used “doorway pages,” were heavily penalized by Google, and sites that utilized organic, purposeful content received a boost in their rankings post-update.
Hummingbird (September 2013)
The Hummingbird update vastly improved Google’s search ecosystem’s precision with natural language queries. The ability to better gauge intent in a semantic manner gave users the ability to easily search for topics and sub-topics using short and long-tail keywords. Users received more accurate results in search elements like the knowledge graph.
Hummingbird also laid the foundation for voice search as it combined conversational language and human intent based on location to improve search experience.
Pigeon (July 2014)
U.S. English search results became more accurate and useful for users with the Pigeon update, which prioritized local search results. Google utilized factors such as location and distance with local directory sites taking more precedence with this and future updates.
RankBrain (October 2015)
RankBrain is a machine learning-based search engine algorithm update, which dramatically improved Google’s ability to process and provide more relevant search results to users. The intuitive system takes words or phrases it isn’t familiar with and makes an educated guess as to what other words or phrases hold similar meaning to better filter results.
RankBrain records results and adapts them to increase user satisfaction in future queries. Google communicated to the public that RankBrain is one of its three most important search ranking signals.
Fred (March 2017)
Jokingly named the “Fred,” update by Gary Illyes, then webmaster trends analyst at Google, it targeted Black-hat SEO tactics that utilized aggressive monetization. Content sites that were ad heavy or appeared to push products and services on users more than answer their queries were heavily affected. Websites that demonstrated content quality, expertise, positive user reviews and error-free experiences saw their visibility boom.
BERT (October 2019)
Originally a technique for natural language processing developed by Google, the company began applying BERT to its search algorithms in 70 different languages by the end of December 2019.
Google stated the BERT update was designed to improve conversational and natural queries to better aid search’s ability to understand the nuances of language, text and speech to provide more helpful results for users. This update heavily affected mobile voice search as well where users were more likely to use conversational speech compared to desktop queries.
E-A-T (May 2020)
This update came in the middle of the COVID-19 pandemic. Searchmetrics Chief SEO Consultant Marcus Pentzeck said the update was important for expertise, authority and trust (E-A-T) factors. This update also revealed that quality counts more than authority.
CWVs (June 2021)
The Core Web Vitals update dealt with page experience. It examined three primary ranking factors such as loading time, page interactivity and stability of page visuals.
SEO Basics: Data Collection
There is a plethora of diverse SEO data and metrics you can pull to inform your website’s performance and marketing strategies, but it can be difficult to decide which metrics will provide the best performance insights. Here are a selection of key SEO metrics to examine when establishing a solid foundation for your marketing strategies.
SEO visibility is a collection of metrics compiled from search ranking factors to calculate how visible a website appears in organic search engine results. Collected metrics can include the search volume of certain keywords and the ranking positions of a domain’s various URLs. Agencies and software programs are capable of examining a collection of metrics to determine a visibility score to compare the online performance of your website to other domains.
SEO visibility scores are an essential metric to review in digital marketing as the collection of metrics provide robust performance insights that can be utilized to boost your website’s ranking in search engines and identify what tactics competitors are using in their marketing strategies.
Traffic index (paid and organic SEO)
Traffic index functions similarly to how SEO visibility scores are calculated except it measures keyword search volume. The index is calculated by examining a site’s estimated monthly organic or paid search traffic from a set of keywords compiled by a marketer or SEO manager. Keywords are one of the primary ranking factors influencing how high search engines rank websites.
Marketers use traffic index information to analyze and compare traffic potential for certain keywords and remove low-performing keywords from sets in a process of elimination. This method allows marketers to better prioritize and identify keywords with the best potential to help their website and other domains rank higher than competitors.
Cost per acquisition (CPA)
A familiar term to most marketers, cost per acquisition is the average cost of online ads needed to get a user to perform a specific action such as downloading content or completing and submitting a form. It provides data on the cost per acquisition, or conversion, which helps with identifying optimization opportunities. CPA is a key component of pay-per-click (PPC) search.
PPC is a well-tested method of generating leads although it consumes a significant portion of marketing budgets compared to optimizing for organic search. Although organic search is better for marketing budgets, leveraging PPC platforms, like Google Ads, can help marketers bid on valuable keywords through sponsored ads.
Complete site load time
This is the amount of time it takes for a webpage to completely load all its technical page elements including links, images, video clips and more. How fast a website loads is key to SEO and marketing strategies as users are more likely to bounce from slow loading pages, resulting in lost conversions.
Search engines, like Google, prioritize site loading speed as a key ranking factor and marketers carefully monitor their site speeds to ensure their sites aren’t penalized.
Searchmetrics provides tailored consulting solutions to drive success in online growth for our clients. Our Digital Strategies Group brings together our data science, software and SEO experts to deliver services that amplify your search and content efforts.