searchmetrics email facebook github gplus instagram linkedin phone rss twitter whatsapp youtube arrow-right chevron-up chevron-down chevron-left chevron-right clock close menu search

The CMO’s Crash-Course in SEO

Laying a Foundation for CMOs

Every online buyer’s journey begins with a query entered into, and processed by, a search engine. As marketers we must understand every step of the customer’s journey to reach and engage them at every single phase. Consumers use more than one channel for inspiration or information, and often complete purchases using a different channel. SEO is an essential tool in the process to address customers at every phase of their journey, across multiple different channels. 

Understanding foundational SEO terms, essential strategies and tactics, and how the largest search engine in the world revises and updates its algorithm is a critical first step toward effectively utilizing SEO in your marketing strategies. Here is a crash-course guide to foundational SEO terms, data and must-know algorithm updates to increase your awareness of the greater SEO landscape and to create a foundation for you to begin building upon.

Foundational SEO Terms 

Foundational SEO terminology once grasped, eliminates the shroud of difficulty that surrounds SEO. Here are some essential term definitions that make SEO easier to understand:

  • Site speed – Simply put, site speed is how fast a webpage loads once a user clicks its link. Site speed is critical to SEO as search engines like Google prioritize websites that load quickly and efficiently. Additionally, the longer users need to wait to access a webpage, the more likelier they are to bounce from it if the page and its elements, such as videos and photos, don’t load quickly. Site speed can be increased in several different ways. One common method is to compress CSS, HTML or JavaScript files exceeding 150 bytes as these large webpage elements can slow down load times. Also, some file extensions are better suited for certain types of images (JPEGS are best suited for photographs and PNGs with images composed of 16 different colors or less). Browser caching is another excellent method as it acquires information from a user’s first visit to the page to quickly reload it on a user’s recurring visits.
  • Google Lighthouse – An open-source tool designed to run on any webpage to audit accessibility, load times, SEO and more. Lighthouse only needs a URL to run a series of audits against the page and creates a detailed report on how well a page performed.Google Lighthouse is an invaluable tool for SEO as the audit reports provide precise information on what aspects of your page or pages are not performing well. A reference doc is provided with each audit to explain how to fix a variety of issues. Website audits are especially useful after Google algorithm updates as they can shed light on how a previously top-performing page is suddenly seeing drops in visibility.
  • Content linking – The process of linking keywords and content to relevant webpages or to pages within your own website. A variety of different link types exist including inbound, internal and external links.Content linking is a crucial, intricate practice that is a core element of SEO optimization and each type of link requires a specific balance to effectively rank for SEO. A website with relevant important content is ranked higher by crawlers when several reputable sites inbound link to them. It’s a delicate mix of branding, excellent content creation and partnership building that can help elevate a website’s ranking using inbound links. Internal linking requires connecting pages to have a logical, meaningful relationship to rank, and search engine crawlers are smart enough to determine the value of those relationships.
  • User intent – A range of intentions and actions a user takes online, whether it’s comparing clothing prices between department stores or fact-checking information via a search engine query. Understanding why a user searched using a specific term is key to the SEO optimization process.User searches are often categorized as either transactional, navigational or informational. Marketers can utilize user intent to create content that best answers user needs, which makes the content more valuable to users and in turn, more valuable to search engines who seek to prioritize and index websites with helpful content.
  • Indexation – Web indexing (or indexation) for search engines is the method of indexing the contents of a website, namely its keywords, content and metadata to create a better, robust vocabulary for onsite or internet searches. Web crawlers are responsible for crawling sites and creating copies of pages for search engines to process and index in their system for certain queries users provide. Crawlers are significantly impacted by changes like Google algorithm updates as they ultimately inform what website qualities, elements and characteristics crawlers should prioritize.

A Brief History of Google Algorithm Updates

Google published its first algorithm update in 2000 and has since continued iterating new improvements to its search engine that has significantly impacted the search industry and the way people search. Google’s algorithm has evolved into a complex set of rules to deliver optimized search engine result pages (SERPs).

Tracking Google search algorithm updates is key to performing well on their platform. Here is a history of some of Google’s most important algorithm updates and the impact they’ve had on the search industry.

  • Panda (February 2011): The Panda update introduced a new search filter designed to prevent websites with poor quality content from ranking in Google’s top search results. The update primarily targeted content farms and sites with heavy advertising, while boosting the rankings of reputable news sites and social media sites. Google experienced a drop in revenue from the update as it didn’t spare any of the company’s partners from scrutiny.
  • Penguin (April 2012): Google cracked down on websites that violated Google’s Webmaster Guidelines to artificially increase their rankings using link schemes with the Penguin update. Sites that manipulated several links directing to a page (now known as a Black-hat SEO tactic) or used “doorway pages,” were heavily penalized by Google, and sites that utilized organic, purposeful content received a boost in their rankings post-update.
  • Hummingbird (September 2013): The Hummingbird update vastly improved Google’s search ecosystem’s precision with natural language queries. The ability to better gauge intent in a semantic manner gave users the ability to easily search for topics and sub-topics using short and long-tail keywords. Users received more accurate results in search elements like the knowledge graph. Hummingbird also laid the foundation for voice search as it combined conversational language and human intent based on location to improve search experience.
  • Pigeon (July 2014): U.S. English search results became more accurate and useful for users with the Pigeon update, which prioritized local search results. Google utilized factors such as location and distance with local directory sites taking more precedence with this and future updates.
  • RankBrain (October 2015): RankBrain is a machine learning-based search engine algorithm update, which dramatically improved Google’s ability to process and provide more relevant search results to users. The intuitive system takes words or phrases it isn’t familiar with and makes an educated guess as to what other words or phrases hold similar meaning to better filter results. RankBrain records results and adapts them to increase user satisfaction in future queries. Google communicated to the public that RankBrain is one of its three most important search ranking signals.
  • Fred (March 2017): Jokingly named the “Fred,” update by Gary Illyes, then webmaster trends analyst at Google, it targeted Black-hat SEO tactics that utilized aggressive monetization. Content sites that were ad heavy or appeared to push products and services on users more than answer their queries were heavily affected. Websites that demonstrated excellent content quality, expertise, maintained steady positive user reviews and error free experiences saw their visibility boom.
  • BERT (October 2019): Originally a technique for natural language processing developed by Google, the company began applying BERT to its search algorithms in 70 different languages by the end of December 2019. Google stated the BERT update was designed to improve conversational and natural queries to better aid search’s ability to understand the nuances of language, text and speech to provide more helpful results for users. This update heavily affected mobile voice search as well where users were more likely to use conversational speech compared to desktop queries.

SEO Data Collection

There is a plethora of diverse SEO data and metrics you can pull to inform your website’s performance and marketing strategies, but it can be difficult to decide which metrics will provide the best performance insights. Here are a selection of key SEO metrics to examine when establishing a solid foundation for your marketing strategies.

  • SEO visibilitySEO visibility is a collection of metrics compiled from search ranking factors to calculate how visible a website appears in organic search engine results. Collected metrics can include the search volume of certain keywords and the ranking positions of a domain’s various URLs. Agencies and software programs are capable of examining a collection of metrics to determine a visibility score to compare the online performance of your website to other domains. SEO visibility scores are an essential metric to review in digital marketing as the collection of metrics provide robust performance insights that can be utilized to boost your website’s ranking in search engines and identify what tactics competitors are using in their marketing strategies.
  • Traffic index (paid and organic SEO) – Traffic index functions similarly to how SEO visibility scores are calculated except it measures keyword search volume. The index is calculated by examining a site’s estimated monthly organic or paid search traffic from a set of keywords compiled by a marketer or SEO manager. Keywords are one of the primary ranking factors influencing how high search engines rank websites. Marketers use traffic index information to analyze and compare traffic potential for certain keywords and remove low-performing keywords from sets in a process of elimination. This method allows marketers to better prioritize and identify keywords with the best potential to help their website and other domains rank higher than competitors.
  • Cost per acquisition (CPA) – A familiar term to most marketers, cost per acquisition is the average cost of online ads needed to get a user to perform a specific action such as downloading content or completing and submitting a form. It provides data on the cost per acquisition, or conversion, which helps with identifying optimization opportunities. CPA is a key component of pay per click search (paid search). PPC is a guaranteed method to generate leads although it consumes a significant portion of marketing budgets compared to optimizing for organic search. Although organic search is better for marketing budgets, leveraging PPC platforms like Google AdWords can help marketers bid on valuable keywords that Google will then generate sponsored ads.
  • Complete site load time – The amount of time it takes for a webpage to completely load all its technical page elements including links, images, video clips and more. How fast a website loads is key to SEO and marketing strategies as users are more likely to bounce from slow loading pages, resulting in lost conversions. Search engines like Google prioritize site loading speed as a key ranking factor and marketers carefully monitor their site speeds to ensure their sites aren’t penalized. Tools and software exist to help marketers measure their site’s complete loading speed. GTmetrix is one useful tool where users can plug in their site’s URL and receive a detailed report on elements of their site that are quickly or slowly loading. GTmetrix also assigns letter grades (A to F) to page elements and help markets target and fix the elements of their sites that are performing poorly, reducing the potential for a penalization from search engines

The CMO’s Guide to SEO: The Bottom-Line Impact of SEO

The next blog in our five-part series “The CMO’s Guide to SEO: The Bottom-Line Impact of SEO,” will examine why companies see value investing in SEO and why mobile SEO optimization is the next frontier in growing your bottom line.

Searchmetrics provides tailored consulting solutions to drive success in online growth with our clients. Our Digital Strategies Group brings together our data science, software and SEO experts to deliver services that amplify your search and content efforts.

Request Appointment