searchmetrics email facebook github gplus instagram linkedin phone rss twitter whatsapp youtube arrow-right chevron-up chevron-down chevron-left chevron-right clock close menu search
1398913989

Unwrapping the Secrets of SEO: Indexing, Website Quality and Meta-Tags

Search engine optimization is a complex topic. There are numerous recommendations, guides and checklists out there that try to cover and explain all the various different aspects of SEO. For this article, we’ve listened to you so that we can address issues that matter most to you. What should you be aware of regarding the indexing of your pages? How does Google evaluate website quality? And what is the deal with keywords in meta-tags?

If you’d like to explore the SEO potential of your website, then you can request an appointment with our Digital Strategies Group and see how we can help you:

Make an appointment

The topics in this article are based on feedback from participants at a (German) webinar we recently held. I’ve answered the most important questions – with expert support from Marcus Pentzek, who, as a Senior SEO Consultant for the Searchmetrics Digital Strategies Group is responsible for advising our clients and helping them to improve their search engine performance.

The questions deal with the following areas of SEO:

GoogleBot Crawling and Indexing

Having a website that can be crawled is the first prerequisite for being picked up by Google and therefore having a chance of appearing in the search results. You are also able to take decisions regarding which pages are incorporated into Google’s index and which aren’t.

How can I make sure the GoogleBot regularly visits my site?

The GoogleBot visits websites as often as appears necessary:

  • One factor is the size of the website, provided the large number of URLs that a big website has are of value. If a website has many pages with low-quality content, then Google won’t be tempted to visit more often solely based on the sheer number of pages.
  • If a website is often linked to from others, giving it a strong PageRank, then it will be visited more frequently.
  • Another factor is how much new content the website publishes. If, for example, there are 20 new (or updated) blog posts every day, then this puts pressure on Google to keep up and crawl the site more often than if there is only one new post per week.

Can I use the XML sitemap to tell the GoogleBot what to read and what not to?

No. That’s what the robots.txt is for. The XML sitemap is primarily for URLs that Google should crawl and (usually) index.

Can Google judge how much I’ve influenced the GoogleBot’s indexing? And can this affect how my website is evaluated?

You can only give Google recommendations regarding what it should crawl. You can’t directly influence it per se and, therefore, this won’t impact the evaluation of your website.

Does it make sense having an imprint/legal notice page indexed?

Yes. Any reputable company should be open about and proud of its name and who it is. The imprint won’t normally cannibalize any other pages on the website so it isn’t a problem if it’s indexed.

To what extent do typos affect the indexing or search engine rankings?

Search engines are good at recognizing and understanding spelling errors and should return the same result as for the correctly spelt search term. For me and my website, this means that there is no reason to deliberately include misspelt keywords in my content. In fact, this would create a highly unprofessional impression and could have a negative impact on user signals (like bounce rate).

If Google believes that particular search queries are best served by high-quality content, then typos could even lead to drop in rankings, with a page supplanted by websites offering higher-quality texts.

How does “good” indexing affect Google’s evaluation of a site?

Indexing has been done well if only those pages are indexed that are supposed to be found via web search. The impact of this indexing on the evaluation of a website arises because Google only sees the relevant areas of a website. This means that Google doesn’t have to weigh up relevant and non-relevant content when determining rankings.

Can it harm a website’s rankings if too many pages are indexed?

This is best explained with a little thought experiment: For each domain, the GoogleBot gets a limited budget with which it can crawl the site. If I have more pages than the budget allows for then it will take a lot longer for all my content to be crawled. And if I update some content, then it will also take longer for this new content to be picked up.

There is another important point. If I have a great deal of content, but it isn’t optimized for particular search terms, then vast numbers of pages will likely exist that are all considered low-quality by Google. This can have a negative impact on the whole domain. Furthermore, if I have too many pages, then it can easily happen that I have very similar content on different pages covering the same topics. This is what we call cannibalization, and usually results in none of these pages ranking as I would like.

Why shouldn’t I want a landing page for a social media campaign to be in the index? Does it have a negative impact on the crawl budget?

Good question! I would always advise you to consider the purpose of a landing page. If I have a landing page that only exists to gather the leads from a campaign – this could be a page that contains nothing more than a contact form – then the content on this landing page will be considered low quality, and will be evaluated poorly by Google. If I have lots of pages like this, and I let Google index them, then this can damage the overall evaluation of my website.

If, however, I fill my social media landing page with content, and I optimize them for different keywords to my SEO landing pages, then it may well make sense having this page indexed. But be careful: if the landing pages are too similar then they could end up fighting each other for the same rankings.

This means that there is no one right answer. To be on the safe side (to avoid cannibalization), you can ensure that you only let Google crawl and index the content that is specifically designed for this purpose.

What does Google’s latest announcement about noindex being the default mean?

Google’s announcement that, from the 1st of September 2019, the noindex rule in the robotos.txt will no longer be applied, changes how I can use the robots.txt. Firstly, I should be aware of the fact that all pages that I’ve only excluded from indexing via the robots.txt will now end up in the Google index.

I can, however, still de-index pages in the following ways:

  • noindex in the robots meta-tag still works and is the simplest option
  • using disallow in the robots.txt, I can still prevent bots from crawling certain content on my website

Should I noindex pages with “bad content” and how do I determine which content is “bad”?

As a rule, I should only look to de-index content that isn’t relevant for search terms for which I want to be found. Content is “bad” if it doesn’t match a search query. This means that the best method is to assess which content I need for my desired search terms and then see if I can provide it.

Does it make a difference for SEO if I have content as a one-pager or a more complex site structure? Or can a one-pager be indexed by Google without difficulty?

In terms of indexing, a one-pager shouldn’t be a problem. It has the simplest possible site structure. Once at the home page, this can be indexed and all the link juice that is transferred to my site via links, is focused on this page.

However, if I want to be found for more topics than directly match my main topic, then it becomes more difficult with a one-pager. Consider the example of a website offering therapeutic services. The home page is optimized for all relevant search terms that are directly connected to this area of work. But now I want to reach users who are searching for specific health issues, for which my services can provide a solution. In this case, it would make sense to create new content for new sub-pages that present my knowledge and expertise, and that also serve the relevant search queries. Of course, with more pages, the complexity of the website also increases.

Another point for consideration is that, depending on the user intent behind a search query, Google defines whether shorter or longer texts are more appropriate. Based on this, it could be advisable to create a longer, more holistic one-pager (cf. Pillar Page) on the topic – or it could make more sense create more smaller pages, each of which has a clearer focus and is optimized for individual keywords. If doing the latter, they obviously have to be organized in a logical website structure.

If this distinction doesn’t seem important for your specific scenario, then one-pagers have the added advantage that, due to the large number of terms appearing on the page, often some distance apart, they have more chance of achieving rankings for long-tail keywords that are comprised of several different terms.

URLs, keywords and meta-tags

Search engines are becoming more intelligent, is making SEO increasingly complex. This leaves many people asking how important old favorites like keywords and meta-tags still are.

Is it pointless using keywords as meta-tags?

When optimizing for Google’s search engine: yes. This was confirmed by Google’s Matt Cuts back in 2009. And in this case: what was true 10 years ago is still true today.

However, for secondary search engines, using meta-tags can still be useful. As a rule of thumb for the keywords meta-tag, you should define 2-3 search terms for each URL that this URL is relevant for. And avoid overlap with other URLs.

How important are factors like page structure e.g. with h1, h2 or h3 headings, meta-tags and alt-tags?

A well-structured page can always help, but there are various different benefits. Headings make the text more digestible, they give the content a clear structure and they make it easier to address sub-topics.

Meta-tags are more varied. The meta-title is an important element that is used to show search engines what the main topic of the URL is and for which search term the content was written. The meta description is primarily important on the search results page, because it is the first text from my page that a user sees – and this is the basis for the user’s decision whether to click or not.

Alt-tags, on the other hand, are very important for Google image search.

Speaking URLs: What’s the perfect URL?

This could be the topic of an entire workshop. The short answer is that a URL should reflect the structure of the website and each element should communicate to the User what they can expect from the webpage.

Is it still worthwhile operating special keyword domains that can rank well and link to my page so that this gets an improved PageRank?

Nowadays, domains don’t get any ranking benefit just for including the exact keyword in the domain title. So in respect, the answer is “no”.

Strengthening your main website’s PageRank using links from these keyword domains can only work if the keyword domains are valuable n their own right. If these are newly registered domains, then they won’t initially have any value – so they won’t have any PageRank to pass on. This can improve if a backlink profile is built up for the keyword domains.

Manipulating backlinks as ranking factor has become more difficult than it used to be, and building up a backlink profile is less “straightforward”. Today, backlinks are most valuable when they arise naturally (i.e. when they are genuine “recommendations”), when they are connected to the topic and when they have a “story”. A link has to have a reason to exist to be considered natural. With this in mind, it is easier for a high-quality website to collect natural backlinks because it will often rank for relevant search terms and will therefore serve the user intent of a webmaster who is looking for a relevant source.

Instead of focusing on this strategy, I would rather recommend investing time and resources in optimizing your website content (for users), so that it will later become the natural target of new backlinks. If you do try to artificially create links, then you run the risk of receiving a manual Google penalty. Other strategies (e.g. optimized, user-focused content) can lead to much greater long-term success.

User signals and search intent

Search intent describes what the user expects from their search. Do they want to read a long text, receive a quick answer or would they rather watch a video? This is related to the user signals that are created by user actions. How often do users click on a search result, how long do they spend on the page and how often do they click back to the search results? Successful search engine optimization should always consider the user and their behavior.

I have lots of content pages, where the user spends some time reading but leaves without any further interaction. Will this 100% bounce rate affect my rankings negatively or does Google take into account that this is a pure content page?

You’re right. In Google Analytics, this would be a 100% bounce rate. Google does state in its guidelines that a page doesn’t need a low bounce rate, but I would still say that it is rare that you wouldn’t want to keep users on your domain.

The example of Wikipedia shows us that there is more to offer more than just the content on the page. Wikipedia gives users numerous paths to follow, where they can access more content and inform themselves on related topics.

Google doesn’t actually use bounce rate as a ranking factor. Instead, Google measures the related metric, “back-to-SERP rate”. Here, Google looks more closely at how long it takes for a page visitor to return to the search engine results page. And this needn’t be a negative signal. If a user has found what they were looking for then they won’t have to go back to their initial search query. But if they don’t find what they were looking for then they will have to either click on an alternative search result or enter a new, more precise search query that may return more suitable results.

If we believe statements made by John Mueller (Google Webmaster Trends Analyst, then user signals aren’t relevant for rankings, and are only used to help train Google’s algorithms.

The webinar didn’t cover navigational and local search intent. Why not?

This was just to keep the example simple in the available time. Of course search intent is not limited to transactional and informational queries. I would also say that within each search intent category, there are further, more precise nuances that can help you adapt your content more closely to the users’ needs.

Website quality: How can I improve my rankings?

There is no quick-and-easy way of getting better rankings. But we do know a fair bit about Google’s ranking algorithm. Of primary importance is the fact that Google evaluates the quality of a website’s content and that this plays an important role in calculating the page’s position in the search results. But how is the quality of websites measured?

Who evaluates my website and decides if it’s good or bad?

Website evaluations are conducted by the search engine’s algorithm. The algorithm – e.g. Google’s – looks at numerous different factors, including content, links, user signals or technical aspects, and uses these to decide which URLs are most relevant for a search query. This is highly complex, but basically a website should be technically well-optimized and offer the most relevant content possible to have the best chance of achieving high rankings.

What can I do to directly influence my website’s ranking factors?

This is an extremely broad question that can’t be answered in just a few sentences. There are more than 200 ranking factors that can have varying degrees of influence. Looking at the individual factors and examining ways of optimizing for them could be the subject of a future webinar. Or of 200.

But one place to start would be the title tag, which is one of the most important onpage ranking factors. Using your main keywords in the page title is one is the most direct ways you can have influence.

Which search results get a Featured Snippet and how can I influence this?

It’s completely up to the search engines whether a Featured Snippet is shown or not. Just as they determine which content from the relevant website is shown. For detailed background on Featured Snippets, I would recommend reading Malte Landwehr and Izzi Smith’s Q&A on the topic or watching their webinar.

Google Analytics

The more data you have, the better you are able to understand your online performance. So it’s good to connect the data from your search engine optimization platform (like the Searchmetrics Suite) with user data from a tool like Google Analytics.

If I have a strong page, would you recommend using Google Analytics so that Google can see how users are behaving on the site?

It is generally a good idea to use an analytics program. It’s a good idea because it gives you insight into your users’ behavior, which you can use to help make decisions. However, Google makes it clear in its T&C – and I would in no way suggest that Google is acting otherwise – that it does not use data from Google Analytics to evaluate websites.

Want to start applying your SEO knowledge in practice? Then arrange your personal Searchmetrics Suite software demo today:

Request a demo