searchmetrics email facebook github gplus instagram linkedin phone rss twitter whatsapp youtube arrow-right chevron-up chevron-down chevron-left chevron-right clock close menu search
61426142

5 ways to definitely get hit by a Panda algorithm penalty

For the normal world, a Panda is just an animal, for SEO folks it’s a diagnosis. In this post, I will briefly talk about how Panda works, but the nitty gritty of it is actually about how to get hit by a Panda algorithm penalty. You may ask, “why would you want that Kevin?” Well my lad, when you know what you should NOT do, you get a better picture of what you actually SHOULD do. I also strongly believe that by doing everything you can to avoid being hit by Panda, you set your website and company up for sustainable growth and SEO Success. So learn how to turn these five factors around and you’ll have a recipe for success when it comes to preventing a Panda penalty.

 

great seo success

Picture Source: http://yourbrandlive.com/assets//images/blog/great_success_brandlive.png

 

Let’s take a brief look at how it works: Panda is a filter that is supposed to prevent “low quality” sites from ranking. The logic behind that works as follows: each indexed page or group of URLs (site or folder, which is also why only a directory of your site can be affected, instead of the entire site) receives a relevance score for a query. Navigational queries are an exception. Since their relevance is clear, Google does not modify the search results. Generally, search results are filtered by relevance and displayed for the user in the SERPs. However, before a user receives the results of a query, they’re sorted by Google’s criteria (e.g. the famous ranking factors). It’s important to understand that quality is continuously measured, which again has an impact on the search results. Think of Panda as a self-assessing system. We’ll talk about the five determining factors that come into play during this self-assessment for relevance and quality in a minute.

 

It’s also important to know that Panda does not only punish, but also rewards. Take a look at this screenshot to understand what I mean:

seo visibility positive example

Notice how the website started to drop at the end of November 2012 until May 2014? On November 21, 2012 the 22nd Panda refresh took place. The impact of this refresh is evident based on their decrease in visibility. However, on May 19th Panda 4 was rolled out. The displayed site was able to recover because it avoided the five factors we’re about to talk about.

 

On the other hand, if you don’t follow these five factors, this might happen:

seo visibility negative example

But of course you wouldn’t want to do that ;-).

 

From this video, “How does Google use human raters in web search?”, we know that Google Quality Raters play an important role in determining thresholds used by Panda. The newest insights from a recently leaked version of the Google Quality Rater Guidelines are also in perfect accordance with the three overall determining factors regarding high-quality websites: expertise, authoritativeness and trustworthiness. This is quite an abstract model, so I want to provide a more tangible approach in this article.

 

Introducing the five wonderful ways to get hit by Panda:

 

1. Bad Content

The first thing to do, if you want to get hit by Panda, is to have bad content. I have separated this into five categories:

  • Thin:

Imagine you have a category page with only a few lines of meaningless text and hundreds of links to products. This is what we call thin content. Search engines need “food” aka content, to determine the relevancy of a page for a query.  All they can read is text – little text, little food. If you barely provide any information that’s accessible for a search engine, how are they able to understand what the page is about? The same goes for a user on your site.

  • Spun / automated:

Very often, bigger sites / brands deal with higher amounts of pages that need to be filled with content. One easy way is to automate that content, e.g. by writing a boilerplate text that’s the same on each page except for a few variables. Search engines do not like this, just like the users don’t. We call this spun / automated content. Of course you cannot always write a unique text for each and every page, but it is vital that you make sure to individualize content. You should aim to provide users with as much tailored content as possible.

  • Aggregated:

Another popular way to fill a large number of pages with content is to aggregate content from other sources. As we wrote in our Panda 4 analysis, aggregator sites of news, coupons, software and price comparisons were heavily punished by the algorithm. While it makes sense to partially display content from other sources, it should only be to enrich your unique content, not replace it.

  • Duplicate:

When content on many sites or pages are too similar, we’re looking at duplicate content. This has been a ranking-killer for years and we have weapons like canonical-tags and meta=noindex to fight it, but it still appears. While it’s difficult to rank at all when having a lot of duplicate content, Panda will punish sites that have it at a large scale.

  • Irrelevant:

When users do not find an answer to their question on a page, they leave. A page might seem relevant for a query, but falls short with what’s actually promised by the title and content for example. Imagine a page with text about a recipe for spaghetti that’s only about how well it tastes, where spaghetti actually came from and what it looks like. It contains the keyword “Spaghetti recipe” in the title, it comes up often in the content and even semantic topics are covered, but the page doesn’t contain the actual recipe. In this case, we’re speaking about irrelevant content, at least for the query “Spaghetti recipe”. Here, Google would also be able to determine the relevancy by analyzing user signals for this page; which brings me to the second way to get hit by Panda.

 

I would also categorize old content as irrelevant. Imagine a text about a topic that needs to be continuously up to date, such as “iPhone”. Writing about the first and the sixth one is like wearing two different pairs of shoes at the same time.

 

Tip: A good way to determine “bad content” is to look at how many pages of your site are indexed versus how many pages actually rank.

 

2. Bad User Signals

As you might have read in our recently released ranking factor study, Google pays a lot of intention to user signals. We usually focus on the four most significant ones. So if you want Panda to be mad at you, make sure to have the following:

  • Low Click Through Rates

Just because you rank high, does not necessarily guarantee that people will click on your site. I’d suggest checking exactly this.  What it tells us if you have a low click through rate is that either your snippet doesn’t promise what the user is looking for or something else in the SERPs has gained their attention.

  • High Bounce Rate

This is determined when the user clicks on your snippet and then quickly bounces back to the SERPs. This is a tricky one, because for some results the actual time on site is legitimately meant to be low – imagine someone is looking for a sports game score. My assumption is that Google is able to distinguish between queries that require a low bounce rate and those with a higher one.

  • Low Time On Site

Users aren’t spending a lot of time on your site in general. This is an indicator of low quality content.

  • Low amount of Returning Visitors

Users are not coming back after visiting your site. Good content is consumed again and again (think of “evergreen content”), but most importantly you want users to come back to your site as much as possible after they have discovered it. This is what determines high quality content in Google’s eyes.

google analytics

 

If you want to make sure these four metrics are in place, check them regularly and compare them to the overall value of your site. For example: you want to know if a bounce rate for a page is high or low. Let’s say it’s 65%. Is this too much or okay? In this case, I’d recommend you identify the average bounce rate for your website and then measure this value against the value of a specific page.

 

Tip: Identifying and optimizing / reworking pages with high bounce rates is a crucial task for SEOs. In the third point we’ll talk about how to actually optimize them.

 

3. Bad Usability

Usability and user experience are each their own science, but being an SEO in 2014 means you need to deal with and understand these topics. While you don’t have to implement multivariate user testing for colors (even though I recommend it), you should make sure these following factors are poorly represented on your site if you want to get smacked in the face by Panda:

  • Design / Layout

Having a really old or simply ugly design is an invitation for users to bounce. While “ugly” is very subjective, I mostly refer to a design that looks like one used by spammy sites, even though the site is not. Constantly redeveloping your site and regularly doing a refresh and/or even a re-launch is a good start for user experience optimization

  • Navigation

Providing a structured navigation is an absolute must-have in order to optimize the so called “user orientation scent”. While it’s of course crucial for internal linking optimization and therefore the flow of PageRank through your site, users must be able to find all products, content and even use the navigation instinctively.

  • Mobile version (redirects)

I don’t have to tell you how important “mobile” is, as you can read that all over the place. But in terms of user experience, I’d recommend you make sure mobile users are being redirected to the mobile version of the page. It’s important they are not being redirected to either the home page or a 404-error page. Of course, a requirement for this would be to have a mobile version of your site.  As Avinash Kaushik, Google Analytics evangelist,  expressed so nicely: “If you have a mobile version of your site, you’re now ready for 2008”.

  • 404 errors

Every site is supposed to have 404 errors, e.g. when a user enters a URL that doesn’t exist or is not available anymore (for this issue there are nicer solutions). When users experience too many 404 pages or when they’re not optimized they will leave the site. To avoid this, optimize 404 error pages by providing the user with the answer, like a way to refine his search or navigate somewhere else on the site.

  • Meta-refreshes

A meta-refresh usually leads the user to another site after a couple of seconds. This is not only annoying, but also confusing. Don’t confuse your users.

  • Site speed

Site speed is an official ranking factor, because it is an essential component of the user’s experience. Even if you have the best content available, if your site takes minutes to load, you cannot guarantee that users will stay. I could write an entire blog post around optimizing site speed, but for now you just have to know that it is important and should be optimized.

  • Ads (even though primarily targeted by PLA / top heavy, it plays into panda)

I don’t recommend you stop using ads to make money, but I want to make you aware that having too many ads will lead to problems. If you place too many ads above the fold or in the text, respectively Panda the “top heavy” algorithm will give you very awkward looks. The reason behind this is it’s simply annoying to users. Have you ever entered a site and suddenly ten layers of pop ups try to sell you something before you have even had a chance to read the text on the page? It’s like walking through a store and constantly being stop and asked “do you want to buy this?”, “why don’t you buy that?”, “buy already!”. How can that be a good user experience?

  • Flash

Flash and other formats that do not work on all common devices should be avoided. Sorry, Flash.

user experience

Of course a good user experience will have a direct impact on user signals. In the end, UX also determines whether a site is trustworthy or not, and in today’s Panda world, trust plays a special role.

 

4. Low trustworthiness

If you want users to stay and actually come back, your site has to be trustworthy. This is more important if you have a site within the financial industry, but it comes with general advantages for everyone. Proving trustworthiness leads to users recognizing your brand – even to a degree at which they’d click on your snippets just because they see your domain under the title.

 

Trustworthiness plays an important role on several touch points and starts early in the customer journey:

  • Snippet
  • First impression (Design & Layout)
  • Content
  • Conversion

 

In 2012, Danny Goodwin wrote an article on Search Engine Watch about the Google Quality Rater Guidelines and how page quality has been added to them. They also contain questions like “Would I trust this site with my credit card?“.

 

There are a couple of knobs you can turn to improve the trustworthiness of your site:

  • Provide contact information and address
  • Integrate trusted shop symbols and security seals
  • Provide privacy policy
  • Display reviews (internal and external)
  • Show testimonials
  • Avoid bad spelling and grammar
  • Have your site connecting via HTTPS
  • Show your client portfolio
  • Display press mentions and awards
  • Provide an “about us” page with biography / company history
  • Avoid heavily disturbing / interruption ads

 

If you need inspiration from well-trusted sites, look at top players like Amazon or eBay. The more you optimize your site towards being trustworthy, the higher chance you have for seeing conversions. But don’t get too carried away as you can also optimize too much, which brings us to the last point.

 

5. Over optimization

 

Optimizing “too hard” can definitely lead to a penalty. In most cases this is referred to as a “manual spam penalty”, but it can also play into Panda. I have met two main “spam” playgrounds in my career: content and internal linking.

 

Over optimized text for example can easily be detected by simply reading it. Often you can immediately “feel” the difference between a naturally written text and one written for search engines. Of course you have to include the keyword that you optimize for in the text, but the times of keyword density are long over. Don’t stuff the text with keywords, just make sure it occurs a couple of times (naturally) in the text, headings, title, description and URL. It is much more important to cover semantically relevant topics and be comprehensive.

 

For internal linking, over optimization would consist of too many internal links and the use of a hard anchor text. As with backlinks, there are thresholds search engines use to determine whether an internal link profile looks natural or optimized. An example of an over optimized internal linking profile would be using the term  “buy cheap iPhone 6” on every page to link to your product page. If you don’t go overboard, you should be fine.

 

To wrap it up: If you’d like to experience getting hit by Panda and lose approximately 50% of your traffic and therefore revenue, make sure your site has the five points I mention above. If you’d like to improve your success (and avoid Panda), fulfilling Panda requirements for high-quality sites will actually be a way to achieve sustainable rankings in the long-term. To do this, make sure your site has these five factors in place: quality content, user signals, user experience, trust and optimization.

 

Kevin Indig

Kevin Indig

Kevin Indig has been an SEO Consultant for the Searchmetrics Pro Services team. He helps enterprise companies implement critical SEO Strategies.

44 thoughts on “5 ways to definitely get hit by a Panda algorithm penalty


  • Satish Kuma rIthamsetty 2014/09/25 at 2:04 pm

    Dear kevin,

    Google panda update is a terror for thin content blogs. Your article clear and informative. Every point explained in very simple way. Thank you for your sharing.

    Regards
    Satish Kumar Ithamsetty
    http://www.BloggingDen.com

  • Bounce rate isn’t actually that tricky IMHO because as Google has said, they don’t actually track “bounce rates”, but they have patents and are likely using metrics which are effectively just measuring the same thing a bit differently. It’s essentially looking at # of clicks in a user session (on google results page) before clicking on your result, and # of clicks by a user after clicking on your result. So if someone bounces, that’s fine presumably. If they bounce and then continue to click on other search results from that query, that implies your result was not a good fit.

  • I agree. Plus it’s a little tricky to actually find out whether a user was happy with the result or not. Imagine one is looking for simple information like a date and only spends 3 seconds on the actual page. Or imagine a user simply wants to compare two results and then makes it seem like he bounced, but actually found the first result to be good. So I think it’s really about combining different user signals and metrics and that’s what Google seems to do.

  • My domain has not improved.
    More than 10 months without to see any improvement.
    On Mibdoa that there is a problem in the Arabic websites

  • Yes, there are different iteration cycles for different languages / parts of the world

  • Jan-Willem Bobbink 2014/10/01 at 2:43 pm

    Hi Kevin, maybe an idea for a follow up is how to diagnose your website for being hit by Panda?

  • Hi Jan, funnily we are already in the making ;-). But I appreciate your input!

  • I’ve carefully read your post, but found nothing interesting for me, sorry. Panda it is a much bigger problem, than it is seems.

    I’ll give you an example. My site (I don’t like to spam, but you can find it under my name), had very good positions by all the queries I need. it was at the 4’d place at my main query (1 word and more than 360 000 requests in month) and, also, it took top10 in all other queries.

    At the sept 14, i was punished by Panda VEEEEEEEEEEEEERY hard…All of my pages was written manually, some pages had some content from ancient books, but anyway were unique…

    What’s happened. I can see here 2 main factors.

    1. One dude from India copy-pasted ALLLLLLL my site with all the structure, styles, everything…and have opened an internet store with more than 1500 pages…

    You can ask me – where is the problem. I’ll give you an answer: When i copied any paragraph from my site and put it to google search string I’ve found HIS site, not my, my site was in the omitted results!!!

    My domain was registered a year ago, his…may be 1 month before Panda update!

    Google decided, that I stole all the content from this F@@@@@ site…

    2. The second point was that one stup@d girl, copy-pasted ALL my MAIN PAGE with all styles, headers, etc. to her blog.

    Unfortunately she had a popular blog with PR=3 and with authority better than my main page.

    Guess, what google did??????

    He decided that I stole this content again.

    So…conclusion – Google can’t recognize a real source of content!!! He looks only to his metrics and only!!!

    I spent 1 year, 1 year to make this site!!! I’ve translated MANUALLY ancient text on Latin to made main 13 pages of my site…

    What do I have today! I’ve hardly punished by Panda, my results for today are only 3’d page and more at any queries!

    1. I have the most extended content in my niche;
    2. My site fully discovered the main theme
    3. I’ve used TRUE sources of information (where it was appeared for a first time, XIV century)
    4. I had never manipulate with the links, I didn’t by anyone!
    5. I’ve changed my pages many time with another content to improve this panda update!

    Today, I’m a looser with less than 300 visitors a day and poor adsense earnings…

    Panda update, described by your blog must to help small sites (not the big brands) to ranks higher!

    I can see another picture! Poor sites with almost NO CNTENT and big jewelry brands in top 10 + absolutely bad (as we can see in MOZ metrics page) takes 3’d place with page authority 14…

    PANDA is NOT TOO EASY as you’ve described!!!

  • Very nice post, some rumors that new Pandas will come on next week. Let´s wait what happen!

  • Thanks for sharing this great article Kevin, I suspect my site was hit but Panda and I’m desperately trying to improve the quality of my link profile. I found this article jellyfish.net/blog/seo-best-practices-in-manual-penalty-removal/ , which is great but I’d love to know your thoughts about manual penalty removal, Is this something you can write about at all?

  • Hi Elizabeth, thanks for the kudos. Panda actually is less about backlinks, but more about quality and content. I deal with manual penalty removal on a daily basis, but this is another pair of shoes and has nothing to or with panda or penguin. Did you receive a manual penalty message within the Google Webmaster Tools?

  • Ok that good to know, I’m actually just being suspicious as I noticed a massive drop in my website rankings.

  • Sure. Do you have a Google Webmaster Tools account? If you look under “manual penalty” and find no message, you don’t have a manual penalty. But there has been a recent new roll out of Panda (4.1), which might have affected you…

  • I am taking a long breath once i finished reading your article. Simply its awesome with clean instructions.

    Learned a lesson how to write better blog post and i salute your hard work.

    Ummmmmmmmmmmmmmmmaaa…

  • good list – except it is incomplete

    I posted this list and more months ago:

    http://themoralconcept.net/pandalist.html

    🙂

  • I found very helpful for my website as I have just started developing it for Internet marketing and SEO updates.

    Thank you its inspiring for me to write a good blog post in future.

    mahesh gelani

    http://www.Maheshgelani.com

  • Tip: A good way to determine “bad content” is to look at how many pages of your site are indexed versus how many pages actually rank.—-

    You are kidding me right> Just because a page does not rank does not mean that it is of thin content.. there could be thousands of reasons behind that .. probably the page does not have strong back links or probably it is not spamming…

  • Sarah | Houston IT Solutions 2014/10/28 at 4:52 pm

    Panda is a nightmare for thin content pages, and what I’m reading here is the exact opposite. Highly ranked, I read the article in detail and have been on the page for at least 10 minutes, and it doesn’t read as spammy. Bloggers, if you want to see what qualifies as good content, this article itself is a fantastic example.

  • Peter Giammarco 2014/11/12 at 1:17 am

    Great article, I was wondering what I had to look out for after the update. This article did a good job at explaining it. I need to focus on testimonials, reviews, and a privacy policy. Thank You

  • Common sense is need for SEO. Just think. I am providing something useful? Will it help people? Think of the user and not the search engine.

  • Lesson learnt . My stats on one of my sites proved the point that High bounce rates and low time on a site does have an effect. The site in question is one that people just drop onto to get a small piece of information and leave.
    However the initial effect has stablised through october.

  • Hi there, really nice post.
    It would be nice to know also what is good average for returning visitors. Although I am not sure how G calculates that. If I am targeting only on commercial keywords ald trying to capture only hot leads (without email subsritipon and other stuff), my returning visitors will be bad. Mostly they are calling for purchase and then leave. So, it means, I will get penalized. Which would be wrong, wouldn’t it? Regards, Matija, Slovenia

  • Hey Matija, thanks for the kudos! It’s difficult to give a good average relation between returning and new users, because it heavily depends on the website / market / business model. However, I generally strive for a classic 50/50 split. A balance of new and retained clients is usually preferable.
    There are several ways to interpret returning users. In order to understand the metrics better, I’d suggest to segment them, meaning putting them into context by adding more variable into the picture. One example: what is the distribution of the average time spend on your site per returning or new user?
    Hope that helps!

    All the best from Silicon Valley,
    Kevin

  • How to deal with a quotes/ quotations website and how to get rankings…

  • nice article and good stats to back it up.

  • Panda and Penguin are two scary algorithmic updates from Google which ruin down several blogs and websites by lowering their rankings on search result pages. When a blog or website is hit by these updates, its rankings in SERPs drop over night and the hard work of several months and even years may get vanished in minutes.
    So is there any way to rank panda hit site again, how to do fast recovery again?

  • Thankfulness to my father who shared with me on the topic of this webpage, this webpage is in fact awesome.

  • Web traffic is very important for the popularity of your website. Before hiring an SEO agency make sure they’ll send you monthly reports of what they’ve done and the difference it’s made to your rankings for the agreed search terms. Search engines take time to consider your page quality and all the other aspects of your page when you update something.

  • directory listings seo 2016/01/25 at 7:05 am

    Everyone loves what you guys tend to be up too.
    This kind of clever work and reporting! Keep up the great works guys I’ve you guys to blogroll.

  • That is a great tip especially to those new to the blogosphere.

    Brief but very accurate info… Thank you for sharing this one.
    A must read article!

  • That is a great idea for seo tips for ranking factors .

  • valentines day 2016 2016/02/01 at 9:41 pm

    That is very fascinating, You are an excessively skilled
    blogger. I have joined your feed and look ahead to in quest
    of more of your wonderful post. Also, I have shared your website in my social networks

  • mobile search optimization 2016/02/13 at 9:48 pm

    This website was… how do you say it? Relevant!! Finally I have
    found something which helped me. Appreciate it!

  • Hi Kevin, maybe an idea for a follow up is how to diagnose your website for being hit by Panda?

  • Hello

    Well, Google has made it difficult for newbies like us, with these algorithm updates. I am really trying hard to learn to comply with these updates, after one of my blog got hit from last panda update and I guess Penguin is also coming.

    So, reading a lot about these updates, so I can remove the risk of penalty from my blogs.

    Is there any special thing, which I can follow?


Write a Comment

Note: If you enter something other than a name here (such as a keyword), or if your entry seems to have been made for commercial or advertising purposes, we reserve the right to delete or edit your comment. So please only post genuine comments here!

Also, please note that, with the submission of your comment, you allow your data to be stored by blog.searchmetrics.com/us/. To enable comments to be reviewed and to prevent abuse, this website stores the name, email address, comment text, and the IP address and timestamp of your comment. The comments can be deleted at any time. Detailed information can be found in our privacy statement.