Searchmetrics’ journey to SEO Mecca – PubCon Las Vegas 2011

November 14th, 2011 | Events, General No comments

This year, we brought a big presence to the  PubCon Las Vegas. And, since we participated in quite a few interesting sessions, we wanted to pass on a bit of what we saw.

But before I begin: PubCon is massive and varied in its scope. With nine parallel tracks, the program  is impressively large. The event is perfect for getting a broad overview of what’s happening in the internet marketing scene and to look at topics that go beyond the realm of hardcore-SEO.

Tips from the search engines themselves

Matt Cutts and Amit Singhal held a long keynote session that covered a variety of points. The most interesting from my perspective were:

  • SSL search will continue. By that, I mean that organic search keywords for nearly all logged-in users in the USA will no longer will be forwarded to web analysis programs. This is part of Google’s long-term strategy to ensure that more personal searches are performed on Google. This change currently results in them no longer forwarding the keyword information for up to 20% of US searches. Furthermore, Google intends to introduce this change worldwide.
  • As ‘compensation’, more ranking data will probably be offered in Webmaster Tools – either up to 5000 keywords per day (instead of 1000 currently) or a longer historical overview. According to Matt Cutts, 96% of all domains can already identify all the keywords for which they receive SEO traffic.
  • Webmaster Tools shows all the links that Google sees, not just those that have a real effect on rankings. Therefore, you cannot conclude that the few helpful blog comments that appear in Webmaster Tools will be included in your rankings.
  • Google has gone through the whole USA Panda complaint thread in the webmaster help-forum and put 500 domains in a spreadsheet to be checked by a staff member for ‘false positives’. In my opinion, this indicates at least a little hope for some unfairly treated victims since Google appears to be actually checking their complaints. However, if you have little evidence to back up your case that you were poorly treated in this – or perhaps even other – areas, then maybe it’s better to hold back on the complaints. After all, we are talking about a manual check by a Google employee…

Future trends in online marketing

  • ‘Social’ was THE buzzword. You could hardly sit three minutes in a session before someone mentioned Facebook, Google+ or some other integration and optimization possibility. There was nthing partiocularly new in this area. However, it was interesting to see that Techcrunch tried removing the sharing buttons from their site and lost an average of 20-25% of traffic.
  • Bing says that they can differentiate false social interactions and even use bounces as a signal when users return immediately from a site to the SERP. Besides this, we heard that bounces with a longer ‘dwell time’ have less negative ratings. SEOs have assumed this for a while now with Google. Beyond this, Bing are thinking of displaying competitor links from Yahoo Site Explorer again. According to a few statements, for this to happen the head of Bing Webmaster Tools needs more requests and tips. So contact them!
  • Microdata was also a favorite topic, in the quest for higher-quality and quantity rich snippets, better differentiation with ambiguous topics and perhaps even direct ranking advantages. For those who are less-than-gifted in the coding arena, you can find a (pretty simple) generator at schema-creator.org

The Panda struggle

Of course, many sessions touched upon Panda, its effects, iterations and the resulting challenges. For example, the following suggestions were made:

  • Content-performance ratios can make sense as a rough evaluation tool. On the one hand, you can use the ratio between indexed sites and sites with traffic, and on the other hand, the ratio between indexed sites and deeplinks. Using this information, you should be able to deduce some recommended approaches, like closing pages using robots.txt, parameter usage in Webmaster Tools, etc.
  • One speaker was particularly confident that by removing weak or bad links, you could rid yourself of the ‘Panda Penalty’.

Tools

  • SEOs are particularly data-driven in the area of keyword research – besides the AdWords Keyword Tool, internal site search data and even a few enthusiastic Scrapebox and Mozenda users, ubersuggest (now also for Google News) and soovle were recommended.
  • Excel, or specialized tools like Fookes Notetab, can be used to structure keyword lists.
  • YouTube is already a massive search engine in its own right in the US, so it makes sense to use YouTube keyword tools and YouTube SERP trackers like voot.net.
  • As always, graphic design tools are the best option for data visualization but Excel/Google Docs, Google Chart Wizard and Tableau were also recommended.

 

 

P.S.: Who's writing this stuff? My name is Sebastian Weber and I'm an SEO consultant with Searchmetrics. When I'm not writing blog posts, I'm optimizing sites from every corner of the internet, and trying get to the bottom of phenomena, both in the online and offline world.
Show all articles from .

Comments (0)

Trackbacks (0)

Write a comment