Techmeme PageRank Penalty?

I just happened to glance down at my Search status toolbar in the status window of Firefox, and noticed that Techmeme’s Google Toolbar PageRank had been reduced to 4

Techmeme Google Toolbar PageRank

Techmeme Sell Links

Techmeme has very clear advertising with sponsored posts

Techmeme Sponsored Post

The links they use are redirects

http://www.techmeme.com/goto/onair.adobe.com/blogs/onair/2008/01/30/businessweek-on-air-applications-at-demo/?sdid=BQUCI

If you check the Http Status Codes

SEO Consultants Directory Check Server Headers – Single URI Results
Current Date and Time: 2008-01-31T03:10:07-0700
User IP Address: 213.158.xxx.xx

#1 Server Response: http://www.techmeme.com/goto/onair.adobe.com/blogs/onair/2008/01/30/businessweek-on-air-applications-at-demo/?sdid=BQUCI
HTTP Status Code: HTTP/1.1 301 Moved Permanently
Date: Thu, 31 Jan 2008 11:10:06 GMT
Server: Apache/2.2.3 (Red Hat)
Location: http://onair.adobe.com/blogs/onair/2008/01/30/businessweek-on-air-applications-at-demo/?sdid=BQUCI
Content-Length: 388
Connection: close
Content-Type: text/html; charset=iso-8859-1
Redirect Target: http://onair.adobe.com/blogs/onair/2008/01/30/businessweek-on-air-applications-at-demo/?sdid=BQUCI

#2 Server Response: http://onair.adobe.com/blogs/onair/2008/01/30/businessweek-on-air-applications-at-demo/?sdid=BQUCI
HTTP Status Code: HTTP/1.1 200 OK
Date: Thu, 31 Jan 2008 11:09:22 GMT
Server: Apache
X-Powered-By: PHP/5.2.0-8+etch7
X-Pingback: http://onair.adobe.com/blogs/onair/xmlrpc.php
Connection: close
Content-Type: text/html; charset=UTF-8

You discover that it is a 301 redirect used for tracking, and in theory could pass “Google Juice” – the link could be counted as a real link by search engines.

However you then need to look at the Techmeme robots.txt file

User-Agent: MSIECrawler
Disallow: /

User-agent: *
Disallow: /goto/

Traditional SEO thinking is that robots.txt blocks the passing of Google Juice and PageRank

That however isn’t how it has been confirmed by Matt Cutts to operate.

As I discussed in my SEO Linking Gotcha’s post, Robots.txt does not prevent a page from accumulating PageRank, it purely stops a page from being crawled most of the time, though that is fallible (There was further discussion on Sphinn).

Confused? So Am I

I can remember seeing Techmeme as a PR6, maybe even as high as a PR7. Robert Scoble seems to have removed his blogroll link to them, but they don’t honestly need the juice any more. Techmeme honestly has tons of links.

I can’t believe as some might suggest that there is less PageRank in the technology sector, as has been suggested about the SEO sector in the past.

Techmeme hasn’t really changed its internal linking structure. From an SEO perspective it isn’t exactly ideal in my way of thinking, but that again would be highly contested as many people believe you can’t benefit from controlling internal linking.

So Why A Penalty?

If those redirects were actually links to a static page, they would still accumulate PageRank, even if blocked with robots.txt, and could still appear in search results based upon whatever data Google derives from the pages linking to it.

Linking again to my SEO linking Gotchas just in case you ignored the first link.

It is possible that for a period of time Techmeme didn’t have a robots.txt file, mistakes happen, though when I last checked back in October the robots.txt was exactly the same as it is now.

The page is blocked with robots.txt, so the Googlebot shouldn’t crawl the page, unless for some reason it did.

It seems to me, relying on robots.txt for paid links isn’t safe

2 safe options

  1. Nofollow all links
  2. Use meta nofollow on the redirect page

The only people who can confirm that Techmeme has some kind of penalty one way or the other are Google

If Techmeme has been given a penalty, they are not the only ones to have been given a PageRank penalty unfairly.

Update

It seems that whatever caused this has been fixed by Google or maybe hand edited. I wonder whether we will ever find out the cause.

I am not yet seeing the change on my Search Status toolbar, but a quick check on DigPagerank reveals that Techmeme is back to PR7

Liked this post? Follow this blog to get more. Follow

Comments

  1. says

    But does it really matter? I have encountered several small business websites that have had a PR drop with absolutely no loss of rankings. One site in particular had a two point drop while we upped the rankings on several terms.

    • says

      It depends… user perception

      If I was sitting here as a PR6, as the Google directory says I should be, maybe

      1. I would get more subscribers
      2. The perception that when I write an SEO article, or try to sell something based upon SEO, to a layman I might know more than someone with PR3
      3. Higher listings in ratings charts using PageRank
      4. Maybe the press would have linked to me
      5. Maybe the first impression of a Wikipedia editor when they were deciding to delete an entry about me that Igor decided to write wouldn’t have been deleted as quickly
      6. Maybe I would just get more invites to conferences for free, free product samples, or more beer bought for me.

      There is positive or negative side to it as well. I get less visitors from custom search engines and tools used by people looking to drop comments.
      Now you might not look on that as a bad thing, but anyone interested in SEOing their site, and who might be an “internet marketer” for me just happens to be a good target audience, as long as they are doing it manually.

      For Techmeme, I don’t know if it matters to that extent, but it would if they have now been given other penalties, maybe no longer pass PageRank, or that such things may happen in the future.

      If you have had a penalty for passing PageRank, how doe that affect internal linking, subdomains you link to, or sister sites? Techmeme has other properties that probably do not receive as much link juice as Techmeme, but would be much easier to monetize.

  2. says

    Thanks Andy, very interesting. Techmeme used to have a PR7. I’ve been hearing about this -3 hit for a while.

    Do people see the absurdity here that I do? It’s like all the web’s publisher are now required to internalize a working model of Google’s PR algorithms.

    Funny how one gets the impression from your post that Techmeme’s webmaster acted with consideration of these issues in mind. Nope, I acted obliviously, blissfully so. I didn’t disallow /goto to supply pure inputs to Google’s PR calculations. I did it because dumb crawlers (not Google’s) were hammering that directory, seeking pages that weren’t there. My blissful ignorance has ended of course. I now know I’m expected to do more than just clearly mark the links as Sponsor Posts. I did nothing wrong or deceptive, but that’s no longer good enough.

    Hey Tim Berners-Lee, you once said “hyperlink by enclosing anchor text in an A HREF tag”. Hope you don’t mind, but Google’s amending that with “oh, and make sure to use a nofollow condom on links to entities with which you have commercial relationships, and if your links are redirects, robots.txt disallow the redirect urls…wait, better yet, nofollow those too”. Thanks for the simplicity Tim, but this is what progress looks like.

    Yes, we all know Google is free to set the rules for its search engine. And I agree with that. But I’d like to submit this: Google wants me to learn these arcane rules, apply them, and then prostrate myself for “reconsideration”. Is this a winning long-term strategy for Google?

    • says

      Gabe – SEO bloggers generally think that Google is going about this the wrong way, but it is difficult to shift opinion when most of the tech industry and world opinion generally have a love affair with Google.

      I made sure I stuck a question mark on the end of the title, nothing is 100%, only Google knows, but it is the only possibility I can think of, though it defies a lot of conventional thinking by multiple SEO experts.

      The good news is in your case it is quite simple to fix and just submit a reconsideration request from within the Webmaster central, stating that you had the links previous blocked by robots.txt, and that you have now used nofollow.

      If you jump back up to PR7 over the next few weeks, it was most likely a penalty, though Google are very unlikely to confirm either way.

  3. says

    Yep, heard this last week, and Gabe and I had a little twitter about it:

    http://twitter.com/dannysullivan/statuses/636945522
    http://twitter.com/gaberivera/statuses/637443512

    And yep, the detour though a blocked page in robots.txt should be perfectly safe. It’s not traditional SEO thinking. It’s exactly what Google says:

    http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=66736

    “Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file”

    So yep again, Gabe looks dinged unfairly. What annoys me most, really, is that the links are clearly labeled as Sponsor Posts. Yes, Google wants Gabe and others to use nofollow or block etc. etc. But I also expect Google to have built up its own intelligence at this point to figure out what’s paid even if the “machine readable” signs are missing. Words in a reverse box saying “Sponsored” ought to be just fine.

    • says

      Sorry it got stuck in moderation Danny, links to Google.com get flagged because so many comment spammers use links to Google.com to try to build up their Karma, or just use Google for redirects.

      I wasn’t aware you had been in contact on Twitter before I made the post.

      As it is, Google have had more links to their webmaster guidelines on this blog than the people who requested a review

      I totally agree that Google should be handling all of this in a different way.

      The traditional SEO thinking was in reference to my previous post, which extensively references the Matt Cutts / Eric Enge interview.
      If this is just a glitch in Google’s datacenter, it would still be advisable to nofollow those links, because Gabe has tons of them on every single page, and best case scenario the redirects are hanging pages.

      Lets think of a hypothetical website

      100% horrible flash based site with no redeeming features.

      If it was indexable, if you link to it in a paid post, you could get a penalty.

      But what would happen if the whole website was blocked by robots.txt?

      If you link to a page that is blocked by Robots.txt, it can still accumulate pagerank and get indexed.

      Could Google give a blogger a penalty for writing a paid post linking through to a site blocked with robots.txt?

      Then there is meant to be a layer of manual review, especially I would think for the larger sites.

  4. Patrick Grote says

    One day another search engine will come along with the vaunted “algos” google once claimed to have. My feeling is if you were to peer behind the curtain, so to speak, you would see a google index process with much more wire and spackle than intelligent automation.

    The nofollow tag was a perfect mind twist that everyone readily jumped on to avoid comment spam. Hardly anyone at the time considered the ramifications of making everything nofollow. :)

    When the agent of record for the internet needs to rely on a human tag to help it do the job, well, we’re all in trouble.

  5. says

    I’ve been trying to get my brain wrapped around the toolbar update for sometime now…. I’ve never sold links, nor has anyone that I know of in my immediate web community. My assumption is the perceived PageRank does have something to do with 301 redirects though… although I haven’t put my finger on it… But some of my other blogger acquaintances never have used a 301… So I’m not sure.

  6. says

    Thanks for the update Andy. I was contacted by…someone from Google, and much of what was speculated was confirmed. Techmeme was dinged for those links, but should not have been given the disallowed redirects. Google evidently noticed the paid links during the week in 2006 when I did NOT have /goto disallowed, i.e. during the week Techmeme did pass PageRank (whatever that means :-) ). Google resolved to rectify.

    I still disagree with Google’s approach, but I should note I was treated exceptionally well yesterday. About as good as anyone could expect. I didn’t even have to submit a reconsideration thingy, whatever that is.

    • says

      Gabe good to hear it was resolved. Having a story hanging on the front page of Techmeme might have nudged them to take action a little quicker, or maybe Danny nudging them about it on Twitter.

      So much for human verification, and possibly a 2 stage process over a period of time just in case of accidental errors.

  7. says

    Great to hear, Gabe — but wow, something from back in 2006 finally shows as kicking in. Wow. Remember, you’ve probably been prevented from passing link love to people for all this time. It’s just that no one publicly could tell that until the penalty.

  8. says

    Hi there, Andy .. I note in the screenshot above you have the Page rank Tool and the Alexa Tool together, how were you able to achieve this, please?

    Or is this a Firefox add-on

    My F-secure program will funnily enough not allow me even to download the Alexa Toolbar for IE .. now that is saying something .. lol

Trackbacks

  1. [...] Techmeme PageRank Penalty? | Andy Beard – Niche Marketing: “Hey Tim Berners-Lee, you once said ‘hyperlink by enclosing anchor text in an A HREF tag’. Hope you don’t mind, but Google’s amending that with ‘oh, and make sure to use a nofollow condom on links to entities with which you have commercial relationships, and if your links are redirects, robots.txt disallow the redirect urls…wait, better yet, nofollow those too’. Thanks for the simplicity Tim, but this is what progress looks like.” [...]