Lowering The Google Red Flag – Sidestep The Cash Hungry Bull

Lowering the Red Flag SmallWith all the previous discussion of paid reviews and my unwillingness to raise the white flag or bend over, this post is going to come as a bit of a shock.

I am lowering the red flag

Carry on reading to find out why this isn’t the same as raising a white flag, and is far from surrendering to Google on paid reviews.

Robots.txt

I have spent a long time deciding on a course of action, and have decided that blocking my content using Robots.txt is ultimately better for me, and better for people hiring my services.

It also happens to be worse for Google than currently, but that is the beauty of this strategy.

It might be harder to rank, pages blocked using robots.txt still gather PageRank, and can appear in the index, though they would be looked on as dangling pages.

Ultimately links can always be redirected to a followup review which refers to the first, and that followup isn’t a paid review.

It is a little naughty, some people will sometimes receive editorial links within reviews and receive a trackback, but I don’t know of any spam plugin that checks robots.txt , plus the links will still be valuable in other search engines.

Google’s Achilles Heel With Paid Reviews

The only domain for which a client is paying for a review from is this one. When my content appears on other sites, there is a totally different editorial process, and links can in no way be looked on as paid links.

Content syndication is extensive:-

Paid Link Reviews Syndication

1. Social Bookmarking

Sites such as BloggingZoom encourage more than just a single line of description and rewritten titles on submissions, and not only deliver traffic from their existing user base, but also search traffic.

2. Hub Pages

Many content sites allow you to use syndicated content in the form of article feeds, and content is even picked up by larger sites such as Topix.

3. Authorized Syndication

You can arrange or organise for your content to be selectively syndicated on authority sites such as Andy Beard on WebProNews and even my WordPress SEO reviews published on SearchNewz.

Whilst I haven’t made it clear recently, I publish all my content under GPL, in fact I am switching to the GFDL with an invarient clause requiring a live hyperlink back to the original without nofollow – I prefer GFDL over creative commons because of this flexibility (for me) to be highly specific.

In future I am going to be actively encouraging syndication

4. Unauthorized Syndication

This is technically the same, but as long as people scraping my content are linking back to me, preferably with a followed link, it is great. I am not even worried about some light spinning of the content, as long as they state that the content has been modified and is only based on my original.

5. Indexed Search Results & Aggregators

This is the likes of Technorati, and feed readers that are indexed – I have no intention of blocking reviews from RSS feeds.

6. Multimedia

I use a lot of pictures and screenshots for my reviews, but this is going to increase – in addition I will also be creating podcasts and screencasts which will be widely distributed in their own right.

Hooray for Universal search!

No Nofollow = Editorial Backlinks

By not using nofollow in my reviews, it is most likely that syndicated copies of my reviews will provide backlinks not just for me, but also for my clients. The backlinks are editorial in many cases, someone has chosen to syndicate my content.

Unfortunately Google use backlinks to attribute content to an original source, but it is a whole lot harder if they can’t index the original. It will be interesting which site syndicating my work will rank highly, or how many.

Linking to Syndicated Content

This is something I haven’t decided on yet, but just like I can link through to my various social profiles, I do have the option to link through to my content on other domains after it has been syndicated.

Worse for Google

My content will still be in the index, filtered through an extra layer of editorial control, but there is going to be a whole lot more of it.

Google have made it clear that they are only worried about the existence of links, and not the time it takes to create content, expertise, and whether links within reviews were specified or given in an editorial capacity.

Matador Google

I honestly don’t like junk reviews written purely for SEO purposes, but as Google seem determined to impose the letter of the law rather than the spirit, throwing the baby out with the bath water, whilst I will comply to the letter of the law, I can’t see a reason why I shouldn’t sidestep the charging bull.

Nofollow is not the answer to Google’s troubles

Update

There seems to be some misunderstandings, and I need to clear them up.

1. The blocking hasn’t happened yet – it is the next thing on the todo list
2. I intend to get more search traffic from Google taking this action, not less.

Update 2

Robots.txt has now been modified

User-agent: *
Disallow: /Recommends/
Disallow: /downloads/

User-agent: Googlebot
Disallow: /2007/08/plagiarism-checker-outsourcing.html
Disallow: /2007/07/gather-success-review.html
Disallow: /2007/06/wordpress-seo-masterclass-for-competitive-niches.html
Disallow: /2007/05/bidvertiser-review.html
Disallow: /2007/05/seo-consulting.html
Disallow: /2007/04/ibegin-source-review.html
Disallow: /2007/03/sponsored-reviews-now-live-in-depth-review.html
Disallow: /2007/03/volusion-review-and-suggestions.html
Disallow: /2006/12/search-engine-glossary.html

The list is quite short, but now I have a strategy in place, I will be writing a lot more paid reviews

Whilst this might be looked on as insignificant, some of those pages rank quite well for very useful terms, and are probably worth 2000+ visitors per month.

Update 3

Whilst the changes in robots.txt were quite straight forward, before making any reinclusion or reconsideration request, I thought it important to check the robots.txt within the Google webmaster console.

First of all I waited for it to be refreshed by Googlebot, which seems to happen approximately once every 24 hours.

Googlebot has fetched my new robots.txt file

There is an option to just copy and paste that refreshed data by hand, but waiting for it to be fetched is conclusive.

Next I entered in the URLs which need to be blocked by the robots.txt file, and checked them.

Output from checking that URLs are blocked according to the robots.txt

In theory Googlebot will now be blocked from crawling the “offending” pages, and I will be able to ask for reconsideration.

Photo credits
Lowering the Flag (modified)
Matador (modified)

Liked this post? Follow this blog to get more. Follow

Comments

  1. says

    I tell you what Andy, I admire you for standing by your principles. Instead of just complaining about the paid links issue you’ve stepped up and voted with the only mechanism that counts, your own site.

    I believe it’s Google’s right to dictate what types of requirements it wants to hold sites to before they index them, but it is also your right to hold the same truth’s and withhold your content until they are in accordance with your own standards.

    • says

      It takes guts to do this, and I, too, admire this. I’m glad that I now know a way of blocking posts containing paid reviews. Being a total blogging ignorant, this post is priceless. I never thought someone could do this. Excellent.

  2. says

    Were there any preparations you made before blocking them? Also, if it’s still in the index, how is that blocking Google? Or is it just its existence that’s indexed, and not the content?

    Also, for now I just see two folders disallowed, and for all crawlers, not just Googlebot.

    Finally, not to be rude but I’m having a hard time with your grammar. Specifically, I don’t understand this:

    “It might be harder to rank, pages blocked using robots.txt still gather PageRank, and can appear in the index, though they would be looked on as dangling pages.

    Ultimately links can always be redirected to a followup review which refers to the first, and that followup isn’t a paid review.

    It is a little naughty, some people will sometimes receive editorial links within reviews and receive a trackback, but I don’t know of any spam plugin that checks robots.txt , plus the links will still be valuable in other search engines.”

    Looking forward to more detail! I’m considering something like user agent: googlebot
    disallow: /

    and want to hear your thoughts. BTW, you have msn messenger? If you do, it’s be great to chat :)

    • says

      Hi Gab

      In my Linking Gotchas article I encouraged people to read the Matt Cutts interview with Eric multiple times.

      It really is important to understand, there is a difference between blocking with robots.txt and using noindex for instance.

      I haven’t made the change to robots.txt yet, but it will just disallow Googlebot from specific articles.

      I haven’t made any special preparations for this, I don’t really need to, as my content get splogged by probably 50 + blogs by now, and other legitimate syndication.

      I wouldn’t block Google totally, I am just doing this to stick to the letter of the law that Google seem to be insisting on… that is their problem.

  3. glengara says

    Good luck mate, you made a big deal about not selling PR but I saw one of your “reviews”, you were selling anchor text links….

    • says

      There is actually a huge difference between selling anchor text links, and choosing to give SEO friendly links when writing a review in an editorial capacity, whether paid for or not.

      Just read through my blog, and you will find almost every single link uses good anchor text where possible, and I often reword what I write to ensure people get a good link.

      Even the links I just used to Flickr use good anchor text, not just “photo credit” which is what you find on most blogs when they use other people’s pictures.

  4. Doug Heil says

    Yes; good luck with it. I’m with glengara.

    It’s amazing how obsessed people are with google and the paid review thing. Just a thought though; what if you did things just for your visitors only and did reviews for your visitors only, and did anchor text stuff just for your visitors only? You seem to be doing things only for google, and then writing post after post about what you did and why you did it because of google. Just wondering if you think that is a good strategy?

    • says

      Doug

      Descriptive anchor text is helping users.

      I could have just done

      Picture Credits

      Or even

      Credits 1 2

      I have even seen

      Look at what everyone has to say
      1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

      Then there are the sites that talk about others without linking at all

      My company is based in the UK – It is my belief I am legally obligated to use links that are friendly, and it is just good practice.

      As I am writing about SEO, using junk links wouldn’t be a very good example to my readers would it?

  5. Doug Heil says

    Well yes; but if it were obvious you do things for your visitors only, people like me would not be writing that it seems to be all about google and how google harms site owners, etc. Even the title of your blog post has the word google in it. It just seems to me that seo’s need to sit back and relax a bit instead of doing most everything because of google. Either doing or not doing. The paid links issue is really not this difficult.

    Andy; I agree with lots of things you write about, etc, and I also disagree with things as well. My point is that if you just relaxed a bit and did things for your visitors only you might be better off. Common sense should also tell you that a paid review is a paid review. It isn’t anything else, so why should that link be given a boost in a search engine anyway? You see; I disagree with the entire premise of paid links being given a boost, no matter how many seo’s wish it to be so.

    • says

      I look on Matt Cutts doing a review of Google Reader as being a paid review, or Techcrunch reviewing a site that Michael Arrington has equity.

      The only difference is employment status

      If I do an SEO review of someone’s site, I need to link to it – If I want to share that review with my readers as content, why should I have to block the links?

      There are people in the industry who have written paid reviews quite openly, I know Matt Cutts reads their blogs, yet they haven’t had to suffer their green pixie dust being removed, and other potential penalties.

      There are far worse things that Google should be looking at, such as the WordPress.com tags which make it a linkfarm.

  6. says

    Just on the point of anchor text, UK accessibility legislation requires “companies” to maintain accessible websites. Most accessibility experts agree the general consensus is that an accessible web site is one that reaches the WAI level 1 and attempts to meet Level 2 (or in old money Bobby AA and AAA)
    To meet Level 2;

    13.1 Clearly identify the target of each link. [Priority 2]

    Techniques for Web Content Accessibility
    Therefore a UK business would be obliged to provide not only a search engine friendly link but more importantly a human friendly link.

    • says

      And there’s the thing, Tim. Make a link human friendly via the anchor text or title and you’ve probably made it search engine friendly too.

      Of course what they define as a company is pretty woolly but that’s for another day ;)

  7. says

    Well this is one that I definitely want to watch! Absolutely hats off to you, Andy, for taking this stance.

    Ultimately links can always be redirected to a followup review which refers to the first, and that followup isn’t a paid review.

    Too right. This is something that I advocated last year as a work around. Ultimately how can the Big G penalise you for that especially if you’ve robots.txt’d them off the original page?

    I look forward to hearing how things work out with this.

  8. says

    Andy, good job.

    I think though, especially reading the comments here and elsewhere, that most people really won’t understand the logistics behind what you are doing, despite the explanation that you gave. Great thinking though. :)

    • says

      Michael I have seen some of the comments elsewhere

      I gave some really big hints to some of the things I am going to be playing with, both in this article and some of my previous articles.

      If I have a nofollow in a post, every syndicated copy of my content also gains a nofollow, unless I go the complicated route and use some of Sebastian’s nofollow cloaking

      If I use Nofollow, Google wins, unanimous decision.

      By using this method, whilst I am certainly not happy to be doing it, I am taking a few punches, but win on points.

  9. ny seo says

    at the very least we have to say Andy makes it even more interesting.

    my mouth is watering with anticipation over what you might come up with next…

    i might be obligated to send you royalty checks soon, thanks again

  10. says

    RE: Update #3

    Did you request it yet? What do you think the odds are, realistically, that they will reinclude you right off the bat? In what kind of timeline do you think you will get it, or how long are you willing to give it before you step up the noise level?

    You know they made Donna jump through some hoops when she did hers, right?

    • says

      First off I am waiting for them to all be reported as being blocked after a crawl. I might have to give the Googlebot a nudge with some more links, but we will see tomorrow.

      I know Donna had to jump through hoops, though most of that was missed posts – I kept very close track of mine.
      The same was true for Wendy (Emom) and Yaro.

      I haven’t blocked follow up posts that might have happened to the same domain.

      I will also be doing something regarding disclosure, and have an advertising page explaining robots.txt soon.

      Who knows how long it will take after that.

  11. says

    Andy,

    I like your solution rather then an attempt to completely ban googlebot from my site few months ago.

    After trying my way in sponsored reviews I gave up. Reason is simple. You have set the bar of how these reviews should be done. Anyone who is not trying to level with your reviews should be simply kicked off the PPP and other places.

    It is a shame that some people have gotten with murder in the past. While some one like you has to resort to these manipulations.

    I have to admit your approach make more sense the nofollow crap.

  12. says

    Andy,

    Am I smoking crack or did I read in one of Matt Cutt’s quotes over the past few months on this issue, I believe in relation to directories, that paid links are not the problem, paid links without editorial review of the links are the problem. If that is the case then why should GOOG have a problem with what you do or what Pay Per Post does or anybody else who claims to have an “editorial” process? Rhetorically asked of course.

  13. says

    This post has given me a lot to think about when it comes to sponsored or paid post. I don’t mind reading them on other blogs as long as they aren’t feeding into it too much and making up stuff that you know isn’t true or wrote on a site or product that they have probably never used before. I understand what you are saying in this post and wish you the best of luck.

  14. says

    Andy,

    I just wanted to write to say I agree with what you are doing. I don’t understand some of the challenges you are receiving here…to help visitors understand where a link leads means using appropriate anchor text. That just makes sense, and that’s the reason why search engines consider anchor text in rankings.

    I’m looking forward to seeing how this all washes out. Please keep us updated.

  15. says

    So are you going to end up placing a new Disallow in your .htaccess for every paid post that you do? Sorry, I’m a bit confused on how this works.

  16. says

    A lot to digest here. Not for the first time I’ve had to print out one of your entries so that I could take it all in. And use my highlighter. It’s refreshing that instead of complaining your looking for solutions.

  17. says

    Just wondering… how effective is using robots.txt to set a nofollow on a page, compared to using nofollow meta tags for each page?
    For some of us looking into using robots.txt, we would sure be dealing with a long list of disallow URLs, and updating them looks to be more of a hassle compared to using meta tags.

  18. says

    I have to admit that i like your solution, rather then ban googlebot.

    SEO friendly links are most often a good thing, making the link more user friendly. People are obsessed with Google and the paid reviews.

    I can see only one weak point here. It is similar to viral marketing. How many people will want to pay for a potential negative review or no-follow links?

  19. says

    So, how would this prevent Google from seeing what’s on your main page?

    I can understand that it will prevent Googlebot from seeing them once they’ve rolled off the main page, but wont those links still be indexed?

  20. says

    I’d love another update – I noticed most of the pages you blocked have a blank PR toolbar reading.

    You’re still passing on internal pagerank to those pages and I’m really curious to see if a blocked page gathers pagerank in time.

Trackbacks

  1. Andy Beard Blocks Googlebot with Robots.txt

    This past weekend, Andy Beard did what many people thought unthinkable. In his blog post, he says that he’s blocked Google from crawling paid reviews on his site. His reasoning is clear: I have spent a long time deciding on…