Jump to content

Hey Guest, Welcome to Backlinksforum - Archive!

Sign up today in order to gain access to a vast range of features including the ability to create new topics, send private messages, Facebook & Twitter integration and MUCH more!

  • RSS Feed
  • Important

    This is the (read only) Archive of Backlinksforum.com up to December 3 2011
    If you want to start a new topic, then you need to go to: www.trafficplanet.com


    - - - - -

    Google themselves say they perform manual reviews when it comes to spam


    • This topic is locked This topic is locked
    7 replies to this topic

    #1 lovethelink

    lovethelink

      Member

    • Members
    • PipPipPip
    • posts 98
    • Joined: 02-September 10
      Reputation: 10

    Posted 04 October 2011 - 12:26 AM

    Here it is:-

    google.com/support/webmasters/bin/answer.py?answer=35769

    (don't wanna link directly there - just feel it's a little dirty linking directly via these forums)

    To quote:-

    Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google's partner sites.


    This is right there in Google Webmasters(!), so wasn't hard to research. I'd always assumed there must be some manual curation when it comes to de-indexing to both help the algo, and to prevent (or at least limit) false positives. And of course, let's not forget, they have to manually deal with reinclusion requests too (or do some think THAT is 100% automated too?!). And yes, I've had one or two sites de-indexed in the past, and included again through a very manual process that involved manual reviews to re-include my sites once more (confirmed by emails sent to me by Google).

    So given that, it's hardly a leap to assume there's some manual curation going on with with data pushes / Panda updates to both help the algo next time around, and ensure their targets are pinpointed. Algorithms are human-lead after all, and they rely on new data and rules set by humans. I don't see how that's possible without some level of manual review to "show it the way" and flag up similar sites next time.

    Now before some people get all angry about this and think it's "either / or", of course I am not suggesting ALL sites that are de-indexed are done so manually. I don't doubt there's a threshold in their algo that will draw a line between automatic de-indexing and "send for manual review / algo refinement", and I've no doubts that a lot of "sure fire" spam sites get deindexed automatically. But to suggest every site that is de-indexed on a data push is automatically done so is a bit naïve given the complexity of the web and the fact that Google want to limit their own mistakes to a minimum.

    I suppose those in denial will say "well Google would say that" when they say they perform manual reviews.

    I just think it's healther to assume Google are telling the truth on this, and create sites as if they can stand up to a manual review.

    Edited by lovethelink, 04 October 2011 - 12:47 AM.


    #2 WillJR

    WillJR

      Member

    • Members
    • PipPipPip
    • posts 42
    • Joined: 20-July 11
      Reputation: 10

    Posted 04 October 2011 - 01:19 AM

    calling out GOY on this one ?

    #3 thrall

    thrall

      Senior Member

    • Members
    • posts 170
    • Joined: 10-June 11
      Reputation: 10

    Posted 04 October 2011 - 01:23 AM

    How is anyone calling out GOY? He said from the start it was an algorithm, and the OP agrees an algorithm is involved.

    This whole argument is STUPID. First of all, what kind of engineer with a masters or PHD creates an algorithm and doesn't test its output? Of course they are going to manually review what the algorithm gives out to see if its on track.

    Most likely google created some math to find these HPBL networks, and the engineers look at the results and verify it is operating correctly, just like their other algorithms. There is nothing to argue about.

    #4 lovethelink

    lovethelink

      Member

    • Members
    • PipPipPip
    • posts 98
    • Joined: 02-September 10
      Reputation: 10

    Posted 04 October 2011 - 02:58 AM

    calling out GOY on this one ?


    ...well kind of - just want to be clear on this, and not let people assume SEO is purely "man v machine" which is a dangerous mistake given the fact that Google welcome spam reporting from the public. I think it's actually detrimental to assume every de-index has to be part of an automatic process. This assumes you simply avoid the footprints and fingerprints that the algo currently knows, and you're probably OK. I think it's wiser to take Google's word on this, and build sites so they're strong enough to pass a manual review. A manual de-index can lead to dozens more automatic de-indexes that share the same attributes of the manually de-indexed site. I believe Google are always on the look out for these types of signals to find to help its algorithm. They say that in their spam report form - who do you think follows up those reports? Humans. Just don't think it's a case of man v machine, it's man v man and machine.

    Edited by lovethelink, 04 October 2011 - 03:07 AM.


    #5 WillJR

    WillJR

      Member

    • Members
    • PipPipPip
    • posts 42
    • Joined: 20-July 11
      Reputation: 10

    Posted 04 October 2011 - 03:46 AM

    How is anyone calling out GOY? He said from the start it was an algorithm, and the OP agrees an algorithm is involved.

    This whole argument is STUPID. First of all, what kind of engineer with a masters or PHD creates an algorithm and doesn't test its output? Of course they are going to manually review what the algorithm gives out to see if its on track.

    Most likely google created some math to find these HPBL networks, and the engineers look at the results and verify it is operating correctly, just like their other algorithms. There is nothing to argue about.


    I agree on this being a very stupid argument. I think its all just out there to scare you. I haven't had much problem. I am with GOY on this one its just ridiculous for someone to think that your personally being manually reviewed when there are billions of keyword and billions of sites.

    WATCH THE VIDEO BELOW
    Being a Google Autocompleter - YouTube


    I just think it's healther to assume Google are telling the truth on this, and create sites as if they can stand up to a manual review.


    I agree with you on this. The one thing that has always stuck with me is

    "You don't have control over your backlinks BUT you do have control over whats on your site"

    That being said if your scraping content/using duplicate content and claming it as your own ("autoblogging") - I can see people getting de-index for this

    but I can't see someone getting de-indexed for having too many spammy links. If that where the case I would make a bunch of spammy links to my competitors and get them de-indexed (hense google knows you don't have controll over your backlinks)

    Edited by WillJR, 04 October 2011 - 03:58 AM.


    #6 lovethelink

    lovethelink

      Member

    • Members
    • PipPipPip
    • posts 98
    • Joined: 02-September 10
      Reputation: 10

    Posted 04 October 2011 - 05:52 AM

    Hey GOY, don't take everything personally, or fall back to insults. It's healthy to revise what you know and not be so stubborn. It's pretty clear that Google use a mix of manual and automated de-indexing, and that the "less automated" data push that your very own Matt Cutts quote stated strongly implies a mix of manual and automated de-indexing. I just don't see why you're so upset that Google might involve some manual curation, even if it is to improve their algo.

    Here's some more examples of Google "hand jobs":-

    google.com/support/forum/p/Webmasters/thread?tid=68df5636297d3f91&hl=en

    searchengineland.com/google-improves-webmaster-penalty-notifications-for-both-manual-automated-penalties-92904

    I'm surprised you are trying to dispute this to be honest - there's plenty of discussion about manual reviews in Google's own help pages, webmaster forums, etc - plenty of emails back and forth between Google and website owners in regards to manual reviews and de-indexes.

    Your argument that a data push has to involve only 100% automation seems rather arbitrary and based on no evidence.

    I think the bottom line is to accept that manual reviews clearly occur, and any one of our sites COULD be subject to a manual review so we should create our sites as if they can pass such a review.

    Edited by lovethelink, 04 October 2011 - 05:56 AM.


    #7 pirondi

    pirondi

      Senior Member

    • Members
    • posts 413
    • Joined: 15-July 10
      Reputation: 10

    Posted 04 October 2011 - 12:46 PM

    oh god people on forums have soo much free time.

    #8 BrentR

    BrentR

      Senior Member

    • Members
    • posts 115
    • Joined: 05-May 11
      Reputation: 10

    Posted 04 October 2011 - 01:06 PM

    oh god people on forums have soo much free time.


    Thanks for the laugh! :)

    rel="Punctual"

    --

    You have NO IDEA what is in store..:cool:





    0 user(s) are reading this topic

    0 members, 0 guests, 0 anonymous users