I know that because we had the dreaded “Pharma hack” on efolkMusic.org, whereby there were about 10,000 references to certain popular medications on our website, as far as Google was concerned. Visitors to the site weren’t aware of it, viewing the source code of any page didn’t show the offending keywords, but search would show 250+ occurrences on a given page, times a few hundred drugs.
I finally hired a security company — probably the perps, but what could I do — and they had it cleaned in a few hours. We’re still showing some funny stuff in search, but we didn’t get blacklisted. Thing is, Google, with all it’s servers and billions of dollars, can’t keep up with the exponential growth in indexable links. Up until recently, robots were mostly used to get information. A new breed of bots is working right now ADDING useless content, with nefarious intent.
We had an earlier link issue, the first time I paid any attention to how many links in we had- this was 3 months ago, we had about 2 million. Today, 3,038,391. Maybe once Google “catches up” (which is getting harder and harder) with our malware-free site, that number goes down. But what if it doesn’t? Say it doubles every six months, take that times a gazillion little sites like ours, you pretty quickly get some serious numbers, I don’t care how many servers they throw at it, they can’t keep up, the system WILL go down. Can you say log jam?
I’d keep a nice AAA road atlas under the seat of your car, if I were you, you might be needing it.