Google Wacking

This week the santy virus used Google queries to find vulnerable versions of PHPBB for it to attack. Immediately there were calls for Google to block this malicious search. Within 7 hours Google complied and the virus was no longer to search for PHPBB servers until a new variant was written that changed to user-agent field. If the user-agent field is random, or something common, will Google then block all queries for PHPBBs?
One has to wonder about the interests of information freedom versus the interests of computer security. I wonder how far Google will be forced to go. Will all googlewacks be banned as well? After all, what legitimate purpose do I have in searching for XLS files with “password” in the file name? Where will it stop. Will I no longer be able to search for +solaris +root +exploit? It seems antithetical to the nature of the internet to try to block all malicious searches in search engines.
Over at news.com Robert Lemos postulates that rate limiting is a possible solution. If I have a computer that queries for vulnerable PHPBB servers, it talks to Google once. How do you rate limit that? In the vast amount of traffic how do you notice “abnormal” query tendencies and block the dynamically? Frankly rate limiting should already be in effect to prevent address harvesting via the Google cache.
Given the security culture, I cant help but wonder how long we must wait before someone demands we shut down the search engines to protect national security. First we had a email virus getting addresses from Yahoo People. Now we have an internet worm gathering victims from Google. Wont someone please think of the children.