Google at it again - Ian pretty much says everything there is to say about it. Google claims they don't want to be "evil." But they are infinitely stupid, as shown by the repeated launch of the Web Damager.

What does the Web Accelerator do, and why is it such a stupid piece of software? Well, it simply follows links. And it does so in advance, before the user does - so to speak, speculative web crawling, but privately for the user. That doesn't sound so bad at first, except that servers are bombarded with traffic they might never have otherwise - because every link is followed, even if the user doesn't go there. And that multiplied by the users who use this thing...

But the traffic is not the real problem - the real problem comes when you consider the context in which this thing runs. And that is, it runs on the user's private computer, between the browser and the network. Just a little proxy of its own. Which, for its work, remembers cookies and similar things and then sends requests to the pages that look as if they come from the user's browser. With their security headers. And cookies.

Apart from the fact that I wouldn't particularly like it if my headers with passwords or session cookies appeared anywhere other than in the browser and the target server - this approach also enables the Web Accelerator to look at areas that a central crawler would not see. For example, areas of pages that are behind logins. Content management systems, where additional links appear after login. Wikis, whose edit links then appear when someone starts a session. Webmail systems, where each mail is represented as a link.

All these systems have one thing in common: for changing actions, a form submission is not always necessary. Often, it is enough to click a link. The current version of a page in the wiki to delete quickly to remove wiki spam - a simple link, only visible to the logged-in user. The mail in the webmail inbox, which is automatically marked as read when called up. The publish link in the CMS, with which a page is put live.

Of course, responsible web application programmers try to put destructive actions behind forms (and thus POST requests) so that a simple link doesn't destroy anything. But this usually only happens in the publicly accessible areas, where otherwise the web robots of the various search engines and spam automata would cause chaos.

But precisely in the areas shielded by login, one normally does not expect automated clicks - and therefore builds comfort features, because one can be sure that a link is clicked consciously and intentionally.

Well, until the Google Web Accelerator came along. From the company that claims to understand the web. Thanks a lot, you assholes.

PS: and contrary to the first version, the new version no longer sends a header with which one could recognize the prefetch requests in order to block them in such critical areas.