When I saw the announcement for the new Google Web Accelerator, my first thought was: why bother? The idea is certainly nothing new, just a specialized proxy setup that caches data and delivers it in a compressed format. Sure, it could make things faster, but what does Google get out of the deal? A ton of info about users browsing habits, sure, but I don’t think that is it. I think it what they are hoping for is something much more traditional.
I suspect that Google hopes to find more sites to index. I’m sure that Google (and Yahoo! and MSN Search for that matter) are doing pretty good jobs with their spiders, but I suspect there are still plenty of sites that haven’t been found by any of them. I’m sure every site requested through their proxy will be checked against their search index.
So what other already existing technology will Google try to improve on (and use to their own advantage) next?