Adding compression support can be very simple -- if your spider is coded in Perl using LWP::UserAgent, then the
addition of a single line of code will enable compression support.
$ua->default_header('Accept-Encoding' => 'gzip');
and then you need to make sure that you always refer to 'decoded_content' when dealing with the response object.
For other languages, all
you need to do is to add
Accept-encoding: gzipto the HTTP request that you send, and then be prepared to deal with a 'content-encoding: gzip' in the response.
Happily, some of the large spiders do support compression -- the googlebot and Yahoo Slurp do (to name but two). Since I started prodding crawler implementors, a couple have implemented compression (one within hours), and another reported that it was a bug that it didn't work -- which would be fixed shortly.
Crawlers which do more than 5% of the total (uncompressed) crawling activity are marked in bold below.
| Crawler | Last IP used |
|---|---|
| DomainStatsBot/1.0 (https://domainstats.com/pages/our-bot)" "gladstonefamily.net | 148.251.121.91 |
| masscan/1.0 (https://github.com/robertdavidgraham/masscan)" "- | 45.120.126.66 |
| Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot; help@moz.com)" "charon.gladstonefamily.net | 216.244.66.194 |
| TerraCotta https://github.com/CeramicTeam/CeramicTerracotta" "gladstonefamily.net | 3.83.76.234 |