Adding compression support can be very simple -- if your spider is coded in Perl using LWP::UserAgent, then the
addition of a single line of code will enable compression support.
$ua->default_header('Accept-Encoding' => 'gzip');and then you need to make sure that you always refer to 'decoded_content' when dealing with the response object.
For other languages, all
you need to do is to add
Accept-encoding: gzipto the HTTP request that you send, and then be prepared to deal with a 'content-encoding: gzip' in the response.
Happily, some of the large spiders do support compression -- the googlebot and Yahoo Slurp do (to name but two). Since I started prodding crawler implementors, a couple have implemented compression (one within hours), and another reported that it was a bug that it didn't work -- which would be fixed shortly.
Crawlers which do more than 5% of the total (uncompressed) crawling activity are marked in bold below.
Crawler | Last IP used |
---|---|
curl/7.54.0 | 104.156.155.31 |
fasthttp" "73.227.75.114:8080 | 47.236.225.130 |
meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)" "blog.gladstonefamily.net | 57.141.0.11 |
meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)" "pond.gladstonefamily.net | 57.141.0.24 |
meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)" "pond1.gladstonefamily.net | 57.141.0.22 |
meta-externalagent/1.1 (+https://developers.facebook.com/docs/sharing/webmasters/crawler)" "www.gladstonefamily.net | 57.141.0.4 |
Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot; help@moz.com)" "blog1.gladstonefamily.net | 216.244.66.194 |
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.5060.134 Safari/537.36 | 138.246.253.24 |