Adding compression support can be very simple -- if your spider is coded in Perl using LWP::UserAgent, then the
addition of a single line of code will enable compression support.
$ua->default_header('Accept-Encoding' => 'gzip');and then you need to make sure that you always refer to 'decoded_content' when dealing with the response object.
For other languages, all
you need to do is to add
Accept-encoding: gzipto the HTTP request that you send, and then be prepared to deal with a 'content-encoding: gzip' in the response.
Happily, some of the large spiders do support compression -- the googlebot and Yahoo Slurp do (to name but two). Since I started prodding crawler implementors, a couple have implemented compression (one within hours), and another reported that it was a bug that it didn't work -- which would be fixed shortly.
Crawlers which do more than 5% of the total (uncompressed) crawling activity are marked in bold below.
Crawler | Last IP used |
---|---|
curl/7.54.0 | 172.105.87.91 |
Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot; [email protected])" "gladstonefamily.net | 216.244.66.194 |
Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot; [email protected])" "pond.gladstonefamily.net | 216.244.66.194 |
Mozilla/5.0 (compatible; DotBot/1.2; +https://opensiteexplorer.org/dotbot; [email protected])" "pond1.gladstonefamily.net | 216.244.66.194 |
Mozilla/5.0 (compatible; oBot/2.3.1; http://www.xforce-security.com/crawler/)" "gladstonefamily.net | 161.156.29.33 |
Mozilla/5.0 (compatible; oBot/2.3.1; http://www.xforce-security.com/crawler/)" "gladstonefamily.net:8080 | 161.156.29.33 |
Mozilla/5.0 (compatible; oBot/2.3.1; http://www.xforce-security.com/crawler/)" "pond.gladstonefamily.net | 161.156.29.33 |
Mozilla/5.0 (compatible; oBot/2.3.1; http://www.xforce-security.com/crawler/)" "pond1.gladstonefamily.net | 161.156.29.33 |
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.146 Safari/537.36 | 138.246.253.24 |