Hello guys,
Am wondering if once can rate-limit (reqs/second or /minute) globally,
but also whitelist Google (and other crawlers), by hardcoding their
reverse dns hosts in the config?
If reverse dns/host check is no-go, can the same be achieved using user
agent filter?
Any ideas are greatly appreciated.
Thx
Joe
Received on 2011/05/05 16:39
This archive was generated by hypermail 2.2.0 : 2011/05/05 16:45 CEST