For a few days now, Google is offering a crawl rate column in their Webmaster Tools. The top part of the page shows pages, kilobytes and the time that Google needs to load a page, per day. None of these data are new quite interesting to compare them to your own records.
Further down there is an option of how frequently the Googlecrawler is supposed to fetch the site. Default is normal, slower can be a good idea for sites that put a large strain on the server but it only gets interesting at the faster option. This option can only be chosen if Google believes that the website is made up of more pages than Google fetches with their normal crawling-frequency. With this option you are allowing Google to exceed their crawling-frequency-limit (hostload). This exception is good for 90 days after which it has to be renewed.