I also think that that is result of boots hitting your site. First check which URLs are being visited.
In my case, Googlebot was an indexing page so aggressive, that my relative high performance server CPU was 100/100 all the time.
The pages which were indexed was old paths for custom fields from the previous script I used before useless (but change was made this month).
All I had to do was to block those paths in robots.txt file (look below)
Ofcourse you paths may be different
You can also set (slow down) interval of indexing site with adding:
(please note, that crawler delay does not apply to Googlebot)