Too much traffic - The robots slowed down our website.

What I did

I have increased the server timeout and included some additional rules to our code, which should prevent most of the known bad bots/crawlers and unknown/hidden user agents from accessing the website and help for decreasing unwanted traffic.

What happened 

I got this email from our hosts:

"Dear Jan,

We would like to inform you that your account has reached the allowed daily usage of xxxx CPU seconds per account. Please note that once you hit 150% of the allowed daily CPU seconds, your web service will be limited for the calendar day. The web service limit means you may have problems accessing your website.

So I researched and found the problems and then I searched some more and found the solutions.