Ok can't you reduce the frequency of it then?
IAC, it seems silly to run a robot through the Web interface if its internal. They should be running hte thing off a service or a standalone process that runs in the background instead of tieing up the Web application.
+++ Rick ---
>>Why don't you just block the robot from code and avoid the data access altogether? Or block certain requests if the robot gets too agressive.
>>
>>Are we talking about a search engine robot or something else? Search engines are supposed to not hit more than once every two seconds or so and in doing so shouldn't kill your scalability.
>
>That robot is one that has been put on the Web server to listen for incoming files in a directory, uncompress them, extract them and execute a tons of data operation against them. So, basically, the 15 to 20 MB files received are being processed in sequence and that will create numerous new records in tables and several adjustments on existing records.
>
>So, when I heard this week about such a robot doing such a process on the Web server, I recommended to offload that to another box. But, they don't want to because they say this would then create a lot of data problems such as corruption if they have to read our tables from a network drive.