>Related to that: read a real-life case study of a firm that created a .NET server-side asp.net application: they topped out at 25 users per server, pegging the server at 100% CPU Utilization. So much for putting everything on the server.
This is always depends on the application adn the implementation. Yeah, if you have SQL Server on the same box and you're running massive queries a single user can top out a Web application.... You always have to be careful of the circumstances.
FWIW, I just did some load testing on one application I'm rewriting with .Net and using pure ASP.Net vs. classic ASP/COM code improved performnace two fold at lower CPU load. This is not a heavy duty app, but both applications were designed using business objects (which are not optimized at this point in my .Net framework but were in the VFP code). So I think there are definite benefits you can reap with .Net.
One more thing; Multi-threading in user code on a Web Server is something few people will want to do. You can, but there generally is little need to do this unless you have a good scenario for it. THe main reason is that Server apps are transactional (one thing after a nother and in general multi-threading does not improve performance only responsiveness. Inside of an ASP.Net page this wouldn't make much sense it wouldn't help things much and make coding a lot more difficult introducing wait states having to wait for threads returning.
As you say: The best policy is a good application design and planning ahead on how to connect to data in advance. Granularity tends to be the best policy for optimizing code...