Why does Java get used server side?
Bean had?

Arrrgh! Java as a server-side platform. Interpreted p-code, with a p-code pseudo-processor just to suck up all that spare CPU and memory you don't really want your server to have. Why? Just why?
I can understand it for portable applications in the world of modern consumer electronics where using Java provides a viable target in a sea of possibilities. But for servers? How often do you need to port a server application between different environments in any given hour? When you're going to be running the same application on the same hardware for three years, what the heck is wrong with simple operating-system and hardware dependency libraries, and native compiled code?