Someone sent me an interesting statistic today which fascinated me, mainly because I couldn’t understand it. In the experience of this particular person someone supporting Windows could support nearly 40 servers, whereas someone supporting another OS (which shall remain nameless) could support less than 10 servers. We are talking about two mature operating systems here.
I haven’t got around to a full challenge of the assertion but I couldn’t see it. Surely the operating system support overhead is about the same these days, I certainly wouldn’t have expected a four-fold difference, that’s huge. That would mean that a Windows server would only need to carry a quarter of the load of any other operating system to be cost effective. OK, there are some capital cost differences but they are small compared to the operating costs.
Am I that far out of touch?
Discover more from Graham Chastney
Subscribe to get the latest posts sent to your email.

maybe, the server maintenance effort of the server itself is negligible, and it is the applications that require effort. So consequently, the reason that the operator can manage 4x as many windows servers as os X (for want of a better name) is that the windows server can handle 4x fewer applications? Just an idea?
LikeLike
It would be OK if that were true Ian, but unfortunately there was another set of numbers for the application support.
LikeLike