Not to spark a war of words or anything, but for those who are interested in the "Windows will or will not scale" debate, here is some info definitely worth talking about.

I work at a company called Corillian Corporation (as the corporate IT team's manager). Microsoft just released a case study they did with Corillian, and it's worth taking note of. Scott Hanselman, our Chief Architect, posted these facts and figures on his blog earlier today, which illustrate the numbers quite well. They are, frankly, pretty darn amazing numbers:

*Currently, more than 19 million end users—or about 25 percent of U.S. online banking customers—use Corillian technology when they use their institution's online services for transactions such as checking balances, paying bills, and transferring funds between accounts. (Not bad for a Microsoft-based platform, eh? .NET works.) *
*Voyager 3.1 was able to support 70,000 concurrent users across multiple lines of business. *
*Voyager 3.1 was able to support a sustained throughput rate of more than **1,268 transactions per second **— about 4.5 million successful transactions per hour—and a sustained session creation rate of more than 208 new sessions per second. *
*Voyager 3.1 supported more than 129,000 concurrent sessions across the system at peak load. This includes both active sessions, in which a user is executing transactions, and inactive sessions. *
*Voyager 3.1 supported a ramp-up from 0 to 70,000 users in only 15 minutes—without any adverse impact on performance—demonstrating that Voyager can sustain a large burst of users accessing information in a short time period without overwhelming the system. *

  • Voyager 3.1 surpassed its previous benchmark of 30,000 concurrent users by 133 percent, with only a 32-percent increase in overall hardware cost.

Another interesting note comes from some questions I asked of a Corillian employee involved in the testing. He told me that the bottleneck that stopped the test from going into higher numbers was not Windows, nor was it Voyager (our online banking application); It was the *hardware *(which is amazing hardware by the way). The test simply used up all the hardware resources available in the lab. In other words, both Voyager and Windows Server 2003 had more room to spare and would have kept scaling, had the hardware allowed. When you consider the test systems are some of the biggest and best in the industry, that's saying a lot. That just doesn't typically happen.

From the case study, this quote sums it all up:

Hugh Wade, one of the Microsoft engineers who spent time analyzing the Corillian code and recommending some changes to the company, notes that "Voyager was the best-performing non-Microsoft application" he had seen in the lab.

This is pretty amazing stuff, and it says TONS about Windows Server 2003, as well as Corillian Voyager - and the people who are involved in developing both products.