They compared Windows Server 2003 and Red Hat Enterprise Server 3 running databases, scripting engines and Web servers (Microsoft's on one, the open source Apache on the other).
Their criteria included the number of reported vulnerabilities and their severity, as well as the number of patches issued and days of risk \ufffd the period from when a vulnerability is first reported to when a patch is issued.
On average, the Windows setup had just over 30 days of risk versus 71 days for the Red Hat setup, their study found.
What is wrong with this picture?