SUPPOSE you ran a company that created a product, and as the founder, you made very public promises that this product would be reliable and safe.
Then, a few months later, a problem is found that dramatically compromises both the reliability and safety of the product. On a global scale, your product is no longer considered either reliable or safe.
If you ran this company, what would happen internally? How would you respond to this crisis? Who would be accountable? Would heads roll?
When the Blaster worm went racing across the Internet this week, Microsoft found itself in the grip of this scenario -- again. The software giant's co-founder and chief software architect, Bill Gates, announced just last year that "Trustworthy Computing" was going to be a major initiative. Gates acknowledged what Microsoft's critics had alleged for years: that its products were full of bugs that compromise security, making them risky for use in critical applications.
Despite a push that included 10 weeks off from new development to find and squash bugs, Microsoft continues to be bedeviled by glitches in its code. The Blaster worm is just the latest in a parade of malicious software that preys on Microsoft's vulnerabilities.
...
Has anyone at Microsoft actually been fired or disciplined as a result of the Blaster debacle? Jeff Jones, senior director for Trustworthy Computing Security, said the responsibility for the flaw was narrowed to a team, and "no one was fired." He would not say if anyone was disciplined.
[link|http://www.chron.com/cs/CDA/ssistory.mpl/business/2050991|column]