Edward Felten is going on a bit about monocultures. Since he isn't going to do it, let me tell you the other side of the story. And let me preface this by saying that this is my opinion and doesn't necessarily represent the opinion of my employer.
Felten takes as a given that a monoculture incurs a penalty. A monoculture incurs a risk, indeed. But what is the size of that risk, and how does it measure up against the actual benefits of having a standard configuration within an organization? For there are real measured benefits, and compatibility is only one of them. For instance: training. You only need to train your people, and provide training infrastructure, on one set of products. Also support: your helpdesk only needs to know how to support and troubleshoot one set of products. Here's the real kicker though: the overall investment in security assessment and audit is smaller with a monoculture. If we take as a given that all software has security and reliability bugs (and they do -- go read http://www.linuxsecurity.org if you doubt this) then a monoculture drastically simplifies the time and effort it takes to track patches and configuration changes, and to apply patches and configuration changes once you discover a need to.
There are other fallacies in here as well. The biological monoculture analogy really doesn't hold up. When biological creatures die, they die forever. Not true for computers. Assuming that people are being responsible and backing up their data, you can't permanently extinguish a population of computers. (if they aren't being responsible and backing up their critical data, then the existence of a monoculture is the least of their worries) Also, in truth, most biological life forms on the planet are by a fairly conservative definition a monoculture. Dawkins tells us that over 90% of all animals' DNA is common; 98% of DNA is common between a human and a chimpanzee. Wouldn't you call two software products with 98% of the same code a monoculture? And we see it in real life; people find security bugs in Linux all the time that are present in multiple distros. There isn't nearly the diversity that we like to believe there is.
And suppose we buy the argument that monocultures are evil; how much diversity do you need? No more than 50% one product? 25%? 10%? Try to get a CIO to go along with the notion that it's ok for 50% of their company's machines to go down. Even 10% isn't really safe, because any good hacker that has compromised 10% of the machines on a network can launch an amazing flood of network traffic in a DDOS attack that will drown the network and bring the other 90% of the machines to their knees. (but imagine how much work it would be to support 10 different database servers in your shop)Compromising a very small percentage of machines behind a firewall is enough to wreak great havoc.
The REAL research agenda lies in other areas. There are the preventative measures, ranging from simply better tools to help developers write better code in the first place (and verify it), to measuring the attack surface of your code, to estimating the security risk factor of a particular design, component, or configuration (and other areas too numerous to list here). There are defensive measures: quickly and accurately detecting atypical (or typical compromised) behavior in a system as a sign that the machine should be looked at, isolated, or even potentially shut down. And there are simulations, which are a both of both and help to exercise and tune systems. There are very smart people in the research community looking at these areas, and I am very optimistic that we will continue to make good progress.
The monoculture discussion makes for good bluster, particularly if you're a fan of MS-bashing, but it is a red herring and a distraction. Can we please move on?
9:46:20 PM ; ;
|