Out of Control Complexity in Computing

Once complexity is out of control, it takes control. We wish to believe that somehow we can regain control. But the lesson from every other evolving complex system, whether biological, social, economic or political, is that the promise of future control is an ever-receding mirage.

If there is any doubt that complexity is out of control in the computing world (as it has forever been in biology, economics, ecology, etc.), think of the various types of computing specialists who are now focused on issues that were almost unheard of twenty years ago. IT professionals now expend substantial resources detecting and cleaning virus and worm infections, patching operating systems and other software, modifying firewalls, updating virus detection software, updating spam filters, and so forth. There are computing epidemiologists who seek to identify new viral outbreaks before too much damage occurs. And there are computing pathologists who dissect new viruses and worms such as Stuxnet to see how they work, not to mention the cyber "warriors" who work in NSA to create exploits to install surveillance software in the general populations' computers and smartphones. Is this not like the world of multicellular life, where small viruses and bacteria and larger predators constantly vie for life's energy and replicative power?

Civil engineers who create steel bridges have a saying that "rust never sleeps." The comparable maxim in computing ought to be that "complexity never sleeps." As computing professionals work to reduce complexity, all too often their efforts actually exacerbate it because the already complex systems are far beyond our comprehension.

Much of the runaway complexity is inherent in what we are now asking computing systems to do: principally to participate in ever more complex networks of machines and ever richer interactions with users involving many previously unavailable sensors. In addition to the increasingly complicated human-controlled client PCs on the net, the Internet now includes PDAs, cell phones, web cams, bar-code readers, RFID sensors, credit-card readers, and microphones for VOIP telephony and voice interactions with users. Effectors such as digitally controlled pumps and valves, traffic light controllers, music synthesizers, and all manner of household appliances increasingly have web interfaces. Software in one machine interacts, often in unforeseen ways, with software in many other computers. And not all of that software is benign. If a hacker hijacks your coffee-maker, you may be irritated, but not threatened. If he hacks into your bedroom baby-cam or the system that controls traffic lights between home and work, it goes beyond irritation. If he hacks into the system that controls the pumps and valves at a large oil refinery, chemical plant, or nuclear power plant it can become a disaster. And if this hacker works for a cyber-crime gang, or al Qaeda, or for an unfriendly nation state, things get rapidly worse. Consider, for example, the July 4th, 2009 attack on national resources in the US and South Korea (perhaps by a North Korean cyber warfare group, or perhaps not). Or the recent US/Israel Stuxnet virus attack on the centrifuges in the Iranian fuel enrichment site at Natanz. The recently discovered "Flame" virus aimed at Iran and other Middle East regimes is by far the most sophisticated virus yet. It spoofs Microsoft Update certificates and can install key loggers, turn on microphones and cameras, sniff the local network, and much more.

Yet the complexity of the elements in the Internet (PCs, iPhones, routers, server farms, etc.), not to mention the complexity of their interactions, seems to provide an ever-growing smörgåsbord of unanticipated and unprotected opportunities for surprises. Unpatched Windows PCs, especially if IE is the browser, are easy marks for recruitment into bot-nets that exploit the weaknesses and complexity of the Internet for cyber crime and cyber warfare.

Truly, the network is the computer these days, and that network is under attack. Researchers at the Internet Storm Center estimate that, on average, an unpatched Windows PC connected to the Internet will survive for about 20 minutes before it's compromised by malicious software. Twenty minutes is less than the time it takes to download and install the necessary patches to protect the machine. Such attacks are accelerating dramatically. Symantec claims that unique malware variants increased to 403 million by the end of 2011. The capabilities and sophistication of viruses and worms continues to evolve as well. A worm discovered a few years ago tries to install a 'sniffer' seeking to use infected computers to capture login and banking [account/password] information from other computers on the same network. These days that is unremarkable. Researchers have discovered the possibility of more sophisticated viral attacks via flaws in the 802.11 wireless adapter driver that can work even when the machine is not connected to the network. Just because your wireless is enabled and looking for a network could be enough. So, you can take your laptop into a Starbucks and get infected before you sit down with your coffee! "Out of Control" is an understatement.

It is tempting to believe that the only solution is to redouble our efforts to control complexity. True enough, we should continue to construct better engineering solutions to each problem: reduce complexity, create more perfect firewalls, and better structure the interactions between all computers under our control. But we must also understand that such measures are at best stopgaps. As Tahar Elgamal points out, “The hard truth of network security is that while many approaches are good, no individual effort makes the network completely safe. Implement enough fixes, and you only succeed at making your network more complex and, hence, more ungovernable, with solutions that wind up acting at cross-purposes.” The same can be said for each of the other specialized tasks in managing complex computing systems.

To successfully improve the security of our computing systems, we will need to modify our systems at an architectural level. To that end, I suggest that we learn from complex biological systems. This website explores four such architectural principles that have evolved in multicellular biological systems.

Read more about the role of complexity in the Evolution of Computing

Last revised 5/30/2014