Evolving Levels of Abstraction in Computing Systems
The evolution of computing systems -- the world of
bits -- is taking a
course similar to that of life, and now the two are merging!
The evolution of computing is much shorter and better known than the
evolution of life. Even
so, the history of computing is already losing track of many of the
emergent stages of development, Moreover, the details of the way each
evolved, especially those due to "accidents of history" are not fully
knowable. Nonetheless, here is a brief sketch of
the stages as I understand them. Others may characterize the
development of computing somewhat differently.
- Modern computing began during WWII to facilitate code breaking
and computation of artillery ballistics -- tasks that required
complicated and tedious arithmetic computations. At first, these
programmed by hard-wired plug-boards configured by hand to perform
higher-level logic and arithmetic functions.
- Soon, the notion of a unitary machine instruction emerged.
Machine instructions represent coordinated hardwired sequences of gate
changes with predictable results (now most machine instructions are
by microcode rather than being hardwired, but that is another story).
programs were carefully crafted sequences of machine instructions that
exploited very clever (and today verboten) kinds of self-modifying code
and other obscure tricks. A program that involved a thousand machine
language instructions was a pretty substantial program and very
difficult to debug!
- As we learned more about programming, we made it more
human-friendly. We created reusable sub-routines or functions, i.e.,
sets of machine instructions with predictable results that can be
treated as a unit. We created more understandable code
abstractions, first assembly languages and then compiled
languages. Compiled COBOL was first available in 1959, to be
followed in 1960 by FORTRAN and soon thereafter, ALGOL. At that stage a
program still ran on the bare metal without operating systems, I/O
drivers, or other abstraction layers. The computer halted when the
program was finished, which was made obvious when all the blinking
lights on the operator’s panel froze (although, given the unreliability
of the early machines, frozen lights all too often simply signaled a
hardware failure or a program bug). It should also be noted that in
those days programs were hand crafted for a single machine. Each
computer was unique hence each program was
unique to the machine on which it was intended to run.
- The sociology of computing evolved in parallel. Each of the early
computers attracted a group of computing people – amateurs of course,
it would be years before computing became a profession – that explored
what the computer could do and freely shared their techniques and code.
As computers became more alike and people began using computers for
more routine tasks, operating systems emerged as did outboard I/O
processors, databases, and many other kinds of middleware. The
separated communities of computer people finally had much to share with
one another which led to the formation of ACM (the first issue of Communications was published in
- The emergence of minicomputers and then “microcomputers” (which
evolved into PCs), based on the early microelectronic CPU chips, opened
computing up to hobbyists who completely changed the common notions of
what a computer was for. Instead of just being for record keeping and
arithmetic, computers branched into word processing, games, and
virtually anything that could be made digital. The rapid decrease in
price/performance due to Moore’s Law continued the expansion of
computing into new areas and in unheard-of numbers.
- Early efforts to connect computers to one another
began in the 1960s. Rather than moving rolls of paper tape, boxes of
punch cards, or reels of magnetic tape from machine to machine, it was
more convenient to send data over a wire or a phone line. This line of
evolution eventually led to the ARPAnet in the early ‘70s. The modern
Internet evolved in series of steps out of the ARPAnet. In 1989-1991.
Tim Berners-Lee and collaborators then proposed and built a combination
HTTP, HTML, and a crude browser that exploded into the World Wide Web.
- The Web, in turn, has spawned all sorts of emergent multicellular
computing constructs such as infectious viruses and
worms, search engines, multi-player Internet games, peer-to-peer
networks, wikis, blogs, social networking sites, folksonomies, Web
Services, Service Oriented Architectures (SOA), mashups, and Web 2.0.
We also witnessed the growth of Cyber crime and now Cyber warfare,
via Stuxnet and Flame. We've come a long way, baby!
All of these emergent levels, from individual gates (now based upon
minuscule transistors printed on silicon rather than vacuum tubes or
discrete transistors) to collaborating web services, still participate
unseen in our everyday experience of computing. The most visible
examples are smartphones that merge voice, YouTube, streaming sports
events, Google, games, and a large world of other apps -- and you can
even program them yourself if you have a mind.
What is amazing is that the evolution of the “virtual” world
occurred so rapidly. Computing had become so complex by the ‘80s (a
mere four decades after the first general purpose computer) that
different threads of evolution – hardware, software, networking, and
cultural/economic – were operating in parallel, dependent upon each
other and reinforcing each other in completely unexpected ways.
Finally, in the mid 90’s, the Internet/Web and the dot.com boom
supercharged the evolution of computing by driving it into every nook
and cranny of modern culture. We now see amusing consequences such as
adults asking their middle-school kids to help them with some computer
problem or business executives with expensive laptops that are
considerably less powerful than their kid’s three hundred dollar game
box. Many of us walk around with one or two computers more powerful
than a 1960 mainframe in our pockets. Today, most of them, such as
iPhones, iPods and Android devices are connected to the Internet at
Modern high-tech human societies -- arguably the pinnacle of the
evolutionary path that evolved step by step from atoms -- are beginning
to be influenced as much by bits as by atoms. The evolution of life and the evolution of
computing are merging, bringing the complexity of both realms together
in completely unforeseeable ways.
Contact: sburbeck at mindspring.com
Last revised 9/3/2012