Ignored reading pile in summer of swimming

Posted on January 14, 2008

0


So, back at my desk after an incredible summer. Hottest on record, lots of swimming. How much of the summer reading pile got read? None. Not a diddle. Until yesterday that is, when I looked at the pile I’ve ignored all summer. Then I ignored it some more and went for another swim.

But I did read the Communications of the ACM 50th year anniversary issue (is now online and easier to read than the cellophane wrapped ones I carry around meaning to read on planes). The issue looks back on 50 years of the publication. There’s too much technical detail for me, computing’s roots in cryptography and a continuing focus on efficiency of alogorithms does little to light me up. But then I noticed that most articles finish with a little flourish – a look forward. And then I looked more and found some interesting ideas, with some quite far reaching biomimicry.

Gul Agha writes of “Computing in pervasive cyberspace”. He promotes building software systems composed of concurrent objects, or actors. Actors reflect several key characteristics. For example, they are distributed, autonomous objects that interact by sending each other messages.

Computing has been morphing ever since (the notion of the Turing machine). Initially developed as a flexible calculator for scientific problems, computers have successively become an arbiter and recorder of business transactions, a reasoning engine carrying out symbolic computations, a laboratory for running simulations, and a vehicle for social networking and entertainment. At the same time, their speed has increased more than 10-million- fold, and they have been interconnected through ever-larger bandwidths. Amazingly, our concept of a program as an implementation of a sequential algorithm has remained the same. While the shift to actor-oriented computing is overdue, we need to start thinking of software systems beyond the composition of actors.

Freed from the temporal constraints of hardware, software could be the ultimate cyberorganism—a mind taking a body as needed to fulfill a particular function or mission.

Rodney Brooks writes of a need to shift to new software architectures:

The goal would be nonbrittle software modules that plug together and just work, in the remarkable way our own flesh repairs itself when insulted.

Peter Neumann writes about computing risks. While the focus is on security “particularly with respect to critical requirements such as security, reliability, survivability, evolvability, maintainability, interoperability, and predictable upgradability”, this sentiment could easily be expanded to a wider sphere:

The overwhelming conclusion from this body of material is that the risky problems are as great today as they were when we first set out to expose and eradicate them. Although the prevention mechanisms have improved somewhat, it is evident that we have not been advancing sufficiently rapidly in the development of mass-marketplace systems and custom applications that are sufficiently trustworthy—despite the tangible gains and research advances I noted in the first paragraph of this essay. Worse yet, various factors have outpaced those mechanisms, including increased complexity of systems, increased worldwide dependence on information technology and the ever-growing Internet, increasingly critical applications to which that technology is being entrusted, the general ease with which antisocial acts can be committed, and the ubiquity of potential attackers. Thus, we seem to be falling farther behind as time goes by. In particular, the huge expansion in the scope and pervasiveness of the Internet is creating many challenges for our community.

Risky problems are as great today as they were when we first set out to expose and eradicate them.

Jeannette Wing poses “Five deep questions in computing” (P ≠ NP?).

Our engineering prowess creates computer, communication, and information systems that enhance everyone’s daily lives and enable us to do astonishing things: instantaneous access to and sharing of information through palm-size devices with friends in social networks of tens of millions of users; dance with remote partners through 3D tele-immersion; and lead alternative lives through avatars that can even defy the laws of physics in virtual worlds. The complexity of these systems delivers the richness of functionality we enjoy today, with time and space performance that spoil us. Their complexity, however, also makes it difficult for even the original system developers to analyze, model, or predict system behavior, let alone anticipate the emergent behavior of multiple interacting systems.

Can we build systems with simple and elegant designs that are easy to understand, modify, and evolve yet still provide the functionality we might take for granted today and dream of for tomorrow? Is there a complexity theory for analyzing our real-world computing systems as there is for the algorithms we invent? Such a theory would need to consider measures of not just time and space but of energy and cost, as well as dependability, security, and usability, most of which elude quantification today. More ambitiously (or crazily), is there a complexity theory that spans both the theory and practice of computing?

I’m working this week, Environmental Educators conference in latter half of week. Working some of next week then trip to Auckland then the next week off. Maybe that summer reading pile might get some attention.

Advertisements
Tagged: