Bart Stewart -- Systemantics 0 (original) (raw)

horizontal rule

S Y S T E M A N T I C S 0

horizontal rule

BACKGROUND

One of the realizations to emerge from the recognition of the complexity of modern human life is that the universe is full of systems.

Some of them even work.

Most of those which do, however, arose naturally. That is, they are not products of human ingenuity. For all the fecundity of our imagination, we consistently fail to create large-scale systems that work as well as, say, gravity or photosynthesis.

Why is this so? While some may be content to simply shake their heads, roll their eyes, and mutter something along the lines of "humans are just plain weird," I have a theory. It is this: humans are just plain impatient.

We want it now. So we short-circuit the natural process of evolution by designing a system to do what we want. Rather than make infinitesimal changes very gradually across a highly diverse and interconnected environment, we make big changes very rapidly in an extremely narrow field.

The result -- which should be predictable -- is usually failure, sometimes on a massive scale with widespread unforeseen consequences. When we screw up, we tend to screw up big.

And yet... sometimes we get it right. Sometimes we actually manage to create working systems. (Though even these often have unexpected side effects.) What some thinkers now propose is that these successes are more likely when we understand and take into account the fundamental laws of systems-behavior.

First came Ludwig von Bertalanffy (yes, that was his real name), who provided a theoretical basis for understanding system behavior, particularly that of natural systems. This focus, however, prevented Bertalanffy from recognizing the conscious design aspect of human systems that allows some systems to work while others fail miserably (and sometimes disastrously).

The spark that ignited this revolution in thinking in a practical way about systems was left to an engineer, whose off-the-cuff insight led to what must be considered the primal systems-law: "If anything can go wrong, it will." With this realization, the field of Systems Analysis was born.

Rigorous research soon followed. Count Alfred Korzybski (yes, that was his real name, too) tried to put the study of systems on a firm theoretical footing with his work on General Semantics. Following him came researchers of a more pragmatic turn of mind (and easier names). C. Northcote Parkinson focused on institutional systems. His_Economist_ essay "Parkinson's Law" and other related work exposed the significance of human action in systems-failures. Lawrence J. Peter (The Peter Principle) expanded the popular understanding of this effect by cataloging examples in a familiar setting -- the office.

Finally, to give this field the aroma of respectibility it so richly deserves, John Gall wrote a book called Systemantics, a play on words based on the way "systems display antics." Gall collected and analyzed hundreds of examples of systems-failures, then from them generalized a number of pithy and highly quotable Laws of Systems Behavior, so of course his influence in this field cannot be overstated.

(It should also be pointed out that some of Gall's Horrible Examples of systems gone bad seem less about casting light on a theory than about beating up on certain things he doesn't like, with anything to do with splitting atoms heading the list. Still, his work as a whole survives these occasional excursions into personal score-settling.)

Although Systemantics is written with an air of facetiousness, Gall does make three useful contributions to those of us who would like to like to make the world work better but who aren't so much idealistic reformers as pragmatic theorists:

This, I think, is a positive approach to dealing with the natural tendency of systems to display antics. We all have to deal with systems. Why not try to learn their fundamental principles of operation (or, more often, non-operation) so that we can improve our chances of success?

I therefore present herewith those laws of system behavior which seem like they might have the most value to those who want to produce working systems in something less than geologic time.

Note: Some of these Laws of Systems Behavior are from Gall's_Systemantics_, but the text discussing each Law is mine. If you have a gripe about the text associated with a Law, the fault is mine, not Gall's.

Another note: It is useful to point out that, although the observations which follow apply to all systems, they apply in particular to_human_ systems. My point is to distinguish between mechanical systems and systems of human interaction such as organizations or institutions. Mechanical systems tend either to work reasonably well or else not at all. But human systems, because their outputs are subject to perception and interpretation, may actually continue to exist even though they have long since ceased to perform the function for which they were created.

So anything said about the bizarre behavior of systems in general goes double (at least) for human systems.

horizontal rule

Background

I. General System Behavior and Morphology II. Systems and Information III. System Design Guidelines: Number IV. System Design Guidelines: Size V. System Design Guidelines: Longevity

horizontal rule Home

Heart Body Spirit Mind
Art Writing Religion Personality
Music Travel Politics Computers
Genealogy Work History Reasoning
Fiction Games Economics Science

horizontal rule