Sunday, March 22, 2009

Systemantics: A systems' view of everything

Systemantics (retitled The Systems Bible in its third edition) is a text by John Gall in which he proposes several "laws" of systems' failures. Systemantics is a play on words on semantics and systems display antics.

It is written in the style of a serious academic work, and is often mistakenly cited as such. The content is similar in style to Murphy's Law and the Peter Principle, which are both referenced in the work.

Some laws of Systemantics

  • The Primal Scenario or Basic Datum of Experience: Systems in general work poorly or not at all. (Complicated systems seldom exceed five percent efficiency.)
  • The Fundamental Theorem: New systems generate new problems.
  • Laws of Growth: Systems tend to grow, and as they grow, they encroach.
  • The Generalized Uncertainty Principle: Complicated systems produce unexpected outcomes. The total behavior of large systems cannot be predicted.
  • Le Chatelier's Principle: Complex systems tend to oppose their own proper function. As systems grow in complexity, they tend to oppose their stated function.
  • Functionary's Falsity: People in systems do not actually do what the system says they are doing.
  • The Fundamental Law of Administrative Workings (F.L.A.W.): Things are what they are reported to be. The real world is what it is reported to be. (That is, the system takes as given that things are as reported, regardless of the true state of affairs.)
  • Systems attract systems-people. (For every human system, there is a type of person adapted to thrive on it or in it.)
  • The bigger the system, the narrower and more specialized the interface with individuals.
  • A complex system cannot be "made" to work. It either works or it doesn't.
  • A simple system, designed from scratch, sometimes works.
  • A complex system that works is invariably found to have evolved from a simple system that works.
  • A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
  • The Functional Indeterminacy Theorem (F.I.T.): In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
  • The Newtonian Law of Systems Inertia: A system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
  • Systems develop goals of their own the instant they come into being.
  • Intrasystem [sic] goals come first.
  • The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.
  • The mode of failure of a complex system cannot ordinarily be predicted from its structure.
  • The crucial variables are discovered by accident.
  • The larger the system, the greater the probability of unexpected failure.
  • "Success" or "Function" in any system may be failure in the larger or smaller systems to which the system is connected.
  • The Fail-Safe Theorem: When a Fail-Safe system fails, it fails by failing to fail safe.
  • Complex systems tend to produce complex responses (not solutions) to problems.
  • Great advances are not produced by systems designed to produce great advances.
  • The Vector Theory of Systems: Systems run better when designed to run downhill.
  • Loose systems last longer and work better. (Efficient systems are dangerous to themselves and to others.)
  • As systems grow in size, they tend to lose basic functions.
  • The larger the system, the less the variety in the product.
  • Control of a system is exercised by the element with the greatest variety of behavioral responses.
  • Colossal systems foster colossal errors.
  • Choose your systems with care.



Advanced systems theory


1. Everything is a system.

2. Everything is part of a larger system.

3. The universe is infinitely systematized, both upward (larger systems) and downward (smaller systems).

4. All systems are infinitely complex.


No comments: