Jump to ratings and reviews
Rate this book

The Systems Bible: The Beginner's Guide to Systems Large and Small

Rate this book
Book is in Good condition

316 pages, Paperback

First published January 1, 1977

Loading interface...
Loading interface...

About the author

John Gall

22 books32 followers
John Gall (September 18, 1925 - December 15, 2014) was an American author and retired pediatrician. Gall is known for his 1975 book General systemantics: an essay on how systems work, and especially how they fail..., a critique of systems theory. One of the statements from this book has become known as Gall's law.

Gall started his studies in St. John's College in Annapolis, Maryland. He received further medical training at George Washington University Medical School in Washington, and Yale College. Eventually early 1960s he took his pediatric training at the Mayo Clinic in Rochester, Minnesota.[3]

In the 1960s Gall started as a practicing pediatrician in Ann Arbor, Michigan and became part of the faculty of the University of Michigan. In 2001 he retired after more than forty years of private practice. In the first decades of his practice he had also "conducted weekly seminars in Parenting Strategies for parents, prospective parents, medical students, nursing students, and other health care practitioners." Until 2001 he held the position of Clinical Associate Professor of Pediatrics at the University of Michigan. Since 1958 he has been Fellow of the American Academy of Pediatrics.

After he retired, Gall and his wife Carol A. Gall moved to Walker, Minnesota, where he continued writing and published seven more titles. He died in December 2014.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
354 (39%)
4 stars
301 (33%)
3 stars
162 (17%)
2 stars
61 (6%)
1 star
29 (3%)
Displaying 1 - 30 of 111 reviews
Profile Image for John.
293 reviews25 followers
December 25, 2016
I think what this book demonstrates is that a certain kind of common sense isn't sense at all, but rather a cynical tyranny of half-truths. It is disingenuous, in that it attempts to borrow the prestige of technical language exactly while also writing in a register of humor, so that any attempt to see past it would provoke the guard reflect of not being in on the joke. Another frequently-used convention is to use upper-case words to make conceptual entities seem justified, well-known, and cohesive, while offering no definition or justification of those entities as common phenomena. For example, I could call it Sophistical Trickery to make such a move, but I really don't mean anything more than sophistical trickery no matter how much the emphasis seems to imply. However, beyond the matter of style, there is what this little book of systems ethics forgot: what it is good to do, given the state of living within systems.

Such is the dogmatism of this volume that I fear this negative review might cause partisans to label me as some kind of "systems thinker" or "change agent", for which the book spares no tar. What I am an apologist for is straightforward speaking, careful scholarship, and thoughtful analysis. The great tragedy of this book is that in spreading the tar around it covers good intuitions with bad argument.

There are good intuitions; I might say that there is a lot of half-truth to be found here, and in particular the last sections, which suggest guidelines, are useful. Here is what is useful to know about this book: don't build systems, but instead solve the problems you encounter directly. I think this is right, but let me say something further: don't be afraid to build frameworks, to build guidelines, to build processes, to nonetheless undertake design and undertake it creatively. You should resolve the matters in front of you as straightforwardly and conclusively as possible, but with all of the preparation, care, and design you think is truly appropriate to the task.

The author notes that everything is a system, and all systems attempt to preserve themselves: therefore, we are systems and our obligation is our own care. We are within a profusion of systems, so it is for us to preserve the practices that demonstrate their capacity to care for us, and don't just do things because they are there to do, but because they genuinely promote flourishing as we see it.
Profile Image for Sai.
97 reviews11 followers
January 21, 2018
Some parts systems theory, some parts psychology. Author has a quirky writing style and a consistent dry sense of humor which I enjoyed, but can't see it being everyone's liking.

This book reads like as if a shaman were educating you about complex systems. Very pithy, but also can come across as not rigorous enough.

I wish the author had tackled more systems post-failure scenarios and how to deal with messes, and I would be happy to add that 5th star. Author briefly touches system resiliency, but a full blown take would have made the book more complete IMO.

Worth picking up after finishing Thinking in Systems by Donella Meadows and before reading Antifragile by Taleb
Profile Image for Kaspar.
8 reviews
March 25, 2020
A simple and brilliant work that you'll probably misunderstand.

Gall, with wit and concision, advocates an attitude and mindset of deep humility and skepticism when dealing with systems. The problem is that this book really is the Tao of Systems Thinking. To receive its wisdom and recognize its profound depth, to grasp even the need for systems-skepticism, the reader should expect to meditate on these aphorisms for days, months or even years. As an intro to systems thinking, its not very good or useful at all. It might infuriate the reader or leave them completely indifferent. But if the reader already has substantial experience with systems thinking, Gall will elevate and organize their understanding by validating their intuitions, offering useful heuristics and checking their ego.
Profile Image for Otto Lehto.
453 reviews171 followers
May 24, 2019
Do not take this book very seriously. It is a quirky little comedy essay about General Systems Theory. There is really nothing to compare it to, so I really have no idea how to rate it...

Although it lacks any kind of scientific rigour or empirical accuracy, it does a pretty good job of explaining the basics of how (complex) systems work. It does so in a surreptitious way by simplifying the science behind systems theory and complexity theory into pithy slogans and anecdotes. (This is bad practice that usually leads to terrible results, but somehow not here.)

The book is pleasantly short to read (it can be finished in an hour or two) and contains enough ideas to give the reader an idea of how to think skeptically and cynically about the powers of prediction and control that systems-builders and systems-reformers always claim to possess. (Spoiler alert: they don't.)

I would not rely on the book if I wanted to find out the best science of systems. It is mostly factually dubious, rhetorically outrageous, and severely biased. But I liked it quite a bit for what it is.
Profile Image for Ushan.
801 reviews70 followers
December 27, 2010
A cross between Dilbert, Dao De Jing and Charles Perrow's Normal Accidents. Large technological and social systems lose track of their original purpose and turn self-serving; they do not function as designed because their creators forgot Le Chatelier's principle and were unaware of various feedback loops. The process of observing the systems changes them. Passive safety is better than active safety; when used mindlessly, safety devices and procedures themselves become safety hazards.

The examples of systems gone bad are great. An enormous hangar designed to house space rockets and protect them from the elements generates its own weather, so it can rain inside it upon the rockets. When the Fermi I experimental breeder reactor experienced partial meltdown, radioactive sodium was drained and a complicated periscope and pincers were lowered into it; it was found that a foreign object blocked the flow of sodium; the object was later identified as a safety device installed at the very last moment and not documented (Perrow also tells this story; Gall is mistaken in calling it an anti-meltdown device: this would've been too cute). A Peruvian railroad replaced its steam locomotives with diesel ones; they discovered after the fact that diesel locomotives lose most of their power at Andean altitudes, unlike steam ones, but instead of going back to steam, the Peruvians used two 3000hp diesel locomotives where one 1300hp steam locomotive sufficed before. The Nile used to flood annually and fertilize the Egyptian fields; Nasser built the Aswan dam, which stopped the flooding; the dam produces electricity, which is used to make artificial fertilizer (J. R. McNeill also tells this story in his environmental history of the twentieth century). The examples of ignored feedback are also nice. The Green Revolution caused third-worlders to go hungry as before - but at much higher population densities. Widespread application of antibiotics caused antibiotic-resistant germs to emerge. On the other hand, Washington D.C.'s international airport has a better-than-average safety record despite its hazardous features: "It was safe because it was bad. It kept pilots alert".

Every engineer could cite many examples of systems gone bad. So could everybody interested in politics. I wonder if politically Gall is a Reaganite; certainly his book made me think of Reagan's famous remark, "My friends, some years ago, the Federal Government declared war on poverty, and poverty won." My favorite political example is the agricultural policy of the USA and the EU. The United States Federal Government on the one hand, subsidizes farmers and tries to keep food prices and demand for food high, on the other hand, issues food stamps to poor people because food prices are too high, and on the third hand, combats obesity through the National Institute of Health. The EU countries' governments give out large amounts of aid to poor countries, yet impose high tariffs on agricultural imports from them. Like Stanislaw Lem's King Murdas, they are examples of systems so large that their various parts have minds of their own, sometimes contradicting the minds of other parts.
Profile Image for Phil Eaton.
91 reviews184 followers
March 19, 2019
Douglas Adams writes a book on complexity and failure.

In the top two books I've read in the last five years.
Profile Image for John Fultz.
28 reviews5 followers
August 2, 2014
What a very odd book. The voice is incredibly serious, yet often with tongue planted firmly in cheek. The style reminds me a bit of The Dilbert Principle, but with less overt humor and more "wink-wink-nudge-nudge, but no, this is really serious".

Also, it's an old book, and it shows through the examples and footnotes. Many of them date to the 60s and the 70s. Although this printing was released in 2012, there's a lot of the previous decades leaking through.

All of that having been said, the principles seem fairly solid to me. The book warns firmly against optimism when dealing with systems, and advises pragmatic approaches which involve compromise. Several chapters go into some detail discussing software as a system, and I can certainly personally verify most of what was written there. And the many anecdotes are entertaining, if humbling.
Profile Image for Eric Franklin.
78 reviews85 followers
February 6, 2019
Absurd, hilarious, and useful, this is a complete and creative toolkit for understanding and interacting with systems. Replete with humorous examples and rife with overt cynicism, a timeless representation of human futility for engaging with our own creations.
Profile Image for Justus.
182 reviews4 followers
November 21, 2010
tries unsuccessfully to be flip and not very insightful, but its a quick read with an interesting of mind tickling maxims.
42 reviews1 follower
June 6, 2016
Funny at times, but I'm not sure there was actually much I could take away from it. I did like the use of very short chapters.
Profile Image for Anna.
35 reviews
July 10, 2022
Recommend it for anyone who wants to solve big problems or has to interact with human systems
Profile Image for Prasanna.
231 reviews10 followers
October 28, 2017
I read the book in one sitting on a Friday when I was taking a break from working on some annoying distributed systems issues. It speaks to the timelessness of Systems problems that a book that was first published in 1975 can have such an impact even today. There have been many Systems Theory books since this one and I had just read John Miller's "A crude Look at the whole" and expected more of the same treatment. Boy, was I surprised!? The book is broken into very small chapters that essentially talk about a certain axiom about systems thinking with anecdotes. The book feels friendly while also sounding serious and authoritative at times. This validated a lot of my own thoughts about heterogeneity gaining stability in a system and how complex systems with problems should be tackled. This should be a required reading for anyone who wants to develop a systems thinking mindset.
Profile Image for Martin Brochhaus.
153 reviews163 followers
November 21, 2020
I'm not sure how to rate this. I wanted to learn about Systems Thinking so that I can apply it in life.

What I got from this book were a few good chuckles (decent humour) and an existential crisis (everything seems hopeless and pointless and utterly broken, and we will probably all die in a nuclear winter).

All the axioms and theorems seem to make sense to me, but of course only such examples were cherry-picked, that proved the point.

Most chapters are a headline, a few paragraphs, one or two new theorems/heuristics. It quickly becomes too much to keep in mind, and the chapters don't seem to properly build on top of each other, so I quickly stopped caring too much and just forced myself to the end.

I suspect, this book is aimed at people who are professional "System Thinkers" (is that even a profession?) and it's just an attempt at humour? Like the TV show Big Bang Theory is humor for pop culture nerds?
55 reviews
January 14, 2023
Good nuggets about how systems work, or don’t, at a macro level. Lots of food for thought on systems I encounter at work (the company as a system, the industry as a system), or our daily lives. My simplified summary: complex systems fail in unexpected ways, in fact they mostly exhibit failures and you should just accept that, systems tend to self preserve their existence even when they’re bad or not needed, you need an outsider view to properly evaluate them as systems create their own internal rewards to keep you hooked.

The material is quite dry and at some point in the latter half it becomes self-referential. I was relieved when the Appendix section started about 60% into the book and I could move on to the next book.

I was slightly put off by the self-aggrandizing writing style and the unexpected and frequent capitalization of many words as if they’re axioms I’m supposed to memorize. That gives the book a vibe of a cultish/religious book I’m supposed to believe and not scrutinize.
Profile Image for Alexander Yakushev.
49 reviews35 followers
March 21, 2018
However satirical, this book presents hard questions and no easy answers. It is a very humbling experience that makes you rethink your approach to solving problems (and whether what you do solves them at all).
Profile Image for Leonid.
228 reviews18 followers
January 18, 2023
Even though book has some interesting (albeit not super-original) insights, sometimes it's ironic tone harms the message, as I feel that author tries to rely on anecdotes to prove system failures, rather then providing better evidence. Also sometimes it feels like author deliberately misrepresents goals of some systems to show their failures.
To those interested in the general "systemantics" topic I'd much more recommend "Thinking in Systems" by Donella H. Meadows.
Profile Image for Bob.
45 reviews
January 21, 2020
Meh...

As a systems thinker and fan, didn't find much to recommend it.

Complex systems do what they're gonna do, and it's not what you want, was my main take away.

Abandoned after 2/3 through, but definitely feeling done with it.
39 reviews
October 8, 2018
Things are not working, that much we agree about. For me, these 'things' are mostly software systems which drive me closer to the edge of insanity the more I work with them, and for the author of this book it's mostly human organizations. The arguments of the book are supposed to apply to all systems, so we can ignore this minor difference. Who or what is to blame for this state of affairs? Or even better, how can we navigate it? The culprit, according to the author,is the systems that we see everywhere, enjoined and entangled,permeating our lives, and pulling us in all directions. The solution proposed by the author is to be suspicious of systems, and know their tricky ways ("We must learn to live with systems, to control them lest they control us", p. xiii). An extensive definition of a system is not attempted by the author, rather wisely, because this gives the reader the drive to look for systems in their own life. There is one roundabout way of defining systems towards the end of the book which I found rather enlightening, however: "The system represents someone's solution to a problem. It does not solve the problem" (p. 74). More than systems per se, the author targets what he calls systemism, the vacuous belief in the fundamental usefulness of systems, and the book aims to cast it out of the reader by succinct statements of the flaws of systems. The book is written in a tongue-in-cheek style. The author makes claims of systematicity, and orders his argument in axioms that should somehow lead up to a consistent theory, but that's not really the case. This doesn't mean that the book is inconsistent,however. The axioms serve to deconstruct systemism, and show, in the process, how systems not only fail to function as advertised, but also blind us to their failure.

The argument starts off with a statement of the singularity of systems: All systems exhibit system behavior, and when we have a new one, we will have to deal with its problems, just like with all the others. Systems also have a will to survive and grow. Based on work on the size of administrative systems, the author (rather informally) judges the annual growth of systems to be in the order of 5 to 6 percent per year. This despite the fact that systems do not produce the results we expect them. In fact, we cannot even project what kind of behavior complex systems will exhibit. This is due to what the author calls the Generalized Uncertainty Principle (GUP): "Complex systems exhibit unexpected behavior" (p. 19). The simple and straightforward explanation for the GUP is that the no one can understand the real world good enough to predict the outcome of complex processes. This is a fact that has been internalized in parts of the startup community with the principle of experimenting and failing fast to find out what is working, instead of just guessing it. One obvious trick to beat the GUP is taking a simple system that is well understood, and making it bigger. This will not work, however,which I have had the chance to observe personally a number of times,and the author agrees: "A large system, produced by expanding the dimensions of a smaller system, does not behave like the smaller system".

The next axiom, called Le Chatelier's principle, is also something I have had the misfortune of observing: "Systems tend to oppose their own proper functions" (p.23). That is, systems, through ill-defined proposals for improvement, will install procedures which in name aim to do so, but bind people and resources in what the author calls Administrative Encirclement. The example the author gives is from an academic setting, but every software developer knows the dread of having to attend meetings that are supposed to improve efficiency, but acoomplish nothing other than holding back people from working.

The result of the GUP and Le Chatelier's principle is that systems do not perform the function a similar system of smaller size would perform. The relationship is in the opposite direction: The system is not defined by the function, but the function by the system: "The function or product is defined by the systems-operations that occur in its performance or manufacture" (p.36). But if systems are inefficient, and producing the wrong thing at the same time, how come they don't self-correct? To do so, a system should have the capacity to perceive the situation, which is not the case. For systems, "The real world is what is reported to the system" (p. 39). I think this is one of the most striking lessons of this book. It is astonishing how distorted the views of people working within a company (especially in higher positions) can become, and this precludes making any significant changes in the company. This is also one of the most significant parallels between human and software systems. A software system is also as good as its fidelity to the real world. Some systems keep on running for a long time as their records of reality and the real world drift apart. The author also has a really nice name for the ratio of reality that reaches administration to the reality impinging on the system: Coefficient of Friction.

What about intervening into systems as an outsider? Can one judge their behavior from the outside, without the tinted glasses with which the system sees the world, and improve the system? According to the author, this is possible only if the system already worked at some point. One cannot build a complex system and expect it to work; this is not possible. A working complex system can be achieved only by starting with a small system, and growing it. This is another parallel to software development: Large systems that are designed without one line of code being written, and are then built by separate teams, face huge problems when the time for integration comes. This insight has led to the agile movement, which aims for always working software that is integrated as frequently as possible. What's more, software teams also face a similar issue. Gathering a large number of developers together, and telling them to build a specific thing does not work either. The best approach is to start with a relatively small team,see what works and what doesn't, establish a culture, and grow around them.

How systems deal with errors (or fail to do so) is one of the most relevant parts of the book for modern technological systems. Due to the fact that systems tend to grow and encroach, they will have an infinite number of ways in which they can fail. These ways, and the crucial variables which control failure, can be discovered only once the system is in operation, since the functionality of a complex system cannot be deduced from its part, but only observed during actual functioning. These points lead to the conclusion that "Any large system is going to be operating most of the time in failure mode" (p. 61). It is therefore crucial to know, and not delegate as only extraordinary, what a system does when it fails. This is not so easy, however, since as per the coefficient of fiction, it is difficult for a system to perceive that it is working in error mode,which leads to the principle that "In complex systems, malfunction and even total nonfunction may not be detectable for long periods, if ever" (p. 55).

If it is so difficult to design systems that work, and keep them working as they grow, how are we supposed to live with them? The obvious step is to avoid introducing new ones, that is, "Do it without a system if you can" (p. 68), because any new system will bring its own problems. If you definitely have to use a system, though, the trick is to design the system with human tendencies rather than against them. In technological systems, this is understood as usability. Another principle is to design systems that are not too tightly coupled in the name of efficiency or correctness. This is stated as "Loose systems last longer and function better" (p. 71).

After reading the book, and writing this review, I have only one question in my mind: Why does this book exist? How can it exist? How can it be that so many mind-blowing insights about technological systems were derived by a MD, and recorded in an obscure 80 page book sometime in the seventies? And which other books exist out there that are as good as this one, and are not yet discovered?
Profile Image for Tony.
99 reviews
August 1, 2015
I read this book on a Saturday afternoon. Small book, amusing writing, easy to follow.

This book was published in 1975. Don't be surprised if some of the examples, and some of the language, is somewhat dated.

The author attempts to be both amusing and academic in his approach. I find most academic writing to be dry and overly intellectual. While the intellectual aspects of this book annoyed me to some degree (otherwise it would have 5 stars) the humor does shine through.

What are the common characteristics of systems? Keep in mind: machines are systems, as are electronic devices, as are organizations of people, as are computer programs. What things can we say about ALL of these things?

First off: have you ever noticed that systems tend to fail? So much, in fact, that we consider that normal? So much so that we have a common acronym for when it does (SNAFU; Situation Normal All F***ed Up)? Systems tend to spend more time in "failure mode" than in proper working order. And attempts to remedy the situation, usually by making the system more sophisticated, only increase probability that it will be in failure mode at any given time.

As someone who creates systems for a living (I'm a programmer), I tend to think that just a few more tweaks to my shiny new system is all that is needed to get it working properly, reliably. Instead, I need to be designing systems to be as easy as possible to clean up after when they do fail and the consequences of said failure need to cause as little pain as possible. Because the only thing you can count on is that it WILL fail, at least part of the time.

Right. I needed a book to tell me this? It's full of things which should be "duh!" And, in retrospect, are. But going in, you will probably find yourself with a lot of "AHA!" moments.

Spend an hour or two with this one. You won't regret it. You'll probably have a couple laughs. And sometimes, we just need to have those things we've been feeling, down in our bones, publicly stated.

Never forget. The hot air balloon was invented by a couple paper makers (Montgolfier brothers). And the airplane was invented by a couple bicycle makers. Not by some organization.
212 reviews9 followers
November 1, 2014
Not what I expected, but still very relevant. I expected something very academic and mathematical. The author claimed many times that his principles were "axioms", and that they were pristinely mathematical in nature and all self evident. This was a rather annoying claim, since the book was not mathematical at all, nor were the axioms necessarily self-evident (though good supporting examples were provided). Despite this, it all still rings perfectly true. A system can be a blessing or a curse, but it is guaranteed to have unexpected behavior. When it does something bad, you'd better hope that your system is flexible, changeable, monitorable objectively somehow, that it doesn't completely dominate everything and only allow positive feedback.
The style is kind of a mix of Taoist philosophy, design, a tiny bit of math (really, barely any), common sense, self-improvement, endearing Latin textbook, and more. Lately I'd been thinking about all of the generalizable things I'd learned from programming (and especially my strengthened dislike of large bureaucracies): that things need to be flexible and interchangeable, testable and constantly tested, designed with user experience in mind, tested before deployed, etc. All of that is abstracted out of software design and into the real world with this book, which is really quite phenomenal. Software happens to be just one type of system.
Profile Image for Taylor Pearson.
Author 3 books740 followers
January 28, 2019
Complex systems are one of my favorite subjects and The Systems Bible is a great entry in the genre.

Simple systems are a sum of their parts: a bike is just a bunch of parts. If you take a wheel off and replace it with another, no big deal.

In a complex system, the whole is greater than the sum of its parts. If you take the heart out of a horse and then replace it a few hours later, it doesn’t start working again like a bike. This does not mean we can’t understand complex systems, only that they play by a different rulebook which The Systems Bible attempts (and largely succeeds) at capturing with quotes like:

“A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.”
and

“SYSTEMS TEND TO MALFUNCTION CONSPICUOUSLY JUST AFTER THEIR GREATEST TRIUMPH:
Toynbee explains this effect by pointing out the strong tendency to apply a previously-successful strategy to the new challenge:
THE ARMY IS NOW FULLY PREPARED TO FIGHT THE PREVIOUS WAR
For brevity, we shall in future refer to this Axiom as Fully Prepared for the Past (F.P.F.P.)”
Profile Image for Raziel.
Author 1 book1 follower
November 9, 2016
"we humans tend to forget inconvenient facts, and if special notice is not taken of them, they simply fade away out of awareness", p. xx.

"The reader is hereby warned that any such optimism is the reader's own responsibility", p. xxi

"Error is our existential situation and that our successes are destined to be temporary and partial", p. xxv.

Efficiency Expert: Someone who thinks s/he knows what a given System is, or what it should be doing, and who therefore feels s/he is in a position to pass judgment on how well the System is doing it. At best a nuisance, at worst a menace, on certain rare occasions a godsend. (p. 237)

Expert: A person who knows all the facts except how the System fits into the larger scheme of things. (p. 237)

Specialist: One who never makes small mistakes while moving toward the grand fallacy (McLuhan). (p, 240)

Systems-person: For purposes of recognition, a System-persons is someone who wants you to really believe or (worse) really participate in their System. (p. 241)

System: A set of parts coordinated to accomplish a set of goals. Now we need only to define "parts", "coordinated", and "goals", not to mention "set". (p. 240)
87 reviews
June 7, 2016
This book was first published in 1975 and has gone through several printings.
It is a serious book that sometimes masquerades its points with humor. The general theory supported in the book is that: "Systems in general work poorly or not at all". Two representative corollaries of this theory are: "Large systems usually operate in failure mode" and "The system tends to oppose its own proper function." The strength of the book is the examples of real world systems behaviour ranging from the administrative to those in the technical world. Human failure while a part of many systems is not claimed as the underlying cause and instead it is suggested that the observed difficulties are intrinsic to the system's operation. This theory is hard to validate with our current focus on human error in design and operation. Difficulties notwithstanding, it appears others have attempted to carry this work forward. i look forward to reading further in Systemantics in case hope is found for human intervention.
1 review
May 27, 2008
I would like this book to be required reading for all high school or college students. It would help dispel the now unhealthy wide-spread blind faith in "systems." To paraphrase the author: A large system (Congress for example) never does what it says it does. Large systems have their own goals.

"The Systems Bible" is written for the layperson. It is very witty and full of usable wisdom.
Profile Image for Lou Cordero.
127 reviews1 follower
September 11, 2014
The copy I read is subtitled "How systems work and especially how they fail". Wonderful easy read sheds light and humor on the development of complex systems. The impossibility of solving the problem correctly and completely. I recommend this book to anyone involved in the design of complex systems.
Profile Image for Julissa Dantes-castillo.
363 reviews26 followers
September 27, 2023
I appreciate the author's enjoyable approach to the book, as it serves as an exemplar of maintaining conciseness and precision in its discussion of the subject matter. It refrains from including unnecessary supplementary examples and remains focused on only those that are truly necessary. Overall, it is a really good book.
1 review
Read
April 5, 2023
I. GENERAL SYSTEM BEHAVIOR AND MORPHOLOGY

1. If anything can go wrong, it will.

2. Systems in general work poorly or not at all.
Another way of saying this: Things don't work very well--in fact, they never did. What is this but the recognition that being human is intimately linked to building systems, which by their nature are always more likely to fail than to succeed?

3. Complicated systems seldom exceed five percent efficiency.
Part of the reason why designed systems (as distinct from evolved systems) perform so poorly is their complexity.

4. In complex systems, malfunction and even total non-function may not be detectable for long periods (if ever).
Large-scale human systems are particularly susceptible to this condition. Because the determination of the quality of the system's output is often highly subjective--in many cases coming from a component of the system itself--the partial or even complete failure of the system can be masked. This is not always deliberate; sometimes it is simply the complex nature of the system itself that prevents observers from determining within any useful margin of error the level at which the system is functioning.

5. A system can fail in an infinite number of ways.

6. Systems tend to grow, and as they grow, they encroach.
The human impulse toward improvement (some might call it "tinkering") means that any working system is regarded as a challenge. "If this system is good now," the thinking goes, "expanding it could only make it better." Another way of describing this behavior: "Systems tend to grow at five per cent per annum." (Examples abound in business and government... and indeed, government itself is an example of how working systems tend to grow.)

7. As systems grow in complexity, they tend to oppose their stated function.
Although it may seem counterintuitive, as systems become more complex their outputs actually begin to change to preserve or even intensify the problem they were originally designed to solve.

8. As systems grow in size, they tend to lose basic functions.
A system is created to solve a problem. It exists to perform a basic function. Sometimes it actually works.

9. The larger the system, the less the variety in the product.
As a system grows, it becomes progressively more difficult for the element designed to exercise control to do so. In order for a system's defined control element to function as designed, it requires information about how well the system's outputs are conforming to the control element's goals.

Because it is difficult to measure highly-varied outputs, what happens in reality is that the variation in those outputs is often artificially constrained in order to make them easier to numerically quantify. System success, as a result, is slowly transformed from being about solving a problem to being about whether measured numbers resemble expected numbers.

10. The larger the system, the narrower and more specialized the interfaces between individual elements.
It is a fundamental principle of information theory that the integrity of a message is inversely proportional to the length of the communication channel through which the message travels. In other words, information degrades over distance.

11. Control of a system is exercised by the element with the greatest variety of behavioral responses.
Systems don't operate in a vacuum. They function in an environment, and the important thing to note about an environment is that it can change.

As with biological systems, which must adapt or die, so it is with other kinds of systems. The difficulty for human-created systems, however, is that their activity is meant to be directed by conscious intelligences. Because we tend to use hierarchical command structures, that means the activity of a whole system is supposed to be directed from the top down. But as systems become very large, the behavioral responses of both the highest and lowest elements is often constrained.

The director of an agency, for example, is the most visible to those outside persons concerned with the activities of that agency. So what that director can do is often limited due to time (shareholder meetings, appearances on Nightline, etc.) or political maneuvering (lobbying, etc.). And what the person on the shop floor is permitted to do is usually even more tightly constrained.

12. Loose systems last longer and work better.
Gall, in Systemantics, describes Charles Babbage's experience with his Difference Engine. In the early versions, the internal parts tended to bind up against one another, with the result that the entire system either did not work or broke from being pushed too hard.

13. Complex systems exhibit complex and unexpected behaviors.
Complex systems by definition are composed of many smaller parts. The interaction of these parts--since there are so many of them, and so many connections between them--reaches a point of incomprehensibility very rapidly. Thus the mutual action of the parts is likely to be unpredictable.

14. Colossal systems foster colossal errors.
From burying its rulers under simple stone structures called mastabas, the Egyptians progressed (if that is the word) to complex pyramids. As the pyramids got larger and larger, they became too tall for their bases. The pyramids fell down.

II. SYSTEMS AND INFORMATION

1. To those within a system the outside reality tends to disappear.
System components tend over time to become more and more specialized. As this continues, these elements begin to refuse to recognize any inputs that they consider improperly formatted. This formalization tends to result in the creation of "layers" of information between the elements of a system and the real (external) world.

2. Components of systems do not do what the system says they do.
Components are often referred to as if they performed the function for which the entire system was designed. For example, does someone who works in the aerospace industry build fighter jets? No. She may rivet wing assemblies, or manage a computer network, or perform cost accounting, but she is not a jet fighter builder.

3. The system itself does not do what it says it is doing.
Particularly in larger systems, stated goals are often not the actual goals. In small (that is, working) systems, before and just after their creation, goals usually aren't stated; it's enough that the system helps solve the problem it was created to address.

4. The system takes the credit (for any favorable outcome).
If an individual element of a system somehow causes a desired positive effect, the system as a whole will credit itself with having achieved that effect... even if that effect is only incidentally associated with the stated goal of the system. The reasoning (such as it is) goes something like this: Since the individual element was part of the overall system, and since individual elements are incapable of producing the stated goal of that system, therefore if anything even remotely like the goal occurs, the entire system must have produced it.

5. Perfection of planning is a symptom of system decay.
This is actually a phenomenon of two separate but related systems dysfunctions. FIrst, when the belief that "once our plans are perfect, we cannot fail" infects a group's leadership, that group begins to fail. The trouble with this kind of thinking is that it assumes that the environment surrounding the system is static, that it will never change. Adapting to a changing reality is hard. It takes constant effort. So there's always pressure to take the easy route, to assume (just for the sake of discussion, of course) that the environment won't change between the time when planning starts and when the plan is complete.

6.
The information you have is not the information you want.
The information you want is not the information you need.
The information you need is not available.

III. SYSTEM DESIGN GUIDELINES: NUMBER

1. New systems mean new problems.
Systems are created to solve some problem. The trouble is, the moment a system is created it brings into being an entirely new set of problems related to the general behavior of all systems. Instead of there being just the one original problem, now there is the original problem plus the problems generated by the new system.

2. Systems should not be unnecessarily multiplied.
The lesser-known formulation of Occam's Razor. (The usual form is, "Given a choice between two systems, choose the simpler.")

3.
Do it without a new system if you can.
Do it with an existing system if you can.
Do it with a little system if you can.

4. Escalating the wrong solution will not improve the outcome.
Example: In the world of software development, projects sometimes wind up being delayed. Humans can make mistakes; requirements can change; resources can be denied--there are numerous events that can affect a plan for the worse.

5. If a problem seems unsolvable, consider that you may have a metaproblem.
In other words, it may be that the problem you're trying to solve is itself the result of some other, deeper problem. In such a case, figuring out the real problem may give you the necessary insight to solve the lesser problem, or even make its solution unnecessary.

IV. SYSTEM DESIGN GUIDELINES: SIZE

1. A simple system may or may not work.
Things can fail in an infinite number of ways, but they can work in only a few ways. In other words, the odds are against success.

2. A complex system that works is almost always found to have evolved from a simple system that worked.
Big systems built from scratch almost never work. You have to start from scratch with a working simple system and grow it carefully. This might not work, either, but it's your only hope.

3. Big systems either work on their own, or they don't. If they don't, you can't make them. Pushing on the system won't help.
Big systems are like any other large mass: they have a lot of inertia. Once a system gets big, it becomes highly resistant to major change. If it's not working, tinkering is unlikely to make it work... but tinkering is all that anyone will be permitted to do to that system, because big systems always have constituencies whose functions depend on the system remaining as it is.

4. A large system produced by expanding the the dimensions of a smaller system does not behave like the smaller system. It Kant.
The trouble with the idea of increasing the scope of a small system to make it do more is that--as Immanuel Kant among others pointed out--a sufficient change in degree can constitute a change in kind. To put it another way, enough of a change in the quantity of a thing can mean a change in a quality of that thing.

1. Systems develop goals of their own the instant they come into being.
The moment someone begins using a simple working system, they become part of that system... along with their goals for this new, larger system.

As the system grows further, becoming still more complex as more people use it, more goals are added. Some of these goals will be mutually exclusive; they can't all be fully satisfied. Eventually it becomes impossible to explain why the system as a whole does whatever it does. The system appears to take on goals of its own that are completely independent of whatever purpose the original system's designers intended.

2. The longer a system exists, the more its primary goal becomes self-preservation.
Constituencies again. The more people who are served by a particular system, the greater the resistance to changing that system... and, bizarrely, this is so whether that system works or not.

3. The longer a system exists, the greater its resistance to any fundamental change short of complete destruction.
Resistance to change seems to be a natural feature of all mature systems. At some point, the effort required to change becomes greater than the resources available to make that change. (When the system under consideration is a civilization, the historian Carroll Quigley's terms for this are an "instrument" which becomes an "institution." But these terms could be applied equally well to systems in general.)

When this occurs, the system's final insulation from its environment begins. Soon no reality will get through to the controlling elements at all. The closed positive feedback loop is complete. And the only way the original problem will be solved is by the creation of an entirely new, small, working system.

Profile Image for Hamish.
404 reviews29 followers
July 5, 2021
I read this book because I wanted to learn more about Gall's Law, the statement that complex systems never work when constructed de novo, and so have to be iteratively built up from smaller systems which do work.

Unfortunately this is not an academic book as I thought, but rather a protracted Dilbert comic.

Despite the disappointment, it does have some good stuff in it.

Great introduction:
All over the world, in great metropolitan centers as well as in the remotest rural backwaters, in sophisticated electronics laboratories, and in dingy clerical offices, people everywhere are struggling with a Problem: Things Aren't Working Very Well.


I've modified this quote to make it punchier:
Whereas before there was only the problem to deal with, there is now - in addition - the solution.


"Systems tend to expand to fill the known universe."

Le Chatelier's principle: When a settled system is disturbed, it will adjust to diminish the change that has been made to it.

Relatable content:
Because Trillium has clearly stated his Goals and Objectives, it is now possible to deduce with rigorous logic how he should spend his waking and working hours in order to achieve them most efficiently. No more pottering around pursuing spontaneous impulses and temporary enthusiasms!


There's a bunch of proto-Hansonian stuff on page 32 about "kings are not really about ruling" and "academics are not really about research". The generalisation is "the system does not do what it says it is doing".

Galls' Law:
A complex system that works is invariably found to hav eevolved from a simple system that worked. [Also:] A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.


When you create a system:

New problems are created by its very existence.
Once set up, it won't go away, it gorws and encroaches.
It begins to do strange and wonderful things.
Breaks down in ways you never thought possible.
It kicks back, gets in the way, and opposed its own proper funciton.
Your own perspective becomes distorted by being in the system.
You become anxious and push on it to make it work.
Eventually you come to believe that the misbegotten product it so grudginly delivers is what you really wnated all the time.
At that point encoachment has become complete.
You have become absorbed. You are now a systems-person.


A sysems-sophisticated manager will adopt "catalytic management": don't try to force the system to do things, try to unblock things it is already trying to do.

The work of change agents is made enormously more delicate and uncertain by the fact that the mere presence of a change agent (recognizable as such) has about the same effect on an organization as an efficiency expert seen strolling across the factory floor with stopwatch in hand: it promptly induces bizarre and unpredictable alterations in the behaviour of hte system.


The real field which this book is spoofing is general systems theory.

There are lots of quotable lines:

"Solutions usually come from people who see in the problem only an interesting puzzle and whose qualifications would never satisfy a select committee."

"The system behaves as if it had the will to live."

"A complex system cannot be 'made' to work. It either works or it doesn't."

"For every human system, there is a type of person adapted to thrive on it or in it."

"If a system is working, leave it alone."

"A system continues to do its thing regardless of need."

"Compelx systems usually operate in failure mode."

"When a fail-safe system fails, it fails by failing to fail safe."

"Complex systems tend to produce complex responses (not solutions) to problems."
Profile Image for Artur.
221 reviews
October 16, 2023
Some people treat this book like it should be a textbook or a rigorous whitepaper. That frame of reference is bound to bring grief and misunderstanding as the field of systemantics is so new and its subject is so complex and so all-encompassing that it will take humanity centuries more to reach the proper understanding of it on the level of rigor we expect from an established field of science, be it chemistry or physics.

Instead, I'd suggest to treat this book as a collection of recipes and lore, empirical rules of thumb and critical observations that might or might not be true in the systems that the reader encounters on daily basis. A Hitchhiker's Guide to Systems, if you will. This frame of reference, with high degree of probability will bring the reader much more joy and allow to perceive the systems in the new light and not be amazed by their idiosyncrasies.

Now, let's move on to the actual review. I am a software engineer which means that I tinker with systems for a living (all of us do to some degree, but software engineering being engineering makes you do so with a bit more self-consciousness compared to other professions). This book amazed me by the apparent truthfulness and laconic nature of its analysis of the behavior systems exhibit in the wild and the breadth of examples that really drove the point home. I'll give you just five examples of the axioms from the book to give you the taste of it, if you think they are true to your experience, you should definitely read it as it explores much more in the similar way:

- A LARGE SYSTEM PRODUCED BY EXPANDING THE DIMENSIONS OF THE SMALLER SYSTEM DOES NOT BEHAVE LIKE THE SMALLER SYSTEM
- A SYSTEM IS NO BETTER THAN ITS SENSORY ORGANS
- THE SYSTEM CONTINUES TO DO ITS THING, REGARDLESS OF CIRCUMSTANCES
- GREAT ADVANCES DO NOT COME OUT OF SYSTEMS DESIGNED TO PRODUCE GREAT ADVANCES
- IN ORDER TO REMAIN UNCHANGED THE SYSTEM MUST CHANGE

As you see, it applies not just to a computer system or an organisational system. There is something similar in all systems that protrudes prominently through the physical realities and pesky details. This something is the subject of Systemantics that might be worth your time.
Displaying 1 - 30 of 111 reviews

Can't find what you're looking for?

Get help and learn more about the design.