Skip to content

Latest commit

 

History

History
364 lines (240 loc) · 29.9 KB

thinking-in-systems-a-primer.markdown

File metadata and controls

364 lines (240 loc) · 29.9 KB

Thinking in Systems: A Primer

by Donella H. Meadows

Part 1: System Structure and Behavior

Ch 1: The Basics

More Than the Sum of Its Parts
  • A system is an interconnected set of elements that is coherently organized in a way that achieves something. It consists of three kinds of things: elements, interconnections, and a function or purpose.
  • There is an integrity or wholeness about a system and an active set of mechanisms to maintain that integrity.
  • The system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.
Look Beyond the Players to the Rules of the Game
  • The elements of a system are often the easiest parts to notice, because many of them are visible, tangible things. But intangibles can also be elements.
  • Many interconnections in a system are flows of information, or signals that go to decision or action points. Information holds systems together and greatly determines how they operate.
  • The purpose of a system is deduced from its behavior, not from rhetoric or stated goals. One important purpose of almost every system is to ensure its own perpetuation.
  • Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.
  • A system generally goes on being itself, changing only slowly if at all, even with complete substitutions of its elements – as long as its interconnections and purposes remain intact.
  • The elements are often the least important in defining the unique characteristics of the system, unless changing an element also changes relationships or purpose.
Bathtubs 101 – Understanding System Behavior over Time
  • Stocks are the elements of a system that you can see, feel, count, or measure. They are a store, quantity, or accumulation of material or information that has built up over time.
  • Stocks change over time through the actions flows. A stock is therefore a memory of the history of changing flows within the system.
  • Understanding the dynamics of stocks and flows reveals much about a complex system's behavior. In a state of dynamic equilibrium, a stock does not change.
  • We focus more on inflows than outflows, but a stock can be increased by decreasing its outflow rate as well as increasing its inflow rate.
  • A stock takes time to change, even when the flows into our out of them change suddenly. Stocks therefore act as delays or buffers or shock absorbers in a system.
  • The time lags that come from slowly changing stocks can cause problems, but also provide stability. They allow room to maneuver, to experiment, and to revise policies that aren't working.
  • Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with one another.
  • Most individual and institutional decisions are designed to regulate the levels of stocks. System thinkers see the world as stocks with the mechanisms for regulating their levels by manipulating flows.
How the System Runs Itself – Feedback
  • Feedback loops permit behaviors that persist over time. Such a loop is formed when a change in stock affects the flows into our out of that same stock.
  • Feedback loops can cause stocks to maintain their levels within a range or grow or decline.
Stabilizing Loops – Balancing Feedback
  • Balancing feedback loops are goal-seeking or stability-seeking. They keep a stock at a given value or range of values. They oppose whatever change is imposed on the system.
Runaway Loops – Reinforcing Feedback
  • A reinforcing feedback loop enhances whatever direction of change is imposed on it, thereby creating a virtuous or vicious cycle of healthy growth or runaway destruction.
  • Reinforcing loops are found wherever a system element can reproduce itself or grow as a constant fraction of itself. It leads to exponential growth or to runaway collapses over time.
  • Ask yourself: "If A causes B, is it possible that B also causes A?" Feedback opens up the idea that a system can cause its own behavior.

Ch 2: A Brief Visit to the Systems Zoo

One-Stock Systems

A Stock with Two Competing Balancing Loops – a Thermostat

  • A furnace cannot heat a room to a target temperature because heat leaks to the outside. This leak drains away some heat even as the furnace is getting the signal to put it back.
  • The information delivered by a feedback loop can only affect future behavior; it can't deliver a signal fast enough to correct behavior that drove the current feedback.
  • This means that a flow can't react instantly to a flow. It can only react to a change in stock, and only after a slight delay to register the incoming information.
  • A stock-maintaining balancing feedback loop must have its goal set to compensate for draining or inflowing processes that affect the stock. Otherwise the process will fall short or exceed the target.
  • Every feedback loop has its breakdown point, where other loops pull the stock away from its goal more strongly than it can pull back.

A Stock with One Reinforcing Loop and One Balancing Loop – Population and Industrial Economy

  • Because systems often have several competing feedback loops operating simultaneously, those loops that dominate the system will determine its behavior.
  • Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate the behavior.
  • To test the value of a model, ask: Are the driving factors likely to unfold this way? If they did, would the system react this way? What is driving the driving factors?
  • Model utility depends on not whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.
  • Population has a reinforcing loop of births and a balancing loop of deaths. Capital has a reinforcing loop of investment of output and a balancing loop of depreciation.
  • A central question of economic development is how to keep the reinforcing loop of capital accumulation growing more slowly than the reinforcing loop of population growth, so that people are getting richer instead of poorer.

A System with Delays – Business Inventory

  • A delay in a balancing feedback loop makes a system likely to oscillate.
  • Changes in stock while waiting for a flow to change can lead us to overreact. If the change in flow is more than expected then we can overreact again and change the flow too much in the other direction.
  • Delays are strong determinants of behavior, and so we can't begin to understand the dynamic behavior of systems unless we know where and how long the delays are.
  • Economies are extremely complex systems; they are full of balancing feedback loops with delays, and they are inherently oscillatory.
Two-Stock Systems

A Renewable Stock Constrained by a Nonrenewable Stock – an Oil Company

  • Any real physical entity is always exchanging things with its environment, and so any physical, growing system is going to eventually run into some sort of constraint.
  • This constraint takes the form of a balancing loop that shifts the dominance of the reinforcing loop driving the growth, either by strengthening the outflow or weakening the inflow.
  • Growth in a constrained environment is called the "limits-to-growth" archetype.
  • The more capital, the higher the extraction rate, the lower the resource stock, the lower the yield per unit of capital, the lower the profit, the lower the investment rate, and so the lower growth rate of capital.
  • When you're building a capital stock dependent on a nonrenewable resource, the higher and faster you grow, the farther and faster you fall.
  • The real choice in the management of a nonrenewable resource is whether to get rich very fast or to get less rich but stay that way longer.
  • Raising the price of the resource gives the industry higher profits, so investment goes up, capital stock continues rising, and the more costly resources can be extracted.

Renewable Stock Constrained by a Renewable Stock – a Fishing Economy

  • A simplified model is affected by three non-linear relationships: price (scarcer fish cost more), regeneration rate (scarcer fish breed less), and yield per unit of capital (efficiency of fishing).
  • Nonrenewable resources are stock-limited: The entire stock is available at once and can be extracted at any rate, and so the faster you extract, the shorter its lifetime.
  • Renewable resources are flow-limited: The stock supports indefinite extraction at a rate equal to its regeneration rate, or else the resource may be driven below a threshold and become non-renewable.
  • More and more, increases in technology and harvest efficiency have the ability to drive resource populations to extinction.
  • If the balancing feedback loop is weak, so that capital grows even as the resource dips below its threshold ability to regenerate itself, the resource and industry both collapse.

Part 2: Systems and Us

Ch 3: Why Systems Work So Well

  • When a system works well, chances are good that you may have observed one of three characteristics: resilience, self-organization, or hierarchy.
Resilience
  • Resilience is a measure of a system's ability to survive and persist within a variable environment. The opposite is brittleness or rigidity.
  • Resilience arises from a rich structure of many feedback loops that can work in different ways to restore a system even after a large perturbation.
  • It can be very hard to see, unless you exceed its limits, overwhelm and damage the balancing loops, and the system structure breaks down.
  • Think of resilience as a plateau upon which the system can play, performing its normal functions in safety.
Self-Organization
  • The capacity of a system to make its own structure more complex is called self-organization.
  • If we weren't so blind to the property of self-organization, we'd do a better job of encouraging rather than destroying the self-organizing capacities of systems of which we're a part.
  • Self-organization produces heterogeneity and unpredictability, yielding whole new structures or ways of doing things. It requires freedom of experimentation and a certain amount of disorder.
  • Out of simple rules of self-organization can grow enormous, diversifying crystals of technology, physical structures, organizations, and cultures.
Hierarchy
  • Complex systems can evolve from simple systems only if there are stable intermediate forms. The resulting complex forms will be naturally hierarchic.
  • Hierarchies not only give a system stability and resilience, but they reduce the amount of information that any part of the system has to keep track of.
  • Hierarchies evolve from the lowest level up, and the original purpose of a hierarchy is always to help its originating subsystems do their jobs better.
  • When a subsystem's goals dominate at the expense of the total system's goals, the resulting behavior is called sub-optimization. But just as damaging is too much central control.
  • To be a highly functional system, hierarchy must balance the welfare, freedoms, and responsibilities of the subsystems and the total system. It must balance autonomy and central control.

Ch 4: Why Systems Surprise Us

  • Everything we think we know about the world is a model. Our models have a strong congruence with the world, but fall far short of representing the real world fully.
Beguiling Events
  • It's engrossing to see the world as a series of events, and constantly surprising, because that way of seeing the world has almost no predictive or explanatory value.
  • We are less likely to be surprised if we can see how events accumulate into dynamic patterns of behavior.
  • System structure, or the stocks and flows and feedback loops, reveals itself as a series of events over time. It's the key to understanding not just what is happening, but why.
  • All systems surprise us because we focus too little on their history, and we are insufficiently skilled at seeing in this history clues to the structures from which behavior and events flow.
Linear Minds in a Nonlinear World
  • We often are not very skilled in understanding the nature of relationships, especially nonlinear ones, where a cause does not produce a proportional effect.
  • Nonlinearities foil the reasonable expectation that if a little of some cure did a little good, then a lot of it will do a lot of good. Similarly for destructive actions and harm.
  • Nonlinearities change the relative strengths of feedback loops, flipping a system from one mode of behavior to another. They are the chief cause of shifting dominance.
Nonexistent Boundaries
  • The greatest complexities of a system exist at its boundaries, or sources or sinks as stocks that are ignored for the purposes of simplifying a system.
  • Disorderly, mixed-up borders are sources of diversity and creativity.
  • We have to invent boundaries for clarity and sanity, and boundaries can produce problems when we forget that we've artificially created them.
  • There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion – the questions we want to ask.
  • We are attached to the boundaries that our minds happen to be accustomed to, but boundaries can and should be reconsidered for each new discussion, problem, or purpose.
Layers of Limits
  • We live in a world where multiple inputs come together to produce multiple outputs, and virtually all of the inputs, and therefore all of the outputs, are limited.
  • At any given time, the input that is the most important to a system is the one that is the most limiting.
  • Whenever one factor ceases to be limiting, growth occurs, which itself changes the relative scarcity of factors until another becomes limiting.
  • To shift attention from the abundant factors to the next potential limiting factor is to gain real understanding of, and control over, the growth process.
  • Any physical entity with multiple inputs and outputs is surrounded by layers of limits. The choice is not to grow forever, but to decide what limits to live within.
Ubiquitous Delays
  • Delays are often sensitive leverage points for policy, if they can be made shorter or longer.
  • Delays determine how fast systems can react, how accurately they hit their targets, and how timely is the information that is passed around. Overshoots, oscillations, and collapses are all caused by delays.
  • When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve a problem.
Bounded Rationality
  • Bounded rationality means people make reasonable decisions based on the information they have, but they don't have perfect information, especially about distant parts of the system.
  • Instead of finding a long-term optimum, we are blundering "satisficers," discovering within our limited purview a choice we can live with for now, and sticking to it until we are forced to change.
  • We don't make decisions that optimize our individual good, much less the good of the system as a whole. We imperfectly interpret our imperfect information.
  • Seeing how individual decisions are rational within the bounds of the information available does not excuse narrow-minded behavior, but it illuminates why that behavior arises.
  • If the bounded rationality of each actor does not lead to decisions that furthers the welfare of a system, we must redesign the system to improve the information, incentives, disincentives, goals, stresses, and constraints that affect each actor.

Ch 5: System Traps... and Opportunities

  • We call the system structures that produce common patterns of problematic behavior archetypes.
Policy Resistance – Fixes that Fail
  • Policy resistance is when balancing loops persist undesirable behaviors despite efforts to invent technological or policy fixes.
  • Policy resistance comes from the bounded rationalities of the actors in a system, where each actor has goals that are inconsistent with the goals of others.
  • In such a system, each actor has to put great effort into keeping a system where no one wants to be, because if any actor gives up then the others will drag the system closer to their goals.
  • The way out: Find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality.
The Tragedy of the Commons
  • For a system to be subject to tragedy, the resource must not only be limited, but erodable. Beyond some threshold it cannot regenerate itself and is more likely to be destroyed.
  • A commons system also needs users of the resource, which have good reasons to increase, and which increase at a rate that is not influenced by the conditions of the commons.
  • The tragedy of the commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource.
  • The structure of a commons system makes selfish behavior much more convenient and responsible than behavior that is responsible to the whole community and to the future.
  • The way out: Educate and exhort, privatize the commons (divide it up so that each person reaps the consequences of his or her own actions), or regulate the commons (enforced by policing and penalties).
Drift to Low Performance
  • Drift to low performance happens when a system not only resists policy and stays in a normal bad state, but it keeps getting worse.
  • The actor believes bad news more than good news, and so the actor thinks things are worse than they are. Standards aren't absolute, and so as perceived performance slips, the goal is allowed to slip.
  • The lower the perceived system state, the lower the desired state, and so the less the discrepancy, and so the less the corrective action, and so the lower the system state.
  • The way out: Keep standards absolute, regardless of performance. And make goals sensitive to the best performances of the past, instead of the worst.
Escalation
  • Escalation comes from a reinforcing loop set up by competing actors trying to get ahead of each other.
  • If nothing is done to break the loop, the process usually ends with one or both of the competitors breaking down.
  • The way out: Unilateral disarmament, thereby refusing to compete and interrupting the reinforcing loop. Or negotiate a new system with balancing loops to control the escalation.
Success to the Successful – Competitive Exclusion
  • This is a reinforcing feedback loop where winners in a competition receive, in the reward, a means to compete even more effectively in the future. And so winners go on winning, while losers go on losing.
  • If everything the winner wins is extracted from the losers, then the losers are gradually bankrupted, or forced out, or starved.
  • The "competitive exclusion principle" says given two species competing for the same resource, one will reproduce faster or use the resource more efficiently, and so it reproduces faster and drives the other to extinction.
  • One way out is diversification (or exploiting new resources), but this is not a strategy for the poor when the monopolizing firm can crush or buy up all offshoots.
  • Other ways out: Add a feedback loop to stop any competitor from taking over entirely, or by periodically "leveling the playing field."
Shifting the Burden to the Intervenor – Addiction
  • An intervenor may step in to bring a system to a desirable state, but the original problem will reappear since no one has fixed whatever feedback process is not maintaining the state of the system.
  • The trap is formed if an intervention undermines the capacity of the system to maintain itself. Increased intervention only weakens the capacity of the original system, creating a vicious cycle.
  • Addiction is finding a quick and dirty solution to the symptom of the problem, which distracts one from the harder and longer-term task of solving the real problem.
  • The way out: Intervene in a way that strengthens the ability to shoulder its own burdens, which can be cheaper and easier than taking over and running the system. Then remove yourself.
Rule Beating
  • Rule beating is evasive action to get around the intent of a system's rules – abiding by the letter, and not the spirit, of the law.
  • Rule beating becomes a problem when it leads a system into large distortions, unnatural behaviors that would make no sense at all in the absence of the rules.
  • Rule beating is usually a response of the lower levels in a hierarchy to over-rigid, unworkable, or ill-defined rules from above.
  • The way out: Better explain the rules, or re-design the rules to release creativity not in the direction of beating the rules, but in the direction of achieving their purpose.
Seeking the Wrong Goal
  • If the goal of a system is designed badly, if it doesn't measure what it's supposed to measure, if it doesn't reflect the real welfare of the system, then the system can't produce a desirable result.
  • Confusing effort with results is one of the most common mistakes in designing systems around a wrong goal.
  • You have the problem of wrong goals when something stupid happening "because it's the rule." You have the problem of rule beating when something stupid happens as a way around the rule.
  • The way out: Specify indicators and goals that reflect the real welfare of the system.

Part 3: Creating Change – in Systems and in Our Philosophy

Ch 6: Leverage Points – Places to Intervene in a System

  • Leverage points are frequently not intuitive because as systems become more complex, their behavior becomes more surprising.
  • In the end, it seems that mastery has less to do with pushing leverage points than it does with strategically, profoundly, madly, letting go and dancing with the system.
12. Numbers – Constants and parameters such as subsidies, taxes, standards
  • Different hands on the faucets may change the rate at which they turn, but they're still the same faucets plumbed into the same system, turned according to the same information and goals and rules.
  • Parameters become leverage points when they go into ranges that kick off one of the higher items on this list.
11. Buffers - The sizes of stabilizing stocks relative to their flows
  • You can often stabilize a system by increasing a buffer. But if a buffer is too big, the system gets inflexible, and it reacts too slowly.
10. Stock-and-Flow Structures – Physical systems and their nodes of intersection
  • After a structures is built, the leverage is in understanding its limitations and bottlenecks, using it with maximum efficiency, and refraining from fluctuations or expansions that strain its capacity.
9. Delays – The lengths of time relative to the rates of system changes
  • Delays that are too short cause over-reaction, oscillations amplified by the jumpiness of the response. Delays that are too long cause damped, sustained, or exploding oscillations, depending on how much too long.
  • Delay length would be a high leverage point, except for the fact that delays are not often easily changeable.
8. Balancing Feedback Loops – The strength of the feedbacks relative to the impacts they are trying to correct
  • The strength of a balancing feedback loop is important relative to the impact it is trying to correct. If the impact increases in strength, so must the feedbacks.
7. Reinforcing Feedback Loops – The strength of the gain of driving loops
  • A system with an unchecked reinforcing loop will ultimately destroy itself.
  • Slowing down the growth of the reinforcing loop is usually a more powerful leverage point than strengthening balancing loops.
6. Information Flows – The structure of who does and does not have access to information
  • Given missing information flows, adding or restoring information is usually much easier and cheaper than rebuilding physical infrastructure.
  • There are so many missing feedback loops because we tend to avoid accountability for our own decisions.
5. Rules – Incentives, punishments, constraints
  • The rules of a system determine its scope, its boundaries, its degrees of freedom.
  • Power over the rules is real power. If you want to understand the deepest malfunctions of systems, pay attention to the rules and who has power over them.
4. Self-Organization – The power to add, change, or evolve system structure
  • Self-organization means changing any aspect of a system lower on this list – adding new balancing or reinforcing loops, or new rules. It is the strongest form of system resilience.
  • It is basically a matter of an evolutionary raw material – a highly variable stock of information – and a means for experimentation, for selecting and testing new patterns.
  • This intervention is unpopular, as encouraging variability and experimentation and diversity means "losing control."
3. Goals – The purpose or function of the system
  • Even people within systems don't often recognize what whole-goal system they are serving.
  • Changing the person at the top of a system can change its goals and therefore radically change its behavior.
2. Paradigms – The mind-set out of which the system (its goals, structures, rules, delays, parameters) arises
  • The shared idea in the minds of society, the great big unstated assumptions, constitute that society's paradigm, or deepest set of beliefs about how the world works.
  • Paradigms are the sources of systems. From them, from shared social agreements about the nature or reality, come system goals and information flows, feedbacks, stocks, flows, and everything else.
  • There's nothing physical or expensive or slow in the process of paradigm change. In a single individual it can happen in a millisecond.
1. Transcending Paradigms
  • Every paradigm, including the ones you hold dear, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension.
  • If you think no paradigm is right, you can choose whatever one will help to achieve your purpose.

Ch 7: Living in a World of Systems

  • Seeing the future exactly and preparing for it perfectly is unrealizable. Making a complex system do just what you want it to do can be achieved only temporarily, at best.
  • For any objective other than the most trivial, we can't optimize. We don't even know what to optimize.
  • But we can listen to what the system tells us, and discover how its properties and our values can work together to create something much better than could ever be produced by our will alone.
  • Living successfully in a world of systems requires our full humanity – our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.
Get the Beat of the System
  • Starting with the behavior of a system forces you to focus on facts, not theories, and helps you avoid any beliefs or misconceptions.
  • Moreover starting with the history of a system discourages us from defining a problem not by its actual behavior, but by the lack of our favorite solution.
Expose Your Mental Models to the Light of Day
  • Everything you know, and everything everyone knows, is only a mental model.
  • Exposing your mental models, making them rigorous, testing them against the evidence, and scuttling them if they are no longer supported is simply practicing the scientific method.
Honor, Respect, and Distribute Information
  • Most of what goes wrong in a system is from biased, late, or missing information. So do not distort, delay, or withhold information.
Use Language with Care and Enrich It with Systems Concepts
  • The language and information systems of an organization are not an objective means of describing an outside reality – they structure the perceptions and actions of its members.
  • Keep language as concrete, meaningful, and truthful as possible. And then enlarge language to make it consistent with our enlarged understanding of systems.
Pay Attention to What Is Important, Not Just What Is Quantifiable
  • Our culture is obsessed with numbers, and has therefore given us the idea that what we can measure is more important than what we can't measure.
  • We have been endowed not only with the ability to count, but also with the ability to assess quality. Be a quality detector.
Make Feedback Policies for Feedback Systems
  • The best policies contain meta-feedback loops – or loops that alter, correct, and expand loops. They design learning into the management process.
Go for the Good of the Whole
  • Don't maximize parts of the system while ignoring the whole. Don't go to great trouble to optimize something that never should be done at all.
Listen to the Wisdom of the System
  • Before you charge in to make things better, pay attention to the value of what's already there. Don't destroy the system's own self-maintenance policies.
Locate Responsibility in the System
  • This means looking for the ways the system creates its own behavior.
  • "Intrinsic responsibility" means that the system is designed to send feedback about the consequences of decision making directly, quickly, and compellingly to decision makers.
Stay Humble – Stay a Learner
  • When you're learning take small steps, constantly monitor, and change course as you find out more about where it's leading.
  • Error-embracing is the condition for learning. It means seeking and using – and sharing – information about what went wrong with what you expected or hoped would go right.
Expand Time Horizons
  • In a systems sense, there is no long-term or short-term distinction. Phenomena at different time scales are nested within each other.
Defy the Discipline
  • Interdisciplinary communication works only if there is a real problem to be solved, and if its representatives are more committed to solving the problem than being academically correct.
Don't Erode the Goal of Goodness
  • The most damaging example of the systems archetype "drift to low performance" is the process by which modern industrial culture has eroded the goal of morality.
  • Don't weigh the bad news more heavily than the good. And keep standards absolute.