November 16, 2019
The stimuli to the system does not change the system or cause it to behave in a particular way. It merely triggers a response. The structure of the system determines how the system will react.
Many notions derived from systems analysis have corollaries in common proverbs.
Systems thinking isn't always the best way to see the world. It only provides another lens that aids in seeing things clearer.
The behavior of the system cannot be known by only studying it's elements.
A system cannot exist without a coherent organization and a purpose.
"You think because you understand 'one' therefore you must understand 'two' because one and one makes two. But you forget that you must also understand 'and'." -- Sufi teaching story
The elements of most system can be broken down endlessly and do little to help with understanding the working of the system. Therefore, you should start identifying interconnections as soon as possible.
Interconnections often come in the form of information flows.
Purpose come from the behavior of the system, not from the rhetoric or stated
Complex systems develop multiple sub-processes that can aid or thwart the
overarching purpose. Efficient system optimize for a single purpose. Inefficient systems develop competing objects that slow down progress.
Stocks are the tangible elements of a system.
Flows are the mechanisms that fill or drain stocks. The balance between the
flows of a system will determine the behavior and level of the stock.
All system representations are simplifications of the real world.
A dynamic equilibrium occurs when at least one in flow and at least one outflow if present, but the level of the stock doesn't change.
People quickly realize that increasing the in flow will grow the stock. But they miss that decreasing the outflow, while maintaining the inflow, will also increase the stock.
A change in the flow dynamics could take a long time to tangibly effect the stock.
Stocks act as buffers in the system allowing temporary decoupling of the in flows and out flows.
Feedback occurs in the system when the level of the stock can effect one of it's flows. This process allows the system to self regulate.
A balancing feedback loop keeps the stock within a range or tries to reach an equilibrium enforced by an external factor to the flow. An example of reaching an equilibrium is a cup of hot coffee cooling off as it sits in a cold room.
Self reinforcing loops occur when the amount of stock enhances the rate of flow. This leads to exponential growth or runaway collapse.
Besides trivial examples, almost every system has a form of a feedback loop.
Complex systems get built up by similar simple patterns. Understanding each pattern helps understand a more complicated system.
A two balanced loop system occurs when there are two loops trying to drag a system in opposing direction. A good example is a thermostat that's trying to keep a house at a certain temperature while the cold outside air is draining the house of heat.
Feedback loops have an inherit delay that allows them only effect the future value of the stock. This mechanism prevents a loop in a competing system from reacting quickly enough to reach it's desired set-point precisely.
A system with a competing reinforcing loop and balancing loop can only toggle between the two. When the reinforcing loop dominates the system, the stock will grow or fall exponentially. When the balancing loop dominates the stock will grow or fall linearly. Population represents this type of system with the birth rate as the reinforcing loop, and the death rate as the balancing loop.
Test the value of a model by asking:
Most often, the driving forces of a system will have other forces driving them. Even systems with vastly different exteriors will behave the similarly if they are driven by the same feedback loops. Population and Industrial Economy provide an example of this phenomena.
Many real world systems have further external delays in their feedback loops. If these are not managed properly the stock oscillates out of control.
Controlling any single delay can have profound effects on the entire system; sometimes, in counter intuitive ways.
In a physical exponentially growing system, there will always be one reinforcing loop driving the growth and at least on balancing loop constraining the growth. This ensures that no system grows forever.
In a two stock system, as the resource become more scarce, it becomes harder to obtain and therefore rises in price.
Two stock systems can depend on two types of resources: non-renewable, and renewable.
Systems that depend on a non-renewable resources will reach and end of the stock and therefore the system. The only thing that can be controlled in this type of system is the amount of time the stock will last.
The amount of the resource makes little difference if the growth is exponential.
To control how long the stock lasts, the extraction rate must be closely controlled. The faster the extraction rate grows and higher it gets, the more capital the stock will produce, but the shorter life it will have.
Systems that depend on renewable resources can provide resources indefinitely, but require a delicate balancing act.
If the system has a strong balancing loop and low technological advances, they can reach a point of balance with the renewable resource.
If the system has a weaker balancing loop, or technological advances allow further resource extraction without a cost increase. Then, the system will never reach an even balance. Instead, it will oscillate around the ideal balance point.
A renewable resource can also turn non-renewable if the balancing loop is too weak. In this case, the reinforcing regenerating loop of the resource becomes dominated by the balancing loop of extraction and the resource stock starts declining into extinction.
Systems work well because of three characteristics: resilience, self-organization, and hierarchy.
Although there are always limits to resilience, a resilient system can recover from unexpected events or damages.
A system gains resilience by having multiple feedback loops that can restore the system balance through different circumstances.
The system can gain more resilience by having backup loops, loops that can rebuild other feedback loops, or loops that can learn, grow, and evolve.
Resilience requires a level of chaos diminishes predictability and short term efficiency.
Because people desire predictability and optimization of a single outcome, they cause systems to lose their resilience.
Self-organizing systems can make their own structure more complex. This leads to systems that can evolve to find new and improved ways of going about their task.
Like with fractals, self-organization of a system produces complex results by following a set of simple rules.
Self-organization is often taken for granted, and because of this it gets sacrificed for the same reasons as resiliency.
Hierarchies naturally form in systems from the bottom up when a higher level system can help a lower system.
If any subsystem goal dominates the goal of the whole system, then the entire system becomes sub-optimized.
We want to think of systems as a series of events each of which causes the other. We want to focus on the flows and try to link those flows as causal events.
In reality, however, systems consist of flows, stocks, and loops; all of which effect multiple aspects of a system. Therefore, a single event could have an impact on even seemingly disconnected flows, or the impact of the event could be delayed by a stock.
Behavior based models work great for predicting the near future outcomes but do a poor job in determining long term behavior.
We like linear relationships because they are easy to calculate and visualize. We can solve linear equations and draw graphs of the relationships. Unfortunately, most systems behave in nonlinear ways that can't be easily solved or visualized.
We often get surprised by nonlinear systems because they can act in a linear fashion up to a point, and then hit a dynamic shift that completely boggles our expectations.
We see examples non-linearity in applying fertilizer to plants, advertising, and even in the amount of time we dedicate to work. All of these have a positive benefit until, suddenly, they start having a negative effect when applied to liberally.
The world contains no separate systems. Everything is connected together and on a long enough timescale all system flows will impact each other.
Despite the interconnections of everything, we must use boundaries if we wish to understand any scenario. The question becomes where is the appropriate place to draw our boundaries.
Every system has limiting factors that surround it's growth. By addressing the most limiting factor, a systems growth can resume. But undoubtedly, another one will soon arise.
We can only gracefully handle limiting factors if we impose them on the system ourselves. Otherwise, the system will produce one in the environment with potentially disastrous side effects.
Foresight is essential in systems because every system contains many delays. Without foresight, a problem will continue to escalate in a system because the cause of the initial problem was delayed while the behavior that caused it continued.
The actors in the system work with limited information and don't act in fully rational ways. Because of this, blaming or replacing any of the actors will rarely have any effect. The best way to change the outcome of a system lies in modifying the system itself.
Systems contain many traps that if handled incorrectly, result in disaster for a sector or the entire system. Luckily, these traps also present opportunities if they are handle gracefully.
Policy resistance happens when multiple actors desire an opposing system state. If any of the actors tries to pull a system in a particular direction, the rest will resist an equal amount to bring a system back to the imperfect balance point.
Using power to overcome resistance results in catastrophic effects because the retaliation grows more sever the longer an undesirable system state is forced on the other actors.
You can overcome policy resistance by letting go and enduring the temporary losses while allowing everyone else to ease off too. Another option lies in looking for a shared goal that everyone can redirect their efforts towards.
An unregulated commons quickly becomes abused because each of the actors gets all the benefit but spreads the cost across everyone involved.
Keeping the commons beneficial to everyone requires education, privatization, or regulation. Depending on the type of commons and system, one or a combination of strategies needs to be applied.
If a systems performance standards get dragged down to the worst performance, then a reinforcing loop starts that will drive down overall performance until a total collapse.
To overcome the downward performance trend, performance standards need be absolute, or better yet, enhanced every time a new best performance occurs.
Escalation happens when two stocks get locked in a loop where they try to surpass each other. This leads to the degradation of every stock and flow that's not involved in the escalation loop.
The best way to protect a system from escalation is to avoid it in the first place. If escalation can't be avoided, however, resolving it requires someone to break the cycle by loosing, or tough rounds of policy negotiation.
Many systems are susceptible to providing the successful with more means to succeed. Like a game of monopoly, this ends up with a single actor controlling the entire niche without any competition remaining.
The rewards of the system need to be carefully designed if multiple actors are to remain in a niche. Otherwise, the system will require continuous diversification and creation of new niches.
Addiction happens in the system when an actor, a policy, or a substance alleviates or masks a negative system effect without helping the system handle the circumstance better on it's own.
The best way to handle addiction is to avoid it. Once addiction sets in, the addict can mitigate the negative effects by strengthening the systems capabilities before taking away the intervene. But no matter the steps taken, the addict will face painful withdrawals from the intervene.
Rule beating and wrong goals are the two sides of the same coin. In rule beating, the system actors try to get around the rules and cause harm to the overall system. With wrong goals, on the other hand, the actors become overly focused on an inappropriate goal and end up harming the rest of the system.
The way out of rule beating the wrong goals is the same: redesigning the rules or goals to work better within the system.
Systems remain in a constant state of change. If we want them to change in a favorable way we much use the most effective point of leverage available to us.
The leverage points -- in order of effectiveness:
As the effectiveness of leverage points grows, so does the difficulty of implementing them. Any established system will resist change, especially change as radical as a paradigm shift.
Analyzing systems doesn't turn the whole world into a mechanical construct for us to optimize. Deep exploration exposes the human side that we can't easily change or understand. It brings up the same set of philosophical question people have struggled with for centuries.
Every system has it's own rhythm. Before you try to change or even work within a system, take the time to understand it's unique way of operating.
Write down, draw, and explain your mental model to yourself and others. This process will make your understanding concrete and allow you to work with others.
Exposing more information to the right people can fix many underlying system problems.
Especially when dealing with systems, use clear language and add vocabulary to aid with understanding.
You can't quantify many of the most important aspects of a system. Don't allow the ones you can quantify overshadow resiliency, honesty, and justice.
To greatly improve the efficiency in systems create direct feedback policies. These will enable better operation by giving the right information to the people that need to modify their behavior.
Any system that favors individual interests over the good of the whole is doomed.
Allow the system to show you the best course of action. Don't assume that what worked in one system will work in another.
Information gets hidden when responsibility gets reduced. For example, wars got worse when leaders didn't have to lead their army.
We will never understand a system completely. They are too complex. Therefore, we must embrace constant adjustment and admit the errors we will inevitably make.
The best people to make system decisions are holistic thinkers that care deeply about the good of the whole.
Copyright © Artem Chernyak 2020