https://fs.blog/mental-models/...:
1. Feedback Loops
All complex systems are subject to positive and negative feedback loops whereby A causes B, which in turn influences A (and C), and so on - with higher-order effects frequently resulting from continual movement of the loop. In a homeostatic system, a change in A is often brought back into line by an opposite change in B to maintain the balance of the system, as with the temperature of the human body or the behavior of an organizational culture. Automatic feedback loops maintain a "static" environment unless and until an outside force changes the loop. A "runaway feedback loop" describes a situation in which the output of a reaction becomes its own catalyst (auto-catalysis).
2. Equilibrium
Homeostasis is the process through which systems self-regulate to maintain an equilibrium state that enables them to function in a changing environment. Most of the time, they over or undershoot it by a little and must keep adjusting. Like a pilot flying a plane, the system is off course more often than on course. Everything within a homeostatic system contributes to keeping it within a range of equilibrium, so it is important to understand the limits of the range.
3. Bottlenecks
A bottleneck describes the place at which a flow (of a tangible or intangible) is stopped, thus constraining it back from continuous movement. As with a clogged artery or a blocked drain, a bottleneck in production of any good or service can be small but have a disproportionate impact if it is in the critical path. However, bottlenecks can also be a source of inspiration as they force us reconsider if there are alternate pathways to success.
4. Scale
One of the most important principles of systems is that they are sensitive to scale. Properties (or behaviors) tend to change when you scale them up or down. In studying complex systems, we must always be roughly quantifying - in orders of magnitude, at least - the scale at which we are observing, analyzing, or predicting the system.
5. Margin of Safety
Similarly, engineers have also developed the habit of adding a margin for error into all calculations. In an unknown world, driving a 9,500-pound bus over a bridge built to hold precisely 9,600 pounds is rarely seen as intelligent. Thus, on the whole, few modern bridges ever fail. In practical life outside of physical engineering, we can often profitably give ourselves margins as robust as the bridge system.
6. Churn
Insurance companies and subscription services are well aware of the concept of churn - every year, a certain number of customers are lost and must be replaced. Standing still is the equivalent of losing, as seen in the model called the "Red Queen Effect." Churn is present in many business and human systems: A constant figure is periodically lost and must be replaced before any new figures are added over the top.
7. Algorithms
While hard to precisely define, an algorithm is generally an automated set of rules or a "blueprint" leading a series of steps or actions resulting in a desired outcome, and often stated in the form of a series of "If → Then" statements. Algorithms are best known for their use in modern computing, but are a feature of biological life as well. For example, human DNA contains an algorithm for building a human being.
8. Critical mass
A system becomes critical when it is about to jump discretely from one phase to another. The marginal utility of the last unit before the phase change is wildly higher than any unit before it. A frequently cited example is water turning from a liquid to a vapor when heated to a specific temperature. "Critical mass" refers to the mass needed to have the critical event occur, most commonly in a nuclear system.
9. Emergence
Higher-level behavior tends to emerge from the interaction of lower-order components. The result is frequently not linear - not a matter of simple addition - but rather non-linear, or exponential. An important resulting property of emergent behavior is that it cannot be predicted from simply studying the component parts.
10. Irreducibility
We find that in most systems there are irreducible quantitative properties, such as complexity, minimums, time, and length. Below the irreducible level, the desired result simply does not occur. One cannot get several women pregnant to reduce the amount of time needed to have one child, and one cannot reduce a successfully built automobile to a single part. These results are, to a defined point, irreducible.
11. Law of Diminishing Returns
Related to scale, most important real-world results are subject to an eventual decrease of incremental value. A good example would be a poor family: Give them enough money to thrive, and they are no longer poor. But after a certain point, additional money will not improve their lot; there is a clear diminishing return of additional dollars at some roughly quantifiable point. Often, the law of diminishing returns veers into negative territory - i.e., receiving too much money could destroy the poor family.