Even if people would do everything in their power so that their children would have good lives, they can never guarantee it (and even more important, in my view, is that it is always guaranteed that their children would harm others). People can’t protect their children from every possible harm even if they’ll always do their best, since many bad things happen to many people independently of their parents’ actions. So obviously and as thoroughly explained in all the former texts in this blog, creating a person is always the wrong decision. In this post I’ll argue that the fact that breeding is always made by creatures who are hardly able to make right and rational decisions, makes it even worse.

Procreation involves creating an extremely vulnerable subject of harm, therefore the ones who decide to do so must be perfect decision makers. But people haven’t proven to be anything close. People are irrational creatures who usually don’t make the right decisions, and all their decisions are shaped and influenced by irrational forces. They tend to think that their decisions were made after they have rationally considered the best possible outcome of any given situation, and that they are in total control of their behavior and perceptions, but the truth is that much of it has very little, and often nothing, to do with the situation at hand and more with internal factors such as their personalities, habits, temperament, previous perceptions, willpower and hidden and explicit motives; and with contingent external factors such as how tired they are, how hungry they are, how thirsty they are, how sexually aroused they are, how comfortable their shoes are, the outside temperature, and etc.

While people like to believe that they are rational and logical, they are significantly influenced by many cognitive biases that constantly distort their assessments, positions, beliefs, judgments and decisions. Here are some common examples:

Status-Quo Bias: Generally, people prefer stability, the familiar, sticking to their routines. Therefore they tend to make decisions which guarantee that things remain more or less the same, even if they can be better, or are currently wrong. Though it makes sense not to fix something that is not broken, the problem is that many things are not even seen as broken because people don’t want to bother fixing them.

Egocentric Bias: People tend to recall the past in a self-serving manner, meaning they “remember” their performances as better than they actually were. One of the consequences is that they make decisions based on false self-perceptions.

Confirmation Bias: People tend to favor (and even remember) information that confirms their positions and actions, and disfavor and disregard (or forget) any information that contradicts or threatens their positions and actions. This bias is so common and so important that I address it separately.

Anchoring Bias: When making decisions, people tend to be overly influenced by the first piece of information they hear about the subject. The rest of the information is assimilated in relation to the first one simply because it was first and therefore was anchored, not because it is more accurate or more important.

Halo Effect: People’s overall impression of someone influences how they view each of that person’s traits, even when there are no causal links or any relevancy between the traits. The most common expression of the bias (and its worst effect) is that people find whom who is more physically attractive to also be smarter and kinder, and even worse, that the less physically attractive are also dumb and evil.

Sunk Cost Fallacy: Also known as Escalation of Commitment, people tend to continue in activities even after realizing that these activities are no longer enjoyable or needed, and often despite that it would take more efforts to complete them than was invested in them in the first place. In other words, people’s decisions are influenced by their cumulative prior investments, and regardless of their need, desire, and often despite new evidence proving this decision wrong.

Ego Depletion: Studies show that willpower is an expendable resource which can be depleted after overuse. In times of overabundance of temptations and stimulations such as ours, it is much easier and very frequent for people’s willpower to be depleted. Therefore, in many cases they unconsciously make decisions which they would have never made hadn’t their willpower been depleted.

Belief Bias: People value the logic of an argument according to the plausibility of its conclusion.

This is one of the biases which most strongly prove how illogical people are, as the logic of an argument, by definition, must be objective and independent of how plausible or desirable the conclusion which is rationally inferred from it is. Otherwise what is the point of logic in argumentation?

Existence Bias: People tend to treat the mere existence of something as evidence of its goodness, and to evaluate an existing state more favorably than its alternatives.

The Optimism Bias: The optimism bias, also referred to as “the illusion of invulnerability”, is people’s built-in cognitive tendency to underestimate their likelihood of experiencing bad things, and overestimate their likelihood of experiencing good things. For example, people underestimate their chances of suffering from diseases or car accidents, no matter how prone they are specifically to be involved in such, or how prevalent diseases and car accidents are in general; and they overestimate their happiness potential no matter their specific living conditions.

Availability Heuristic: When evaluating a specific issue, idea, method or decision, people tend to place greater value on information that comes to their mind more quickly. People give greater credence to immediate examples and tend to overestimate the probability of similar things happening in the future.

Priming: Not only that people’s decisions and judgments are unconsciously affected by stimuli, which in many cases are absolutely irrelevant, such as smells, colors and looks, usually these factors will also affect people’s following decisions and judgments – since primal decisions and judgments affect future ones. In other words, when people are exposed to one stimulus, not only that it affects their current decisions and judgments, but it might also affect their future ones. That is despite that usually the primal stimulus had nothing to do with neither of the cases at hand.

The Consistency Effect – Is a similar and even stronger effect than priming, which basically means that people tend to defend and preserve their positions and behaviors, even if these were decided randomly or without serious observation by the agent. This effect can be an even stronger case of priming since it usually lasts longer, and since it doesn’t necessarily relate to sensual perceptions but to statements and actions performed by a person. Hence, once a person said or did something, it is often much harder to convince that person that s/he is wrong because of their drive to remain consistent (even with random and arbitrary statements and actions), which is usually further fortified by another kind of psychological bias – Self-Justification which is the infamous tendency of people to justify their behavior no matter how incoherent, reasonless and even untypical it is.

Substitution Bias:
When people are confronted with a complex decision, they often automatically and unconsciously substitute it with a less complex one. They seek an easier, more familiar, related problem and apply its easier more familiar solution, to the more complex problem.

Bandwagon Effect: People tend to believe and do things merely since others believe and do them (they jump on the bandwagon).

Default Effect: Studies show that an option is more likely to be chosen by people, regardless of its content, or whether it has advantages over other alternatives, or if it is expected to benefit its choosers, once it is simply (and arbitrarily) set as the default option.

Groupthink: People’s opinions and decisions are shaped, if not suppressed, by other group members who collectively and unconsciously try to reach an agreement, often at the expense of evaluating alternative positions. This tendency results in an irrational and dysfunctional but common decision-making process.

Fluency Bias: People tend to take more seriously ideas which are processed more fluently, and more smoothly, often merely because they were presented more masterfully, not because they are more trustworthy or logical.

Mere Exposure Effect: People tend to favor options merely because they are more familiar with them.

Choice-Supportive Bias: Once a decision is made, people tend to over-focus on its benefits and minimize its flaws.

Gambler’s Fallacy: People tend to think that the likelihood of events which their probability is statistically independent (such as dice rolling or coin flipping), is nevertheless affected by past outcomes. For example, people believe that after two successive heads in coin tossing, it is more likely that the next one would be tails.

Restraint Bias: People tend to overestimate their ability to resist temptations.

Expectation Bias: People are biased by their expectations of a situation, which causes them to believe, confirm, and spread information which correspond with their expectations, and overlook, discard, or downgrade information which is in conflict with their expectations.

Framing Effect: People’s decisions are likely to differ depending on whether the exact same information is presented in one way or in another.

Authority Bias: People tend to ascribe more credibility and are more influenced by authority figures, regardless of the content of their statements.

False Uniqueness Bias: People tend to view themselves as more unique and special than they actually are.

Hyperbolic Discounting: Also known as present-bias, as it regards to people’s tendency to strongly prefer immediate benefits over future ones, despite that their future selves would highly prefer that they wouldn’t make those decisions in the present.

And it’s probably most fitting to end this partial list of cognitive biases with – Bias Blind Spot: People’s tendency not to recognize the effect of biases on their own judgment. Almost all people are sure that they are less biased than others, absolutely convinced that their beliefs, judgments and decisions are all rational, accurate, and bias free. Research has shown that people are still unable to control the effect of biases on their beliefs, judgments and decisions, even after made aware of them, and that further strengthen the fact that they are biased by the Bias Blind Spot.

Every decision people make is never after an independent standalone truly rational examination of the given situation. Every decision is somehow biased, usually by more than one cognitive bias.

It may be worth noting that cognitive biases are not the same as logical fallacies. While it may seem to some that at least theoretically, logical fallacies, which are basically error in logical argumentation, can be fixed by talented, articulate and patient activists, cognitive biases on the other hand, being deeply rooted genuine deficiencies or limitations in people’s thought processing, judgments, memory, attention, valuation, and other mental activities, are here to stay and they constantly distort people’s rational thinking, logic, emotions, believes, positions, perceptions, decisions, and actions.

The fact that people are so unaware of these tremendous forces influencing their decisions makes it even harder to convince them to change their decisions since they are sure that their decisions were made rationally and independently of any external or internal pressure.
If people were rational then in each case they would logically compare all the options and decide upon the expected best outcome in terms of benefit, and not according to the various factors that actually determine their behavior.

People’s thinking and decisions are highly affected by their emotional state. Multiple studies have shown how stress and excitement affect people’s reason and actions. One famous example is lottery sales which sky rocket after events which are considered good, especially unexpected ones. People generally tend to overestimate the chances of something good happening to them, and underestimate the chances of something bad happening to them (the Optimism Bias) and it has an even stronger effect when they have a better mood (which obviously doesn’t really affect the chances of something good happening to them, or that something bad won’t).

Not only that the emotions experienced by people while making a decision sometimes have nothing to do with the issue itself, it is often the case that the effects of the emotions experienced while making a decision can last longer than the emotions themselves. In other words, not only that emotions sometimes have a strong irrelevant effect on an immediate decision, they often affect future decisions as well, and again regardless of the relevancy to the issue itself. That is because in many cases an emotion creates a long-lasting pattern of responses to similar scenarios which correspondingly affects decision making regarding these situations. Or to put it even more bluntly, one initial mistake can start a chain reaction of misguided decisions.

People are sure that they are always in the driver’s seat, at least when it comes to their decisions, and with what happens in their lives. But they are always not, even when it comes to “their” decisions and with what happens in their lives, and to an even greater degree, what happens in others’ lives.
People are merely pawns who are constantly influenced by many forces which they can’t comprehend or are even aware of, not to mention are able to control.

Given how irrational people are, it is irrational to keep using rational arguments expecting to convince them to stop breeding. What is needed is not rational arguments but actions.

References

Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman and Company.

Baron J (2000). Thinking and deciding (3rd ed.). New York: Cambridge University Press. ISBN 978-0-521-65030-4.

Barrett LF, Simmons WK (July 2015). Interceptive predictions in the brain. Nature Reviews. Neuroscience. 16(7): 419–29. doi:10.1038/nrn3950. PMC 4731102. PMID 26016744.

Bishop MA, Trout JD (2004). Epistemology and the Psychology of Human Judgment. New York: Oxford University Press. ISBN 978-0-19-516229-5.

Bornstein RF, Crave-Lemley C (2004). Mere exposure effect. In Pohl RF (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp. 215–234. ISBN 978-1-84169-351-4. OCLC 55124398.

Dardenne B, Leyens JP (1995). Confirmation Bias as a Social Skill. Personality and Social Psychology Bulletin. 21 (11): 1229–1239. doi:10.1177/01461672952111011.

De Meza D, Dawson C (January 24, 2018). Wishful Thinking, Prudent Behavior: The Evolutionary Origin of Optimism, Loss Aversion and Disappointment Aversion. SSRN 3108432.

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.

Enzle, Michael E.; Michael J. A. Wohl (March 2009). Illusion of control by proxy: Placing one’s fate in the hands of another. British Journal of Social Psychology. 48 (1): 183–200. doi:10.1348/014466607×258696. PMID 18034916.

False Uniqueness Bias (SOCIAL PSYCHOLOGY) – IResearchNet 2016-01-13.

Fisk JE (2004). Conjunction fallacy. In Pohl RF (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp. 23–42. ISBN 978-1-84169-351-4. OCLC 55124398.

Gilovich T (1993). How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life. New York: The Free Press. ISBN 978-0-02-911706-4.

Gilovich T, Griffin D, Kahneman D (2002). Heuristics and biases: The psychology of intuitive judgment. Cambridge, UK: Cambridge University Press. ISBN 978-0-521-79679-8.

Gino, Francesca; Sharek, Zachariah; Moore, Don A. (2011). Keeping the illusion of control under control: Ceilings, floors, and imperfect calibration. Organizational Behavior and Human Decision Processes. 114 (2): 104–114. doi:10.1016/j.obhdp.2010.10.002.

Hardman D (2009). Judgment and decision making: psychological perspectives. Wiley-Blackwell. ISBN 978-1-4051-2398-3.

Hilbert M (March 2012). Toward a synthesis of cognitive biases: how noisy information processing can bias human decision making. Psychological Bulletin. 138(2): 211–37. CiteSeerX 10.1.1.432.8763. doi:10.1037/a0025940. PMID 22122235.

Hsee CK, Zhang J (May 2004). Distinction bias: misprediction and mischoice due to joint evaluation. Journal of Personality and Social Psychology. 86 (5): 680–95. CiteSeerX 10.1.1.484.9171. doi:10.1037/0022-3514.86.5.680. PMID 15161394.

Hoorens V (1993). Self-enhancement and Superiority Biases in Social Comparison. European Review of Social Psychology. 4 (1): 113–139. doi:10.1080/14792779343000040.

Investopedia Staff (2006-10-29). Gambler’s Fallacy/Monte Carlo Fallacy. Investopedia. Retrieved 2018-10-10.

Juslin P, Winman A, Olsson H (April 2000). Naive empiricism and dogmatism in confidence research: a critical examination of the hard-easy effect. Psychological Review. 107 (2): 384–96. doi:10.1037/0033-295x.107.2.384. PMID 10789203.0

Kokkoris, Michail (2020-01-16). The Dark Side of Self-Control. Harvard Business Review.

Kruger, J. &Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.

Kruger J (August 1999). Lake Wobegon be gone! The “below-average effect” and the egocentric nature of comparative ability judgments. Journal of Personality and Social Psychology. 77(2): 221–32. doi:10.1037/0022-3514.77.2.221. PMID 10474208.

Lichtenstein S, Fischhoff B (1977). Do those who know more also know more about how much they know?. Organizational Behavior and Human Performance. 20 (2): 159–183. doi:10.1016/0030-5073(77)90001-0.

Marks, Gary; Miller, Norman (1987). Ten years of research on the false-consensus effect: An empirical and theoretical review. Psychological Bulletin. 102 (1): 72–90. doi:10.1037/0033-2909.102.1.72.

McKenna, F. P. (1993). It won’t happen to me: Unrealistic optimism or illusion of control?. British Journal of Psychology. 84 (1): 39–50. doi:10.1111/j.2044-8295.1993.tb02461.x.

Merkle EC (February 2009). The disutility of the hard-easy effect in choice confidence. Psychonomic Bulletin & Review. 16(1): 204–13. doi:10.3758/PBR.16.1.204. PMID 19145033.

Milgram S (Oct 1963). Behavioral Study of obedience. The Journal of Abnormal and Social Psychology. 67 (4): 371–8. doi:10.1037/h0040525. PMID 14049516.

Msetfi RM, Murphy RA, Simpson J (2007). Depressive realism and the effect of intertrial interval on judgements of zero, positive, and negative contingencies. The Quarterly Journal of Experimental Psychology. 60 (3): 461–481. doi:10.1080/17470210601002595. PMID 17366312.

Nickerson RS (1998). Confirmation Bias: A Ubiquitous Phenomenon in Many Guises. Review of General Psychology. 2 (2): 175–220 [198]. doi:10.1037/1089-2680.2.2.175.

O’Donoghue T, Rabin M (1999). Doing it now or later. American Economic Review. 89 (1): 103–124. doi:10.1257/aer.89.1.103.

Oswald ME, Grosjean S (2004). Confirmation Bias. In Pohl RF (ed.). Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory. Hove, UK: Psychology Press. pp. 79–96. ISBN 978-1-84169-351-4. OCLC 55124398.

Pacini, Rosemary; Muir, Francisco; Epstein, Seymour (1998). Depressive realism from the perspective of cognitive-experiential self-theory. Journal of Personality and Social Psychology. 74 (4): 1056–1068. doi:10.1037/0022-3514.74.4.1056. PMID 9569659.

Paul W. Glimcher, (2004) Decisions, Uncertainty, and the Brain: The Science of Neuroeconomics (Cambridge, MA: MIT Press, pp. 189–91

Plous S (1993). The Psychology of Judgment and Decision Making. New York: McGraw-Hill. ISBN 978-0-07-050477-6.

Pohl RF (2017). Cognitive illusions: Intriguing phenomena in thinking, judgment and memory. London and New York: Routledge. ISBN 978-1-138-90341-8.

Pronin E, Kruger J, Savitsky K, Ross L (October 2001). You don’t know me, but I know you: the illusion of asymmetric insight. Journal of Personality and Social Psychology. 81 (4): 639–56. doi:10.1037/0022-3514.81.4.639. PMID 11642351.

Pronin E, Kugler MB (July 2007). Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot. Journal of Experimental Social Psychology. 43 (4): 565–578. doi:10.1016/j.jesp.2006.05.011. ISSN 0022-1031.

Schwarz N, Bless H, Strack F, Klumpp G, Rittenauer-Schatka H, Simons A (1991). Ease of Retrieval as Information: Another Look at the Availability Heuristic (PDF). Journal of Personality and Social Psychology. 61 (2): 195–202. doi:10.1037/0022-3514.61.2.195.

Martin Steve (2012) The Default Effect: How to Leverage Bias and Influence Behavior. Influence at Work

Thompson, Suzanne C.; Armstrong, Wade; Thomas, Craig (1998). Illusions of Control, Underestimations, and Accuracy: A Control Heuristic Explanation. Psychological Bulletin. 123 (2): 143–161. doi:10.1037/0033-2909.123.2.143. PMID 9522682.

West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.