Decision & Cognitive Bias

Examples of Incidents Attributed to Human Error During Nights or Night Shifts

  • Three-Mile Island nuclear reactor
  • Davis–Besse nuclear reactor at Oak Harbor, Ohio
  • Racho Seco nuclear reactor near Sacramento, Calif
  • Chernobyl nuclear plant
  • Space shuttle Challenger accident
  • Launch of the space shuttle Columbia
  • Bhopal Union Carbide tragedy
  • Exxon Valdez accident
  • Estonia ferry accident
  • Peak incidence of single-vehicle motor accidents
  • 18% increase in human error incidents in afternoon shift relative to morning shift
  • 30% increase in human error incidents on night shift relative to morning shift

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1854972/

 

Decision Making

Daniel Kahneman:http://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow

Our brain is weak in remembering lists but is good in remembering routes and agents, so to remember a list we should imagine traveling a route related to that list.

Two different ways the brain forms thoughts:

System 1: Fast, automatic, frequent, emotional, stereotypical, subconscious works all the time
System 2: Slow, effortful, infrequent, logical, calculating, conscious we feel that we are the authors of the thoughts of system 2,

but system one keeps whispering suggesting to system 2 and most of the time we are guided by automatic suggestions of system 2.

It is good for most of the time when we are relying on skills.

Relying on intuition for answer of complex questions is very risky

People need protection against their own mistakes and predators.

Most of what we do is just skill and system 1 is fine for that, but system 1 produces simplified answers when it is facing a situation for which it doesn’t have an skilled answer.

If the problems arise in system 1 and we are not aware that system 1 doesn’t have the skill, then system 2 will not be activated to correct it.

There are predictable errors that people make, the only thing we can do is to recognize the situations  in which you are likely to make an error.

Institutions can help controlling us from relying on intuition.

If we were in a society that understood the problems of judgement better, we would make better decisions (like using regressive prediction)

People are more intelligent about the problems of other people.

Kahneman https://www.youtube.com/watch?feature=player_embedded&v=i_UVDD7ErJ4

Kahneman suggests: http://www.amazon.ca/Moonwalking-Einstein-Science-Remembering-Everything/dp/0143120530

Intuitive prediction – Defense Technical Information Center

http://www.spiegel.de/international/zeitgeist/interview-with-daniel-kahneman-on-the-pitfalls-of-intuition-and-memory-a-834407.html

http://www.lse.ac.uk/newsAndMedia/videoAndAudio/channels/publicLecturesAndEvents/player.aspx?id=1251

http://integral-options.blogspot.ca/2012/06/debunking-myth-of-intuition-daniel.html

http://examinedexistence.com/daniel-kahnemans-on-why-we-shouldnt-always-trust-our-gut-feelings/

http://www.inc.com/graham-winfrey/daniel-kahneman-on-why-entrepreneurs-shouldnt-trust-their-gut.html

=================================================================================

Factors that affect how people make choices

1. Situational Factors

2. Personality Traits

3. Group Dynamics – that affect group decision-making

Use Divergent (brainstorming)/Convergent thinking

Step 1 (divergent Thinking): Brainstorm.-The more ideas, the better-Build one idea upon another-Wacky ideas are okay-Don’t evaluate ideas
Step 2 : Blow away and cluster and group similar ideas.
Step 3 (convergent thinking ): Select practical, promising ideas.

We should train ourselves to be ready at any moment, when we are
analyzing a problem, to shift from our normal convergent mode to a divergent mode and back again, taking from the divergency new ideas.

Tell the people that you are starting a divergent mode, otherwise they may get upset or may take the brainstormed ideas as tangent or really serious proposals.

Jones, M. D. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving (Revised edition edition). New York: Crown Business.

 Systematic  Decision Making steps:

1. Problem Definition (PD articulation)

-Articulate the Decision Problem in many different ways.
-Start by writing it down – question it, test it, hone it – reword it.
-Use Divergent/Convergent thinking
-Redefine problems as opportunities.

-Definition should be wide. (Question the constraints & establish workable scope)
-Get rid of assumptions
-Don’t think about solutions during DP articulation
Understand the stakeholders and the members of the value chain
Understand your own Biases
-Ask others – get a fresh/outside perspective
-Be Creative

2. Setting Objectives

Objectives are Decision Criteria used to assess alternatives / course(s) of action

What we are trying to accomplish

What do we really want/need

Objectives are the bases for evaluating alternatives

Objectives are usually to Maximize or Minimize

People spend too little time on setting objectives and take a narrow view of what the objectives are

Use Divergent/Convergent thinking

Write down all concerns/hopes for making the decision

Convert concerns into objectives (e.g.. Maximize XX, minimize XX, (verb and object)

3. Identifying alternatives

Use Divergent thinking and brain storming Question alternatives asking are these what we really want?
Is there a better choice?

Use the power of the group

You can never chose an alternative you do not consider
Your decision is never better then your best alternative
Don’t box yourself in statusquo and saticficing

Look at your objectives and ask HOW
Challenge constraints – most of them are mental rather then real barriers
Break free of tradition
Set high aspirations
Get additional perspectives
Never stop looking – in the other stages.
BUT Know when to quit looking

Types of Alternatives:

Process alternatives
Win-win
Information Gathering
Time-buying Alternatives

 

 4-Analyze alternatives against objectives using tools

Pros/Cons/Fixes

Evaluating each alternatives’ merits, it compensates for negativity bias by forcing us to identify the positives  first . Only then are we allowed to indulge joyously in negatives. But the technique goes a step further
by examining the negatives and trying to think of actions that could be taken to “fix” them, either converting them into positives or, if that isn’t feasible, eliminating them altogether. Those negatives (cons) that can’t be “fixed” represent the price one must pay, the burden one must bear, if the thing being evaluated were to be adopted or accepted.(Jones, 1998, p.53)

-List all Pros and all Cons
-Review and consolidate the Cons, by merging and eliminating
-Neutralize as many Cons as possible by finding fixes for them
-Compare the Pros and ‘unalterable’ cons for all options
-Pick an option

For a decision Problem that has two alternatives, this can lead to a decision.

 

Simple Ranking

Eliminate clearly inferior Alternatives by Ranking  each criteria for every alternative.

Then eliminate the alternatives that are clearly dominated by at least one other alternatives using pairwise comparison.

if alternative A in comparison with B has a higher ranked attribute, and none of its other attributes are lower than B attributes, B is dominated by A; and B should be eliminated.

If Xn dominated Xm, that is enough to eliminate Xm.

Those alternatives that don’t have superiority over any alternatives on any attributes are eliminated. (If an Alternative is superior on one attribute over others it cannot be eliminated because that attribute may be very important)

A  B  C

1  3  3

2  2  2

3  1  3

3  3  1

In the example above nothing can be eliminated.

And this still may leave illions or billions of ‘undominated pairs’ – pairs of alternatives where one has a higher ranked category for at least one criterion and a lower ranked category for at least one other criterion than the other alternative, and hence a judgment is required for the alternatives to be pairwise ranked.

The PAPRIKA is used to get rid of more alternatives.

Potentially all pairwise rankings of all possible alternatives (PAPRIKA)

http://en.wikipedia.org/wiki/Potentially_all_pairwise_rankings_of_all_possible_alternatives

https://www.1000minds.com/solutions/decision-making-software

==================================================

Fallacy

A fallacy is the use of invalid or otherwise faulty reasoning, or “wrong moves” in the construction of an argument. A fallacious argument may be deceptive by appearing to be better than it really is.

https://en.wikipedia.org/wiki/Fallacy

==================================================

Cognitive biases can lead to systematic deviations from standard rationality ,however the standard rationality does not exist in real life.

Cognitive biases can lead to systematic deviations from good judgment, however goodness is subjective.

They are often studied in psychology and behavioral economics.

Some are effects of information-processing rules (i.e. mental shortcuts), called heuristics, that the brain uses to produce decisions or judgments. Such effects are called cognitive biases.

Biases in judgment or decision-making can also result from motivation, such as when beliefs are distorted by wishful thinking.

Some biases have a variety of cognitive (“cold”) or motivational (“hot”) explanations. Both effects can be present at the same time.

http://en.wikipedia.org/wiki/List_of_cognitive_biases

 

Cognitive biases are thinking tendencies that may lead to systematic deviations from a standard of rationality or good judgment.

by:

Name Description Solution
Ambiguity effect The tendency to avoid options for which missing information makes the probability seem “unknown”.[8]  Step 1 (divergent Thinking): Brainstorm.-The more ideas, the better-Build one idea upon another-Wacky ideas are okay-Don’t evaluate ideas
Step 2 : Blow away and cluster.
Step 3 (convergent thinking ): Select practical, promising ideas.We should train ourselves to be ready at any moment, when we are
analyzing a problem, to shift from our normal convergent mode to a divergent mode and back again, taking from the divergency new ideas.Tell the people that you are starting a divergent mode, otherwise they may get upset or may take the brainstormed ideas as tangent or really serious proposals.Jones, M. D. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving (Revised edition edition). New York: Crown Business.
Anchoring or focalism The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information that we acquire on that subject)[9][10]  
Attentional bias The tendency of our perception to be affected by our recurring thoughts.[11]  
Automation bias The tendency to excessively depend on automated systems which can lead to erroneous automated information overriding correct decisions.[12]  
Availability heuristic The tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.[13]  
Availability cascade A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”).[14]  
Backfire effect When people react to disconfirming evidence by strengthening their beliefs.[15]  
Bandwagon effect The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.[16]  
Base rate fallacy or base rate neglect The tendency to ignore base rate information (generic, general information) and focus on specific information (information only pertaining to a certain case).[17]  
Belief bias An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.[18]  
Bias blind spot The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.[19]  
Cheerleader effect The tendency for people to appear more attractive in a group than in isolation.[20]  
Choice-supportive bias The tendency to remember one’s choices as better than they actually were.[21]  
Clustering illusion The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).[10]  
Confirmation bias The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.[22]  
Congruence bias The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.[10]  
Conjunction fallacy The tendency to assume that specific conditions are more probable than general ones.[23]  
Conservatism or regressive bias A certain state of mind wherein high values and high likelihoods are overestimated while low values and low likelihoods are underestimated.[24][25][26][unreliable source?]  
Conservatism (Bayesian) The tendency to revise one’s belief insufficiently when presented with new evidence.[24][27][28]  
Contrast effect The enhancement or reduction of a certain perception’s stimuli when compared with a recently observed, contrasting object.[29]  
Curse of knowledge When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.[30]  
Decoy effect Preferences for either option A or B changes in favor of option B when option C is presented, which is similar to option B but in no way better.  
Denomination effect The tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills).[31]  
Distinction bias The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[32]  
Dunning-Kruger effect The tendency for unskilled individuals to overestimate their ability and the tendency for experts to underestimate their ability.[33]  
Duration neglect The neglect of the duration of an episode in determining its value  
Empathy gap The tendency to underestimate the influence or strength of feelings, in either oneself or others.  
Endowment effect The fact that people often demand much more to give up an object than they would be willing to pay to acquire it.[34]  
Essentialism Categorizing people and things according to their essential nature, in spite of variations.[dubious ][35]  
Evaluability bias The tendency to weight the importance of an attribute in proportion to its ease of evaluation[36]  
Exaggerated expectation Based on the estimates, real-world evidence turns out to be less extreme than our expectations (conditionally inverse of the conservatism bias).[unreliable source?][24][37]  
Experimenter’s or expectation bias The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[38]  
Focusing effect The tendency to place too much importance on one aspect of an event.[39]  
Forer effect or Barnum effect The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.  
Framing effect Drawing different conclusions from the same information, depending on how that information is presented.  
Frequency illusion The illusion in which a word, a name or other thing that has recently come to one’s attention suddenly seems to appear with improbable frequency shortly afterwards (not to be confused with the recency illusion or selection bias).[40] Colloquially, this illusion is known as the Baader-Meinhof Phenomenon.[41]  
Functional fixedness Limits a person to using an object only in the way it is traditionally used.  
Gambler’s fallacy The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. Results from an erroneous conceptualization of the law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”  
Hard–easy effect Based on a specific level of task difficulty, the confidence in judgments is too conservative and not extreme enough[24][42][43][44]  
Hindsight bias Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable[45] at the time those events happened.  
Hostile media effect The tendency to see a media report as being biased, owing to one’s own strong partisan views.  
Hot-hand fallacy The “hot-hand fallacy” (also known as the “hot hand phenomenon” or “hot hand”) is the fallacious belief that a person who has experienced success has a greater chance of further success in additional attempts.  
Hyperbolic discounting Discounting is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning.[46] Also known as current moment bias, present-bias, and related to Dynamic inconsistency.  
Identifiable victim effect The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.[47]  
IKEA effect The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result.  
Illusion of control The tendency to overestimate one’s degree of influence over other external events.[48]  
Illusion of validity Belief that furtherly acquired information generates additional relevant data for predictions, even when it evidently does not.[49]  
Illusory correlation Inaccurately perceiving a relationship between two unrelated events.[50][51]  
Impact bias The tendency to overestimate the length or the intensity of the impact of future feeling states.[52]  
Information bias The tendency to seek information even when it cannot affect action.[53]  
Insensitivity to sample size The tendency to under-expect variation in small samples  
Irrational escalation The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.  
Less-is-better effect The tendency to prefer a smaller set to a larger set judged separately, but not jointly  
Loss aversion “the disutility of giving up an object is greater than the utility associated with acquiring it”.[54] (see also Sunk cost effects and endowment effect).  
Mere exposure effect The tendency to express undue liking for things merely because of familiarity with them.[55]  
Money illusion The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.[56]  
Moral credential effect The tendency of a track record of non-prejudice to increase subsequent prejudice.  
Negativity effect Tendency in evaluating the causes of the behaviors of a person they dislike, to attribute their positive behaviors to the environment and their negative behaviors to the person’s inherent nature.  
Negativity bias

Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories.[57](WIKIPEDIA)As a result “humans are compulsively negative”. “Negative thoughts and reactions—reasons we don’t like something—come to mind one after another almost magically. We don’t consciously orchestrate their generation. They just happen spontaneously.” “especially when evaluating the merit of something new or unconventional we focus on the negative aspects.

Jones, M. D. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving (Revised edition edition). New York: Crown Business.
Pros-Cons-and-Fixescompensates for negative
thinking by forcing us to identify the positives  first . Only then are we allowed to indulge joyously in negatives. But the technique goes a step further
by examining the negatives and trying to think of actions that could be taken to “fix” them, either converting them into positives or, if that isn’t feasible, eliminating them altogether. Those negatives (cons) that can’t be “fixed” represent the price one must pay, the burden one must bear, if the
thing being evaluated were to be adopted or accepted.(Jones, 1998, p.53)
Neglect of probability The tendency to completely disregard probability when making a decision under uncertainty.[58]  
Normalcy bias The refusal to plan for, or react to, a disaster which has never happened before.  
Not invented here Aversion to contact with or use of products, research, standards, or knowledge developed outside a group. Related to IKEA effect.  
Observer-expectancy effect When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).  
Omission bias The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).[59]  
Optimism bias The tendency to be over-optimistic, overestimating favorable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias).[60][61]  
Ostrich effect Ignoring an obvious (negative) situation.  
Outcome bias The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.  
Overconfidence effect Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.[24][62][63][64]  
Pareidolia A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse.  
Pessimism bias The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.  
Planning fallacy The tendency to underestimate task-completion times.[52]  
Post-purchase rationalization The tendency to persuade oneself through rational argument that a purchase was a good value.  
Pro-innovation bias The tendency to have an excessive optimism towards an invention or innovation’s usefulness throughout society, while often failing to identify its limitations and weaknesses.  
Pseudocertainty effect The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.[65]  
Reactance The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice (see also Reverse psychology).  
Reactive devaluation Devaluing proposals only because they purportedly originated with an adversary.  
Recency illusion The illusion that a word or language usage is a recent innovation when it is in fact long-established (see also frequency illusion).  
Restraint bias The tendency to overestimate one’s ability to show restraint in the face of temptation.  
Rhyme as reason effect Rhyming statements are perceived as more truthful. A famous example being used in the O.J Simpson trial with the defense’s use of the phrase “If the gloves don’t fit, then you must acquit.”  
Risk compensation / Peltzman effect The tendency to take greater risks when perceived safety increases.  
Selective perception The tendency for expectations to affect perception.  
Semmelweis reflex The tendency to reject new evidence that contradicts a paradigm.[28]  
Social comparison bias The tendency, when making hiring decisions, to favour potential candidates who don’t compete with one’s own particular strengths.[66]  
Social desirability bias The tendency to over-report socially desirable characteristics or behaviours in one self and under-report socially undesirable characteristics or behaviours.[67]  
Status quo bias The tendency to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[68][69]  
Stereotyping Expecting a member of a group to have certain characteristics without having actual information about that individual.  
Subadditivity effect The tendency to judge probability of the whole to be less than the probabilities of the parts.[70]  
Subjective validation Perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences.  
Survivorship bias Concentrating on the people or things that “survived” some process and inadvertently overlooking those that didn’t because of their lack of visibility.  
Time-saving bias Underestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively low speed and overestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively high speed.  
Unit bias The tendency to want to finish a given unit of a task or an item. Strong effects on the consumption of food in particular.[71]  
Well travelled road effect Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes.  
Zero-risk bias Preference for reducing a small risk to zero over a greater reduction in a larger risk.  
Zero-sum heuristic Intuitively judging a situation to be zero-sum (i.e., that gains and losses are correlated). Derives from the zero-sum game in game theory, where wins and losses sum to zero.[72][73] The frequency with which this bias occurs may be related to the social dominance orientation personality factor.  

So

=======================================

Daniel Kahneman and Amos Tversky

http://www.princeton.edu/~kahneman/docs/Publications/prospect_theory.pdf

==================================================================

experiments by Tversky and Kahneman showed that the same people who would choose 1 candy bar now over 2 candy bars tomorrow, would choose 2 candy bars 101 days from now over 1 candy bar 100 days from now.

Amir: This means the utility function of delta input drop rapidly in short distances from now but not that fast later. Dy/dx for dx=1 unit is <1 for tomorrow.  It is dy/dx for dx=1 >1 100 days from now. This means that the utility of one unit function is convex. Like dy/dx=1/t that means that immediate gratification is very determining

Goodwill, P., and G. Wright.  Decision Analysis for Management Judgment. 3rd  ed.  Chichester:  John Wiley and Sons, 2004.

 

Additional Resources:

Hammond, J., Keeney R. Raiffa H. (2002). Smart Choices: A Practical Guide To Making Better Life Decisions. Broadway Books New York.

Harvard Business Review on Decision Making (2001 paperback ed.), Boston, Ma; Harvard Business School Publishing.

Morgan, J. D. (1998). A Thinker’s Toolkit; 14 Powerful Techniques for Problem Solving. New Your: Three River Press.

=============================================

The logic of Risk taking:

G=gain from taking risk

p(G)= probability of gain

L=loss from risk taking

p(L)= probability of loss

W=wealth

G*p(G)-L*p(L)>W

Assume G=10L   and P(G)=10P(L) 

100L*P(L)-L*p(L)>W

L*p(L)>W/99

G=kL       p(G)=k(p(L)

L*p(L)=W/(k-1)

W=1000000

G=1100000

L=