Examples of Incidents Attributed to Human Error During Nights or Night Shifts

  • Three-Mile Island nuclear reactor
  • Davis–Besse nuclear reactor at Oak Harbor, Ohio
  • Racho Seco nuclear reactor near Sacramento, Calif
  • Chernobyl nuclear plant
  • Space shuttle Challenger accident
  • Launch of the space shuttle Columbia
  • Bhopal Union Carbide tragedy
  • Exxon Valdez accident
  • Estonia ferry accident
  • Peak incidence of single-vehicle motor accidents
  • 18% increase in human error incidents in afternoon shift relative to morning shift
  • 30% increase in human error incidents on night shift relative to morning shift

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1854972/

 

Decision Making

=-=-=-=-=-=-=-=-=-=-

Ladder of Inference

First described by William Isaacs in a 1992 paper by the MIT Center for Organizational Learning,
the Ladder of Inference describes the thinking process that we go through, usually without
realizing it, to get from a fact to a decision or action.

https://asana.com/resources/ladder-of-inference

https://www.mindtools.com/aipz4vt/the-ladder-of-inference

URL: https://thesystemsthinker.com/the-ladder-of-inference/

https://synergycommons.net/resources/the-ladder-of-inference/

==-=-=-=-=-=-=-=-=–=-

Daniel Kahneman:http://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow

Our brain is weak in remembering lists but is good in remembering routes and agents, so to remember a list we should imagine traveling a route related to that list.

Two different ways the brain forms thoughts:

System 1: Fast, automatic, frequent, emotional, stereotypical, subconscious works all the time
System 2: Slow, effortful, infrequent, logical, calculating, conscious we feel that we are the authors of the thoughts of system 2,

but system one keeps whispering suggesting to system 2 and most of the time we are guided by automatic suggestions of system 2.

It is good for most of the time when we are relying on skills.

Relying on intuition for answer of complex questions is very risky

People need protection against their own mistakes and predators.

Most of what we do is just skill and system 1 is fine for that, but system 1 produces simplified answers when it is facing a situation for which it doesn’t have an skilled answer.

If the problems arise in system 1 and we are not aware that system 1 doesn’t have the skill, then system 2 will not be activated to correct it.

There are predictable errors that people make, the only thing we can do is to recognize the situations  in which you are likely to make an error.

Institutions can help controlling us from relying on intuition.

If we were in a society that understood the problems of judgement better, we would make better decisions (like using regressive prediction)

People are more intelligent about the problems of other people.

Kahneman https://www.youtube.com/watch?feature=player_embedded&v=i_UVDD7ErJ4

Kahneman suggests: http://www.amazon.ca/Moonwalking-Einstein-Science-Remembering-Everything/dp/0143120530

Intuitive prediction – Defense Technical Information Center

http://www.spiegel.de/international/zeitgeist/interview-with-daniel-kahneman-on-the-pitfalls-of-intuition-and-memory-a-834407.html

http://www.lse.ac.uk/newsAndMedia/videoAndAudio/channels/publicLecturesAndEvents/player.aspx?id=1251

http://integral-options.blogspot.ca/2012/06/debunking-myth-of-intuition-daniel.html

http://examinedexistence.com/daniel-kahnemans-on-why-we-shouldnt-always-trust-our-gut-feelings/

http://www.inc.com/graham-winfrey/daniel-kahneman-on-why-entrepreneurs-shouldnt-trust-their-gut.html

=================================================================================

Factors that affect how people make choices

1. Situational Factors

2. Personality Traits

3. Group Dynamics – that affect group decision-making

Use Divergent (brainstorming)/Convergent thinking

Step 1 (divergent Thinking): Brainstorm.-The more ideas, the better-Build one idea upon another-Wacky ideas are okay-Don’t evaluate ideas
Step 2 : Blow away and cluster and group similar ideas.
Step 3 (convergent thinking ): Select practical, promising ideas.

We should train ourselves to be ready at any moment, when we are
analyzing a problem, to shift from our normal convergent mode to a divergent mode and back again, taking from the divergency new ideas.

Tell the people that you are starting a divergent mode, otherwise they may get upset or may take the brainstormed ideas as tangent or really serious proposals.

Jones, M. D. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving (Revised edition edition). New York: Crown Business.

 

==================================================

Fallacy

A fallacy is the use of invalid or otherwise faulty reasoning, or “wrong moves” in the construction of an argument. A fallacious argument may be deceptive by appearing to be better than it really is.

https://en.wikipedia.org/wiki/Fallacy

==================================================

Cognitive biases can lead to systematic deviations from standard rationality ,however the standard rationality does not exist in real life.

Cognitive biases can lead to systematic deviations from good judgment, however goodness is subjective.

They are often studied in psychology and behavioral economics.

Some are effects of information-processing rules (i.e. mental shortcuts), called heuristics, that the brain uses to produce decisions or judgments. Such effects are called cognitive biases.

Biases in judgment or decision-making can also result from motivation, such as when beliefs are distorted by wishful thinking.

Some biases have a variety of cognitive (“cold”) or motivational (“hot”) explanations. Both effects can be present at the same time.

http://en.wikipedia.org/wiki/List_of_cognitive_biases

 

Cognitive biases are thinking tendencies that may lead to systematic deviations from a standard of rationality or good judgment.

by:

Name Description Solution
Ambiguity effect The tendency to avoid options for which missing information makes the probability seem “unknown”.[8]  Step 1 (divergent Thinking): Brainstorm.-The more ideas, the better-Build one idea upon another-Wacky ideas are okay-Don’t evaluate ideas
Step 2 : Blow away and cluster.
Step 3 (convergent thinking ): Select practical, promising ideas.We should train ourselves to be ready at any moment, when we are
analyzing a problem, to shift from our normal convergent mode to a divergent mode and back again, taking from the divergency new ideas.Tell the people that you are starting a divergent mode, otherwise they may get upset or may take the brainstormed ideas as tangent or really serious proposals.Jones, M. D. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving (Revised edition edition). New York: Crown Business.
Anchoring or focalism The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information that we acquire on that subject)[9][10]  
Attentional bias The tendency of our perception to be affected by our recurring thoughts.[11]  
Automation bias The tendency to excessively depend on automated systems which can lead to erroneous automated information overriding correct decisions.[12]  
Availability heuristic The tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.[13]  
Availability cascade A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”).[14]  
Backfire effect When people react to disconfirming evidence by strengthening their beliefs.[15]  
Bandwagon effect The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.[16]  
Base rate fallacy or base rate neglect The tendency to ignore base rate information (generic, general information) and focus on specific information (information only pertaining to a certain case).[17]  
Belief bias An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.[18]  
Bias blind spot The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.[19]  
Cheerleader effect The tendency for people to appear more attractive in a group than in isolation.[20]  
Choice-supportive bias The tendency to remember one’s choices as better than they actually were.[21]  
Clustering illusion The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).[10]  
Confirmation bias The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.[22]  
Congruence bias The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.[10]  
Conjunction fallacy The tendency to assume that specific conditions are more probable than general ones.[23]  
Conservatism or regressive bias A certain state of mind wherein high values and high likelihoods are overestimated while low values and low likelihoods are underestimated.[24][25][26][unreliable source?]  
Conservatism (Bayesian) The tendency to revise one’s belief insufficiently when presented with new evidence.[24][27][28]  
Contrast effect The enhancement or reduction of a certain perception’s stimuli when compared with a recently observed, contrasting object.[29]  
Curse of knowledge When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.[30]  
Decoy effect Preferences for either option A or B changes in favor of option B when option C is presented, which is similar to option B but in no way better.  
Denomination effect The tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills).[31]  
Distinction bias The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[32]  
Dunning-Kruger effect The tendency for unskilled individuals to overestimate their ability and the tendency for experts to underestimate their ability.[33]  
Duration neglect The neglect of the duration of an episode in determining its value  
Empathy gap The tendency to underestimate the influence or strength of feelings, in either oneself or others.  
Endowment effect The fact that people often demand much more to give up an object than they would be willing to pay to acquire it.[34]  
Essentialism Categorizing people and things according to their essential nature, in spite of variations.[dubious ][35]  
Evaluability bias The tendency to weight the importance of an attribute in proportion to its ease of evaluation[36]  
Exaggerated expectation Based on the estimates, real-world evidence turns out to be less extreme than our expectations (conditionally inverse of the conservatism bias).[unreliable source?][24][37]  
Experimenter’s or expectation bias The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[38]  
Focusing effect The tendency to place too much importance on one aspect of an event.[39]  
Forer effect or Barnum effect The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.  
Framing effect Drawing different conclusions from the same information, depending on how that information is presented.  
Frequency illusion The illusion in which a word, a name or other thing that has recently come to one’s attention suddenly seems to appear with improbable frequency shortly afterwards (not to be confused with the recency illusion or selection bias).[40] Colloquially, this illusion is known as the Baader-Meinhof Phenomenon.[41]  
Functional fixedness Limits a person to using an object only in the way it is traditionally used.  
Gambler’s fallacy The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. Results from an erroneous conceptualization of the law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”  
Halo Effect
If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  • Be aware: Awareness is the first step towards overcoming errors in judgement. …
  • Slow down: deliberately slow down your judgement and any subsequent decisions. …
  • Be systematic: Finally, try to engage your analytical reasoning skills by taking a systematic approach.
Hard–easy effect Based on a specific level of task difficulty, the confidence in judgments is too conservative and not extreme enough[24][42][43][44]  
Hindsight bias Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable[45] at the time those events happened.  
Hostile media effect The tendency to see a media report as being biased, owing to one’s own strong partisan views.  
Hot-hand fallacy The “hot-hand fallacy” (also known as the “hot hand phenomenon” or “hot hand”) is the fallacious belief that a person who has experienced success has a greater chance of further success in additional attempts.  
Hyperbolic discounting Discounting is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning.[46] Also known as current moment bias, present-bias, and related to Dynamic inconsistency.  
Identifiable victim effect The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.[47]  
IKEA effect The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result.  
Illusion of control The tendency to overestimate one’s degree of influence over other external events.[48]  
Illusion of validity Belief that furtherly acquired information generates additional relevant data for predictions, even when it evidently does not.[49]  
Illusory correlation Inaccurately perceiving a relationship between two unrelated events.[50][51]  
Impact bias The tendency to overestimate the length or the intensity of the impact of future feeling states.[52]  
Information bias The tendency to seek information even when it cannot affect action.[53]  
Insensitivity to sample size The tendency to under-expect variation in small samples  

Irrational escalation

sunk cost fallacy

The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.  
Less-is-better effect The tendency to prefer a smaller set to a larger set judged separately, but not jointly  
Loss aversion “the disutility of giving up an object is greater than the utility associated with acquiring it”.[54] (see also Sunk cost effects and endowment effect).  
Mere exposure effect The tendency to express undue liking for things merely because of familiarity with them.[55]  
Money illusion The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.[56]  
Moral credential effect The tendency of a track record of non-prejudice to increase subsequent prejudice.  
Negativity effect Tendency in evaluating the causes of the behaviors of a person they dislike, to attribute their positive behaviors to the environment and their negative behaviors to the person’s inherent nature.  
Negativity bias

Humans have a greater recall of unpleasant memories compared with positive memories.[57](WIKIPEDIA)As a result “humans are compulsively negative”. “Negative thoughts and reactions—reasons we don’t like something—come to mind one after another almost magically. We don’t consciously orchestrate their generation. They just happen spontaneously.” “especially when evaluating the merit of something new or unconventional we focus on the negative aspects.

Jones, M. D. (1998). The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving (Revised edition edition). New York: Crown Business.
Pros-Cons-and-Fixescompensates for negative
thinking by forcing us to identify the positives  first . Only then are we allowed to indulge joyously in negatives. But the technique goes a step further
by examining the negatives and trying to think of actions that could be taken to “fix” them, either converting them into positives or, if that isn’t feasible, eliminating them altogether. Those negatives (cons) that can’t be “fixed” represent the price one must pay, the burden one must bear, if the
thing being evaluated were to be adopted or accepted.(Jones, 1998, p.53)
Neglect of probability The tendency to completely disregard probability when making a decision under uncertainty.[58]  
Normalcy bias The refusal to plan for, or react to, a disaster which has never happened before.  
Not invented here Aversion to contact with or use of products, research, standards, or knowledge developed outside a group. Related to IKEA effect.  
Observer-expectancy effect When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).  
Omission bias The tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).[59]  
Optimism bias The tendency to be over-optimistic, overestimating favorable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias).[60][61]  
Ostrich effect Ignoring an obvious (negative) situation.  
Outcome bias The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.  
Overconfidence effect Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.[24][62][63][64]  
Pareidolia A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse.  
Pessimism bias The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.  
Planning fallacy The tendency to underestimate task-completion times.[52]  
Post-purchase rationalization The tendency to persuade oneself through rational argument that a purchase was a good value.  
Pro-innovation bias The tendency to have an excessive optimism towards an invention or innovation’s usefulness throughout society, while often failing to identify its limitations and weaknesses.  
Pseudocertainty effect The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.[65]  
Reactance The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice (see also Reverse psychology).  
Reactive devaluation Devaluing proposals only because they purportedly originated with an adversary.  
Recency illusion The illusion that a word or language usage is a recent innovation when it is in fact long-established (see also frequency illusion).  
Restraint bias The tendency to overestimate one’s ability to show restraint in the face of temptation.  
Rhyme as reason effect Rhyming statements are perceived as more truthful. A famous example being used in the O.J Simpson trial with the defense’s use of the phrase “If the gloves don’t fit, then you must acquit.”  
Risk compensation / Peltzman effect The tendency to take greater risks when perceived safety increases.  
Selective perception The tendency for expectations to affect perception.  
Semmelweis reflex The tendency to reject new evidence that contradicts a paradigm.[28]  
Social comparison bias The tendency, when making hiring decisions, to favour potential candidates who don’t compete with one’s own particular strengths.[66]  
Social desirability bias The tendency to over-report socially desirable characteristics or behaviours in one self and under-report socially undesirable characteristics or behaviours.[67]  
Status quo bias The tendency to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[68][69]  
Stereotyping Expecting a member of a group to have certain characteristics without having actual information about that individual.  
Subadditivity effect The tendency to judge probability of the whole to be less than the probabilities of the parts.[70]  
Subjective validation Perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences.  
Survivorship bias Concentrating on the people or things that “survived” some process and inadvertently overlooking those that didn’t because of their lack of visibility.  
Time-saving bias Underestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively low speed and overestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively high speed.  
Unit bias The tendency to want to finish a given unit of a task or an item. Strong effects on the consumption of food in particular.[71]  
Well travelled road effect Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes.  
Zero-risk bias Preference for reducing a small risk to zero over a greater reduction in a larger risk.  
Zero-sum heuristic Intuitively judging a situation to be zero-sum (i.e., that gains and losses are correlated). Derives from the zero-sum game in game theory, where wins and losses sum to zero.[72][73] The frequency with which this bias occurs may be related to the social dominance orientation personality factor.  

 

50 Types of Common Cognitive Biases

  1. Fundamental Attribution Error: We judge others on their personality or fundamental character, but we judge ourselves on the situation.
  2. Self-Serving Bias: Our failures are situational, but our successes are our responsibility.
  3. In-Group Favoritism: We favor people who are in our in-group as opposed to an out-group.
  4. Bandwagon Effect: Ideas, fads, and beliefs grow as more people adopt them.
  5. Groupthink: Due to a desire for conformity and harmony in the group, we make irrational decisions, often to minimize conflict.
  6. Halo Effect: If you see a person as having a positive trait, that positive impression will spill over into their other traits. (This also works for negative traits.)
  7. Moral Luck: Better moral standing happens due to a positive outcome; worse moral standing happens due to a negative outcome.
  8. False Consensus: We believe more people agree with us than is actually the case.
  9. Curse of Knowledge: Once we know something, we assume everyone else knows it, too.
  10. Spotlight Effect: We overestimate how much people are paying attention to our behavior and appearance.
  11. Availability Heuristic: We rely on immediate examples that come to mind while making judgments.
  12. Defensive Attribution: As a witness who secretly fears being vulnerable to a serious mishap, we will blame the victim less if we relate to the victim.
  13. Just-World Hypothesis: We tend to believe the world is just; therefore, we assume acts of injustice are deserved.
  14. Naïve Realism: We believe that we observe objective reality and that other people are irrational, uninformed, or biased.
  15. Naïve Cynicism: We believe that we observe objective reality and that other people have a higher egocentric bias than they actually do in their intentions/actions.
  16. Forer Effect (aka Barnum Effect): We easily attribute our personalities to vague statements, even if they can apply to a wide range of people.
  17. Dunning-Kruger Effect: The less you know, the more confident you are. The more you know, the less confident you are.
  18. Anchoring: We rely heavily on the first piece of information introduced when making decisions.
  19. Automation Bias: We rely on automated systems, sometimes trusting too much in the automated correction of actually correct decisions.
  20. Google Effect (aka Digital Amnesia): We tend to forget information that’s easily looked up in search engines.
  21. Reactance: We do the opposite of what we’re told, especially when we perceive threats to personal freedoms.
  22. Confirmation Bias: We tend to find and remember information that confirms our perceptions.
  23. Backfire Effect: Disproving evidence sometimes has the unwarranted effect of confirming our beliefs.
  24. Third-Person Effect: We believe that others are more affected by mass media consumption than we ourselves are.
  25. Belief Bias: We judge an argument’s strength not by how strongly it supports the conclusion but how plausible the conclusion is in our own minds.
  26. Availability Cascade: Tied to our need for social acceptance, collective beliefs gain more plausibility through public repetition.
  27. Declinism: We tent to romanticize the past and view the future negatively, believing that societies/institutions are by and large in decline.
  28. Status Quo Bias: We tend to prefer things to stay the same; changes from the baseline are considered to be a loss.
  29. Sunk Cost Fallacy (aka Escalation of Commitment): We invest more in things that have cost us something rather than altering our investments, even if we face negative outcomes.
  30. Gambler’s Fallacy: We think future possibilities are affected by past events.
  31. Zero-Risk Bias: We prefer to reduce small risks to zero, even if we can reduce more risk overall with another option.
  32. Framing Effect: We often draw different conclusions from the same information depending on how it’s presented.
  33. Stereotyping: We adopt generalized beliefs that members of a group will have certain characteristics, despite not having information about the individual.
  34. Outgroup Homogeneity Bias: We perceive out-group members as homogeneous and our own in-groups as more diverse.
  35. Authority Bias: We trust and are more often influenced by the opinions of authority figures.
  36. Placebo Effect: If we believe a treatment will work, it often will have a small physiological effect.
  37. Survivorship Bias: We tend to focus on those things that survived a process and overlook ones that failed.
  38. Tachypsychia: Our perceptions of time shift depending on trauma, drug use, and physical exertion.
  39. Law of Triviality (aka “Bike-Shedding”): We give disproportionate weight to trivial issues, often while avoiding more complex issues.
  40. Zeigarnik Effect: We remember incomplete tasks more than completed ones.
  41. IKEA Effect: We place higher value on things we partially created ourselves.
  42. Ben Franklin Effect: We like doing favors; we are more likely to do another favor for someone if we’ve already done a favor for them than if we had received a favor from that person.
  43. Bystander Effect: The more other people are around, the less likely we are to help a victim.
  44. Suggestibility: We, especially children, sometimes mistake ideas suggested by a questioner for memories.
  45. False Memory: We mistake imagination for real memories.
  46. Cryptomnesia: We mistake real memories for imagination.
  47. Clustering Illusion: We find patterns and “clusters” in random data.
  48. Pessimism Bias: We sometimes overestimate the likelihood of bad outcomes.
  49. Optimism Bias: We sometimes are over-optimistic about good outcomes.
  50. Blind Spot Bias: We don’t think we have bias, and we see it others more than ourselves.

 

https://thedecisionlab.com/biases

https://www.titlemax.com/discovery-center/lifestyle/50-cognitive-biases-to-be-aware-of-so-you-can-be-the-very-best-version-of-you/

 

 

So

=======================================

Daniel Kahneman and Amos Tversky

http://www.princeton.edu/~kahneman/docs/Publications/prospect_theory.pdf

==================================================================

experiments by Tversky and Kahneman showed that the same people who would choose 1 candy bar now over 2 candy bars tomorrow, would choose 2 candy bars 101 days from now over 1 candy bar 100 days from now.

Amir: This means the utility function of delta input drop rapidly in short distances from now but not that fast later. Dy/dx for dx=1 unit is <1 for tomorrow.  It is dy/dx for dx=1 >1 100 days from now. This means that the utility of one unit function is convex. Like dy/dx=1/t that means that immediate gratification is very determining

Goodwin, P., and G. Wright.  Decision Analysis for Management Judgment. 3rd  ed.  Chichester:  John Wiley and Sons, 2004.

 

Additional Resources:

Hammond, J., Keeney R. Raiffa H. (2002). Smart Choices: A Practical Guide To Making Better Life Decisions. Broadway Books New York.

Harvard Business Review on Decision Making (2001 paperback ed.), Boston, Ma; Harvard Business School Publishing.

Morgan, J. D. (1998). A Thinker’s Toolkit; 14 Powerful Techniques for Problem Solving. New Your: Three River Press.

=============================================

The logic of Risk taking:

G=gain from taking risk

p(G)= probability of gain

L=loss from risk taking

p(L)= probability of loss

W=wealth

G*p(G)-L*p(L)>W

Assume G=10L   and P(G)=10P(L) 

100L*P(L)-L*p(L)>W

L*p(L)>W/99

G=kL       p(G)=k(p(L)

L*p(L)=W/(k-1)

W=1000000

G=1100000

L=

 

 

 

=========================

The utility functions are ordinal

think of an outcome as a set of measurable attributes

An outcome is the result of your decision or action and the state of the world

http://faculty.econ.ucdavis.edu/faculty/bonanno/Youtube%20Recorded%20Lectures%20Decision%20Making.html

http://faculty.econ.ucdavis.edu/faculty/bonanno/Youtube%20Recorded%20Lectures%20Uncertainty%20Risk%20and%20Information.html

http://faculty.econ.ucdavis.edu/faculty/bonanno/Youtube%20Recorded%20Lectures%20Game%20Theory.html

 

 

 

conformity

happiness

Laziness
exploitation
reaction
Conflicting demand

insanity
submission
Union/Integrity
purposefullness

Love/hate

Conformity
reason

Intelligence

Rationalization
Money
family

https://en.wikipedia.org/wiki/Communitarianism

=-=-=-=-=-=-=-=-=-

This study investigates people’s implicit stereotype of the social group of the rich in terms of competence and warmth.

https://journals.sagepub.com/doi/10.1017/prp.2017.8

The Complementary Stereotypes about the Rich and the Poor: A Study in China

https://www.scirp.org/journal/paperinformation.aspx?paperid=72269

Does Perceiving the Poor as Warm and the Rich as Cold Enhance Perceived Social Justice? The Effects of Activating Compensatory Stereotypes on Justice Perception

https://www.frontiersin.org/articles/10.3389/fpsyg.2019.01361/full

Social Class Competence Stereotypes Are Amplified by Socially Signaled Economic Inequality

https://journals.sagepub.com/doi/10.1177/0146167220916640

Loading