www.fgks.org   »   [go: up one dir, main page]

Jump to content

Talk:Hindsight bias: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Line 139: Line 139:
:::::::::If you are someone who has a lot of experience in coming to an answer (which is beforehand), then there is no bias involved. The point is the overestimation of one's likelihood of solving; if one is 100% certain to find an answer already, then one cannot overestimate that! --[[User:JorisvS|JorisvS]] ([[User talk:JorisvS|talk]]) 16:58, 4 April 2014 (UTC)
:::::::::If you are someone who has a lot of experience in coming to an answer (which is beforehand), then there is no bias involved. The point is the overestimation of one's likelihood of solving; if one is 100% certain to find an answer already, then one cannot overestimate that! --[[User:JorisvS|JorisvS]] ([[User talk:JorisvS|talk]]) 16:58, 4 April 2014 (UTC)
::::::::::To me (and this happens to be an area in which I can claim some expertise), ''generally'' implies a much stronger probability than ''may'', which expresses only ''possibility''. To say that X is generally true suggests that X will ''usually'' or ''virtually always'' be found to be true--that one can ''generalize'' by saying "X is true". But to say that X ''may'' be true leaves open the broad possibility that, at least in many cases, X will turn out not to be true. The latter more accurately applies to hindsight bias, doesn't it? Can one generalize by saying, "People think they could have solved a problem, once they know the answer"? I don't think so, so I'd keep "may" rather than "generally". [[User:Jdcrutch|<tt>''J.&nbsp;D.&nbsp;Crutchfield''</tt>]]&nbsp;&#124;&nbsp;[[User_talk:Jdcrutch|Talk]] 17:42, 4 April 2014 (UTC)
::::::::::To me (and this happens to be an area in which I can claim some expertise), ''generally'' implies a much stronger probability than ''may'', which expresses only ''possibility''. To say that X is generally true suggests that X will ''usually'' or ''virtually always'' be found to be true--that one can ''generalize'' by saying "X is true". But to say that X ''may'' be true leaves open the broad possibility that, at least in many cases, X will turn out not to be true. The latter more accurately applies to hindsight bias, doesn't it? Can one generalize by saying, "People think they could have solved a problem, once they know the answer"? I don't think so, so I'd keep "may" rather than "generally". [[User:Jdcrutch|<tt>''J.&nbsp;D.&nbsp;Crutchfield''</tt>]]&nbsp;&#124;&nbsp;[[User_talk:Jdcrutch|Talk]] 17:42, 4 April 2014 (UTC)
:::::::::::It's more like most people will show hindsight bias to some degree, and some more strongly than others. What about "tend to" instead? --[[User:JorisvS|JorisvS]] ([[User talk:JorisvS|talk]]) 15:41, 5 April 2014 (UTC)

Revision as of 15:41, 5 April 2014

WikiProject iconPsychology Start‑class Mid‑importance
WikiProject iconThis article is within the scope of WikiProject Psychology, a collaborative effort to improve the coverage of Psychology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-importance on the project's importance scale.
WikiProject iconPhilosophy: Logic Start‑class Low‑importance
WikiProject iconThis article is within the scope of WikiProject Philosophy, a collaborative effort to improve the coverage of content related to philosophy on Wikipedia. If you would like to support the project, please visit the project page, where you can get more details on how you can help, and where you can join the general discussion about philosophy content on Wikipedia.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
LowThis article has been rated as Low-importance on the project's importance scale.
Associated task forces:
Taskforce icon
Logic

September 2006

This is my first serious attempt at growing the knowledge contained in Wikipedia. I've tried my best to conform to the formating and content standards of this sphere, but if you see any problems with my work, I would appreciate feedback. Thanks. My edit was of implications/classic studies from the Myers textbook.

(Contributed by User:Irimi)

I have a problem with the section listing phrases describing them as "illustrative of this fallacy":

Phrases The following common phrases are illustrative of this fallacy: With the wisdom of hindsight. Retrospective foresight. Hindsight is 20/20. Hindsight is a wonderful thing

I don't think the common use of these phrases is so much an illustration (nor example) of this fallacy, but something slightly different, which I might almost call a joke (I don't know what to call it).

When someone (or I) say something like "hindsight is 20/20" I'm not trying to say (or imply) that someone has a false recollection that their prediction of outcome was correct, but instead saying that, now that the result is known, anybody could "predict" this (ex poste facto).

At least I don't think so. I haven't changed anything on the page because I'm not 100% sure of myself or of how I might change that section if I were 100% sure.

209.60.102.231 12:56, 1 September 2006 (UTC)[reply]

Classic studies

What exactly does either of the "classic studies" cited have to do with hindsight bias? I haven't removed them because I'm open to the possibility that I'm missing something, but they both seem to be about a totally unrelated subject. 81.86.133.45 23:15, 7 February 2007 (UTC)[reply]

If they seem to be unrelated, they probably are, this is Wikipedia. Classic studies should refer to seminal articles. These should include the first empirical tests of the hindsight bias done by Fischhoff, B. (1975), also one should discuss the 2 meta-analyses and the single narrative review that has been done on the hindsight bias. —Preceding unsigned comment added by 142.66.58.192 (talk) 20:26, 17 November 2008 (UTC)[reply]

I agree that they seem to be irrelevant to this article.MartinPoulter (talk) 09:25, 18 November 2008 (UTC)[reply]

Can anyone get an Oxford Dictionary of the English Language based citation for the pronunciation comment? In addition, I think the pronunciation guide more properly belongs in the introduction. —Preceding unsigned comment added by 128.194.74.31 (talk) 04:54, 19 February 2010 (UTC)[reply]

Examples

Could we please have some examples on this page? Without them it's quite hard to work out what the article is talking about. 58.165.109.255 (talk) 12:37, 21 September 2008 (UTC)[reply]

How is Pavlov's dog exhibiting hindsight bias... ? That's just not what that bias is about. I haven't looked up the citation given, but it doesn't seem initially plausibel that such an encyclopedia would make that claim. - 79.92.46.10 (talk) 12:27, 22 April 2011 (UTC)[reply]

Popular Culture

I think the section on popular culture should be dropped. Most of the examples provide therein are either inaccurate (e.g., they refer to overconfidence or simple egoism) or they are colloquial and improperly cited (e.g., weather and sports examples).

I'm not sure if the popular culture section was added to address the 'examples' issue noted above, but for a psychological phenomenon which is related but distinct form other phenomena, precision is important. I believe it should be relatively easy to pull examples from actual papers (e.g., medical malpractice). 128.237.247.173 (talk) 19:07, 7 June 2011 (UTC)[reply]

I was thinking the same thing as I skimmed through this. Removed. —tktktk 01:01, 8 June 2011 (UTC)[reply]

Elimination

... Researchers attempt to decrease the bias in participants has failed, leading one to think that hindsight bias has an automatic source in cognitive reconstruction. This supports the Causal Model Theory and the use of sense-making to understand event outcomes. ...

I'm confused by the above statement. Can someone help to simplify and clarify it. I would rewrite it as follows but I'm not sure if that's what is being said. I couldn't access the reference so I don't know the original statement by Blank, H., & Nestler, S. (2007).

... Attempt by researchers to decrease hindsight bias in participants has failed. This failure led some to think that hindsight bias has an automatic source in cognitive reconstruction. And this supports the Causal Model Theory and the use of sense-making to understand event outcomes. ...

Abelmebratu (talk) 21:53, 14 March 2012 (UTC)[reply]

Wrong article

Hindsight and hindsight bias/knew-it-all-along effect are not the same thing. Hindsight is merely the "perception of the significance and nature of events after they have occurred". It does not necessarily mean you think you predicted it when in fact you didn't. The bias part of the article should be split off, or made just a subsection.Malick78 (talk) 10:14, 8 September 2012 (UTC)[reply]

The article is about the hindsight bias. I've moved it back to where it should be. --JorisvS (talk) 19:14, 10 September 2012 (UTC)[reply]

Editing for class project

My name is Victoria, and over the next few weeks I plan on editing the hindsight bias article through a class at Shenandoah University. I plan on including information from the following two sources: [1] [2] - — Preceding unsigned comment added by Vmathews102 (talkcontribs)

Welcome! - David Gerard (talk) 22:56, 22 February 2013 (UTC)[reply]

Bare reference added, not after anything

A new editor just added (in ref tags not on any statement):

Harley, E. M. (2007). HINDSIGHT BIAS IN LEGAL DECISION MAKING. Social Cognition, 25(1), 48-63.

Does anyone have this to hand, as a possible source for new content? - David Gerard (talk) 22:27, 30 November 2013 (UTC)[reply]

Social Cognition Project

These edits are done by Miami University students for a Social Cognition assignment.

1. Lack of information about the sense-making process when explaining how surprise effects hindsight bias and Pezzo’s sense-making model. This models integrates two supported contradicting ideas that a surprising outcome can either show results of lesser or possibly reversed hindsight bias (when the individual will believe that the outcome wasn't a possibility at all), or a surprising outcome can lead to the hindsight bias being even stronger than before. “Initial surprise is necessary to trigger the sense-making process but if the sense making is not successful, surprise should prevail and -- as a consequence-- hindsight bias should be attenuated.” This lack of a sense-making process is what creates a reversed hindsight bias. Insight problems are expected to create a stronger hindsight bias because even though the individual is initially surprised, the solution to the insight problem makes sense after it has been seen.

2. My article is already cited in the wikipedia article but it can be expanded on. They only spend one line explaining the article and how to eliminate hindsight bias. Research by Arkes, Faust, Guilmette, and Hart show that hindsight bias can be decreased by having participants think about a reason that the alternate hypothesis or hypotheses could be correct. This does not eliminate hindsight bias but it significantly decreases it. One explanation for this effect is that the participants start to doubt that the correct hypothesis and they report that they wouldn't have chosen it as often.

3. My article was not cited in the wiki page but I feel like they could reference it. In my article they looked at the magnitude of the bias that dealt with memory distortion. They wanted to see whether there was a relationship between the amount of time that they gave the participants to respond and their level of bias. The results of Calvillo (2013) demonstrate that response time when recollecting foresight judgments is related to hindsight bias in a memory design. In both of the experiments participants made judgments, completed unrelated tasks, and then were then given the correct answer to some of the previous questions. They were then asked to recall their original judgments. Half of the participants were asked to recall their original judgments quickly while the other half was given time to process this and respond with their original judgement. The hindsight bias index was greater among rapidly responding participants than among delayed responding participants. The wiki page makes no mention of the magnitude of bias being affected by amount of time given to process their judgments.

Cprazete (talk) 00:22, 11 March 2014 (UTC)[reply]

Definition in lede needs revision

The lede defines hindsight bias as follows:

Hindsight bias . . . is the inclination to see events that have already occurred as being more predictable than they were before they took place.

By that definition, a person exhibiting hindsight bias as to event A would affirm the following statement:

Before event A occurred, it was not as predictable as it is now. Now that it has occurred, it is more predictable.

That of course is nonsense, and is not an expression of hindsight bias. As I understand the concept, a person exhibiting hindsight bias will affirm the following statement after event A has occurred:

I could have predicted that A would happen;

whereas, in fact, before the occurrence of A the person had no basis on which to predict whether or not A would happen.

For example, on a certain day, when Mary has no way of knowing what is about to happen to John, John gets hit by a bus. Upon learning of the event, Mary feels certain that she might have predicted, almost predicted, or did indeed predict, John's accident. She does not believe that John's accident is more predictable now that it's happened than it was before it happened, as the definition currently given suggests.

Somebody familiar with the literature on hindsight bias should revise the lede. J. D. Crutchfield | Talk 23:32, 31 March 2014 (UTC)[reply]

I've taken a crack at it. The problem is not so much familiarity with the topic (you also know what it is), but how to accurately and concisely word that into a definition. I hope it is now better. --JorisvS (talk) 08:03, 1 April 2014 (UTC)[reply]
Better, but still not quite there. With JorisvS's permission, I'll substitute the following:
. . . the inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it, prior to its occurrence.
J. D. Crutchfield | Talk 16:08, 1 April 2014 (UTC)[reply]
But it is more of a gradation. For example, if you present people with the solution to a problem, they will overestimate the likelihood they would have solved it. --JorisvS (talk) 18:17, 1 April 2014 (UTC)[reply]
Aha. That's why I wanted somebody familiar with the literature to tackle it! ;0) Would it suffice to change it to "to see the event as having been more or less predictable . . . ."? Or is the difficulty that the bias applies not only to events but to other phenomena, such as the solution to a problem? Arguably the solution to a problem is an event from the perspective of the person it's presented to--i.e., the discovery of the solution. But to solve a problem is not the same thing as to predict that it will be solved, or to predict what the solution will be. Are we sure that we're talking about the same phenomenon? Are "I could have predicted that event," and "I could have solved that problem," really examples of the same bias? Maybe the problem with the definition given in the lede isn't merely logical but also substantive. Surely somewhere in the literature there is a concise and logical definition of hindsight bias! J. D. Crutchfield | Talk 16:37, 3 April 2014 (UTC)[reply]
That's an interesting question. It may not have been definitively answered by psychologists. There could be common basis for both estimate: one makes an estimate knowing the outcome/solution, which is basically an anchoring effect, and people are well known to insufficiently adjust their estimate. --JorisvS (talk) 18:09, 3 April 2014 (UTC)[reply]
Is this any better?
Hindsight bias is the inclination to regard a known datum as having been more or less knowable or predictable before it actually became known, even though there was little or no objective basis for knowing or predicting it beforehand. For instance, when presented with the solution to a puzzle, Smith says, "I could have solved that." Similarly, upon the occurrence of an event, Jones says, "I had a feeling that was going to happen." In both cases, Smith and Jones may be exhibiting hindsight bias.
That avoids the question I raised earlier, by referring to knowledge or prediction of a datum, rather than just prediction of an event; but it is rather complex and lengthy for a definition. And of course I may not understand the concept quite. J. D. Crutchfield | Talk 15:13, 4 April 2014 (UTC)[reply]
Let's replace "datum" with "fact" (datum = a fact known from direct observation, so "known datum" is a bit a tautology). Let's also add "derivable", because that's better for the puzzle-solution part. And by making the examples also a bit more general we then get:
Hindsight bias is the inclination to regard a known fact as having been more or less knowable, predictable, or derivable before it actually became known, even though there was little or no objective basis for knowing or predicting it beforehand. For example, when presented with the solution to a puzzle, people overestimate the likelihood they would have solved it. Similarly, upon the occurrence of an event people often feel it was more predictable than it really was.
I'm not yet happy with "even though there was little or no objective basis for knowing or predicting it beforehand" because that still neglects the puzzle-solution thing. --JorisvS (talk) 15:25, 4 April 2014 (UTC)[reply]
Hee dat was vlug, Joris!
I like that, although I might suggest that derivable is comprised within knowable, and I'm not sure every solution to a problem is derived--but those are minor quibbles. I wasn't aware that datum implied known--I was trying to avoid the implication of done in fact, but I readily concede the point. In the first example, I would insert "may" before "overestimate" (since not everybody exhibits hindsight bias); and in the second I'd put a comma after "event".
As for the objective basis part, if we allow knowing to include deriving, does that take care of puzzle-solving? We talk about knowing the answer to a puzzle or problem, after all. If I say, "I could have solved that problem," what kind of evidence might I point to, to show that I am not exhibiting hindsight bias? For instance, if I were a mathematician and the problem was a simple equation in algebra, I'd have had an objective basis, before I was given the answer, for thinking that I could solve the problem--i.e. that I could come to know the answer--without help. Is that good enough? J. D. Crutchfield | Talk 16:12, 4 April 2014 (UTC)[reply]
Maybe technically 'knowing' may include 'deriving' here (I already thought that was how you intended it), but I think it is best to ignore that, so that it is logical to many more people, even if they are not very carefully reading it and thinking it through (which would be the great majority). I'd add something like "generally", because these are averages, but "may" implies more a probability. It is actually as much (or more) knowing how to get to the answer than it is knowing the answer itself. We'd now get:
Hindsight bias is the inclination to regard a known fact as having been more or less knowable, predictable, or derivable before it actually became known, even though there was little or no objective basis for knowing or predicting it beforehand. For example, when presented with the solution to a puzzle, people generally overestimate the likelihood they would have solved it. Similarly, upon the occurrence of an event, people often feel it was more predictable than it really was.
If you are someone who has a lot of experience in coming to an answer (which is beforehand), then there is no bias involved. The point is the overestimation of one's likelihood of solving; if one is 100% certain to find an answer already, then one cannot overestimate that! --JorisvS (talk) 16:58, 4 April 2014 (UTC)[reply]
To me (and this happens to be an area in which I can claim some expertise), generally implies a much stronger probability than may, which expresses only possibility. To say that X is generally true suggests that X will usually or virtually always be found to be true--that one can generalize by saying "X is true". But to say that X may be true leaves open the broad possibility that, at least in many cases, X will turn out not to be true. The latter more accurately applies to hindsight bias, doesn't it? Can one generalize by saying, "People think they could have solved a problem, once they know the answer"? I don't think so, so I'd keep "may" rather than "generally". J. D. Crutchfield | Talk 17:42, 4 April 2014 (UTC)[reply]
It's more like most people will show hindsight bias to some degree, and some more strongly than others. What about "tend to" instead? --JorisvS (talk) 15:41, 5 April 2014 (UTC)[reply]
  1. ^ Nestler, Steffen; Egloff, Boris; Küfner, Albrecht C. P.; Back, Mitja D. "An integrative lens model approach to bias and accuracy in human inferences: Hindsight effects and knowledge updating in personality judgments." Journal of Personality and Social Psychology. Vol. 103 (4). October 2012. Retrieved on 2013-02-21.
  2. ^ Bernstein, Daniel M.; Wilson, Alexander Maurice; Pernat, Nicole L. M.; Meilleur, Louise R. "Auditory hindsight bias." Psychonomic Bulliten & Review. Vol. 19 (4). August 2012. Retrieved on 2013-02-21.