“The ques­tion that is most often asked about cog­ni­tive illu­sions is whether they can be over­come. The mes­sage of these exam­ples is not encour­ag­ing. Because Sys­tem 1 oper­ates auto­mat­i­cally and can­not be turned off at will, errors of intu­itive thought are often dif­fi­cult to pre­vent. Biases can­not always be avoided, because Sys­tem 2 may have no clue to the error. Even when cues to likely errors are avail­able, errors can be pre­vented only by the enhanced mon­i­tor­ing and effort­ful activ­ity of Sys­tem 2. As a way to live your life, how­ever, con­tin­u­ous vig­i­lance is not nec­es­sar­ily good, and it is cer­tainly imprac­ti­cal. Con­stantly ques­tion­ing our own think­ing would be impos­si­bly tedious, and Sys­tem 2 is much too slow and inef­fi­cient to serve as a sub­sti­tute for Sys­tem 1 in mak­ing rou­tine deci­sions. The best we can do is a com­pro­mise: learn to rec­og­nize sit­u­a­tions in which mis­takes are likely and try harder to avoid sig­nif­i­cant mis­takes when the stakes are high.”

Daniel Kah­ne­man
2002 Nobel Prize in Economics

From Sci­en­tific Amer­i­can, June 15, 2012