Call this a hypothesis based upon observation. Its harder for smart people to admit they are incorrect about something, than it might be for the population as a whole. My rationale works like this … the smarter you are, the more defensive you are of that ‘status’ if you will, and so you tend to act in a way to reinforce prior decisions, regardless of their actual (quantifiable) correctness. That is, you are more afraid of the consequences of admitting to being wrong, as compared to actually being wrong.
I know I am not the only one to have noticed this. There are pretty good essays out there on this phenomenon.
I’ve been wrong, sometimes badly so, on quite a number of things. Admitting this doesn’t make me dumb, or “less smart”. On the contrary, my ability to self analyze and self correct allows me to learn, hopefully more quickly than others I compete with.
There is also a possible linkage with the Dunning?Kruger effect whereby people whom may be convinced of their own innate wisdom and gifts are probably, to put it mildly, fooling themselves. That is, they are defending competence which might not really exist in great quantity.
One of the corollaries to this is that smart people may be more gullible. More susceptible to being taken in, as they may be more sure of themselves, and their biases that they rarely if ever make incorrect decisions.
Another corollary to this is that they may be prone to manipulation. Reinforce an earlier decision, or make a subsequent decision appeal to some aspect of their assumed intelligence.
Worse still, such people may not necessarily act in their own best self interests, as the costs to admit “wrongness” may far exceed their ability to chose a new path based upon information.
I’m going to muse on this some more over time. The context for this spans a very wide range of things, and I’ve been noticing these common threads for years. I’ve caught myself in the past arguing a point more to argue the point than defending the correctness of the position. This has caused significant introspection, and has made me more cautious and more skeptical. But at the same time, its also made me better at examining what I need to examine.
I think this is also related to another concept I had discussed in the past, where people will try to assign “blame” for a problem to the person or group most likely to be able to get something done to solve it, rather than the actual source of the problem.
Ok, this is probably not at a full fledged hypothesis stage yet. Working on it. In the background. In my copious spare time (that is a joke).