Imagine you’re having a debate or discussion with someone, and have a strong difference of opinion. The discussion ends, and no agreement has been reached, both sides hold their positions. Later, perhaps because of further reflection, or new information, you change your mind. You privately realize you were wrong, and that that someone was right; you actually agree with them now. What happens then?
Previously, I have said
As soon as you establish, in a social context, that you are advocating or defending a certain position, you become bound to it: being proven wrong as well as changing your opinion signals weakness, something we are evolutionarily programmed to avoid at all costs. That’s why you rarely see someone admitting being wrong or changing their mind in a debate, especially if there’s an audience.
We don’t want to lose face or status, not even in front of ourselves, and this may establish preferences over states of reality
If you have a preference as to how you’d like things to be, you can be pretty sure that your mind will distort things to match that
So, besides being vigilant when detecting said preferences, what else can be done to counter the cognitive biases at work? Two things, which I’ll call conditioning and reversed reinforcement.
Conditioning means exposing yourself to a negative experience in order to develop tolerance and reduce its effects. When the negative effects of being wrong are reduced, that is, we become accustomed to “losing face” both publicly and privately (i.e. private introspection), the motivation to avoid said negative stimulus should be correspondingly reduced. This in turn will reduce the strength of biases that distort reality to avoid the negative experience.
The other technique involves trying to convert a negative experience into a positive one. Instead of shamefully and half-heartedly admiting an error, doing it openly. Displaying an open, honest attitude over something that is typically a sign of weakness is a very strong signal of strength. And this can result in positive reinforcement, both publicly and privately, thus countering the negative stimulus that is by default associated with being wrong. As above, a dimished negative effect may reduce the strength of activated biases.
In summary, openly acknowledging being wrong may work towards reducing resistance to changing your mind in the future instead of fooling yourself to protect your status. And there’s an added bonus. If people observe that you’re honest about your errors, they will automatically assign greater credibility to positions you hold. So, next time you realize you’re wrong, instead of shutting up about it, come out and say it outright.
 Technically, systematic desensitization
 And before somebody points it out, it really is very hard to go too far and reverse the bias, the innate tendency to avoid signaling weakness is very strong.