In 1979 there was the Three Mile Island incident involving a nuclear reactor. I read the description of how operators would not believe their instruments and took steps that made the situation worse. I could not understand why they did not believe the surprising yet plausible readings. Then I remembered something that had happened to me some years before. I recount that personal experience here:
I noticed that the gasoline gauge in my car was reading half full when the tank was indeed full. I recall wondering how expensive it would be to get the gauge repaired and whether I could get along without it. I thought not. Two more days passed and when the tank was about half empty the car stopped.
I was indeed out of gas! The gauge had been working perfectly. I had been deluded as to the real amount of gas. Only then did I recollect that the last time I had put gas in the car I had been short of cash and only half filled the tank.
Early Volkswagens lacked a gas gauge and I was sure that I could not get along without a gas gauge. I was entirely unaware of my innate sense of how much gas I had. Yet when that unconscious innate sense conflicted with what I rationally believed was my only source of information, the innate sense completely overrode the gauge reading. I was unaware of the conflict—only aware of the broken gauge. The thought that the gauge was correct did not enter my mind even briefly until the car stopped.
This book introduces the term ‘committed belief’ or ‘epistemic commitment’ which refers to beliefs held unconsciously, in contrast to hypothetically. The story might be mildly amusing to others and at this remove it is mildly amusing to me. Ideologies are essentially committed social beliefs. The believer may be vaguely aware that there are those that do not share the belief but then there are always others that don’t know the truth.