I’m sure you’re familiar with Bayesian belief. It’s about the relationship between belief and certainty, Supposedly you have prior beliefs that can be updated once degrees of certainty go past a certain amount or something like that. Something with probabilities, etc.
I’ve never really understood it or accepted it. Even now, I begrudgingly accept that it must be true because it’s useful.
The reason I have trouble with it is that for me, something “is” or “isn’t” so. The “is” or “isn’t” can change but there’s no room for “belief” as far as I can tell.
I can sometimes quantify an amount of certainty — perhaps that’s a form of belief? but it’s usually a case of missing certain types of evidence that prevent me from certainty but I can put into a holding bin of more or less likely
Of course, most things actually are in that zone really. Anything certain can be questioned, all truths subject to revision, etc.
Yet, I’ve not had a problem that a lot of people seem to have with the notion of “are you sure you’re sure?” where someone can get me to doubt myself by badgering.
And the reason for that is: ultimately, I doubt everything to an unreasonable degree.
And this, oddly, allows me to be certain about “is and isn’t” things and not fall into the ‘i am not really sure that i’m sure’ BECAUSE it’s arbitrary on the deepest level and so it allows me to function “as if it’s true” with full certainty.
“Of course this is so!” and I believe it 100% (socially). I can do that because my doubt is equally complete.
That is to say, I’m not affected as much by “are you sure you’re sure?” doubting because my doubt is more than most already. Upon that layer I can be certain of whatever I’m certain about.