Confessions of a helpless epistemon

[Content note: Staying in the realm of useless abstractness and away from concrete examples.  This is the longest I’ve ever had a substantial unfinished draft lying around — around four months! — which is a record I hope will never be reached again on this blog.  Hopefully the tone and focus hasn’t been too badly diluted by this gap.]

I was so pleased when I first encountered the term “epistemic helplessness”, because it described my own mental state so much of the time.

Feeling misunderstood is a hallmark of the emotional experience of the typical teenager, but probably most humans at any stage of life experience this emotion regarding some aspect of who they are.  For me, I oftentimes think that the main element behind what routinely causes other people to not relate to me comes down to a state of epistemic helplessness which is fairly dominant in my perception of the outside world.

But before I get into the misunderstandings which I attribute to this uncomfortable epistemological state, I should first try to convey what “epistemic helplessness” means when I apply the description to myself, or at least what it feels like from the inside.

To put it rather vaguely, I feel that I like in a world of uncertainty.  Now I’m pretty sure* that everyone feels that way to some degree.  But the longer I’ve lived, the stronger an impression I’ve gotten that relative to most of the people around me, I’m quite uncertain of empirical facts.  Not just simple statements about the physical world, but also (and especially!) assessments of complex constructs and situations involving many humans, such as political events.  On any given topic, there are about a hundred other (pretty easily visible) people, clearly far more knowledgeable in that arena than I’ll ever be, who are arguing from multiple sides of it.

And when I say “uncertainty”, I don’t mean it in the sense of being inclined towards assigning probabilities to things and updating based on incoming evidence — what some call Bayesianism — rather than leaning towards believing something is absolutely true or absolutely false.  If anything, my Bayesian mindset makes me feeling more epistemically empowered; it certainly doesn’t constitute any mental state that I would call “helplessness”.  No, I mean “uncertainty” as in often having no idea where to begin in assessing a situation, because no matter how much the evidence seems to suggest one outcome (or probabilities assigned to a set of possible outcomes), life is always complicated enough that there’s sure** to be something else I haven’t considered.

I suffer from too much imagination, and that clearly lies at the root of this issue.  Part of the definition of maturity is having respect for how complicated the world we live in really is.  But an excess of respect for this complexity can be paralyzing.  It means that no matter how much evidence you see for some involved idea or stance, no matter how many reasonable-sounding arguments are made, all of that justification amounts to almost zero in the vast sea of uncertainty made up of so many vague might-be’s coming from all directions.  To make things worse, there’s a certain laziness ingrained in my intellect where I tend not to be well researched on most topics I find myself discussing — this is something to do with the fact that I’m interested in a lot of things but only seem able to muster Interest in a seemingly arbitrary subset of them.  The result is that I’m often much less knowledgeable than my acquaintances about whatever issue we’re debating.  All in all, sometimes the urge to just throw up one’s hands and give up the search for truth (or the quest to get as close as possible to it) is irresistible.

So when all is said and done, an overload of respect for complexity probably doesn’t lead to the most sophisticated approach to analyzing our universe.  And of course, from the outside, some of the rhetorical behavior that results from it is hard to tell apart from naïvité and a lack of respect for complexity.

* Yes, the turn of phrase “I’m pretty sure” seems to contradict what I just said in the previous sentence, but it so happens that I do have relatively reasonable levels of confidence in my understanding of how other individuals function; see below.
** And yes, the start of this sentence makes me sound awfully uncharacteristically confident as well, but clearly certain meta-level propositions are exempt from my perpetual tendency towards uncertainty; this is definitely apparent below.

I strongly suspect that a lot of those people with whom I’ve found myself discussing some contentious issue or question quietly carry an impression of me as excessively timid and mild in my opinions, or trying to disagree equally with everyone, or even as somewhat “radically centrist“.  True, I’ve already admitted on this blog that I did have tendencies towards “radical centrism” at some earlier times in my life (by the way, I thought I was making up that phrase at the time I wrote that post — add another to the list of terms that already existed at the time I “invented” them).  But while that flaw isn’t entirely unrelated, I am not typically trying to lean towards the most central or even the most neutral position on any current topic of discussion.  Instead, I’m leaning towards no position, because in the world as I see it there is surely a possible counterpoint to every point being put forth.

But the misconception of my thought processes is probably worsened by the fact that I often come off as rather lazy in backing up my claims that there might be other sides to some issue.  This is because my doubts are often based not in concrete speculations but in vague suspicions that “this kind of thing” always has other possible explanations.  So the scene isn’t necessarily one where I’m playing the role of a gadfly poking and prodding with actual concrete suggestions, but one where I’m despondently deflecting other people’s assertions with vague, almost trollish-sounding remarks like “well it depends on who you ask” and “there are multiple sides to every story” and “there are always more obstacles for someone in that position than we might think”.  For instance, I oftentimes have an attitude like this in the face of erupting scandals over the actions or inactions of persons in powerful-looking positions — I have a tendency to reserve my judgment on the basis of “I have a feeling there’s a lot more dry bureaucracy involved for that individual than we know about, having no idea what their day-to-day program is actually like”.  And it can be hard to make others understand that at least in theory I’m addressing things this way out of a profounder sense of uncertainty than what they seem to routinely experience, rather than out of an excess of charity or just plain (slightly obnoxious) laziness.

A further irony is that at least half the time, my opinions aren’t interpreted as stubbornly noncommittal but rather the opposite: it’s assumed that deep down I must have a strong opinion one way or the other, and since I seem skeptical of the confident views being put in front of me, I must be taking the other side of the debate.  In these cases I suppose that not only does my level of uncertainty come across as overly lazy, but that it comes across as so unbelievably lazy that it can’t be genuine: I’m obviously just being coy about the fact that I positively disagree.  I’ve heard a lot of complaints about having one’s opinion shoehorned by others via a false dichotomy, but I’m often still taken aback at how inconceivable to some people it apparently is that my opinion could ever be one of genuine agnosticism.


A number of people through the course of my life have been concerned about how often and strongly I seem to rely on majority opinion.  I can’t tell you how many times it’s been pointed out to me that I can’t base my tentative conclusions (which aren’t always properly recognized as tentative; see above) on “what other people think”, because Other People is obviously a very fallible source of factual beliefs.  This criticism is of course only a minor variant on the classic “If everyone jumped off a bridge, would you jump off a bridge too?” argument.  In my opinion there’s a pretty obvious rebuttal to the typical bridge-jumping objection, but I prefer to lay out the initial response that usually goes through my head (despite it being less precise and more on the emotional side) whenever I’m warned about the dangers of citing majority opinion.

There is a famous quote from science fiction writer and skeptic Isaac Asimov that I always think of in this context, although the analogy required will be a rather loose one:

“Don’t you believe in flying saucers?” they ask me.  “Don’t you believe in telepathy? – in ancient astronauts? – in the Bermuda triangle? – in life after death?”

No, I reply. No, no, no, no, and again no.

One person recently, goaded into desperation by the litany of unrelieved negation, burst out “Don’t you believe in anything?

“Yes,” I said. “I believe in evidence. I believe in observation, measurement, and reasoning, confirmed by independent observers. I’ll believe anything, no matter how wild and ridiculous, if there is evidence for it. The wilder and more ridiculous something is, however, the firmer and more solid the evidence will have to be.”

A lot of the time I feel sort of like Mr. Asimov being pestered to believe in a bunch of controversial phenomena.  Only instead of supernatural or paranormal things, I’m being bombarded with claims that sound scientific or somehow common-sense, along with arguments for them that sound perfectly reasonable on first hearing.  “Don’t you believe in my amateur understanding of the nutritious value of ingredient X?  On my sensible-sounding but simplistic argument on the dangers of activity Y to one’s health?  On my elementary mathematical demonstration of why economic policy Z is a good idea?”  Or even occasionally, “Don’t you believe my factual claim based on evidence provided by that specific scientific authority whose paper I can refer you to (in a field where the experts constantly disagree and the consensus changes every 10 years)?”

And my response basically boils down to “No, no, no, no.”  Or more specifically, “There’s always another side we’re not thinking of right now or an aspect we just can’t understand from our position; there’s always another expert who disagrees; etc.”

One can imagine my interlocutor throwing their hands up in exasperation at this point and crying, “Don’t you believe in some kind of claim using some kind of justification?  Or are you just infinitely skeptical about everything?”

And my answer is, “Yes!  I believe in evidence.  But not necessarily everything that you might consider good evidence.  I mean evidence that appeals to both the overly-imaginative and the lazy intellects.  Evidence that makes some assertion look significantly more likely regardless of how many additional wrinkles to the situation we may not be considering.  Evidence that shines out as a truly unambiguous beacon in my very foggy world.”

For me, majority opinion is serious concrete evidence of this kind.  It is not absolute proof.  The majority can be wrong.  My perception of what actually is the majority belief is also prone to error, especially because my view of the majority often doesn’t extend beyond my local bubble.  But a lot of the time, what Other People think is a far stronger indicator that a certain assertion is at least worth seriously considering than one person’s plausible-but-overly-simple-looking explanation of it.

Think of it this way: my concern is that I can’t understand most issues well enough to properly evaluate all sides of them and that I can’t trust most other individuals to understand them well enough for this either, no matter how confidently they claim to.  But if a large collection of other people, each with their own knowledge of various aspects of the problem, generally takes a particular side?  Now we’re essentially looking at an average over many outcomes, which is clearly more statistically significant than a the outcome of a single experiment whose degree of uncertainty appears rather high.


The importance I put on what other people think is complemented, perhaps, by a fair amount of confidence in how other people function.

I can’t fully explain or justify this, but I think it’s inherent both in how I navigate real life and in how I argue in my persuasive writing.  On this blog, for instance, when I’m not framing and justifying the points I make in terms of dry abstract logic, I’m generally appealing to some striking impression of human behavior, in particular to provide explanations for why we humans make the rhetorical mistakes that we make.  My whole thing about free-will explanations versus deterministic ones (on which I intend to expand soon with a series of other posts) was pretty much entirely an appeal to this sensibility, once I got past the pure philosophizing on the problem of free will.

I call this my “social sense”.  It’s largely built on my commitment to empathy and increasing experience with other people, and therefore my confidence in it is only getting stronger the older I get.  I’d say that “social sense” appears to have a pretty good track record with me, but that’s probably largely an illusion arising from an ability to re-frame things in accord with my already-existing convictions on how humans act, rather than from any tested predicative power.

Maybe I can try to explain this confidence in “social sense” as follows.  There’s a common misconception that emotions are inherently irrational, so to be as rational as possible we have to minimize our reliance on emotion, and so on.  I call this a misconception because I believe that emotions can be harnessed to work in tandem with cold logic, since ultimately, the feelings we experience are a natural function provided by the process of evolution which provides us helpful instincts about sentient behavior happening around us.  I want to suggest that, in the same vein, even if I’m tremendously fallible in understanding abstract philosophical or scientific models and considering all their possible alternatives, at least the “social sense” I’m equipped with through biology is reasonably likely to point me towards truth concerning the behavior of my fellow humans.

And this at least gives me a small weapon for skepticism about all the factual claims that fly my way: sometimes I can say, “Regardless of how valid your logic sounds right now, this kind of justification is often a particular trap that people fall into because of X, Y, and Z, and therefore I find it suspect.”  I don’t claim that it’s really much of a weapon, but I’ll take what I can get.


I apologize for all this vague and blathering exposition of my experience of epistemological helplessness.  But to be coherent and rigorous would almost certainly open me up to all kinds of plausible-sounding counterarguments, and after all, how much can I ever feel really sure of?

3 thoughts on “Confessions of a helpless epistemon

  1. I recognize this experience of constantly second-guessing your judgments, mistrusting how something is presented and always feeling like there’s something more that hasn’t been considered.

    But trusting the majority seems odd. True “wisdom of the crowd” mechanisms rely on some very particular conditions to work (like individual independence). We also know that humans have plenty of biases that renders our judgments unreliable in so many ways.

    FWIW, I’ve found that maintaining undecidedness is actually more viable than I thought it’d be.

    Like

    1. Absolutely we humans have plenty of biases and it should be a priority to catch those biases in action whenever possible. I don’t feel epistemologically helpless in the arena of human biases; a lot of my writing here has been an effort to explore such things (and it is one component, though not all, of what I meant by “social sense”). But when we can’t come up with an evident fallacy or bias to explain why a large group of people believe what they believe, then what can we do?

      I probably should have made clearer, by the way, that I’m not talking about necessarily considering the average beliefs of human population as a whole or even a particular culture, but often a subculture whose characteristics and general belief system is very compatible with mine. For instance, my social bubble used to be composed of highly intelligent academically-inclined people with strong respect for science and rationality, who almost universally rejected religious dogma, etc., with many of them morally idealistic. The fact that they mostly seemed to be in complete agreement that eating animals is wrong (while the ones who defended meat-eating tended to be the least morally idealistic and their arguments were inherently defensive) has led me to lean much closer to believing that eating animals is wrong. (Not to mention that the online rationalist community, whose core values I also identify with, seems pretty much unanimous on this, but I didn’t get exposed to that until later.) So having done zero research on what nutrition, environmental science, etc. have to say, I feel that I have “evidence” that eating animals is probably wrong. Yet if I cite the majority opinion of those in my social bubble as “evidence”, I’m likely to get criticized for just “believing what everybody says” (where “everybody” is practically everyone who was once in my life — a relatively small social bubble, but one which clearly shared most of my values).

      At the same time, that same subculture that made up my social bubble also held some other majority beliefs and followed a smattering of social norms that I became skeptical of, precisely because I was able to identify what I believed to be certain biases behind them.

      Anyway, my post should be taken as a “confession” (as the title suggests) describing my own mental processes which I’m not particularly proud of, and so my majoritarianist tendency is not fully endorsed. My defense of it should not be read so much as a whole-hearted defense but something closer to “this is why I often don’t know what else to do”.

      Liked by 1 person

  2. I see. I think I meant something more fundamental with the word “bias”, not so much specific biases but that our whole belief machinery is “broken” in that is not at all optimized for accuracy and it’s not so much the content as the whole format of our beliefs that’s wrong. That’s why I’m skeptical, but as a pragmatic last-resort strategy it certainly makes sense.

    Like

Leave a comment