[Content note: This is something I’ve been thinking about which feels somewhat clearer in my mind than it comes out in writing. However, I’m already having doubts about how the connection to superweapons works. Mentions of several sensitive issues for examples, included in tags.]
It is a common criticism from those who have known me for long enough that I’m too gullible. Sometimes this is meant in the basic sense of believing false things (especially when I was younger), but also sometimes in the sense that I come across as much too immediately accepting of whatever broad narrative is pitched to me in defense of a particular view. Enough independent people from different parts of my life have expressed concern about this that it’s only logical for me to conclude that the criticism is probably valid on some level. At this point in my life, it’s more a matter of in which sense is it valid, what underlies this tendency, and which aspects of it are helping me as opposed to hurting me.
There’s more than one issue at play here, but here I want to focus on one particular type of fallacy which I consider to be a major problem with a lot of the discourse I see, and which I’m trying to guard against when I react to claims in a way that makes me look too credulous. This problem in the world of discourse can be summed up by saying that we’re not welcoming enough to gadflies.
I. Socrates the Gadfly
I am not particularly knowledgeable with regard to ancient Greek philosophers, but I am familiar with Socrates’ characterization of himself as a “gadfly of the Athenian people”. What he meant, as I understand it, is that his intellectual function in his society was to articulate skepticism and raise nagging doubts in the face of commonly-held assumptions. In other words, he aimed to be what is more commonly called a “devil’s advocate”. According to him, gadflies are understood to create discomfort and to generally be annoying, but they should be welcomed. Apparently in trying to defend himself from the death penalty, he claimed, perhaps arrogantly, that he was the only gadfly in the area, and that they would be unwise to get rid of him as gadflies are essential to the health of society.
This assertion has been made many times and articulated in many ways since Socrates. It encapsulates a general idea that is seen most prominently in the philosophy of science, as well as within the deeply-held values at the heart of modern democracies, skeptic/rationalist culture, and academic culture in general. In any intellectual pursuit, thinking critically and challenging assumptions is key. I don’t want to write about this very broad notion which has been discussed constantly for centuries. When I say, “We just don’t welcome enough gadflies”, I’m not trying to proclaim a vague platitude like “We don’t think critically enough!” In particular, the use of “gadfly” is not meant as a metaphor for challenging authority, or the exercise of skepticism within the scientific process. (Indeed, I don’t see any sense in claiming things like “We should be more skeptical when doing science”. The scientific mindset, as Carl Sagan put it, consists of “a marriage of skepticism and wonder”, and in fact my comment about gadflies could be construed equally well to mean that we need more wonder (i.e. open-mindedness) when doing science. Skepticism and wonder are arguably two sides of the same coin.)
The gadfly behavior I’m advocating today is a more specific thing, which I find easier to describe in the negative: when considering a particular decision or situation, don’t automatically dismiss any of the relevant possibilities that come to mind, even (especially!) if they make you feel uncomfortable.
Let me clarify what I mean by “relevant possibilities” above by use of an example from an earlier post. Suppose that you arrive at a colleague’s office at an agreed-upon time for a meeting with them to prepare for an upcoming deadline, but they never show up. Now let’s say that one of your major pet peeves with the world is the way most people around you seem to be disorganized, and that this has really been adding to your stress lately as you rely on a lot of other people. To make matters worse, although you know you can probably reschedule for late tomorrow afternoon (both you and that colleague often stay after normal hours), tomorrow is your kid’s birthday and definitely not a day you want to come home late. So naturally, your immediate reaction is to feel really angry.
There are many possible causes for your colleague’s absence, a few of which were discussed in the other post: they might have decided not to bother; they might have simply forgotten; they might have a drug problem (entirely unknown to you) which indirectly resulted in not being able to make it; or they might have gotten into some kind of accident on the way to their office. Chances are that the first two possibilities above are the most obvious explanations and are the first to leap into your mind — unsurprisingly, these ideas do nothing to abate your anger. It is unlikely, especially given the narrative you’ve developed about everyone else being disorganized, that either of the last two possibilities will occur to you quickly if at all. And yet those explanations, while perhaps not particularly likely, are still perfectly plausible. Maybe your colleague has always had the appearance of being totally together, but is actually struggling with some sort of addiction, or perhaps suffering from a mental illness which has not been apparent to you. And people do get into serious accidents and have emergencies from time to time. And so, before acting on your newfound resentment towards your colleague, you should at least consider these possibilities — these are the “relevant possibilities” I referred to above. I’m not saying they should be deemed as likely, but that they should occur to you, and be objectively considered.
This is not a matter of considering every possibility under the sun and weighing them all equally. Under most circumstances it seems to be a much more common occurrence for people to be careless or forgetful than for them to have some much more serious reason to not show up for something. However, in the long run, I believe it pays off to at least allow them to enter your consciousness.
(By the way, let’s say that your colleague did fail to meet you out of forgetfulness, caused in part by the fact that they never saw the meeting as particularly important. They get that nobody likes waiting around for someone who never shows up, but sincerely don’t understand why you would be this upset about it. After all, you can just both stay late tomorrow, as you often do, and deal with everything then in time not to miss any deadlines. It just doesn’t occur to them that there might be a particular reason why you don’t want to be at work late tomorrow. Maybe they, like you, should make more of a habit of considering more possibilities, especially those which lead to conclusions they don’t want to believe.)
These annoying ideas that we should try our best to come up with, particularly the ones which threaten the narratives we’re comfortable believing, are what I call “gadfly speculations”. They are not fun to have around, but it’s bad for our intellectual health not to let a few of them swarm our conscious minds and nip at our deliberations on a regular basis.
I want to be clear before going any further that when I say, “Be welcoming to gadflies”, all I’m talking about here is the skill of knowing how to let these speculations fly into one’s head in the first place, NOT how to weigh them once they’re present! Gadfly speculations are what should happen during a mini brainstorming session. They are funny-looking blobs to be thrown at a wall regardless of whether in the moment they seem likely to stick. They are ideas which may seem quite improbable, but which should occupy a spot on one’s mental whiteboard. Later on, of course, they need to be evaluated on their merits. Pretty much everyone understands on principle the idea of coming up with a bunch of ideas and then evaluating them to choose the best (or most probable) one, but I have a feeling that a lot of us don’t pay enough attention to gathering a sufficiently varied collection of ideas in the first place.
That is what I’m trying to stress here. In order to weigh possibilities to arrive at the most rational conclusion, we need to reach the first step of being able to see a healthy variety of possibilities on the table in front of us. Why do we so often fail at this? Our intellects tend to be lazy, and we naturally want the first step of any decision-making process to be easier. One obvious way to make it easier is to give ourselves fewer things to choose from.
Now there’s nothing deep in arguing that we should be careful to entertain enough gadfly speculations. It’s basically a variant of guarding against “lack of imagination” and more or less standard Biases 101 stuff. I just want to point attention to how this very unsurprising human tendency plays into some more interesting rhetorical trends. Or at least, in the likely event that these connections seem similarly obvious, I’d at least like to get this point of view down in writing so that I can easily refer to it later.
(I’ve always enjoyed the gadfly metaphor. I remember distinctly that back when I was in college and for the first time very interested in starting a blog, I kept trying to think of a name which referred to gadflies. I wouldn’t be surprised if the word “speculation” didn’t show up in some of these names too, since I’ve always seen myself just suggesting things in blog posts rather than trying to meticulously argue anything. But at the time, the only name I could come up with that I was reasonably happy with was “Hawks and Handsaws”, and obviously I managed no better many years later when it came to naming this blog.)
II. The building of superweapons
In several posts, most notably these two (see also this), Scott Alexander (who runs Slate Star Codex) expounds upon a rhetorical phenomenon which he calls “superweapons”. Here is the essential passage from the first linked post:
Suppose you were a Jew in old-timey Eastern Europe. The big news story is about a Jewish man who killed a Christian child. As far as you can tell the story is true. It’s just disappointing that everyone who tells it is describing it as “A Jew killed a Christian kid today”. You don’t want to make a big deal over this, because no one is saying anything objectionable like “And so all Jews are evil”. Besides you’d hate to inject identity politics into this obvious tragedy. It just sort of makes you uncomfortable.
The next day you hear that the local priest is giving a sermon on how the Jews killed Christ. This statement seems historically plausible, and it’s part of the Christian religion, and no one is implying it says anything about the Jews today. You’d hate to be the guy who barges in and tries to tell the Christians what Biblical facts they can and can’t include in their sermons just because they offend you. It would make you an annoying busybody. So again you just get uncomfortable.
The next day you hear people complain about the greedy Jewish bankers who are ruining the world economy. And really a disproportionate number of bankers are Jewish, and bankers really do seem to be the source of a lot of economic problems. It seems kind of pedantic to interrupt every conversation with “But also some bankers are Christian, or Muslim, and even though a disproportionate number of bankers are Jewish that doesn’t mean the Jewish bankers are disproportionately active in ruining the world economy compared to their numbers.” So again you stay uncomfortable.
Then the next day you hear people complain about Israeli atrocities in Palestine, which is of course terribly anachronistic if you’re in old-timey Eastern Europe but let’s roll with it. You understand that the Israelis really do commit some terrible acts. On the other hand, when people start talking about “Jewish atrocities” and “the need to protect Gentiles from Jewish rapacity” and “laws to stop all this horrible stuff the Jews are doing”, you just feel worried, even though you personally are not doing any horrible stuff and maybe they even have good reasons for phrasing it that way.
Then the next day you get in a business dispute with your neighbor. If it’s typical of the sort of thing that happened in this era, you loaned him some money and he doesn’t feel like paying you back. He tells you you’d better just give up, admit he is in the right, and apologize to him – because if the conflict escalated everyone would take his side because he is a Christian and you are a Jew. And everyone knows that Jews victimize Christians and are basically child-murdering Christ-killing economy-ruining atrocity-committing scum.
He has a point – not about the scum, but about that everyone would take his side. Like the Russians in the missile defense example above, you have allowed your opponents to build a superweapon. Only this time it is a conceptual superweapon rather than a physical one. The superweapon is the memeplex in which Jews are always in the wrong. It’s a set of pattern-matching templates, cliches, and applause lights.
The posts linked to above mainly focus on certain trends in the feminist movement, but Alexander uses a number of other examples, and I believe that the concept of “superweapon” can be applied to argumentative tactics regarding a wide variety of issues. When I first read about superweapons from him, I had mixed feelings. On the one hand, I was thrilled that he managed to articulate brilliantly a major issue I’d had with a lot of discourse on a lot of topics. Before reading his essays, the only ways I’d come up with for referring to it required clumsy uses of the word “dogma” — superweapons are, after all, a means of discouraging critical questioning. On the other hand, I was kind of dissatisfied with using what I saw as a concept handle for a complex rhetorical behavior along with intuitive appeals to its potential to be dangerous. Maybe it’s the mathematician in me, but I would prefer to break apart these ideas until they are decomposed into atoms in the world of logical fallacies. Since then, I’ve seen the great effect of approaches of rationalists like Alexander as well as Eliezer Yudkowsky, who has an extremely analytical mind and yet manages to convey many of his messages very clearly using invented terminology to stand in for complex ideas. Plus, I’ve realized on attempting to decompose these concept handles into more basic parts that it’s really hard and I’m not able to get very far. So I’m content to live with them for now.
Still, I think I can begin the process of disassembling superweapons by describing them as being made of gadfly repellents.
I should say, each superweapon made of a particular cocktail of repellents which wards off large classes of gadfly speculations (while still allowing a few which are consistent with the narrative the superweapon’s engineer is trying to push). Think about it: a superweapon’s real source of power is really just its ability to shut down certain lines of argumentation.
For instance, take the example in the quoted passage above about the Jew in old-timey eastern Europe. The situation is presented as a culture dominated by anti-Semitism gradually constructing a memoplex whereby Jews are always viewed as being at the root of various societal ills: child-killing, bad economy, etc. But the flip side of this positive reinforcement (which is not explicitly mentioned above but is readily apparent in many real-life examples of superweapons) is an intolerance towards any idea that poses a threat to this narrative. And in fact, no matter how well that eastern European society manages to reinforce those negative stereotypes about Jews, its assembled superweapon will be seriously lacking in power as long as any skeptical gadflies are buzzing around. When the main character of the story is accused of trying to steal money from his Christian neighbor, a spectator might open their mind to the gadfly speculation “Well, I know there’s a pattern of Jews being greedy, but I suppose it might be possible that this particular Jew was owed a debt…” The superweapon has to shut this down immediately. In fact, in examples like this one, the superweapon has effectively shut down the thought before it’s even properly formed, by hammering an anti-Semitic narrative into everyone’s heads so hard that such contrary notions don’t occur to anybody. In the unlikely event that someone forms the dangerous thought anyway, I imagine that in the presence of a sufficiently strong superweapon, it would be immediately met with, “Come on, when have you ever heard of a Jew being willing to help one of us Christians? Don’t they want to kill our children?”
There are many ways to view the superweapon concept, but I hold that when viewed from one particular angle, superweapons are just anti-gadfly machines. They suppress most gadfly speculations from forming, or they immediately quash the ones that do form. I’ve been trying to avoid alluding to real-life modern controversial topics, but in case I need to be convincing about the quashing aspect, consider the following commonly-expressed “arguments” used to immediately kill gadfly thoughts (more often implied rather than directly said out loud): “More guns = more violence, so how can an open-carry law possibly make anyone safer?”, “More sex education = more sex, so how can the availability of birth control possibly reduce unwanted pregnancy rates?”, “Drugs cause harm, so how could legalizing them possibly do any good?”, “How could anyone possibly lie about being abused?”, etc.
III. Gadflies and partial narratives
A couple of posts ago, I explored the question of how to make ethical judgments of what I called “multivariate situations” — that is, scenarios where something happens as the effect of decisions made by two or more independent agents. I suggested (in that post and more vaguely elsewhere) that if Mr. X and Ms. W each act on independent decisions which jointly resulted in some disaster, then oftentimes, Mr. X’s first instinct will be to put all the blame on Ms. W — after all, if she had made a difference choice, disaster would have been averted! (Of course, Ms. W is likely to similarly blame Mr. X; the contradiction in these symmetric reactions is by itself an argument against this kneejerk behavior.) I claim now that a key part of the subconscious strategy Mr. X uses to leap to an assumption of Ms. W’s guilt is by quickly shutting down the part of his mind that starts to consider the idea that he could have done something differently. The most basic shape this takes is the blanket subconscious assumption that other people always have free will while his own actions in this case were determined.
This looks to me as though Mr. X is adept at warding off certain gadfly speculations. “If she’d looked where she was going, we wouldn’t have crashed!” “Hmm well, maybe, to be fair, if I had stuck to the speed limit, the accident might have been avoi–” “NO! Just focus on the fact that if that irresponsible Ms. W hadn’t been driving so inattentively, we wouldn’t have crashed!!”
A recent post on the blog Everything Studies touches on a similar idea. There the author discusses what he calls “partial narratives”, interpretations of a situation which are very one-sided not in the sense that they’re wrong, but in the sense that they’re incredibly partial: in order to arrive at them, one “takes the derivative of a single variable, discards all other terms and dimensions, and recreates a reality based on the integration of this particular derivative.” The main example he considers is Ayn Rand’s portrayal of capitalism in Atlas Shrugged, where Rand pushes one partial narrative about capitalism while ignoring all others.
You have “capitalism is when people can trade freely in voluntary agreements and create wealth through their own work and ingenuity” and “capitalism is when the rich can use wealth to assert power over the poor in order to extract surplus wealth from their labor”. They are both partial truths, like a cylinder is a circle from one angle and a square from another. With partial narratives we square the circle, but it remains difficult to keep them both in your head at once.
In order to push the partial truth that capitalism allows people to “create wealth through their own work and ingenuity”, as Rand did in Atlas Shrugged, it is important that no other partial truths regarding capitalism be allowed to take root in the reader’s mind. This isn’t necessarily accomplished by explicitly dismissing such troublesome speculations as invalid; after all, that would run the risk of introducing us to those “bad” ideas in the first place (disclaimer: I haven’t read any Rand and don’t know what devices exactly she used to express her views there. Maybe she did spend a little time in Atlas Shrugged explicitly trying to rebut the “capitalism is oppressive” narrative. But I believe that this is avoided an awful lot of the time partial narratives are pushed.) Possibly the best way to quash ideas that challenge the desired narrative is just to proclaim it as forcefully as possible, so loudly that it drowns out all budding skepticism. “This is a really nice story about how capitalism can lead to great wealth and personal autonomy, but I can also imagine how some poor people might get really screwed over in this syst–” “NO! Capitalism does so much good by giving people the freedom to create wealth through their own work and ingenuity!!”
Swatting away gadflies again.
IV. My overactive inner gadfly
Now what does this have to do with my being too willing to accept any story that’s put in front of me?
Well, some of the behaviors I preach are things that I myself don’t practice enough, and others are things that I probably take too far. Openness to gadfly speculations is an example of the latter.
Whenever I hear a narrative, however obviously unlikely, there is a part of my mind which says, “Well it could be that way.” This goes beyond just accepting a non-negligible possibility of the claims being presented to me; it often involves me coming up with supporting explanations on my own to challenge my instinctive response of “Well obviously that can’t be true.” The result is often that I choose to assume the truth of what I’m told pending further deliberation.
It happens from time to time that an acquaintance, particularly one who has noticed how fun to screw around with I apparently am, tells me some obviously very unlikely personal detail about themselves as a joke. And I oftentimes initially act like I believe it (nodding slowly and saying, “Okay…”) or at least don’t immediately dismiss what they said as an obvious joke. On one recent such occasion, I said something like “No way, you’re just messing with me” about three times before finally politely acting as if I believed what my friend was saying… which of course turned out to be the opposite of the truth. And I think that when I act gullible in this way, it comes across like I’m lacking in critical thinking, like I’ll accept whatever is put in front of me without considering how obviously absurd it is. But what’s actually going on in my head in a way is almost the opposite: I realize the absurdity of the claim immediately and know right away that the person is most likely joking, but ideas creep in like ominous gadflies, providing half-formed, kinda-sorta plausible explanations for why they just might be telling the truth. And it occurs to me that if those half-formed explanations are actually reality — however minutely low the probability seems at the moment — well then it would be totally rude of me to just dismiss them and automatically disbelieve the person, wouldn’t it? They’re probably just screwing with me, but I’m not about to take the risk of assuming this when it might turn out that they’re serious.
I guess any epistemic behavior I’d like to see more of in the world, even something like open-mindedness, can be harmful if taken to an extreme. I believe it was Bertrand Russell who said that one should keep one’s mind open, but not so wide open that one’s brains fall out. And there is such a thing as having too much imagination.