I haven’t had much focus for reading and writing analytically the last couple of weeks. I wasn’t sure what was going on until I went readwalking for a few minutes on Friday evening.
Much as I’ve loved reading Nassim Nicholas Taleb the last few weeks, I’m rationing his Antifragile. Instead of reading Taleb, then, I read a few pages in Daniel Kahneman’s Thinking, Fast and Slow, which I found via Taleb.
Between Taleb and Kahneman, I’m finding something like peace.
I began my flurry of book-reading about this time last year. At the time, I–then a lifelong U.S. Democrat–was motivated to deeper reading by my absolute horror with Democratic officials. I was certain that the badness I was witnessing in articles and soundbites was just the tip of a badness iceberg.
I confirmed my suspicions fairly quickly, and loathed myself for having unquestioningly, for decades, embraced Democrats as the good guys. But something else grew beyond that: a concern that truth didn’t seem to be what most folks online were after. In fact, over and over again, I witnessed people I love and admire actively rejecting the mere possibility something they didn’t want to be true could be true. That tendency troubled me much more deeply than wrongdoing by a relatively small number of elites.
Why? Because of the potential consequences to humankind’s future by large groups of people believing things that aren’t true. I’d seen self-protective denial exercised over and over in my childhood, thanks to growing up in poverty and predation. I just hadn’t realized that the strategy I saw wives of predators (and jurors) adopt was only one expression of something destructive that runs to the core of American life. Last year was when I began to understand that the denial of reality I saw in childhood was a fraction of damaging denial worldwide.
Coming to terms with that was rough. Seeing people around me go through wild contortions to disregard inconvenient facts, I was overwhelmed by the feeling that their denial could never be eroded quickly enough to protect young generations of the entire world from the effects of shortsighted overconfidence of elder ones in the United States.
One of my favorite authors, Gavin de Becker, calls denial a “save-now, pay-later” scheme. While he writes about personal safety, this scheme’s applications extend to climate change and the myriad economic factors that have pushed the Earth to the brink of habitability. Basically, older generations have saved a little pain today by passing a lot of pain onto any future generations’ tomorrows.
Once I began to accept that there’s nothing I can do to ensure a different future, I felt a lot less frantic. A lot less hostile about people whose choices to see/not-see impact
my our children’s future. A lot more open to figuring out what exactly leads people to tune out truths in favor of more (temporarily) pleasant fictions.
And then I found Taleb, who helped me see that there’s nothing willful about this unseeing. It’s about heuristics, or mental shortcuts, that worked really great when humans lived out in the wilderness, but which don’t work so well–and can, in fact, harm us–in complex systems. We see the world as much less complex and more linear than it actually is, and these illusions are nurtured–wittingly or unwittingly–by the structures of societal institutions.
Part of how Taleb reached this understanding was through the works of Daniel Kahneman and Amos Tversky, who’ve done amazing work on heuristics–slow thinking versus fast thinking, System 1 versus System 2 thinking, or (as described by the authors of Switch) elephant and rider. I’m reading this Kahneman book to better understand when I’m using faulty thinking, and to see if there’s anything I can do to constructively counter denial in conversation.
It was in this spirit of exploration that I set out to readwalk to Kahneman for 15 minutes on Friday, and exclaimed “aha!” when I read about phenomena like cognitive load, cognitive busyness, and ego depletion. Specifically, I found the seeds of an answer to my analysis exhaustion beginning on page 41:
Baumeister’s group has repeatedly found that an effort of will or self-control is tiring; if you have had to force yourself to do something, you are less willing or less able to exert self-control when the next challenge comes around. This phenomenon has been named ego depletion.
The most surprising discovery made by Baumeister’s group shows, as he puts it, that the idea of mental energy is more than a mere metaphor. The nervous system consumes more glucose than most other parts of the body, and effortful mental activity appears to be especially expensive in the currency of glucose. When you are actively involved in difficult cognitive reasoning or engaged in a task that requires self-control, your blood glucose level drops.
Why haven’t I been able to focus on words the last few weeks? Because I’ve been eating more restrictively than I ever have, and it is taking everything I have to do so … even understanding that doing so is important for my health. This has diverted more mental resources than I realized, making me feel the brain drain of analysis where I wouldn’t usually.
(A few weeks ago, I even told my husband, “This may sound strange, but … all this thinking is really exhausting. I mean, like, physically exhausting.” I just somehow managed not to notice how exhausting thinking is until I had to seriously restrict my diet.)
Even with so many resources diverted to simply getting through the day without eating things that hurt me, I have to read a little. This readwalking discovery is a perfect example of why it can be better to read 10 slightly foggy minutes at a time than totally give up reading.
So–willful ignoring of facts? Not so much. In the end, it comes down to a specific kind of cost/benefit analysis: “In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature.”
That’s only a small part of what’s “built deep into our nature,” which includes:
a puzzling limitation of our mind: our excessive confidence in what we believe we know, and our apparent inability to acknowledge the full extent of our ignorance and the uncertainty of the world we live in. We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events. Overconfidence is fed by the illusory certainty of hindsight.
These limitations aren’t nearly willful, or even intentional. They’re part of the built-in calculating that’s helped humankind survive until now.
Now, my question is: Can we humans collectively become aware of these limitations fast enough to put the brakes on the worst possible devastations of climate change?
I know enough now to know how little I know, and to avoid–or quickly backtrack from–certain thinking traps (if not, yet, others). I also know better, thanks to folks like Taleb and Kahneman, that there’s only so much yesterday can tell us about tomorrow.
Understanding all this makes tomorrow seem a lot more hopeful, and so: read I will, with gratitude for authors who help make me a better and simultaneously more compassionate thinker.