Suggest to someone that they can’t think for themselves. How do you reckon they’ll react? If there’s any truth to it, it should follow one of the deep channels of habit. They may hiss and bluster, or wave you off. Turn the accusation back to sender. Cite examples of times they stuck to their guns against an outlaw horde, pulled from what they promise is a wealth of other possibilities just beneath recollection. If they’re thoughtful and honest, they may admit to certain occasions when fatigue or passion managed to drive them along the cattle trail—and that it’ll happen again. Happens to all of us. I haven’t lobbed the accusation as often as I’ve thought it, but I suspect it would be hard to find a person who admitted that everything that ever crossed their mind or spurred them to action was an echo of a source long-forgotten. I hope that’s because it isn’t true. I would like to believe that as much as we all take a shortcut now and then, there are moments when our words and deeds issue forth from an authentic self.
The First Thing You Think is a Cliche
Here begins a series of posts about the ability to think for ourselves. There’s a fib in that last line that I’ll expose soon enough. For now, let it stand. Things start when they do because they’re at last ready, so there must be a reason I write these words now. My fingers twitch with the temptation to rail about people “this day and age.” I see dancing visions of social media and TV, gossip and rumors and nightly news. A polarized political climate, crumbling values, attention-destruction, likes and shares, marketing campaigns that seep into every crevice of our lives and redefine our notions of beauty, success, and what I want for supper.
I’m tempted to go deeper, after the unquestioned cultural assumptions that provide the framework for those trends. What do we want that leaves us thinking about our next job and our pay, instead why are our of modes of work stuck in 40-hour time-wages and certificate-obsession? Why are our vocabularies and practical skills dwindling, instead of what does it mean for us to know? Why don’t we write long letters or talk in person, instead of what do we seek from our relationships? The modern age is ripe for casual criticism.
I’ve heard it said in writing circles that the first thing you think is a cliché. If this age feels thoughtless to me, maybe it’s because it’s the one I know. Every old man sees the apocalypse in his grandkids’ music and social calendars. Something tells me if I took a poll in 1950, 1850, or 150 A.D., a certain element of people would find their peers just as infuriating. This is not an unthinking age. There may be new ways to avoid it, and maybe it does ebb and flow to a degree, but there have always been people more eager to repeat what they heard than to question what they know.
The Second Thing You Think is Also a Cliche
If anything’s changed, it’s me. I didn’t come to this essay a day earlier because I wasn’t ready to ask the question until now. Do the people around me think for themselves? I could have filed that charge from the halls of elementary school. Of course they don’t. Not often enough to avoid frustrating me. That’s not the question that drove me to start this series. I already know the answers I want to give. But let me resist the urge to repeat them so quickly. It doesn’t matter whether or not other people think for themselves. Do I?
This essay wouldn’t exist if I did. There’s a paradox in there somewhere: that the first thought of someone trying to think for themselves is a realization that they don’t. It’s been weighing on me for some time, now. I’ve enjoyed a few false starts, where I was sure I finally got the hang of it. But aside from a few glorious moments in which I had an insight just barely different from a well-worn pair (deriving new things from two old ones is a useful technique I’ll go over later), I’m pretty sure what’s passed as originality for me has been the act of repeating things I heard from more and more original sources. If something sounds good, I adopt it, and if few others are saying it, it may feel pretty novel.
But most human thinking is conservative. We’re disinclined to change our minds by much. Change tends to be incremental, and where it seems radical, it’s limited in scope, and anchored by a familiar framework in which we orient the new thought to a lot of old ones in a structure that hardly shifted at all. A political opinion may shift, but it’s justified by familiar virtues against the same cultural backdrop and supported by good old-fashioned Aristotelian logic. And the new opinion will likely reconcile very well with a lot of old ones I claim to have held all along. Fresh ideas come within the same old context. It’s as though I were playing Ode to Joy in the same time and key, adding a few grace notes every fourth bar.
It’s difficult-enough to be aware of our thoughts and where we may have come by them. Harder still is to understand the social and cultural forces channeling them like mountain tributaries into a slow, widening body. I write this series not to teach a method of striking free from a place of mastery, but because I only just had an inkling of the problem, and I need to talk it out. What follows is at best co-learning, subject to feedback and adjustment from myself and others.
Is It Even Possible?
If I’m right and we rarely if ever think for ourselves, falling back on half-forgotten repetitions and deep unconscious forces, it’s fair to wonder whether anyone can ever really pull it off. The chain of sensations, reactions, and judgments are so fast and subtle that we can never be fully conscious of all the factors that go into a feeling, a thought, an act. Reptilian instinct trumps reason. If I see a green light, at what point from sensing a color, to registering it consciously, to interpreting it and choosing to drive my truck beneath it did the capital “I” take over? For that matter, did I even choose to turn my attention to it to begin with, or was it forced into consideration by habits as old as the species?
Anything I do can be framed as a reaction to or repetition of some external stimulus. The notion that I’m entirely free, unaffected by the world around me appears dead in the water. There is always some degree of “other” in the thing I call “me.” If we accept the idea that all of our actions are a complex chain of responses to processes that begin at some imperceptible level, we have no choice but to adopt the popular hypothesis that there is no free will, only a human delusion of it. If that’s the case, then I have no other choice but dismiss the notion and continue my investigation. And if it’s wrong, as I happen to believe, let those who think themselves meat robots chirp to one another according to their algorithms.
I think the fallacy comes from a rigid binary: either we are totally free, or we are influenced and entirely bound to move as the pool balls scatter. The popular (very popular) belief that there are two and only two options is one of the things at the heart of a lot of sloppy thinking. At some level beneath conscious awareness, we’re hardwired. Nothing you or I do will untie us from the instincts built into us by the evolution of our species and the selection pressures we faced. Nothing will remove us from the external world, with its many wills supportive and contentious and ignorant to our own. We can’t live in a vacuum of our own imagination, where a raw thought drawn from an infinite ether becomes reality. We’re influenced, I say, but not bound. There’s a narrow band at the tip of the spectrum of which we are conscious. We can’t think without restraint, but we can dance within limits. There are possibilities. They aren’t infinite, but I wager they’re far greater than most of us have explored.
In Defense of Musk Oxen
If I’m going to expand my narrow ability to think for myself, I should start by acknowledging the advantages of not doing so. It’s gotten me and a whole host of others through a few million years of life in some hairy situations. Human beings are social animals who live in groups. If we all had such radically different ideas and ways of perceiving the world, it would be difficult to communicate and cooperate. No two people think the same, but we benefit from thinking similarly.
When the musk ox is threatened, usually by a nature photographer or polar explorer, they arrange themselves into a tight defensive square with the most vulnerable members at the center. So strong is this instinct that no kind of fright will scatter them. Even as ox after ox falls to a hunter’s bullet, the survivors will rearrange themselves amidst their dead relatives until the last one stubbornly drops. Not so effective against a rifle, but this tactic evolved to deal with wolves and bear, who are practically helpless against an organized herd.
Thoughts that work keep us alive. Survivors pass on their genes, and their habits. Whatever we’ve got has come to us through a gauntlet of predators, disease, natural disaster, and human competition. The ways we sense and respond to the world work. Or, like the musk ox, they did before rifles and global networks made radical changes to their environment that outpaced their ability to adapt. Whether or not they’re still useful, those deep structures are here to stay. They don’t change in a lifetime, even if we can become aware of them.
They hold us together in defensive formation for the sake of our species. Radical ideas are more likely to lead to dismemberment than fortune. As a group, we prefer to let a few individuals put their necks on the line while we wait to see how it works out. For the most part, we cling to the time-honored because it keeps us alive. Others’ examples are tested. We find reinforcement for those behaviors, and they’re easy to adopt. Life proceeds with less friction when the way is paved. Imitating others signals membership in the group and common values. People are pleased with us, and want to help us. When we operate in the same bubble, the same tautology, communication is easy. We relate to one another and get things done. As we diverge, communication becomes difficult. It’s hard to tell who is on whose side, or what they’re after. Life without common ground can be downright lonely.
Evolution makes a strong case for sticking to the gameplan. It also makes the best case against it. We’re born mammals with two legs and arms, shaped like humans, with a certain range of senses and patterns of social behavior because that’s the formula that got us here. A new mode of thought isn’t going to have much effect on all that. But we aren’t single-celled oceanic organisms, so at some point, something must have changed.
Much like thinking, species shift through variations built on conservative foundations and tested in the real world. If new things weren’t possible, nothing would ever evolve. It’s at that razor’s edge that we shape possibilities that may become the basis for future growth. We need a solid herd, but we also need those capable of sudden adaptation and novel solutions. The same is true of our inner life. Much of our thought is founded on our culture, our language, our family traditions, and our past experience. Those serve as our anchor, and to forget them would literally lead to insanity. But we also have the ability to make conscious adjustments to new experiences, and it’s there that the independent self comes into its own. Without that ability, we can only repeat the past and hope the world never changes around us. If it did, we could only repeat what used to make sense while new external pressures dismember us.
Benefits of Independent Thinking
In fact, the world does change. Imagine the waitress comes to your table, and you order a Denver omelette with a side of bacon. There were other options on the menu, but there’s nothing bizarre about your action. After a nice breakfast, she reappears with the bill. At the sight of her, you order another Denver omelette with a side of bacon. Fine, maybe it was a really good omelette. Later that afternoon, you see the waitress at the bus stop after her shift. You roll down the window and order the same omelette and bacon. The encounter repeats itself the next day at the grocery store, then again at a local concert, in the line at the polling station, and at last at the court hearing for her restraining order.
Giving your order to a waitress is fine. What changed is the context. It seems absurd that a simple marker—the appearance of a certain woman—set off the same behavior when many other things should have signaled that this is no longer “waitress,” or “the time and place to order breakfast.” It makes just as little sense to see musk oxen continuing to form a smaller and smaller defensive square as their kin fall to rifle shots, refusing to bolt from a field of corpses. While it seems obvious, there are innumerable occasions when we fail to recognize a context has shifted beyond a vague threshold that requires a new approach. We find an action that works, apply it to simple triggers, and repeat it at every opportunity. Have you ever had a debate with someone who seemed to repeat the same points no matter how far the conversation veered from the topic? For that matter, have you ever carried a habit beyond the realm where it made sense? In the privacy of my home, I tend to get a head start on my fly while en route to the bathroom. Saves time, and no ones cares. But more than once, I’ve had to stop myself from doing it before I even made it through the door of the public restroom.
I like to use extreme examples to make points, but this fidelity to habit and old contexts extends to all phases of our life in ways that are far from obvious. I’ve discussed why it’s useful. The problem arises when we assume that the way we view the world is the only one possible, and always appropriate.
Our habitual ways of living are based on examples from our friends and family, community, nation, laws, faith, history, language, philosophical underpinnings, and unstated metaphysics. Even if our circumstances remain constant, the chances of all those forces cooperating in a single coherent worldview at all times are dubious.
There are many families, many histories, many worldviews. To think for yourself means only to acknowledge our predilections, take careful account of what’s going on around us, and be willing to alter our ideas when the situation calls for it. Why would we want to move against such cultural inertia? Every innovation and stroke of genius requires different thinking. While many newfangled ideas turn out to be nonsense, or even harmful, the flaw lies not in original thinking but failure to recognize a shortcoming and alter course. All that’s tried and true was once new. There’s risk in originality, but the inability to adapt leaves us wandering senile through a world that no longer exists.
When we think with others, we can only think what others are capable of thinking. We share their brilliance and their mistakes. We share the same limitations. Individual experience is reduced to a multiple choice menu of broad, agreeable narratives. What’s compromised is more than just intellectual integrity.
There’s pain in denying authentic experience. I aim to show that what I’m talking about extends well beyond “thinking.” You might point out that we find common ground in order to relate to one another, and without it, we’re adrift. But I’ll repeat many times that we don’t have to lock ourselves into a binary of either mindless herd life, or maddeningly incomprehensible individuality. We can still share many things while we respect and encourage differences. What we ultimately hold in common is the meta-value that independent thinking is good, even if we disagree on the details.
If the notion of an independent thinker conjures up images of argumentative snobs who nitpick everything we say, I won’t blame you. But most disagreement isn’t independent thinking. It’s highly stereotyped thinking of an oppositional nature. When we gather information and think about it, it’s actually very hard to end up in the extremes. Of course it happens, but rarely, and it’s usually a result of ceasing to tend to your own experience, letting limited data substitute for all future experiences and run away with your mind.
The ability to think for yourself allows for innovation, frees us from bitter dogma, makes us more accepting of others’ perspectives, takes the steam out of arguments, maintains sanity, encourages adaptation, and frankly is a hell of a lot more fun.
So Why Can’t We Do It?
I promised to expose a fib I told in the opening paragraphs. I’ve littered a few hints to the solution, but I admit that this series is not about thinking for yourself. It’s an alluring hook. Who could claim to want otherwise? If the prospect entices you, maybe you’ll forgive the deception, because what I aim to talk about encompasses that, and much more. To plunge headfirst would have risked confusion and dismissal. If what follows has any merit, the stated intention will still bear fruit, but it will only blossom where twisted branches intersect.
Most of the discussions I’ve heard on the topic, including my own introduction, suggest that thinking for yourself is just a matter of taking time to become aware of your thoughts, comparing them to an assortment of facts and opinions from others, and again to reality, then arriving at a considerate course of action. If that were the case, we would be able to win arguments with logic, and eliminate our self-sabotaging behavior by reasoning ourselves into sainthood. Yet in my experience of myself and others, even the most rational people are capable of losing their minds over certain issues. Objectivity and the lessons of the past are chucked out the window in favor of old scripts or membership in some consensus. We apply logic to one domain, but not another. We check certain sources like they’re drug smugglers, and wave others past with a glance. At times, we press for new frontiers, and others, anchor in safe harbors.
This series is not about thinking for yourself, because that doesn’t work. When we seem to do it, there’s a lot more going on than just thinking. I can explain convincingly why a certain course of action is good for me, and the alternative miserable. So why can’t I do it? There lies part of the problem: we can think with great rigor, but is that all we want? To think a wise thing, then drift off to do something else? I reckon what we’re after isn’t just great thoughts in passing, but the beliefs, relations, and deeds that should follow. Thinking is one part of the equation—a part that can’t be separated from the whole and still hope to live.
In the next part of this introduction, we’ll see how we actually process the world, and we may wonder why thinking is given such esteem, while there are few books on things like feeling for yourself, or acting for yourself.