In Praise of Falling Down; In Fear of Always Winning

They’ve got a name for the winners in the world,
And I want a name when I lose. (Walter Becker and Donald Fagan)

Winning! (Charlie Sheen)

What is it with all this winning and winners? I thought America pulled for underdogs, but that is not true. The Yankees are popular because they win a lot, never mind their outsized payroll that lures top players from other teams. The Cubs were laughed at for years, but no longer with that World Series trophy in their clubhouse. Everyone roots for the Cubs. Few want to root for the losing team, of course, but always shifting allegiances to the current winner seems like cheating.

There is a crack, a crack in everything.
That’s how the light gets in. (Leonard Cohen)

It seems natural to admire winners and to want to be like them, but our obsession with winning and winners has become unnatural. We see this in people’s shifting sports loyalties, our strange veneration of the rich and famous whatever their actions or motive, and, of course, our political class. Winning is all there is, and only losers admit to losing as though there is some magic strength in denying reality or some virtue in running from the truth.

Vulnerability is the leading edge of truth. (Charles Blow)

One need not engage too much brain power to realize that facing error, admitting or even embracing defeat takes a good deal more resilience and temerity than winning or claiming to win all the time. Vulnerability takes guts and wherewithal and leads to learning and increased understanding. Climbing out on a limb can get you to the fruit at the end of the branch. Yes, the branch may break, and you may get hurt. But, you will learn something about yourself and about the limits of tree branches for future ventures. Chopping down the tree, collecting all the fruit, and declaring victory is shortsighted and dishonest. Once all the fruit is gone, there is nothing left. Falling gets you further.

Ever tried. Ever failed. No matter. Try again. Fail again. Fail better. (Samuel Beckett)

I’ve failed over and over and over again in my life, and that is why I succeed. (Michael Jordan)

Failure, like exercise, takes strength and it builds strength. Failure also takes guts and resilience. Frankly, cowards never fail because they never try. Or, worse still, they make excuses and deny failure. Politicians have long mastered the art of redefining success downward. If your bill fails, you declare victory because you received more support than expected, and this flagrant disingenuousness works. More recently, the political impulse to blame others, redefine victory downward, manufacture the appearance of success, denigrate enemies as losers, and deny the possibility of error altogether has become more entrenched. Reading that list aloud sounds like I am describing a schoolyard filled with arrogant bullies. We deny the inevitably of failure and the benefits of failing at the expense of future progress. Failure may be hard to face, but it is a special category of weakling who goes to great lengths to avoid that reality.

If I don’t have red, I use blue. (Pablo Picasso)

Failure makes us adaptive and resilient and fosters creativity, as well. Coming up short and facing that fact can stretch our abilities and challenge our sense of limitation. The best artists, scholars, athletes, and, yes, even politicians learn from failure and grow from error. The truly best live the philosophy Miles Davis espoused in his colorful way: “Be wrong strong. Otherwise, sit it the f**k out.”

Being defeated is often a temporary condition. Giving up is what makes it permanent. Marilyn vos Savant

The goal of trying is success, certainly, but how do we define and measure success? Is success getting the goods or getting more of the goods than others? Remember your childhood, there was always some kid on the block who would suddenly rush over to a tree or street sign or parked car, touch it, and say “I win!” Maybe your sibling was that kid. Maybe you were that kid. Everyone else would stand around dumbfounded and annoyed because they had no inkling that there was some spontaneous race breaking out. And, no amount of protest or argument could keep this obnoxious kid from strutting around full of self-conceit. How is it different, then, to declare oneself the richest or the best or the most famous when no one else was competing or when the competition was rigged? Yet, we adulate people who are like that every day, even those who inherited their wealth or lucked into success or found fame through infamy. How else could we have people who are famous simply for being famous? They contribute nothing. They achieve nothing. They have few redeeming qualities, but we make the celebrities, or, worse still, our elected officials.

Meanwhile, failure–beneficial and productive failure–takes talent and skill and resolve. Brazen winning is simply brazen and empty.

I saw how we are all great in our shortcomings, yea,
greater because of them. (John Ashbery)

In short, no one is perfect. In fact, no one is truly great in the most superlative sense. The limitations, foibles, flaws, and errors of our lives shape our identities at least as much as our achievements and virtues. While many will hide behind this imperfection to excuse away stumbles and misdeeds, we really imagine that there is some paragon out there, of morals or discipline or achievement, etc. Of course there isn’t.

If you want to succeed, double your failure rate. (Thomas J. Watson, Sr.)

We all fail. It is a human inevitability. It through failure that we achieve and recognize success. Pity the dull-witted, cowardly perennial winners. As Wallace Stevens observed, “Death is the mother of beauty.”

“You know sometimes Satan comes as a man of peace”: Alfred Nobel, Bob Dylan, and the Expropriation of the Prize

Originally conceived in response to a call for academic papers

“He’s a great humanitarian, he’s a great philanthropist.” (Bob Dylan, “Man of Peace”)

Alfred Nobel:
Patron of Peace
Deviser of Dynamite
Purveyor of International Plaudits

Bob Dylan:
Prophet-Poet to the Peace-nicks
Detonator of Doctrine
Ambivalent Abjurer of Acclamation

Nobel’s prize-giving progeny have bestowed literary laurels on Bob Dylan, a move that–surely unintentionally–has advanced Dylan’s agenda of disruptive subterfuge. You hadn’t noticed? As Bob Dylan himself has said, “I’ve always believed that the first rule of being subversive is not to let anybody know you’re being subversive.” (Theme Time Radio Hour Archive, Episode 45: Trains). Bob Dylan is a subversive, and the Nobel Prize Committee has taken the bait.

The Nobel Prize in Literature purports to honor the greatest living literary artist at the moment, which is, as I have noted before, an absurdity on its face and of little importance beyond the fleeting elevation and enrichment of one lucky scribbler and his or her publisher. In 2016, though, perhaps for the first time, the prize’s recipient has appropriated the prize itself as artistic grist simply by dint of its conferral, and the prize has suddenly become relevant to artistic endeavor.

The prize simultaneously overplays and normalizes Dylan’s work. We can already see this dynamic in Dylan’s audacious non-responsiveness, the bafflement and delight of the public, the bemused hostility of and exploitation by the media, and the sundry responses from other writers. Every album Dylan releases, every major interview he gives, and even his paintings and occasional car commercials ignite a similar conflagration of anger, confusion, amusement, and celebration.

Witness the hype over Dylan’s purported penchant for plagiarism. Dylan has, if you have not heard, appropriated the words and phrases of others and incorporated them into his lyrics, his memoirs, and even some interviews. He had long appropriated tunes in practice of what is known as the “folk music process.” For instance, his earliest songwriting efforts, such as “Song to Woody” and “Blowing in the Wind,” lift their tunes from other songs (“1913 Massacre” and “No More Auction Block,” respectively). Fittingly, those songs often have their own antecedents and progenitors. Similarly, his paintings are sometimes based on photos taken by others and used without the permission or even the awareness of the photographer.

Some have defended Dylan’s mode as conceptual art, an accepted and well established creative act. Still, perhaps because he is Bob Dylan, pop star, the media have largely condemned him as a cheater as though there were no difference between the Dylan lines

More frailer than the flowers, these precious hours
That keep us so tightly bound
(“When the Deal Goes Down”)

And the all-but-forgotten “Poet Laureate of the Confederacy’s”

A round of precious hours
Oh! here, where in that summer noon I basked
And strove, with logic frailer than the flowers.
(Henry Timrod, “Rhapsody of a Southern Winter Night“).

Synthesizing something novel from what exists is as creative and artistic an act as coining work of pure originality. Unless of course you are a college student working on a paper. That’s plagiarism!

This proclivity for appropriation is key to grasping Dylan’s methodology. We can call it reflective magpie-ism. I have maintained elsewhere and continue to maintain that Dylan is at his core a satirist–a subverter of cultural assumptions and their consequent expectations. This stance of his is a feature of all his public acts–artistic or otherwise–and informs his Promethean public personae, which turns the mirror (or is it a lens?) back toward the audience. Jonathan Swift noted that the viewer of satire sees everyone’s face but his own reflected back (“Battle of the Books”). Dylan understands and counts on that phenomenon. He invites and relies on us to project wants, expectations, and norms onto a mirror, which casts back on us. In fact, his 2012 concert tour featured elegantly framed mirrors of various sizes on stage facing the audience. Flashes from fans’ cameras presumably were rendered useless–an excellent metaphor for the Dylanesque. Our own desire to see and capture more of Dylan results in a glaring awareness of our self-blindness.

Glaring blindness.

I wear dark glasses to cover my eyes
There are secrets in ’em that I can’t disguise.
(Dylan, “Long and Wasted Years”)

Dylan’s dark glasses (an emblem of blindness!) are presumably not perfectly opaque, at least not from the inside. From the outside, the viewer sees nothing of Dylan’s secrets and only a distorted reflection of the self. Dylan’s use of personae and masks is well documented, but here he suggests that a mere disguise is not adequate to hide the secrets his eyes would betray. But there is a price. The dark glasses must filter his vision. As with sunglasses, colors may pop more readily–a boon to any poet/lyricist–but shadows may be darker still. The dark glasses create a dualist, even Manichaean, perspective of reality, one that lends itself toward the binary stance of most satire. Dylan’s satiric outlook, though, is less corrective than it is disruptive, less about shaming than subverting. Dylan does not want to point a finger (at least, not anymore) as much as rattle the cage, which he has always done. But, as we can see from the quotation above, he is not about to expose himself to the same scrutiny–an unremarkable act of hypocrisy.

Dylan’s critique and cage rattling extend beyond his lyrics. They infuse the personae he adopts and all his public acts or conspicuous inaction, which have amused and vexed observers for decades. The Nobel Committee’s choice stimulated this exercise to commence once again. Witness the great upheaval the conferral of the prize provoked followed by the indignity at Dylan’s bizarre silence. Then, when Dylan deigned to respond, there was amusement mixed with anger at the ambiguity and inadequacy of his response. Dylan’s refusal to attend the ceremony culminated in a torpid reading by the American ambassador of Dylan’s rather excellent and modest speech and a shaky rendering of a Dylan classic by his old companion Patti Smith. The ceremony was both touching and infuriating, elegant and absurd. The Nobel Committee would have been perfectly justified giving the prize to another just as worthy or worthier writer, but few would have instantly co-opted the conferral itself so adroitly. The awarding of the Nobel became a small part of the Dylan mystique–not the rock star Dylan mystique but the mystique of the subversive Dylan–the disrupter. Dylan owned the prize and the process as the world gawked and gasped, and he barely said or did a thing. How explosive a figure is he!

Alfred Nobel, that endower of peace who bequeathed dynamite to the world, has once again unwittingly loosed the devil, this time via his legacy prize committee. Dylan has long sought to overthrow dogma even as he spouts it, to play at once the icon and the iconoclast. The Nobel Prize conferral has become just the latest expedient in his long scheme of artistic subversion.

The Martyrs Call the World

They like to take all this money from sin, build big universities to study in,
Sing “Amazing Grace” all the way to the Swiss Banks.
-Bob Dylan, “Foot of Pride”

I have long decried the false dichotomy between education in the liberal arts and vocationally-based educational practices. Both, when done correctly, should draw enthusiastically from one another and–on balance–deliver similar results even within the framework of specialized content. Instead, so-called pragmatic education evokes images of gainful work while liberal arts education evokes scenes of self-indulgent contemplation. Furthermore, in a culture that rests on capitalist ideals and Christian assumptions, work–in its crudest sense–has come to represent or merely be virtue, no matter the necessity or even value of that work. Simply put, presumably, work has intrinsic value. Inevitably, wealth, which is already facilely associated with work, is seen in a causal relationship with effort. A standard script emerges. It has variations but goes much like this: hard work, a virtue, leads to wealth; therefore, all wealth is the result of and sign of virtuous behavior. The syllogism of this script is, of course, a fallacy.

This script, in its sectarian extreme, manifests in Puritan thought (a foundational American/Western doctrine) and, more recently, in “Prosperity Theology,” which posits that material goods and luxury are intimations of God’s blessings. We can also find it woven throughout our more secular institutions–financial, athletic, artistic, and even academic. When we see financial or material success, however measured, we assume it is earned, an assumption that serves as a hallmark of meritocracy.

The implications of this script have their detractors, of course, in various walks of life, including those listed above. Perhaps, though, dissent emanates most frequently and deliberately from artists and the arts, and poets are particularly vocal in their disagreement with these assumptions.

For instance, Walt Whitman, in the opening stanza of “Song of Myself,” famously boasts,

I loaf and invite my soul,
I lean and loaf at my ease observing a spear of summer grass.

His is the rallying cry of the pure poet, particular since the Romantic Era.

William Butler Yeats toys with this notion of the value of work in his narrative lyric “Adam’s Curse.” The poem describes an evening conversation between the poet and two sisters and can stand as an example of Yeats’ narcissism and his sexist sense of entitlement. On the other hand, readers in a more generous mood could render it as Yeats’ send-up of his inadequacies as a conversationalist among women he seeks to impress. In short, he may just be an awkward young man with blowhard tendencies ineptly trying to impress some pretty women.

The opening sets the scene of a late summer gathering and picks up, in media res, the discourse, which is almost a monologue with the pedantically bumptious poet holding forth about his second-favorite subject, poetry.

I said, “A line will take us hours maybe;
Yet if it does not seem a moment’s thought,
Our stitching and unstitching has been naught.”

As he does throughout the narrative, the speaker blithely blathers on. He then, in a fit of remarkable superciliousness, contrasts the rigors of poetic labor with the relative ease of physical toil.

“Better go down upon your marrow-bones
And scrub a kitchen pavement, or break stones
Like an old pauper, in all kinds of weather;
For to articulate sweet sounds together
Is to work harder than all these”

The forcefulness of this near-breathless diatribe suggests the poet’s conviction and his self righteousness. Poets toil and suffer more than physical laborers. Imagine.

But the next lines are most revealing.

                                             “… and yet,
Be thought an idler by the noisy set
Of bankers, schoolmasters, and clergymen
The martyrs call the world.”

I have long enjoyed the sarcasm of “the martyrs” and the ironic lack of self-consciousness in the phrase. Also, that odd list that represents “the world”–the respectable professions–“bankers, schoolmasters, and clergymen.” Why, the poet seems to lament, why do these self-serious posers get to be “the world?” Poets work just as hard, maybe harder. And poets certainly contribute more–or so he suggests.

And here arises the dichotomy. The poets vs the world. The humanities vs the sciences. The liberal arts vs professional education. Is it really thus? Are there no accounting majors who act on stage, no history majors running large technical companies? Of course there are, and there is compelling evidence that they are not anomalies. Ask Wallace Stevens, for one–a lawyer, insurance executive,  and poet. Far from the pragmatic laborer, he wrote,

Let be be finale of seem.
The only emperor is the emperor of ice-cream.

It would be tedious to cite the many studies and articles that maintain that a college student’s major does not matter much to future success or even career choice, but here are a few:

Some argue otherwise, but generally starting with the premise that some professional fields pay more than others. Of course that is the case, but the assumption in these studies is that a student’s career field is always the same as the major field, which is flat-out wrong.

And for years, evidence has mounted to demonstrate that employers of all sorts are looking for precisely the kinds of skills and the capacity that a liberal arts background, whatever the major, fosters. The best-rounded students have the sort of liberal education that allows them both to broaden their learning and deepen their understanding of a particular area or areas. This type of learning is often described as a ‘T’–wide at the top and focused vertically. Believe me, I am not going out on a limb or covering new ground when I describe the value of T-shaped learning.

Yet, we are subjected to politicians and thought leaders who loudly, if not compellingly, contrast philosophers and welders, sociologists and engineers, anthropologists and everyone else. The false dichotomy that the academy itself established–pure learning vs pragmatic learning–has become the rallying cry of external “reformers” bent on nothing less than the utter upheaval of all that higher education values and represents. For instance, the silly and hyper-academic argument about whether math and the sciences are part of the the liberal arts still simmers. Just recently, the topic came up in an audience question during the closing plenary at the venerable Association of American Colleges and Universities annual meeting.

Enough. We do ourselves no favors and threaten injury to our students and progeny by continuing these petty squabbles. Too often, these spats reek of interdisciplinary snobbery and gloating–ignorance in the extreme. Good learning is liberal learning that crosses boundaries and integrates knowledge and its application. Colleges and universities and the educators who lead them pretend otherwise at their own peril.

Why Bob Dylan Deserves the Nobel Prize (And, Perhaps, You Do Too)

The disapproving hubbub over Bob Dylan winning the Nobel Prize has been relentless and pretty predictable. First it was because he won and then that he did not immediately acknowledge winning. Then when he acknowledged it, he was not humble enough. And then he said he would not make it to the acceptance, which raised a cacophony of contempt. He claims he has a previous commitment, and up rises the hue and cry. I do not want to make excuses for him, but maybe he promised a grandchild a special day on that date or maybe he has a surgical procedure scheduled or maybe that is the one day a year he reserves for bathing in the blood of virgins to maintain his vigor so that he can continue touring at age 75. I have no idea, and neither do you.

Now the outrage is renewed at him sending a speech for someone else to read while Patti Smith plays a Bob Dylan song. In short, Dylan has been universally declared to be horribly, monstrously, inexcusably insufficient in his gratitude. Reports (rumors, really) have maintained that Dylan’s friend and manager, Jeff Rosen, lobbied for Dylan to receive the prize. I do not know if that is true or even how one lobbies for a prize in literature, but, if true, could it mean Dylan himself desired the prize he now eschews? The ingrate!

All in all, the castigation of Dylan’s ostensible ineptitude as a groveler seems odd. As Barack Obama wisely observed when Dylan did not swoon obsequiously at receiving the medal of freedom, “That’s how you want Bob Dylan, right? You don’t want him to be all cheesin’ and grinnin’ with you. You want him to be a little skeptical about the whole enterprise. So that was a real treat.” Just right. If Dylan fawned over Obama, his knee-jerk detractors would have called him a sellout or a toady. In related news, Dylan recently chose to skip the White House reception for this year’s American winners of the Nobel Prize.

All this hubbub, all the hue and cry, all the castigation is finally pedestrian, which is why I have not previously weighed in on the topic. Dylan, as always, is playing it, playing us, for all he can. Nonetheless, I now have a few observations about the reaction and the prize itself.

One of my favorite responses to Dylan winning the Nobel was from members of PEN America, the venerable organization for writers and writing professionals. Some, mostly not women and not people of color, praised the decision. Others, mostly women and people of color, were against. Almost all seemed supremely silly if not flat-out supercilious. The reactions for ranged from “hey, yeah!” to “it’s a bigger tent now.” The reactions against ranged from “songs ain’t lit” to a unilateral declaration that the award should have gone to a different Bob–i.e. Marley, who died in 1981. (Imagine the Nobel Prize committee freed to offer the prize posthumously. Instead of tracing Dylan’s art to Homer and Sappho, the prize could simply go this year to Homer and next year to Sappho. The Nobel Peace Prize could go one year to Jesus of Nazareth and another year to Caesar Augustus . . . for ushering in the long Pax Romana, of course.)

Now, as to why Bob Dylan deserves the Nobel Prize. The essence of why is captured in the reaction of the PEN members. The whole notion of awarding a singular prize for achievement to an individual–of recognizing one among billions–is inherently absurd. This one individual is the best literary artist in the world since last year and until next year. Whatever inherent value the Nobel has, it does not lie in including this one and thereby excluding all the others. We can see this reality in the committee’s occasional propensity to send messages.

For instance, why did Winston Churchill win the Nobel for Literature in 1953? The official reason is “for his mastery of historical and biographical description as well as for brilliant oratory in defending exalted human values.” I am not aware of any extensive and serious literary studies of his biographical writing, whatever the historical importance. (Before you object, such exist for Dylan’s songs.) Churchill’s oratory certainly was soaring and important during the Second World War as it inspired Britain and thereby helped secure Europe, but is it “literary” in the way we usually mean that term? Clearly the committee meant to honor a man the members admired and felt gratitude for.

Or last year’s prize, which went to a Belarusian journalist, Svetlana Alexievich, who captures oral histories–not the regular novelist, poet, or (if we must) playwright who best fits our regular view of the literary. Bigger tent, indeed.

I do not want to disparage the literary contributions of Churchill, Alexievich, or any other author. In fact, I am all for an expansive view of the literary. I find the concept of genre generally problematic since the best artists seek to explode such confinements. Hence Bob Dylan and the song as literature. And Dylan is not even the first songwriter to win the prize, by the way.

Dylan is a wildly influential and widely admired songwriter. And, yes, that influence is greatly augmented by the relentless promotional machine of Columbia records, and, yes, he is ridiculously remunerated for his literary prowess. Let us not kid ourselves, though. The myth that the only worthy artist is an impoverished artist eschewing wealth and fame for the pure pursuit of the ideal was born of the Romantic era and was never true. For every John Keats there was a Lord Byron and his circle. It would be the rare artist who does not dream of and strive for reputation and financial success. Furthermore, when recent Nobels have gone to relatively obscure authors, the hue and cry was about their relative lack of influence. So which is it? Who is the Goldilocks Nobel winner who stands perfectly between success and saintly obscurity? Sully Prudhomme?

Dylan’s lyrics exalt what we think of as literature. This claim is true for any number of artists working in a variety of forms. The Nobel Prize undoes itself because it excludes multitudes of the deserving and even the superior. It excludes you, and it excludes me. Maybe neither of us should be much surprised at our exclusion, but it excludes everyone else as well–except the winners.

Now, if I have to read one more half-baked piece about how someone’s favorite artist deserves the Nobel in Literature far more than that ancient croaker Bob Dylan, I may have to stoop to Twitter to complain!

Full disclosure: As I was completing the preparation of this blogpost, Stephen King published an article in Rolling Stone with a similar argument. I would not be so churlish as to accuse him of supernaturally anticipating my essay and producing his own preemptively plagiarized offering even though such a plot element would fit with the general themes of his ouvre. Nor do I think that I would be unduly boasting if I pointed out that my essay is far superior in every possible way to his.

Thinking Gray: In Praise of Deliberative Leadership

Steven Sample, in The Contrarian’s Guide to Leadership, contends that effective leaders frequently “think gray.” In other words, they work hard to remain as open-minded as possible by not forming strong opinions when they encounter new concepts, people, or circumstances. Obviously, no one can think gray perfectly since doing so runs counter to our nature. It is a practice, like other imperfect but necessary skills, such as mindfulness or medicine. Driven by bias and other factors, we want to form opinions, make comparisons, leap to conclusions, and forge judgments whenever we encounter novel challenges. Good leaders, though, sometimes suspend their inclinations so as to remain neutral until such a time as they have to form an opinion, take a stance, or make a decision. They do so with intention and discipline. The advantages are substantial.

Thinking gray is difficult and seems anathema to what we typically regard as good leadership. Great leaders are imagined and portrayed as being preternaturally decisive. Proper leaders, we assume, take a stand in the face of doubt and stand by it whatever the odds. They defy convention, opposition, and even fear to hold a position or direction. We admire those whom we regard as such steadfast leaders. We laud their boldness, their resilience, even when we disagree.

And, on occasion, we need such leaders, particularly in times of true crisis. But times of true crisis–and I don’t mean times of great concern or festering paranoia or simple urgency–are fairly infrequent in the quotidian swirl of decisions and choices.

All too often, such leaders end badly because they are reactive and not deliberative. What we praise as steadfast may actually be stubborn. What we applaud as brave may in reality be heedless, reckless, or plain stupid. Having strong convictions and acting on them despite all evidence to the contrary has more in common with the mindset of toddlers than that of the fully formed adult. Young kids are spontaneous and willful in their spontaneity, and while we may enjoy their freedom and antics, we do not imitate them.

The true challenge, except in rare circumstances, is to withhold judgment. Not to react but to consider, to weigh. Such behavior can look like confusion, cowardice, or inanition to those who do not appreciate the virtues of consideration and evaluation. 

The deliberative stance is thinking gray, choosing not to choose the black or the white until such a time as a decision must be made and even then perhaps finding a compromise. After all, what is the virtue in making a decision in haste? If I can wait awhile to decide, why should I rush? It seems foolish to react suddenly without cause and outside of reason. Such reactions are inevitably prone to emotional bias and potentially are informed by partial knowledge and assumptions. Besides, what if something changes in the circumstances or in the context governing the decision? A quick, reactive decision–the decision of a rash decider–may precede the change, which could lead to avoidable error.

Deliberation, reasonably deployed, helps avoid such a situation, but by deliberation I do not mean avoidance or procrastination. A leader should run at challenges and problems but should not leap to conclusions unnecessarily. Nonetheless, quick decisions are frequently desirable or even necessary. Many solutions require only a rapid application of logic or even a flash of insight. Most choices are so minor or simple that we would be silly to delay them. For instance, lengthy deliberations over what to eat for lunch or whether to get a haircut or not would be a waste of time. These choices, like most, require a quick and slight exertion of brainpower. But complex or momentous decisions demand the practice of thinking gray.

By extension, the gray thinker will be willing to review, revisit, and revise decisions. Even gray thinkers will make mistakes, but the gray thinkers’ mistakes will be of a different kind and degree than the reactively decisive, the type more commonly associated with admirable leadership. The gray thinker’s mistakes, whatever the consequences, will remain steeped in principle and logic, so gray thinkers own their mistakes. We should not confuse deliberation and self-review with indecisiveness and self-doubt. The gray thinker has the discipline to withhold judgment and the strength to reconsider and change.

Too often we laud the shoot-from-the-hip cowboy making snap judgments, but that type of leader is out of place in our time. Life is not a cheeseball Western with clear, white-hat-black-hat delineations offering choices as the bullets fly. Our experiences can seem rapid fire, but they only seem that way. That is why regularly thinking gray is the mark of the true leader, the one to trust. Only the gray thinker will make the most consistently solid and correct decisions, and only the gray thinker will own mistakes and not just admit to them. The practice of gray thinking is less than exciting, but it is honest and true.

RIGOR STARTS WITH “WHY?”: RIGOR, RIGAMAROLE, RIGOR MORTIS, AND THE DE RIGUER

Educators, like other professionals, often confuse hard work with virtue. I am not arguing against working hard as a good and even necessary thing, but is hard work inherently good? Is it better to work hard or to do well? Is there a distinction between the two? One can work hard in the service of nothing, simply in the service of work for work’s sake. Doing so is not inevitably bad, but what is the point of effort in the absence of meaning, such as achievement or progress, in the absence of doing well? Some religious orders engage in repetitive or tedious labor as a virtue, but their service–physically productive or not–has a contemplative and spiritual end. In contrast, empty, rote work is simply that: empty and rote.

In higher education we love to expound on the virtues of rigor, and rigor–I would contend–is, indeed, inherently good. Rigor is the process of setting challenges and striving for improvement, and, yes, it is often difficult and requires hard work. But hard work is not necessarily rigorous, and hard work without rigor can be, in fact, mere busy work or rigamarole. Both may be difficult and sometimes appropriate, but only rigor tends toward meaningful learning

Consider this example. I have seen departments and individual professors put an emphasis on citation rules in writing that runs to the extreme. Undergraduate students, even freshmen, are expected to apply APA or MLA style with near perfection and without the aid of apps and electronic reference tools. When questioned, those who advocate for such will claim that these styles are essential to writing in their fields. They will say that these citation styles are the mark of good writing and that their knowledge will bolster the students’ success in graduate school and beyond.

Perhaps there is some truth in that claim, but mastering a citation style has no bearing on whether a student has crafted a clear, logical, and convincing argument. Learning MLA style is hard and can be useful, I suppose, but learning to organize thoughts in a persuasive manner requires rigor. It will naturally be easier for some and harder for others, but it will always be rigorous. Rote aspects of writing, such as overemphasis of citation styles or grammatical and mechanical minutia, just like rote formulas for writing, will certain end in dead writing and the onset of cognitive rigor mortis in the reader.

Recently I attended a conference and heard a talk by Carl E. Wieman, a Stanford physicist and Nobel laureate. He was addressing the way science is taught to undergraduates and describing his experience researched approaches that lead to better learning outcomes through what he called “a scientific approach to teaching.” Instead of reenacting the more traditional or de rigueur approach of lecturing on terms and processes to be memorized, he uses what he calls “practice-with-feedback, research-based active learning,” which starts with the introduction of a question to be answered. The students work on this problem together before the instructor engages them in a class discussion. The instructor provides frequent feedback throughout.

His and others’ experiments with this inquiry-and-feedback-based method demonstrated marked improvements in student learning and even attendance when compared to what he calls the “pedagogical bloodletting” that traditional approaches represent. You can hear a podcast of his presentation and view his slides here.

***

A familiar truism is that “a little knowledge is a dangerous thing,” but this aphorism is a misquote of Alexander Pope in his Essay on Criticism. Pope actually wrote,

A little learning is a dangerous thing;
Drink deep, or taste not the Pierian spring. (Emphasis mine)

The distinction between “knowledge” and “learning” is instructive. Knowledge is more stuff stuffed into an overstuffed head. Learning is the ability to use that knowledge and new information. Accumulating knowledge is an accomplishment and often demands hard work, but it ends there. Learning, though, applies that knowledge and is thus a real challenge that requires rigor.

Both rigor and rigmarole are responses to questions. Rigor starts with “why?” and continues with “so what?” Rigamarole starts with “what?” and sometimes asks “how?” but not much more. Answering these questions reveals much. “What?” and “how?” are necessary and important questions, but answering them alone does not lead to progress. Answering “why?” inevitably leads to inquiry, analytical thought, and synthesis of–yes–knowledge. We call this outcome “understanding.” The question “so what?” leads to evaluation and even deeper understanding. Rigor starts with “why?,” and true learning starts with rigor.