The last five years of Jonathan Swift’s life were notable for their escalating misery. In fact, both “notable” and “misery” sell the situation short. For more than a century after his death in 1745, the marked mental and physical collapse that immediately preceded it had ascended into legend. He long suffered from bouts of dizziness and nausea which frequently debilitated him. His roaring tinnitus kept him awake at night before he had gone deaf. He had soon begun to lose his memory. “I am so stupid and confounded,” he wrote to a cousin in 1740, “I cannot express the mortification I am under in both body and mind. All I can say is, that I am not in torture; but I daily and hourly expect it.” His final moments of lucidity were spent reflecting on and longing for death. Once he lost even that the stories, always told by others, got grimmer. He refused to eat, his eye “swelled as large as an egg” and had to be restrained trying to claw it out. He was declared insane, spent the remainder of his days in his rocking chair totally incoherent, and was suspected to have been put on display by servants for paying onlookers.
What precisely caused these conditions, and whether or not some of the stories surrounding them were accurate, is a matter of speculation. It seems fairly plausible that his physical symptoms stemmed from Ménière’s disease, though whether he later suffered from a stroke or dementia is less clear. Not that it mattered to his critics, who saw his condition as fateful, even appropriate. Long before psychoanalytic criticism could take its hold, noted dislikers of Swift’s writing couldn’t help but relish the irony of the most gifted English language ironist, whose still-unmatched chemistry of logic and imagination made life harder for pretty much anyone he thought deserved it, had been reduced to, in Samuel Johnson’s words, “a driveller and a show.” His was “a cautionary tale told over and over again” of reveling too deeply in misanthropy and rage, and of the pride and thwarted ambitions that supposedly made him more susceptible to them.
And in fairness, to read any account of Swift’s life, even the ones that aren’t blanket critiques, is to witness a verbal parade of disappointment. The things he most wanted in life—mainly a bishopric in England which would grant him public influence and validate his massive but routinely wounded ego—he never really got. Though he served the Earl of Oxford and Lord Bolingbroke ably as the most effective polemicist for their interests, he was never high on their list of priorities; and truth be told, it’s hard to see someone who wrote “Last week I saw a woman flayed, and you would hardly believe how much it altered her person for the worse” becoming a Spiritual Peer. His deanery at St. Patrick’s was rather last ditch, returning to Ireland being the absolute last thing he wanted, and surely the ferocity of his Irish tracts was driven as much by his “exile” from London as by the policies that were coming over from it. “Swift was outside the shrewd discipline of talent,” Carl Van Doren wrote. “He could not sit down and write prose and verse as if they were sufficient ends. … He used them in his tragic role, in his war of ambition, not because he valued them, but because they were the only weapons he had.”
Nearly three centuries later, this aspect of Swift’s example still pays dividends. I gathered as much, anyway, from Emily Esfahani Smith’s 2013 article in The Atlantic—seen by me now, presumably, because it is geared toward incoming and outgoing college students—on the pitfalls of pursuing “ambition” at the expense of “relationships.” “For many [high school students], going away to college will be like crossing the Rubicon,” she wrote. “They will leave their families—their homes—and probably not return for many years, if at all.” This was a matter of concern for Smith, and to embellish on that point, she uses Rod Dreher’s example:
That was journalist Rod Dreher’s path. Dreher grew up in the small southern community of Starhill, Louisiana, 35 miles northwest of Baton Rouge. His family goes back five generations there. His father was a part-time farmer and sanitarian; his mother drove a school bus. His younger sister Ruthie loved hunting and fishing, even as a little girl.
But Dreher was different. As a bookish teenager, he was desperate to flee what he considered his intolerant and small-minded town, a place where he was bullied and misunderstood by his own father and sister. He felt more at home in the company of his two eccentric and worldly aunts — great-great aunts, actually — who lived nearby. One was a self-taught palm reader. She looked into his hand one day when he was a boy and told him, “See this line? You’ll travel far in life.” Dreher hoped she was right. When he was 16, he decided to leave home for a Louisiana boarding school with the intention of never looking back.
And it seems that he did not, for the most part, preferring to pursue a career in journalism in cities across the eastern and southern United States. “I was caught up in a culture of ambition,” he admitted. His pursuit of a career, however, erected “invisible walls” between him and his sister Ruthie, an elementary teacher who remained in their hometown. “Ruthie could not understand Dreher’s lifestyle. Why would he want to leave home for a journalism career? Wasn’t Starhill good enough? Did Rod think he was better than all of them?” Dreher was brought back to Starhill upon the diagnosis of Ruthie’s lung cancer, which took her life after 19 months, and his book The Little Way of Ruthie Leming in part chronicles his experience of seeing a community come together to help his sister, going so far as to organize a concert to raise $43,000 to pay her medical expenses. Dreher and his family have since returned to Starhill as permanent residents.
The “culture of ambition” being left behind is one of Machiavellian excess, wherein things like intimacy and friendship are either luxuries or means to more materialistic ends. Ambitious people, though better educated and less obstructed in the pursuit of high-income and social status, put their long term well-being at risk for a life of debased utilitarianism. Dreher leaves behind east coast media friends who, despite their success, are left “otherwise empty and alone.” We are a nation of Daniel Plainviews, at once enriched, enfeebled, and enraged by everything we have strived for; excluding, of course, the less fortunate competition to whom we have laid waste in our mad pursuit. The rather Manichean framing of the community against the individual is not atypical of Dreher, whose ambitions bore and continue to bear impressive fruit as a journalist and blogger at The American Conservative (a fine magazine that has published me many times). Dreher’s communitarianism is commendable, but the simplifications of his polemical passions do it a disservice, confusing, as culture warriors are wont to do, a phenomenon for an experience.
No one should deny that ambition is greed, and so often corrosive. When blind to everything but advancement, as every striver is from time to time, the shortening of life said to be linked to it is clearly felt, a simultaneous pit-in-stomach lethargy and a violent excretion of energy. Modernity has only given this dilemma a surface sheen rather than an internal soothing. “The new work ethic ultimately failed,” Sarah Perry argues:
It proposed that work was at once an act of self-fulfillment and an act of self-denial; but how could this be? It promised value based on individual self-determination, but in practice delivered work in corporations and bureaucracies. Social mobility proved to be limited. Working less and consuming more proved to be more attractive than working for work’s sake. The motivation of individual self-esteem and prestige through a career proved more sturdy than the motivation of the work ethic.
“Bourgeois life was based on the institution of the career,” John Gray wrote in Straw Dogs. “None but the incorrigibly feckless believe in taking the long view. Saving is gambling. Careers and pensions are high-level punts. The few who are seriously rich hedge their bets. The proles—the rest of us—live from day to day.” One would be foolish not to see that pursuing work that combines reasonable pay with some degree of cognitive complexity and/or independence requires in most cases instincts for high court networking and the endurance to do it at the speed and stability of a roller coaster. But to suggest that ambition is limited to that type is to develop a tunnel-visioned perspective of life that pushes out other forms of ambition and overlooks what truly drives it and drives other people to act. Not that their being overlooked is entirely inadvertent.
Reviewing Lena Dunham’s memoir Not That Kind of Girl in Slate, Katy Waldman expresses dismay that its author has provided an approximation of “Lena,” a generation-wide imaginary friend, filled with “just-so” anecdotes not dissimilar from that which can be found in Tiny Furniture and Girls, rather than Lena Dunham, their creator. “I want to read about Dunham’s intense drive to be seen, and yet to be in perfect command of her likeness.” Even if one, like me, is not an admirer of Dunham, to say that her efforts are not the results of belief in an artistic vision, dedication to craft, and no small sliver of business and production savvy, in addition to privilege and luck, is naïve. The reasons she conceals this obvious truth behind an irritating forcefield of self-deprecation can be many: reticence to divulge insight to would-be usurpers, genuine self-consciousness over waving obvious success more overtly in fans’ faces, or total disinterest. Dunham is not obligated to divulge anything she doesn’t want to, but Waldman had a point. The book may well have been more appealing if it dispensed with the obvious artifice and validated the drive which many in her age group possess but for which they have been roundly criticized.
In being fostered for modernity, the millennials are beset with a kind of start-stop dynamic, as if there was a collective switch to control us, encouraging us, on the one hand, to be “self-starters” while, on the other hand, telling us to cool our jets just as soon as propulsion has steadied. Out of what kind of concern this is done I can’t say, but it has left a whole swath of people ambivalent, uncertain, and possibly ashamed of nurturing their abilities.
But seeing as how generalization gets us nowhere, I can only meet Dreher’s experience with my own.
It is surprising, all the ways in which one can discover that he or she is an arrogant prick. My coming out was the result of a few factors. I do not know which came first: the discovery of my intelligence or the discovery of the learning disability that kept me largely out of the “mainstream” curriculum from kindergarten well into high school. By no means did any teacher, administrator, or counselor say out and out that I was stupid, and I don’t believe they thought that. More likely they thought I was peculiar. I did not take well to classroom education, but scoped out encyclopedias with a singular avarice. I memorized the presidents around age five or six, and created my own storybooks about ghosts in graveyards that I left in the nap area and insisted my classmates read. At best I was interesting enough, but not essential, and may never well be. Some did take notice, some history teachers, and a guidance counselor who was adamant that I not drop journalism. All the same I was a “special education” student, with all the stigma that came with it. I was, to put it another way, a spazz rather than a nerd with a talent wholly unfocused and stuck, moreover, in a town where there were few people to emulate.
It was less a matter of low expectations (I did get into a four-year college and graduated within that timeframe) as uncertain expectations, which was at once insulting and liberating: I felt the need to prove my intelligence but at least I had a wide berth in going about accomplishing that. The hope was eventually I would be made useful. I delved into writing, I published a zine, I worked for whatever magazine would hire me. Admittedly there were a lot of false starts and wrong turns involved, not to mention very little money, and even in face of a whole mess of discouragement I pressed on, because the alternative of not even attempting seemed just wasteful. I was quite convinced that what I had to offer was of value, that I was good at what I did, that I had sharp instincts and generally good taste, and when combined, whether in Biopsy or in an essay for another magazine, they offered something that could not be gotten from someone else.. Wanting a certain success and believing it can be obtained (which is distinct from thinking one is entitled to it) is actually very easy, very common, and very uninteresting. It is when people start agreeing with your self-assessment where things become more challenging.
There is something fearful in having even some of your greatest hopes realized. My desire to make use of my intelligence, and my ability to phrase that intelligence in a certain way, led to its recognition. Few perhaps will admit to a feeling of being let into something from the outside, but that is how it felt for me. And though I am on the periphery of that interior, it is not nothing, it requires adjustment. Ambition, properly met, should ideally give way to discipline and humility. Respect is largely seen as a reward, but it is also a kind of trust. I take earning someone’s attention seriously. I do not tailor pieces to any one person’s preferences, but I do write about subjects I deem interesting in a way that doesn’t waste people’s time, doing my best to write more carefully and argue more clearly without docking the style points I so deeply and in my mind rightfully cherish.
Humility is more of a challenge; my weaknesses are plentiful. Disappointment and setback are harder to take after having some successes. I have grown increasingly tired of being overlooked (as I tend to see it) rather than resigned to it. Boredom is no less alleviated; joy at the presence of advocates gives way to despair at the absence of critics; and being stuck in the suburbs (even one a hour’s train ride from Manhattan) means to live in a kind of stasis, wherein a cool indifference to anything middlebrow gives way to ugly intolerance. Camaraderie among likeminded writers can just as easily give way to competition, and so fearful of the influence of others on my style and interests (that is, to more overtly skew them for popularity) that I don’t really read a whole lot of contemporary writers. Sometimes I read to hate, and to judge others by standards that sometimes I can’t even meet. I cherish those days when I am more even-keeled, when I can better accept limitations while still trying to hone my strengths. There is no greater experience, for instance, when you find people in your field who are on the same wavelength, so to speak, but whose individual abilities stoke not envy but encouragement to improve. Indeed, once such friend, while enduring this or that career-related anxiety from me, offered some clarity: better than thinking of the job you want in 10 years is thinking whose outlook or priorities you want in 10 years. I did not have an immediate answer but it did allow me to reframe old nagging concerns in new, less burdensome ways. Talking to another friend of his job search, he said his main ambition was to work with “good people,” which was notable if only because I had literally never considered that a priority.
There is value in Smith’s and Dreher’s advocating for the simple, unambitious life, and certainly in cultivating rather than shunning one’s roots. In fact, Dreher would be happy to know that my immediate area in New Jersey is teeming with former high school classmates, many having married each other. Fairly impressive in the homestate of “Born to Run.” Still, the zero-sum approach to ambition is unhelpful, and unfair to a whole swath of people whose idea of being driven is not so much material success (security would be nice but that might be another matter) and blind status-seeking as it is in simply doing good work. The ablest cure for the fallout of ambition is perspective, the reality that how we see ourselves and our skills are never the same as how others see them, and that in some cases, perhaps even most cases, how others see them is better. This happens at any scale. One hopes that Swift would appreciate the irony that he (along with his rival Daniel Dafoe) is often confused for a children’s author; or that Peter O’Toole caused a mass walkout with his 1984 reading of “A Modest Proposal.” As Van Doren puts it: “He had won the war in which he hardly noticed he was fighting because he had fought with so much passion in a war which was not worth it.”