On a cool May day in 1758, a 10-year girl with red hair and freckles was caring for her neighbor’s children in rural western Pennsylvania. In a few moments, Mary Campbell’s life changed forever when Delaware Indians kidnapped her and absorbed her into their community for the next six years. She became the first of some 200 known cases of white captives, many of whom became pawns in an ongoing power struggle that included European powers, American colonists and Indigenous peoples straining to maintain their population, their land and way of life.
While Mary was ultimately returned to her white family—and some evidence points to her having lived happily with her adopted Indian tribe—stories such as hers became a cautionary tale among white settlers, stoking fear of “savage” Indians and creating a paranoia that escalated into all-out Indian hating.
From the time Europeans arrived on American shores, the frontier—the edge territory between white man’s civilization and the untamed natural world—became a shared space of vast, clashing differences that led the U.S. government to authorize over 1,500 wars, attacks and raids on Indians, the most of any country in the world against its Indigenous people. By the close of the Indian Wars in the late 19th century, fewer than 238,000 Indigenous people remained, a sharp decline from the estimated 5 million to 15 million living in North America when Columbus arrived in 1492.
The reasons for this racial genocide were multi-layered. Settlers, most of whom had been barred from inheriting property in Europe, arrived on American shores hungry for Indian land—and the abundant natural resources that came with it. Indians’ collusion with the British during the American Revolution and the War of 1812 exacerbated American hostility and suspicion toward them.
Even more fundamentally, Indigenous people were just too different: Their skin was dark. Their languages were foreign. And their world views and spiritual beliefs were beyond most white men’s comprehension. To settlers fearful that a loved one might become the next Mary Campbell, all this stoked racial hatred and paranoia, making it easy to paint Indigenous peoples as pagan savages who must be killed in the name of civilization and Christianity.
Jonathan Lear describes his book Radical Hope (2006) (German translation 2020) as a work of “philosophical anthropology”. Like an anthropologist, he is interested in what happened to the Crow tribe when they were moved onto reservations and their traditional way of life came to an end. Unlike an anthropologist, however, Lear is also concerned with the larger questions entailed by the possibility that a way of life could come to an end. One such question is ethical in nature: how should one live in relation to the prospect that one’s way of life may come to an end? Another such question is ontological, in the sense that it concerns the nature of that being for whom such a thing is possible.
This ontological dimension was intimated by something the last Crow chief, Plenty Coups, said when describing the end of his tribe’s traditional way of life. In recounting his life story, Plenty Coups described the period when his people moved onto the reservation this way: “But when the buffalo went away the hearts of my people fell to the ground, and they could not lift them up again. After this nothing happened.” Lear admits that he cannot know precisely what Plenty Coups meant when he said “nothing happened.” Was Plenty Coups depressed? (Lear notes in passing that the rest of Plenty Coups’ life certainly does not seem to be that of a depressed person.) Does he mean that his tribe could no longer go on living according to the traditional ways? These are plausible interpretations of what Plenty Coup might have meant. But Lear wants to pursue the possibility that something deeper was being communicated by Plenty Coups’ remark. He asks: “What if it gave expression to an insight into the structure of temporality: that at a certain point things stopped happening? What would he have meant if he meant that?” The implication here is that our sense of time, of things happening and our understanding of what happens, are bound up with a particular way of life. When that way of life comes to end, the intelligibility of our world also collapses; for us, it is as if nothing more happens because nothing can make sense outside of that way of life.
For Lear, Plenty Coups’ remark points to “a particular form of human vulnerability”; a vulnerability we all share by virtue of being human. It is an ontological vulnerability because it concerns our particular way of being in relation to the world and to time. In posing the issue in these terms, Lear acknowledges a debt to the German philosopher Martin Heidegger (1889–1976). In his book Being and Time, Heidegger presented human existence as fundamentally concerned with making sense of the world in terms of its meaningful possibilities. Everyday objects, for example, are intelligible to us primarily through the way they express specific possibilities for their use: a hammer for hammering nails, a lectern for placing lecture notes on, etc. These possibilities are not infinite — the range of possibilities is determined by the specific culture and society we grew up in and in which we live of our lives. What is significant about Heidegger’s account in this context is that culture and social life are not things which come after or exist alongside our relation to objects, but instead are the very medium through which objects become intelligible to us at all.
It is in this light that Lear reflects on the simple act of cooking a meal. Cooking is common to all human societies, but the meaning which the act of cooking a meal has for each of us depends on the culture and society in which the action is embedded. For the Crow, whose traditional way of life revolved around hunting and fighting, the intelligibility of cooking a meal would have depended upon its relation to the possibilities of hunting and fighting. With the collapse of their traditional way of life, cooking a meal could no longer be made sense of in those terms. Of course, the Crow could make sense of it otherwise in relation to the way of life which followed. But to someone bearing witness, as Lear puts it, to the demise of the traditional way of life, it is as if the act of cooking no longer counted as an intelligible act at all. And without the meaningfulness of cultural objects like the coup stick used by the Crow in battle, or of everyday acts like cooking in preparation for a hunt, there is no longer any socially meaningfully way for the Crow to mark time. The Crow “ran out of whens,” as Lear puts it, “all Crow temporality had fitted within these categories — everything that happened could be understood in these terms — and thus it seems fair to say that the Crow ran out of time.” It is this possibility, peculiar to human beings as cultural creatures, that Lear seeks to understand when reflecting on the fate of the Crow people.
In part one, I noted that Lear draws on Heidegger’s Being and Time to make sense of our “ontological vulnerability” to a breakdown in meaning. As creatures whose existence fundamentally consists in making sense of things, we are to a great extent dependent upon the meaningful possibilities which are illuminated through our cultural and social practices. When this cultural foundation collapses, there is a sense in which the intelligibility of our world also collapses. One of the main ideas Lear explores is that even prior to such a collapse, there is a way in which this vulnerability can make itself felt.
In Being and Time, Heidegger discusses the individual’s experience of anxiety (Angst in German) as revealing something of great importance about human existence. Anxiety draws us out of what Heidegger calls our “crowd-self”, that is, the typical roles, worries and tasks with which we preoccupy ourselves in our everyday lives. Anxiety is the feeling or mood that strikes us when our daily preoccupations begin to lose their grip on us; when we begin to wonder about the point of it all and whether there isn’t a deeper meaning to our lives. Through this experience, the world itself becomes unfamiliar or uncanny. Consider, for example, a case in which a deadline that someone has been working towards all of a sudden loses its urgency for that person. Whereas before they had been wrapped up in the need to meet the deadline, and so focused on bringing together all those elements which are needed to make things happen, at that very moment the deadline no longer appears to them as something which has the same organizing and motivating significance for them. The experience is uncanny because a situation which had been familiar now becomes very unfamiliar, even though the person can still understand everything that is going on. What moments like this can reveal is that the intelligibility of our world is very much dependent on us, on our active taking up of possibilities and combining them together in the tasks and projects we choose to pursue, rather than as a meaningful whole which we can simply take for granted. The anxiety arises in distancing ourselves from the possibilities with which we are normally fully engaged, and in the accompanying threat of a breakdown of intelligibility.
In a footnote to chapter two, Lear describes his book as a meditation on Heidegger’s idea of “being-towards-death”. This is more accurately put as “being on the edge of death”, because Heidegger explicitly associates it with the sense of anxiety described above, which can strike us at any moment and isn’t specifically tied to some point in the future when we will cease to exist. It is meant to highlight how our existence is always, at every moment, engaged in an existential struggle to ward off the utter meaninglessness that lies on the other side of existence, even if we rarely relate to our own existence in these terms. It is in the mood of anxiety, by coming face to face with the world in its uncanniness, that this existential struggle is revealed to each individual.
While Heidegger’s focus is on the individual’s experience of anxiety, Lear is interested in how this kind of experience might affect an entire community which is facing the prospect of cultural devastation. He invokes the idea of communal anxiety, explicitly in relation to the accounts of anxiety developed by both Heidegger and Kierkegaard (whom I’ll discuss in a future post). Lear’s argument in chapters two and three is mostly concerned with how such an anxiety, as felt by the Crow, may have been transformed through radical hope into the imaginative resources needed to survive the collapse of their way of life. Having considered some of the background to Lear’s argument, I will consider the argument itself in more detail in part three.
In the previous post in this series, I looked at Heidegger’s account of anxiety and how it is bound up with the finitude of human existence. Anxiety is a mood which exposes the intelligibility of the world as something which is contingent on our own existential struggle to make sense of it. Lear’s account of anxiety in the face of cultural devastation is influenced by Heidegger, but also by the Danish philosopher Søren Kierkegaard. It is through the influence of the latter that Lear emphasizes the ironic nature of this experience of anxiety. What is meant by irony here?
Lear introduces the notion of irony in chapter one of Radical Hope by reflecting on the criteria for a vibrant culture:
There must be established social roles that one can embody and interact with.
There must be standards of excellence associated with these roles, that give us a sense of the culture’s ideals.
There must be the possibility of constituting oneself as a certain sort of person — namely, one who embodies those ideals.
The sense of irony which Lear is concerned with arises historically when the possibility of constituting oneself as a certain sort of person (3) becomes problematic. In the case of the Crow, irony in this sense was impossible in the 1840s because the three criteria listed above cohered in such a way that ensured their culture’s vibrancy. But a hundred years later, when the Crow had been moved onto a reservation, this coherence collapsed. While it was still possible to recognise the traditional social roles, the standards of social excellence associated with most of these roles could no longer be realized. Intertribal warfare had been banned, traditional hunting had become impossible, and mortality rates from disease had almost wiped out a younger generation. Under these conditions, it became possible for the Crow to ask:
Among the warriors, is there a warrior?
One could call oneself a warrior, but it was no longer clear what the pursuing the ideals of being a warrior might entail once intertribal warfare had been banned. With the breakdown of social roles and the patterns of upbringing disrupted, the very possibility of constituting oneself as a Crow subject was thrown in question. It is in this light that one might ask the question:
Among the Crow, is there a Crow?
As outsiders, we seem to have no trouble reformulating this question as: among those who call themselves members of the Crow nation, are there any members who live up to the ideals of being a Crow? But from the perspective of the Crow people themselves, there is a disorientation inherent in the question which is not felt by the observer. In his essay “A Lost Conception of Irony,” Lear explains this difference as follows:
“[F]rom the perspective of my Crow friends, the question has a different aura. It makes them anxious; or rather it names a core anxiety. I mean anxiety in the literal sense of disruptive separation from the world and disorientation. It is easy for us to hear the question as though it were coming from the superego — a question of whether the Crow fail to live up to their ideals. But from the perspective of my Crow friends, the ideal is every bit as much in question as they are.”
It is not merely a theoretical question for the Crow, but a practical matter of how one should live. And it produces anxiety because there is no longer a clear answer to the question of how to go on.
This experience of irony can affect all of us as in coming to terms with our social roles. In such an experience, the question of what counts as excellence in the role is not simply an occasion to reflect upon whether we do in fact live up to such ideals; it leads to a moment of anxious disruption, as the very nature of those ideals is called into question. For Lear, this experience of irony raises the ethical question of how we can live in such a way that remains open to it without leading to despair. For Kierkegaard, the figure of Socrates represented an exemplar of ironic existence. I would suggest that in Radical Hope, Lear offers us the figure of Plenty Coups, the last great chief of the Crow, as another such exemplar. I will consider this further in my next post.
When he was nine years old, Plenty Coups underwent a traditional rite of passage which involved leaving the tribe for a few days and through solitude and fasting experience a dream-vision. On his second night in the wilderness, Plenty Coups dreamt that he met a buffalo bull who turned into a man wearing a buffalo robe, and who showed him a plain in which countless buffalo emerged from a hole in a ground. Suddenly, the plains were empty, and out of the hole in the ground came animals which looked similar to the buffalo but were spotted. Plenty Coups was then shown an old man sitting under a tree, and was told that he was looking at himself. Finally, Plenty Coups witnessed a terrible storm in which the four winds blew down all the trees in the forest except one. He was told that inside the tree was the lodge of the Chickadee, and that the Chickadee-person was one with the least physical strength but strongest mind, who was willing to work for wisdom and never missed a chance to learn from others.
Plenty Coups recounted his dream to the tribal elders who then interpreted it. They said it foretold a time when the white man’s herds would replace the buffalo, and that only by becoming like the Chickadee and learning from the experience of others would the tribe be able to survive and hold onto its lands.
Jonathan Lear believes that Plenty Coups’ prophetic dream was most probably a response to the tribe’s communal sense of anxiety. Plenty Coup would have had this dream in 1855 or ’56, by which time the advance of white settlers had pushed rival tribes into greater proximity with one another, and the escalation in inter-tribal warfare and diseases such as smallpox had reduced the Crow’s population by about half. The dream was part of the process by which the tribe’s anxieties could be metabolized and represented in narrative form. And it gave Plenty Coups, as a future chief of the tribe, the imaginative resources needed to cope with the “storm” or cultural devastation that was coming. In particular, Lear thinks that the values represented in the dream by the Chickadee came to articulate a new form of courage.
For Lear this is a crucial point, because the primary virtue around which Crow life had revolved was courage in battle. The ultimate act of courage was symbolically represented by the planting of a coup-stick, which expressed a Crow warrior’s resolve to die rather than retreat. Lear analyses this and other acts of courage as marking a boundary around Crow life which demanded recognition even from the Crow’s enemies. This is what Lear calls the Crow’s “thick” conception of courage, by which he means a concept rooted in a particular culture and historical circumstances. What happened to the Crow, however, was that the possibilities for practicing their traditional way of life would become restricted to such an extent that such thick concepts eventually became unintelligible. A virtue like courage simply could not be realized as it had been in the past. How does one retain a sense of virtue or ethics when the very concepts which had informed one’s cultural understanding of what is good collapse?
According to Lear, the values expressed in the dream through the figure of the Chickadee represented a kind of radical hope. It is radical in the sense that the values transcend the finite ethical forms manifested by thick ethical concepts. Plenty Coups’ vision was not of a future form of life, but of a commitment to the possibility of ethics even after the concepts with which one had understood the ethical ceased to make sense. Lear explains this point as follows:
“It is difficult to grasp the radical and strange nature of this commitment. For, on the one hand, Plenty Coups is witnessing the death of a traditional way of life. It is the death of the possibility of forming oneself as a Crow subject — at least, as traditionally understood. On the other hand, he is committed to the idea that by “listening as the Chickadee listens” he and the Crow will somehow survive. What could this mean? We would have to understand the Crow as somehow transcending their own subjectivity. That is, we would have to understand them as surviving the demise of the established ways of constituting oneself as a Crow subject. In that sense, it is no longer possible to be a Crow…. Still, on the basis of his dream, he commits himself to the idea that — on the other side of the abyss — the Crow shall survive, perhaps flourish again. The Crow is dead, long live the Crow! This is a form of hope that seems to survive the destruction of a way of life. Though it must be incredibly bly difficult to hold onto this commitment in the midst of subjective catastrophe, it is not impossible. And it is at least conceivable that this is just what Plenty Coups did.”
This kind of commitment, Lear argues, is ironic in Kierkegaard’s sense of the term. It is a recognition or more precisely a hope, that by giving up a traditional way of life new possibilities will open up and another way of flourishing will become possible. It is a commitment undertaken despite the fact that these future possibilities cannot be comprehended in advance.
Lear draws on many biographical facts about Plenty Coups’ life to suggest the ways in which he followed the wisdom of the Chickadee in the face of cultural devastation. One episode Lear places great emphasis on is Plenty Coups’ participation in a ceremony at the Tomb of the Unknown Soldier in 1921, when he laid down his coup-stick and headdress. By this act, Lear believes, Plenty Coups acknowledged that the traditional forms of fitting or virtuous behaviour were no longer appropriate. But he did so in a way which was itself fitting, that demonstrated “in these radically altered circumstances” that it was still possible “to think about what it was appropriate to do.” Plenty Coups’ actions did not just mark an end to a way of life, but sought to creatively reinterpret traditional ideals from within a radically new context.
My comment: This reminds me automatically to James Baldwin’s legendary speech at Cambridge University 1965: „…It comes as a great shock to discover that Gary Cooper killing off the Indians—when you were rooting for Gary Cooper—the Indians were you. It comes as a great shock to discover the country which is your birth place and to which you owe your life and your identity has not in its whole system of reality evolved any place for you….“ Here the Link for the post of this speech on my website: https://www.pottbayer.de/wp-admin/post.php?post=3404&action=edit
My recommendation for an exhibition 2020-2021 at the Field Museum in Chicago: “I hope this exhibition helps people to honor their own cultural experiences in new ways and to identify with Indigenous people—to realign ourselves as Americans and understand that this is a very diverse country.”
“But when the buffalo went away the hearts of my people fell to the ground, and they could not lift them up again. After this nothing happened.”
„Wir leben in einer Zeit des sich verstärkenden Gefühls, dass Zivilisationen verletzlich sind. Ereignisse überall auf der Welt – Terrorangriffe, gewaltsame soziale Umwälzungen und auch Naturkatastrophen (Pandemien wie Covid 19) – hinterlassen in uns ein unheimliches Gefühl der Bedrohung. Wir scheinen uns einer geteilten Verletzlichkeit bewusst zu sein, die wir nicht ganz benennen können. Ich vermute, dass dieses Gefühl auch die weitverbreitete Intoleranz hervorgerufen hat. Es ist so, als ob ohne unser Beharren auf die Richtigkeit unserer Perspektive auch diese Perspektive selbst zusammenbrechen könnte. Wemm wir unserem geteilten Gefühl der Verletzlichkeit einen Namen (Begriff, Wort) geben könnten, wäre es uns vielleicht auch möglich, besser mit ihm zu leben.“ (aus der deutschen Übersetzung von Jonathan Lears „Radical Hope“ von 2020)
How to be alone, wake up from illusion, master the art of asking, fathom your place in the universe, and more.
After the year’s most intelligent and imaginative children’s books and best science books, here are my favorite books on psychology and philosophy published this year, along with the occasional letter and personal essay — genres that, at their most excellent, offer hearty helpings of both disciplines. Perhaps more precisely, these are the year’s finest books on how to live sane, creative, meaningful lives. (And since the subject is of the most timeless kind, revisit the selections 2013, 2012, and 2011.)
1. A GUIDE FOR THE PERPLEXED
Werner Herzog is celebrated as one of the most influential and innovative filmmakers of our time, but his ascent to acclaim was far from a straight trajectory from privilege to power. Abandoned by his father at an early age, Herzog survived a WWII bombing that demolished the house next door to his childhood home and was raised by a single mother in near-poverty. He found his calling in filmmaking after reading an encyclopedia entry on the subject as a teenager and took a job as a welder in a steel factory in his late teens to fund his first films. These building blocks of his character — tenacity, self-reliance, imaginative curiosity — shine with blinding brilliance in the richest and most revealing of Herzog’s interviews. Werner Herzog: A Guide for the Perplexed (public library) — not to be confused with E.F. Schumacher’s excellent 1978 philosophy book of the same title — presents the director’s extensive, wide-ranging conversation with writer and filmmaker Paul Cronin. His answers are unfiltered and to-the-point, often poignant but always unsentimental, not rude but refusing to infest the garden of honest human communication with the Victorian-seeded, American-sprouted weed of pointless politeness.
Herzog’s insights coalesce into a kind of manifesto for following one’s particular calling, a form of intelligent, irreverent self-help for the modern creative spirit — indeed, even though Herzog is a humanist fully detached from religion, there is a strong spiritual undertone to his wisdom, rooted in what Cronin calls “unadulterated intuition” and spanning everything from what it really means to find your purpose and do what you love to the psychology and practicalities of worrying less about money to the art of living with presence with an age of productivity. As Cronin points out in the introduction, Herzog’s thoughts collected in the book are “a decades-long outpouring, a response to the clarion call, to the fervent requests for guidance.”
And yet in many ways, A Guide for the Perplexed could well have been titled A Guide to the Perplexed, for Herzog is as much a product of his “cumulative humiliations and defeats,” as he himself phrases it, as of his own “chronic perplexity,” to borrow E.B. White’s unforgettable term — Herzog possesses that rare, paradoxical combination of absolute clarity of conviction and wholehearted willingness to inhabit his own inner contradictions, to pursue life’s open-endedness with equal parts focus of vision and nimbleness of navigation.
A certain self-reliance that permeates his films and his mind, a refusal to let the fear of failure inhibit trying — a sensibility the voiceover in the final scene of Herzog’s The Unprecedented Defence of the Fortress Deutschkreuz captures perfectly: “Even a defeat is better than nothing at all.”
If the odds of finding one’s soul mate are so dreadfully dismal and the secret of lasting love is largely a matter of concession, is it any wonder that a growing number of people choose to go solo? The choice of solitude, of active aloneness, has relevance not only to romance but to all human bonds — even Emerson, perhaps the most eloquent champion of friendship in the English language, lived a significant portion of his life in active solitude, the very state that enabled him to produce his enduring essays and journals. And yet that choice is one our culture treats with equal parts apprehension and contempt, particularly in our age of fetishistic connectivity. Hemingway’s famous assertion that solitude is essential for creative work is perhaps so oft-cited precisely because it is so radical and unnerving in its proposition.
While Maitland lives in a region of Scotland with one of the lowest population densities in Europe, where the nearest supermarket is more than twenty miles away and there is no cell service (pause on that for a moment), she wasn’t always a loner — she grew up in a big, close-knit family as one of six children. It was only when she became transfixed by the notion of silence, the subject of her previous book, that she arrived, obliquely, at solitude. She writes:
I got fascinated by silence; by what happens to the human spirit, to identity and personality when the talking stops, when you press the off button, when you venture out into that enormous emptiness. I was interested in silence as a lost cultural phenomenon, as a thing of beauty and as a space that had been explored and used over and over again by different individuals, for different reasons and with wildly differing results. I began to use my own life as a sort of laboratory to test some ideas and to find out what it felt like. Almost to my surprise, I found I loved silence. It suited me. I got greedy for more. In my hunt for more silence, I found this valley and built a house here, on the ruins of an old shepherd’s cottage.
Illustration by Marianne Dubuc from ‚The Lion and the Bird,‘ one of the best children’s books of the year. Click image for more.
Maitland’s interest in solitude, however, is somewhat different from that in silence — while private in its origin, it springs from a public-facing concern about the need to address “a serious social and psychological problem around solitude,” a desire to “allay people’s fears and then help them actively enjoy time spent in solitude.” And so she does, posing the central, “slippery” question of this predicament:
Being alone in our present society raises an important question about identity and well-being.
How have we arrived, in the relatively prosperous developed world, at least, at a cultural moment which values autonomy, personal freedom, fulfillment and human rights, and above all individualism, more highly than they have ever been valued before in human history, but at the same time these autonomous, free, self-fulfilling individuals are terrified of being alone with themselves?
We live in a society which sees high self-esteem as a proof of well-being, but we do not want to be intimate with this admirable and desirable person.
We see moral and social conventions as inhibitions on our personal freedoms, and yet we are frightened of anyone who goes away from the crowd and develops “eccentric” habits.
We believe that everyone has a singular personal “voice” and is, moreover, unquestionably creative, but we treat with dark suspicion (at best) anyone who uses one of the most clearly established methods of developing that creativity — solitude.
We think we are unique, special and deserving of happiness, but we are terrified of being alone.
We are supposed now to seek our own fulfillment, to act on our feelings, to achieve authenticity and personal happiness — but mysteriously not do it on our own.
Today, more than ever, the charge carries both moral judgement and weak logic.
Maitland goes on to explore the underlying psychology of our unease from the fall of the Roman Empire to the rise of the “male spinster” and how to cultivate the five deepest rewards of solitude. Read more here.
3. WAKING UP
Nietzsche’s famous proclamation that “God is dead” is among modern history’s most oft-cited aphorisms, and yet as is often the case with its ilk, such quotations often miss the broader context in a way that bespeaks the lazy reductionism with which we tend to approach questions of spirituality today. Nietzsche himself clarified the full dimension of his statement six years later, in a passage from The Twilight of Idols, where he explained that “God” simply signified the supersensory realm, or “true world,” and wrote: “We have abolished the true world. What has remained? The apparent one perhaps? Oh no! With the true world we have also abolished the apparent one.”
In Waking Up: A Guide to Spirituality Without Religion (public library | IndieBound), philosopher, neuroscientist, and mindful skeptic Sam Harris offers a contemporary addition to this lineage of human inquiry — an extraordinary and ambitious masterwork of such integration between science and spirituality, which Harris himself describes as “by turns a seeker’s memoir, an introduction to the brain, a manual of contemplative instruction, and a philosophical unraveling of what most people consider to be the center of their inner lives.” Or, perhaps most aptly, an effort “to pluck the diamond from the dunghill of esoteric religion.”
Sam Harris by Bara Vetenskap
Harris begins by recounting an experience he had at age sixteen — a three-day wilderness retreat designed to spur spiritual awakening of some sort, which instead left young Harris feeling like the contemplation of the existential mystery in the presence of his own company was “a source of perfect misery.” This frustrating experience became “a sufficient provocation” that launched him into a lifelong pursuit of the kinds of transcendent experiences that gave rise to the world’s major spiritual traditions, examining them instead with a scientist’s vital blend of skepticism and openness and a philosopher’s aspiration to be “scrupulously truthful.”
Our minds are all we have. They are all we have ever had. And they are all we can offer others… Every experience you have ever had has been shaped by your mind. Every relationship is as good or as bad as it is because of the minds involved.
Noting that the entirety of our experience, as well as our satisfaction with that experience, is filtered through our minds — “If you are perpetually angry, depressed, confused, and unloving, or your attention is elsewhere, it won’t matter how successful you become or who is in your life — you won’t enjoy any of it.” — Harris sets out to reconcile the quest to achieve one’s goals with a deeper longing, a recognition, perhaps, that presence is far more rewarding than productivity. He writes:
Most of us spend our time seeking happiness and security without acknowledging the underlying purpose of our search. Each of us is looking for a path back to the present: We are trying to find good enough reasons to be satisfied now.
Acknowledging that this is the structure of the game we are playing allows us to play it differently. How we pay attention to the present moment largely determines the character of our experience and, therefore, the quality of our lives.
Virginia Woolf called letter-writing “the humane art” — an epithet only amplified today, in an age when we so frequently mistake reaction for response and succumb to expectations of immediacy that render impossible the beautiful, contemplative mutuality at the heart of the notion of co-respondence. This, perhaps, is why yesteryear’s greatest letters appeal to us more irrepressibly than ever.
For years, Shaun Usher has been unearthing and highlighting brilliant, funny, poignant, exquisitely human letters from luminaries and ordinary people alike on his magnificent website. This year, the best of them were released in Letters of Note: Correspondence Deserving of a Wider Audience (public library | IndieBound) — the aptly titled, superb collection featuring contributions from such cultural icons as Virginia Woolf, Roald Dahl, Louis Armstrong, Kurt Vonnegut, Nick Cave, Richard Feynman, Jack Kerouac, and more.
“You gotta be willing to fail… if you’re afraid of failing, you won’t get very far,”Steve Jobs cautioned. “There is no such thing as failure — failure is just life trying to move us in another direction,”Oprah counseled new Harvard graduates. In his wonderfully heartening letter of fatherly advice, F. Scott Fitzgerald gave his young daughter Scottie a list of things to worry and not worry about in life; among the unworriables, he listed failure, “unless it comes through your own fault.” And yet, as Debbie Millman observed in Fail Safe, her magnificent illustrated-essay-turned-commencement-address, most of us “like to operate within our abilities” — stepping outside of them risks failure, and we do worry about it, very much. How, then, can we transcend that mental block, that existential worry, that keeps us from the very capacity for creative crash that keeps us growing and innovating?
That’s precisely what curator and art historian Sarah Lewis, who has under her belt degrees from Harvard and Oxford, curatorial positions at the Tate Modern and the MoMA, and an appointment on President Obama’s Arts Policy Committee, examines in The Rise: Creativity, the Gift of Failure, and the Search for Mastery (public library | IndieBound) — an exploration of how “discoveries, innovations, and creative endeavors often, perhaps even only, come from uncommon ground” and why this “improbable ground of creative endeavor” is an enormous source of advantages on the path to self-actualization and fulfillment, brought to life through a tapestry of tribulations turned triumphs by such diverse modern heroes as legendary polar explorer Captain Scott, dance icon Paul Taylor, and pioneering social reformer Frederick Douglass. Lewis, driven by her lifelong “magpie curiosity about how we become,” crafts her argument slowly, meticulously, stepping away from it like a sculptor gaining perspective on her sculpture and examining it through other eyes, other experiences, other particularities, which she weaves together into an intricate tapestry of “magpielike borrowings” filtered through the sieve of her own point of view.
Female archers, lantern slide, c. 1920. (Public domain via Oregon State University Special Collections & Archives.)
Lewis begins with a visit with the women of Columbia University’s varsity archery team, who spend countless hours practicing a sport that requires equal parts impeccable precision of one’s aim and a level of comfort with the uncontrollable — all the environmental interferences, everything that could happen between the time the arrow leaves the bow and the time it lands on the target, having followed its inevitably curved line. From this unusual sport Lewis draws a metaphor for the core of human achievement:
There is little that is vocational about [contemporary] culture anymore, so it is rare to see what doggedness looks like with this level of exactitude… To spend so many hours with a bow and arrow is a kind of marginality combined with a seriousness of purpose rarely seen.
In the archers’ doggedness Lewis finds the central distinction that serves as a backbone of her book — far more important than success (hitting the bull’s-eye) is the attainment of mastery (“knowing it means nothing if you can’t do it again and again”), and in bridging the former with the latter lives the substance of true achievement. (The distinction isn’t unlike what psychologist Carol Dweck found in her pioneering work on the difference between “fixed” and “growth” mindsets.) Lewis writes:
Mastery requires endurance. Mastery, a word we don’t use often, is not the equivalent of what we might consider its cognate — perfectionism — an inhuman aim motivated by a concern with how others view us. Mastery is also not the same as success — an event-based victory based on a peak point, a punctuated moment in time. Mastery is not merely a commitment to a goal, but to a curved-line, constant pursuit.
This is why, Lewis argues, a centerpiece of mastery is the notion of failure. She cites Edison, who famously said of his countless fruitless attempts to create a feasible lightbulb: “I have not failed, I’ve just found 10,000 ways that won’t work.” In fact, Lewis points out that embedded in the very word “failure” — a word originally synonymous with bankruptcy, devised to assess creditworthiness in the 19th century, “a seeming dead end forced to fit human worth” — is the bias of our limited understanding of its value:
The word failure is imperfect. Once we begin to transform it, it ceases to be that any longer. The term is always slipping off the edges of our vision, not simply because it’s hard to see without wincing, but because once we are ready to talk about it, we often call the event something else — a learning experience, a trial, a reinvention — no longer the static concept of failure.
In its stead, Lewis offers another 19th-century alternative: “blankness,” which beautifully captures the wide-open field of possibility for renewal, for starting from scratch, after an unsuccessful attempt. Still, she considers the challenge of pinning down into plain language a concept so complex and fluid — even fashionable concepts like grit fail failure:
Trying to find a precise word to describe the dynamic is fleeting, like attempting to locate francium, an alkali metal measured but never isolated in any weighted quantity or seen in a way that the eye can detect — one of the most unstable, enigmatic elements on the Earth. No one knows what it looks like in an appreciable form, but there it is, scattered throughout ores in the Earth’s crust. Many of us have a similar sense that these implausible rises must be possible, but the stories tend to stay strewn throughout our lives, never coalescing into a single dynamic concept… The phenomenon remains hidden, and little discussed. Partial ideas do exist — resilience, reinvention, and grit — but there’s no one word to describe the passing yet vital, constant truth that just when it looks like winter, it is spring.
When we don’t have a word for an inherently fleeting idea, we speak about it differently, if at all. There are all sorts of generative circumstances — flops, folds, wipeouts, and hiccups — yet the dynamism it inspires is internal, personal, and often invisible… It is a cliché to say simply that we learn the most from failure. It is also not exactly true. Transformation comes from how we choose to speak about it in the context of story, whether self-stated or aloud.
One essential element of understanding the value of failure is the notion of the “deliberate incomplete.” (Cue in Marie Curie, who famously noted in a letter to her brother: “One never notices what has been done; one can only see what remains to be done.”) Lewis writes:
We thrive, in part, when we have purpose, when we still have more to do. The deliberate incomplete has long been a central part of creation myths themselves. In Navajo culture, some craftsmen and women sought imperfection, giving their textiles and ceramics an intended flaw called a “spirit line” so that there is a forward thrust, a reason to continue making work. Nearly a quarter of twentieth century Navajo rugs have these contrasting-color threads that run out from the inner pattern to just beyond the border that contains it; Navajo baskets and often pottery have an equivalent line called a “heart line” or a “spirit break.” The undone pattern is meant to give the weaver’s spirit a way out, to prevent it from getting trapped and reaching what we sense is an unnatural end.
There is an inevitable incompletion that comes with mastery. It occurs because the greater our proficiency, the more smooth our current path, the more clearly we may spot the mountain that hovers in our gaze. “What would you say increases with knowledge?” Jordan Elgrably once asked James Baldwin. “You learn how little you know,” Baldwin said.
A related concept is that of the “near win” — those moments when we come so close to our aim, yet miss it by a hair:
At the point of mastery, when there seems nothing left to move beyond, we find a way to move beyond ourselves. Success motivates. Yet the near win — the constant auto-correct of a curved-line path — can propel us in an ongoing quest. We see it whenever we aim, climb, or create with mastery as our aim, when the outcome is determined by what happens at the margins.
Lewis goes on to illustrate these concepts with living examples from the stories of such pioneering figures as the great polar explorer Captain Scott, dance icon Paul Taylor, and pioneering social reformer Frederick Douglass. Read more here.
6. THE ACCIDENTAL UNIVERSE
It says something about physicist and writer Alan Lightman — the very first person to receive dual appointments in science and the humanities at MIT — that a book of his is not only among the best science books of the year, but also a masterwork of philosophy. But that is precisely what The Accidental Universe: The World You Thought You Knew (public library | IndieBound) is — a spectacular journey to the frontiers of theoretical physics, exploring how the possibility of multiple universes illuminates the heart of the human experience and our quest for Beauty, Truth, and Meaning. Lightman’s enchanting writing reveals him not only as a scientist of towering expertise, but also as an insightful philosopher and poet of the cosmos, partway between Seneca and Carl Sagan.
In the foreword, Lightman recounts attending a lecture by the Dalai Lama at MIT, “one of the world’s spiritual leaders sitting cross-legged in a modern temple of science,” and hearing about the Buddhist concept of sunyata, translated as “emptiness” — the notion that objects in the physical universe are vacant of inherent meaning and that we imbue them with meaning and value with the thoughts of our own minds. From this, Lightman argues while adding to history’s finest definitions of science, arises a central challenge of the human condition:
As a scientist, I firmly believe that atoms and molecules are real (even if mostly empty space) and exist independently of our minds. On the other hand, I have witnessed firsthand how distressed I become when I experience anger or jealousy or insult, all emotional states manufactured by my own mind. The mind is certainly its own cosmos. As Milton wrote in Paradise Lost, “[The mind] can make a heaven of hell or a hell of heaven.” In our constant search for meaning in this baffling and temporary existence, trapped as we are within our three pounds of neurons, it is sometimes hard to tell what is real. We often invent what isn’t there. Or ignore what is. We try to impose order, both in our minds and in our conceptions of external reality. We try to connect. We try to find truth. We dream and we hope. And underneath all of these strivings, we are haunted by the suspicion that what we see and understand of the world is only a tiny piece of the whole.
Science does not reveal the meaning of our existence, but it does draw back some of the veils.
Amid her moving reflections on grief, grace, and gratitude is one especially enchanting essay titled “The Book of Welcome,” in which Lamott considers the uncomfortable art of letting yourself be seen:
Trappings and charm wear off… Let people see you. They see your upper arms are beautiful, soft and clean and warm, and then they will see this about their own, some of the time. It’s called having friends, choosing each other, getting found, being fished out of the rubble. It blows you away, how this wonderful event ever happened — me in your life, you in mine.
Two parts fit together. This hadn’t occurred all that often, but now that it does, it’s the wildest experience. It could almost make a believer out of you. Of course, life will randomly go to hell every so often, too. Cold winds arrive and prick you: the rain falls down your neck: darkness comes. But now there are two of you: Holy Moly.
Unlike many other puzzles we confront, questions of trust don’t just involve attempting to grasp and analyze a perplexing concept. They all share another characteristic: risk. So while it’s true that we turn our attention to many complex problems throughout our lives, finding the answers to most doesn’t usually involve navigating the treacherous landscape of our own and others’ competing desires.
Trust implies a seeming unknowable — a bet of sorts, if you will. At its base is a delicate problem centered on the balance between two dynamic and often opposing desires — a desire for someone else to meet your needs and his desire to meet his own.
But despite what pop culture may tell us, decades’ worth of attempts to decode the signals of trustworthiness — sought in everything from facial expression to voice to handwriting — have proven virtually useless, and the last five years of research have rendered previous assertions about certain nonverbal cues wrong. (No, a sideways glance doesn’t automatically indicate that the person is lying to you.) As DeSteno wryly observes, “If polygraphs were foolproof, we wouldn’t need juries.” He explains what makes measures of trust especially complicated:
Unlike many forms of communication, issues of trust are often characterized by a competition or battle…. It’s not always an adaptive strategy to be an open book to others, or even to ourselves. Consequently, trying to discern if someone can be trusted is fundamentally different from trying to assess characteristics like mathematical ability. … Deciding to be trustworthy depends on the momentary balance between competing mental forces pushing us in opposite directions, and being able to predict which of those forces is going to prevail in any one instance is a complicated business.
Contrary to long-held doctrine, isolated gestures and expressions aren’t reliable indicators of what a person feels or intends to do. Two types of context — what I call configural and situational — are essential for correct interpretation. And they’ve been missing in most attempts to discover what trustworthiness and its opposite look like.
To figure out this multifaceted puzzle, DeSteno, whose lab studies how emotional states shape our social and moral behavior, took a cross-disciplinary approach, turning to the work of economists, computer scientists, security officers, physiologists and other psychologists, and enlisting the direct help of social psychologist David Pizarro and economist Robert Frank. With combined expertise spanning behavioral economics, evolutionary biology, nonverbal behavior, and emotional biases in decision making, they built, with equal parts rigor and humility, the richest framework for understanding trust that science has ever accomplished. Specifically, they focused on the two main components of trust — how it works and whether we’re able to predict who deserves it. DeSteno writes:
In the end, what emerged are not only new insights into how to detect the trustworthiness of others, but also an entirely new way to think about how trust influences our lives, our success, and our interactions with those around us.
“Have compassion for everyone you meet, even if they don’t want it,”Lucinda Williams sang from my headphones into my heart one rainy October morning on the train to Hudson. “What seems cynicism is always a sign, always a sign…” I was headed to Hudson for a conversation with a very different but no less brilliant musician, and a longtime kindred spirit — the talented and kind Amanda Palmer. In an abandoned schoolhouse across the street from her host’s home, we sat down to talk about her magnificent and culturally necessary new book, The Art of Asking: How I Learned to Stop Worrying and Let People Help (public library | IndieBound) — a beautifully written inquiry into why we have such a hard time accepting compassion in all of its permutations, from love to what it takes to make a living, what lies behind our cynicism in refusing it, and how learning to accept it makes possible the greatest gifts of our shared humanity.
I am partial, perhaps, because my own sustenance depends on accepting help. But I also deeply believe and actively partake in both the yin and the yang of that vitalizing osmosis of giving and receiving that keeps today’s creative economy alive, binding artists and audiences, writers and readers, musicians and fans, into the shared cause of creative culture. “It’s only when we demand that we are hurt,” Henry Miller wrote in contemplating the circles of giving and receiving in 1942, but we still seem woefully caught in the paradoxical trap of too much entitlement to what we feel we want and too little capacity to accept what we truly need. The unhinging of that trap is what Amanda explores with equal parts deep personal vulnerability, profound insight into the private and public lives of art, and courageous conviction about the future of creative culture.
The most urgent clarion call echoing throughout the book, which builds on Amanda’s terrific TED talk, is for loosening our harsh and narrow criteria for what it means to be an artist, and, most of all, for undoing our punishing ideas about what renders one a not-artist, or — worse yet — a not-artist-enough. Amanda writes of the anguishing Impostor Syndrome epidemic such limiting notions spawn:
People working in the arts engage in street combat with The Fraud Police on a daily basis, because much of our work is new and not readily or conventionally categorized. When you’re an artist, nobody ever tells you or hits you with the magic wand of legitimacy. You have to hit your own head with your own handmade wand. And you feel stupid doing it.
There’s no “correct path” to becoming a real artist. You might think you’ll gain legitimacy by going to university, getting published, getting signed to a record label. But it’s all bullshit, and it’s all in your head. You’re an artist when you say you are. And you’re a good artist when you make somebody else experience or feel something deep or unexpected.
But in the history of creative genius, this pathology appears to be a rather recent development — the struggle to be an artist, of course, is nothing new, but the struggle to believe being one seems to be a uniquely modern malady. In one of the most revelatory passages in the book, Amanda points out a little-known biographical detail about the life of Henry David Thoreau — he who decided to live the self-reliant life by Walden pond and memorably proclaimed: “If the day and the night are such that you greet them with joy, and life emits a fragrance like flowers and sweet-scented herbs, is more elastic, more starry, more immortal — that is your success.” It is a detail that, today, would undoubtedly render Thoreau the target of that automatic privilege narrative as we point a finger and call him a “poser”:
Thoreau wrote in painstaking detail about how he chose to remove himself from society to live “by his own means” in a little 10-foot x 15-foot hand-hewn cabin on the side of a pond. What he left out of Walden, though, was the fact that the land he built on was borrowed from his wealthy neighbor, that his pal Ralph Waldo Emerson had him over for dinner all the time, and that every Sunday, Thoreau’s mother and sister brought over a basket of freshly-baked goods for him, including donuts.
The idea of Thoreau gazing thoughtfully over the expanse of transcendental Walden Pond, a bluebird alighting onto his threadbare shoe, all the while eating donuts that his mom brought him just doesn’t jibe with most people’s picture of him of a self-reliant, noble, marrow-sucking back-to-the-woods folk-hero.
If Thoreau lived today, steeped in a culture that tells him taking the donuts chips away at his credibility, would he have taken them? And why don’t we? Amanda writes:
Taking the donuts is hard for a lot of people.
It’s not the act of taking that’s so difficult, it’s more the fear of what other people are going to think when they see us slaving away at our manuscript about the pure transcendence of nature and the importance of self-reliance and simplicity. While munching on someone else’s donut.
Maybe it comes back to that same old issue: we just can’t see what we do as important enough to merit the help, the love.
Try to picture getting angry at Einstein devouring a donut brought to him by his assistant, while he sat slaving on the theory of relativity. Try to picture getting angry at Florence Nightingale for snacking on a donut while taking a break from tirelessly helping the sick.
To the artists, creators, scientists, non-profit-runners, librarians, strange-thinkers, start-uppers and inventors, to all people everywhere who are afraid to accept the help, in whatever form it’s appearing,
Please, take the donuts.
To the guy in my opening band who was too ashamed to go out into the crowd and accept money for his band,
Take the donuts.
To the girl who spent her twenties as a street performer and stripper living on less than $700 a month who went on to marry a best-selling author who she loves, unquestioningly, but even that massive love can’t break her unwillingness to accept his financial help, please….
Just take the fucking donuts.
But Thoreau, it turns out, got one thing right in his definition of success, which emanates from Amanda’s words a century and a half later:
The happiest artists I know are generally the ones who can manage to make a reasonable living from their art without having to worry too much about the next paycheck. Not to say that every artist who sits around the campfire, or plays in tiny bars, is “happier” than those singing in stadiums — but more isn’t always better. If feeling the connection between yourself and others is the ultimate goal it can be harder when you are separated from the crowd by a 30-foot barrier. And it can be easier to do — though riskier — when they’re sitting right beside you. The ideal sweet spot is the one in which the artist can freely share their talents and directly feel the reverberations of their artistic gifts to their community. In other words, it works best when everybody feels seen.
As artists, and as humans: If your fear is scarcity, the solution isn’t necessarily abundance.
Read more and watch my conversation with Palmer here.
10. LEONARDO’S BRAIN
One September day in 2008, Leonard Shlain found himself having trouble buttoning his shirt with his right hand. He was admitted into the emergency room, diagnosed with Stage 4 brain cancer, and given nine months to live. Shlain — a surgeon by training and a self-described “synthesizer by nature” with an intense interest in the ennobling intersection of art and science, author of the now-legendary Art & Physics — had spent the previous seven years working on what he considered his magnum opus: a sort of postmortem brain scan of Leonardo da Vinci, performed six centuries after his death and fused with a detective story about his life, exploring what the unique neuroanatomy of the man commonly considered humanity’s greatest creative genius might reveal about the essence of creativity itself.
Shlain finished the book on May 3, 2009. He died a week later. His three children — Kimberly, Jordan, and filmmaker Tiffany Shlain — spent the next five years bringing their father’s final legacy to life. The result is Leonardo’s Brain: Understanding Da Vinci’s Creative Genius (public library | IndieBound) — an astonishing intellectual, and at times spiritual, journey into the center of human creativity via the particular brain of one undereducated, left-handed, nearly ambidextrous, vegetarian, pacifist, gay, singularly creative Renaissance male, who Shlain proposes was able to attain a different state of consciousness than “practically all other humans.”
Illustration by Ralph Steadman from ‚I, Leonardo.‘ Click image for more.
Noting that “a writer is always refining his ideas,” Shlain points out that the book is a synthesis of his threepreviousbooks, and an effort to live up to Kafka’s famous proclamation that “a book must be the axe for the frozen sea inside us.” It is also a beautiful celebration of the idea that art and science belong together and enrich one another whenever they converge.
Shlain argues that Leonardo — who painted the eternally mysterious Mona Lisa, created visionary anatomical drawings long before medical anatomy existed, made observations of bird flight in greater detailed than any previous scientist, mastered engineering, architecture, mathematics, botany, and cartography, might be considered history’s first true scientist long before Mary Somerville coined the word, presaged Newton’s Third Law, Bernoulli’s law, and elements of chaos theory, and was a deft composer who sang “divinely,” among countless other domains of mastery — is the individual most worthy of the title “genius” in both science and art:
The divergent flow of art and science in the historical record provides evidence of a distinct compartmentalization of genius. The river of art rarely intersected with the meander of science.
Although both art and science require a high degree of creativity, the difference between them is stark. For visionaries to change the domain of art, they must make a breakthrough that can only be judged through the lens of posterity. Great science, on the other hand, must be able to predict the future. If a scientist’s hypotheses cannot be turned into a law that can be verified by future investigators, it is not scientifically sound. Another contrast: Art and science represent the difference between “being” and “doing.” Art’s raison d’être is to evoke an emotion. Science seeks to solve problems by advancing knowledge.
Leonardo’s story continues to compel because he represents the highest excellence all of us lesser mortals strive to achieve — to be intellectually, creatively, and emotionally well-rounded. No other individual in the known history of the human species attained such distinction both in science and art as the hyper-curious, undereducated, illegitimate country boy from Vinci.
Using a wealth of available information from Leonardo’s notebooks, various biographical resources, and some well-reasoned speculation, Shlain goes on to perform a “posthumous brain scan” seeking to illuminate the unique wiring of Da Vinci’s brain and how it explains his unparalleled creativity.
“Faith is the ability to honor stillness at some moments,” Alan Lightman wrote in his sublime meditation on science and spirituality, “and at others to ride the passion and exuberance.” In his conversation with E.O. Wilson, the poet Robert Hass described beauty as a “paradox of stillness and motion.” But in our Productivity Age of perpetual motion, it’s increasingly hard — yet increasingly imperative — to honor stillness, to build pockets of it into our lives, so that our faith in beauty doesn’t become half-hearted, lopsided, crippled. The delicate bridling of that paradox is what novelist and essayist Pico Iyer explores in The Art of Stillness: Adventures in Going Nowhere (public library | IndieBound) — a beautifully argued case for the unexpected pleasures of “sitting still as a way of falling in love with the world and everything in it,” revealed through one man’s sincere record of learning to “take care of his loved ones, do his job, and hold on to some direction in a madly accelerating world.”
Iyer begins by recounting a snaking drive up the San Gabriel Mountains outside Los Angeles to visit his boyhood hero — legendary singer-songwriter Leonard Cohen. In 1994, shortly after the most revealing interview he ever gave, Cohen had moved to the Mt. Baldy Zen Center to embark on five years of seclusion, serving as personal assistant to the great Japanese Zen teacher Kyozan Joshu Sasaki, then in his late eighties. Midway through his time at the Zen Center, Cohen was ordained as a Rinzai Zen Buddhist monk and given the Dharma name Jikan — Pali for “silence.” Iyer writes:
I’d come up here in order to write about my host’s near-silent, anonymous life on the mountain, but for the moment I lost all sense of where I was. I could hardly believe that this rabbinical-seeming gentleman in wire-rimmed glasses and wool cap was in truth the singer and poet who’d been renowned for thirty years as an international heartthrob, a constant traveler, and an Armani-clad man of the world.
Cohen, who once described the hubbub of his ordinary state of mind as “very much like the waiting room at the DMV,” had sought in the sequestered Zen community a more extreme, more committed version of a respite most of us long for in the midst of modern life — at least at times, at least on some level, and often wholeheartedly, achingly. Iyer reflects on Cohen’s particular impulse and what it reveals about our shared yearning:
Leonard Cohen had come to this Old World redoubt to make a life — an art — out of stillness. And he was working on simplifying himself as fiercely as he might on the verses of one of his songs, which he spends more than ten years polishing to perfection. The week I was visiting, he was essentially spending seven days and nights in a bare meditation hall, sitting stock-still. His name in the monastery, Jikan, referred to the silence between two thoughts.
One evening — four in the morning, the end of December — Cohen took time out from his meditations to walk down to my cabin and try to explain what he was doing here.
Sitting still, he said with unexpected passion, was “the real deep entertainment” he had found in his sixty-one years on the planet. “Real profound and voluptuous and delicious entertainment. The real feast that is available within this activity.”
Was he kidding? Cohen is famous for his mischief and ironies.
He wasn’t, I realized as he went on. “What else would I be doing?” he asked. “Would I be starting a new marriage with a young woman and raising another family? Finding new drugs, buying more expensive wine? I don’t know. This seems to me the most luxurious and sumptuous response to the emptiness of my own existence.”
Typically lofty and pitiless words; living on such close terms with silence clearly hadn’t diminished his gift for golden sentences. But the words carried weight when coming from one who seemed to have tasted all the pleasures that the world has to offer.
Iyer beholds his encounter with Cohen with the same incredulous amazement that most of us modern cynics experience, at first reluctantly, when confronted with something or someone incomprehensibly earnest, for nothing dissolves snark like unflinching sincerity. For Cohen, Iyer observes, the Zen practice was not a matter of “piety or purity” but of practical salvation and refuge from “the confusion and terror that had long been his bedfellows.” Iyer writes:
Sitting still with his aged Japanese friend, sipping Courvoisier, and listening to the crickets deep into the night, was the closest he’d come to finding lasting happiness, the kind that doesn’t change even when life throws up one of its regular challenges and disruptions.
“Nothing touches it,” Cohen said, as the light came into the cabin, of sitting still… Going nowhere, as Cohen described it, was the grand adventure that makes sense of everywhere else.
We’ve lost our Sundays, our weekends, our nights off — our holy days, as some would have it; our bosses, junk mailers, our parents can find us wherever we are, at any time of day or night. More and more of us feel like emergency-room physicians, permanently on call, required to heal ourselves but unable to find the prescription for all the clutter on our desk.
Not many years ago, it was access to information and movement that seemed our greatest luxury; nowadays it’s often freedom from information, the chance to sit still, that feels like the ultimate prize. Stillness is not just an indulgence for those with enough resources — it’s a necessity for anyone who wishes to gather less visible resources. Going nowhere, as Cohen had shown me, is not about austerity so much as about coming closer to one’s senses.
If the notion of mental illness in animals seems like far-fetched anthropocentrism, a field of science that has been gathering momentum for more than 150 years strongly suggests otherwise. That’s precisely what Senior TED Fellow Laurel Braitman explores in Animal Madness: How Anxious Dogs, Compulsive Parrots, and Elephants in Recovery Help Us Understand Ourselves (public library | IndieBound). Braitman, who holds a Ph.D. in history and anthropology of science from MIT, argues that we humans are far from unique in our capacity for “emotional thunderstorms that make our lives more difficult” and that nonhuman animals are bedeviled by varieties of mental illness strikingly similar to our own. With equal parts rigor and compassion, she examines evidence from veterinary science, psychology and pharmacology research, first-hand accounts by neuroscientists, zoologists, animal trainers, and other experts, the work of legendary scientists and philosophers like Charles Darwin and Rene Descartes, and her own experience with dozens of animals spanning a multitude of species and mental health issues, from depressed dogs to self-harming dolphins to canine Alzheimer’s and PTSD.
Braitman’s journey begins with one particularly troubled nonhuman animal — Oliver, the Bernese Mountain Dog she adopted, whose “extreme fear, anxiety, and compulsions” prompted her, in the way that a concerned parent on the verge of despair grasps for answers, to explore whether and how other animals could be mentally ill. Considering the tapestry of evidence threads she uncovered during her research, she writes:
Humans and other animals are more similar than many of us might think when it comes to mental states and behaviors gone awry — experiencing churning fear, for example, in situations that don’t call for it, feeling unable to shake a paralyzing sadness, or being haunted by a ceaseless compulsion to wash our hands or paws. Abnormal behaviors like these tip into the territory of mental illness when they keep creatures — human or not — from engaging in what is normal for them. This is true for a dog single-mindedly focused on licking his tail until it’s bare and oozy, a sea lion fixated on swimming in endless circles, a gorilla too sad and withdrawn to play with her troop members, or a human so petrified of escalators he avoids department stores.
Every animal with a mind has the capacity to lose hold of it from time to time. Sometimes the trigger is abuse or mistreatment, but not always. I’ve come across depressed and anxious gorillas, compulsive horses, rats, donkeys, and seals, obsessive parrots, self-harming dolphins, and dogs with dementia, many of whom share their exhibits, homes, or habitats with other creatures who don’t suffer from the same problems. I’ve also gotten to know curious whales, confident bonobos, thrilled elephants, contented tigers, and grateful orangutans. There is plenty of abnormal behavior in the animal world, captive, domestic, and wild, and plenty of evidence of recovery; you simply need to know where and how to find it.
Braitman is careful to acknowledge that such a notion is likely to unnerve our notions of human exceptionalism and offers a wise caveat:
Acknowledging parallels between human and other animal mental health is a bit like recognizing capacities for language, tool use, and culture in other creatures. That is, it’s a blow to the idea that humans are the only animals to feel or express emotion in complex and surprising ways. It is also anthropomorphic, the projection of human emotions, characteristics, and desires onto nonhuman beings or things. We can choose, though, to anthropomorphize well and, by doing so, make more accurate interpretations of animals’ behavior and emotional lives. Instead of self-centered projection, anthropomorphism can be a recognition of bits and pieces of our human selves in other animals and vice versa.
Braitman goes on to trace how our evolving understanding of animal psychology, from Charles Darwin to Jane Goodall, sheds invaluable light on things of deep concern to us humans — notions like anxiety, altruism, depression, and happiness. Read more here.
Slingerland frames the paradoxical premise at the heart of his book with an illustrative example: a game called Mindball at his local science museum in Vancouver, in which two players sit opposite one another, each wearing an electrode-equipped headband that registers general activity in the brain, and try to mentally push a metal ball from the center of the table to the other player; whoever does this first wins. There is, of course, a rub:
The motive force — measured by each player’s electrodes, and conveyed to the ball by a magnet hidden underneath the table—is the combination of alpha and theta waves produced by the brain when it’s relaxed: the more alpha and theta waves you produce, the more force you mentally exert on the ball. Essentially, Mindball is a contest of who can be the most calm. It’s fun to watch. The players visibly struggle to relax, closing their eyes, breathing deeply, adopting vaguely yogic postures. The panic they begin to feel as the ball approaches their end of the table is usually balanced out by the overeagerness of their opponent, both players alternately losing their cool as the big metal ball rolls back and forth. You couldn’t wish for a better, more condensed illustration of how difficult it is to try not to try.
Our lives, Slingerland argues, are often like “a massive game of Mindball,” when we find ourselves continually caught in this loop of trying so hard that we stymie our own efforts. Like in Mindball, where victory only comes when the player relaxes and stops trying to win, we spend our lives “preoccupied with effort, the importance of working, striving, and trying,” only to find that the more we try to will things into manifesting, the more elusive they become. Slingerland writes:
Our excessive focus in the modern world on the power of conscious thought and the benefits of willpower and self-control causes us to overlook the pervasive importance of what might be called “body thinking”: tacit, fast, and semiautomatic behavior that flows from the unconscious with little or no conscious interference. The result is that we too often devote ourselves to pushing harder or moving faster in areas of our life where effort and striving are, in fact, profoundly counterproductive.
Art by Austin Kleon from ‚Show Your Work.‘ Click image for more.
Some of the most elusive objects of our incessant pursuits are happiness and spontaneity, both of which are strikingly resistant to conscious pursuit. Two ancient Chinese concepts might be our most powerful tools for resolving this paradox — wu-wei (pronounced oooo-way) and de (pronounced duh). Slingerland explains:
Wu-wei literally translates as “no trying” or “no doing,” but it’s not at all about dull inaction. In fact, it refers to the dynamic, effortless, and unselfconscious state of mind of a person who is optimally active and effective. People in wu-wei feel as if they are doing nothing, while at the same time they might be creating a brilliant work of art, smoothly negotiating a complex social situation, or even bringing the entire world into harmonious order. For a person in wu-wei, proper and effective conduct follows as automatically as the body gives in to the seductive rhythm of a song. This state of harmony is both complex and holistic, involving as it does the integration of the body, the emotions, and the mind. If we have to translate it, wu-wei is probably best rendered as something like “effortless action” or “spontaneous action.” Being in wu-wei is relaxing and enjoyable, but in a deeply rewarding way that distinguishes it from cruder or more mundane pleasures.
“Anxiety … makes others feel as you might when a drowning man holds on to you,”Anaïs Nin wrote. “Anxiety may be compared with dizziness. He whose eye happens to look down the yawning abyss becomes dizzy,”Kierkegaard observed. “There is no question that the problem of anxiety is a nodal point at which the most various and important questions converge, a riddle whose solution would be bound to throw a flood of light on our whole mental existence,”Freud proclaimed in his classic introductory lectures on psychoanalysis. And yet the riddle of anxiety is far from solved — rather, it has swelled into a social malady pulling countless numbers of us underwater daily. Among those most mercilessly fettered by anxiety’s grip is Scott Stossel, familiar to most as the editor of The Atlantic. In his superb mental health memoir, My Age of Anxiety: Fear, Hope, Dread, and the Search for Peace of Mind (public library | IndieBound), Stossel follows in the tradition of Montaigne to use the lens of his own experience as a prism for illuminating insight on the quintessence of our shared struggles with anxiety. From his personal memoir he weaves a cultural one, painting a portrait of anxiety though history, philosophy, religion, popular culture, literature, and a wealth of groundbreaking research in psychology and neuroscience.
Why? Because anxiety and its related psychoemotional disorders turn out to be the most common, prevalent, and undertreated form of clinically classified mental illness today, even more common than depression. Stossel contextualizes the issue with some striking statistics that reveal the cost — both financial and social — of anxiety:
According to the National Institute of Mental Health, some forty million Americans, nearly one in seven of us, are suffering from some kind of anxiety disorder at any given time, accounting for 31 percent of the expenditures on mental health care in the United States. According to recent epidemiological data, the “lifetime incidence” of anxiety disorder is more than 25 percent — which, if true, means that one in four of us can expect to be stricken by debilitating anxiety at some point in our lifetimes. And it is debilitating: Recent academic papers have argued that the psychic and physical impairment tied to living with an anxiety disorder is equivalent to living with diabetes — usually manageable, sometimes fatal, and always a pain to deal with. A study published in The American Journal of Psychiatry in 2006 found that Americans lose a collective 321 million days of work because of anxiety and depression each year, costing the economy $50 billion annually; a 2001 paper published by the U.S. Bureau of Labor Statistics once estimated that the median number of days missed each year by American workers who suffer from anxiety or stress disorders is twenty-five. In 2005 — three years before the recent economic crisis hit — Americans filled fifty-three million prescriptions for just two antianxiety drugs: Ativan and Xanax. (In the weeks after 9/11, Xanax prescriptions jumped 9 percent nationally — and by 22 percent in New York City.) In September 2008, the economic crash caused prescriptions in New York City to spike: as banks went belly up and the stock market went into free fall, prescriptions for anti-depressant and antianxiety medications increased 9 percent over the year before, while prescriptions for sleeping pills increased 11 percent.
Few people today would dispute that chronic stress is a hallmark of our times or that anxiety has become a kind of cultural condition of modernity. We live, as has been said many times since the dawn of the atomic era, in an age of anxiety — and that, cliché though it may be, seems only to have become more true in recent years as America has been assaulted in short order by terrorism, economic calamity and disruption, and widespread social transformation.
Fittingly, Alan Watts’s The Wisdom of Insecurity: A Message for an Age of Anxiety, written in the very atomic era that sparked the dawn of our present predicament, remains one of the best meditations on the subject. But, as Stossel points out, the notion of anxiety as a clinical category only appeared as recently as thirty years ago. He traces anxiety’s rise to cultural fame through the annals of academic history, pointing out that there were only three academic papers published on the subject in 1927, only fourteen in 1941, and thirty-seven in 1950. It wasn’t until psychologist Rollo May published his influential treatise on anxiety in 1950 that academia paid heed. Today, a simple Google Scholar search returns nearly three million results, and entire academic journals are dedicated to anxiety.
But despite anxiety’s catapulting into cultural concern, our understanding of it — especially as far as mental health stereotypes are concerned — remains developmentally stunted, having evolved very little since the time of seventeenth-century Jewish-Dutch philosopher Baruch Spinoza, who asserted that anxiety was a mere problem of logic and could thus be resolved with tools of reason. Stossel counters such oversimplification with a case for layered, complex causality of the disorder:
The truth is that anxiety is at once a function of biology and philosophy, body and mind, instinct and reason, personality and culture. Even as anxiety is experienced at a spiritual and psychological level, it is scientifically measurable at the molecular level and the physiological level. It is produced by nature and it is produced by nurture. It’s a psychological phenomenon and a sociological phenomenon. In computer terms, it’s both a hardware problem (I’m wired badly) and a software problem (I run faulty logic programs that make me think anxious thoughts). The origins of a temperament are many faceted; emotional dispositions that may seem to have a simple, single source — a bad gene, say, or a childhood trauma — may not.
Self-censorship in the digital age We won’t be able to recognize ourselves
07.04.2014 · More than a century ago, Sigmund Freud showed how we censor ourselves. In the age of digital mass surveillance we are facing self-censorship of a different dimension. We are more cautious, warier. Our behavior is changing drastically .
On February 24, 1998, back when Edward Snowden was but fifteen, the National Security Agency finished one of the most remarkable documents in the history and theory of communications media. The Internet itself had just recently shifted into a commercial mode and was hosting an ever-growing fraction of all two-way communication. Electronic intelligence officers took notice, in concert with its “partners.”
The document said: „In the past, NSA operated in a mostly analog world of point-to-point communications carried along discrete, dedicated voice channels. [M]ost of these communications were in the air and could be accessed using conventional means….Now, communications are mostly digital, carry billions of bits of data, and contain voice, data and multimedia. They are dynamically routed, globally networked and pass over traditional communications means such as microwave or satellite less and less. … To perform both its offensive and defensive missions, NSA must live on the network.“
Lurking in the shadows of the shadows
The NSA and its allies have indeed, learned to “live on the network,” hovering over tweets and texts, emails and videocalls, social networks, games, images, searches, and phones. They are not the only ones with eyes on the digital prize. The British GCHQ has been fiercely aggressive in pursuing electronic intelligence, the French DGSE have happily joined in with their own version of massive electronic surveillance, and the Germans, alongside the Americans and British, grown very familiar with the NSA “crown jewels,” like the digital vacuum cleaner XKeyscore (capable of searching emails, chats, and browsing histories), using the program to capture hundreds of millions of German data sets. In one NSA document reported on by Der Spiegel, the NSA applauded „the German government [for] modif[ying] its interpretation of the G-10 privacy law … to afford the BND more flexibility in sharing protected information with foreign partners.“
Of course, prying eyes on the Internet come too from countries beyond Europe and North America. If anyone doesn’t know that the Chinese and Russians have invested heavily in cyber-espionage they reside in some other solar system. Multi-national corporations plead “shock” and “outrage” that their servers and data pipes were so well hoovered– they doth protest too much. Meanwhile, those same companies are themselves cross-correlating data on all of us at a staggering rate. Lurking in the shadows of shadows are the cybercriminals, profitably snatching government and corporate data.
Reshaping the self
In fact, the most shocking thing I’ve read over the last year has not been that electronic espionage agencies spy electronically. Instead, it was a small salmon-colored text balloon lodged on the lower right of an NSA PowerPoint PRISM slide: “PRISM cost: ~ $20M per year.”
Twenty million dollars per year? An absolutely insignificant drop in the NSA’s budget. Of course that low price depended on getting the data but by pressure, law or stealth from the corporate data world. The very ease of this kind of monitoring suggested by this low seven-figure bill means that this debate is effectively over. Sure this or that program will be curtailed. But no one, no institution, no treaty, law, or country is going to stop this world-wide harvesting of data.
In fact, if I listed the following mental benefits from a new pill or potion, you’d be rightly sceptical.
But all these flow from a simple activity which is completely free, involves no expensive equipment, chemicals, apps, books or other products.
I’ve also included my own very brief meditation instructions below to get you started.
But first, what are all these remarkable benefits?
1. Lasting emotional control
Meditation may make us feel calmer while we’re doing it, but do these benefits spill over into everyday life?
Desborders et al. (2012) scanned the brains of people taking part in an 8-week meditation program, before and after the course.
While they were scanned, participants looked at pictures designed to elicit positive, negative and neutral emotional responses.
After the meditation course, activation in the amygdala, the emotional centre of the brain, was reduced to all pictures.
This suggests that meditation can help provide lasting emotional control, even when you are not meditating.
2. Cultivate compassion
Meditation has long been thought to help people be more virtuous and compassionate. Now this has been put to scientific test.
In one study participants who had been meditating were given an undercover test of their compassion (Condon et al., 2013).
They were sat in a staged waiting area with two actors when another actor entered on crutches, pretending to be in great pain. The two actors sat next to the participants both ignored the person who was in pain, sending the unconscious signal not to intervene.
Those who had been meditating, though, were 50% more likely to help the person in pain.
One of the study’s authors, David DeSteno, said:
“The truly surprising aspect of this finding is that meditation made people willing to act virtuous–to help another who was suffering–even in the face of a norm not to do so.”
3. Change brain structures
Meditation is such a powerful technique that, after only 8 weeks, the brain’s structure changes.
To show these effects, images of 16 people’s brains were taken before and after they took a meditation course (Hölzel et al., 2011).
Compared with a control group, grey-matter density in the hippocampus–an area associated with learning and memory–was increased.
The study’s lead author, Britta Hölzel, said:
“It is fascinating to see the brain’s plasticity and that, by practicing meditation, we can play an active role in changing the brain and can increase our well-being and quality of life.”
4. Reduce pain
One of the benefits of changes to the brain’s structure is that regular meditators experience less pain.
Grant et al. (2010) applied a heated plate to the calves of meditators and non-meditators. The meditators had lower pain sensitivity.
Joshua Grant explained:
“Through training, Zen meditators appear to thicken certain areas of their cortex and this appears to be underlie their lower sensitivity to pain.”
5. Accelerate cognition
How would you like your brain to work faster?
Zeidan et al. (2010) found significant benefits for novice meditators from only 80 minutes of meditation over 4 days.
Despite their very brief period of practice—and compared with a control group who listened to an audiobook of Tolkein’s The Hobbit—meditators improved on measures of working memory, executive functioning and visuo-spatial processing.
The authors conclude:
“…that four days of meditation training can enhance the ability to sustain attention; benefits that have previously been reported with long-term meditators.”
Improvements seen on the measures ranged from 15% to over 50%.
The right type of meditation can help solve some creative problems.
A study by Colzato et al. (2012) had participants take a classic creativity task: think up as many uses as you can for a brick.
Those using an ‘open monitoring’ method of meditation came up with the most ideas.
This method uses focusing on the breath to set the mind free.
7. Sharpen concentration
At its heart, meditation is all about learning to concentrate, to have greater control over the spotlight of attention.
An increasing body of studies now underline the benefits of meditation for attention.
For example, Jha et al. 2007 sent 17 people who had not practised meditation before on an 8-week training course in mindfulness-based stress reduction, a type of meditation.
These 17 participants were then compared with a further 17 from a control group on a series of attentional measures. The results showed that those who had received training were better at focusing their attention than the control group.
A central symptom of depression is rumination: when depressing thoughts roll around and around in the mind.
Unfortunately you can’t just tell a depressed person to stop thinking depressing thoughts; it’s pointless. That’s because treating the symptoms of depression is partly about taking control of the person’s attention.
One method that can help with this is mindfulness meditation. Mindfulness is all about living in the moment, rather than focusing on past regrets or future worries.
A recent review of 39 studies on mindfulness has found that it can be beneficial in treating depression (Hofmann et al., 2010).
Since it is so beneficial, here is a quick primer on how to meditate.
The names and techniques of meditation are many and varied, but the fundamentals are much the same:
1. Relax the body and the mind
This can be done through body posture, mental imagery, mantras, music, progressive muscle relaxation, any old trick that works. Take your pick.
This step is relatively easy as most of us have some experience of relaxing, even if we don’t get much opportunity.
2. Be mindful
It’s a bit cryptic this one but it means something like this: don’t pass judgement on your thoughts, let them come and go as they will (and boy will they come and go!). When your mind wanders, try to nudge your attention back to its primary aim.
It turns out this is quite difficult because we’re used to mentally travelling backwards and forwards while making judgements on everything (e.g. worrying, dreading, anticipating, regretting etc.).
The key is to notice, in a detached way, what’s happening, but not to get involved with it. This way of thinking often doesn’t come that naturally.
3. Concentrate on something
Often meditators concentrate on their breath, the feel of it going in and out, but it could be anything: your feet, a potato, a stone.
The breath is handy because we carry it around with us. Whatever it is, though, try to focus all your attention onto it.
When your attention wavers, and it will almost immediately, gently bring it back. Don’t chide yourself, be compassionate to yourself.
The act of concentrating on one thing is surprisingly difficult: you will feel the mental burn almost immediately. Experienced practitioners say this eases with practice.
4. Concentrate on nothing
Most say this can’t be achieved without a lot of practice, so I’ll say no more about it here. Master the basics first.
This is just a quick introduction but does give you enough to get started. It’s important not to get too caught up in techniques but to remember the main goal: exercising attention by relaxing and focusing on something.
Try these things out first, see what happens, then explore further.
Understanding the psychology behind the way we tick might help us to tick even better.
Many studies and much research has been invested into the how and why behind our everyday actions and interactions. The results are revealing. If you are looking for a way to supercharge your personal development, understanding the psychology behind our actions is an essential first step.
Fortunately, knowing is half the battle. When you realize all the many ways in which our minds create perceptions, weigh decisions, and subconsciously operate, you can see the psychological advantages start to take shape. It’s like a backstage pass to the way we work, and being backstage, you have an even greater understanding of what it takes to succeed.
The following 6 psychology facts can be viewed as a hacker’s guide to self-improvement, based on the brain’s default settings. So, that’s exactly what this is – your backstage pass to how our brain functions and how we can best avoid common misconceptions: (…)
Imagine you are asked to watch a short video (above) in which six people-three in white shirts and three in black shirts-pass basketballs around. While you watch, you must keep a silent count of the number of passes made by the people in white shirts. At some point, a gorilla strolls into the middle of the action, faces the camera and thumps its chest, and then leaves, spending nine seconds on screen. Would you see the gorilla?
Almost everyone has the intuition that the answer is „yes, of course I would.“ How could something so obvious go completely unnoticed? But when we did this experiment at Harvard University several years ago, we found that half of the people who watched the video and counted the passes missed the gorilla. It was as though the gorilla was invisible.
This experiment reveals two things: that we are missing a lot of what goes on around us, and that we have no idea that we are missing so much. To our surprise, it has become one of the best-known experiments in psychology. It is described in most introductory textbooks and is featured in more than a dozen science museums. It has been used by everyone from preachers and teachers to corporate trainers and terrorist hunters, not to mention characters on the TV show C.S.I., to help explain what we see and what we don’t see. And it got us thinking that many other intuitive beliefs that we have about our own minds might be just as wrong. We wrote The Invisible Gorilla to explore the limits of human intuition and what they mean for ourselves and our world.