Andrew Montin Sep 4, 2017

Short prehistory:

On a cool May day in 1758, a 10-year girl with red hair and freckles was caring for her neighbor’s children in rural western Pennsylvania. In a few moments, Mary Campbell’s life changed forever when Delaware Indians kidnapped her and absorbed her into their community for the next six years. She became the first of some 200 known cases of white captives, many of whom became pawns in an ongoing power struggle that included European powers, American colonists and Indigenous peoples straining to maintain their population, their land and way of life.

While Mary was ultimately returned to her white family—and some evidence points to her having lived happily with her adopted Indian tribe—stories such as hers became a cautionary tale among white settlers, stoking fear of “savage” Indians and creating a paranoia that escalated into all-out Indian hating.

From the time Europeans arrived on American shores, the frontier—the edge territory between white man’s civilization and the untamed natural world—became a shared space of vast, clashing differences that led the U.S. government to authorize over 1,500 wars, attacks and raids on Indians, the most of any country in the world against its Indigenous people. By the close of the Indian Wars in the late 19th century, fewer than 238,000 Indigenous people remained, a sharp decline from the estimated 5 million to 15 million living in North America when Columbus arrived in 1492.

The reasons for this racial genocide were multi-layered. Settlers, most of whom had been barred from inheriting property in Europe, arrived on American shores hungry for Indian land—and the abundant natural resources that came with it. Indians’ collusion with the British during the American Revolution and the War of 1812 exacerbated American hostility and suspicion toward them.

Even more fundamentally, Indigenous people were just too different: Their skin was dark. Their languages were foreign. And their world views and spiritual beliefs were beyond most white men’s comprehension. To settlers fearful that a loved one might become the next Mary Campbell, all this stoked racial hatred and paranoia, making it easy to paint Indigenous peoples as pagan savages who must be killed in the name of civilization and Christianity.

History:

Jonathan Lear describes his book Radical Hope (2006) (German translation 2020) as a work of “philosophical anthropology”. Like an anthropologist, he is interested in what happened to the Crow tribe when they were moved onto reservations and their traditional way of life came to an end. Unlike an anthropologist, however, Lear is also concerned with the larger questions entailed by the possibility that a way of life could come to an end. One such question is ethical in nature: how should one live in relation to the prospect that one’s way of life may come to an end? Another such question is ontological, in the sense that it concerns the nature of that being for whom such a thing is possible.

This ontological dimension was intimated by something the last Crow chief, Plenty Coups, said when describing the end of his tribe’s traditional way of life. In recounting his life story, Plenty Coups described the period when his people moved onto the reservation this way: “But when the buffalo went away the hearts of my people fell to the ground, and they could not lift them up again. After this nothing happened.” Lear admits that he cannot know precisely what Plenty Coups meant when he said “nothing happened.” Was Plenty Coups depressed? (Lear notes in passing that the rest of Plenty Coups’ life certainly does not seem to be that of a depressed person.) Does he mean that his tribe could no longer go on living according to the traditional ways? These are plausible interpretations of what Plenty Coup might have meant. But Lear wants to pursue the possibility that something deeper was being communicated by Plenty Coups’ remark. He asks: “What if it gave expression to an insight into the structure of temporality: that at a certain point things stopped happening? What would he have meant if he meant that?” The implication here is that our sense of time, of things happening and our understanding of what happens, are bound up with a particular way of life. When that way of life comes to end, the intelligibility of our world also collapses; for us, it is as if nothing more happens because nothing can make sense outside of that way of life.

For Lear, Plenty Coups’ remark points to “a particular form of human vulnerability”; a vulnerability we all share by virtue of being human. It is an ontological vulnerability because it concerns our particular way of being in relation to the world and to time. In posing the issue in these terms, Lear acknowledges a debt to the German philosopher Martin Heidegger (1889–1976). In his book Being and Time, Heidegger presented human existence as fundamentally concerned with making sense of the world in terms of its meaningful possibilities. Everyday objects, for example, are intelligible to us primarily through the way they express specific possibilities for their use: a hammer for hammering nails, a lectern for placing lecture notes on, etc. These possibilities are not infinite — the range of possibilities is determined by the specific culture and society we grew up in and in which we live of our lives. What is significant about Heidegger’s account in this context is that culture and social life are not things which come after or exist alongside our relation to objects, but instead are the very medium through which objects become intelligible to us at all.

It is in this light that Lear reflects on the simple act of cooking a meal. Cooking is common to all human societies, but the meaning which the act of cooking a meal has for each of us depends on the culture and society in which the action is embedded. For the Crow, whose traditional way of life revolved around hunting and fighting, the intelligibility of cooking a meal would have depended upon its relation to the possibilities of hunting and fighting. With the collapse of their traditional way of life, cooking a meal could no longer be made sense of in those terms. Of course, the Crow could make sense of it otherwise in relation to the way of life which followed. But to someone bearing witness, as Lear puts it, to the demise of the traditional way of life, it is as if the act of cooking no longer counted as an intelligible act at all. And without the meaningfulness of cultural objects like the coup stick used by the Crow in battle, or of everyday acts like cooking in preparation for a hunt, there is no longer any socially meaningfully way for the Crow to mark time. The Crow “ran out of whens,” as Lear puts it, “all Crow temporality had fitted within these categories — everything that happened could be understood in these terms — and thus it seems fair to say that the Crow ran out of time.” It is this possibility, peculiar to human beings as cultural creatures, that Lear seeks to understand when reflecting on the fate of the Crow people.

In part one, I noted that Lear draws on Heidegger’s Being and Time to make sense of our “ontological vulnerability” to a breakdown in meaning. As creatures whose existence fundamentally consists in making sense of things, we are to a great extent dependent upon the meaningful possibilities which are illuminated through our cultural and social practices. When this cultural foundation collapses, there is a sense in which the intelligibility of our world also collapses. One of the main ideas Lear explores is that even prior to such a collapse, there is a way in which this vulnerability can make itself felt.

In Being and Time, Heidegger discusses the individual’s experience of anxiety (Angst in German) as revealing something of great importance about human existence. Anxiety draws us out of what Heidegger calls our “crowd-self”, that is, the typical roles, worries and tasks with which we preoccupy ourselves in our everyday lives. Anxiety is the feeling or mood that strikes us when our daily preoccupations begin to lose their grip on us; when we begin to wonder about the point of it all and whether there isn’t a deeper meaning to our lives. Through this experience, the world itself becomes unfamiliar or uncanny. Consider, for example, a case in which a deadline that someone has been working towards all of a sudden loses its urgency for that person. Whereas before they had been wrapped up in the need to meet the deadline, and so focused on bringing together all those elements which are needed to make things happen, at that very moment the deadline no longer appears to them as something which has the same organizing and motivating significance for them. The experience is uncanny because a situation which had been familiar now becomes very unfamiliar, even though the person can still understand everything that is going on. What moments like this can reveal is that the intelligibility of our world is very much dependent on us, on our active taking up of possibilities and combining them together in the tasks and projects we choose to pursue, rather than as a meaningful whole which we can simply take for granted. The anxiety arises in distancing ourselves from the possibilities with which we are normally fully engaged, and in the accompanying threat of a breakdown of intelligibility.

In a footnote to chapter two, Lear describes his book as a meditation on Heidegger’s idea of “being-towards-death”. This is more accurately put as “being on the edge of death”, because Heidegger explicitly associates it with the sense of anxiety described above, which can strike us at any moment and isn’t specifically tied to some point in the future when we will cease to exist. It is meant to highlight how our existence is always, at every moment, engaged in an existential struggle to ward off the utter meaninglessness that lies on the other side of existence, even if we rarely relate to our own existence in these terms. It is in the mood of anxiety, by coming face to face with the world in its uncanniness, that this existential struggle is revealed to each individual.

While Heidegger’s focus is on the individual’s experience of anxiety, Lear is interested in how this kind of experience might affect an entire community which is facing the prospect of cultural devastation. He invokes the idea of communal anxiety, explicitly in relation to the accounts of anxiety developed by both Heidegger and Kierkegaard (whom I’ll discuss in a future post). Lear’s argument in chapters two and three is mostly concerned with how such an anxiety, as felt by the Crow, may have been transformed through radical hope into the imaginative resources needed to survive the collapse of their way of life. Having considered some of the background to Lear’s argument, I will consider the argument itself in more detail in part three.

In the previous post in this series, I looked at Heidegger’s account of anxiety and how it is bound up with the finitude of human existence. Anxiety is a mood which exposes the intelligibility of the world as something which is contingent on our own existential struggle to make sense of it. Lear’s account of anxiety in the face of cultural devastation is influenced by Heidegger, but also by the Danish philosopher Søren Kierkegaard. It is through the influence of the latter that Lear emphasizes the ironic nature of this experience of anxiety. What is meant by irony here?

Lear introduces the notion of irony in chapter one of Radical Hope by reflecting on the criteria for a vibrant culture:

  1. There must be established social roles that one can embody and interact with.
  2. There must be standards of excellence associated with these roles, that give us a sense of the culture’s ideals.
  3. There must be the possibility of constituting oneself as a certain sort of person — namely, one who embodies those ideals.

The sense of irony which Lear is concerned with arises historically when the possibility of constituting oneself as a certain sort of person (3) becomes problematic. In the case of the Crow, irony in this sense was impossible in the 1840s because the three criteria listed above cohered in such a way that ensured their culture’s vibrancy. But a hundred years later, when the Crow had been moved onto a reservation, this coherence collapsed. While it was still possible to recognise the traditional social roles, the standards of social excellence associated with most of these roles could no longer be realized. Intertribal warfare had been banned, traditional hunting had become impossible, and mortality rates from disease had almost wiped out a younger generation. Under these conditions, it became possible for the Crow to ask:

Among the warriors, is there a warrior?

One could call oneself a warrior, but it was no longer clear what the pursuing the ideals of being a warrior might entail once intertribal warfare had been banned. With the breakdown of social roles and the patterns of upbringing disrupted, the very possibility of constituting oneself as a Crow subject was thrown in question. It is in this light that one might ask the question:

Among the Crow, is there a Crow?

As outsiders, we seem to have no trouble reformulating this question as: among those who call themselves members of the Crow nation, are there any members who live up to the ideals of being a Crow? But from the perspective of the Crow people themselves, there is a disorientation inherent in the question which is not felt by the observer. In his essay “A Lost Conception of Irony,” Lear explains this difference as follows:

“[F]rom the perspective of my Crow friends, the question has a different aura. It makes them anxious; or rather it names a core anxiety. I mean anxiety in the literal sense of disruptive separation from the world and disorientation. It is easy for us to hear the question as though it were coming from the superego — a question of whether the Crow fail to live up to their ideals. But from the perspective of my Crow friends, the ideal is every bit as much in question as they are.”

It is not merely a theoretical question for the Crow, but a practical matter of how one should live. And it produces anxiety because there is no longer a clear answer to the question of how to go on.

This experience of irony can affect all of us as in coming to terms with our social roles. In such an experience, the question of what counts as excellence in the role is not simply an occasion to reflect upon whether we do in fact live up to such ideals; it leads to a moment of anxious disruption, as the very nature of those ideals is called into question. For Lear, this experience of irony raises the ethical question of how we can live in such a way that remains open to it without leading to despair. For Kierkegaard, the figure of Socrates represented an exemplar of ironic existence. I would suggest that in Radical Hope, Lear offers us the figure of Plenty Coups, the last great chief of the Crow, as another such exemplar. I will consider this further in my next post.

When he was nine years old, Plenty Coups underwent a traditional rite of passage which involved leaving the tribe for a few days and through solitude and fasting experience a dream-vision. On his second night in the wilderness, Plenty Coups dreamt that he met a buffalo bull who turned into a man wearing a buffalo robe, and who showed him a plain in which countless buffalo emerged from a hole in a ground. Suddenly, the plains were empty, and out of the hole in the ground came animals which looked similar to the buffalo but were spotted. Plenty Coups was then shown an old man sitting under a tree, and was told that he was looking at himself. Finally, Plenty Coups witnessed a terrible storm in which the four winds blew down all the trees in the forest except one. He was told that inside the tree was the lodge of the Chickadee, and that the Chickadee-person was one with the least physical strength but strongest mind, who was willing to work for wisdom and never missed a chance to learn from others.

Plenty Coups recounted his dream to the tribal elders who then interpreted it. They said it foretold a time when the white man’s herds would replace the buffalo, and that only by becoming like the Chickadee and learning from the experience of others would the tribe be able to survive and hold onto its lands.

Jonathan Lear believes that Plenty Coups’ prophetic dream was most probably a response to the tribe’s communal sense of anxiety. Plenty Coup would have had this dream in 1855 or ’56, by which time the advance of white settlers had pushed rival tribes into greater proximity with one another, and the escalation in inter-tribal warfare and diseases such as smallpox had reduced the Crow’s population by about half. The dream was part of the process by which the tribe’s anxieties could be metabolized and represented in narrative form. And it gave Plenty Coups, as a future chief of the tribe, the imaginative resources needed to cope with the “storm” or cultural devastation that was coming. In particular, Lear thinks that the values represented in the dream by the Chickadee came to articulate a new form of courage.

For Lear this is a crucial point, because the primary virtue around which Crow life had revolved was courage in battle. The ultimate act of courage was symbolically represented by the planting of a coup-stick, which expressed a Crow warrior’s resolve to die rather than retreat. Lear analyses this and other acts of courage as marking a boundary around Crow life which demanded recognition even from the Crow’s enemies. This is what Lear calls the Crow’s “thick” conception of courage, by which he means a concept rooted in a particular culture and historical circumstances. What happened to the Crow, however, was that the possibilities for practicing their traditional way of life would become restricted to such an extent that such thick concepts eventually became unintelligible. A virtue like courage simply could not be realized as it had been in the past. How does one retain a sense of virtue or ethics when the very concepts which had informed one’s cultural understanding of what is good collapse?

According to Lear, the values expressed in the dream through the figure of the Chickadee represented a kind of radical hope. It is radical in the sense that the values transcend the finite ethical forms manifested by thick ethical concepts. Plenty Coups’ vision was not of a future form of life, but of a commitment to the possibility of ethics even after the concepts with which one had understood the ethical ceased to make sense. Lear explains this point as follows:

“It is difficult to grasp the radical and strange nature of this commitment. For, on the one hand, Plenty Coups is witnessing the death of a traditional way of life. It is the death of the possibility of forming oneself as a Crow subject — at least, as traditionally understood. On the other hand, he is committed to the idea that by “listening as the Chickadee listens” he and the Crow will somehow survive. What could this mean? We would have to understand the Crow as somehow transcending their own subjectivity. That is, we would have to understand them as surviving the demise of the established ways of constituting oneself as a Crow subject. In that sense, it is no longer possible to be a Crow…. Still, on the basis of his dream, he commits himself to the idea that — on the other side of the abyss — the Crow shall survive, perhaps flourish again. The Crow is dead, long live the Crow! This is a form of hope that seems to survive the destruction of a way of life. Though it must be incredibly bly difficult to hold onto this commitment in the midst of subjective catastrophe, it is not impossible. And it is at least conceivable that this is just what Plenty Coups did.”

This kind of commitment, Lear argues, is ironic in Kierkegaard’s sense of the term. It is a recognition or more precisely a hope, that by giving up a traditional way of life new possibilities will open up and another way of flourishing will become possible. It is a commitment undertaken despite the fact that these future possibilities cannot be comprehended in advance.

Lear draws on many biographical facts about Plenty Coups’ life to suggest the ways in which he followed the wisdom of the Chickadee in the face of cultural devastation. One episode Lear places great emphasis on is Plenty Coups’ participation in a ceremony at the Tomb of the Unknown Soldier in 1921, when he laid down his coup-stick and headdress. By this act, Lear believes, Plenty Coups acknowledged that the traditional forms of fitting or virtuous behaviour were no longer appropriate. But he did so in a way which was itself fitting, that demonstrated “in these radically altered circumstances” that it was still possible “to think about what it was appropriate to do.” Plenty Coups’ actions did not just mark an end to a way of life, but sought to creatively reinterpret traditional ideals from within a radically new context.

My comment: This reminds me automatically to James Baldwin’s legendary speech at Cambridge University 1965: „…It comes as a great shock to discover that Gary Cooper killing off the Indians—when you were rooting for Gary Cooper—the Indians were you. It comes as a great shock to discover the country which is your birth place and to which you owe your life and your identity has not in its whole system of reality evolved any place for you….“ Here the Link for the post of this speech on my website: https://www.pottbayer.de/wp-admin/post.php?post=3404&action=edit

My recommendation for an exhibition 2020-2021 at the Field Museum in Chicago: “I hope this exhibition helps people to honor their own cultural experiences in new ways and to identify with Indigenous people—to realign ourselves as Americans and understand that this is a very diverse country.”

Nina Sanders (Apsáalooke), guest curator of Apsáalooke Women and Warriors, at the Field Museum in Chicago until April 4, 2021. Here two links: https://www.fieldmuseum.org/exhibitions/apsaalooke-women-and-warriors

https://culturalpropertynews.org/apsaalooke-women-and-warriors-in-chicago/

“But when the buffalo went away the hearts of my people fell to the ground, and they could not lift them up again. After this nothing happened.”

„Wir leben in einer Zeit des sich verstärkenden Gefühls, dass Zivilisationen verletzlich sind. Ereignisse überall auf der Welt – Terrorangriffe, gewaltsame soziale Umwälzungen und auch Naturkatastrophen (Pandemien wie Covid 19) – hinterlassen in uns ein unheimliches Gefühl der Bedrohung. Wir scheinen uns einer geteilten Verletzlichkeit bewusst zu sein, die wir nicht ganz benennen können. Ich vermute, dass dieses Gefühl auch die weitverbreitete Intoleranz hervorgerufen hat. Es ist so, als ob ohne unser Beharren auf die Richtigkeit unserer Perspektive auch diese Perspektive selbst zusammenbrechen könnte. Wemm wir unserem geteilten Gefühl der Verletzlichkeit einen Namen (Begriff, Wort) geben könnten, wäre es uns vielleicht auch möglich, besser mit ihm zu leben.“ (aus der deutschen Übersetzung von Jonathan Lears „Radical Hope“ von 2020)

Stoicism And Digital Minimalism: An Interview With Computer Scientist And Bestselling Author Cal Newport

The unassuming Georgetown computer science professor Cal Newport has become one of this generation’s leading voices on how we can all work more wisely and more deeply. In his latest book Digital Minimalism: Choosing A Focused Life In A Noisy World, which just released this week, the bestselling author of Deep Work introduces a philosophy for technology use that has already improved countless lives. With media consumption continuing to go way up (which, for most of us, means happiness and productivity continue to go way down) and the world becoming noisier every day, this book is an urgent call to action for anyone serious about being in command of their own life. The minimalism movement successfully led millions to opt out of the many possessions we’re told we’re supposed to crave and focus instead on the small number of things that bring the most meaning and value to our lives. The same ideology applies to our online lives. Digital clutter is stressful. We don’t need the constant connectivity, the pages and pages of apps, the incessant scrolling and clickingNew technologies can improve our lives if we know how to best leverage them. Cal’s 

Cal is also a fan of the Stoics. In our interview with Cal, he explains his interest and application of Stoicism, why the idea that less can be more has held up since ancient times, the importance of solitude and high-quality leisure, and so much more. Please enjoy our interview with Cal Newport! 

***

You’ve written about Stoicism a few times, which makes sense for a college professor but is a bit unexpected for a computer science professor. Do you remember how you were first introduced to Stoicism? 

I’ve always read widely in both philosophy and religious history, so Stoicism has been on my radar for a while. I remember reading William Irvine’s book, A Guide to the Good Life, around the time it came out. I also remember Tim Ferriss, during this period, was talking a lot about Seneca.

Why do you think it resonated with you?

At a high-level, I’ve always liked the Ancient Greek model of philosophy as a blueprint for action. This is why in my books I’m always mixed practical advice with more complicated theories and big-picture ideas. When I first started working in this “smart self-help” mash-up style, there was resistance from the publishing world. Self-help books were supposed to be written conversationally and have little intellectual content, and idea books were supposed to be smart and critical and never sully themselves with actual suggestions. This division is artificial. The Greeks had it right: what’s the point of thinking hard about issues related to your life if it’s not going to directly help your life? This has been a big inspiration for my approach to books.

At the lower-level, Stoicism itself contains great psychological wisdom: reactive thoughts, more than any actual events, control our experience of our lives. Learning to find strength and joy in what you control over what you cannot sound simple, but is profound when put into action.

You have a great post about Seneca on the myth of “free” and how most people miss the hidden costs of social media. As someone who has decided—famously—to not have any social media accounts, what do you feel like you’ve gained by that? Or avoided paying? Tell us what it’s like over on the other side. 

From Seneca, through Thoreau, to Kondo, the idea that less can be more has persisted throughout the history of civilization. The core insight is that focusing your limited energy on the things you know for sure are very valuable will return greater overall value than trying to dissipate this energy over many lower-value things in the vain attempt to not miss out on random scraps of value.

I don’t use social media because in my life as a writer I want to focus my energy like a laser on the small number of things that I’ve already learned provide me a big return: reading smart things, writing essays on my blog to test ideas, and writing books.

I’m sure there are many little bits of value I might have extracted by engaging with a subset of my readers through Twitter, or managing a social media consultant who posts on my behalf to Facebook. But the energy invested in these pursuits is energy taken away from the core activities that I know move the needle in my writing career, meaning my net return would likely be lower.

To put it more concretely, if we consider the counterfactual in which I’m a heavy social media user,  I’d probably have a lot Twitter followers but would have also written less total books. I’ll take the books over retweets.

If you’ll indulge my ranting a little longer, I want to also note that social media has become particularly pernicious for people who are justing getting started out in a competitive creative field. It provides you an activity that can make you feel busy, and important, and like you’re crushing things left and right, without actually demanding that much that’s actually hard. But as far as the market is concerned, only the hard stuff matters!

In my experience, in almost every competitive field, the absolute key to both success and fulfillment is to follow Steve Martin’s famous advice to become so good they can’t ignore you (I wrote a book about this in 2012). Or as Jocko would put it: put down the damn phone and get after it!

Your new book Digital Minimalism is about reducing the time we spend online, focusing on a small number of activities that support things we deeply value. One of the things that Stoics talk about is needing to make time for philosophy—and how hard that is to do. Isn’t it crazy that people have been struggling with this stuff for so long? Have you found that digital minimalism is giving people more time to read and think and reflect on what really matters?

Last year, I led over 1,600 volunteers in an experiment where they stepped away from optional technologies in their digital lives to be reacquainted with what they really value.  One of the most common reports I heard from these volunteers was that they were both surprised and excited to rediscover how much they enjoyed the simple analog activities we used to take for granted, such as coming home from the library with a stack of random books or building something with their hands.

As you note, these issues are not new. Both Seneca and Aristotle wrote about the importance of high-quality leisure. Arnold Bennett wrote this great guide in the early 20th century titled How to Live on 24 Hours a Day, in which he argued that what you do with your free time is fundamental to the quality of your life. He argues that you should read and think about hard things instead of getting drunk and noodling around on the piano (the Victorian Age equivalent of surfing Twitter).

One of Seneca’s lines is thinking about what we’ve become “slaves” too—really questioning anything that we’re powerless to stop checking or doing or spending time on. Is the fact that social media and technology is just so hard to quit kind of proof that there is something manipulative and dangerous about it?

There seems to be two major reasons why we spend so more time than we know is healthy looking at screens.

The first is that some of these tools—especially the major social media platforms—have specifically engineered their user experiences to foster compulsive use. Facebook, for example, used to be a fairly static platform. You might log in a few times a week to see if any friends had changed their relationship status or posted pictures from a vacation. When they moved to mobile, however, Facebook reengineered the experience so that it would send a rich stream of social approval indicators at the users (likes, photo tags, comments), mixed in with algorithmically-optimized feed items meant to spark emotional charge. Now instead of checking a few times a week, you compulsively click that little “f” icon on your phone twenty times an hour. This was not accidental—it was the Facebook executives’ strategy for boosting user engagement metrics to where they needed to be for the IPO to succeed.

Put another way: the “like” button wasn’t introduced for your benefit, it was instead introduced for the benefits of the early Facebook investors who were getting antsy for their 100x return.

The other reason that seems to keep people glued to their screens is that it fills a void. Life is hard. This hardness is especially manifest during those periods of downtime when you’re alone with your thoughts. People avoid these confrontations through constant, low quality digital distraction much in the way that people of another era might have dealt with these difficulties with heavy drinking.

But this is just a bandaid over a deeper wound.

As the ancients taught us, the sustainable response is to instead dedicate your free time toward things that matter. Take on as much responsibility as you can bear, seek out quality for the sake of quality (as Aristotle recommends in The Ethics), serve your community, connect with real people in real life and sacrifice for them.

All of this can seem daunting as compared to clicking “watch next” on your Netflix stream, but once engaged in these deeper pursuits, it’s hard to go back to the shallow.

You also popularized this concept of “Deep Work”—the increasingly rare ability to concentrate without distraction on a demanding task. You say this is a skill that can be trained. What are your recommendations to people who want to improve their ability to do deep work?

Remove from your phone any app where someone makes money off your attention when you open it. These apps are to your cognitive abilities what junk food is to your athletic abilities.

Spend regular periods of time in a state of solitude, by which I mean “free from inputs from other minds”: no phone, no ear buds, no screen, no books—just you and your thoughts.  If you want to be good at thinking, by which I mean processing information and generating insights, you need to practice. (This definition of solitude comes from a great book called Lead Yourself First).

Do interval training: pick a hard problem; set a timer; think intensely about the problem with zero distractions (not even the smallest glance at a phone) until the timer goes off. Start with a small amount of time, and once you become comfortable with that duration of focus, increase the time by 10-15 minutes.

One of the things the Stoics talk about is detaching from results or outcomes (we control the input on a given project, for example, but not how critics or the market receive it). How do you think about that with something stressful and uncertain like a book launch?

One of my rules during book launches is that I put my effort into taking good swings, by which I mean writing the best article I can, or giving the best interview I’m capable of in the moment. But then once the swing is done, put your attention immediately on the next. I don’t, for example, read comments or look at social media reaction for the things I’m putting out there. Once it’s out of my hands I want to move on.

Any good book recommendations? Any favorite Stoic quotes you want to share?

A few books I love that your audience would probably like too, but might not have heard about:

Lincoln’s Virtues, by Henry Lee Miller (a virtuosic and exuberant look at the development of Lincoln’s moral life)

Amusing Ourselves to Death by Neil Postman (a more pragmatic Marshall McLuhan: dissects how media influences how we think)

Medieval Technology and Social Change by Lynn White Jr. (classic work from history of technology that argues the horse stirrup caused feudalism; great example of both big think history and a reminder of the unintentional influences of tech on culture)

Cal’s new book Digital Minimalism is out now! Digital clutter is stressful. We don’t need the constant connectivity, the pages and pages of apps, the incessant scrolling and clicking.New technologies can improve our lives if we know how to best leverage them. Cal shows us how.

2014’s Best Books on Psychology, Philosophy, and How to Live Meaningfully | Brain Pickings.

by Maria Popova

How to be alone, wake up from illusion, master the art of asking, fathom your place in the universe, and more.

After the year’s most intelligent and imaginative children’s books and best science books, here are my favorite books on psychology and philosophy published this year, along with the occasional letter and personal essay — genres that, at their most excellent, offer hearty helpings of both disciplines. Perhaps more precisely, these are the year’s finest books on how to live sane, creative, meaningful lives. (And since the subject is of the most timeless kind, revisit the selections 2013, 2012, and 2011.)

1. A GUIDE FOR THE PERPLEXED

Werner Herzog is celebrated as one of the most influential and innovative filmmakers of our time, but his ascent to acclaim was far from a straight trajectory from privilege to power. Abandoned by his father at an early age, Herzog survived a WWII bombing that demolished the house next door to his childhood home and was raised by a single mother in near-poverty. He found his calling in filmmaking after reading an encyclopedia entry on the subject as a teenager and took a job as a welder in a steel factory in his late teens to fund his first films. These building blocks of his character — tenacity, self-reliance, imaginative curiosity — shine with blinding brilliance in the richest and most revealing of Herzog’s interviews. Werner Herzog: A Guide for the Perplexed (public library) — not to be confused with E.F. Schumacher’s excellent 1978 philosophy book of the same title — presents the director’s extensive, wide-ranging conversation with writer and filmmaker Paul Cronin. His answers are unfiltered and to-the-point, often poignant but always unsentimental, not rude but refusing to infest the garden of honest human communication with the Victorian-seeded, American-sprouted weed of pointless politeness.

Herzog’s insights coalesce into a kind of manifesto for following one’s particular calling, a form of intelligent, irreverent self-help for the modern creative spirit — indeed, even though Herzog is a humanist fully detached from religion, there is a strong spiritual undertone to his wisdom, rooted in what Cronin calls “unadulterated intuition” and spanning everything from what it really means to find your purpose and do what you love to the psychology and practicalities of worrying less about money to the art of living with presence with an age of productivity. As Cronin points out in the introduction, Herzog’s thoughts collected in the book are “a decades-long outpouring, a response to the clarion call, to the fervent requests for guidance.”

And yet in many ways, A Guide for the Perplexed could well have been titled A Guide to the Perplexed, for Herzog is as much a product of his “cumulative humiliations and defeats,” as he himself phrases it, as of his own “chronic perplexity,” to borrow E.B. White’s unforgettable term — Herzog possesses that rare, paradoxical combination of absolute clarity of conviction and wholehearted willingness to inhabit his own inner contradictions, to pursue life’s open-endedness with equal parts focus of vision and nimbleness of navigation.

A certain self-reliance that permeates his films and his mind, a refusal to let the fear of failure inhibit trying — a sensibility the voiceover in the final scene of Herzog’s The Unprecedented Defence of the Fortress Deutschkreuz captures perfectly: “Even a defeat is better than nothing at all.”

Sample this magnificent tome with Herzog on creativity, self-reliance, and making a living out of what you love and his no-bullshit advice to aspiring filmmakers, which applies just as brilliantly to any field of creative endeavor.

2. HOW TO BE ALONE

If the odds of finding one’s soul mate are so dreadfully dismal and the secret of lasting love is largely a matter of concession, is it any wonder that a growing number of people choose to go solo? The choice of solitude, of active aloneness, has relevance not only to romance but to all human bonds — even Emerson, perhaps the most eloquent champion of friendship in the English language, lived a significant portion of his life in active solitude, the very state that enabled him to produce his enduring essays and journals. And yet that choice is one our culture treats with equal parts apprehension and contempt, particularly in our age of fetishistic connectivity. Hemingway’s famous assertion that solitude is essential for creative work is perhaps so oft-cited precisely because it is so radical and unnerving in its proposition.

Solitude, the kind we elect ourselves, is met with judgment and enslaved by stigma. It is also a capacity absolutely essential for a full life.

That paradox is what British author Sara Maitland explores in How to Be Alone (public library | IndieBound) — the latest installment in The School of Life’s thoughtful crusade to reclaim the traditional self-help genre in a series of intelligent, non-self-helpy yet immeasurably helpful guides to such aspects of modern living as finding fulfilling work, cultivating a healthier relationship with sex, worrying less about money, and staying sane.

While Maitland lives in a region of Scotland with one of the lowest population densities in Europe, where the nearest supermarket is more than twenty miles away and there is no cell service (pause on that for a moment), she wasn’t always a loner — she grew up in a big, close-knit family as one of six children. It was only when she became transfixed by the notion of silence, the subject of her previous book, that she arrived, obliquely, at solitude. She writes:

I got fascinated by silence; by what happens to the human spirit, to identity and personality when the talking stops, when you press the off button, when you venture out into that enormous emptiness. I was interested in silence as a lost cultural phenomenon, as a thing of beauty and as a space that had been explored and used over and over again by different individuals, for different reasons and with wildly differing results. I began to use my own life as a sort of laboratory to test some ideas and to find out what it felt like. Almost to my surprise, I found I loved silence. It suited me. I got greedy for more. In my hunt for more silence, I found this valley and built a house here, on the ruins of an old shepherd’s cottage.

Illustration by Marianne Dubuc from ‚The Lion and the Bird,‘ one of the best children’s books of the year. Click image for more.

Maitland’s interest in solitude, however, is somewhat different from that in silence — while private in its origin, it springs from a public-facing concern about the need to address “a serious social and psychological problem around solitude,” a desire to “allay people’s fears and then help them actively enjoy time spent in solitude.” And so she does, posing the central, “slippery” question of this predicament:

Being alone in our present society raises an important question about identity and well-being.

[…]

How have we arrived, in the relatively prosperous developed world, at least, at a cultural moment which values autonomy, personal freedom, fulfillment and human rights, and above all individualism, more highly than they have ever been valued before in human history, but at the same time these autonomous, free, self-fulfilling individuals are terrified of being alone with themselves?

[…]

We live in a society which sees high self-esteem as a proof of well-being, but we do not want to be intimate with this admirable and desirable person.

We see moral and social conventions as inhibitions on our personal freedoms, and yet we are frightened of anyone who goes away from the crowd and develops “eccentric” habits.

We believe that everyone has a singular personal “voice” and is, moreover, unquestionably creative, but we treat with dark suspicion (at best) anyone who uses one of the most clearly established methods of developing that creativity — solitude.

We think we are unique, special and deserving of happiness, but we are terrified of being alone.

[…]

We are supposed now to seek our own fulfillment, to act on our feelings, to achieve authenticity and personal happiness — but mysteriously not do it on our own.

Today, more than ever, the charge carries both moral judgement and weak logic.

Maitland goes on to explore the underlying psychology of our unease from the fall of the Roman Empire to the rise of the “male spinster” and how to cultivate the five deepest rewards of solitude. Read more here.

3. WAKING UP

Nietzsche’s famous proclamation that “God is dead” is among modern history’s most oft-cited aphorisms, and yet as is often the case with its ilk, such quotations often miss the broader context in a way that bespeaks the lazy reductionism with which we tend to approach questions of spirituality today. Nietzsche himself clarified the full dimension of his statement six years later, in a passage from The Twilight of Idols, where he explained that “God” simply signified the supersensory realm, or “true world,” and wrote: “We have abolished the true world. What has remained? The apparent one perhaps? Oh no! With the true world we have also abolished the apparent one.”

Indeed, this struggle to integrate the sensory and the supersensory, the physical and the metaphysical, has been addressed with varying degrees of sensitivity by some of history’s greatest minds — reflections like Carl Sagan on science and religion, Flannery O’Connor on dogma, belief, and the difference between religion and faith, Alan Lightman on science and spirituality, Albert Einstein on whether scientists pray, Ada Lovelace on the interconnectedness of everything, Alan Watts on the difference between belief and faith, C.S. Lewis on the paradox of free will, and Jane Goodall on science and spirit.

In Waking Up: A Guide to Spirituality Without Religion (public library | IndieBound), philosopher, neuroscientist, and mindful skeptic Sam Harris offers a contemporary addition to this lineage of human inquiry — an extraordinary and ambitious masterwork of such integration between science and spirituality, which Harris himself describes as “by turns a seeker’s memoir, an introduction to the brain, a manual of contemplative instruction, and a philosophical unraveling of what most people consider to be the center of their inner lives.” Or, perhaps most aptly, an effort “to pluck the diamond from the dunghill of esoteric religion.”

Sam Harris by Bara Vetenskap

Harris begins by recounting an experience he had at age sixteen — a three-day wilderness retreat designed to spur spiritual awakening of some sort, which instead left young Harris feeling like the contemplation of the existential mystery in the presence of his own company was “a source of perfect misery.” This frustrating experience became “a sufficient provocation” that launched him into a lifelong pursuit of the kinds of transcendent experiences that gave rise to the world’s major spiritual traditions, examining them instead with a scientist’s vital blend of skepticism and openness and a philosopher’s aspiration to be “scrupulously truthful.”

Harris writes:

Our minds are all we have. They are all we have ever had. And they are all we can offer others… Every experience you have ever had has been shaped by your mind. Every relationship is as good or as bad as it is because of the minds involved.

Noting that the entirety of our experience, as well as our satisfaction with that experience, is filtered through our minds — “If you are perpetually angry, depressed, confused, and unloving, or your attention is elsewhere, it won’t matter how successful you become or who is in your life — you won’t enjoy any of it.” — Harris sets out to reconcile the quest to achieve one’s goals with a deeper longing, a recognition, perhaps, that presence is far more rewarding than productivity. He writes:

Most of us spend our time seeking happiness and security without acknowledging the underlying purpose of our search. Each of us is looking for a path back to the present: We are trying to find good enough reasons to be satisfied now.

Acknowledging that this is the structure of the game we are playing allows us to play it differently. How we pay attention to the present moment largely determines the character of our experience and, therefore, the quality of our lives.

This message, of course, is nothing new — half a century ago, Alan Watts made a spectacular case for it, building on millennia of Eastern philosophy. But what makes our era singular and this discourse particularly timely, Harris points out, is that there is now a growing body of scientific research substantiating these ancient intuitions, which he goes on to examine in fascinating detail.

Sample the book further with Harris on the paradox of meditation.

4. LETTERS OF NOTE

Virginia Woolf called letter-writing “the humane art” — an epithet only amplified today, in an age when we so frequently mistake reaction for response and succumb to expectations of immediacy that render impossible the beautiful, contemplative mutuality at the heart of the notion of co-respondence. This, perhaps, is why yesteryear’s greatest letters appeal to us more irrepressibly than ever.

For years, Shaun Usher has been unearthing and highlighting brilliant, funny, poignant, exquisitely human letters from luminaries and ordinary people alike on his magnificent website. This year, the best of them were released in Letters of Note: Correspondence Deserving of a Wider Audience (public library | IndieBound) — the aptly titled, superb collection featuring contributions from such cultural icons as Virginia Woolf, Roald Dahl, Louis Armstrong, Kurt Vonnegut, Nick Cave, Richard Feynman, Jack Kerouac, and more.

Sample this treasure trove further with E.B. White’s beautiful letter to a man who had lost faith in humanity, young Hunter S. Thompson’s advice to a friend on how to find one’s purpose and live a full life, comedian Bill Hicks’s piercing missive to a censoring priest on what freedom of speech really means, and Eudora Welty’s disarming job application to the New Yorker.

5. THE RISE

“You gotta be willing to fail… if you’re afraid of failing, you won’t get very far,” Steve Jobs cautioned. “There is no such thing as failure — failure is just life trying to move us in another direction,” Oprah counseled new Harvard graduates. In his wonderfully heartening letter of fatherly advice, F. Scott Fitzgerald gave his young daughter Scottie a list of things to worry and not worry about in life; among the unworriables, he listed failure, “unless it comes through your own fault.” And yet, as Debbie Millman observed in Fail Safe, her magnificent illustrated-essay-turned-commencement-address, most of us “like to operate within our abilities” — stepping outside of them risks failure, and we do worry about it, very much. How, then, can we transcend that mental block, that existential worry, that keeps us from the very capacity for creative crash that keeps us growing and innovating?

That’s precisely what curator and art historian Sarah Lewis, who has under her belt degrees from Harvard and Oxford, curatorial positions at the Tate Modern and the MoMA, and an appointment on President Obama’s Arts Policy Committee, examines in The Rise: Creativity, the Gift of Failure, and the Search for Mastery (public library | IndieBound) — an exploration of how “discoveries, innovations, and creative endeavors often, perhaps even only, come from uncommon ground” and why this “improbable ground of creative endeavor” is an enormous source of advantages on the path to self-actualization and fulfillment, brought to life through a tapestry of tribulations turned triumphs by such diverse modern heroes as legendary polar explorer Captain Scott, dance icon Paul Taylor, and pioneering social reformer Frederick Douglass. Lewis, driven by her lifelong “magpie curiosity about how we become,” crafts her argument slowly, meticulously, stepping away from it like a sculptor gaining perspective on her sculpture and examining it through other eyes, other experiences, other particularities, which she weaves together into an intricate tapestry of “magpielike borrowings” filtered through the sieve of her own point of view.

Female archers, lantern slide, c. 1920. (Public domain via Oregon State University Special Collections & Archives.)

Lewis begins with a visit with the women of Columbia University’s varsity archery team, who spend countless hours practicing a sport that requires equal parts impeccable precision of one’s aim and a level of comfort with the uncontrollable — all the environmental interferences, everything that could happen between the time the arrow leaves the bow and the time it lands on the target, having followed its inevitably curved line. From this unusual sport Lewis draws a metaphor for the core of human achievement:

There is little that is vocational about [contemporary] culture anymore, so it is rare to see what doggedness looks like with this level of exactitude… To spend so many hours with a bow and arrow is a kind of marginality combined with a seriousness of purpose rarely seen.

In the archers’ doggedness Lewis finds the central distinction that serves as a backbone of her book — far more important than success (hitting the bull’s-eye) is the attainment of mastery (“knowing it means nothing if you can’t do it again and again”), and in bridging the former with the latter lives the substance of true achievement. (The distinction isn’t unlike what psychologist Carol Dweck found in her pioneering work on the difference between “fixed” and “growth” mindsets.) Lewis writes:

Mastery requires endurance. Mastery, a word we don’t use often, is not the equivalent of what we might consider its cognate — perfectionism — an inhuman aim motivated by a concern with how others view us. Mastery is also not the same as success — an event-based victory based on a peak point, a punctuated moment in time. Mastery is not merely a commitment to a goal, but to a curved-line, constant pursuit.

This is why, Lewis argues, a centerpiece of mastery is the notion of failure. She cites Edison, who famously said of his countless fruitless attempts to create a feasible lightbulb: “I have not failed, I’ve just found 10,000 ways that won’t work.” In fact, Lewis points out that embedded in the very word “failure” — a word originally synonymous with bankruptcy, devised to assess creditworthiness in the 19th century, “a seeming dead end forced to fit human worth” — is the bias of our limited understanding of its value:

The word failure is imperfect. Once we begin to transform it, it ceases to be that any longer. The term is always slipping off the edges of our vision, not simply because it’s hard to see without wincing, but because once we are ready to talk about it, we often call the event something else — a learning experience, a trial, a reinvention — no longer the static concept of failure.

In its stead, Lewis offers another 19th-century alternative: “blankness,” which beautifully captures the wide-open field of possibility for renewal, for starting from scratch, after an unsuccessful attempt. Still, she considers the challenge of pinning down into plain language a concept so complex and fluid — even fashionable concepts like grit fail failure:

Trying to find a precise word to describe the dynamic is fleeting, like attempting to locate francium, an alkali metal measured but never isolated in any weighted quantity or seen in a way that the eye can detect — one of the most unstable, enigmatic elements on the Earth. No one knows what it looks like in an appreciable form, but there it is, scattered throughout ores in the Earth’s crust. Many of us have a similar sense that these implausible rises must be possible, but the stories tend to stay strewn throughout our lives, never coalescing into a single dynamic concept… The phenomenon remains hidden, and little discussed. Partial ideas do exist — resilience, reinvention, and grit — but there’s no one word to describe the passing yet vital, constant truth that just when it looks like winter, it is spring.

[…]

When we don’t have a word for an inherently fleeting idea, we speak about it differently, if at all. There are all sorts of generative circumstances — flops, folds, wipeouts, and hiccups — yet the dynamism it inspires is internal, personal, and often invisible… It is a cliché to say simply that we learn the most from failure. It is also not exactly true. Transformation comes from how we choose to speak about it in the context of story, whether self-stated or aloud.

One essential element of understanding the value of failure is the notion of the “deliberate incomplete.” (Cue in Marie Curie, who famously noted in a letter to her brother: “One never notices what has been done; one can only see what remains to be done.”) Lewis writes:

We thrive, in part, when we have purpose, when we still have more to do. The deliberate incomplete has long been a central part of creation myths themselves. In Navajo culture, some craftsmen and women sought imperfection, giving their textiles and ceramics an intended flaw called a “spirit line” so that there is a forward thrust, a reason to continue making work. Nearly a quarter of twentieth century Navajo rugs have these contrasting-color threads that run out from the inner pattern to just beyond the border that contains it; Navajo baskets and often pottery have an equivalent line called a “heart line” or a “spirit break.” The undone pattern is meant to give the weaver’s spirit a way out, to prevent it from getting trapped and reaching what we sense is an unnatural end.

There is an inevitable incompletion that comes with mastery. It occurs because the greater our proficiency, the more smooth our current path, the more clearly we may spot the mountain that hovers in our gaze. “What would you say increases with knowledge?” Jordan Elgrably once asked James Baldwin. “You learn how little you know,” Baldwin said.

A related concept is that of the “near win” — those moments when we come so close to our aim, yet miss it by a hair:

At the point of mastery, when there seems nothing left to move beyond, we find a way to move beyond ourselves. Success motivates. Yet the near win — the constant auto-correct of a curved-line path — can propel us in an ongoing quest. We see it whenever we aim, climb, or create with mastery as our aim, when the outcome is determined by what happens at the margins.

Lewis goes on to illustrate these concepts with living examples from the stories of such pioneering figures as the great polar explorer Captain Scott, dance icon Paul Taylor, and pioneering social reformer Frederick Douglass. Read more here.

6. THE ACCIDENTAL UNIVERSE

It says something about physicist and writer Alan Lightman — the very first person to receive dual appointments in science and the humanities at MIT — that a book of his is not only among the best science books of the year, but also a masterwork of philosophy. But that is precisely what The Accidental Universe: The World You Thought You Knew (public library | IndieBound) is — a spectacular journey to the frontiers of theoretical physics, exploring how the possibility of multiple universes illuminates the heart of the human experience and our quest for Beauty, Truth, and Meaning. Lightman’s enchanting writing reveals him not only as a scientist of towering expertise, but also as an insightful philosopher and poet of the cosmos, partway between Seneca and Carl Sagan.

In the foreword, Lightman recounts attending a lecture by the Dalai Lama at MIT, “one of the world’s spiritual leaders sitting cross-legged in a modern temple of science,” and hearing about the Buddhist concept of sunyata, translated as “emptiness” — the notion that objects in the physical universe are vacant of inherent meaning and that we imbue them with meaning and value with the thoughts of our own minds. From this, Lightman argues while adding to history’s finest definitions of science, arises a central challenge of the human condition:

As a scientist, I firmly believe that atoms and molecules are real (even if mostly empty space) and exist independently of our minds. On the other hand, I have witnessed firsthand how distressed I become when I experience anger or jealousy or insult, all emotional states manufactured by my own mind. The mind is certainly its own cosmos. As Milton wrote in Paradise Lost, “[The mind] can make a heaven of hell or a hell of heaven.” In our constant search for meaning in this baffling and temporary existence, trapped as we are within our three pounds of neurons, it is sometimes hard to tell what is real. We often invent what isn’t there. Or ignore what is. We try to impose order, both in our minds and in our conceptions of external reality. We try to connect. We try to find truth. We dream and we hope. And underneath all of these strivings, we are haunted by the suspicion that what we see and understand of the world is only a tiny piece of the whole.

[…]

Science does not reveal the meaning of our existence, but it does draw back some of the veils.

Lightman goes on to explore the relationship between science and spirituality, how dark energy explains why we exist, and what our yearning for immortality tells us about the universe.

7. SMALL VICTORIES

Beyond having written one of the finest books on writing ever published, Anne Lamott embraces language and life with equal zest, squeezing from the intersection wisdom of the most soul-stretching kind. Small Victories: Spotting Improbable Moments of Grace (public library | IndieBound) shines a sidewise gleam at Lamott’s much-loved meditations on why perfectionism kills creativity and how we keep ourselves small by people-pleasing to explore the boundless blessings of our ample imperfections, from which our most expansive and transcendent humanity springs.

Amid her moving reflections on grief, grace, and gratitude is one especially enchanting essay titled “The Book of Welcome,” in which Lamott considers the uncomfortable art of letting yourself be seen:

Trappings and charm wear off… Let people see you. They see your upper arms are beautiful, soft and clean and warm, and then they will see this about their own, some of the time. It’s called having friends, choosing each other, getting found, being fished out of the rubble. It blows you away, how this wonderful event ever happened — me in your life, you in mine.

Two parts fit together. This hadn’t occurred all that often, but now that it does, it’s the wildest experience. It could almost make a believer out of you. Of course, life will randomly go to hell every so often, too. Cold winds arrive and prick you: the rain falls down your neck: darkness comes. But now there are two of you: Holy Moly.

Read more here and here.

8. THE TRUTH ABOUT TRUST

“When you trust people to help you, they often do,” Amanda Palmer asserted in her beautiful meditation on the art of asking without shame. But what does it really mean to “trust,” and perhaps more importantly, how can we live with the potential heartbreak that lurks in the gap between “often” and “always”? That’s precisely what psychologist David DeSteno, director of Northeastern University’s Social Emotions Lab, explores in The Truth About Trust: How It Determines Success in Life, Love, Learning, and More (public library | IndieBound).

DeSteno, who has previously studied the osmosis of good and evil in all of us and the psychology of compassion and resilience, argues that matters of trust occupy an enormous amount of our mental energies and influence, directly or indirectly, practically every aspect of our everyday lives. But trust is a wholly different animal from the majority of our mental concerns. DeSteno writes:

Unlike many other puzzles we confront, questions of trust don’t just involve attempting to grasp and analyze a perplexing concept. They all share another characteristic: risk. So while it’s true that we turn our attention to many complex problems throughout our lives, finding the answers to most doesn’t usually involve navigating the treacherous landscape of our own and others’ competing desires.

[…]

Trust implies a seeming unknowable — a bet of sorts, if you will. At its base is a delicate problem centered on the balance between two dynamic and often opposing desires — a desire for someone else to meet your needs and his desire to meet his own.

But despite what pop culture may tell us, decades’ worth of attempts to decode the signals of trustworthiness — sought in everything from facial expression to voice to handwriting — have proven virtually useless, and the last five years of research have rendered previous assertions about certain nonverbal cues wrong. (No, a sideways glance doesn’t automatically indicate that the person is lying to you.) As DeSteno wryly observes, “If polygraphs were foolproof, we wouldn’t need juries.” He explains what makes measures of trust especially complicated:

Unlike many forms of communication, issues of trust are often characterized by a competition or battle…. It’s not always an adaptive strategy to be an open book to others, or even to ourselves. Consequently, trying to discern if someone can be trusted is fundamentally different from trying to assess characteristics like mathematical ability. … Deciding to be trustworthy depends on the momentary balance between competing mental forces pushing us in opposite directions, and being able to predict which of those forces is going to prevail in any one instance is a complicated business.

[…]

Contrary to long-held doctrine, isolated gestures and expressions aren’t reliable indicators of what a person feels or intends to do. Two types of context — what I call configural and situational — are essential for correct interpretation. And they’ve been missing in most attempts to discover what trustworthiness and its opposite look like.

To figure out this multifaceted puzzle, DeSteno, whose lab studies how emotional states shape our social and moral behavior, took a cross-disciplinary approach, turning to the work of economists, computer scientists, security officers, physiologists and other psychologists, and enlisting the direct help of social psychologist David Pizarro and economist Robert Frank. With combined expertise spanning behavioral economics, evolutionary biology, nonverbal behavior, and emotional biases in decision making, they built, with equal parts rigor and humility, the richest framework for understanding trust that science has ever accomplished. Specifically, they focused on the two main components of trust — how it works and whether we’re able to predict who deserves it. DeSteno writes:

In the end, what emerged are not only new insights into how to detect the trustworthiness of others, but also an entirely new way to think about how trust influences our lives, our success, and our interactions with those around us.

Read more here.

9. THE ART OF ASKING

“Have compassion for everyone you meet, even if they don’t want it,” Lucinda Williams sang from my headphones into my heart one rainy October morning on the train to Hudson. “What seems cynicism is always a sign, always a sign…” I was headed to Hudson for a conversation with a very different but no less brilliant musician, and a longtime kindred spirit — the talented and kind Amanda Palmer. In an abandoned schoolhouse across the street from her host’s home, we sat down to talk about her magnificent and culturally necessary new book, The Art of Asking: How I Learned to Stop Worrying and Let People Help (public library | IndieBound) — a beautifully written inquiry into why we have such a hard time accepting compassion in all of its permutations, from love to what it takes to make a living, what lies behind our cynicism in refusing it, and how learning to accept it makes possible the greatest gifts of our shared humanity.

I am partial, perhaps, because my own sustenance depends on accepting help. But I also deeply believe and actively partake in both the yin and the yang of that vitalizing osmosis of giving and receiving that keeps today’s creative economy alive, binding artists and audiences, writers and readers, musicians and fans, into the shared cause of creative culture. “It’s only when we demand that we are hurt,” Henry Miller wrote in contemplating the circles of giving and receiving in 1942, but we still seem woefully caught in the paradoxical trap of too much entitlement to what we feel we want and too little capacity to accept what we truly need. The unhinging of that trap is what Amanda explores with equal parts deep personal vulnerability, profound insight into the private and public lives of art, and courageous conviction about the future of creative culture.

The most urgent clarion call echoing throughout the book, which builds on Amanda’s terrific TED talk, is for loosening our harsh and narrow criteria for what it means to be an artist, and, most of all, for undoing our punishing ideas about what renders one a not-artist, or — worse yet — a not-artist-enough. Amanda writes of the anguishing Impostor Syndrome epidemic such limiting notions spawn:

People working in the arts engage in street combat with The Fraud Police on a daily basis, because much of our work is new and not readily or conventionally categorized. When you’re an artist, nobody ever tells you or hits you with the magic wand of legitimacy. You have to hit your own head with your own handmade wand. And you feel stupid doing it.

There’s no “correct path” to becoming a real artist. You might think you’ll gain legitimacy by going to university, getting published, getting signed to a record label. But it’s all bullshit, and it’s all in your head. You’re an artist when you say you are. And you’re a good artist when you make somebody else experience or feel something deep or unexpected.

But in the history of creative genius, this pathology appears to be a rather recent development — the struggle to be an artist, of course, is nothing new, but the struggle to believe being one seems to be a uniquely modern malady. In one of the most revelatory passages in the book, Amanda points out a little-known biographical detail about the life of Henry David Thoreau — he who decided to live the self-reliant life by Walden pond and memorably proclaimed: “If the day and the night are such that you greet them with joy, and life emits a fragrance like flowers and sweet-scented herbs, is more elastic, more starry, more immortal — that is your success.” It is a detail that, today, would undoubtedly render Thoreau the target of that automatic privilege narrative as we point a finger and call him a “poser”:

Thoreau wrote in painstaking detail about how he chose to remove himself from society to live “by his own means” in a little 10-foot x 15-foot hand-hewn cabin on the side of a pond. What he left out of Walden, though, was the fact that the land he built on was borrowed from his wealthy neighbor, that his pal Ralph Waldo Emerson had him over for dinner all the time, and that every Sunday, Thoreau’s mother and sister brought over a basket of freshly-baked goods for him, including donuts.

The idea of Thoreau gazing thoughtfully over the expanse of transcendental Walden Pond, a bluebird alighting onto his threadbare shoe, all the while eating donuts that his mom brought him just doesn’t jibe with most people’s picture of him of a self-reliant, noble, marrow-sucking back-to-the-woods folk-hero.

If Thoreau lived today, steeped in a culture that tells him taking the donuts chips away at his credibility, would he have taken them? And why don’t we? Amanda writes:

Taking the donuts is hard for a lot of people.

It’s not the act of taking that’s so difficult, it’s more the fear of what other people are going to think when they see us slaving away at our manuscript about the pure transcendence of nature and the importance of self-reliance and simplicity. While munching on someone else’s donut.

Maybe it comes back to that same old issue: we just can’t see what we do as important enough to merit the help, the love.

Try to picture getting angry at Einstein devouring a donut brought to him by his assistant, while he sat slaving on the theory of relativity. Try to picture getting angry at Florence Nightingale for snacking on a donut while taking a break from tirelessly helping the sick.

To the artists, creators, scientists, non-profit-runners, librarians, strange-thinkers, start-uppers and inventors, to all people everywhere who are afraid to accept the help, in whatever form it’s appearing,

Please, take the donuts.

To the guy in my opening band who was too ashamed to go out into the crowd and accept money for his band,

Take the donuts.

To the girl who spent her twenties as a street performer and stripper living on less than $700 a month who went on to marry a best-selling author who she loves, unquestioningly, but even that massive love can’t break her unwillingness to accept his financial help, please….

Everybody.

Please.

Just take the fucking donuts.

But Thoreau, it turns out, got one thing right in his definition of success, which emanates from Amanda’s words a century and a half later:

The happiest artists I know are generally the ones who can manage to make a reasonable living from their art without having to worry too much about the next paycheck. Not to say that every artist who sits around the campfire, or plays in tiny bars, is “happier” than those singing in stadiums — but more isn’t always better. If feeling the connection between yourself and others is the ultimate goal it can be harder when you are separated from the crowd by a 30-foot barrier. And it can be easier to do — though riskier — when they’re sitting right beside you. The ideal sweet spot is the one in which the artist can freely share their talents and directly feel the reverberations of their artistic gifts to their community. In other words, it works best when everybody feels seen.

As artists, and as humans: If your fear is scarcity, the solution isn’t necessarily abundance.

Read more and watch my conversation with Palmer here.

10. LEONARDO’S BRAIN

One September day in 2008, Leonard Shlain found himself having trouble buttoning his shirt with his right hand. He was admitted into the emergency room, diagnosed with Stage 4 brain cancer, and given nine months to live. Shlain — a surgeon by training and a self-described “synthesizer by nature” with an intense interest in the ennobling intersection of art and science, author of the now-legendary Art & Physics — had spent the previous seven years working on what he considered his magnum opus: a sort of postmortem brain scan of Leonardo da Vinci, performed six centuries after his death and fused with a detective story about his life, exploring what the unique neuroanatomy of the man commonly considered humanity’s greatest creative genius might reveal about the essence of creativity itself.

Shlain finished the book on May 3, 2009. He died a week later. His three children — Kimberly, Jordan, and filmmaker Tiffany Shlain — spent the next five years bringing their father’s final legacy to life. The result is Leonardo’s Brain: Understanding Da Vinci’s Creative Genius (public library | IndieBound) — an astonishing intellectual, and at times spiritual, journey into the center of human creativity via the particular brain of one undereducated, left-handed, nearly ambidextrous, vegetarian, pacifist, gay, singularly creative Renaissance male, who Shlain proposes was able to attain a different state of consciousness than “practically all other humans.”

Illustration by Ralph Steadman from ‚I, Leonardo.‘ Click image for more.

Noting that “a writer is always refining his ideas,” Shlain points out that the book is a synthesis of his three previous books, and an effort to live up to Kafka’s famous proclamation that “a book must be the axe for the frozen sea inside us.” It is also a beautiful celebration of the idea that art and science belong together and enrich one another whenever they converge.

Shlain argues that Leonardo — who painted the eternally mysterious Mona Lisa, created visionary anatomical drawings long before medical anatomy existed, made observations of bird flight in greater detailed than any previous scientist, mastered engineering, architecture, mathematics, botany, and cartography, might be considered history’s first true scientist long before Mary Somerville coined the word, presaged Newton’s Third Law, Bernoulli’s law, and elements of chaos theory, and was a deft composer who sang “divinely,” among countless other domains of mastery — is the individual most worthy of the title “genius” in both science and art:

The divergent flow of art and science in the historical record provides evidence of a distinct compartmentalization of genius. The river of art rarely intersected with the meander of science.

[…]

Although both art and science require a high degree of creativity, the difference between them is stark. For visionaries to change the domain of art, they must make a breakthrough that can only be judged through the lens of posterity. Great science, on the other hand, must be able to predict the future. If a scientist’s hypotheses cannot be turned into a law that can be verified by future investigators, it is not scientifically sound. Another contrast: Art and science represent the difference between “being” and “doing.” Art’s raison d’être is to evoke an emotion. Science seeks to solve problems by advancing knowledge.

[…]

Leonardo’s story continues to compel because he represents the highest excellence all of us lesser mortals strive to achieve — to be intellectually, creatively, and emotionally well-rounded. No other individual in the known history of the human species attained such distinction both in science and art as the hyper-curious, undereducated, illegitimate country boy from Vinci.

Using a wealth of available information from Leonardo’s notebooks, various biographical resources, and some well-reasoned speculation, Shlain goes on to perform a “posthumous brain scan” seeking to illuminate the unique wiring of Da Vinci’s brain and how it explains his unparalleled creativity.

Peek inside his findings here.

11. THE ART OF STILLNESS

“Faith is the ability to honor stillness at some moments,” Alan Lightman wrote in his sublime meditation on science and spirituality, “and at others to ride the passion and exuberance.” In his conversation with E.O. Wilson, the poet Robert Hass described beauty as a “paradox of stillness and motion.” But in our Productivity Age of perpetual motion, it’s increasingly hard — yet increasingly imperative — to honor stillness, to build pockets of it into our lives, so that our faith in beauty doesn’t become half-hearted, lopsided, crippled. The delicate bridling of that paradox is what novelist and essayist Pico Iyer explores in The Art of Stillness: Adventures in Going Nowhere (public library | IndieBound) — a beautifully argued case for the unexpected pleasures of “sitting still as a way of falling in love with the world and everything in it,” revealed through one man’s sincere record of learning to “take care of his loved ones, do his job, and hold on to some direction in a madly accelerating world.”

Iyer begins by recounting a snaking drive up the San Gabriel Mountains outside Los Angeles to visit his boyhood hero — legendary singer-songwriter Leonard Cohen. In 1994, shortly after the most revealing interview he ever gave, Cohen had moved to the Mt. Baldy Zen Center to embark on five years of seclusion, serving as personal assistant to the great Japanese Zen teacher Kyozan Joshu Sasaki, then in his late eighties. Midway through his time at the Zen Center, Cohen was ordained as a Rinzai Zen Buddhist monk and given the Dharma name Jikan — Pali for “silence.” Iyer writes:

I’d come up here in order to write about my host’s near-silent, anonymous life on the mountain, but for the moment I lost all sense of where I was. I could hardly believe that this rabbinical-seeming gentleman in wire-rimmed glasses and wool cap was in truth the singer and poet who’d been renowned for thirty years as an international heartthrob, a constant traveler, and an Armani-clad man of the world.

Cohen, who once described the hubbub of his ordinary state of mind as “very much like the waiting room at the DMV,” had sought in the sequestered Zen community a more extreme, more committed version of a respite most of us long for in the midst of modern life — at least at times, at least on some level, and often wholeheartedly, achingly. Iyer reflects on Cohen’s particular impulse and what it reveals about our shared yearning:

Leonard Cohen had come to this Old World redoubt to make a life — an art — out of stillness. And he was working on simplifying himself as fiercely as he might on the verses of one of his songs, which he spends more than ten years polishing to perfection. The week I was visiting, he was essentially spending seven days and nights in a bare meditation hall, sitting stock-still. His name in the monastery, Jikan, referred to the silence between two thoughts.

[…]

One evening — four in the morning, the end of December — Cohen took time out from his meditations to walk down to my cabin and try to explain what he was doing here.

Sitting still, he said with unexpected passion, was “the real deep entertainment” he had found in his sixty-one years on the planet. “Real profound and voluptuous and delicious entertainment. The real feast that is available within this activity.”

Was he kidding? Cohen is famous for his mischief and ironies.

He wasn’t, I realized as he went on. “What else would I be doing?” he asked. “Would I be starting a new marriage with a young woman and raising another family? Finding new drugs, buying more expensive wine? I don’t know. This seems to me the most luxurious and sumptuous response to the emptiness of my own existence.”

Typically lofty and pitiless words; living on such close terms with silence clearly hadn’t diminished his gift for golden sentences. But the words carried weight when coming from one who seemed to have tasted all the pleasures that the world has to offer.

Iyer beholds his encounter with Cohen with the same incredulous amazement that most of us modern cynics experience, at first reluctantly, when confronted with something or someone incomprehensibly earnest, for nothing dissolves snark like unflinching sincerity. For Cohen, Iyer observes, the Zen practice was not a matter of “piety or purity” but of practical salvation and refuge from “the confusion and terror that had long been his bedfellows.” Iyer writes:

Sitting still with his aged Japanese friend, sipping Courvoisier, and listening to the crickets deep into the night, was the closest he’d come to finding lasting happiness, the kind that doesn’t change even when life throws up one of its regular challenges and disruptions.

“Nothing touches it,” Cohen said, as the light came into the cabin, of sitting still… Going nowhere, as Cohen described it, was the grand adventure that makes sense of everywhere else.

A century after Bertrand Russell admonished that the conquest of leisure and health would be of no use if no one remembers how to use them, Iyer paints an empirical caricature of the paradoxical time argument against stillness. Citing a sociological study of time diaries that found Americans were working fewer hours than they were 30 years earlier but felt as if they were working more, he writes:

We’ve lost our Sundays, our weekends, our nights off — our holy days, as some would have it; our bosses, junk mailers, our parents can find us wherever we are, at any time of day or night. More and more of us feel like emergency-room physicians, permanently on call, required to heal ourselves but unable to find the prescription for all the clutter on our desk.

As most of us would begrudgingly admit, not without some necessary tussle with denial and rationalization, the challenge of staying present in the era of productivity is in no small part a product of our age itself. Iyer captures this elegantly:

Not many years ago, it was access to information and movement that seemed our greatest luxury; nowadays it’s often freedom from information, the chance to sit still, that feels like the ultimate prize. Stillness is not just an indulgence for those with enough resources — it’s a necessity for anyone who wishes to gather less visible resources. Going nowhere, as Cohen had shown me, is not about austerity so much as about coming closer to one’s senses.

Much like we find ourselves by getting lost, Iyer suggests, we inhabit the world more fully by mindfully vacating its mayhem:

Going nowhere … isn’t about turning your back on the world; it’s about stepping away now and then so that you can see the world more clearly and love it more deeply.

Read more about how to do that here.

12. ANIMAL MADNESS

If the notion of mental illness in animals seems like far-fetched anthropocentrism, a field of science that has been gathering momentum for more than 150 years strongly suggests otherwise. That’s precisely what Senior TED Fellow Laurel Braitman explores in Animal Madness: How Anxious Dogs, Compulsive Parrots, and Elephants in Recovery Help Us Understand Ourselves (public library | IndieBound). Braitman, who holds a Ph.D. in history and anthropology of science from MIT, argues that we humans are far from unique in our capacity for “emotional thunderstorms that make our lives more difficult” and that nonhuman animals are bedeviled by varieties of mental illness strikingly similar to our own. With equal parts rigor and compassion, she examines evidence from veterinary science, psychology and pharmacology research, first-hand accounts by neuroscientists, zoologists, animal trainers, and other experts, the work of legendary scientists and philosophers like Charles Darwin and Rene Descartes, and her own experience with dozens of animals spanning a multitude of species and mental health issues, from depressed dogs to self-harming dolphins to canine Alzheimer’s and PTSD.

Braitman’s journey begins with one particularly troubled nonhuman animal — Oliver, the Bernese Mountain Dog she adopted, whose “extreme fear, anxiety, and compulsions” prompted her, in the way that a concerned parent on the verge of despair grasps for answers, to explore whether and how other animals could be mentally ill. Considering the tapestry of evidence threads she uncovered during her research, she writes:

Humans and other animals are more similar than many of us might think when it comes to mental states and behaviors gone awry — experiencing churning fear, for example, in situations that don’t call for it, feeling unable to shake a paralyzing sadness, or being haunted by a ceaseless compulsion to wash our hands or paws. Abnormal behaviors like these tip into the territory of mental illness when they keep creatures — human or not — from engaging in what is normal for them. This is true for a dog single-mindedly focused on licking his tail until it’s bare and oozy, a sea lion fixated on swimming in endless circles, a gorilla too sad and withdrawn to play with her troop members, or a human so petrified of escalators he avoids department stores.

Every animal with a mind has the capacity to lose hold of it from time to time. Sometimes the trigger is abuse or mistreatment, but not always. I’ve come across depressed and anxious gorillas, compulsive horses, rats, donkeys, and seals, obsessive parrots, self-harming dolphins, and dogs with dementia, many of whom share their exhibits, homes, or habitats with other creatures who don’t suffer from the same problems. I’ve also gotten to know curious whales, confident bonobos, thrilled elephants, contented tigers, and grateful orangutans. There is plenty of abnormal behavior in the animal world, captive, domestic, and wild, and plenty of evidence of recovery; you simply need to know where and how to find it.

Braitman is careful to acknowledge that such a notion is likely to unnerve our notions of human exceptionalism and offers a wise caveat:

Acknowledging parallels between human and other animal mental health is a bit like recognizing capacities for language, tool use, and culture in other creatures. That is, it’s a blow to the idea that humans are the only animals to feel or express emotion in complex and surprising ways. It is also anthropomorphic, the projection of human emotions, characteristics, and desires onto nonhuman beings or things. We can choose, though, to anthropomorphize well and, by doing so, make more accurate interpretations of animals’ behavior and emotional lives. Instead of self-centered projection, anthropomorphism can be a recognition of bits and pieces of our human selves in other animals and vice versa.

Braitman goes on to trace how our evolving understanding of animal psychology, from Charles Darwin to Jane Goodall, sheds invaluable light on things of deep concern to us humans — notions like anxiety, altruism, depression, and happiness. Read more here.

13. TRYING NOT TO TRY

“The best way to get approval is not to need it,” Hugh MacLeod memorably counseled. We now know that perfectionism kills creativity and excessive goal-setting limits our success rather than begetting it — all different manifestations of the same deeper paradox of the human condition, at once disconcerting and comforting, which Edward Slingerland, professor of Asian Studies and Embodied Cognition at the University of British Columbia and a renowned scholar of Chinese thought, explores in Trying Not to Try: The Art and Science of Spontaneity (public library | IndieBound).

Slingerland frames the paradoxical premise at the heart of his book with an illustrative example: a game called Mindball at his local science museum in Vancouver, in which two players sit opposite one another, each wearing an electrode-equipped headband that registers general activity in the brain, and try to mentally push a metal ball from the center of the table to the other player; whoever does this first wins. There is, of course, a rub:

The motive force — measured by each player’s electrodes, and conveyed to the ball by a magnet hidden underneath the table—is the combination of alpha and theta waves produced by the brain when it’s relaxed: the more alpha and theta waves you produce, the more force you mentally exert on the ball. Essentially, Mindball is a contest of who can be the most calm. It’s fun to watch. The players visibly struggle to relax, closing their eyes, breathing deeply, adopting vaguely yogic postures. The panic they begin to feel as the ball approaches their end of the table is usually balanced out by the overeagerness of their opponent, both players alternately losing their cool as the big metal ball rolls back and forth. You couldn’t wish for a better, more condensed illustration of how difficult it is to try not to try.

Our lives, Slingerland argues, are often like “a massive game of Mindball,” when we find ourselves continually caught in this loop of trying so hard that we stymie our own efforts. Like in Mindball, where victory only comes when the player relaxes and stops trying to win, we spend our lives “preoccupied with effort, the importance of working, striving, and trying,” only to find that the more we try to will things into manifesting, the more elusive they become. Slingerland writes:

Our excessive focus in the modern world on the power of conscious thought and the benefits of willpower and self-control causes us to overlook the pervasive importance of what might be called “body thinking”: tacit, fast, and semiautomatic behavior that flows from the unconscious with little or no conscious interference. The result is that we too often devote ourselves to pushing harder or moving faster in areas of our life where effort and striving are, in fact, profoundly counterproductive.

Art by Austin Kleon from ‚Show Your Work.‘ Click image for more.

Some of the most elusive objects of our incessant pursuits are happiness and spontaneity, both of which are strikingly resistant to conscious pursuit. Two ancient Chinese concepts might be our most powerful tools for resolving this paradox — wu-wei (pronounced oooo-way) and de (pronounced duh). Slingerland explains:

Wu-wei literally translates as “no trying” or “no doing,” but it’s not at all about dull inaction. In fact, it refers to the dynamic, effortless, and unselfconscious state of mind of a person who is optimally active and effective. People in wu-wei feel as if they are doing nothing, while at the same time they might be creating a brilliant work of art, smoothly negotiating a complex social situation, or even bringing the entire world into harmonious order. For a person in wu-wei, proper and effective conduct follows as automatically as the body gives in to the seductive rhythm of a song. This state of harmony is both complex and holistic, involving as it does the integration of the body, the emotions, and the mind. If we have to translate it, wu-wei is probably best rendered as something like “effortless action” or “spontaneous action.” Being in wu-wei is relaxing and enjoyable, but in a deeply rewarding way that distinguishes it from cruder or more mundane pleasures.

Read more here.

14. MY AGE OF ANXIETY

“Anxiety … makes others feel as you might when a drowning man holds on to you,” Anaïs Nin wrote. “Anxiety may be compared with dizziness. He whose eye happens to look down the yawning abyss becomes dizzy,” Kierkegaard observed. “There is no question that the problem of anxiety is a nodal point at which the most various and important questions converge, a riddle whose solution would be bound to throw a flood of light on our whole mental existence,” Freud proclaimed in his classic introductory lectures on psychoanalysis. And yet the riddle of anxiety is far from solved — rather, it has swelled into a social malady pulling countless numbers of us underwater daily. Among those most mercilessly fettered by anxiety’s grip is Scott Stossel, familiar to most as the editor of The Atlantic. In his superb mental health memoir, My Age of Anxiety: Fear, Hope, Dread, and the Search for Peace of Mind (public library | IndieBound), Stossel follows in the tradition of Montaigne to use the lens of his own experience as a prism for illuminating insight on the quintessence of our shared struggles with anxiety. From his personal memoir he weaves a cultural one, painting a portrait of anxiety though history, philosophy, religion, popular culture, literature, and a wealth of groundbreaking research in psychology and neuroscience.

Why? Because anxiety and its related psychoemotional disorders turn out to be the most common, prevalent, and undertreated form of clinically classified mental illness today, even more common than depression. Stossel contextualizes the issue with some striking statistics that reveal the cost — both financial and social — of anxiety:

According to the National Institute of Mental Health, some forty million Americans, nearly one in seven of us, are suffering from some kind of anxiety disorder at any given time, accounting for 31 percent of the expenditures on mental health care in the United States. According to recent epidemiological data, the “lifetime incidence” of anxiety disorder is more than 25 percent — which, if true, means that one in four of us can expect to be stricken by debilitating anxiety at some point in our lifetimes. And it is debilitating: Recent academic papers have argued that the psychic and physical impairment tied to living with an anxiety disorder is equivalent to living with diabetes — usually manageable, sometimes fatal, and always a pain to deal with. A study published in The American Journal of Psychiatry in 2006 found that Americans lose a collective 321 million days of work because of anxiety and depression each year, costing the economy $50 billion annually; a 2001 paper published by the U.S. Bureau of Labor Statistics once estimated that the median number of days missed each year by American workers who suffer from anxiety or stress disorders is twenty-five. In 2005 — three years before the recent economic crisis hit — Americans filled fifty-three million prescriptions for just two antianxiety drugs: Ativan and Xanax. (In the weeks after 9/11, Xanax prescriptions jumped 9 percent nationally — and by 22 percent in New York City.) In September 2008, the economic crash caused prescriptions in New York City to spike: as banks went belly up and the stock market went into free fall, prescriptions for anti-depressant and antianxiety medications increased 9 percent over the year before, while prescriptions for sleeping pills increased 11 percent.

[…]

Few people today would dispute that chronic stress is a hallmark of our times or that anxiety has become a kind of cultural condition of modernity. We live, as has been said many times since the dawn of the atomic era, in an age of anxiety — and that, cliché though it may be, seems only to have become more true in recent years as America has been assaulted in short order by terrorism, economic calamity and disruption, and widespread social transformation.

Fittingly, Alan Watts’s The Wisdom of Insecurity: A Message for an Age of Anxiety, written in the very atomic era that sparked the dawn of our present predicament, remains one of the best meditations on the subject. But, as Stossel points out, the notion of anxiety as a clinical category only appeared as recently as thirty years ago. He traces anxiety’s rise to cultural fame through the annals of academic history, pointing out that there were only three academic papers published on the subject in 1927, only fourteen in 1941, and thirty-seven in 1950. It wasn’t until psychologist Rollo May published his influential treatise on anxiety in 1950 that academia paid heed. Today, a simple Google Scholar search returns nearly three million results, and entire academic journals are dedicated to anxiety.

But despite anxiety’s catapulting into cultural concern, our understanding of it — especially as far as mental health stereotypes are concerned — remains developmentally stunted, having evolved very little since the time of seventeenth-century Jewish-Dutch philosopher Baruch Spinoza, who asserted that anxiety was a mere problem of logic and could thus be resolved with tools of reason. Stossel counters such oversimplification with a case for layered, complex causality of the disorder:

The truth is that anxiety is at once a function of biology and philosophy, body and mind, instinct and reason, personality and culture. Even as anxiety is experienced at a spiritual and psychological level, it is scientifically measurable at the molecular level and the physiological level. It is produced by nature and it is produced by nurture. It’s a psychological phenomenon and a sociological phenomenon. In computer terms, it’s both a hardware problem (I’m wired badly) and a software problem (I run faulty logic programs that make me think anxious thoughts). The origins of a temperament are many faceted; emotional dispositions that may seem to have a simple, single source — a bad gene, say, or a childhood trauma — may not.

Heraclitus

Panta rhei, „everything flows“

Πάντα ῥεῖ (panta rhei) „everything flows“ either was not spoken by Heraclitus or did not survive as a quotation of his. This famous aphorism used to characterize Heraclitus‘ thought comes from Simplicius,[32] a neoplatonist, and from Plato’s Cratylus. The word rhei (cf. rheology) is the Greek word for „to stream“, and to the etymology of Rhea according to Plato’s Cratylus.[33]

Heraclitus by Hendrick ter Brugghen

The philosophy of Heraclitus is summed up in his cryptic utterance:[34]

ποταμοῖσι τοῖσιν αὐτοῖσιν ἐμϐαίνουσιν, ἕτερα καὶ ἕτερα ὕδατα ἐπιρρεῖ.
Potamoisi toisin autoisin embainousin, hetera kai hetera hudata epirrei
„Ever-newer waters flow on those who step into the same rivers.“

The quote from Heraclitus appears in Plato’s Cratylus twice; in 401d as:[35]

τὰ ὄντα ἰέναι τε πάντα καὶ μένειν οὐδέν
Ta onta ienai te panta kai menein ouden
„All entities move and nothing remains still“

and in 402,a[36]

„πάντα χωρεῖ καὶ οὐδὲν μένει“ καὶ „δὶς ἐς τὸν αὐτὸν ποταμὸν οὐκ ἂν ἐμβαίης“
Panta chōrei kai ouden menei kai dis es ton auton potamon ouk an embaies
„Everything changes and nothing remains still … and … you cannot step twice into the same stream“[37]

Instead of „flow“ Plato uses chōrei, to change chōros.

The assertions of flow are coupled in many fragments with the enigmatic river image:[38]

Ποταμοῖς τοῖς αὐτοῖς ἐμβαίνομέν τε καὶ οὐκ ἐμβαίνομεν, εἶμέν τε καὶ οὐκ εἶμεν.
„We both step and do not step in the same rivers. We are and are not.“

Compare with the Latin adages Omnia mutantur and Tempora mutantur (8 CE) and the Japanese tale Hōjōki, (1200 CE) which contains the same image of the changing river, and the central Buddhist doctrine of impermanence.

 

Also in Ovid’s ‚Metamorphoses‘:

Bk XV:176-198 Pythagoras’s Teachings:The Eternal Flux

 

‘Since I have embarked on the wide ocean, and given full sails to the wind, I say there is nothing in the whole universe that persists. Everything flows, and is formed as a fleeting image. Time itself, also, glides, in its continual motion, no differently than a river. For neither the river, nor the swift hour can stop: but as wave impels wave, and as the prior wave is chased by the coming wave, and chases the one before, so time flees equally, and, equally, follows, and is always new. For what was before is left behind: and what was not comes to be: and each moment is renewed.‘

 

And from Ovid to Shakespeare:

We have to look no further than Shakespeare (who read Ovid in Golding’s translation) to confirm this point. Shakespeare, quite literally, plundered Ovid for stories and moved them directly into his plays – in Titus Andronicus or A Midsummer Night’s Dream for example – and, like so many of his contemporaries used Ovid as a sort of handbook for classical allusions and similes (as sad as Niobe, as crafty as Ulysses, as vain as Narcissus, as impetuous as Phaethon, as foolish as Icarus, and so on). Shakespeare lifts whole speeches from Ovid and adapts them to his purposes (so, for example, Prospero’s famous invocation of the spirits in the Tempest is adapted directly from Medea’s similar speech in Metamorphoses (a speech Shakespeare had used before, in Macbeth). In Shakespeare’s early work, something like three quarters of the classical imagery is derived directly from Ovid’s poem. And if we want to see modern poets doing the same thing, we have only to look at, say, Eliot’s Waste Land, in which images and references to Ovid are just as frequent. In fact, if one wants to have any sort of historical appreciation for the development of English poetry, understanding the influence of and the reference to Ovid is essential.

How Art Can Save Your Soul | Brain Pickings.

“Art can be a source of help with our problems — our innermost problems — the problems of the soul.”

“Art holds out the promise of inner wholeness,” British philosopher Alain de Botton wrote in Art as Therapy (public library), one of the best art books of 2013. He expounds the premise of the book in this fantastic “Sunday sermon” from The School of Life — the lecture series de Botton founded in 2008, premised on the idea that secular thought can learn a lot from the formats of religion, which went on to reimagine the self-help genre. De Botton argues that in the 19th century, culture replaced scripture as our culture’s object of worship, but we are no longer allowed to bring our fears and anxieties to this modern cathedral. “It is simply not acceptable to bring the aches and pains of our souls to the guardians of culture,” he laments. He goes on to explore how we can reclaim this core soul-soothing function of art from the grip of empty elitism and sterile snobbery, focusing on the the seven psychological functions of art.

We are very vulnerable, fragile creatures in desperate need of support and we generally don’t get it. … Art [can be] a source of help with our problems — our innermost problems — the problems of the soul. . . . Art can be a form of self-help and there is nothing demeaning about the concept of self-help — only the way in which some of self-help has been done so far, but there is nothing wrong with it as a concept. . . .

There is nothing wrong with [art today]. It’s not the art that’s the problem — it’s the frame around the art. We are simply not encouraged to bring ourselves to works of art. . . . The impact of art is often not what it should be because the frame is wrong.

[…]

I believe that art should be propaganda of something [other than the Christian church] — not theology, but psychology. I believe that art should serve the needs of our psyche as efficiently and as clearly as it served the needs of theology for hundreds of years.

[youtube=http://www.youtube.com/watch?v=qFnNgTSkHPM&w=640&h=360]

And this article by Alain de Botton in the ‚Guardian‘ also fits to the issue:

Alain de Botton’s guide to art as therapy

Can visual art offer solace, hope and reassurance as music can? The writer chooses the works that make him feel less alone.
Bridge Over a Pond of Water Lillies (1899) by Claude Monet View larger picture

Bridge Over a Pond of Water Lillies (1899) by Claude Monet. Photograph: National Gallery London

It comes naturally to most of us to think of music as therapeutic. Almost all of us are, without training, DJs of our own souls, deft at selecting pieces of music that will enhance or alter our current moods for the better. Yet few of us would think of turning to the visual arts for this kind of help. Few of us involve paintings or sculptures in our emotional lives. We don’t have playlists of favourite images on our phones. We don’t assemble our own private galleries on our computers. The cost and prestige of art typically draws us back from such steps. The way the establishment presents art to us doesn’t invite us to bring ourselves into contact with works.

In the solemn galleries of museums, which is still where most of us pick up cues about how to behave around art, many of us are – in our hearts – a little lost (the gift shop is more helpful; it may be embarrassingly easier to have a fruitful time with the postcard than the original). We look at the caption and dutifully learn some key dates, the provenance and perhaps an explanation of an allegory. But could this really matter to me? What should art really be for?

The second question has long felt either vulgar and impatient or else simply unanswerable. This is dangerous. If art deserves its enormous prestige (and I think it does), then it should be able to state its purpose in relatively simple terms. I believe art is ultimately a therapeutic medium, just like music. It, too, is a vehicle through which we can do such things as recover hope, dignify suffering, develop empathy, laugh, wonder, nurture a sense of communion with others and regain a sense of justice and political idealism.

But for it to do any of these things for us, we need to approach art in the right sort of way. It needs to be framed not principally according to the criteria of art history (however interesting those can be), but according to a psychological method that invites us to align our deeper selves with artworks. What does a psychological therapeutic way of reading art look like? A selection of works suggests the way.

Hope
Bridge Over a Pond of Water Lilies (1899) by Claude Monet (above)

Monet’s painting is one of the most popular works in the Metropolitan Museum of Art in New York. This is worrying to many people of taste and sophistication, who take a taste for „prettiness“ as a symptom of sentimentality, even stupidity.

The worry might be that the fondness for this kind of art is a delusion: those who love pretty gardens are in danger of forgetting the actual conditions of life, which include war, disease and political error and immorality. Audiences need art constantly to remind them of this kind of material, sophisticated types will propose, or they might end up deluded as to what life is actually like.

But this is to locate the problem in completely the wrong place. For most of us, the greatest risk we face is not complacency; few of us are likely to forget the evils of existence. The real risk is that we are going to fall into fury, depression and despair; the danger is that we will lose all hope in the human project.

It is this kind of despair that art is well suited to correct and that explains the well-founded popular enthusiasm for prettiness. Flowers in spring, blue skies, children running on the beach … these are the visual symbols of hope. Cheerfulness is an achievement and hope is something to celebrate.

Empathy
The Twilight of Life (1894) by Sydney Tully

The Twilight of Life By Sydney Strickland Tully, Photograph: Art Gallery Of Ontario It’s hard to take much of an interest in other people, especially perhaps elderly people. In Tully’s portrait, an elderly woman sits stooped and thoughtful against a stark background. We’re being encouraged to look for longer than we normally would. She used to be strong and decisive. She had lovers once; she carefully set out with a quiet thrill in the evening.

Now, she’s hard to love and maybe she knows this. She gets irritated, she withdraws. But she needs other people to care for her. Anyone can end up in her position. And there are moments when a lot of people – at whatever stage of life – are a bit hard to admire or like. Love is often linked to admiration: we love because we find another person exciting and sweet. But there’s another aspect to love in which we are moved by the need of the other, by generosity.

Tully is generous to her sitter. The painter looks with care into her face and wonders who she might really be.

Care
14th-century Venetian glass

Venetian glass Photograph: Getty Images/DeAgostini The glass workshops of Venice became famous in the medieval period for producing the most elaborate, delicate, transparent glassware mankind had ever known. Most of the time, we have to be strong. We must not show our fragility. We’ve known this since the playground. There is always a fragile bit of us, but we keep it very hidden. Yet Venetian glass doesn’t apologise for its weakness. It admits its delicacy; it makes the world understand it could easily be damaged.

The glass is not fragile because of a deficiency, or by mistake. It’s not as if its maker was trying to make it tough and hardy and then – stupidly – ended up with something a child could snap. It is fragile and easily harmed as the consequence of its search for refinement and its desire to welcome sunlight and candlelight into its depths. Glass can achieve wonderful effects, but the price is fragility. It is the duty of civilisation to allow the more delicate forms of human activity to thrive; to create environments where it is OK to be fragile. It’s obvious the glass could be smashed, so it makes you use your fingers tenderly. It is a moral tale about gentleness, told by means of a drinking vessel. This is training for the more important moments in life when moderation will make a real difference to other people. Being mature means being aware of the effect of one’s strength on others. CEOs please take note.

Sorrow
Fernando Pessoa (2007-08) by Richard Serra

Fernando Pessoa, 2007-8, by Richard Serra Photograph: David Levene for the Guardian We’re often intensely lonely in our suffering. In an upbeat world that worships success, our miseries feel shameful. We’re not only sad, we’re sad at being the only ones that seem to be so. We can’t remove suffering from life, but we can learn to suffer more successfully – that is, with less of a sense of persecution or an impression that we have unfairly been singled out for punishment. Fernando Pessoa is a beautifully dark monumental work by Richard Serra, named after a Portuguese poet with a turn for lamentation („Oh salty sea/ how much of your salt is tears from Portugal“).

The work does not deny our sorrows, it does not tell us to cheer up or point us in a brighter direction. The large scale and monumental character of this sombre sculpture declare the normality and universality of grief. It is confident that we will recognise and respond to the legitimate place of solemn emotions in an ordinary life.

Rather than leaving us alone with our darker moods, the work proclaims them as central features of life. In its stark gravity, like many of the greatest works of art, it creates a dignified home for sorrow.

Work
At the Linen Closet (1663) by Pieter de Hooch

At linen closet, by Pieter de Hooch (1629-1684) www.bridgemanart.com In this modest domestic scene by the 17th-century Dutch genre painter Pieter de Hooch, we see a couple of women busy with a household task. There are no soldiers, kings, martyrs or divine figures in sight; this is ordinary life as we know it to this day.

It can be hard to see beauty and interest in the things we have to do every day and in the environments where we live. We have jobs to go to, bills to pay, homes to clean and we deeply resent the demands they make on us. The linen closet itself could easily be resented. It is an embodiment of what could be seen as boring, banal, even unsexy.

But the picture moves us because we recognise the truth of its message. If only, like De Hooch, we knew how to recognise the value of ordinary routine, many of our burdens would be lifted. It gives voice to the right attitude: the big themes of life – the search for prosperity, happiness, good relationships – are always grounded in the way we approach little things. The statue above the door is a clue. It represents money, love, status, vitality, adventure. Taking care of the linen is not opposed to these grander hopes. It is, rather, the way to them. We can learn to see the allure of those who look after it, ourselves included.

That so many people revere this painting is hopeful; it signals that we know, deep down, that De Hooch is on to something important.

Appreciation
An Idyll: Daphnis and Chloe (1500-01) by Nicola Pisano

Niccolo Pisano An Idyll: Daphnis and Chloe Photograph: Copyright The Wallace Collection In his Daphnis and Chloe, Pisano evokes the beginnings of love, a moment when the sweetness and grace of the other is intensely present to us. Daphnis regards Chloe as so precious, he hardly dares to touch her. All his devotion, his honour and his hopes for the future are vivid to him. He wants to deserve her; he does not know if she will love him and this doubt intensifies his delicacy. In his eyes, she absolutely cannot be taken for granted. Seen by someone in a long-term relationship, when habit has made the other completely familiar, this image comes across as particularly necessary, because of its power to return us to a forgotten sense of gratitude and wonder.

Relationships
The Agony in the Kitchen (2012) by Jessica Todd Harper

Jessica Todd Harper The Agony In The Kitchen. Photograph: Copyright the artist We’re surrounded by images of what relationships are like – many of which are deeply deceptive and harmful to our own chances of being contented with another person. People seldom talk with complete sincerity about what is happening inside their relationships.

Behind the silence lies a need to maintain face about one’s progress through one of the most significant challenges of adult life: a capacity to succeed at being happy with someone else.

We need works of art that can show us that our troubles are both sad and normal. We don’t need the diametric opposite of the saccharine images of Hollywood. Extremes of domestic violence are rare. But day‑to‑day struggles are universal, though often unrepresented and unseen.

In Jessica Todd Harper’s image, a couple has perhaps been planning to have a nice evening yet it has now all gone wrong. One person is feeling incensed; the other is perhaps crying.

Importantly, however, these could be nice people. We are not to condemn them. They are likable, but in the grip of a genuinely difficult problem. And you have been there yourself.

Art can function as a private repository for truths that are too peculiar and too unacceptable to be shared with people we know.

Consumerism
Cookmaid with Still Life of Vegetables and Fruit (1620‑05) by Nathaniel Bacon

Cookmaid With Still Life Of Vegetables And Fruit Photograph: Tate The idea of consumerism as evil is a scourge with which to beat the modern world. Yet at its best consumerism is founded on love of the fruits of the earth, delight in human ingenuity and due appreciation of the vast achievements of organised effort and trade. This painting takes us to a time when abundance was new and not to be taken for granted.

We are so afraid of greed that we forget how honourable the love of material things can be. In 1620, homage could be paid to the nobility of work and commerce, something that boredom and guilt make less accessible to us today. Perhaps we can learn from this picture. A good response to consumerism might not be to live without melons and grapes, but to appreciate what really needs to go into providing them.

The 13 Best Psychology and Philosophy Books of 2013 | Brain Pickings.

How to think like Sherlock Holmes, make better mistakes, master the pace of productivity, find fulfilling work, stay sane, and more.

After the best biographies, memoirs, and history books of 2013, the season’s subjective selection of best-of reading lists continue with the most stimulating psychology and philosophy books published this year. (Catch up on the 2012 roundup here and 2011′s here.)

(…)

‚When someone has spiritually awakened, he resembles the moon’s ‘residing’ in water: the moon does not get wet nor is the water shattered. Although the moon is a great, broad light, it lodges in the tiniest bit of water. The moon at its fullest, as well as the whole of the heavens, lodges within the dewdrop poised on a blade of grass, just as it lodges in any single bit of water. Spiritual awakening does not tear a person asunder; thus, it is like the moon’s not making a dent in the water. A person no more impedes his spiritual awakening than a dewdrop impedes the moon in the heavens. The deeper the reflection, the higher the light: how long the period of your spiritual awakening will last depends on how large your drop of water is and how full your moon is seen to be.‘ (out of ‚Shobogenzo‘ by Eihei Dogen)
 
Image
 
(The Shōbōgenzō (Treasury of the True Dharma Eye) is the master work of the Japanese Sōtō Zen Master Eihei Dōgen (1200 – 1253). It consists of a series of lectures or talks given to his monks as recorded by his head monk, Ejo, who became his Dharma successor although Dōgen was involved in the editing and
Shobogenzo cover
recording of some of the Shōbōgenzō. This is the first major Buddhist philosophical work composed in the Japanese language.
There were only two complete English translations of the Shōbōgenzō previous to this version: Gudo Nishijima and Chodo Cross’s Master Dōgen’s Shōbōgenzō in four volumes (available from Windbell Publications) and Shobogenzo, The Eye and Treasury of the True Law, by Kosen Nishiyama and John Stevens. There are many translations of sections of the Shōbōgenzō. There are also many commentaries on Dōgen and his work. A search on this website will uncover articles on Dōgen and his teachings.The Gudo Nishijima and Chodo Cross’s Master Dōgen’s Shōbōgenzō in four volumes is now available for download for free, courtesy of the BDK English Tripitaka Project. (click on the Digital Text link) You can download from their site or from thezensite: Dogen Teachings pageThe Complete Shōbōgenzō is available here (this site) and here from Shasta Abbey, translated by Rev. Hubert Nearman.
WARNING
: the complete text is 1144 pages in .pdf and 8, 675 Kb. Not recommended for dial-up modems. If you have difficulty downloading this from this website, try the Shasta Abbey link. Below are links to each of the 96 chapters and other parts of the book. All pages are in .pdf format.)

 

The 10 Best Psychology and Philosophy Books of 2012 | Brain Pickings.

After the best science books, art books, and design books of 2012, the season’s subjective selection of best-of reading lists continues with the most stimulating philosophy, psychology, and creativity books published this year. (Catch up on last year’s roundup here.)