You have /5 articles left.
Sign up for a free account or log in.

Do you recall the story of Theseus and the Minotaur? How King Minos of Crete demanded, as tribute from Athens, seven young women and men to be sacrificed to the Minotaur. How Theseus, the son of King Aegeus, volunteered to be sent to the Minotaur’s labyrinth. How Theseus, with the help of Ariadne, Minos’s daughter, killed the Minotaur. Then, how Theseus forgot his promise to his father to hoist white sails on his ship if he survived his ordeal and how Aegeus, seeing only black sails, committed suicide by throwing himself into the sea.

This story served as the basis for one of the most famous paradoxes in all of metaphysics. Plutarch tells us that Theseus’s ship was preserved as a historical relic. But as its wooden planks deteriorated, they were replaced, one by one, until none of the original planks remained. Theseus’s paradox asks whether the vessel remained the same ship?

Theseus’s paradox raises enduring questions about identity and change and continuity over time.

(Thomas Hobbes later asked a variant on this question. If the Athenians had reconstructed another ship out of the original planks, which vessel would be true ship? Would both vessels be the genuine article or would both be replicas?)

Various solutions have been offered to Theseus’s paradox. One approach is to argue that an object’s identity is determined not by its physical components, but its form, function or continuity over time. Another approach argues that an object’s identity is subjective and depends on how it is perceived. In the case of a human being or a society, the philosopher John Locke argued that it is memory that serves as the basis of identity over time.

I raise all this as an entryway into issues of identity and selfhood that are the source of much controversy today.

If I were to identify the single most important word in the contemporary vocabulary, I would say it is “self.” Think of the many ways we use that word: self-help, self-esteem, self-efficacy, self-reflection and more. Our modern ideas about rights, privacy and identity are impossible to imagine without a conception of the self that rests on a person’s consciousness, personal identity, agency and self-awareness.

Various disciplines offer distinct perspectives on the self. Philosophers, for example, ask whether the self is a fixed entity, despite physical and mental changes or a dynamic, contingent, fluid and evolving object and whether the self is an illusion or a social or cultural construct.

Neuroscientists examine the brain structures and mental processes that underlie self-awareness, self-referential thinking and a sense of agency. Psychologists, in turn, study the development of the self, the factors that influence self-perception and the ways that a person’s self-concept and self-esteem influence behavior, motivation and mental health.

Sociologists and anthropologists are especially interested in how social, cultural and environmental factors shape an individual's self-concept and self-identity and the role of social interactions, relationships, cultural norms and values in the development and expression of the self.

Then, there are historians and historically minded philosophers like Charles Taylor who examine the evolution of the very idea of the self.

Just think of the disparate ways contemporary social scientists talk about the self. There’s the self-reflexive “inner eye” that views life through that lens. There’s the “essential” self, which is defined by intersectional identities or by one’s self-defined identity. There’s the physical self, a person’s physical appearance and bodily sensations, but also their body image. There’s the psychological self, which refers to a person’s thoughts, emotions, beliefs and memories. There’s the social self, the idea that a person’s identity is shaped by their social roles, social relationships and social interactions and by context and cultural norms, as well as by a person’s personality. Then, too, there’s the narrative self, the stories that we tell ourselves that give our lives coherence and meaning.

To this list, we might add the fluid self, whose identity is anything but fixed. Isn’t that partly what we mean when we refer to metrosexual or bisexual? In addition, there’s the looking-glass self, the self that is socially constructed, socially situated and the product of social interactions. There’s also the managed self, either self-managed or institutionally managed. And, of course, there’s the postmodern self, an unstable, fractured, fluid, unified social construction shaped by language, discourse, cultural norms and a particular society’s power dynamics.

As a historian, I would argue that the self is not a transhistorical concept or phenomenon. It has a history.

In Sources of the Self, Taylor’s 1989 classic, the great Canadian philosopher explores the development of modern understandings of the self and their epistemological, ethical and political implications. He also shows how new conceptions of the self underpinned the emergence of the social sciences, which treated the self as an object of scientific study. In addition, the radical individuality that underlies modern selfhood gave rise to political individualism and the rise of new notions of rights, privacy, independence and autonomy that coexisted with intense longings for connection, community and a sense of belonging and therefore contributed to modern nationalism.

What defines the modern self? In Taylor’s view, its defining features include:

  • Individuality: This is the idea that each person is unique and autonomous and is responsible for her or his own thoughts, feelings and actions. This contrasts with earlier conceptions of the self that emphasized the individual's embeddedness within a larger community or cosmic order. Romanticism, with its emphasis on self-expression and the quest for personal self-fulfillment, is itself an outgrowth of this heightened emphasis on individuality.
  • Subjectivity, self-reflexivity and interiority: This is the sense of the self as defined by inner space and depth, which is accessed through self-reflection and self-consciousness. This emphasis on the psychological interior and on subjective emotions, desires, drives, passions and preferences, in turn, gave rise to a growing concern with personal authenticity and self-expression.
  • The affirmation of everyday life: This is the belief that life’s meaning is found in precisely those areas previously treated as private in a pejorative sense: child rearing, friendship, love, marital companionship and employment in a craft or profession. Moral worth was found in ordinary life, not in martial valor or participation in politics and governance.

As Keith Thomas demonstrates in The Ends of Life, his masterful, if inexplicably neglected, 2010 study of the origins of modern ideals of personal fulfillment through this work, wealth and possessions as well as the pleasures of friendship, family and sociability, a modern conception of the good life differed radically from its precursors. Gradually and unevenly an older emphasis on honor, military prowess and the leisurely pursuit of politics (and in some instances, an emphasis on religious practice) gave way to more bourgeois aspirations.

If you were to teach about the rise of modern selfhood, you certainly wouldn’t lack for sources or ideas. There’s Christopher Lasch’s The Minimal Self, the sequel to his bestselling The Culture of Narcissism, a critique of the ideas of self-actualization popularized by the psychologist Abraham Maslow. As in his earlier book, Lasch argues that American society “breeds and fosters a narcissistic psychopathology” that cultivates discontent and encourages a consumerist mentality.

There’s The Rise and Triumph of the Modern Self by Carl R. Trueman, a professor of biblical and religious studies, which traces the growth of an ethic of expressive individualism and psychological well-being and its relationship to what the book considers the contemporary obsession with subjective identities.

Then, there’s Dror Wahrman’s The Making of the Modern Self, a 2006 study of identity and culture in 18th-century England, which argues that this period witnessed a veritable revolution in the understanding of such identity categories as race, gender and class.

In 1938, Marcel Mauss, the French anthropologist and sociologist, wrote that “far from existing as the primordial innate idea, clearly engraved since Adam, in the innermost depths of our being,” the modern self “as a bounded, unique, more or less integrated motivational and cognitive universe, a dynamic center of awareness, emotional, judgment and action organized into a distinctive whole and set contrasted against other such wholes,” is “a rather peculiar idea within the context of world cultures.”

Mauss is certainly right to argue that contrasting ideas about the self offer an invaluable window into cultural difference across time and space. Also, I can’t think of a better way to understand the importance that contemporary society attaches to identity, rights and personal privacy than to offer students a multidisciplinary perspective on the making of the modern self and the ways that this development shaped aesthetic and moral sensibilities, political ideals and personal aspirations.

Let me close by offering yet another reason to adopt a big-picture approach to the self. It holds out the possibility of bringing disciplines into dialogue in ways that today’s siloed universities do not.

I left graduate school under an illusion: that the academy would be a place just like Yale in the 1970s—intellectually serious and deeply committed to ideas, culture, the arts, intense conversation and intellectual contemplation.

That fantasy didn’t arise out of nothing. Breakthrough ideas filled the air, and most of my fellow graduate students entered the academy for precisely the same reason that I did: because it seemed to offer an alternative to corporate culture and a critical vantage point on the violent, unequal society that had produced the Vietnam War.

We found ourselves in graduate school in an intellectual environment that could scarcely have been more stimulating or provocative—and where department lines meant little.

But after graduation, It didn’t take long before the more mundane realities of adulthood and academic life shattered our pipe dreams, as we struggled to find a job, construct a stable personal life, get tenure and make a professional reputation.

To be sure, a few of my fellow history graduate students kept the faith. They remained true to the mission of democratizing history by working at open-air and living history museums or, later, by making rich historical resources available for free on the World Wide Web, producing historical documentaries or creating serious video games on historical themes. But most, alas, did not.

John Sayles’s directorial debut, The Return of the Secaucus 7 (1980) and its more mainstream counterpart, Lawrence Kasdan’s The Big Chill (1983), gave visceral expression to what I and my grad school colleagues felt at that fraught historical moment, as the great dreams of 1960s faded and more somber, sober, subdued realities took hold.

Even if you haven’t seen those pictures, you almost certainly know their plot: how a group of 30-something friends flirt, gossip, reflect nostalgically on their shared past and desperately struggle to hang onto their youthful self-image while trying to evade the harsher realities of adulthood. I wouldn’t call the films bittersweet. They’re too tinged with sadness, disappointment, regret and depression for that. But they do capture the widespread sense that the self-identified young radicals had become sellouts.

It may be too late for my generation to bring back to life an earlier conception of the academy as utopia, but, dear reader, it’s not too late for you. Reclaim the campus as a utopian space.

Realizing that vision will require the faculty to break free, partly, from the Clark Kerr vision of the multiversity as an instrument dedicated, first and foremost, to the production of research and the training of human capital to strengthen America’s global competitiveness. Nor must it resemble Robert Maynard Hutchins’s vision of the university of utopia organized around discussion of great books.

The utopia I speak about is a place where it is possible to imagine and realize a better world in its diversity, where all commonly held ideas and values are challenged and addressed, where cross-campus dialogue, conversation, debate and free exchange regularly take place and where knowledge isn’t siloed. It is a community of learning, a community of inquiry and a solver community. The university is a feasible utopia.

Yes, “utopia is possible. Yes, even now.” If only on our college campuses.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma