You have /5 articles left.
Sign up for a free account or log in.

Nostalgia, it’s said, is a hell of a drug.

Like other psychoactive and addictive drugs, nostalgia has a potent and powerful allure. It makes past experiences more emotionally intense and appealing than they were at the time. Much as drugs can mask pain, nostalgia is escapist. It encourages us to remember the good while forgetting the bad, offering temporary relief from present-day dissatisfactions and challenges.

The drug analogy also suggests how easy it is to overindulge in nostalgia, hindering our ability to appreciate or live in the present. Much like drug peddlers, politicians, corporate interests, mass media and the culture industry exploit nostalgia for commercial and political ends.

Yet the drug metaphor also hints at the downside of being anchored in a bygone past. As a form of escapism, nostalgia can inhibit personal growth and hinder progress and innovation. Idealized, inaccurate nostalgia can also skew and distort our understanding of the past.

Since the mid-17th century, American society has been prone to viewing history through the lens of declension and decline—as a fall from an earlier age of grace.

While it’s important to acknowledge backsliding, regression, setbacks and deterioration, an overemphasis on decline can foster pessimism, fatalism and resignation, contribute to an underappreciation of improvements and advances, and impede adaptation to changing circumstances.

When I look back on my own upbringing, it’s easy to succumb to nostalgia. I was lucky: I grew up in an environment far different from the social world we inhabit today. I grew up in an ethnic bubble, consisting of a host of institutions that have since fallen on tough times. I was embedded within an extended family, spending most weekends with cousins and aunts and uncles. It was a world that was broadly middle class, in which doctors drove Buicks, not Mercedes or BMWs.

It was also a child-centered world, far less structured and supervised than childhood is today. I was surrounded by vacant lots and fields. I had a dozen or more same-aged playmates nearby and spent most days playing outside. For better or worse, my middle-class childhood was also much more sheltered from adult realities and anxieties than it is today.

To be sure, the standard of living was lower and even more racially stratified than now. It was a world without color television, with linoleum floors and countertops, in which even Formica was out of reach.

I don’t want to romanticize or sentimentalize that world. It depended heavily on maternal sacrifice. Mothers (and my mom worked as a schoolteacher) were expected to serve as cooks, maids and chauffeurs and were responsible for organizing every facet of family life. Play was highly sex-segregated. As many as a third of children grew up in poverty.

Still, despite the Cold War backdrop, it was an optimistic world experiencing rapid economic growth and the beginnings of the freedom struggles that held out the promise of a society that would finally live up to its commitment to liberty and equality.

No wonder that many older Americans regard that era with nostalgia, as a kind of golden age.

Robert Putnam’s Bowling Alone tells the story of what happened to that world. Like David Riesman’s Lonely Crowd and Charles Murray’s Coming Apart, his theme is the erosion of family and community life and the breakdown of the institutions that “long lent American lives purpose, direction and happiness.”

Putnam’s argument, you may recall, is that social capital—the networks, norms and trust that facilitate coordination and cooperation for mutual benefit—has been in steep decline in the United States since the mid-20th century. The eclipse of earlier forms of connection and engagements is evident in diminished participation in civic and religious organizations, declining union membership, lower attendance at community events, and less frequent socializing with friends.

His organizing metaphor—declining participation in bowling leagues—serves as a defining symbol of the erosion of communal activity and social engagement.

In Putnam’s view, the decline in social capital has had profound implications for American society, democracy and even individuals’ physical and mental health. Trust in public institutions fell. Shared values, expectations and perspectives gave way to radically divergent cultures and experiences resting on class and education. Pessimism and despair surged.

Among the factors that he blamed for the decline in social capital were suburbanization, commuting, time pressures (especially for women), electronic entertainment, a retreat from organized religion and a generational shift in values. The massive postwar expansion of suburbia contributed to the decline of close-knit ethnic urban communities. The decline of the industrial economy, the weakening of labor unions and rise of the knowledge economy contributed to an increase in economic stratification and a heightened emphasis on advanced education.

The concerns that Putnam expressed were not new. His thesis is only the most recent version of Ferdinand Tönnies’s 1887 argument about a long-term shift from Gemeinschaft to Gesellschaft—from social relations based on personal and family ties to those resting on impersonal and instrumental ties and formal organizations.

Many of the anxieties that Putnam expresses about individuation, isolation, disconnection, alienation and anomie had been voiced by Alexis de Tocqueville over a century and a half earlier.

I, a product of the 1960s-era commitment to participatory democracy, share Putnam’s concerns about the eclipse of community. That’s the title of Maurice Robert Stein’s 1960 study of the loss of close-knit, intimate, stable social relationships; the rise of more transient, impersonal and fragmented social interactions; and of bureaucracies that contribute to more impersonal and standardized interactions and disrupt traditional community bonds.

I, too, worry, like Robert Bellah in his 1985 study Habits of the Heart, about whether an individualistic emphasis on personal success and achievement can foster connections that transcend personal self-interest; whether a diverse, increasingly fragmented and politically and ideologically polarized society can successfully address the collective challenges it faces; and whether it is possible to craft a common moral language and shared narrative that would restore a sense of community and collective purpose.

My own research, which focuses largely on children, is driven, in part, by my fear that the United States is becoming less and less a child-friendly society.

Children today make up a smaller share of the population than ever before and, as a result, the United States has become a less child-centered and a more adult-focused society than a half century ago. After all, childless households are now the norm, and those families with children have fewer than at any time since the Great Depression. At the same time, fewer young adults—roughly 40 percent—believe that having children is an essential ingredient in a fulfilling life.

Childhood has become highly stratified—in terms of family structure, housing and income stability—along class lines. Now that a majority of the child-aged population is nonwhite, I worry that society will invest less in childhood.

To be sure, the United States spends more money on children—on schooling, entertainment, goods and services, and, yes, juvenile justice—than any other. And in many respects, kids are doing better than ever. Not only are the rates of infant and child mortality lower than they have ever been, but teens smoke less, drink less and get pregnant less and are more likely to graduate from high school and enroll in college.

Still, the kids are not all right. Surveys suggest that children are less happy than in the past and more anxious and stressed. The onset of clinical depression, which typically materialized during the 20s, now takes place much earlier.

While the increasing diagnosis of conditions like autism, ADHD and other neurological, biological and developmental disabilities is not in itself a sign of childhood distress, the experience of navigating a world that fails to accommodate to their needs, challenges and unique ways of processing information and interacting with the world can contribute to distress. It may well be that the way children are raised and schooled is exacerbating the propensity to exhibit or express certain kinds of behavior, much as the increase in childhood allergies appears to be a product, at least partially, of epigenetics—the ways that environment can alter the expression of genes.

Childhood today is caught up in a mess of contradictions.

Among the paradoxes of progress is the fact that there are more children suffering from chronic illnesses or living with disabilities than ever before—precisely because those children are far less likely to die prematurely.

Parents are much older, on average, than in the past, a development that helps explain the heightened stress on children’s safety and their physical and psychological well-being.

By many measures, kids are doing better than ever. But in other respects, kids are struggling. Today, a majority of children experience a major disruption during their childhood years (like parental or partner separation and divorce), and a majority of children’s families are dysfunctional in one way or another, whether from alcoholism, drug abuse, physical or emotional neglect, domestic violence, or verbal, emotional, physical or psychological abuse.

After diminishing, educational disparities along lines of class, race, ethnicity and gender have again widened. More kids than ever are obese and physically inactive. Kids are overexposed to technology and social media, and this has led to reduced attention spans, cyberbullying and social isolation. About a sixth of all children grow up in poverty, and many more suffer from family instability and hunger, live in unsafe neighborhoods, are exposed to violence and attend chaotic schools.

American childhood today is marked by glaring contradictions. Kids grow up faster than ever before but have few ways to demonstrate their growing maturity and competence. The lone exception is sports, and athletic participation peaks in seventh grade.

Meanwhile, parental overprotection appears to have contributed to middle-class children who are less resourceful, less resilient and less self-reliant than in the past and more dependent and vulnerable to anxiety.

Adults think of childhood as a time of unequaled value but give kids less freedom and fewer opportunities to participate in free, unstructured, unsupervised outdoor group play than in the past.

Adults increasingly medicalize, psychologize and pathologize normal childish behavior. Marketers prey on kids with wiles reserved for adults, while advertisers, magazines, the music industry, television and movies eroticize young girls.

At its best, childhood is a time of wonder and a period of growth, risk taking and freedom, of experimentation and exploration. It’s among life’s greatest adventures, and it should be an odyssey of self-discovery. But for many kids, perhaps most, childhood isn’t like that. School has become less joyful and social. Despite social media and the internet, their social lives are more isolated and lonely.

Bombarded with advertisements and prepackaged, corporate-produced fantasies, their culture has, I would assert, been colonized by commercial interests, which has had a profound, harmful influence on their development, in spite of the growing emphasis on girl power.

The phrase “the corporate colonization of children’s imagination” describes the increased influence of corporate interests and commercialism in the lives of children and how businesses and marketing efforts have permeated various aspects of childhood, with implications for children’s development, values and behaviors.

Corporations increasingly target children as a consumer demographic, using sophisticated marketing strategies to promote products that include toys, junk food, clothing, entertainment and even electronics. These marketing efforts often exploit children’s susceptibility to advertising and their influence on family purchasing decisions.

Since the deregulation of children’s advertising during the Reagan administration, there has been a proliferation of child-focused media content, much of which is heavily commercialized and serves as a platform for product placement and merchandising. Characters from TV shows, movies and video games are often used to market a wide range of products to children. More recently, advertisers have relied heavily on social media influencers to market products.

Children, exposed to product branding from a very young age, begin to define their identities in terms of the products they consume, the games they play, the clothes they wear, the shows they watch and the music they listen to. Early exposure to consumerism and various screens also affects their attention, self-esteem and social skills. And, of course, the commercialization of play, with branded toys and structured activities, has diminished opportunities for free, imaginative and unstructured play, which is crucial for child development.

Have I fallen into the nostalgia trap, treating the childhood that I experienced as a platonic ideal while downplaying the joys of today’s electronically mediated childhood? Perhaps. Do I exaggerate the commercialization, commodification and colonization of childhood? Maybe. After all, children’s television shows, cereals and branded toys aren’t new.

Nevertheless, just as I worry about the dilution of a college education in the name of efficiency and cost-effectiveness at the expense of its developmental and transformational responsibilities, I fret, too, about children’s well-being.

I don’t much worry about premature exposure to adult themes, from sex and violence to complex social issues. My impression is that children generally ignore topics for which they aren’t developmentally ready.

But I am concerned about time spent on electronic devices, which has limited opportunity for physical activity, imaginative play and social interaction. I worry, too, about the heightened emphasis on academic achievement at a very young age and the embrace of a more structured, education-focused childhood, which not only diminishes the time for free play, exploration and leisure but makes learning less pleasurable.

I’m also vexed by the decline of empty lots and other noncommercial outdoor spaces where children can play and explore freely and by the increase in structured and screen-based entertainment, which, I fear, impedes the development of creativity, problem-solving and social skills. Yet another concern involves the commercialization of childhood, shaping and distorting children’s values, desires and self-image.

I, who was myself guilty of overparenting, also worry about excessive parental involvement in and supervision of children’s lives. While this can provide safety and support, it can also limit children’s opportunities to experience independence, take risks and learn from unstructured experiences.

As a historian, I know full well the dangers of romanticizing the past. But it’s also the case that the past can provide a critical vantage point on the present. History offers a unique lens through which we can see what’s been lost as well as what’s gained in the name of progress. Historical perspective also encourages a nuanced view that recognizes the benefits of progress while also being mindful of what might have been lost or compromised along the way. Most important of all, history compels us to consider the kind of future we want to create.

Let’s make it a good one.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma