You have /5 articles left.
Sign up for a free account or log in.

A calendar with the 15th of the month circled in red pen. A red pen lies atop the calendar.

Anastasiia Makarevich/iStock/Getty Images Plus

Over the past few months, I have had conversations with colleagues at multiple institutions of higher education who have reported the reluctance of universities to enforce norms—whether penalizing plagiarism and AI-generated work, putting away phones in educational settings, employing appropriate etiquette in communication, or meeting deadlines. About the same time, I started to see reports that more than half of recent college graduates are underemployed.

Naturally, I began to wonder if there was some connection between the two. A 2023 survey by Intelligent.com of 800 professionals involved in hiring suggests that there might be. The survey found that 38 percent of employers avoid hiring freshly minted graduates in favor of older employees. The reasons given were various, but they included 63 percent saying that recent graduates frequently can’t handle their workload, 61 percent saying they are frequently late to work, 59 percent saying they often miss deadlines and assignments, 58 percent saying that they get offended too easily, 57 percent saying they lack professionalism, 53 percent saying they struggle with eye contact during interviews and 52 percent saying they have poor communication skills.

This was startling to me, because universities have been increasingly marketing themselves as preparing students for the professional world. While I strongly disagree that the purpose of a university degree is job training, it does make one wonder why universities seem to be failing to deliver what they are marketing themselves as selling. Despite the insistence on funneling students to “job-ready” majors, the complaints listed here aren’t that undergraduates are majoring in the wrong subjects but that they lack soft skills and executive function—for example, the ability to communicate appropriately, meet deadlines, be on time and generally be professional. Given that these are skills that universities could instill even without tailoring themselves to “job readiness” by simply enforcing norms appropriate to scholarly life, one is led to wonder what is going on.

Let’s consider one example from the list of complaints above as a kind of case study: the reported inability of recent graduates to meet deadlines. Why don’t they seem to be learning this skill? Perhaps surprisingly, some faculty members argue that deadlines shouldn’t be enforced. While some have argued that being flexible with deadlines can increase rigor and engagement, two other kinds of arguments for flexible deadlines strike me as the most interesting and among the most popular. The first alleges that deadlines perpetuate inequities. The second asserts that enforcing deadlines is a kind of cultural imperialism—in fact, a form of white supremacy.

  1. Deadlines perpetuate inequities.

Faculty are increasingly sensitive to the problems that students face, which range from having to care for sick or elderly family members to lacking necessities like food or a place to live and significant mental health problems. When it comes to deadlines, faculty—quite reasonably—want to help students who are struggling. I think it strikes many professors as unfair that some students fall behind because of circumstances that are no fault of their own, and they feel that rigidly enforcing deadlines will solidify or increase inequities among their students. Perhaps they reason that if they can rectify this inequity by granting an ad hoc exemption to a deadline, they should. However, there is an argument to be made that such ad hoc exemptions themselves sometimes cause unfairness rather than redress it.

Of course, it’s well-known that when it comes to exemption requests, professors usually aren’t equipped to distinguish between those that are legitimate and those that are illegitimate. So, they will sometimes grant exemptions when in fact the student is being dishonest or the circumstances don’t truly warrant it—especially if the professor doesn’t want to deal with the potential ramifications of not granting a student an exemption. But false positives aren’t the biggest problem. The biggest problem, I think, are false negatives.

For example, let’s think about how ad hoc or discretionary exemptions affect students who struggle with unrecognized or indescribable mental health issues. Students who are suffering deeply may lack not only the willingness but the ability to articulate their reasons for an exemption. This is because people who have unusual experiences may lack the language to articulate their experiences in meaningful ways, precisely because their experiences are not common. In the case of disorders like depersonalization, a student’s experiences may be so bizarre that they may feel they are losing their mind. This means that not only will their professors likely not be able to understand or sympathize with what the student is experiencing, the student herself will likely be unable to express what she is experiencing, and she will almost certainly be resistant to trying to express it to a faculty member. Yet since the disorder has a prevalence of about 2 percent, with a mean onset of 16 years and an average of seven to 12 years for a diagnosis, it’s likely that most professors have interacted with such students without being aware of it. And this is just one underrecognized mental health issue of many.

When only those students who have the knowledge, temperament, circumstances and language to describe their difficulties get exemptions, this puts those students with undisclosed or undisclosable problems at a comparative disadvantage in their classes. While they are working under difficult conditions—perhaps more difficult than their classmates—other students are given more time to complete their assignments while they are not. Consequently, some students who may be suffering the most may be the most harmed, in the context of their classes, by professors selectively allowing exemptions.

While this argument certainly doesn’t show that all ad hoc or discretionary exemptions are wrong, I think it does show that the current drift to allow exemptions beyond a limited set of clear and well-defined cases (e.g., medical emergencies) is one that we should think carefully about. The reality is that faculty can’t ensure equitable outcomes for students and the practice of expansive ad hoc exemptions, I suspect, frequently causes unfairness rather than rectifying it.

To be sure, one could argue that this just means that there shouldn’t be deadlines at all. However, recent research suggests that students need more deadlines, not fewer. And, if we are setting our students up for failure in their postcollege lives by not enforcing them, “no deadlines” doesn’t seem to be a “student-centered” policy. Besides, “no deadlines” isn’t a realistic option: courses end and registrars need final grades. There will be deadlines even if it’s just a single one at the end of the course. So, the best way to help our students and not perpetuate inequities seems to be to encourage practices that will help students flourish in their academic and postacademic lives, like by setting and enforcing deadlines.

  1. Deadlines are a manifestation of white supremacy.

A second reason I’ve heard faculty offer that we shouldn’t enforce deadlines is that doing so is a kind of cultural imperialism. Usually, the argument is made by saying that deadlines, punctuality and the like are products of “whiteness” or “white supremacy culture.” I have some sympathy with this line of reasoning. There is a lot of interesting work in fields like history and the social sciences that supports the idea that cultures differ from place to place and that these differences have a real impact on how we think, feel and move about the world. For example, psychologists like Hazel Rose Markus and Shinobu Kitayama and evolutionary biologists like Joseph Henrich have argued that the differences between cultures lead to different views of the self and even different cognitive styles.

At the same time, the version of the argument that has been starting to crop up in academia—unfortunately imitating pop culture rather than influencing it—lacks the subtlety and circumspection of the corresponding scholarly work. Often drawing from a short paper by Tema Okun, professors have been identifying qualities like “perfectionism,” “objectivity,” “worship of the written word” and “a sense of urgency” as traits of “white supremacy culture.” This way of thinking reached its apotheosis in a widely ridiculed infographic by the Smithsonian Institution’s National Museum of African American History and Culture (NMAAHC) in the summer of 2020.

Critics have decried this way of thinking as racist. And not usually, as one might initially suspect, for being guilty of “reverse racism,” i.e., being racist toward white people. Rather, as Yascha Mounk explains (right about the hour mark), the objection is that saying qualities and norms like (in the case of the NMAAHC infographic) “objective, rational linear thinking,” “respect[ing] authority” and “follow[ing] rigid time schedules” are characteristics of white people reinscribes negative stereotypes and tropes about people of color. In other words, the primary objection made to this way of thinking isn’t that it is “reverse racism”; it’s that it’s just regular old-fashioned racism. Okun herself has lamented some of the use of her work, even updating it as an online book in response to the explosion of interest in it since 2020. Nevertheless, if one accepts this way of thinking, then deadlines themselves may be seen as a manifestation of white supremacy.

Now, I quite agree with many of the faults that Okun sees in our culture. To take just one of the items listed above, I agree that our culture is too “urgent”—at least in the sense that it encourages a perpetual cycle of overwork, even if it’s just busy work. Too many people, even in the academy, recoil from leisure as if it’s something toxic or wicked. And this is the case even though (as any academic knows) you need time to let your mind be idle and wander to be creative. This isn’t breaking news – Daoists pointed out millennia ago that forcing things is often counterproductive. In fact, given the popularity of Okun’s paper in popular culture, it’s strange that no one has made the argument that, in the context of the university, the real blow against white supremacy would be to get rid of the rigid focus on job training and return to the ideal of the liberal arts as a leisure activity.

Nevertheless, if universities are going to pitch themselves as making students “job-ready,” then they have an obligation to enforce the norms that will enable them to get and retain jobs—like meeting deadlines. Moreover, faculty who are concerned with social justice (as I am) should consider that the heaviest costs of failing to do so are likely to be borne by those from socially disadvantaged segments of society. Every culture has norms. There’s no way around that. If we believe that a particular culture should change, then members of universities and the broader public can make the case that it should change. But until that happens, we shouldn’t harm our most vulnerable students by not giving them the skills they need to be successful.

In sum, concerns about equity and white supremacy culture are not good reasons to not enforce deadlines. I take the arguments I’ve made to apply to other issues as well—being on time, putting away phones, employing appropriate etiquette in communication, etc. If universities market themselves as getting students “job-ready,” then it is a matter not just of honesty and contractual obligation that they do so: it’s a matter of fairness and social justice.

A Conjecture

So, why aren’t universities getting students job-ready? The arguments I’ve sketched here reveal a deep ambivalence inside universities about what exactly their mission is. While universities publicly sell their degrees as preparation for the job market, many faculty and administrators understand themselves as working to promote social justice, equity, mental health and similar goods. These objectives aren’t always compatible with one another. Indeed, one might wonder about the consistency of universities marketing themselves as making students ready to fit into the existing American work culture while also pitching themselves as wanting to drive social change. Perhaps this internal inconsistency about what it is we’re supposed to be doing and the consequent pulling in different directions has resulted in our not being successful at getting students job-ready. At the same time, universities don’t seem to be particularly good at producing social justice or equity, either. Maybe this is just a classic case of an institution trying to do everything and, consequently, doing nothing particularly well. If my speculations here are at all close to the mark, then it would behoove university leaders to think hard about what it is their institution is trying to accomplish.

In the meantime, if we’re selling ourselves as equipping students to thrive in the marketplace, then we must teach relevant norms. We can start by having the backbone to enforce those norms appropriate to scholarly life. Otherwise, as the cost of a degree continues to go up, and declining standards continue to erode the value of a university degree, more and more people are going to conclude that higher education isn’t worth it. We only have so much stock left with the public. We’d better not squander it.

Patrick J. Casey is an assistant professor of philosophy at Holy Family University.

Next Story

More from Views