You have /5 articles left.
Sign up for a free account or log in.

As students return to campus this fall, they (along with their families) are facing another round of pricey tuition bills. Last year, tuition, fees, room and board averaged about $20,000 for public four-year universities, according to the College Board. That’s more than double the total cost 25 years ago, even when adjusted for inflation. To add fuel to the fire, in a recent poll, nearly half of undergraduates said they learned “nothing” after their first two years of college.

Interestingly, that’s the period during which many students are required to take courses that don’t align with their interests or career goals and instead are part of the “core curriculum” that is required of all students to graduate.

Both of us are university educators; dismaying to us (and perhaps surprising to the reader) is that such requirements are often arbitrary, and there’s little or no data to support their selection. For organizations grounded in research and experimentation, universities engage in surprisingly little analysis about the educational value of courses they offer, beyond those perfunctory student course evaluations.

The goal of universities should be to develop students into mature adults who are knowledgeable, able to function in complex society and prepared for the next phase of their studies or career.

It’s a small amount of time that costs a great deal of money, and neither should be wasted by requiring students to sit in large lecture halls on the campus, taking introductory-level courses from an arbitrarily-chosen bucket of courses. We need to reconsider that approach.

A group of successful people from across the country, from all walks of life, led by evidence-based educators, could be convened to develop a list of core course requirements that all universities would utilize. They would carefully determine what sort of basic knowledge -- such as algebra, foreign languages, basic science, and so forth -- is really necessary for anybody to be considered “educated.” But they shouldn’t stop there. After making their choices, universities should do what they do best -- experimentation -- to evaluate how different core courses impact outcomes for students. Deciding appropriate outcomes and when to measure them should be part of the process.

Second, whatever is decided, universities should shift these core courses to online instruction. Students’ on-campus time is better spent on other endeavors, and it’s inefficient for every university in the country to design and teach the same core courses. Basic Chemistry is the same, whether it’s taught by a professor in Alaska or Arkansas. 

Instead, universities should create a marketplace of online courses to provide students with the best instruction available, even if it’s not produced locally. Those courses could even be taken before students start college, similar to AP courses hundreds of thousands of high school students take every year.

With those core courses out of the way, universities could direct their resources towards more focused curricula where students don’t just learn basic facts but instead learn to think and function as mature adults.

Small group courses would focus on developing skills like oral and written communication, interaction with peers, team behavior, leadership and -- just as important -- followership. Importantly, the instructors who lead these courses need to be teachers who excel in this type of environment. In many cases, the most impressive professors -- those who’ve been published frequently or have conducted groundbreaking research -- won’t thrive in it. Universities should embrace the value of true teachers for this purpose.

A few of these courses would be required, but they would also be subject to experimentation and demonstration of a contribution to the student’s maturity. On-campus courses might include debate, art and architecture and great books. Students could take an “innovation” course, in which they’d interview professionals in a field of their interest to understand the challenges they face in the real world, then discuss possible solutions with classmates. Students would also be required to keep a personal digital portfolio of their accomplishments throughout their college years, which would form the basis for meetings with their adviser and help with self-reflection.

Students, of course, would take elective courses -- also in small groups -- which would help them prepare for a final capstone course. There, students would work in teams and in coordination with a professor to address a problem in real terms, whether building a model of a device or planning an event that addresses a social issue. This work should be carried out beyond the boundaries of the campus, through interviews with professors or experts across the United States or even internationally. The student should guide their team to write the findings of the work as the final product of an exercise in team leadership in the real world.

American universities are known the world over for faculty who do innovative research. But they haven’t always applied that innovative spirit to their own curricula. As technology advances and the costs of education soars, it’s time for institutions to rethink their approach and focus on preparing mature students to best serve the country’s future generations.

Next Story

More from Views