You have /5 articles left.
Sign up for a free account or log in.

The skills-based assessments recommended by the Secretary of Education’s Commission on the Future of Higher Education could be “misleading” to students and parents because it would measure student performance on an institution-wide level rather than more specifically by area of study, a new study of one of the nation’s largest public university systems suggests.

Using data collected in the 2006 University of California Undergraduate Experience Survey, the study, “Institutional Versus Academic Discipline Measures of Student Experience: A Matter of Relative Validity," found that undergraduates studying the same disciplines on different campuses have academic experiences more similar to each other than do students studying different subjects on the same campus.

Among the 58,000 undergraduates on eight campuses who participated in the survey, students who majored in the social sciences and humanities reported higher levels of satisfaction with their undergraduate education over all as well as better skills in critical thinking, communication, cultural appreciation and social awareness.

Students majoring in engineering, business, mathematics and computer science, meanwhile, reported more collaborative learning and demonstrated better mathematical skills. Engineering majors, as well as biological and physical science majors, also reported spending more time preparing for and attending classes.

Steve Chatman, a researcher at University of California at Berkeley and the paper's author, said that because of the differences in undergraduate experiences across majors within an institution, any attempt to capture an overall measure of performance across all of a college or university's students “will necessarily be biased" by the makeup of its programs. The study was released Tuesday by the Center for Studies in Higher Education at Berkeley.

Universities like Harvard and Yale, with large concentrations of humanities and social sciences majors, would rank high for student satisfaction, while the Massachusetts Institute of Technology and the California Institute of Technology, both with large majorities of students majoring in engineering and hard sciences, would rank lower, despite providing undergraduate educations that are widely acknowledged as strong.

Chatman criticized student learning assessments like the Collegiate Learning Assessment (CLA) and the National Survey of Student Engagement (NSSE), which are administered to students across all majors or areas of study at an institution. Both were listed in the Spellings Commission report as examples of student learning assessments that could be adopted nationally.

The cumulative scores of a college or university will be determined not by the absolute skill levels of individual students, Chatman said, but rather by the distribution of students across academic disciplines. The study does not look at how skills have changed during a student’s enrollment, as with the value-added calculations done with assessments such as the C7LA.

Roger Benjamin, president of the Council for Aid to Education and co-director of the CLA Program, said that institutional-level assessments are just one tool for colleges and universities to use in studying their students’ learning.

The assessment, he said, “does not claim to measure all of undergraduate education.” Although it is “a direct measure of certain student learning outcomes -- higher order skills -- widely believed to be important outcomes for undergraduate education,” it is not intended to be a comprehensive test of everything that an undergraduate has learned while in college.

“Someone who majored in journalism should do as well on a quantitative task as someone who majored in engineering,” Benjamin said. “And the same should be true, vice versa, for a task related to the humanities.”

In the paper, Chatman recommends the use of electronic portfolios and discipline-specific testing, ideas proposed earlier this year in Inside Higher Ed by Trudy W. Banta, a professor of higher education and vice chancellor for planning and institutional improvement at Indiana University-Purdue University at Indianapolis. “Banta proposed … a more viable alternative,” Chatman wrote in the paper’s introduction.

Portfolios are an alternative that Benjamin considers “not an option … because of reliability and validity problems that make scientific comparisons impossible.” He does, however, think that the CLA or similar assessments “should be used in concert with multiple indicators of performance.”

What matters most, Benjamin added, are the skills students develop while in college that transcend subject area. “Employers don’t care that much about what students major in, but they do care about the prospective employee’s higher-order skills."

Next Story

More from Learning & Assessment