You have /5 articles left.
Sign up for a free account or log in.

WASHINGTON -- Private colleges are the stereotypical laggards in the national accountability battle. Fostered in large part by the full-throated opposition of independent college leaders to the federal government's push for national methods of measuring student learning, critics of higher education have characterized private colleges as foot draggers unwilling to gauge their own success in educating students by using standardized tools that might allow for comparisons to other institutions.

Like most stereotypes, this one is not complete fiction; it is still not hard to find private college leaders and faculty members, especially at highly selective institutions, who look at the long lines of students beating down their doors for admission and say, "Where's the problem?"

But try selling the "private colleges hate assessment" line on the 150 or so independent college provosts, deans and faculty members who gathered Sunday in a hotel conference room here on one of those rare summer afternoons (low 80s, manageable humidity) when you might actually want to be outside in the nation's capital. They were here for the summer meeting of the Council of Independent Colleges' Collegiate Learning Assessment Consortium, in which dozens of colleges are experimenting with the Council for Aid to Education's performance-based assessment tool.

The independent college council's consortium has existed since the Commission on the Future of Higher Education was but a gleam in Education Secretary Margaret Spellings's eye, and a sizable number of the consortium's members have been experimenting with the CLA since before it became the darling of the Spellings Commission and its chairman, Charles Miller. Other members of the CIC are just now joining the consortium, which is sponsored by the Teagle Foundation.

Whether they are veterans or "rookies" with the CLA, as David C. Paris, a CIC consultant and Hamilton College professor, described the 19 small colleges that are joining the consortium this year, all of them have decided to use the controversial national measurement of student learning (which has vocal critics and advocates) -- or to keep using it -- for a mix of reasons.

Richard H. Ekman, president of the Council of Independent Colleges, said the group started its consortium five years ago because its members realized that their "old ways of talking about [the quality of] our institutions, which were largely anecdotal," were not sounding as persuasive to policy makers and the public as they once did, and that amid "rumblings" of possible federal intervention, colleges of all types needed to find new methods for making the case that they were effectively educating students.

Not coincidentally, Ekman said, the council believed that small, intimate liberal arts colleges like most of its members would "fare quite well" on a measure, like the CLA, designed to measure how institutions "add value" to their students on traditional liberal education skills such as critical thinking and problem solving.

Peter T. Ewell, who for nearly 30 years has been at the center of national and campus discussions about assessment and accountability, offered what the CIC's Paris called "grand, existential answers" to why dozens of small, independent colleges in the council have been experimenting with the CLA. His answers focused on the external pressures building on postsecondary institutions (from politicians and employers) to prove their value and on the recognition by many institutions that they must find new and better ways to educate students who increasingly come to them with lesser academic credentials and are accustomed to learning in ways that differ from how the institutions are accustomed to teaching.

Campus Experiences So Far

A panel of officials from some of the CIC colleges offered their own, more individual answers to why they had begun using the CLA or, in the case of those that had used it for multiple years, why they were continuing to -- in some cases despite meaningful reservations.

Mary Ann Coughlin, a professor of research and statistics at Massachusetts' Springfield College who is the institution's new assistant vice president for academic affairs, listed the many mechanisms the college has long used to asses its students' learning: the National Survey of Student Engagement, participation in a national survey on freshmen, annual surveys of its seniors and fifth-year alumni, and "benchmarking" data comparing its outcomes to other colleges.

"Does that list sound familiar?" she asked her peers in the audience. What became "abundantly clear to us," Coughlin said, is that despite all that data, "we really had limited direct evidence of our students' achievement in cognitive learning areas."

Springfield officials decided to join the CIC consortium this year and work with the CLA to try to collect that evidence. The college is particularly interested, Coughlin said, in gathering data that allows the college to compare itself against "truly representative peer institutions," which it hopes to find among the other CIC institutions, who work closely together (and with the CLA's creators, to strengthen the assessment test) in the context of the consortium. Springfield's major concern, Coughlin said: how it will "not only get students to show up and take the CLA, but to give their best effort," which critics of the test have cited as a major challenge given its three-hour length.

Another college new to the CIC consortium, Pennsylvania's Juniata College, was motivated to experiment with the CLA in part because of the same national pressure that "you all are under, from government, parents, students, to show what it is we're really all about," said James Lakso, the college's provost.

But characteristics closer to home drove its decision, too. Juniata's retention rates are somewhat lower than some of its peers, Lakso said, and its officials are hopeful that the CLA will show that the resource-poor institution does a good job of "adding value" to its students. And the college is also looking to the CLA to help it diagnose why "our faculty are generally unsatisfied with the writing that students do," Lakso said. "We think we've got a writing problem -- and the CLA has confirmed that -- but we really don't know what that writing problem is."

Juniata's big challenge going forward with the CLA, Lakso asserted, will be getting its faculty on board. "I know this isn't true about your faculty," he said to chuckles from the audience, but "a lot of our faculty ... aren't wild about assessment. It makes the hair go up on the back of their necks."

"I have to persuade them that the CLA has meaning," he said.

The colleges that have been using the CLA for multiple years as part of the CIC consortium generally offered a mixed review of their experiences with the exam so far. (The CIC released a full report on the results of the CIC consortium to date, which offers a portrait balanced with pros and cons.) Joel Frederickson, professor and chair of psychology and acting associate dean for institutional assessment at Bethel University, in Minnesota, said the institution had felt pressure from its regional accreditor to measure student learning, and that, at core, "what [the CLA] measures is what's important" for a Christian liberal arts college like Bethel -- critical thinking, written communication, problem solving and analytical reasoning.

But officials there have grown concerned about the great variability in results from year to year, an outcome that the CLA's critics say is not surprising given that most colleges give the test to just 100 students each year (because it is expensive). "The first year, we looked great, another year, so so," said Frederickson. "Another year, the results look horrible, like we're not adding any value. That's a difficult memo to send out" to faculty and staff.

The question of whether a college appears to have added value under the CLA model is complicated by the fact that the standard way of measuring that is by comparing a group of first-year students with a group of graduating seniors. But Bethel has found that "our freshmen seem to be very motivated when they take this," and "if your freshmen do really well, it's a big problem in value added," Frederickson said. "It really hurts us when your seniors don't seem as motivated."

Pennsylvania's Allegheny College was in the first crop of members of the CLA consortium, in 2004-5, and its experience has on balance been a good one, said Linda C. DeMeritt, dean of the college there.

The assessment tool has helped the liberal arts institution identify outcomes that have raised important questions about how its students learn -- that the more math courses students take, the better they score on the CLA's "make an argument" test, for instance, and that athletes show greater growth between the first and third year of college than non-athletes do. (The next step is figuring out why those outcomes occur, DeMeritt said.)

While Allegheny officials continue to believe that the CLA can be an important tool for helping to improve its curriculum and teaching effectiveness, they have grown skeptical about the test's utility in measuring "added value," in large part because results can vary based on when the test is given -- "by means of strategic timing."

In the first three years it gave the test to freshmen, DeMeritt said, Allegheny did so during the second or third week of classes, "when incoming students are particularly serious" about their academic work. In each of those years, first-year students performed better on the CLA than would have been predicted based on their academic preparation. Given that Allegheny seniors performed at the expected levels, it appeared under the CLA's scoring regimen that the college did not help its students advance significantly over their time there.

Last year, Allegheny instead tested its freshmen during orientation, when students are "burned out, overwhelmed, and had too much going on," said DeMeritt. They scored below the expected level, which would be good for the institution's results under the CLA, but didn't exactly enhance the institution's confidence in the test's reliability to show "added value."

"It's just one year, and it doesn’t really prove anything," DeMeritt said. "But it did make us pause and say, 'There are many other good things about CLA, so we're going to focus on the other things the CLA brings us.' "

Next Story

Written By

More from Learning & Assessment