You have /5 articles left.
Sign up for a free account or log in.

This month, responding to four instances in which colleges admitted to having provided false information for its rankings, U.S. News & World Report published an FAQ on the issue. One of the questions: "Do you believe that there are other schools that have misreported data to U.S. News but have not come forward?" The magazine's answer: "We have no reason to believe that other schools have misreported data — and we therefore have no reason to believe that the misreporting is widespread."

Less than three weeks later, another college -- Bucknell University -- came forward to admit that it had misreported SAT averages from 2006 through 2012, and ACT averages during some of those years.

The news from Bucknell left many admissions experts wondering whether there are larger lessons to be learned by colleges as report seems to follow report with regard to inaccurate information being submitted by colleges.

David Hawkins, director of public policy and research for the National Association for College Admission Counseling, said via e-mail that "these actions are the result and responsibility of both individuals and the institutions for which they work," but that there was also a broader context behind all of these incidents.

"The emphasis placed on an institution's 'selectivity,' particularly as defined by standardized test scores, has gone beyond the rational and become something of an obsession. NACAC believes it is time for all stakeholders, including institutions, rankings, bond rating companies, merit scholarships, boards of trustees, alumni, and many others, to reassess the emphasis that is placed on 'input' factors like standardized test scores, and focus on the value colleges add to students' postsecondary experiences once they are on campus, regardless of the supposed 'selectivity' of the campus."

Leaving Students Out of the Average

At Bucknell, the inaccurate data resulted from the college leaving some students' scores out of test averages. In a few cases, the omitted students had scores higher than those reported. But most of the excluded students had lower scores, so the result of leaving them out was to inflate Bucknell's averages. "[D]uring each of those seven years, the scores of 13 to 47 students were omitted from the SAT calculation, with the result being that our mean scores were reported to be 7 to 25 points higher than they actually were on the 1600-point scale," said a letter sent to the campus from John C. Bravman, the president. "During those seven years of misreported data, on average 32 students per year were omitted from the reports and our mean SAT scores were on average reported to be 16 points higher than they actually were."

The ACT scores were inaccurate only for some of those years, but for several of the years resulted in real averages one point lower than those reported.

Even though the inaccuracies were "relatively small," Bravman wrote that they were significant. Reporting false information "violated the trust of every student, faculty member, staff member and Bucknellian they reached. What matters is that important information conveyed on behalf of our university was inaccurate. On behalf of the entire university, I offer my sincerest apology to all Bucknellians for these violations of the integrity of Bucknell."

Bravman's letter said he was concerned that due to "national discussions about college admissions," some people "may reach the incorrect conclusion that the scores omitted were from some single cohort that people typically cite – such as student-athletes, students from underrepresented communities, children of substantial donors, legacies and so on. All such speculation would be in error. The students came from multiple cohorts, and of course the university will not disclose their identity."

The false data were discovered after Bill Conley, a new vice president for enrollment management, noted that the mean SAT score for incoming students this year was about 20 points below last year's reported average. He then investigated, and found the pattern of false reporting.

In an interview Saturday, Bravman said that he believed a single person had been responsible for the false data. SAT and ACT scores were reported to the institutional research office in aggregate form, he said. So the institutional research officials relied on those aggregate data and never had the raw data that might have raised questions.

Bravman said that he has had discussions -- which he described as unsatisfactory -- with the person who was responsible for the reporting, and whom Bravman declined to identify. Bravman said that this person denied trying to make the university's admissions process look better either for internal or external audiences, and never offered a real explanation for what had happened.

"I'm very frustrated," Bravman said of these discussions. He said that it appeared to him to be "ignorance at best" or "incompetence at worst" in recognizing the importance of reporting accurate data.

Data on the Bucknell website have been corrected, and U.S. News & World Report, which was given inaccurate data for rankings purposes, has been informed of the problem, and given correct information, Bravman said.

In 2012, Claremont McKenna College, Emory University and George Washington University all submitted false data to U.S. News about undergraduate admissions, as did Tulane University's business school with regard to M.B.A. admissions.

Explaining the Pattern

Many admissions experts say that they are no longer surprised by these reports. (Inside Higher Ed's survey of admissions directors last year found that 91 percent believed that some institutions besides those that had been identified at the time had reported false scores or other data.) But these officials say that they are concerned about the underlying causes of these incidents, and about the impact of these scandals on the public perception of college admissions.

One longtime senior official in admissions who asked not to be identified said that the false reporting flows from the false impression that very few students get into college, and that a college's quality relates to its competitiveness. "The fact is," he said, "that there is just as much competition among colleges for students as among students for colleges." But market share and prestige are "tied to selectivity," which just adds to the pressure to be selective. This admissions official said that he suspected "that the misreporting ... is less due to deliberate deception, and more to self-rationalizing why certain students or groups of students ought not be included in a profile."

He added, however, that "there is no question that internal and external pressures to attract more applicants, accept fewer of them, and enroll more with ever-increasing scores have contributed to the angst felt by college admissions deans."

Lloyd Thacker, executive director of the Education Conservancy and a longtime critic of rankings, said via e-mail that "as long as commercial rankings are considered as part of an institution's identity, there will be pressure on college personnel to falsify ranking data. An effective way to curb such unethical and harmful behavior is for presidents and trustees to stop supporting the ranking enterprise and start promoting more meaningful measurements of educational quality."

Jerome A. Lucido, executive director of the University of Southern California Center for Enrollment Research, Policy and Practice, said that it was important to remember that outright falsifying reports was "only one way to manipulate" the rankings, and that many others are used as well. "They can also be manipulated by recruiting students who will not be admitted, by deferring to future semesters students who were not admitted for fall, and by counting faculty as teaching resources who only teach nominally or tangentially," Lucido said.

While many say that all kinds of manipulation are just "the way the game is played," Lucido said that it was "long past time to provide truly accurate public information and to concentrate on indicators of our results rather than our inputs."

Tulane M.B.A. Program Becomes 'Unranked'

Robert Morse, who leads the rankings process at U.S. News, did not respond to e-mail messages seeking his reaction to the news about Bucknell. In the past, he has said that the magazine relies on colleges to provide accurate information. The magazine has also been responding to the reports of data fabrication on a case-by-case basis.

On Thursday, the magazine announced that it was moving Tulane's M.B.A. program to "unranked" because the incorrect information the university's business school submitted had a significant impact on its ranking. The M.B.A program -- when it recalculated its data -- had an average GMAT score for new students of 631 in the fall of 2011 (not the 670 previously reported). And when the business school submitted an accurate number for the applications it received, its acceptance rate grew from 57 percent to 93 percent.

 

Next Story

Written By

More from Traditional-Age