You have /5 articles left.
Sign up for a free account or log in.

A Guest Contribution to the Reality Check Blog by Arthur M. Cohen and Pam Schuetz*

Whether it goes by the name of exaggeration, half-truth, misrepresentation, distortion, or dissembling, lying is endemic in all of education. Lies vary in intent and magnitude -- but they tend to escalate under pressures of accountability. Over the past few years, state government efforts to link K-12 school funding and teacher and administrator tenure with student progress have led to widespread data falsifications. In at least half of the states, K-12 teachers and administrators have made test questions available in advance or changed students' incorrect responses. This form of cheating is predictable: The more that data are used in decision-making -- especially fiscal allocations -- the more they will be subject to corruption.

Dissimulation in higher education takes a different turn, with the most egregious assaults on the truth found among the institutions in competition with each other for funding and the intangibles of prestige. For example, most liberal arts colleges and universities, and law schools, medical schools, and other specialized programs regularly supply journals such as Forbes, Princeton Review, Kiplinger, and U.S. News and World Report with data describing entering students' test scores, campus amenities, student-faculty ratio, alumni giving, applicant yield, retention, and graduation rates. Journals then aggregate data into category and overall scores to rank institutions and programs. This system of ranking appeals to the public infatuation with seeing at a glance which institutions are higher or lower than others, just like the colleges' football and basketball rankings that are carried in sports magazines and many daily newspapers. It also gives rise to some flagrantly dishonest practices. One of the ranking criteria, for example, is the percent of applicants an institution admits: A lower number is presumed to indicate a more selective school. Some colleges inflate the number of applicants by recruiting vigorously through advertisements; they may even print application forms in newspapers and send them in bulk to secondary schools. Then, when they reject a sizable number, sometimes more than 90 percent, they are placed higher on the selectivity index. Other gambits include SAT-optional admissions whereby colleges allow students the option of submitting their test scores; most often only the highest scoring applicants choose to do so. And for years some institutions have omitted the SAT of athletes, disadvantaged applicants, or simply those near the bottom of the scale before computing and publishing averages for their entering classes.

More blatant falsification of data occurs as well. Recently, Claremont McKenna College (California) admitted that an admissions dean had inflated SAT averages submitted to various government and ranking entities over the previous six years, fueling its rise from 11th to 9th place in the 2012 U.S. News rankings of liberal arts colleges. Similarly, Iona College (New York) found that a former provost had misreported student data to government agencies and widely circulated journals for almost a decade, inflating freshman SAT scores by 15 percent, four-year graduation rates by 20 percent, and understating student-faculty ratio by 13 percent. And fifteen law schools were sued last year for misleading applicants about job prospects after graduation by reporting 90-plus percent graduate employment rates that included part-time, temporary or other jobs not requiring a J.D. Some of these schools reported starting salaries for recent graduates as high as $160,000 per year -- comparable to salaries of graduates from Harvard and Yale -- rather than the much lower salaries earned by beginning lawyers from their school. On the strength of such misleading data and rankings, many prospective students choose colleges and programs of study (some borrowing heavily to pay for the privilege of attending presumptively better institutions) may receive neither the education nor employment opportunities promised.

Community colleges are less susceptible to the rankings game. Funding has historically been driven by student enrollment numbers rather than measures of institutional quality or reputation. But as state support declines and student populations grow larger, funders and accreditors are demanding more proof of student success. Reporting numbers of credentials completed, rates of retention, graduation and transfer to baccalaureate institutions, proof of student progress through remedial classes, value added learning outcomes and rates of employment in the fields for which students have prepared are either already required or on the near horizon.

These data have the appearance of precision but the devil is in the details. Success rates for full-time students? Part timers? Entrants just out of high school? Transfers from other colleges? Older students? Job getting rates for new entrants or for people who formerly worked in the field? Outcomes tend to vary by student characteristics and goals as well as by ways in which institutional effectiveness is measured. Although community colleges do not participate in national ranking systems they still report exaggerated data in attempts to demonstrate their value.

Over the last decade, several hundred so-called economic impact studies have been conducted for community colleges nationwide that put incredibly high dollar values on the taxes paid and the medical costs averted by the more healthful life styles maintained by alumni. A study published by Long Beach City College (California) presented some bizarre statistical manipulations to estimate its fiscal benefits. One of its conclusions was that because 94 percent of California's prison population had not attended college, and since 16,000 people were enrolled, the community was saving $18,000 per student per year, the difference between keeping a person in college or in jail, for a total of $288 million. (All one had to do to applaud that number was to assume that the college's entire student body would have been incarcerated if they were not attending classes.) In more direct evidence of cheating, the head of Edison College (Florida) admitted that an administrator had doctored transcripts to show that students had taken the necessary prerequisites for transfer to the college's baccalaureate programs, thus inflating the numbers as a way of justifying the recent additions to the curriculum.

The imprecision of reporting criteria offers one avenue for cheating but self-reporting affords the widest loophole. Few external regulators or other data receivers have staff sufficient to audit the information coming from any of the colleges. And when inaccuracies are detected, the reports must be redone but penalties or fines for misreporting data are rare. In cases generating the most adverse publicity, the problem is "fixed" with the demotion or dismissal of an administrator and pledges of future transparency. The problem, however, runs much deeper. Pressures to distort community college performance data are growing as pervasive as in other sectors of education, and for the same reasons: desire to aggrandize the institution and to maintain the appearance of conforming to regulations while garnering better students and more resources. Community colleges that do not aggressively seek to move ahead of their peers may fear reduced state funding, alumni donations and application for admission from highly qualified students.

Simply requiring new institutional measures of student success will increase systemic mendacity. Therefore as new measures of community college student success are developed, researchers and practitioners must also develop authentic auditing regulations and procedures with enough teeth to ensure comparable, dependable reporting.

*Arthur M. Cohen is Emeritus Professor of Higher Education, UCLA, and a Member of the Advisory Board of The Center for Measuring University Performance, Arizona State University.
Pam Schuetz is Research Associate, Northwestern University.

Next Story