You have /5 articles left.
Sign up for a free account or log in.

It's hardly shocking that a new federal report on adult literacy finds that the more formal education Americans have, the better they do on tests that measure practical literacy. "The National Assessment of Adult Literacy," released Thursday by the Education Department's National Center for Education Statistics, shows that citizens with a college education were significantly better able than their peers to understand and analyze the information they confront in their everyday lives. So, as Grover J. (Russ) Whitehurst, director of the Institute for Education Sciences, put it at a new conference Thursday: "Education works -- that's a good thing."

But at a time when colleges and universities are under the microscope and policy makers are increasingly seeking to measure the student "outcomes" that they are producing, the report is hardly a pat on the back for higher education.

Not only does it find that the average literacy of college educated Americans declined significantly from 1992 to 2003, but it also reveals that just 25 percent of college graduates -- and only 31 percent of those with at least some graduate studies -- scored high enough on the tests to be deemed "proficient" from a literacy standpoint, which the government defines as "using printed and written information to function in society, to achieve one's goals, and to develop one's knowledge and potential."

"This seems like another piece of hard evidence, a fairly clear indication, that the 'value added' that higher education gave to students didn't improve, and maybe declined, over this period," said Charles Miller, the former University of Texas regent who is heading the U.S. education secretary's Commission on the Future of Higher Education. "You have the possibility of people going through schools, getting a piece of paper for sitting in class a certain amount, and we don't know whether they're getting what they need. This is a fair sign that there are some problems here."

The report, which extrapolates its findings from a survey of 19,000 Americans aged 16 and up, aims to measure what the commissioner of the National Center for Education Statistics, Mark Schneider, called "reading for purpose" -- how well citizens can process information to do what's necessary to work and live (sample questions are available here).

It assesses three types of literacy: "prose literacy," which is the ability to comprehend continuous texts, like newspaper articles and the brochure that comes with a new microwave; "document literacy," the ability to understand and use documents to perform tasks, like reading a map or prescription labels; and "quantitative literacy," which are the skills needed to do things like balancing a checkbook or calculating the interest on a loan from an advertisement.

Based on their scores, participants in the survey were deemed to have "basic," "intermediate" or "proficient" literacy (Whitehurst noted that a National Research Council committee that recommended the literacy levels initially called the highest level "advanced," but that department officials ultimately concluded that the skills required for that category -- comparing viewpoints in two editorials, for instance, or calculating the cost per ounce of different grocery items -- weren't really all that advanced.)

Over all, the average prose and document literacy scores for Americans were basically flat between 1992 and 2003, though the scores on quantitative literacy rose from an average of 275 to 283, out of a maximum of 500. The scores of women rose in two of the three categories (document and quantitative literacy) over that period, while those for men fell in two of the three (rising only in quantitative). Scores for black Americans rose, while those for Hispanics declined.

Scores rose as one moved up in educational attainment, as the table below, examining prose literacy, shows. But the table also shows that scores fell from 1992 to 2003 for virtually every educational level, and the declines were steepest, by and large, the further up the ladder one moved. The contrast was even steeper in the realm of document literacy. Scores declined by three points or less for those who had at most a high school degree, while the average document literacy score for college graduates dropped by 14 points, to 303 from 317, and by 17 points for those with some graduate education (to 311 from 328).

Average Prose Literacy Scores by Education Level, 1992 and 2003

Education level Prose Score, 1992 Prose Score, 2003
Still in high school 268 262
Some high school 216 207
GED/equivalency 265 260
High school graduate 268 262
Vocational/trade school 278 268
Some college 292 287
Associate degree 306 298
College graduate 325 314
Graduate studies/degree 340 327

 

As the raw scores have declined over time, so too have the proportions of the college educated who proved themselves "proficient" on the literacy tests, the study finds. Thirty-one percent of college graduates tested as proficient in prose literacy in 2003, down from 40 percent in 1992; the proportion of those proficient in document literacy were 25 percent in 2003 and 37 percent in 1992. For those with at least some graduate school, 31 percent were document literate in 2003, down from 45 percent in 1992.

Miller, Whitehurst and college officials offered a range of possible explanations for the numbers that all of them viewed as troubling. Several of them cited societal factors such as declining interest in reading and a culture that increasingly "takes as heroes people who dropped out of school in eighth grade and made a gazillion dollars," as Ross Miller, director of programs at the American Association of Colleges and Universities, who said it was "hard not to be embarrassed by the data."

Others noted that significantly more Americans are involved in higher education today than was the case a decade ago -- 12 percent of the 2003 population were college graduates, compared to 10 percent in the 1992 study, for instance, an increase of about 4 million people -- and most of that growth has come among populations that tend to be academically underprepared. "It doesn't take a genius to look at the test scores that a lot of our urban schools produce to know something's not quite right there," said Ross Miller, though he made clear that he was not trying to transfer blame entirely to elementary and secondary schools.

He and other researchers agreed there was significantly more work to be done to determine whether (a) colleges are taking students who have been significantly underprepared by their previous schools, (b) the colleges are failing to catch those students up, or (c) both. The fact that Americans as a group are getting more formal education at higher levels "should have pushed up the levels of literacy in the country," said Schneider of NCES. Why it has not, he said, "gives us pause."

Like other commentators, Doug Hesse, a professor of English and head of the honors program at Illinois State University, who is active in the National Council of Teachers of English, has some theories. "This is exactly emblematic of what's going on in our culture now," he said, in that students (like most of us) are barraged with "flashes and bits of material" -- "here's a sound bit, here's a sound bite, here's a factoid" -- and "not really much asked to use the information or analyze it in some way."

Hesse also cited "sobering" data about the amount of time students spend on their studies. One study at Illinois State found that honors students were assigned an average of fewer than 50 pages of reading a week, and that two of five students acknowledged completing less than half of that work. "Students seem to spend a lot of time on Facebook, and when you think about the literate practices involved in Facebook, that's probably not contributing a lot to the scores on something like this literacy test," he said.

For Charles Miller, the head of the federal higher education commission, said it was impossible to know for sure whether the damning data in the literacy report necessarily mean that colleges are doing too little to prepare their graduates to think for themselves. But what seems evident, he said, is that colleges need to be able to measure how much they are contributing to students' knowledge -- which they can do only by more consistently testing what their graduates know and have learned.

"We don't have a clue what they're really learning if you don't measure it," he said.

Next Story

Written By

More from News