You have /5 articles left.
Sign up for a free account or log in.

Keene State College released its first placement survey of a graduating class this fall, with plans to annually track graduates’ post-collegiate career outcomes. Of the 235 graduates from the class of 2012 who generated usable responses from the survey, 94 percent reported being employed or engaged in further education.

At Syracuse University, where outcome surveys have been administered for at least 30 years, 84 percent of the class of 2012 was working or attending graduate school.

Any comparison between the two universities -- or between virtually any two colleges -- would be futile.

As just one example among several, “placement” at Keene State means that a graduate has found full- or part-time employment, a paid or unpaid internship, is a full- or part-time student in graduate school, or is completing a year of service.

At Syracuse, only a graduate who secures full-time employment or enters graduate school as a full-time student is considered placed. Each of the reports, however, were generated with responses from less than half of the institution's graduating class. At Syracuse University, 43 percent of the class of 2012 responded; at Keene State, 38 percent.

Defining Success Broadly
Gallup is pitching another approach
to measuring graduates' quality
of life (and colleges' success
in producing them. See related
article
.  

With tuition prices continuing to rise and greater numbers of graduates struggling in the job market, families, students and policy makers -- most visibly President Obama -- are increasingly questioning the “value” that colleges are providing.

In response to that mounting pressure, more colleges and universities are turning to alumni outcome surveys. But widely differing survey methods and various interpretations make it hard for students or national policy makers to draw conclusions about how a college degree from one institution prepares graduates for the work place compared to a degree from another college. 

Many directors of career service programs recognize the flaws in outcome surveys, but say there’s no choice but to continue the surveys to appease the public while working to improve their quality and accuracy. Some national higher education organizations are developing guidelines so surveys can be used to compare one college to another. And individual colleges are revamping their surveys to gather more data and thoroughly analyze the results.

 
Proposed Best Practices
Syracuse University’s approach Keene State College’s approach
Collect data from various sources (like LinkedIn) instead of relying on students’ survey responses Used an online survey and sent reminder emails Used an online survey and reminded students to respond via emails and phone calls
Collect data until about six months post-graduation Stops gathering data at the six month mark Stopped surveying the class of 2012 in March 2013
Provides information about average salaries within colleges and majors Includes average salary for the entire class of 2012 and for individual colleges and schools Does not include
Breaks down data by colleges or majors Breaks down data for nine colleges and schools Plans to break down data by majors in future surveys
Provides examples of job titles, graduate schools or companies Yes, for each college and school Does not include

External Pressure

Federal lawmakers and national policy groups have called for laws that would require colleges to provide data on the post-collegiate lives of their alumni. This August, the Obama administration announced its plan to create a federal college rating system. The system could use measures such as the proportion of students receiving Pell Grants, the average amount of tuition, scholarships and loan debt, graduation and transfer rates, the salaries of graduates and the extent to which graduates get jobs and pursue advanced degrees. Education Secretary Arne Duncan has said he wants to have a final ratings system by December 2014, and to launch the system in the 2015 academic year.

Critics of the proposal question the availability and quality of data that would be used to measure colleges.

Previously, some members of Congress and think tank groups called for legislation that would require colleges to release similar information, as well as for a student-level “unit record” system that would track graduates as they entered the work force. The 2008 renewal of the Higher Education Act included a provision that forbid the creation of such a federal data system, but that could be revisited when Congress reauthorizes the law, a process that began this fall. 

Keene State’s director of institutional research, Cathryn Turrentine, says the college began its survey to stay ahead of that curve. Its Board of Trustees expects to see the employment rates of graduates, she says, and the college anticipates that the federal government will eventually require colleges to report outcomes.  In the future, the survey will ask about salary, as requested by the governing board, she says. The college also plans to survey alumni five years after they graduate to develop a fuller picture of students’ lives after they leave Keene State.

In its infancy, the report from Keene State lists the proportion of students in each type of employment, as well as responses to questions about whether students’ jobs are closely or somewhat related to their studies and what other career-related experiences, like volunteer work or research, a student completed while in college.

About 75 percent of employed respondents said their current position is either somewhat or closely related to their studies at Keene State. Of the respondents seeking postgraduate education, 87 percent felt that their studies at Keene had prepared them well or very well. When asked what career-related activities they participated in at Keene, respondents said part-time employment (43 percent), summer jobs (33 percent), paid or unpaid internships (26 percent) and student teaching (21 percent), among others.

The survey also asked recent alumni to describe other activities they are finding meaningful, aside from employment and further education. The college listed students’ answers, which included various hobbies, like art or beer brewing, as well as volunteer work and physical exercise.

The report from Syracuse is 82 pages longer than its counterpart from Keene State, but it has much in common with it. Both reports ask students if their work relates to their career goals (Syracuse) or their field of study (Keene State). Students are asked when and how they found a job.

Syracuse’s placement report goes quite a bit further. It details students’ placement type (full-time employment, part-time employment, internship, still looking, graduate school, or not seeking employment), provides the average salaries by school and college, and includes the companies or graduate schools where alumni are employed or enrolled, as well as a geographic summary of where alumni work.

Graduates from the School of Architecture reported the largest percentage of students placed, 97 percent, compared with a 69 percent placement rate for recent graduates of the College of Visual and Performing Arts. Even then, the comparisons are not direct. Only 27 percent of graduates from the College of Visual and Performing Arts responded to the survey, compared with 69 percent of graduates from the School of Architecture.

Comprehensive as it is, Syracuse’s survey is being re-examined. Career Services Director Mike Cahill says the university is reworking its survey this year, recognizing a “need to beef everything up.”

While the overhaul stems from increased pressure to show outcomes data, Cahill says he uses the survey to illustrate to prospective students and their parents that a student’s major does not predict job outcomes. Then, he says, he transitions into a conversation about how the university helps students prepare for life after graduation.

Administrators at DePaul University likewise have discussed how to increase the usefulness of the data the university collects and provides. Its most recent report, for the class of 2012, says 85 percent of graduates have found employment or are pursuing further education. Like Syracuse, DePaul provides separate reports for graduates of different colleges. It also lists the average salaries for each major, as well as a list of sample employers and job titles.

Career Center Director Gillian Steele says the university looks for trends in employment — like a move toward self-employment or the likelihood that internships led to jobs — to inform the services DePaul offers students. Internally, DePaul pulls data on specific groups of students, broken down by demographics such as race or first-generation status, to make sure they are helping all students, she says.

More Surveys – and Differences

The variety of job placement surveys is increasing in proportion to their number. Some colleges survey students before graduation, others as part of an exit checklist, and others six months after graduation.  Not all surveys ask the same questions: Some are concerned with students’ opinions on the value of their education; others want to know specifics of students’ jobs or graduate programs.

Collected data, too, is summarized differently. For one university a “placed” student may mean anyone who is employed -- whether in a full-time job, a part-time job or an internship. Some colleges ask whether students are using their degree in their new job. Other universities yield a high enough number of responses to break down data into specific programs.

Many campus officials concede that the differences limit the usefulness of the data. “It makes comparing information across universities or for any group to speak on behalf of higher education communities literally impossible,” says Manny Contomanolis, director of Rochester Institute of Technology’s office of cooperative education and career services.

Colleges and universities need a standard model that includes consistent definitions for what the survey measures as well as timelines for conducting outcome surveys so national groups and prospective students can compare outcomes at various colleges, he says.

Increasing the comparability of the job placement reports wouldn’t solve all the problems with them, others say.

Jeff Strohl, director of research at Georgetown University’s Center on Education and the Workforce, questions whether surveys accurately portray recent graduates’ lives.

“We can imagine that people who place well are more likely to brag and report,” he says. “Those who are working at Starbucks or dog-walking might not.” (Turrentine of Keene State says a follow-up analysis she conducted of non-responders found no statistical difference between those who responded to the placement survey and those who did not.)

Any placement reports for those one year out of college are inconsequential, says Phil Gardner, director of the College Employment Research Institute at Michigan State University. Today’s graduates are likely to hold several different jobs before establishing a long-term career about five to seven years after graduation, he says.

Gardner recommends that universities collect data at least five years past graduation to determine whether students are benefiting from their education. But he doesn’t anticipate seeing a shift to longitudinal analysis, which can take time and money, when one-year-out surveys seem to appease the public and politicians.

“That’s what everybody is using because it’s probably the easiest number to get and you can manipulate it,” he says. “They’ve jumped on this because they think it’s easy and they think it tells them something. But it really doesn’t.”

Strohl agrees that those looking at survey results should be wary. He says he would “discount automatically” the average salary reports of graduates, as larger universities with more programs are going to see greater variations between the earnings of majors.

National Standards?

Some national organizations, including the National Association of Colleges and Employers and the National Career Development Association, are looking to release recommendations for outcome surveys that balance individual colleges’ needs while allowing for some comparisons between colleges.

“One of the things we need to be somewhat cautious about is there seems to be a lot of emphasis on placement and salary, and higher education outcomes are much broader than that,” says Lisa Severy, president of the national career development association. “But because these are concrete and measurable, they tend to come up first.”

After releasing standards, NACE would want colleges and universities to provide survey results to the national organization so it could conduct its own analyses, says RIT’s Contomanolis, who heads a task force on job placement surveys for the NACE. Among the best practices advocated by Contomanolis is a move toward “knowledge rates”  instead of response rates.

Rather than relying on students filling out surveys, he argues, colleges can attain a better sense of students’ outcomes by using faculty members’ and employers’ reports on students’ post-college paths, as well as including what students are reporting on professional networking sites like LinkedIn. Contomanolis recommends that information-gathering efforts should continue until six months after graduation.

At career centers, many administrators are aware of the shortfalls of their surveys and as the squeeze for data continues, some are fine-tuning methods.

The University of Colorado at Boulder, where Severy is director of career services, has hired a research program manager to conduct a project on graduates’ destinations. The aim is to get a broader idea of what students are doing when they graduate, she says. The university gives some general information about its graduates and wants to be able to answer more questions — from students or national higher education policy analysts — about its graduates.

Some states are involved with outcome data as well, most notably Virginia, where the State Council of Higher Education compiled data on graduates’ earnings 18 months following degree completion and data on earnings of some graduates at five years post-graduation.

The “first generation” of these reports is being used to establish a fact base, says Tod Massa, director of policy research and data warehousing at the State Council of Higher Education for Virginia. The effort is a “refined, albeit limited” look at the earnings from graduates at specific institutions or in specific majors, Massa says, and students should use the tool to understand what earnings range a certain degree could put them in. But it won't tell them much of anything, he says, about the effectiveness of a university.

Next Story

Written By

Found In

More from Careers