You have /5 articles left.
Sign up for a free account or log in.

Success in campus internationalization efforts “is most often measured in the amount of activity, or in the inputs,” said Christa Olson, associate director of international initiatives for the American Council on Education. How many globally themed courses does a college offer, for example, or how many study abroad opportunities?

Then there’s the most commonly cited metric these days: “the number of bodies going out the door,” as Michael Vande Berg, vice president for academic affairs and chief academic officer for the Council on International Educational Exchange (CIEE), put it. “We continue to see college and university presidents who are fascinated by the notion of sending 20, 30, 40 percent of their students abroad,” Vande Berg said. But is this single-minded focus on bodily inputs -- not learning outcomes -- among the factors fueling skepticism in the value of the study abroad endeavor?

A forum sponsored by the U.S. Department of Education’s International Programs Service (IEPS) late last week included several sessions on assessment and international education (not surprisingly given the Bush administration's focus on accountability in higher education generally). Attendees at a Friday morning panel described increasing pressure to communicate learning outcomes of study abroad to accreditors and granting agencies. And, internally speaking, they described a need to assess programs not only to improve their own offerings and communicate the programs' worth to college administrators, but also, as one participant put it, to gain ammunition to “de-list” programs offered by affiliated outside providers that aren’t meeting a college’s standards.

At that session and another on Friday afternoon, speakers described various approaches to assessing not only study abroad programs, but international education efforts more generally. “The growth of studies in this area has been breathtaking,” said Vande Berg, who estimated that in excess of 1,000 studies will be published on student learning abroad this decade.

But, while Vande Berg stressed that international education office staffers without backgrounds in research methodology can team up with social scientists to carry out publishable studies, developing an internal culture of ongoing, systematic assessment is also important, said Jonathan Gordon, director of the Office of Assessment and an adjunct professor in the Sam Nunn School of International Affairs at Georgia Institute of Technology. In a recent survey of Georgia Tech alumni, graduates who studied abroad reported that they felt better prepared to find a job, were happier with their progress in the working world and were making more money. Finding out such information, Gordon said, can be a matter of just asking institutional researchers to add a question or two to the surveys they’re already sending out.

When it comes to evaluating internationalization efforts on a campus-wide level, Olson, of ACE, described a tool the association developed, in collaboration with six institutions (two universities, two liberal arts colleges and two community colleges), to assess progress toward nine learning outcomes. (Among them: “A globally competent student graduating from our institution ... Understands his culture within a global and comparative context ... Uses knowledge, diverse cultural frames of reference, and alternate perspectives to think critically and solve problems ... Accepts cultural differences and tolerates cultural ambiguity.”)

The project group developed a combination survey/ePortfolio approach to gauge progress toward those outcomes. Students can include any variety of “artifacts” -- writing samples, art projects, whatever else might be deemed as demonstrating learning -- in the portfolio, which is then evaluated using a rubric ACE developed. By combining the ePortfolio evaluations with data from the survey, colleges can determine, for instance, whether students from a particular ethnic background on a particular study abroad program met the designated international learning outcomes, Olson explained.

“The very strengths of this approach are in some ways its weakness,” she said. On the one hand, it allows for maximum flexibility. On the other, “How do you deal with all this information? How do you make it manageable?”

Other presenters Friday described painstaking efforts to quantify not only growth in intercultural competence, but also, of course, mastery of foreign languages. Steven Poulos, director of the South Asia Language Resource Center at the University of Chicago, is spearheading an effort to develop online assessments in Hindi and Urdu using STAMP, adaptive testing technology developed at the University of Oregon. Before, Poulos said, virtually no common tests were available in South Asian languages (an ACTFL oral proficiency interview in Hindi being one exception).

Common testing is needed or potentially useful, Poulos said, not only for placement in study abroad and for jobs in business and government, but also for aiding inexperienced instructors. With colleges regularly hiring instructors in less commonly taught languages with limited (or no) teaching experience -- and often on campuses where "they have zero assistance in teaching" -- a common test at least provides them with a guide to move backward from, Poulos said.

Poulos said that faculty working on the project are currently developing Hindi reading and listening tests and an Urdu reading assessment. The plan is to create tests in listening, reading, speaking and writing in both languages. But, Poulos asked, when it comes to developing assessments in less commonly taught languages more generally, who is going to do it and how? “The costs are extraordinary per student,” he said. The number of people to develop the test is limited. The number of people who would take it would be limited, too.

On a similar note, panelists called attention to some of the missing data links Friday. Vande Berg noted a dramatic dearth of discipline-specific data; how does study abroad contribute to education outcomes in a particular field? And one forum attendee present for Friday morning’s session noted that with all the talk of evaluating the growth of American students going abroad, what about studies measuring the impact of those students on their host locales?

In response, Tamera Marko, outreach coordinator for the Consortium for Latin American and Caribbean Studies at Duke University and the University of North Carolina at Chapel Hill, offered a description of a "Duke Engage" program in Colombia they’re piloting this summer. Program leaders will collect data on each student's contribution to curriculum development projects at local libraries, in addition to collecting impressions from Duke students, their “buddies” at a nearby Colombian university, and host families, all in parallel blogs. (Students won't have access to the latter two blogs, at least not immediately, and they’re working out the details in terms of how information will be shared, Marko said). Brian Whalen, president and CEO of the Forum on Education Abroad, mentioned that Frontiers journal recently published an article surveying home stay families.

But over all, said Celeste Kinginger, an associate professor of applied linguistics and French at Pennsylvania State University, the perspectives of home stay families and other local residents is virtually absent from the study abroad literature.

“This is a very big problem,” Kinginger said. We’re depending only on the perspectives of students who have just recently arrived abroad, she said -- “who by definition know nothing about what’s going on around them.”

Next Story

More from News