You have /5 articles left.
Sign up for a free account or log in.
Photo illustration by Justin Morrison/Inside Higher Ed | Getty Images | Rawpixel
The College Guidance Network (CGN), an informational resource hub for counselors and students, held a virtual roundtable last week to introduce AVA, the newest artificial intelligence–powered college counseling assistant.
Thousands showed up to the demonstration, many identifying themselves in the sidebar chat as high school counselors and independent admissions consultants. Before long they were flooding the chat with questions and concerns.
Brennan Barnard, director of college counseling at the Khan Lab School and the roundtable’s moderator, seemed both pleased and slightly uneasy about the turnout for the demo.
“This speaks to a moment of significant potential,” he said. “And also, I’m sure, some trepidation. But I for one am really hopeful.”
Students have been using ChatGPT and other generative AI tools to write essays for over a year now, a trend that has raised alarms but seems largely unstoppable. Even some college admissions offices have begun using AI to ease their workloads, however begrudgingly.
AVA, which will launch in pilot this fall, is the latest AI counseling tool meant to replicate the work of a high school counselor or private admissions consultant. Proponents of the technology argue it could reduce the burden on overworked counselors and give students access to expertise and information 24-7 during the stressful application cycle. Critics worry it could be seen as a cheap alternative to high-impact counseling for students who most need a human touch.
Angel Pérez, president of the National Association of College Admissions Counselors (NACAC), lent his organization’s considerable heft to the project by partnering with CGN on AVA’s launch. He spoke at the roundtable about NACAC’s role in engaging admissions professionals to help train the bots.
“I think a lot of our members are kind of putting their heads in the sand about this issue. The truth is, we have to engage with this technology; it’s already here,” he told Inside Higher Ed. “It's true we’re stepping into the unknown, but I would rather our profession be involved in informing this technology as it evolves. If we don’t, someone else with profit-driven, less-than-ideal motives will be the one doing this work.”
Katie Cameron, a high school counselor and assistant executive director of the Nebraska School Counselor Association, attended the roundtable out of curiosity. She has a 300-student caseload and was intrigued by the idea of using AVA to help her serve them better.
“As counselors, we do way more than just college prep,” she said. “I like the idea of it, especially if it saves us time on the simple tasks.”
Equity Hopes and Ethical Concerns
Jon Carson, CGN’s CEO, first started building what would become an informational college-advising resource for students and families in 2019, when he went through what he called the “terrible experience” of helping his son apply to colleges.
“We were flying without instruments,” he told attendees at the virtual roundtable. “There were grotesque inequities: side doors, back doors, expensive consultants; it was hard to get the advice we needed … The expectation seemed to be that our 17-year-old was going to navigate this solo.”
The purpose of AVA, he said, is to “democratize advising.”
Currently, students at well-resourced public high schools and private boarding schools get frequent help from counselors with only a few dozen students on their caseloads, while students at cash-strapped public schools are lucky to schedule one meeting a year. AI chatbots can serve families with limited access to counseling services, Carson argued, which will help close the massive equity gap in college counseling.
AVA is also trained in multiple languages—Carson said it would launch with one or two dozen options—making it a potential game-changer for immigrant families who may struggle with language barriers in the application process.
Royel Johnson, an associate professor at the University of Southern California’s Rossier School of Education, said he sees the potential upside of introducing a tool like AVA to under-resourced high schools, especially if it’s offered by districts as a free resource for families.
But he also cautioned against entrusting AI to be inclusive in its advice and sensitive to students’ lived identities.
“These tools, when you try to make them colorblind, often end up exhibiting some form of racial bias,” he said. “They need to be trained for racial sensitivity, which is a very difficult undertaking.”
More than anything, Johnson worries that AI chatbots could exacerbate existing disparities in college counseling, especially if students who are less engaged in the admissions process are routinely redirected to the bot while higher-income, highly motivated students are granted access to more intensive human advising.
“The students who AI counselors are aimed at serving are also the ones in most need of contextualized, high-contact advising,” he said. “There are so many perils and promises here.”
Ethical concerns also hang over the enterprise. A frequently asked question at CGN’s roundtable concerned student data privacy, an issue the hosts seemed to be actively working on. The basic notion of AI’s place in the admissions process is hotly debated as well.
NACAC is partnering with several universities to form an AI ethics committee, which Pérez said would address ethical questions including, “Should counselors use AI to write recommendation letters?” (his answer—“They already are!”) And, “Should students use AI to help outline their admissions essays?”
Carson said CGN is recruiting counselors to help develop AVA further through practice. The company wants to build a “community of practice” with volunteers who will test the tool and send feedback.
“We don’t have all the answers, and this is the best way to build something folks feel they can trust,” he said at the roundtable. “Because that’s who this is for.”
More Than Just a Chatbot?
AVA, Carson noted, is not a replacement for counselors. It is not intended for curating individualized college lists or reassuring first-gen students that they belong in a lecture hall. That’s human work, he said, and couldn’t be replicated by any AI, no matter how well-trained.
But AVA can answer foundational questions about financial aid and application requirements, or help a student find the right framing device for the essay. In that way, AVA is more like a streamlined, reliable resource for frequently asked questions, Barnard said—a way to get students on the path toward college and free up counselors’ time.
On CGN’s website, AVA is referred to as “the first and only AI counseling assistant for students and families.” But there’s also Ivy, a comprehensive, generative AI counselor from the educational consulting and technology company CollegeVine.
CollegeVine co-founder Vinay Bhaskara drew a fine but important distinction between AVA and Ivy: the former, he said, is essentially a “chatbot with expertise,” a characterization he said was not meant to be belittling, whereas Ivy is a “personalized counseling system.”
At the roundtable, Carson said AVA was trained on the knowledge of hundreds of experts across 110 topics in college admissions. Ivy was developed with input from admissions experts as well, but it takes its cues primarily from student members’ individual CollegeVine profiles, Bhaskara said, which record their interests and aspirations while keeping track of deadlines and to-do lists during the application cycle. Ivy is also trained to be conversationally intelligent; it will remember previous discussions with students, and bring things up as necessary.
Inside Higher Ed was given a private demonstration of Ivy last fall, and this reporter can say that, based on demos of both AVA and Ivy, the distinction seems accurate.
“Because [Ivy] is integrated into the network, it knows you better,” Bhaskara said. “It’s totally different than ChatGPT. It’s offering something unique.”
That something sounds an awful lot like what a human counselor offers: personalized service, emotionally intelligent advising, a rapport that deepens with time. Bhaskara, like Carson, insists that his tool is meant to help counselors, not replace them. But he said it’s not a bad thing that AI can replicate the most essential parts of the job.
“AI has to be part of the future of this field,” he said. “The system has been calling out for more capacity for 20 years. But that’s not going to be solved with chatbots; it will be with comprehensive tools like ours.”
Cameron, the counselor from Nebraska, tried Ivy out after she received an ad in her inbox last fall, in the midst of a particularly hectic application season. She was faced with dozens of requests for recommendation letters, which she said often took an hour each to write; Ivy, she said, cut that down to mere minutes.
But Cameron isn’t too worried about AI taking her job, and neither are the members of the Nebraska counseling association she stewards. Anything to reduce counselors’ workloads and help her students, she said, is worth trying.
The rest is just static.