Barnard launches AI for staff open enrollment, plans campuswide implementation of ‘concierge’ tool
- Kimberly Wing
- 5 days ago
- 4 min read
Executive Vice President for Strategy and Chief Administrative Officer Kelli Murray introduced “Millie,” an AI program that “will eventually serve as a campuswide concierge.” Some Barnard employees expressed concerns about Millie and the College’s approach to AI.

Photo by Gabriela Valentin/The Barnard Bulletin
December 8, 2025
During Barnard’s open enrollment period for faculty and non-union staff to sign up for, change, or renew their health benefits, Executive Vice President for Strategy and Chief Administrative Officer Kelli Murray announced the launch of an AI pilot program called “Millie,” which would “help [employees] think through [their] health plan options.”
Introduced on November 6, Millie was trained on “everything about [Barnard’s] employee benefits” that are offered by the College’s insurance provider, Marshall+Sterling. Millie’s implementation during open enrollment was to test “[its] ability to provide resources across the College in the future.” Faculty and staff were directed to contact Millie through a designated email address and phone number.
Murray noted that engaging with Millie was “completely optional” and discouraged “[sharing] confidential medical history or anticipated medical needs” with the tool. Staff could also opt out of “helpful reminders via email, phone, and text” from Millie.
In a statement to The Bulletin, a Barnard spokesperson reported that “hundreds of employees used Millie at all hours of the day and night to instantaneously receive information and make informed decisions about their benefits.”
Murray also wrote that “Millie will eventually serve as a campus-wide concierge,” which would “connect [the Barnard community] to resources across many areas of the College.” An email that formally introduced Millie to faculty and staff on November 10 repeated this sentiment, stating that in the future, the AI would be “ready to help with questions and resources across all areas of Barnard life.”
Murray is the chair of the Barnard Artificial Intelligence Working Group, a group of 11 faculty and staff members who develop and recommend AI-related governance “through shared knowledge and pilot projects across the campus.” On November 24, the group invited students to participate in a survey that covered “experiences, expectations, and needs related to AI.” In addition to the survey, campuswide access to Gemini and NotebookLM and the introduction of Millie reflect Barnard’s goal to “develop campus-wide standards of AI literacy.”
The College continued to promote Millie in various email reminders about open enrollment throughout November.
However, Millie was met with concerns from faculty and staff, some of whom believe that the College’s goal to “engage with new tech” has reduced College programming and led to staff layoffs.
Nancy Worman, classics professor and chair of the Faculty Governance and Procedures Committee, stated that although Millie “strives to provide accurate and timely information,” faculty expressed concerns about the tool’s disclaimer to “always confirm details directly through official College resources such as the Benefit Guide, Workday, or the Human Resources Office before making any decisions.”
“This hardly inspires confidence – even more, it raises the question of what purpose this AI device actually serves, since it cannot be guaranteed to be accurate,” Dr. Worman wrote in a statement to The Bulletin.
“When it comes to AI, I always say that it is crucial to exercise thoughtful intention. That is one thing that AI can’t do, yet it is a cornerstone of human intelligence,” neuroscience professor Gabrielle Gutierrez (BC ’06) told The Bulletin.
“I’m concerned that Barnard is moving too fast and without enough thoughtful intention … I believe that we have a real opportunity at Barnard to live up to our mission and to engage AI with a uniquely critical and creative lens,” Professor Gutierrez continued. “Can you imagine a world where Barnard graduates take the lead in steering us towards an ethical and socially responsible technological future? That’s what I work for and teach for, but it feels like the administration is disconnected from that vision.”
In a statement to The Bulletin, Saima Akhtar, the Senior Associate Director of the Vagelos Computational Science Center (CSC), expressed her “[appreciation] that Barnard has begun developing an AI Working Group.” Akhtar noted, however, that the CSC was not consulted during the committee’s formation, nor were they invited to or included in the group’s meetings.
“For a topic as consequential as campus-wide AI policy, it’s essential that the committee’s membership, mission, and ongoing work be transparent to the broader Barnard community. Right now, many students, faculty, and staff are unsure what the committee is actually addressing,” Akhtar wrote.
“There are important questions I hope the committee will address openly, including how Barnard’s use of Google Gemini is being managed outside strictly FERPA-protected contexts, the terms of the partnership between Google and Barnard, how campus data is being stored and protected, and how emerging policies will guide AI use beyond the classroom,” she continued. “This includes privacy considerations around AI-assisted note taking in staff and faculty meetings, the introduction of automation tools in workplace processes and their implications for human labor, and broader questions of security and responsible technology use across campus.”
A staff member who wished to remain anonymous expressed concern about Murray’s qualifications as chair of the AI Working Group.
“The folks in ATLIS [Academic Technologies and Learning Innovation Services] have the expertise, but it seems like it’s Kelli Murray, who has no tech on her CV that I’ve observed, who’s driving this thing,” they said. “Is Barnard IT even involved? Are [the members of the AI Working Group] qualified?”
The staff member continued, “I think the majority of our students, faculty and non-senior staff are more interested in a culture of learning how to do [things] ourselves or together, rather than having a bot to do our bidding or having enough money to hire a human concierge.”
Professor Gutierrez expressed a similar sentiment.“We should be empowering our students to lead the decision making and policy making around AI at Barnard through an educational, community driven process rather than having administrators dictate to us,” she stated.
Representatives of the Barnard Human Resources’ leadership team did not respond to The Bulletin’s request for comment.
A Barnard spokesperson told The Bulletin that “the College is evaluating responsible ways to use AI to support our community” and, “Over the coming months, we will continue to assess opportunities to expand Millie’s scope to support broader operational needs.”

