The Colleges of the Fenway hosted “From Classroom to Career in the Age of AI,” featuring panelists from diverse industries on March 25. Speakers discussed how artificial intelligence (AI) is revolutionizing the workplace and how those in higher education can help prepare students for this new professional landscape.
“We are at a moment of rapid transformation … AI [artificial intelligence] … is here and is reshaping how we work,” said Sarah Biggers, Director of Professional Development and Academic Programs at the Colleges of the Fenway (COF), as she began the March 25 panel discussion, “From Classroom to Career in the Age of AI.”
“[AI] is not just changing specific jobs — it’s changing how we learn as professionals, how we adapt, how we grow, and how we navigate careers,” she added.
To discuss how AI is changing the workplace and how higher education can help the next generation adapt, speakers from diverse industries offered their expertise. The panelists included:
- Robbin Beauchamp, Assistant Provost of CO-OPS and Careers, Center for Cooperative Education and Career Development at the Wentworth Institute of Technology
- Heather Mendonça, Senior Product Designer at Netflix
- Shelly Saturne ’11, Founder and CEO of Saturne Healing House
- Dr. Michael Spooner, Dean of the School of Healthcare Business and Technology at the Massachusetts College of Pharmacy and Health Sciences
- Alice Stein ’00, ’03MS, Founder and AI Strategy Executive at Stein & Partners
Shifting Expectations in the Workplace
Moderator Abraham Evensen Tena — Associate Professor of Illustration at the Massachusetts College of Art and Design — began the discussion by asking the panel, “What opportunities has AI opened in your field … [and] in what ways has AI changed the day-to-day operations in your field?”
Saturne responded that she uses AI daily. “In the digital business and tech space, AI has fundamentally shifted how quickly we can move from idea to execution,” she said. Tasks that previously took days to complete can now happen in minutes, which, as she noted, “allows teams to focus more on strategy rather than execution.”
Moreover, the new technology has rendered certain technical tasks more accessible across teams. Saturne said that “on a day-to-day basis, AI acts like a strategist, specifically a strategic assistant,” which can support communication, documentation, and decision-making.
For Stein, AI is “a force multiplier.” Her work requires synthesizing large amounts of data and performing market research. For these projects, “LLMs [large language models] enable me to intake a lot of data in a very quick period of time,” she said. “[AI] helps you to scale and to do much more than ever before.”
From a career services perspective, Beauchamp discussed how she and her colleagues teach students how to use AI appropriately when seeking jobs.
“Employers … are expecting people to have AI literacy already at this point, so it’s our job to teach them that,” she noted. Beauchamp also emphasized the importance of debunking the equation between AI and cheating. “We have to teach them [students] that it’s OK to use AI, but how to do it ethically, and how to not be afraid to talk about it with an employer.”
Spooner referred to today’s undergraduates as “AI natives,” noting that many of them have been using AI tools (i.e, NotebookLM, a note-taking online tool that uses Google Gemini to provide text or audio summaries of study materials) to supplement their coursework for years.
“We surveyed our students last year,” he said, “and almost 80% are using AI as a method to study, and they are doing it in a meaningful way.”
Honing AI Competencies
Tena reoriented the discussion around the particular skills or competencies that are important as AI becomes more integrated into the workplace.
“It’s about the effective use, it’s the critical thinking,” Spooner said. “It’s helping our students to understand what critical thinking means in this AI age.”
He noted that opportunities such as hackathons exemplify how to “support the use of AI, but it also means that they have to take the result, show us what they’ve done with it, and provide a tangible product.” Spooner added that helping students with prompt engineering is crucial, and that “we have to take away the stigma that you can’t use AI.”
Stein spoke about how AI has caused the disappearance of entry-level jobs across many professions. “Educational providers … need to prepare the students to really come in at a much higher acumen, and be ready to take on more,” she said. Stein also emphasized the value of experiential learning in teaching and scrutinizing AI.
According to Beauchamp, it is imperative to prepare students for this new job landscape. “Our faculty can play such a crucial role,” she said, “if we in higher ed are not teaching our students how to use AI now, we are doing a huge disservice to them.”
Using AI Responsibly
Tena’s next question revolved around the responsible use of AI technology.
Spooner mentioned the possibility of bias and the need to teach students the ability to question data. “We also have to acknowledge that there are environmental impacts, even though these are becoming incrementally lesser over time, as we train models … And this is very present in the minds of our students,” he said. From a healthcare perspective, Spooner cautioned that “patient data is sacred,” and should not be used in something like ChatGPT.
Stein advises her clients to conduct “pre-deployment: spend time evaluating your risks associated with it [AI] … It shouldn’t be an afterthought … it should be done upfront.”
Preparation during College
Tena’s subsequent question to the panel addressed how colleges might best prepare students for the workforce.
For Saturne, colleges can err by focusing solely on teaching tools. “Tools change,” she said, “but thinking skills don’t. Students don’t just need to know how to use AI, they need to understand how to think with it.”
Saturne elaborated that preparation should focus on AI literacy, critical thinking and discernment, and execution skills. “The students who will stand out, in my opinion, are not the ones who just use AI, but the ones who can combine AI with judgment, clarity, and ownership. Because at the end of the day, AI doesn’t just replace talent, it reveals it,” she said.
A Shared Responsibility
Tena’s last question turned to the topic of AI training, and how universities and employers might share this responsibility.
Beauchamp responded that many employers today are looking for applicants with data literacy, prompt engineering, ethical AI use, critical thinking, and, above all, interpersonal communication skills. Addressing university faculty, she advised, “Make sure that AI is part of your curriculum, regardless of what you teach.”
Speaking from a healthcare perspective, Spooner addressed teaching students about how AI can supplement decision-making in healthcare.
“However, there is still that human in the loop component,” he said. “We are not going to give [away] our decision-making authority as physicians or advanced practice providers.” Spooner added that it is essential to teach students critical thinking skills.
During the Q&A with attendees, panelists discoursed on human storytelling, contextual thinking, and ethical questions concerning AI. Overall, the panelists encouraged audience members to be curious and optimistic about this emerging technology.