Student Analyst Spotlight: Carolina Oxenstierna
In late 2022, Carolina Oxenstierna (SFS ‘25) sat in an editorial meeting for The Georgetown Independent (The INDY). Between story pitches and article planning, one editor began experimenting with a brand-new chatbot, creating mock stories and headlines for the news magazine.
“He was generating the most crazy plots that he could and all these different versions of INDY articles,” Oxenstierna said. “We all said this was crazy and we should just stick with what we know to be good journalism.’”
The chatbot was ChatGPT, released halfway through Oxenstierna’s undergraduate studies. And while she and The INDY staff did not think much of the large language model then, it made Oxenstierna aware of the growing impact artificial intelligence (AI) was having on the public—a technology that became central to her academic pursuits, led her to the Beeck Center for Social Impact + Innovation at Georgetown University, and shaped her career.
“I had been interested in journalism … and largely what accountability and oversight looked like in the digital age. That led me to studying cybersecurity and … what it means to uphold democratic principles, like freedom of expression, rights to privacy, and access to information in online spaces,” Oxenstierna said, referring to her early years at Georgetown. “I saw a lot of overlap between my interest in cybersecurity and AI. It was a very natural leap.”
As a Culture and Politics major in the School of Foreign Service, Oxenstierna enrolled in classes covering national AI policy, the international governance of AI, and the intersection of cybersecurity and AI, and began searching for a like-minded community. She joined the Georgetown AI Association (then known as the Responsible AI Network) and became a student analyst at the Red House, a center driving experimentation and innovation in education. At the time, Georgetown had no official guidelines on students and faculty using AI, so Oxenstierna spent her time at the Red House exploring how educational institutions could adapt to technological disruptions from ChatGPT and AI.
“I was really lucky to be at Georgetown, [a school] that focuses on interdisciplinary, rigorous analysis of major changes in global power dynamics and in the international arena,” Oxenstierna said. “I got to explore this question of what AI is, [but also] … its societal and political impacts across borders.”
Oxenstierna said she was drawn to the Beeck Center’s Student Analyst program by interdisciplinary questions ranging from “the philosophical aspects of a technology that calls into question the utility of humans” to the implications of automating large parts of the economy for democracies Turning to a side of AI she had yet to explore, Oxenstierna joined the Government AI Hire, Use, Buy (HUB) Roundtable project. The HUB Roundtable was a series of discussions throughout 2024, run in collaboration with the Center for Security and Emerging Technology and the Georgetown Law Institute for Technology Law and Policy. The roundtables grappled with the legal liability questions that AI posed, examined AI’s potential to transform government services, and considered how the government could better attract and use AI talent.
“I had done a lot of research and work in class and through the Georgetown AI Association on what governments should be doing about AI,” Oxenstierna said. “I hadn’t really thought that much about the government using AI.”
Already familiar with a risk-based approach to AI policy from her classes and research, Oxenstierna began to understand government procurement as “a very practical way” to embed accountability and liability into AI systems. She set out to plan the content and invite attendees to one of the roundtable discussions, gathering stakeholders from the public and private sectors, nonprofits, and academia.
“I liked that the Beeck Center gave [student analysts] a lot of agency in how projects went,” Oxenstierna said. “I got to do a lot of novel research into … what were the right questions to be asking … and [how to be] the mediator between industry and government.”
Bringing stakeholders from across the AI field to the Beeck Center allowed Oxenstierna to gain “great insight into what these relationships should look like,” she said.
“It’s really important that industry and government speak to each other,” Oxenstierna said. “Yes, they can have one-off calls or luck out and have someone in government know people in industry. But I think that having places like the Beeck Center to formally host these kinds of dialogues—and create spaces for interaction and engagement and cooperation—is so critical for working together on shared goals.”
Not only was Oxenstierna seeing these conversations take shape at the Beeck Center, but she was also watching them play out as an intern at the White House Office of Science and Technology Policy. Her White House internship focused specifically on AI and emerging technology policy, and overlapped with her time working on the HUB Roundtable. In crossing sectors and engaging with stakeholders at the Beeck Center and in the White House, Oxenstierna began focusing on the idea of AI Governance—a field she entered after graduating in May 2025.
“Coming out of the Beeck Center, my main takeaway was that there are lots of lessons to be learned about … how the government approaches deploying a new technology to serve its public,” Oxenstierna said, “and how that ultimately shapes standards for safety, accountability, liability, and transparency.”
Oxenstierna left the Beeck Center and her role at the White House in late 2024. Though she would graduate the next semester, Oxenstierna was not yet ready to exit the civic tech space at Georgetown. When she left the Beeck Center in October of 2024, Oxenstierna became a program assistant at Tech & Society, a cross-campus ecosystem creating novel approaches for interdisciplinary collaboration, research, understanding, and action.
“I really wanted to stay involved in the Georgetown tech community, in any way possible,” Oxenstierna said about her time with Tech & Society. “I got to be at the center of all of these different and incredible projects and programs at Georgetown, trying to have difficult conversations and ask[ing] difficult questions.”
Now, just a few months after graduating, Oxenstierna has joined the Centre for the Governance of AI (GovAI), a nonprofit research body based in the United Kingdom. There, she works as a research fellow on U.S.-China cooperation related to AI security. While she may have broadened her focus to the international level, Oxenstierna’s work continues to focus on the same goal that she explored with the HUB Roundtable, grappling with the coordination of civil society, governmental, and international sectors to prioritize AI safety.
“We saw during the Cold War that great-power competition [in] technological developments leads us down high-risk paths, endangering civilians around the globe,” Oxenstierna said. “I think there need to be more spaces where you can reach across the water and talk to your so-called adversary on how to cooperate … on fundamentally the same goal of safe and secure AI designed for the people.”
Oxenstierna sees AI not only as interdisciplinary—as she says “it touches literally everything”—but also as demanding collaboration. From the federal government to academia, Oxenstierna has watched the chatbot she first encountered at The INDY evolve into an “institution-shaping force,” “the next frontier,” and “an existential risk.”
For students eager to break into this field, Oxenstierna highly recommends engaging with spaces like the Beeck Center, a place where curiosity meets real-world impact.
“I think the Beeck Center was an incredible space for me to explore different kinds of interests and how they intersect with one another—between the government and industry and civil society and academia,” Oxenstierna said. “No matter what stage one is in in their undergrad or grad career, I think it’s important to explore as much as you possibly can.”