Amid explosive interest in generative AI and rising concern about its impact on higher education, the Association of Pacific Rim Universities (APRU) this week published a white paper on the future of generative AI in higher education as part of APRU’s “University of the Future” initiative.
The 18-month project, backed by Microsoft, set up a network within APRU’s 60 member universities in Asia, the Pacific, North and South America, to gain a deeper understanding of the opportunities and challenges generative AI (genAI) poses for higher education and identify ways to address knowledge gaps.
GenAI tools, such as ChatGPT and DALL.E2, swiftly produce content, including text and images, that can be difficult to distinguish from human-produced content.
“Universities are currently grappling with these implications, and medium- and longer-term strategies will require a better shared understanding of these tools: how they work and how to balance risks and benefits,” said APRU Chief Executive Thomas Schneider.
This article is part of a series on Pacific Rim higher education and research issues published by University World News and supported by the Association of Pacific Rim Universities. University World News is solely responsible for the editorial content. |
“Higher education is now at a stage where it needs to transition to a holistic, supported, and scaffolded approach to generative AI adoption,” the white paper notes, pointing to a “cautious and somewhat piecemeal approach to generative AI” so far.
The project’s academic lead, Simon Bates, vice-provost and associate vice-president, teaching and learning at the University of British Columbia (UBC), Canada, told University World News the aim was to “take a pulse check in a point in time” of how APRU universities are coming to terms with the implications of these tools, the impact on higher education, and to get universities thinking about the future.
Going beyond statements of principle
Initially, when OpenAI’s ChatGPT and other generative AI (genAI) tools were released in late 2022, many universities made statements on genAI use, but few went further than that.
“There’s a big gap between those principles and practical actions, whether in teaching and learning or university business processes or research,” Bates said. “This white paper aims to support this next stage” – with a balanced plan of action for institutions.
“At the same time as embracing the tools, we have to be deliberate about protecting elements of the teaching and learning experience, the same for graduate students with the research experience, that should not be short-circuited,” said Bates. “Getting that balance right will be the big challenge for universities in the next few years.”
Larry Nelson, Microsoft’s Asia regional business leader for education, said that because “AI has been around and integrated into a lot of things that we already do, combined with the introduction of ChatGPT, the importance of generative AI and AI in education is uniquely profound”.
He pointed to a need to work with universities. “We often overestimate the short-term impact of some of these innovations and changes and underestimate the long-term impact,” he told University World News. “So much of the innovation has taken place in universities; it makes a lot of sense to get involved, get engaged, and partner [with universities] around that.” he added.
CRAFT Framework
Danny Liu, professor of educational technologies, University of Sydney, Australia, authored the white paper, titled “Generative AI in Higher Education: Current practices and ways forward”.
“Universities were stuck in inaction. They didn’t quite know where to start,” Liu told University World News.
The white paper devised the CRAFT framework based on literature reviews and feedback from several APRU-organised workshops. It covers five key elements: culture, rules, access, familiarity, and trust to help universities assess their current state and identify next steps in each of these areas.
For responsible integration of genAI into education, research, and operations, universities need a balance of rules, access, and familiarity (with genAI tools). A lack of one or more of these may lead to ethical, privacy, security, or other challenges, according to the white paper.
These are underpinned by ensuring trust between students, educators, leadership, and partners such as industry, government, the community, and AI itself. All these aspects must be part of the local and regional culture of the institutions.
“We think all five elements (of CRAFT) are essential,” said Liu. “There are universities that are progressing better in terms of the rules, or access, or familiarity. But no university is up there for all five,” he noted.
Trust can be built via rules on responsible use of AI. But, for example, rules barring staff from using AI to mark student work would not work, Liu explained, “because it would break the trust between the faculty and students”.
Students should be central to any discussions around rules, according to the white paper. “They are engaged, eager for guidance, and fully aware of how important proficiency with these tools is going to be as they move through and beyond their time at university.”
A culture of genAI acceptance and use
Developing trust, as opposed to eroding it with AI use, “helps, over time, to build culture and change culture” around technology acceptance and use, Liu pointed out.
Or as the white paper puts it: “Do we have a culture that looks far enough into the future so that we are preparing ourselves and our students for a radically transformed environment?”
Liu noted: “The future is generally positive, as long as we can shift the culture of higher education. It depends on whether we can help people to change their mindsets from locking down, restricting, banning, and being scared of genAI, to thinking: it’s here, our students are using it, it’s available for us to use.”
The report suggests that those from emerging economies may have a stronger cultural acceptance of technology as it may be perceived as a route towards economic progress and advancement.
Experimentation in a safe environment
Michelle Banawan, professor at the Asian Institute of Management in the Philippines, attended the workshops that fed into the white paper, which she also peer reviewed. The paper “establishes a framework based on varying perspectives of developed countries and developing countries, so it’s really inclusive”, she told University World News.
It is not just a guide but includes benchmarks for policymaking, formulating pedagogies in higher education, or doing research, she added.
“It encourages us to explore, to encourage co-creation and experimentation, even when we do not have established use cases yet,” she said. “But the approach by which the university is trying to learn and to experiment is a ‘best practice’ in itself.”
UBC adopted a “creative sandbox” approach that encourages faculty and others to experiment with genAI, a practice the White Paper says other universities should consider adopting.
Bates explained: “At UBC, we recognised that faculty need time and space to be able to experiment with these tools, to understand in a safe environment where they don’t need to worry about data security or intellectual property [IP], to see how, where, and if they [the tools] would fit within their courses and curriculum.”
It allowed “access to multiple LLMs [large language models] in a secure way, so the IP that faculty might put in their lecture notes or readings doesn’t go back into the model. We also diverted innovation funding to support experimentation in generative AI projects”, he said.
Need for balance
But the workshops also found that for all the good genAI tools can bring, “there are also things we might inadvertently lose in universities by going too far and too fast down this road”, Bates pointed out, noting unintended consequences were also observed with other technology-driven shifts such as the spread of social media or the shift online during the COVID-19 pandemic.
Universities need to recognise “the capabilities of these tools for supporting and personalising learning at scale that even the best teachers cannot do, but equally have to balance that with the very strong desire for human interaction and connection to support learning. Defining that balance is the task for universities”.
Nelson pointed to “driving a personalised learning experience in a way that doesn’t eliminate the teacher, the faculty, but helps streamline and create more time and space for them to add value where they add it most, which is working with students and delivering content and resources. A curriculum that extends scholarship is something that an AI platform has some potential to deliver”.
Nelson added: “These are tools that companies are looking for their employees to know how to use. So, being thoughtful about figuring out the right ways in which they can be integrated into our curriculum and integrated into the way students learn and teach around that, and build critical thinking skills in terms of how we use them, is important.”
Cognitive offloading
GenAI can make learning seem frictionless, but real understanding and mastery of any discipline requires effort and practice, which is something universities of the future must address.
In September 2024 OpenAI introduced the latest genAI model, the o1-Strawberry model. “It’s a cognition model that corrects itself and thinks about its thought processes,” said Banawan. “Universities are now trying to address ‘cognitive offloading’, where students put all the thinking into this technology. We wanted students to be aware of how they think.
“Students need to develop a nuanced view of not only how to use these tools to support learning but when not to rely on them: using them productively whilst avoiding unhelpful cognitive offloading and potential over-reliance,” the white paper states.
An area to explore further with universities, according to Nelson, was “the science of learning itself, and how AI can inform, improve, or advance that, addressing some of the concerns around ‘cognitive offloading’ that may come through the overuse of AI in some cases”.
Importance of collaboration
The white paper notes that “collaboration within and between institutions will be a key to future success for the sector. This could be regional in scope or focused on particular issues of generative AI adoption and application”.
Liu noted that collaboration with AI-aligned industry and those at the forefront, like Microsoft, was important. “They have the connectivity, the clouds, and the foresight to see where technology is headed, and they can only succeed if they work with us, and we can only succeed if we work with them,” he said.
Christina Schönleber, APRU’s chief strategy officer, said: “Amid the transformative impact of generative AI on higher education, fostering multi-stakeholder collaboration in this safe space is more crucial than ever. By continuing to engage university leaders, educators, and students in the region with technology providers and industry partners, we can develop equitable AI solutions that cater to diverse institutional needs.
“These collaborative efforts not only support effective AI adoption but also aim to ensure that our educational systems remain resilient, innovative, and beneficial for the entire academic community.”