Learning, Analytics, AI, Trust - and the future of Universities

The U21 Educational Innovation Symposium will bring figures from some of the world's leading universities together later this month, under the theme 'Scoping the Future in Higher Education: Transition or Transformation?'. Ahead of the event, Profession Simon Buckingham Shum, the event's keynote speaker, shares a preview of the themes around Artificial Intelligence that he will speak about at the Symposium.

Professor Simon Buckingham Shum, keynote speaker, U21 Educational Innovation Symposium 2023

"First came the global pandemic, forcing teaching and learning online in a matter of weeks, but leaving an indelible mark (not to mention scars) in every institution. Many, however, developed and sustain new online capabilities.

And now generative artificial intelligence has catapulted AI into mainstream consciousness, scoring high on the university earthquake Richter scale, with aftershocks accompanying each upgrade. We are witnessing the largest rollout of AI in educational history, but in true Silicon Valley fashion, moving fast and breaking stuff. Artifacts that were until this year relatively trustworthy indicators of student understanding are suddenly more suspect.

Shockingly, we may need students to evidence their process and ability to think on their feet at least as rigorously as judging documents they can produce. The starkest risk is that universities cannot assure the quality of learning if they continue to depend on artifacts that can be synthesised in a few seconds, and with a few minutes’ efforts massaged beyond the detection of policing software. But the enormous opportunity is that, finally, the tectonic plates of assessment practice may shift in the direction that many have long called for.

The assessment shift from product to process points to the critical roles that well-designed learning strategies design and accompanying technologies can play. Generative AI is undeniably a leap in capability, but its broader underpinning fields, such as data science, natural language processing, machine learning and other forms of AI have been growing in higher education, through the work of academic communities such as AIED (>50 years old) and its teenage cousin Learning Analytics.

These tools provide ways to track how students work, not only what they can produce. Once we can map from clicks to constructs, we come much closer to making educationally meaningful claims. And even though analytics and AI cannot in principle model human learning in all its complexity, embracing imperfection as a feature, not a bug opens new possibilities.

There are at least two levels of response to these system shocks, which seem to align with the symposium’s questions around university transition versus transformation. 

Firstly, and somewhat reassuringly, successful cases of introducing and embedding Learning Analytics/AIED into universities point to how generative AI can, in turn, be appropriated as an educational technology.

I will share stories from my times at The Open University (UK) and the University of Technology Sydney (UTS), where we’ve sought to grow the sociotechnical infrastructure that promotes the skillful use of such tools, respecting the agency of educators and students.

At the Connected Intelligence Centre, we invent, pilot, and evaluate EdTech that makes data actionable, closing the feedback loop to build student qualities that transcend the disciplines. The platforms we work with focus on critical and reflective writing, embodied teamwork, learning dispositions, workplace skills, and cultivating a sense of academic belonging.

Moreover, as AI closes the cognitive gap on us, humans need to move to higher ground, attending to the capabilities that will differentiate learners in the age of AI. No matter how promising an approach, however, without the trust of all system stakeholders, we will just add to the graveyard of EdTech corpses.

Not surprisingly, it turns out that trust is built through conversations — with different stakeholders spanning the “boardroom, staff room, server room, and classroom.” The need to forge and sustain trust also motivates our use of methods from different traditions such as human-centered design and deliberative democracy to build common ground among diverse stakeholders.

Secondly — and more speculatively since this will be an argument rather than an empirical report — we must surely use these shocks to the university system to ask more foundational questions. Assessment reform forces us to ask what we should value most in the graduates we turn out, as we confront ecological and democratic crises.

Nor can we discuss AI ethics in a purely educational frame, as though it does not aggravate both of these. The complexity of these interlocking systems is overwhelming our sensemaking capacity. If we keep asking why we do what we do, we soon run up against values: politics, culture, spirituality. This may force us out of our disciplinary niches, but if ever there was a time for universities to help re-integrate these fragments, is it not now?

I will suggest that The Matter With Things holds important clues, and welcome the chance to explore these urgent questions with you.

Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney where he serves as inaugural Director of the Connected Intelligence Centre.