Take a 30-Second Quiz to See How Your Attention Is Being Shaped

 

For much of modern history, universities presented themselves as moral institutions: places charged with cultivating judgment, curiosity, and the capacity for independent thought. Over the past two decades, they have quietly reconstituted themselves as something else. Not communities of formation, but delivery systems. Not guardians of intellectual life, but platforms for credential distribution.

This shift is often described in economic terms. Universities are now competitive businesses. Students are customers. Degrees are products. But this language, while accurate, misses the deeper transformation. What is underway is not merely marketisation. It is infrastructuralisation.

Education is increasingly governed the way digital platforms are governed: through metrics, optimisation targets, dashboards, risk models, and automated systems of monitoring. The university no longer primarily asks what kind of people it is helping to form. It asks what outcomes it can demonstrate.

Retention rates. Completion rates. Employability scores. Satisfaction surveys. Attendance logs. Engagement analytics.

These numbers do not describe education.

They replace it.

Governance by Proxy

When institutions become large enough, complex enough, and financially exposed enough, they stop being steered directly by human judgment. They are steered by proxies.

Proxy measures are not neutral. They shape behaviour. What is measured becomes what matters. What cannot be easily measured recedes into organisational invisibility.

Care is difficult to quantify.

Belonging is difficult to audit.

Intellectual risk-taking does not fit neatly into spreadsheets.

But churn does.

Dropout rates do.

Credit accumulation does.

Ticket resolution times do.

So systems evolve to optimise for the legible.

In this environment, student distress does not register primarily as a human problem. It registers as a retention risk.

Support becomes something to be delivered efficiently rather than relationally. Guidance becomes something to be standardised rather than situated. Learning becomes something to be consumed rather than lived.

The language of care survives, but its substance is quietly hollowed out.

The Automation of Support

Facing rising mental health needs and shrinking staff capacity, universities increasingly turn to software.

Early-alert systems flag “at-risk” students.

Chatbots answer welfare questions.

Wellbeing platforms offer mood tracking and self-help modules.

AI tools promise personalised study guidance at scale.

These systems are presented as compassionate innovations. And in some narrow sense, they are. They make help more available than nothing.

But they also perform a subtle substitution.

What used to be mediated through human relationships is increasingly mediated through interfaces.

This is not neutral.

When support becomes automated, responsibility becomes diffuse. No single person is accountable for the student’s experience. Harm becomes statistical rather than personal. Suffering becomes an anomaly in a dataset rather than a cry in a corridor.

The institution can say:

The system is in place.

The resource exists.

The box is ticked.

Care becomes a property of infrastructure rather than a practice of people.

Students as Managed Populations

Universities now operate less like communities and more like population-management systems.

Students are segmented.

Profiled.

Scored.

Categorised.

Who is likely to drop out?

Who is likely to succeed?

Who needs intervention?

Who can be left alone?

These questions are framed as supportive. But they also mirror techniques developed in advertising, insurance, and policing.

Prediction replaces understanding.

Correlation replaces conversation.

Risk scores replace relationships.

The student becomes legible to the institution primarily as a pattern.

This is a form of governance.

Not governance through law.

Not governance through explicit coercion.

But governance through architecture.

The environment quietly nudges behaviour, limits options, and structures attention.

The system does not ask:

What does this student need?

It asks:

What outcome must we produce?

The New Contract

The old implicit contract of higher education was imperfect but intelligible:

We will challenge you.

You will struggle.

You will be supported by people who know you.

The new contract is different:

We will provide access.

We will provide tools.

We will provide platforms.

You are responsible for making it work.

When students falter inside this model, their difficulties are framed as individual resilience problems rather than structural design failures.

Too anxious.

Too distracted.

Not engaged enough.

Not self-managing enough.

But what if the system itself is producing the conditions it then pathologises?

An environment optimised for throughput rather than formation.

For efficiency rather than attachment.

For scale rather than care.

AI and the Future University

Artificial intelligence will accelerate this trajectory.

Not because AI is malicious.

Not because it “takes jobs.”

But because AI is exceptionally good at extending proxy governance.

It excels at:

  • Monitoring at scale

  • Predicting behaviour

  • Personalising content

  • Automating feedback

These are precisely the functions that institutions under financial and political pressure want to expand.

The danger is not that AI will replace teachers.

The danger is that it will quietly replace the idea that education is a relationship at all.

Learning becomes interaction with systems rather than participation in a shared intellectual culture.

The university becomes a managed environment rather than a lived one.

What Is at Stake

If education is treated purely as infrastructure, then its purpose collapses into delivery.

Deliver credentials.

Deliver skills.

Deliver employability.

But education has historically done something else as well.

It has helped people become capable of thinking against prevailing systems.

Of recognising when inherited structures are unjust.

Of imagining alternatives.

A society that optimises education purely for economic throughput eventually produces highly skilled individuals who lack a language for meaning, responsibility, or collective life.

That is not a technical failure.

It is a civilisational one.

A Different Question

The pressing question is no longer:

How do we improve student satisfaction?

It is:

What kind of human beings are our educational systems quietly shaping?

Are we building environments that cultivate curiosity, patience, and intellectual courage?

Or environments that train people to self-optimise, self-monitor, and self-blame?

AI will not decide this for us.

But it will amplify whichever model we choose.

And at present, the model we are choosing is not education as formation.

It is education as managed flow.

Leave a Reply

Your email address will not be published. Required fields are marked *