Further Education institutions are being asked to deliver more with less. Less budget. Fewer internal resources. Higher expectations from students, parents and governing bodies. At the same time, student behaviour is shifting faster than anything we have seen in years.
This thought was triggered after visiting a Further Education conference in the UK last week. Across conversations with senior leaders, marketing teams and admissions professionals, one theme kept surfacing: the rules of student recruitment are changing in real time and most institutions are still trying to catch up.
In nearly twenty years of working in marketing, this is the most significant structural change I have seen.
AI is no longer a future trend or a side conversation. It has become part of how students discover courses, compare colleges and make decisions long before they ever complete an application form. Whether we are ready for it or not, AI is now shaping the student journey at the earliest stages.
Here’s the thing: the key question for FE leaders is no longer whether AI is a threat or an opportunity. The real question is whether your institution is actively shaping that conversation, or whether someone else is shaping it for you.
A year ago, tools like ChatGPT were processing around one billion prompts a day. Today that number has more than doubled, with weekly active users approaching 800 million on ChatGPT alone. And that is before we account for the broader ecosystem of language models and AI-powered search tools.
At the same time, Google’s AI Overviews have expanded rapidly. In many search environments, users now receive direct answers without needing to click through to a website. That one shift has massive implications for FE marketing teams.
Traditional SEO was built on keywords, rankings and traffic. AI discovery is built on questions, answers and trust.
Students can now compare colleges, courses and outcomes without ever opening your website. So visibility is no longer just a traffic metric. Visibility is influence.
Google describes the decision journey as a trigger followed by a “messy middle” of exploration and evaluation. It is nonlinear by nature. Students move back and forth between options, opinions, channels and formats before committing.
For FE providers, this messy middle is where recruitment is won or lost.
I remember how this used to play out in more predictable ways. Students would attend an open day, collect a prospectus and talk to an advisor. Today many of those same questions are being asked privately through AI interfaces, often before a student ever speaks to a human.
The questions are practical on the surface but emotional underneath.
Do I need to be strong at maths to study engineering?
What is the real difference between Level 1 and Level 2?
Is this course coursework-heavy or exam-heavy?
If I struggle early, will I be supported or left behind?
What jobs does this path actually lead to?
These are not application-form questions. They are reassurance questions. Confidence questions. Identity questions.
AI is already answering them.
The only question is whether those answers are being drawn from your content or from someone else’s.
One of the biggest misconceptions in this space is that AI has to transform everything overnight to be commercially worthwhile. It does not.
A modest uplift in undecided enrolments can be significant. In one example, a college with 3,750 students aged 16–18 and funding of just over £5,000 per student could generate roughly £380,000 in additional annual revenue by influencing only 2 percent of undecided students. Over a three-year learner cycle, that easily moves beyond £1 million.
This is not about winning students who were always going to choose you. It is about helping undecided students make a confident decision in your direction.
That is where AI visibility becomes measurable.
If FE teams need one immediate mindset shift, it is this: stop writing for keyword lists and start writing for real questions.
AI does not think in keyword clusters. It thinks in conversational intent.
Students and parents ask different questions at different moments of the journey. Early-stage exploration looks nothing like final-stage decision-making. Rational concerns and emotional concerns do not show up in the same language.
Most institutions already hold the data they need to respond properly. It is sitting in plain sight across internal search logs, enquiry forms, live chat transcripts, call centre notes, open day conversations and Search Console data.
When you bring those sources together, patterns become obvious. You start to see what students are actually asking, not what we assume they are asking.
Fast forward to the next step and this is where AI becomes genuinely useful inside your own strategy. You can model intent by persona, pressure-test your messaging and identify concerns your current content does not adequately answer.
This is usually where the content gap appears.
Once you build a bank of real student questions, you can audit your content against it with much more precision.
Do we answer these questions clearly in plain English?
Are answers easy for AI systems to parse and cite?
Are important explanations buried in PDFs?
Are we writing like educators speaking to students, or institutions speaking to institutions?
In many cases, this is not about producing huge volumes of new content. It is about improving structure, clarity and tone.
One strong explainer page can outperform ten thin FAQ pages every day of the week.
AI visibility is measured differently from traditional SEO performance.
Traffic still matters but it is no longer the full story. You also need to track whether your content is being referenced in AI responses, which course areas are being surfaced, how often your institution is cited and what the quality of incoming traffic looks like once users arrive.
In practice, we often see AI-referred visitors spend less time on site. Counterintuitively, that can be a positive signal. It often means they arrive better informed, more focused and closer to a decision.
That requires a more mature internal conversation about what “good performance” actually looks like.
The opportunity right now is real. We are still early and practical improvements can produce visible results quickly.
But passive optimisation will not be enough.
AI interfaces are evolving fast. Paid AI placements are coming. Browsers are becoming more conversational. Students will increasingly interrogate your content through AI layers without ever navigating your website in traditional ways.
This is not a short-term campaign. It is a long-term institutional capability.
Institutions that treat AI visibility the way smart teams treated SEO in the early days will create a meaningful first-mover advantage. Institutions that delay risk entering a conversation that has already been defined by others.
I am asked this often: if AI answers everything, do colleges still need websites?
Absolutely yes.
Your website is no longer just a destination. It is your authoritative data layer. It is what trains and informs how AI systems describe your offer, your outcomes and your positioning.
Structure matters. Accessibility matters. Plain English matters. Schema matters. Your content must work for both humans and machines.
AI may do more of the talking, but your website ensures the right story is told.
We are moving into an answer economy.
Students now expect clear, honest answers at the exact moment they need them and AI is quickly becoming the interface through which those answers are delivered.
The institutions that will lead in this new reality are the ones that write for real questions, make answers easy for AI to trust, measure influence not just traffic and treat AI visibility as an ongoing strategic investment rather than a one-off fix.
In nearly two decades of working with education providers, I have rarely seen change happen at this speed.
Twelve months from now this conversation will look very different again.
The only real risk right now is standing still.