Student support teams face a growing volume of repetitive queries, from password resets to timetable changes, that consume valuable staff time and delay responses to more complex cases. These bottlenecks lead to slower turnaround times, inconsistent service levels, and frustration for both students and staff.
This presents a clear opportunity to apply AI-driven triage to automate routine interactions, speed up resolution, and create capacity for more personalised, high-value student support.
Hypothesis
We believe that introducing a conversational AI agent for initial query triage will reduce average response time by 40%, improve first-contact resolution, and free up staff capacity for complex cases.
This section outlines how the experiment is structured and delivered, from planning and setup to testing and measurement. It shows the practical steps, variations, and inputs that shape the pilot.
Process
The step-by-step process that guides the setup, delivery, and measurement of the experiment.
Map top 50 support queries from last year.
Identify the most common student queries from historical data to understand demand patterns and prioritise which use cases the AI should handle first.
Build an intent library from HubSpot Knowledge Base
Translate those frequent queries into structured “intents,” linking each to relevant Knowledge Base articles and carefully crafted prompts to guide the AI’s responses.
Train Breeze Agent on sample tickets.
Feed representative support tickets into the AI agent to teach tone, terminology, and escalation paths, ensuring it mirrors the team’s preferred communication style.
Pilot with 3 departments
Deploy the triage agent in a controlled environment, gather live feedback from staff and students, and track response times and satisfaction metrics over a defined test window.
Inputs & Prompts
Examples of user queries used to train and evaluate the AI's understanding.
Undergraduate and postgraduate students seeking support via chat or online forms, typically for administrative, wellbeing, or course-related issues.
Data sources
Uses non-sensitive student data such as name, email, course, year, and category of issue, sufficient for routing and reporting but excluding academic performance or personal wellbeing records.