A new approach to teaching
LSI's approach is referred to AI-led pedagogy. What does it actually mean?
Why we started from a blank sheet
LSI was not designed to fix higher education. It was designed to ask a more specific question: if you were building a postgraduate institution today, knowing what AI can now do reliably, what would you build?
The honest answer for us was: certainly not what currently exists. The structures of a conventional university were shaped by information scarcity. Lectures existed because expertise was hard to access; standardised pacing existed because individual tuition did not scale. Those constraints have changed. AI cannot substitute for academic judgement, but it can reliably handle explanation, repetition, individualised examples, and continuous low-stakes feedback at a scale no human faculty has ever been able to match.
Building a new institution on that premise is different from adding a chatbot to an existing one. It changes admissions, content design, staff roles, the working week of a student, and the way assessment is structured. More than 150 specialised AI agents now operate across the institution across course design, admissions, student support, compliance, and reporting, not as add-ons, but as the working infrastructure on which the academic model sits.
Our model was reviewed by the Office for Students through a multi-year Degree Awarding Powers assessment, including scrutiny by an independent expert panel of senior academics from across the UK sector. The same Frameworks for Higher Education Qualifications, subject benchmarks, external examiner system, and Quality Assurance Agency reference points that govern any UK degree govern ours.
How learning at LSI actually works
A student joining LSI is assigned a personal AI tutor at onboarding. The tutor is briefed on their professional background, learning preferences, and goals, and operates against a specific syllabus designed and approved by academic staff. Course content is mapped onto an interactive knowledge graph: every concept in a module is a node with explicit dependencies, and the tutor tracks individual mastery against that map.
In a typical week, a student working full-time might spend Tuesday evening working through a new concept in their preferred mode (short video, written explanation, or dialogue) with the AI tutor adapting examples to their working context. For example, on Wednesday they may complete a formative exercise; the AI gives them immediate feedback, and surfaces a related concept they previously got wrong for revision. Thursday they join a live seminar with their module leader and cohort, focused entirely on application, debate, and case work, not on information transfer, because that has already happened. Friday they have a one-to-one with their personal academic tutor on their dissertation idea.
The point of this design is that the working professional studying at LSI gets more individualised academic contact than they would on a comparable part-time master's elsewhere, not less. The AI absorbs the parts of teaching that scale poorly with human time. The academic absorbs the parts that scale well only when they are freed from repeated delivery.
Where the human academic does the central work
Decisions about what is taught, how it is assessed, and what constitutes a degree sit entirely with academic staff: module leaders, programme leaders, the Academic Board, and external examiners. The AI tutor operates within boundaries set by them, using content they have designed and approved.
In practice, academic staff at LSI spend their time on the parts of the role where human judgement is decisive: programme design, content architecture, summative marking, moderation, live seminars, one-to-one supervision, dissertation supervision, pastoral oversight, and continuous improvement of the curriculum based on real student data. Academic recruitment includes a teaching exercise on flipped-learning design and on using AI tools effectively. The model expects more of academic staff in this configuration, not less, and it survives external scrutiny precisely because the academic role is foregrounded rather than diluted.
Students always have direct access to their module leader, a personal academic tutor, and a student success team. Live one-to-ones can be requested at any point.
When the AI gets things wrong
AI tutors can be confidently incorrect. Pretending otherwise would be dishonest, and it would also be a poor pedagogical signal to send to students who will spend their careers working alongside these tools. We design for it explicitly.
Three things follow. First, the AI provides only formative feedback, the low-stakes, learning-while-doing kind. Summative assessment, the marks that determine a degree, is conducted by qualified academic staff with internal moderation, external examiner scrutiny, and ratification at Module Examination Boards. A student's qualification rests on human academic judgement. Second, students can escalate any AI-generated formative response to academic review at any time, both to challenge the AI's view and to deepen the discussion. Third, the AI tutor's outputs are themselves audited as part of our Quality Assurance Framework, which is benchmarked against the QAA Quality Code and supervised by the Academic Board.
Trust in any feedback system rests on the ability to challenge it. We build that ability in by design, and we treat the AI as an instrument that academics oversee, not an authority that students are asked to accept.
What the fee pays for
LSI's online master's programmes are typically set at £11,900, with scholarships available that reduce the cost to £4,950 for eligible students that need it. The fee covers a fully regulated UK master's degree with the academic infrastructure that implies: assessed modules, external examining, library and journal access, a personal academic tutor, live seminars, summative marking by qualified academics, structured pastoral support, and a recognised qualification awarded under UK frameworks.
What changes relative to a conventional programme is not the quality of academic input but the way academic time is spent. Staff are not paid to deliver the same lecture content year after year; their time goes into design, judgement, supervision, and continuous improvement. Students are not paying for scheduled delivery to a room of fifty; they are paying for continuously available personalised tuition, combined with regular live academic and pastoral contact. The model is more resource-intensive per student in the ways that affect learning outcomes, and lighter in the ways that do not.
A reasonable follow-up question is why anyone would pay this when general-purpose AI tools are freely available. The answer is similar to the answer to "why pay for a university when there are libraries". An individual using a frontier AI model can learn an enormous amount, but cannot validate their own competence, earn a regulated qualification, be challenged by an examiner who has spent years in the field, benefit from external moderation, or build the cohort relationships that become a professional network over a career. A general-purpose chatbot is also a different product from a tutor curated against a syllabus, mapped onto a tracked knowledge graph, briefed on the individual student, and operating inside an environment that knows what has been taught and what is due. The fee buys the regulated degree and the learning system that produces it.
Why we are structured as a non-profit
LSI is structured so that any surplus is reinvested in the institution rather than distributed to investors. There are no external shareholders seeking a financial return. The school has been funded by its founders from previous careers in technology and education, and revenue goes back into its growth, the AI infrastructure that supports it, and student scholarships.
The years of multi-disciplinary work to pass an OfS Degree Awarding Powers assessment, the delegation of authority to an external Board of Governors, the investment in a personal academic tutor system, the appointment of external examiners and academic governors - these are the costs of building an institution that intends to last and to be accountable to its students and to the regulator. There are much easier ways to make money than what LSI has committed to.
Who LSI is built for
LSI is not for every student, and we are explicit about that. It is built for mid-career professionals adapting to AI-driven change in their industries, and for career-switchers moving into technology, consultancy, and innovation-led roles. These are people who have already worked, who need flexibility around work and family commitments, who want their study to map closely to professional application, and who value being part of a cohort going through similar transitions.
For these students, the relevant question is not which institution has the longest history. It is which programme best equips them for the next phase of their career. For someone whose professional thesis is that the next decade will reward leaders who are fluent in AI and capable of building with it, studying inside an institution that is itself an AI-native operating model is a different proposition from studying it elsewhere as a topic on a slide. We think that proposition will appeal to a particular kind of ambitious student. We do not expect it to appeal to everyone, and that is by design.
A note on the wider sector
The AI debate in higher education is real and serious. Concerns about consistency, academic accountability, the future of academic labour, and the student experience deserve careful answers rather than dismissal. We share most of them, which is part of why we built the model the way we did rather than layering AI onto a conventional structure.
Other UK universities are doing important work on AI in their own contexts, often within constraints (institutional history, scale, employment structures, validation arrangements) that a new institution does not face. The future of UK higher education will not be a single model. It will be several, and the sector will be stronger for having a range of approaches subject to public scrutiny. LSI is one of those approaches, built openly and regulated transparently. We expect to learn from how others are evolving, and we expect to contribute what we learn back.
Closing remarks
The institutions that matter in the next decade will be those that have thought carefully about what AI changes and what it does not. Information delivery has become abundant and instant. Academic judgement, intellectual challenge, sustained relationships with experts, peer cohorts, and the standards of a regulated qualification have not. An AI-native university is one credible response to that shift, and not the only one.
We have built ours in public, under regulatory scrutiny, with academic leadership in the foreground and students at the centre. We welcome the debate, and we welcome the scrutiny that comes with it.
Paymon Khamooshi
Co-founder and President
London School of Innovation
Media Coverage: