AI, education and ethics: who should profit from learning?
We’ll send you a myFT Daily Digest email rounding up the latest Artificial intelligence news every morning.
Anthony Seldon, vice-chancellor of the University of Buckingham and former headmaster of Wellington College, reflects on a recent taxi ride.
“Taxi drivers of old would [study] for two, three, four years,” he says, referring to “The Knowledge”, the infamously difficult test taken by London’s black cab drivers to measure their familiarity with the UK capital’s streets.
“They took great pride in the effort. Now it’s mostly worthless because there’s something that does it much better,” he says. “However good their knowledge, they can’t know what’s going to happen round the next corner.”
The world of education is heading for an unpredictable destination. New technologies, especially pattern-recognition software commonly referred to as artificial intelligence, are descending upon the learning process, changing how we retain and test knowledge.
Artificial intelligence in education relies on the idea that software can recognise patterns in students’ performance, highlighting strengths and weaknesses while enabling them to improve more rapidly.
Sir Anthony, who is a joint leader at the university’s Institute for Ethical AI in Education, says transformations in education are scarcely acknowledged compared with frequent discussions of technological shifts in other sectors.
“Government gets AI in transport, in health sciences, in retail,” he says. “It does not get AI in education — it doesn’t get the potential, it doesn’t get the risks. What’s more, it doesn’t know that it doesn’t get it.”
Many companies in education AI are still in the early stages. Up Learn, a UK-based company, has received £2.5m in backing, including funding from Holly and Sam Branson, children of entrepreneur Richard Branson.
Up Learn’s software provides animated video lessons, which continuously test students, and it guarantees an A*/A equivalent exam grade — or your money back.
“It’s constantly evaluating what you know and don’t know, taking into account what you’re getting right and wrong but also things like the rate at which you’re forgetting information, and the optimal time for you to re-practise topics to reduce forgetting,” says Guy Riese, Up Learn founder.
In developing economies such as India, shortages of teachers are often cited in support of introducing technological solutions. However, there are concerns over the use of technological approaches in developing economies, especially if it is driven by profit-seeking organisations and is perceived to reduce contact with teachers.
Commercial development has sometimes proved controversial in developing economies. Bridge International Academies, a for-profit organisation that says it has taught 1m nursery and primary school students globally since 2009, draws on “near real-time data, analytics and advanced technology to constantly make changes that improve pupil learning outcomes”.
It has attracted criticism about its for-profit status from Education International, a global federation of teachers’ trade unions, and in the past faced challenges over the licensing of its schools in Kenya.
Bridge says it has licences in “all countries in which it supports community schools”, including Kenya, and that it is proud to tap new sources of financing “to tackle the global learning crisis with a model that has enabled national and state governments to transform their public education systems at speed and scale”.
Advocates of AI focus on the advantages of new methods. Mark Minevich, an adviser at IPsoft, an AI company, and former chief technology officer at IBM, suggests that technology will allow a process of “customising education for each individual student”, as opposed to the “generic” approach that still prevails in classrooms. This will allow for more data about student performance.
“AI will be introduced into classrooms, introduced into education, but it gives you an opportunity to analyse all this information,” says Mr Minevich. “I think AI has the distinct possibility to get all of this information on a continuous basis — almost like a pacemaker.”
This raises the spectre of privacy concerns similar to those currently echoing across the data-intensive world of online advertising. “It is challenging to implement this,” Mr Minevich adds. “It comes down to privacy and ethics.”
According to Mr Riese of Up Learn, constant data collection would represent an improvement on the current means of assessing students, by drawing on hundreds of hours of data, rather than a short exam. “It’s a flaw of the current exam system that some students must fail in order for the normalised grading system to work,” he argues. “If all students have mastered a domain, they should all get A*s.”
As with computerised taxi systems, there are unpredictable elements to the role of technology in education. It is much less clear what new software might mean for the aspects of education that are not linked to the need to quantitatively assess students.
There is also a risk that new solutions can create deeper problems than the ones they solve. “It might make us happier, brighter — it might make us less balanced as learners,” says Sir Anthony. “Life isn’t just about racing.”
The classroom is changing. Artificial intelligence and robotics are challenging how students learn and helping us understand our own way of thinking. What does the future hold for education and technology?