AI in Education: what trust leaders should be thinking about
This article looks at the opportunities, challenges and risks of generative AI for schools and trust leaders.
As a starting point, school and trust leaders should consider the Department for Education’s ‘Generative artificial intelligence (AI) in education’ which sets out the position of the DfE on the use of generative AI.
The opportunities
Whilst there are undoubtedly complex legal and moral issues to navigate with AI (some of which we will consider below), there are also significant potential opportunities to exploit. These include:
Personalised learning
This is thought to be one of the most promising applications of AI in education. AI can be used to analyse a student's learning style, strengths and weaknesses to tailor educational content to the pupil’s specific needs. This means that students could learn at their own pace, focusing on areas where they need improvement and skipping over material they've already mastered. This could lead to improved learning outcomes and a more engaging educational experience.
The potential for AI to operate as an artificial teaching assistant also has significant potential to help teachers personalise and differentiate learning when working with pupils and students.
Automating administrative tasks
AI could help to automate many of the administrative tasks that take up a significant amount of time for teachers and administrators. This includes tasks like marking, scheduling classes, and tracking student attendance. By automating these tasks, teachers might spend more time on instruction and less time on paperwork. Likewise, many administrative tasks could in theory be given to AI systems to monitor and action, or AI could be used to make those tasks quicker – for example writing letters, compiling and updating data, and so on.
Enhancing accessibility
For example, speech recognition technology could help students with disabilities to interact with digital learning materials. Similarly, AI-powered translation tools can make educational content accessible to students whose first language is not English or who speak very little English. It may also be possible to use AI to create tools that make learning more accessible for students with SEND.
Predictive analytics
Whilst acknowledging the need for scrutiny, AI can analyse data to predict trends and outcomes, which can be particularly useful in education. For example, it might help to identify students who are, or who are at risk of, falling behind, allowing more effective and timely interventions to be put in place. It might also predict which teaching strategies will be most effective for a particular group of students.
Teaching the use of AI as a tool or skill
AI is already impacting the day to day lives of millions of people globally. Every reader of this article has interacted with it, because this article was written with the aid of AI (albeit with significant amends and checks made by humans).
Like computing, the use of AI and how to get the best out of it is a skill and aptitude in of itself and there will be expectations of current and future generations to know how to use it. Enterprising schools will therefore likely already be using or looking at ways to ‘teach’ AI and its application to pupils to better inform them and maximise their future opportunities and prospects.
The risks
Whilst there are clear opportunities for those who can successfully apply and integrate AI into their schools, there are also risks that need to be considered and mitigated.
Data protection
Data protection is a central aspect of implementing AI as, ultimately, AI requires a lot of data to function effectively.
In particular, AI is likely to require access to specific information about students and employees, including their work and/or academic performance, personal characteristics, and even behavioural patterns.
Where AI might be able to assist with access to education, this increases the issues around special category personal data and how this is accessed, used and managed.
These issues raise significant data protection questions for individuals and institutions and suggests that data protection policies and privacy notices will need to be significantly updated and adapted to account for AI use.
At this time the data protection regulatory framework of the UK is still developing, though the ICO has released some helpful guidance on the data protection implications of AI.
As a key step for any organisation prior to implementing any AI systems, schools should ensure they complete a Data Protection Impact Assessment to identify and minimise any risks, to assess the nature, scope, context and purposes of the data processing, and to help identify potential risks to data protection and broader privacy rights. Schools and trusts will also need to ensure that their Data Protection Officer (DPO) is suitably up to date and aware of the issues around AI so that the DPO can oversee the data protection strategy and implementation to ensure compliance with data protection laws, serving as the point of contact for any data protection enquiries or issues, data breaches, and helping to foster a data protection culture within the school or trust.
Discrimination
Whilst AI can be used as an analytical and predictive tool, thought and care needs to be given to the reliance placed on it. For example, an employer using AI to assess job candidates may find some of the insights offered by AI helpful, but AI should not be relied upon to make any decisions about which candidate is ultimately selected.
This is because various AI models have in the past been seen to demonstrate errors and bias due to the data pool available to the AI and there may be biases in the data pool that could skew its reasoning. Employers will need to keep in mind that AI could exacerbate equality, diversity and inclusion issues such as a lack of ethnic or gender diversity in positions of senior leadership, if those issues are already present in schools and trusts and the AI they are using to help decide who to recruit is drawing data from the sector.
Likewise, using AI to monitor pupils or staff performance may lead to issues of disability discrimination if the data pools being used by the AI to compare performance do not take disability characteristics into account. At least for now, it is likely that human input will be needed to consider issues such as reasonable adjustments and specific circumstances. In any event, employers will need to treat insights provided by AI with care.
Plagiarism
Plagiarism is a serious concern in educational institutions and the advent of generative AI is likely to change the dynamics of this issue.
On the positive side, AI-powered tools are becoming increasingly effective at detecting plagiarism and can be used to scan vast databases of academic papers, books, and online content to identify instances of student plagiarism and even paraphrasing. This can help schools uphold academic integrity and ensure that students' work is original.
AI also has the potential to facilitate plagiarism and to make it harder to detect. Advanced AI models can generate text that is coherent, contextually relevant, and even creative. Students might be tempted to use AI text generators to produce essays or coursework that bypass the learning process, raising serious concerns about institutional academic integrity. Likewise, teachers may use AI to generate class resources that intentionally or unintentionally plagiarises the work of others. There is also the potential for materials created by teachers using AI to be inaccurate.
Plagiarism does therefore represent a potential threat to schools as artists, authors and academics amongst others may take action to defend their intellectual property and copyright, though at this time legal action is focussed on the AI tech companies.
To mitigate the risk of AI-enabled plagiarism and inaccuracies, schools and trusts might consider adapting their policies and practices. This could include:
- educating students and staff about the ethical use of AI, emphasising that using AI to generate academic work or learning materials raises accuracy, plagiarism and copyright issues;
- updating relevant policies such as staff and pupil codes of conduct to explicitly deal with generative AI; and
- consider using advanced plagiarism detection tools that can identify AI-generated text.
Conclusions
Like computing, the internet and social media, AI seems set to become more and more integrated into our everyday working and social lives. And like the technologies that went before it, AI will likely have positives and negatives. However, with the right thought, planning and advice, AI offers the sector significant opportunities.
The development of AI is also progressing quickly, and so like any other sector or industry, school and trust leaders cannot ignore the potential impacts.
How we can help At Wrigleys we have a wealth of experience of dealing with clients in the education sector. Our team combines expertise in governance and structural issues through to handling practical questions regarding student and parental rights and disputes, employment issues and more. If you are interested in the issues set out in this article, or require assistance with any other queries, we’d love to hear from you. |
If you would like to discuss any aspect of this article further, please contact Alacoque Marvin or Michael Crowther or any other member of the education team on 0113 244 6100. You can also keep up to date by following Wrigleys Education on X. The information in this article is necessarily of a general nature. The law stated is correct at the date (stated above) this article was first posted to our website. Specific advice should be sought for specific situations. If you have any queries or need any legal advice please feel free to contact Wrigleys Solicitors. |