What every educator should be thinking about with AI
by Sarah Horrocks and Michelle Pauli of Connected Learning
With three out of five teachers already using generative AI in their work, most commonly for lesson planning and research, it’s important that educators are using AI as safely as possible and are aware of potential pitfalls.
We’ve all heard about some of the big picture concerns about AI, whether that’s ethics, bias, surveillance, edtech business models, environmental impact or academic integrity, and there are serious discussions to be had in education about these fundamental issues and risks.
But what do teachers and school leaders need to take into account to use AI safely and effectively in their work?
We’ve highlighted four areas for educators and senior school leaders to consider.
GDPR matters here too
Before using any AI tool that processes personal data, you must make sure it complies with UK GDPR regulations. Teachers could violate data protection law if the AI tool stores, learns from or shares personal data without proper safeguards.
Under UK GDPR, if you’re processing personal data – information that identifies an individual, such as pupil names, assessments, or work that contains identifiable details – you must have a ‘lawful basis’. This will depend on why you are processing data and will be determined by your specific circumstances. It must be one of the following: consent, contract, legal obligation, vital interests, public task, legitimate interests.
If you’re a school leader, you’ll need to work with your school’s data protection officer, IT lead, managed service provider or technical support company to understand exactly how each AI tool processes the data you put into it. Then you can create an approved list of AI tools suitable for your school.
Phrases such as; “not reviewed by anyone”; “data is not used to train the artificial intelligence model or shared with other users”; “activity isn’t used for model improvements”; “prompts are not reviewed by humans” suggest that data will be safer.
As an educator, you should only use the generative AI tools provided by your school. Free AI tools often don't have these safety considerations in place so enterprise or professional AI tools incorporated into suites of secure tools for schools and colleges are much more likely to meet these requirements.
Copyright covers pupil work
Data protection and copyright protection are often confused – but they have important differences. Educators need to make sure they are not infringing any intellectual property rights if they use AI
Importantly, when teachers use an AI tool to generate feedback on student assignments, the original student work remains the student's intellectual property – they are the copyright owner. The key issue is that AI systems could potentially learn from and incorporate student work into their models without proper permissions, which would violate copyright law. That’s why, as with the GDPR considerations, schools need to understand which products use AI and how, and set out clearly which tools are safe for teachers to use for their work.
Any use of AI tools must be transparent, particularly in critical educational areas such as providing student feedback, and teachers need to carefully review and adapt AI-generated feedback. Even when using secure AI tools with proper permissions and transparency, human supervision of the AI feedback process is essential.
AI literacy is cross-curricular
Discussing the actual use of AI with pupils can be challenging – because of age restrictions and a lack of teacher knowledge or confidence. However, children are using AI tools for their homework and for fun.
The latest Ofcom media use and attitudes report finds that half of children aged 8-17 have used AI tools, and are using AI for learning and/or schoolwork, including drafting documents and helping with writing style.
There’s currently much discussion about the need for AI literacy in schools and AI literacy frameworks – but also debate about what AI literacy covers.
Like online safety, AI literacy falls between subject areas: Computing, English, Humanities, PSHE/citizenship and RSE. In our Connected Learning newsletter we’ve highlighted concerns around young people’s use of AI companions, sexual abuse and exploitation). It needs a critical, ethical approach which is genuinely cross curricular.
CPD is crucial
Using AI tools can save teachers time when it comes to lesson planning.
An Education Endowment Foundation (EEF) study found that teachers saved about 25 minutes a week if they used ChatGPT – a reduction of about 31% compared to those who did not use it.
However, there is also a time commitment involved in developing the knowledge and skills to use any new technology effectively – the TES found that 9% of teachers said AI had increased their workload – and even more so when it is one that is evolving as rapidly as generative AI and where confidence among teachers varies so widely.
School leaders need to allow teachers time to learn about AI and develop their skills, ideally using the EEF’s essential building blocks to support effective CPD. Shorter but more frequent CPD refreshers and updates may be needed to keep on top of developments.
Learn more.
Parent Zone has partnered with the Raspberry Pi Foundation to deliver free training to UK educators as part of Experience AI. It includes a wealth of resources here and here to support the teaching of AI literacy.
The DfE has published a set of resources for teachers and school leaders, from early years to FE, on the safe and effective use of AI. The toolkits, with videos and workbooks, can be used as part of group CPD activities or worked through individually by teachers.
Sarah Horrocks and Michelle Pauli are the team behind Connected Learning – offering digital developments for educators, from AI to digital news.