Blog

24 JAN 2024

5 changes experts suggest on AI in colleges

The Department for Education should fund trials of potentially effective AI to check if it does actually boost students’ outcomes, government experts have said.


Notice: Undefined index: info in /home1/onlyfe/public_html/article.php on line 46

A long-term strategy on the use of generative artificial intelligence (AI) such as ChatGPT in schools and colleges is also needed, the government’s open innovation team said today.

The DfE had asked the team to explore the opportunities and risks for AI in education, including proposals on what needs to change.

Researchers publishing “educator and expert views” reviewed existing evidence and interviewed teachers across all phases of education, including four from further education institutions. 

Separately, the DfE has also updated its school and college technology standards on how devices should be accessible for students.

Here’s our round-up of everything you need to know …

1. ‘Flipped learning’ could increase
Experts warn a long-term generative AI strategy is needed to set “the direction” of travel. Long-term planning should explore how AI could change education models, including implications for the role of teachers and classroom-based learning. 

For example, “flipped learning” may become more pronounced, experts said. This is where students engage with learning materials outside of the classroom and come to a lesson with basic knowledge to participate in more “interactive activities”. 

This strategy should be “future-proofed to keep pace with technological advancement”.

Forums made up of students, experts and practitioners to share knowledge about any changes in future AI. 

2. Give colleges funding to evaluate ed tech impact
Experts said there is a “growing need” for a larger evidence base to help educators make informed decisions about the effectiveness of genAI tools. 

Key evidence gaps include its impact on students’ outcomes, especially for disadvantaged and SEND.   

Ministers should set “metrics that matter”, such as student outcomes over engagement, and ensure tools are pedagogically grounded and can be routinely evaluated. 

It will require incentives and resources as colleges are “unlikely to do this themselves” and the ed tech sector has a “vested interest” in showing effectiveness. 

They suggest making funding available to colleges to evaluate, as well as building on existing schemes such as the Oak National Academy curriculum quango. 

3. Research funding needed to help teachers detect AI
As AI-enabled academic malpractice rises and becomes more sophisticated, it will become harder for teachers to identify its use, experts warn. 

They say research funding is needed to support the development of tools reliability detecting AI-generated work as well as other initiatives that could help. 

This includes watermarking, which embeds a recognisable unique signal into AI creations. 

Safety, privacy and data protection accreditations could help reassure users. 

4. Consider how to prevent ‘digital divide’
The curriculum should be updated to reflect how students use AI, or to integrate AI tools as an explicit part of learning and assessment. 

It should also be changed to meet employer needs going forward. But this will require collaboration between employers, government, awarding bodies and educators. 

But experts warn generative AI could exacerbate “the digital divide” in education and there is already an emerging difference between state and independent schools’ use of the technology. 

Government should consider how to support access by all teachers and students, they said. Evidence-informed guidance and advice should be easily accessible through trusted platforms.

5. ‘Be transparent on impact evidence’, Keegan tells edtech firms
Experts warn more research is needed to better understand the intellectual property of genAI. This includes the infringement of IP rights due to the data input into generative AI models.

Traditional educational publishers could be left behind, the report warns, as teachers and students turn to generative AI to produce educational resources. 

“Support for educational publishers may be needed to ensure we have a sustainable publishing sector underpinning the education system,” it adds. 

Speaking today at the BETT show, education secretary Gillian Keegan also said “we should have the same expectations for robust evidence in edtech as we do elsewhere in education.

“Ed tech business should be leading the way – being transparent with buyers and promoting products based on great evidence of what works.”

What colleges need to know from updated tech guidelines…
Last week, DfE said colleges should now assign a senior leadership team member to be responsible for digital technology, as part of updates to its technology standards guidance. 

They should then create a minimum two-year strategy including what devices might need to be refreshed or replaced. Laptops should be safe and secure as well as energy efficient. 

In another update today, colleges were told devices and software should support the use of accessibility features including for disabled students. 

Websites should be accessible for everyone and digital accessibility should be included in a college’s policy.

Original story via FE Week