About this policy
This policy outlines the guidelines and procedures related to malpractice and plagiarism including artificial intelligence (AI) and unattributed references. The policy is designed to ensure that AI is used responsibly and properly acknowledged, and to safeguard the integrity of assessments and educational outcomes. We will not tolerate anything less than the highest standards of academic integrity, e.g. cheating, including the passing off the work of others as your own. This policy applies to students, staff and volunteers involved in AHLU activities.
Definition of terms
‘Malpractice’, including maladministration, means ‘any act, default or practice which is a breach of the regulations that apply to the exam or assessment being taken. This can involve centre staff as well as students.’ (Source: https://www.jcq.org.uk/exams-office/malpractice/)
‘Plagiarism’, ‘unacknowledged copying from, or reproduction of, third party sources or incomplete referencing (including the internet and AI tools);’ (Source: https://www.jcq.org.uk/wp-content/uploads/2023/07/Plagiarism-in-Assessments.pdf)
‘Artificial intelligence (AI)’ refers to technologies that simulate human-like intelligence and decision-making based on data analysis and pattern recognition. This includes generative AI products, e.g. OpenAI’s ChatGPT, and products integrated into productivity suites e.g. Microsoft Copilot.
‘AI Misuse’: AI misuse includes but is not limited to using AI tools to generate content or answers, without proper acknowledgment, submitting AI-generated work as one's own and plagiarising AI-generated content.
1. Understanding AI and Its Risks
1.1 Risks of AI Use: Primary risks relate to plagiarism, misuse of AI-generated content and violations of academic integrity. However, follow-on risks of an over-reliance on AI tools are that students do not learn to use sources critically, to create balanced and contextualised arguments and to challenge statements based on their own substantiated judgements.
1.2 Treatment as Malpractice: AI misuse will be treated as a serious form of malpractice, in line with Art History Link-Up’s broader policies on academic integrity (insert link to Code of Conduct). Penalties for AI misuse may include grade reduction, re-assessment, communications with parents/guardians, reporting to exam board following an internal investigation, and withdrawal from the programme.
2. Proper Use of AI
2.1 When AI May be Used: AI tools may be used by students for research but this use needs to be acknowledged.
2.2 Acknowledgment of AI Use: If AI tools are used to assist in the creation of work, students must clearly acknowledge and reference the use of AI-generated content and failure to do so will be considered a breach of academic integrity. Where AI tools have been used as a source of information, a student’s acknowledgement must show the name of the AI source used and should show the date the content was generated.
2.3 No names or private details or recognisable data must be entered into AI tools like ChatGPT. Doing so is a serious breach of confidentiality as currently any data fed into AI tools will be harvested by the technology.
Examples of AI malpractice are as defined by the Joint Council of Qualifications Appendix A https://www.jcq.org.uk/wp-content/uploads/2024/07/AI-Use-in-Assessments_Feb24_v6.pdf
3. Integration of AI in Learning
3.1 Teaching and Learning material: lead teachers should experiment and familiarise themselves with AI technologies. Utilising AI technologies are crucial for understanding their potential and limitations in the educational context . AI tools may be useful to test students' understanding of both AI-generated content and their ability to engage critically with the technology. For example, teachers may use AI tools to generate essays which fit varying assessment objective levels to explain a basic essay structure and also to point to AI limitations, e.g. regarding visual analysis, complexity and subtlety of argument, etc.
3.2 Guidance and Support: Students will receive guidance on how to responsibly use AI tools and acknowledge their usage.
4. Prevention, Monitoring and Detection
4.1 AI Misuse Prevention: At the start of the Art History for Everyone Programme, teachers should set reading and research homework and use at least one lesson per term for essay- writing practice to support students’ writing skills, to grow their confidence and to develop a good sense of every student’s individual working level and writing style and tone.
4.2 Focused discussion at least once a year on referencing, avoiding plagiarism and the use of generative AI.
4.3 AHLU will run all students’ submissions through plagiarism software.
4.4 Specific training for AHLU core teaching team on malpractice, plagiarism and the use of generative AI.
4.5 Inclusion of a specific clause in students’ code of practice regarding malpractice, plagiarism and the use of generative AI.
5. Consequences of suspected Malpractrice, Plagiarism or Misuse of AI
If misuse of AI or plagiarism is suspected a thorough investigation will be carried out before formal action is taken.
For more information, refer to JCQ guidance on malpractice. https://www.jcq.org.uk/exams-office/malpractice/
Policy to be reviewed August 2025
We believe art history should be for everyone, however fewer than 1% of state supported secondary schools offer Art History A Level. As a result, there is a lack of diversity in the arts sector and an increasing skills shortage. We are the only charity offering formal Art History teaching to school-aged students from all backgrounds. Your financial support will ensure that everyone has an opportunity to study art history: together we can transform the future of the arts.