Over the past 18 months or so, as GenAI technology and our knowledge and use of it has developed, so has our understanding of the challenges and ethical issues surrounding its responsible application. The initial (and continued) focus has been the challenge to academic integrity with the fear of an increased risk of cheating alongside the ability to “hallucinate” references. But this is just the tip of the iceberg.
The United Nations Educational, Scientific and Cultural Organization (Unesco) has released a series of guides that outline the main challenges and ethical issues in detail. You can read them in full here, but they are in summary:
- The lack of regulation of GenAI tools
- Data protection and privacy concerns in how these tools collect, use and store personal data
- The cognitive biases that the tools have learned from the data corpus on which they have been trained (ie, the internet, and the wonders of Reddit)
- Specific issues around the production and dissemination of content that is discriminatory in terms of gender and diversity
- The equitable accessibility of these tools across the globe, particularly in places where there is extreme digital poverty
- The increased commercialisation of these tools, with higher performing “paid for” versions being available to those who can afford the subscription.
One ethical issue absent from this initial precis, however, is the environmental impact – particularly in terms of energy and water usage. We are all willing to switch plastic bags for reusable ones as we can see the potential environmental impact – but are we considering the amount of energy consumed because of every question we ask ChatGPT? Then we get into copyright and intellectual property issues, for which there continue to be court cases against its creator, OpenAI.
- Resources on AI use in higher education
- The two key steps to promoting responsible use of LLMs
- The Goldilocks effect: finding ‘just right’ in the AI era
Just as we cannot ban this technology because of the potential for cheating, we are unlikely to win the ideological war of banning or discouraging the use of these tools as a consequence of their environmental and ethical challenges. However, we can ensure that ethical and environmental issues are embedded in discussions about AI use and its implications and incorporated into future work to develop fair, transparent and responsible use of AI. Indeed, we can’t - and shouldn’t - single out AI for special treatment here, as we need to be aware of the environmental, data security and ethical implications of using any technology. Therefore, the higher education sector, including the University of Exeter, is advocating for ethical and responsible use of these tools in light of these issues. But for staff and students, what does this mean in practice?
To start the conversation, we have developed some central guidelines for the ethical and responsible use of GenAI tools. These are designed to be adapted at the local level to meet disciplinary needs but provide an insight into the institutional expectations for staff and students, and a framework from which disciplinary guidance can be developed.
Ethical and responsible use of GenAI tools
1. Use of GenAI where it enhances and adds value to the intended learning outcomes of a module, and/or the student learning experience – GenAI tools have the potential to help students collate large quantities of information while practising their critical skills in assessing GenAI outputs. In addition, there is growing research about how we can use GenAI to personalise learning, either through supporting staff to develop personalised learning resources or by acting as a study tutor for students. They should not, however, be used to replace learning or intended learning outcomes.
2. The use of GenAI will be transparent, including declarations in student assessment, staff-generated learning materials and adherence to referencing guidelines. Both staff and students must be transparent about when and how they have used GenAI as part of their work. As well as learning potential, GenAI tools can also enhance the learning design process, improving feedback writing and creating efficiencies in busy workloads. Staff should model the transparency that we expect of students in their use.
3. The use of GenAI will be in line with our information governance policies, with particular reference to data protection and GDPR. We must ensure that no personal or sensitive information is inputted into AI tools that are not GDPR compliant. If students wish to use GenAI tools as part of their data analysis, we should signpost them to existing software such as R or NVivo, where auto coding makes use of GenAI. Similarly, staff should not feed assessments into ChatGPT to generate feedback for students. Instead, educators could input their initial feedback/notes to ensure their feedback is constructive, kind or follows a particular feedback model.
4. The use of GenAI must always be critical and reflexive. We encourage staff and students to apply their critical and evaluative skills to GenAI outputs as they would with any source materials, paying particular attention to a) the accuracy and source of information and b) the cognitive biases inherent in GenAI tools. This also applies to being aware of the challenges and ethical issues inherent in these tools and being critical in the choice of when and how to use them.
5. The use of GenAI tools will be in accordance with our policies and maintain academic integrity. Direct copying of AI-generated content is included under plagiarism, misrepresentation and contract cheating under definitions and offences in our teaching and quality assurance manual.
As an institution, we will continue to iterate our guidance as this technology, and its use, continue to develop apace. But we encourage institutions, staff and students to consider the above and ask:
- What in this list is/isn’t applicable to my discipline?
- What is missing from these guidelines?
- How will these principles impact my use of GenAI tools in practice?
Kelly-Louise Preece is head of educator development at the University of Exeter.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
comment