Skip to main content

Artificial intelligence (AI) tools

Information about the University's policy on the use of artificial intelligence tools

Artificial intelligence tools (AI) have had lots of coverage in the news recently including how they can be used in workplaces and universities. You may have heard of tools such as Chat GPT, DALLE-2, Co-Pilot, and Google Bard although there are many more available for different purposes.

When it comes to your course, inappropriate use of these tools can negatively impact your learning as well as affecting your own confidence in your qualification and ability.

While such tools may seem like time-savers, their potential and limitations are still not fully explored. So far, we know that some materials/information may be out of date or incorrect, and some of the information may be fictitious or contain false references and quotes.

We're also aware that since AI models are trained on the data that they are exposed to, this can result in biases. So, responses or information you pull out of such tools may reflect these biases and demonstrate discriminatory attitudes and beliefs.

Our stance on the use of AI tools

The University’s academic integrity policy (UPR AS14 Appendix III) sets out our stance on plagiarism including fake referencing which can often be the case with AI tools. Therefore, it is crucial that you do not use AI tools to generate an assessment and submit it as your own work; to do so will constitute academic misconduct.

When could I use an AI tool?

The only occasions where you may use AI tools in your assessment is if you have explicit permission from your tutor in your assessment brief. Your assessment brief will include information on how to declare any use of such tools, and you can speak to your tutor for guidance. 

If you do not reference your use, then this will constitute academic misconduct. Our current University policy on academic misconduct adequately covers the misuse of such tools, but we are updating them to be clearer on the matter.

Unauthorised use of artificially generated material (AI) in researching or presenting material for an assessment is an academic misconduct offence if you use AI tools in producing your assessment unless the use of AI tools is expressly permitted. However, even if expressly permitted, where you do not declare that you have used an artificial intelligence tool(s) in the production of your assessment, or you are dishonest about the extent to which such tools have been used, you will have committed academic misconduct.

Future of AI tools

As the world of AI tools develops, Herts will continue to discuss its potential and limitations and review our policies. The University’s Graduate Attributes expect our students to be ethical and evidence-based in their work, and therefore any future permitted use will be in line with our academic integrity principles set out above.

If you have questions about how this might affect your assessments, please speak to your module leader, programme leader or personal tutor.