In the Department of Media and Communications, our priority is to ensure that students have the best possible opportunity to develop their critical thinking, hone their analytical skills and refine their ethical sensibilities in relation to the opportunities and challenges that media and communications presents in theory and practice. Students also share these goals, but in addition, they want to develop and use associated skills (research, writing, presentation) in a way that makes sense to them in the context of their daily lives, and that will be useful for them once they leave LSE to pursue their careers.
The departmental policy is based on the differentiation between AI-powered tools and generative AI tools.
- AI-powered tools use artificial intelligence to support (academic) learning, writing and research. They are now embedded in the ways in which students go about their day-to-day work, through tools like typing assistants, search engines, and referencing databases. Many of these use machine learning to identify and generate predictable outcomes (e.g. correct grammatical constructions in a particular type of sentence, or correct requirements for a particular referencing style).
- A sub-set of AI-powered tools, often called generative AI tools, use deep learning and Large Language Models (LLMs) to produce outputs that more closely mimic human activities, thought processes and outputs. These tools include Microsoft Co-Pilot, ChatGPT, Canva, Midjourney, and many others. Students tend to use generative AI tools to support their learning by activities such as summarising content, doing an initial survey of literature in an area, or simplifying articles and ideas.
This departmental policy addresses the reality that AI-powered and generative AI tools are now part of the academic landscape, and the need to balance the use of these tools with ethical awareness about their use, as well as ensuring the integrity of students’ learning, assessment and achievement in the Department.
Fundamentally, we both encourage and expect students to be careful and reflective in their use of AI-powered and generative AI tools. They may be used to support, but not replace, your own intellectual effort both in class and in assessments, and when employed should be used sparingly, recognising that what may appear as a useful shortcut in the short term (for example, summarising literature in a particular area of media and communications scholarship) may limit the development of knowledge and insight in the longer term (for example, a more in-depth understanding of the different positions in the field and the nuance of their arguments). Moreover, the tools themselves present a range of ethical challenges – for example, in relation to their environmental impact, their use of intellectual property without permission, labour exploitation to train AI datasets, and equality of access to different generative AI tools (e.g. paid or unpaid versions).
In line with these expectations, students must adhere to the following rules:
During the year:
- You may use both AI-powered tools and generative AI tools (based on deep learning) during the course of learning and teaching throughout the year, with the limits listed below. In the case of generative AI tools, you should be aware that these tools are far from perfect and may produce incorrect or nonsensical results. When using generative AI tools, you should always review the content they deliver against other sources to ensure accuracy and evaluate whether or not the output is actually useful for you. Where you do use generative AI tools, we encourage you to use Microsoft Co-Pilot. The LSE has a site-wide licence for Co-Pilot, and this ensures that all students have access to the same level of generative AI support.
- You may not use generative AI tools to auto-translate the spoken content of lectures, because this contravenes the intellectual property rights of the teachers developing and delivering the content.
- You may not use generative AI tools to auto-translate the spoken content during seminars, because this may also contravene intellectual property rights of the teachers, and of students who may be presenting or contributing. Seminars are designed to be participatory, and your role in this is essential. You should focus on reflecting on the topics and readings related to the seminar with your fellow students, and engage in the discussions wherever you can.
For assessments:
- You may use AI-powered tools in the process of producing an assessment.
- You may not use any generative AI tools (including Microsoft Co-Pilot) in the process of producing an assessment, including writing drafts of all or part of a proposed submission; producing assignment structures, titles or topics; or changing your own writing into a different style.
- If, during the course of the year, you have used generative AI tools for learning about topics related to your assessment, you may of course use the learning you have developed while using them. However, you may not copy any of the content produced by a generative AI tool into your assessment (see examples below).
- All assessments should be written in your own English, and you may support your writing with AI-powered tools such as editorial assistants or thesaurus tools. However, you may not use translation software for translating all or part of your assessment into English, from another language, for your submission, because your assessments should be your own original writing. Any writing that is not yours (e.g. citations from academic texts) should properly referenced back to its original source.
Using generative AI tools against these rules will be treated as a case of academic misconduct, with measures taken in accordance with the School’s expectations and policy on academic misconduct here.
Examples:
A student uses a generative AI tool (e.g Microsoft Co-Pilot) to obtain a summary of the theory of the Circuit of Culture model. They use this summary when reflecting back on their course readings and lectures on the topic, to develop their understanding prior to, and after a seminar on the topic – permitted
- The student then uses the understanding they have developed to write their assessment on the topic – permitted
- The student inserts the summary from the generative AI tool into their assessment on the topic – not permitted
A student uses translation software to check the meaning of a word or phrase in a course reading, primary source or in lecture slides – permitted
A student uses translation software to translate their original essay, from another language, into English, for submission – not permitted
A student enters key terms into a search engine to search for theoretical literature relevant to a particular aspect of Development Communication - permitted
A student enters a series of bullet points into Canva and asks it to create a presentation based on them – not permitted
A student uses Canva to draw a diagram of their conceptual framework for their dissertation – permitted
A student is writing an assignment and uses referencing software to insert references into the body of an essay and compile the reference list – permitted