Guidelines for students for the Responsible Use of Generative AI, applies to all assessment in the Faculty of Media & Knowledge Sciences from October 2025.
Generative AI refers to a class of artificial intelligence systems designed to create new content – such as text, images, music, videos, software code, and 3D models – based on patterns learned from existing data. Unlike traditional AI, generative AI models produce things. For example, they can generate a realistic image from a text prompt, compose a melody, simulate human conversations, or design game environments.
Popular examples of generative AI services include ChatGPT, Claude, Google Gemini, DALL·E, Adobe Firefly, and tools that can write songs or simulate human voices. These systems are powered by advanced machine learning models like neural networks, particularly a type called transformers.
However, one limitation of generative AI is a phenomenon known as hallucination – when the system generates content that sounds plausible but is factually incorrect or entirely fabricated. This can happen because the model doesn't truly "understand" information in a human sense; it predicts what should come next based on patterns, not verified facts.
It is very important that when using generative AI, students critically evaluate AI-generated content rather than assume it is always accurate.
Unless otherwise specified in the study-unit or assessment description the use of generative AI is encouraged as outlined below in the preparation of work to be assessed, but keep in mind that unacknowledged use of Generative AI in assessed work at the University of Malta is considered to be cheating and will be treated in the same way as other breaches of the University Assessment Regulations (2009).
2.1 Leverage AI to learn faster and deeper
Exploring ideas with Al can clarify concepts, generate examples, and reveal alternative perspectives. In art, design and visual communication studies, AI software can assist students in generating initial concepts, ideation, enhancing digital design skills, and experimenting with different artistic elements and can offer students a platform to explore their creativity, fostering innovation and critical thinking skills.
2.2 Remember: the same tool that boosts learning can also facilitate cheating
Misrepresenting AI-generated work as solely your own violates academic integrity and will be treated as a breach of University regulations.
2.3 AI is a collaborator, not an author
The student presenting the work for assessment remains wholly responsible for accuracy, originality, structure, style, and academic honesty in every submission.
2.4 Verify before you trust
Large models occasionally “hallucinate” (i.e., provide incorrect) facts, references, dates, etc. Fact-checking any AI-generated content is mandatory.
If you cannot independently confirm a fact, claim, or citation produced by AI with reputable sources, do not use it. This applies to both written work and to practical work in view of the editorial responsibilities of producers of artworks, particularly in the case of work that is presented as factual such as articles, podcasts, and documentary films.
2.5 Use AI to refine essays, reports, and presentations—never to fabricate them
The use of Generative AI is permitted in the follow cases (unless the assessment description specifies otherwise):
The critical analysis, judgement, and final work/wording must be the student’s own.
2.6 Protect private and proprietary data
Do not paste assessment briefs, unpublished research, personal information, or sensitive datasets into public AI services. These may then become available to other users of the service.
2.7 If you use AI, maintain an audit trail and include it in your work
Copy every prompt or query you give to a generative AI model (in chronological order) in an appendix to your work. Generally, there is no need to include the actual output of the AI model, only the prompts or queries. However, individual study-units may have different requirements; check the assessment and study-unit descriptions or ask the study-unit coordinator to ensure you provide what is required. To maintain privacy, leave out any personal identifiers when compiling the appendix.
2.8 The use of Generative AI in practical projects
The use of AI in creative projects should serve as a tool to amplify creativity, not replace it, and is best used in the generation of repetitive tasks to permit more time to be spent on creative activities.
2.9 Copyright considerations when using Generative AI
If you use any AI-generated content in any of your work, it is your responsibility to ensure that it does not infringe on existing copyrights, especially if based on or inspired by copyrighted material, for which the necessary permissions/licenses must be obtained.
2.10 Cite AI assistance transparently
Transparency should be exercised at all times. If Generative AI is used, the text should include a brief note such as “Draft refined with ChatGPT, 12 January 2025.” or “Initial ideas for video generated with Claude, 12th January 2025”. Use the Faculty citation style for full references (e.g., APA: OpenAI. (2025). ChatGPT [Large language model].).
In practical projects, make sure that any AI-generated content (images, soundtracks, voice-overs, and footage) is always acknowledged through the use of visual cues (lower thirds, overlays or watermarks) and clearly labelled/credited, so that the audience is made aware of its origin.
2.11 Keep within assessment-specific limits
If an assessment description does not allow Generative AI assistance, that rule overrides these guidelines. When unsure, consult your lecturer before using Generative AI. See also 2.7 above.
Generative AI is actively encouraged as a learning accelerator – provided you stay in control, verify every outcome, document your prompts, and uphold the highest standards of academic integrity by clearly stating when and how you used AI.
Remember that the unacknowledged use of Generative AI in assessed work at the University of Malta is considered to be cheating and will be treated in the same way as other forms of cheating in terms of Article 38(1)(f) of the University Assessment Regulations (2009) which states the following:
“Students shall not submit work which is not truly their own. In such cases, the student shall be called for an oral examination. If during the oral examination it is confirmed that there is a serious mismatch between the quality of work submitted and the performance of the student during the oral examination, a report shall be made to the University Assessment Disciplinary Board.”
The Registrar has also published FAQs on the use of Generative AI Tools in assessments at the University of Malta.
Notes