Designed for Integrity

Smart assessment design supports academic integrity and student engagement and can minimise cheating


The barriers we place on learners, intentionally or not, can exacerbate the stakes and promote the fears and feelings that lead students to cheat (Gray, 2022, p. 196).

If we set tasks that students actually want to do, teach them how to do those tasks well, and don’t schedule them all at once, fewer students might be tempted to cheat. But what about determined cheaters – can assessment design also improve assessment security? (Dawson, 2021, p. 129).

Good teaching can reduce some of the factors that may lead students to cheat. This involves a focus on ensuring that students are learning (Bertram Gallant, 2017). Before reviewing task design, consider the following: 

  • How do you connect academic integrity to subject and program learning outcomes and professional standards in your discipline?
  • How do you ensure that students are well prepared for tasks and that expectations are clear?
  • What opportunities are there for early formative feedback in line with clauses 15 and 30 of the Assessment Policy?
  • How do you ensure that students are aware of the risks of cheating and detection?
  • How can you increase students’ sense of belonging to the subject/discipline and academic community?
  • How can you manage subject and program assessment schedules to reduce stress? 
  • How do you model good practice in academic integrity?

Design for integrity needs to take into account:

  • Students’ diverse academic and language experiences (Murphy, 2022)
  • Ways that academic writing standards, norms, authorship and contribution practices expected may be different from students’ previous experience.  This includes their experience of authoring and sharing texts and files on social media (Corrin et al., 2019; Rogerson & Basanta, 2016).  

As educators, we have a duty-of-care obligation to others and we must therefore act to address academic misconduct, but not without a consideration of the costs and burdens it places on others. (Gedajlovic & Wielemaker, 2020, p. 63)

A duty-of-care perspective on academic integrity recognises that we have shared obligations to take action to reduce misconduct, balanced against any costs and burdens. According to Gedajlovic and Wielemaker, we have a duty-of-care towards:

  • vulnerable students who might cheat as a result of bad choices and being preyed upon (Gedajlovic, 2020) or blackmailed (Yorke et al., 2022) by third parties that wish to profit from them;
  • hardworking students who may be disadvantaged by others cheating;
  • our schools’ reputations that may be tarnished as a result of scandals;
  • our alumni who may see the value and legitimacy of their degrees diminished;
  • our communities who trust us to produce competent and ethical graduates; and
  • educators who can lose a sense of purpose if they come to believe that their institutions do not share their values regarding academic standards and integrity.

(Adapted from Gedajlovic & Wielemaker, 2020, p. 66)

A duty-of-care perspective focuses on the needs of people and the need to protect stakeholders from harm. When designing for integrity, a duty-of-care perspective involves setting clear expectations, providing guidance, ensuring that assessment does not lead to unnecessary stress and ensuring assessment security where appropriate. Staff also have a responsibility to detect and report misconduct.

It may not be feasible or desirable to ensure assessment security for all tasks but assuring assessment security is a priority in high-stakes assessment tasks, especially those linked to assuring achievement of program learning outcomes (Dawson et al. 2020). In these cases, assessment security is vital to protecting the integrity of qualifications and the University’s reputation and to ensure fairness and justice (Dawson, 2021).

Assessment security is defined as: ‘measures taken to harden assessment against attempts to cheat. This includes approaches to detect and evidence attempts to cheat, as well as measures to make cheating more difficult’ (Dawson, 2021, p. 19).

Assessment security focuses on prevention and detection. It requires authentication and control of the circumstances under which work is produced:

Authentication involves ensuring that the student who submits the work has done the work themselves.  Control of circumstances means ensuring that the work was completed under the conditions expected—with access to the information and tools expected, and access to the systems needed.

The extent to which circumstances need to be controlled to ensure assessment security can depend on the learning outcomes assessed. The extent to which circumstances can be controlled depends to some extent on the mode of assessment.

  • Where a task addresses lower level learning outcomes such as memorisation of content, restricting access to information, for example by not allowing ‘cheat sheets’, may be a necessary condition of the task to ensure assessment security.
  • In the case of higher level learning outcomes that require students to apply or evaluate information, restricting access to content may be less important, but it remains important to restrict access to other information that might be specific to the task such as access to the question in advance or access to other students’ responses.
  • The role of tools in ensuring assessment security will depend on the nature of the task and the learning outcomes assessed. For example, access to calculators may be restricted in a basic arithmetic test but allowed in a statistics exam (Dawson, 2021).
  • Restrictions can be difficult to enforce in online assessment. Where it is not feasible to enforce restrictions, fewer restrictions may be fairer, more consistent and more authentic, for example by allowing collaborative answers in complex tests (Villarroel et al., 2020)
  • Online exam settings provide opportunities to simulate real-world requirements for professional tasks (Butler-Henderson & Crawford, 2020), for example through interactive oral exams based on real-world scenarios (Sotiriadou et al., 2020), authentic scenarios that simulate professional environments (Villarroel et al., 2020) or tasks that have been co-designed with partners.



Three ways to address academic integrity in assessment design

  1. Reduce opportunities to find answers
    • Avoid re-using assessment questions.
    • Use non-traditional task types, e.g. professional tasks such as proposals, abstracts, briefing papers.
    • Base tasks on data specific to your context or that students collect themselves, e.g. use unique or personalised data sets or contexts.
    • Design questions that target higher level learning outcomes.
    • Check that questions and answers are not already available online.
    • Set up a Google alert to check for material from your subject online .
  • Take a partnership approach to assessment
    • Provide opportunities for self-assessment, peer review and peer feedback.
    • Strengthen connections between academic and professional integrity in curriculum and assessment through co-design and co-assessment with university partners.
  • Find ways to confirm authorship (authenticate)
    • Use sequenced or multi-faceted tasks, synchronous simulation tasks or part-seen assessments.
    • Collect evidence of work in progress, e.g. drafts, notes and plans.
    • Use post-assessment tasks.
    • Use vivas where appropriate in high stakes assessment.
    • Compare students’ exam and coursework performance.
    • For group tasks, monitor and assess the process of group work and collect evidence of group work e.g. meeting notes, records of online discussion, peer feedback.

Adapted from ASKe, Oxford Brookes University (2008), CRADLE (2018), (Villarroel et al., 2020)


Design to prevent e-cheating

E-cheating is ‘cheating that uses or is enabled by digital technology’ (Dawson, 2021, p. 4). E-cheating includes:

  • Hardware such as computers, smartphones, smartwatches and earpieces.
  • New writing technologies including software and online tools e.g. automated algebra solvers, text generators and auto-paraphrasing tools.
  • The use of technology like e-commerce platforms (including contract cheating where it uses technology). (Adapted from Dawson 2021, pp. 4-5)

Paraphrasing tools and text generators

Tools that generate or paraphrase text based on artificial intelligence need to be considered in assessment design, and more broadly in terms of their impact on disciplinary and professional writing practices.  Tools using natural language processors (NLPs) and natural language generators (NLGs) such as GPT-3 can now produce instant and unique text that can be difficult to differentiate from text written by a human (McKnight, 2021). 

While GPT-3 and its successors can usefully and rapidly identify what has been thought, humans can contribute what has not yet been thought, but which they perceive is needed for a better world. (McKnight, 2021, p. 446)

Restricting access to and detecting the use of paraphrasing and text generation tools is difficult. Given that tools based on artificial intelligence are now widely used in many professional fields, higher education needs to find ways to work with rather than against these tools, and to promote ethical use of artificial intelligence.

In future, this is likely to involve new ways of thinking about the relationship between humans and technology in the practice of writing – with writing being produced by ‘networked human and non-human presences’ in a process of ‘assemblage’ (Deluze & Guattari, 1988; McKnight, 2021) rather than by individual writers (McKnight, 2021, p. 447).  Students will need opportunities to ‘critically engage with the contexts and practices that [they] will encounter in future workplaces and ‘guidance on the’ strategic and effective work of writing as assemblage’ (McKnight, 2021, p. 452). This could include a focus on futures literacy, or futures thinking (Mangnus et al., 2021; Mortensen et al., 2021)

In posthuman English, the teacher becomes a different kind of figure: an initiator, broker, colleague, collaborator, curator and mentor rather than trainer and invigilator (McKnight, 2021, p. 452).

Adapting to new writing technologies is an emerging and speculative field (McKnight, 2021). Some possible responses may be to:

  • Provide opportunities for students to use and critically evaluate the output from new writing technologies such as text generators, paraphrasing tools and grammar checkers.
  • Design tasks where students monitor and reflect on the tools they use to produce text.
  • Engage students as active participants in researching the impact of new writing technologies in your discipline or professional field.
  • Encourage students to explore the possibilities of new writing technologies in creating texts.
  • Recognise the ongoing importance of revising, drafting and editing skills through tasks and activities with an explicit focus on these skills and that develop students’ capacity for evaluative judgement (Tai et al. 2018) in relation to texts produced by new writing technologies.
  • Reconsider (with students, and in task design) what authenticity, originality and plagiarism, and writing practices look like in your discipline and professional field – now and in the future.
  • Consider the use of new writing technologies to promote divergent rather than convergent thinking.
  • Focus on developing students’ capacity for synthesis rather than simply paraphrasing, recognising that ‘synthesising is not only conducted by individual humans, but also by machines within a writing assemblage’ (McKnight, 2021, p. 450).
  • Consider the ethical dimensions of text and text production in relation to new writing technologies.
  • Consider (with students and colleagues) philosophical questions that arise from new writing technologies in relation to language, text, identity, creativity and consciousness.

(Strategies adapted from McKnight, 2021).

Contract cheating and file sharing

Contract cheating is outsourcing of assessment ‘to a third party, whether that is a commercial provider, current or former student, family member or acquaintance. It includes the unauthorised use of file-sharing sites, as well as organising another person to take an examination’ (TEQSA, 2017).  Contract cheating includes services that are purchased, obtained through credit or exchange, or for free.  Recent research suggests that rates of contract cheating among students, including students at Western are higher than previous estimates (Curtis et al., 2021).

Academic cheating services target students through social media and search engines and through persuasive features on their web sites that promote trustworthiness and personalized, ‘just-in-time’ support (Rowland et al., 2018) and offer a wide range of services.  Sites can be difficult to differentiate from legitimate support. Students may not perceive their engagement with these sites, including file sharing as cheating. Known academic cheating sites are blocked to users on the University network. Western is improving its capacity to detect contract cheating.

Students are warned of the risks of engaging with cheating sites and file sharing in the Academic Integrity Module, Learning Guides and through central and Library communications, but this message needs to be reinforced at all stages in all programs.  Engagement with academic cheating sites includes:

  • Sharing assignments or course material;
  • Using online tools provided by these sites to check for plagiarism, grammar or spelling; and/or
  • Purchasing writing services or obtaining a copy of an assignment.

All staff can:

Read more:

Contract cheating has been found to be most frequent in exams (including in particular multiple-choice exams), traditional written assignments and text-based assignments under any conditions (Harper et al., 2021). All forms of assessment, including authentic assessment can be outsourced (Ellis et al., 2020). Types of assessment that may be less prone to outsourcing include:

  • Reflections on practicums
  • Oral vivas
  • Individualised tasks
  • Tasks completed in class (Bretag et al., 2019).

Ways to reduce the risk of contract cheating through assessment design

  • Relate examination questions, particularly in take-home exams to specific tutorial or practical activities that can’t be accessed by external providers.  Request working calculations and sources.
  • Individualise or personalise questions to avoid sharing of solutions.
  • Separate quiz type examinations into shorter, more-frequent quizzes.
  • Where possible, ensure that tutorial sizes are not too large, and make tutorial activities interactive, to enable lecturers/tutors to know their students and their capabilities. This will assist tutors to recognise instances when the work is not the student’s own.
  • Integrate assessment tasks with tutorial activities, e.g. through in-class and online individual or group activities that are submitted during class; staged assignments so that early notes or drafts are submitted for review and feedback before the final paper; peer review/feedback on drafts; vivas; presentations followed by peer/tutor questions.
  • Ensure contact between student and academic staff in authentic tasks. Have students provide progress reports to confirm the work is genuine and that the final report submitted is the student’s own.  Include personal reflection and follow-up discussion.
  • Design tasks that incorporate ‘participation, sharing, delivery or use in a real-world setting’ (Ellis et al. 2020) and include some external supervision where possible.
  • WSU resources

Useful resources


Bertram Gallant, T. (2017). Academic integrity as a teaching & learning issue: From theory to practice. Theory into practice, 56(2), 88-94.

Bretag, T., Harper, R., Burton, M., Ellis, C., Newton, P., van Haeringen, K., Saddiqui, S., & Rozenberg, P. (2019). Contract cheating and assessment design: exploring the relationship. Assessment and Evaluation in Higher Education, 44(5), 676-691.

Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education, 159, 104024.

Corrin, L., Apps, T., Beckman, K., & Bennett, S. (2019). The myth of the digital native and what it means for higher education. In A. Attrill-Smith, C. Fullwood, M. Keep, & D. Kuss (Eds.), The Oxford handbook of cyberpsychology (pp. 98-114). Oxford University Press.

Curtis, G. J., McNeill, M., Slade, C., Tremayne, K., Harper, R., Rundle, K., & Greenaway, R. (2021). Moving beyond self-reports to estimate the prevalence of commercial contract cheating: an Australian study. Studies in Higher Education, 1-13.

Dawson, P. (2021). Defending assessment security in a digital world: Preventing e-cheating and supporting academic integrity in higher education. Routledge.

Deluze, G., & Guattari, F. (1988). A thousand plateaus: Capitalism and schizophrenia. Bloomsbury Publishing.

Ellis, C., van Haeringen, K., Harper, R., Bretag, T., Zucker, I., McBride, S., Rozenberg, P., Newton, P., & Saddiqui, S. (2020). Does authentic assessment assure academic integrity? Evidence from contract cheating data. Higher Education Research & Development, 39(3), 454-469.

Gedajlovic, E., & Wielemaker, M. (2020). Neither abuse, nor neglect: A duty of care perspective on academic integrity. Canadian Perspectives on Academic Integrity, 3(3), 63-69.

Gray, B. C. (2022). Ethics, edtech, and the rise of contract cheating. In S. E. Eaton & J. C. Hughes (Eds.), Academic integrity in Canada (pp. 189-201). Springer.

Harper, R., Bretag, T., & Rundle, K. (2021). Detecting contract cheating: examining the role of assessment type. Higher Education Research & Development, 40(2), 263-278.

Mangnus, A. C., Oomen, J., Vervoort, J. M., & Hajer, M. A. (2021). Futures literacy and the diversity of the future. Futures, 132, 102793.

McKnight, L. (2021). Electric Sheep? Humans, Robots, Artificial Intelligence, and the Future of Writing. Changing English, 28(4), 442-455.

Mortensen, J. K., Larsen, N., & Kruse, M. (2021). Barriers to developing futures literacy in organisations. Futures, 132, 102799.

Murphy, G. A. (2022). Bringing language diversity into integrity research—what, why, and how. Journal of College and Character, 23(1), 76-91.

Rogerson, A. M. (2017). Detecting contract cheating in essay and report submissions: Process, patterns, clues and conversations. International Journal for Educational Integrity, 13(1), 10.

Rogerson, A. M., & Basanta, G. (2016). Peer-to-peer file sharing and academic integrity in the internet age. In T. Bretag (Ed.), Handbook of academic integrity (pp. 273-285). Springer.

Rowland, S., Slade, C., Wong, K.-S., & Whiting, B. (2018). ‘Just turn to us’: the persuasive features of contract cheating websites. Assessment & Evaluation in Higher Education, 43(4), 652-665.

Sotiriadou, P., Logan, D., Daly, A., & Guest, R. (2020). The role of authentic assessment to preserve academic integrity and promote skill development and employability. Studies in Higher Education, 45(11), 2132-2148.

Villarroel, V., Boud, D., Bloxham, S., Bruna, D., & Bruna, C. (2020). Using principles of authentic assessment to redesign written examinations and tests. Innovations in Education and Teaching International, 57(1), 38-49.

Yorke, J., Sefcik, L., & Veeran-Colton, T. (2022). Contract cheating and blackmail: a risky business? Studies in Higher Education, 47(1), 53-66.