Assessment Analytics Using Turnitin & Grademark in an Undergraduate Medical Curriculum

Main Article Content

Peter Reed Simon Watmough Paul Duvall

Abstract

In recent times there has been an increased interest around assessment feedback – evaluation of the University of Liverpool (UoL) Medical Curriculum has shown students have real concerns about the feedback they receive (Reed & Watmough, 2015; Watmough & O’Sullivan, 2011). These concerns have been amplified in recent years by results from the National Student Survey (NSS).

Through the implementation of the Turnitin and Grademark systems to support the Electronic Management of Assessment (EMA), this study set out to research the suitability of the systems as well as investigate the potential of assessment analytics – the concept that assessment data can be viewed to inform future practice and provide a coherent and holistic view of staff and student performance.

Quantitative and qualitative data show that academic staff are positive in relation to the implementation of said systems to support the assessment and feedback cycle, and that whilst the collection and analysis of data can be useful, it is not a complete panacea. There are ethical considerations involved in relation to staff and students in the collection and analysis of such data.

Article Details

Section
Case Studies
Author Biographies

Peter Reed, University of Liverpool


Peter Reed | Lecturer (Learning Technology)Institute of Learning & Teaching | Faculty of Health & Life Sciences | The University of Liverpool

Simon Watmough, University of Liverpool

Research Fellow, University of Liverpool

Paul Duvall, University of Liverpool

Lecturer (Medical Education), School of Medicine, University of Liverpool

References

BERA. (2011). Ethical guidelines for educational research. London.

Buckley, E., & Cowap, L. (2013). An evaluation of the use of Turnitin for electronic submission and marking and as a formative feedback tool from an educator’s perspective.

British Journal of Educational Technology, 44(4), 562–570.
doi: http://dx.doi.org/10.1111/bjet.12054

Coates, H. (2009). Development of the Australasian survey of student engagement (AUSSE). Higher Education, 60(1), 1–17.
doi: http://dx.doi.org/10.1007/s10734-009-9281-2

Cooper, A. (2012). What is analytics ? Definition and essential characteristics. JISC CETIS Analytics Series, 1(5), 1–10.

Ellaway, R. H., Pusic, M. V, Galbraith, R. M., & Cameron, T. (2014). Developing the role of big data and analytics in health professional education. Medical Teacher, 36(3), 216–222.
doi: http://10.3109/0142159X.2014.874553

Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), 662–664.
doi: http://10.1111/bjet.12028

Flyvbjerg, B. (2006). Five misunderstandings about case-study research. Qualitative Inquiry, 12(2), 219–245.
doi: http://10.1177/1077800405284363

Heinrich, E., Milne, J., Ramsay, A., & Morrison, D. (2009). Recommendations for the use of e-tools for improvements around assignment marking quality. Assessment & Evaluation in Higher Education, 34(4), 469–479.
doi: http://10.1080/02602930802071122

ISSE. (2013). Student survey.i.e. The Irish survey of student engagement (ISSE). Implementation of the 2013 national pilot.

Jensen, J. L., & Rodgers, R. (2001). Cumulating the intellectual gold of case study research. Public Administration Review, 61(2), 235–246.
doi: http://dx.doi.org/10.1111/0033-3352.00025

Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 Horizon Report. Austin, Texas.

Johnson, M., Nádas, R., & Bell, J. F. (2010). Marking essays on screen: An investigation into the reliability of marking extended subjective texts. British Journal of Educational Technology, 41(5), 814–826.
doi: http://10.1111/j.1467-8535.2009.00979.x

Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment. Assessment & Evaluation in Higher Education, (July), 1–13.
doi: http://10.1080/02602938.2013.875117

Jordan, S. (2013). Using e-assessment to learn about learning. In D. Whitelock, W. Warburton, G. Wills, & L. Gilbert (Eds.), Proceedings of CAA 2013 International Conference, Southampton (pp. 1–12). Southampton.

Leckey, J., & Neill, N. (2001). Quantifying quality: The importance of student feedback. Quality in Higher Education, 7(1), 19–32.
doi: http://10.1080/13538320120045058

Lipsett, A. (2007, September). Students’ biggest concern is feedback. The Guardian. Retrieved from http://www.theguardian.com/education/2007/sep/12/highereducation.uk2

Meho, L. (2006). E-mail interviewing in qualitative research: A methodological discussion. Journal of the American Society for Information Science and Technology, 57(10), 1284–1295.
doi: http://dx.doi.org/10.1002/asi.20416

Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218.
doi: http://10.1080/03075070600572090

Rae, A. M., & Cochrane, D. K. (2008). Listening to students: How to make written assessment feedback useful. Active Learning in Higher Education, 9(3), 217–230.
doi: http://10.1177/1469787408095847

Reed, P., & Watmough, S. (2015). Hygiene Factors: Using VLE minimum standards to avoid student dissatisfaction. eLearning & Digital Media, 12(1).
doi: http://dx.doi.org/10.1177/2042753014558379

Riley, S. C. (2009). Student Selected Components (SSCs): AMEE Guide No 46. Medical Teacher, 31(10), 885–94.
doi: http://10.3109/01421590903261096

Rolfe, V. (2011). Can Turnitin be used to provide instant formative feedback? British Journal of Educational Technology, 42(4), 701–710.
doi: http://10.1111/j.1467-8535.2010.01091.x

Rolfe, V. (2012). Open educational resources: Staff attitudes and awareness. Research in Learning Technology, 20(1063519), 1–13. doi: http://10.3402/rlt.v20i0/14395

Silverman, D. (2013). Doing qualitative research (K. Metzler, Ed.) (4th ed.). London: SAGE Publications.
Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.
doi: http://10.1177/0002764213479366

The National Student survey (2014). The national student survey 2014. Available from http://www.thestudentsurvey.com/the_nss.html

Times Higher Education. (2006, August). Courses deliver, but feedback falls short. Times Higher Education Supplement. Retrieved from http://www.timeshighereducation.co.uk/news/courses-deliver-but-feedback-falls-short/204943.article

University guide 2015: League table for medicine (2015). The Guardian. Retrieved from http://www.theguardian.com/education/ng-interactive/2014/jun/03/university-guide-2015-league-table-for-medicine

Watmough, S., & O’Sullivan, H. (2011). Medical students’ views on feedback in a PBL curriculum. In Association for Medical Education in Europe (AMEE). Vienna, Austria.

World Federation for Medical Education. (2003). Basic Medical
Education WFME Global Standards for The 2012 Revision. Available from www.wfme.org