Assessment Analytics Using Turnitin & Grademark in an Undergraduate Medical Curriculum
Keywords:eAssessment, Electronic Management of Assessment, EMA, Analystics, Assessment, medical education,
In recent times there has been an increased interest around assessment feedback – evaluation of the University of Liverpool (UoL) Medical Curriculum has shown students have real concerns about the feedback they receive (Reed & Watmough, 2015; Watmough & O’Sullivan, 2011). These concerns have been amplified in recent years by results from the National Student Survey (NSS).
Through the implementation of the Turnitin and Grademark systems to support the Electronic Management of Assessment (EMA), this study set out to research the suitability of the systems as well as investigate the potential of assessment analytics – the concept that assessment data can be viewed to inform future practice and provide a coherent and holistic view of staff and student performance.
Quantitative and qualitative data show that academic staff are positive in relation to the implementation of said systems to support the assessment and feedback cycle, and that whilst the collection and analysis of data can be useful, it is not a complete panacea. There are ethical considerations involved in relation to staff and students in the collection and analysis of such data.
BERA. (2011). Ethical guidelines for educational research. London.
Buckley, E., & Cowap, L. (2013). An evaluation of the use of Turnitin for electronic submission and marking and as a formative feedback tool from an educator’s perspective.
British Journal of Educational Technology, 44(4), 562–570.
Coates, H. (2009). Development of the Australasian survey of student engagement (AUSSE). Higher Education, 60(1), 1–17.
Cooper, A. (2012). What is analytics ? Definition and essential characteristics. JISC CETIS Analytics Series, 1(5), 1–10.
Ellaway, R. H., Pusic, M. V, Galbraith, R. M., & Cameron, T. (2014). Developing the role of big data and analytics in health professional education. Medical Teacher, 36(3), 216–222.
Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), 662–664.
Flyvbjerg, B. (2006). Five misunderstandings about case-study research. Qualitative Inquiry, 12(2), 219–245.
Heinrich, E., Milne, J., Ramsay, A., & Morrison, D. (2009). Recommendations for the use of e-tools for improvements around assignment marking quality. Assessment & Evaluation in Higher Education, 34(4), 469–479.
ISSE. (2013). Student survey.i.e. The Irish survey of student engagement (ISSE). Implementation of the 2013 national pilot.
Jensen, J. L., & Rodgers, R. (2001). Cumulating the intellectual gold of case study research. Public Administration Review, 61(2), 235–246.
Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 Horizon Report. Austin, Texas.
Johnson, M., Nádas, R., & Bell, J. F. (2010). Marking essays on screen: An investigation into the reliability of marking extended subjective texts. British Journal of Educational Technology, 41(5), 814–826.
Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment. Assessment & Evaluation in Higher Education, (July), 1–13.
Jordan, S. (2013). Using e-assessment to learn about learning. In D. Whitelock, W. Warburton, G. Wills, & L. Gilbert (Eds.), Proceedings of CAA 2013 International Conference, Southampton (pp. 1–12). Southampton.
Leckey, J., & Neill, N. (2001). Quantifying quality: The importance of student feedback. Quality in Higher Education, 7(1), 19–32.
Lipsett, A. (2007, September). Students’ biggest concern is feedback. The Guardian. Retrieved from http://www.theguardian.com/education/2007/sep/12/highereducation.uk2
Meho, L. (2006). E-mail interviewing in qualitative research: A methodological discussion. Journal of the American Society for Information Science and Technology, 57(10), 1284–1295.
Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218.
Rae, A. M., & Cochrane, D. K. (2008). Listening to students: How to make written assessment feedback useful. Active Learning in Higher Education, 9(3), 217–230.
Reed, P., & Watmough, S. (2015). Hygiene Factors: Using VLE minimum standards to avoid student dissatisfaction. eLearning & Digital Media, 12(1).
Riley, S. C. (2009). Student Selected Components (SSCs): AMEE Guide No 46. Medical Teacher, 31(10), 885–94.
Rolfe, V. (2011). Can Turnitin be used to provide instant formative feedback? British Journal of Educational Technology, 42(4), 701–710.
Rolfe, V. (2012). Open educational resources: Staff attitudes and awareness. Research in Learning Technology, 20(1063519), 1–13. doi: http://10.3402/rlt.v20i0/14395
Silverman, D. (2013). Doing qualitative research (K. Metzler, Ed.) (4th ed.). London: SAGE Publications.
Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.
The National Student survey (2014). The national student survey 2014. Available from http://www.thestudentsurvey.com/the_nss.html
Times Higher Education. (2006, August). Courses deliver, but feedback falls short. Times Higher Education Supplement. Retrieved from http://www.timeshighereducation.co.uk/news/courses-deliver-but-feedback-falls-short/204943.article
University guide 2015: League table for medicine (2015). The Guardian. Retrieved from http://www.theguardian.com/education/ng-interactive/2014/jun/03/university-guide-2015-league-table-for-medicine
Watmough, S., & O’Sullivan, H. (2011). Medical students’ views on feedback in a PBL curriculum. In Association for Medical Education in Europe (AMEE). Vienna, Austria.
World Federation for Medical Education. (2003). Basic Medical
Education WFME Global Standards for The 2012 Revision. Available from www.wfme.org
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
Journal of Perspectives in Applied Academic Practice has made best effort to ensure accuracy of the contents of this journal, however makes no claims to the authenticity and completeness of the articles published. Authors are responsible for ensuring copyright clearance for any images, tables etc which are supplied from an outside source.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.