Online MCQ Assessment Anxiety Amongst First Year Undergraduate Psychology Students: A Case Study

Dr Gareth Davies, Lews Castle College, UHI


This case study explores the experience of online MCQ assessment anxiety by 1st year undergraduate psychology students at the University of the Highlands and Islands (UHI) and presents a novel approach to mitigating against assessment anxiety. It draws upon feedback from students who took the first iteration of an introductory psychology module and describes the effects of steps taken to mitigate some of the issues identified. The effectiveness of these steps is examined in the feedback received from this cohort and the subsequent cohort of students.

Context: The UHI model of programme delivery

UHI comprises 13 academic partners from across Scotland’s Highlands and Islands. The University offers a wide variety of courses, ranging from Higher National qualifications (SCQF level 7) through to PhD (SCQF Level 12). One of its most notable features is that it delivers many of its degree programmes online with teaching staff who are distributed across these13 different academic partners. Students following the introductory Psychology A module are therefore drawn from multiple locations across Scotland, and in some cases, further afield.

The content of some programmes and courses is delivered by video conferencing facilities located in academic partner colleges, and some programmes have face-to-face elements. However, UHI delivers programmes primarily through blended learning. This model enables students who might not otherwise be able to engage in Higher education in ‘red-brick’ Universities to be able to follow undergraduate and postgraduate programmes. As such, UHI has a high proportion of non-traditional school-leaver students, many of whom live in remote and rural areas. The UHI model enables mature students with families or students who might want to remain at home for financial reasons to continue their higher education.

Anxiety associated with assessments is a recognised issue in academic practice. The literature reveals the nature of anxiety as test anxiety and the related concepts of computer anxiety and technostress. As far back as the 1950s, test anxiety in classroom situations has been identified as an issue and studied academically (McKeachie, Pollie, & Speisman, 1955; Alpert & Haber, 1960). Reber & Reber (2011) define anxiety as “a vague, unpleasant emotional state with qualities of apprehension, dread, distress and uneasiness” (Reber & Reber; 2001, p. 42). Anxiety in the context of tests is somewhat different and has been defined by Zeidner (1998) as behavioural, emotional and physiological responses to perceived negative consequences following an exam.

An early meta-analysis of 562 studies concluded that poor performance in tests was strongly correlated with test anxiety (Hembree, 1988). Birjandi and Alemi (2010) have subsequently identified different reasons for test anxiety. Firstly, the lack of preparation for the test by the student is highlighted, followed by students ‘cramming’ the night before the test. They further describe poor time management, lack of organisation of test material and poor study skills as other factors contributing to test anxiety. These factors can all be conceptualised as procedural or practical issues. Birjandi and Alemi (2010) also identify other factors which might be described as emotional in origin. These emotional issues are described as: worrying about previous exam performance; the performance of peers and comparison of their own performance to theirs; and perceptions of the consequences of failure. Several definitions of computer anxiety exist. In the context of this case study, Parasuraman and Igbaria’s definition ‘‘the tendency of individuals to be uneasy, apprehensive, or fearful about current or future use of computers’’ (Parasuraman & Igbaria, 1990, p. 329), would appear appropriate. However, it is acknowledged that computer anxiety can also be described as a negative emotional state or negative cognition experienced by an individual when he/she is using a computer or computer equipment (Bozionelos, 2001). Korobili, Togia and Malliari  (2009) noted that students’ attitudes towards computers and computer anxiety were closely related to the acceptance of computers, and suggested that further exploration of this finding is critical to educators developing the use of new technologies in academic practice (Korobili et al., 2009).

‘Technostress’ is concerned with the effect of using ICT (Information and Communications Technology) for learning and learning how to use ICT. ICT can include technology other than computers such as software and internet platforms which are new to the user. Wang, Shu and Tu (2008) define Technostress as a ‘‘reflection of one’s discomposure, fear, tenseness and anxiety when one is learning and using computer technology directly or indirectly that ultimately ends in psychological and emotional repulsion and prevents one from further learning or using computer technology.’’ (Wang et al., 2008, p. 3004). Monideepa, Qiang and Ragu-Nathan (2011) describe five factors that contribute to the onset of technostress: techno-invasion, techno-overload, techno-uncertainty, techno-insecurity and techno-complexity. Any or all of these factors might be perceived as an issue by novice first year undergraduate students. Monideepa et al. (2011) then go on to detail how technical training and support can help mitigate the effects of technostress.

Concepts such as online test anxiety, computer anxiety and technostress are important issues for UHI as well as other education institutions that utilise virtual learning environments and other online learning platforms. UHI, as a distributed University, uses computer-based platforms for teaching a significant proportion of its undergraduate and postgraduate courses, many of which are wholly online. As such, UHI needs to become aware of how the concepts described above may potentially impact on student performance. This is crucial for naive online learners such as many first year undergraduate students. This case study of first year undergraduate students in the first iteration of a new module presents a potential ‘perfect storm’ of test anxiety, computer anxiety and ‘technostress’. How these issues impacting on student performance were identified, addressed and subsequently evaluated is now described.

BSc Psychology at UHI

The B.Sc. Psychology Programme ran for the first time in September 2011 and was delivered wholly online via the Blackboard VLE platform except for a mandatory face-to-face induction at the beginning of each year and a mandatory week’s residential event in Inverness. The programme was modular in nature with each module being worth 15 SCQF credits with 120 credits required in each year over four academic years in order to graduate with Honours. In the first semester, students took the ‘Psychology A’ module which was an introduction to the subject of psychology. The module was core for BSc Psychology students, but it was also taken by students from other degree programmes and ran for 10 weeks. In this first iteration there were 117 students. Learning material was delivered in weekly sessions and was supported by the use of the discussion board facility, where students came together in a semi-synchronous manner with tutors to discuss various aspects of the learning material. The learning material often takes advantage of the interactivity that is available through content that is delivered online.

The assessment strategy for the module was based around three assessments. Assessments 1 and 2 were online Multiple Choice Question (MCQ) tests made up of 20 MCQs each. The third assessment was a traditional essay. One of the reasons for using an MCQ assessment was to give those students who might not perform well in an essay-type assessment the opportunity to take advantage of a different type of assessment. The 2 MCQ assessments were delivered in weeks 4 and 7 with the essay being delivered in the final week of the module. Each MCQ was made available to take at any point during a 48-hour window. Once started, students were compelled to complete the assessment within 40 minutes. The standard 40% pass mark for a UHI undergraduate assessment was applied. Taking advantage of the in-built randomisation facility within Blackboard, questions were presented in a random order and the response choices were also randomised. This means that the probability of two students having the questions presented in an identical way was very small.

Students were given advice through online chats and notifications via Blackboard about taking the online MCQ assessment including information about what to do if technical problems were encountered. Students were advised to email their tutor immediately a problem was experienced and also told to log a call to UHI’s technical helpdesk.

Reported issues with the online MCQ assessment

When the MCQ was delivered for the first time, students were able to make one attempt within a 48-hour window of opportunity. This maximised the flexibility for students to take the assessment when it suited them. This is especially important given UHI’s student profile with many students having family commitments and part time jobs. It is important to note that the majority of students were new to online learning. For many this would have been their first experience of MCQs and was combined with their inexperience of using UHI systems, including the UHI Helpdesk. This means that the challenges associated with undertaking the assessment were more extensive and complex than a test of subject knowledge and understanding.

At the end of the deployment of the first MCQ assessment, tutors set up a discussion board forum where students could leave feedback and reports of their experiences, specifically related to the MCQ assessment. This proved to be a valuable resource for the module tutors. Quotes from this forum are used below. Soon into the deployment of the MCQ, tutors began to receive messages from students reporting issues with an onscreen timer obscuring the ‘advance’ button and the ‘save’ button, making it difficult, if not impossible, to progress through the MCQs:

[There were] technical issues that occurred with this test – timer being wrong, timer blocking the ‘next’ button, questions not being visible or some not having access to the test at all. (Student feedback).

Additional issues soon became apparent, and efforts were made by the tutors to understand and resolve them with help from UHI’s Blackboard administrator and Helpdesk. One issue included browser incompatibility. Students were and still are able to access Blackboard through any browser of their choice, but some browsers are more compatible with Blackboard than others. This meant that students using certain browsers were experiencing difficulties whereas other students did not. As soon as this became apparent, an email was sent to all students advising them to use the most compatible browser at that time. They were also advised not to use browsers identified as being particularly problematic:

I found the test a little scary but I think mostly because I wasn't sure what to expect, I was thrown a little by having to restart the test due to using Internet Explorer but I know now to use Firefox. (Student feedback).

A further issue was broadband speed and reliability. UHI’s student profile is such that many students live in remote and rural locations. Poor broadband speed and reliability in some remote and rural areas compromised the ability of some students to complete the assessment in one sitting:

My problems were more technical which is outwith the control of the college maybe. The button to move onto the next question was mostly hidden throughout my test and although I pressed the return key it took ages to move on and then I was given an error message. This was very frustrating for me and panicked me that I would run out of time. Therefore I rushed the test and was left with time at the end to spare! I think my issues are maybe down to my PC. I live over 60 miles from the college so it wouldn't be feasible for me to take the test there. Anyway, it is a learning curve and I may try downloading a more up to date browser on my computer to see if that helps. (Student feedback).

To address this, an email was sent to students advising them, wherever possible, to take the assessment in a UHI Academic Partner college or Learning Centre where internet connectivity and browser compatibility were closer to optimum. Taking an online MCQ, especially for the first time, necessitated complex instructions. It is perhaps unsurprising that for some students, misunderstanding these instructions led to issues requiring an assessment re-set by a course tutor.

These issues led to tutors needing to reset the assessment for students so that another attempt could be made. Although no record was kept of the number of re-sets, it was estimated by the module tutors that up to a quarter of students taking the assessment needed to have the MCQ re-set. For some individuals, the process of resetting the MCQ was difficult and needed to be repeated several times:

I found the test daunting as I didn't see questions first time and then I managed to finally get it reset by [the tutor]. (Student feedback).

The effect of these issues on the overall pass rate (including students whose attempts were reset) was not surprising, with a higher than expected non-submission rate: From n=117 students, 10 failed to submit and 1 failed the assessment.

The issues experienced during the first MCQ assessment were not as profound during the second assessment, where all submitting students passed. The non-submission rate fell slightly from 10 to 8. Interestingly, only 2 of the students who failed to submit the first MCQ assessment did not submit the second. This suggests that most students who received tutor support for problems experienced during the first MCQ were confident that any problems experienced during the second MCQ could be addressed. Clearly, the experience of the first MCQ assessment played a significant part in students’ ability to engage successfully with the second MCQ assessment. This is demonstrated by an increase in the mean score for the first MCQ assessment of 15 / 20 to 17.5 / 20 for the second MCQ assessment.

For students, any pre-existing anxiety associated with the assessment might have been exacerbated. The same could be said for pre-existing computer anxiety leading to an increase in ‘technostress’:

Getting used to the computer is a big thing for me and [I] feel that maybe I will eventually. It will only get harder the questions!!! Doubting myself sometimes and I just need to have faith that all will turn out ok. (Student feedback).

The problems of test anxiety and technostress experienced by students during the first MCQ assessment might have been further compounded by the MCQ being deployed as early on in the module and the course more generally; by week 4, students had had barely enough time to become comfortable with the normal workings of the different features of Blackboard.  Far from engendering confidence with ICT and online learning, the potentially negative experience of the MCQ assessment may have made pre-existing anxieties worse. This might in some cases lead to the student losing confidence with online learning in general and having a negative regard for the subject (psychology).

Mitigation of online MCQ assessment anxiety

In September 2012, a new 20-credit module structure was introduced throughout UHI meaning that all modules were no longer worth 15 SCQF credits but were worth 20. As such, the number of modules a student took in one semester went from four to three, but the same total number of credits required each semester remained the same. All undergraduate programmes were restructured to facilitate this change.

This afforded the opportunity to restructure Psychology A and to address issues causing technostress identified in the previous MCQ assessment. At this juncture the module was also renamed Introduction to Psychology as it was felt that this better reflected the aims of the module. The MCQ assessment was retained but reduced from two assessments to one, but the number of questions was increased from 20 to 30. The number of questions was increased to ensure that the total assessment tariff for the module was appropriate for the new 20-credit format. It was also deployed later in the module. The essay element remained, but the word limit for the assessment increased to reflect the greater number of credits on offer. The volume of learning material in the module was also increased.

Prior to the deployment of the MCQ assessment, two formative assessments were introduced. In the first formative assessment, students were asked to devise three MCQ questions and identify the correct answers. Tutors constructed a second formative 20-question practice MCQ, derived from the submitted MCQs which had been deemed best by consensual tutor agreement. The first formative assessment was deployed in week 2 and the second formative assessment was deployed in week 3. Two weeks later, in week 5, the summative MCQ assessment was deployed. The submission date for the essay was unchanged.

The second formative assessment was deployed initially for 36 hours, to give students the experience of needing to complete the assessment within a fixed time window. However, if individuals did not manage to access the MCQ within that time frame, then they were given a further opportunity. This was done because the tutors felt that it was important for as many students as possible to experience the online MCQ. The online MCQ was deployed with comprehensive advice and guidance relating to the VLE and other technical issues relating to ICT in an attempt to minimise anxiety associated with the technology.

Informed by student feedback, factors found to be associated with online MCQ assessment anxiety were: students’ perceived MCQ assessment competence; knowledge and understanding of the learning material; and perceived competence and confidence with the use of ICT and more specifically the VLE environment. The changes made to the assessment regime were designed to mitigate some of the issues experienced by students and reported to module tutors by feedback. Other sources of information including module results and the amount and type of direct MCQ support needed by students also indicated online MCQ assessment issues. Differences between the results for MCQ assessment 1 and MCQ assessment 2 taken by student cohort 1 pointed towards potential solutions and combination of this information with the student feedback suggested how the problem of online MCQ assessment anxiety might be mitigated for future cohorts. A small number of students did experience problems during MCQ assessment 2, but generally these were students who did not submit a completed MCQ for the first assessment.

It is not possible to assess the effect of the changes made using all the different sources of information, but there are some useful indications of the effectiveness of the changes. No specific student feedback was gathered in relation to the MCQ assessment for the second student cohort to follow the module so no direct comparisons can be made. However, emails and other informal feedback did not indicate any widespread issues. General end-of-module feedback did not reveal any issues with the MCQ assessment either.

The most useful indicator that can be used as a comparison is the number of reset attempts that tutors were asked to do. Figures of the number of resets were not gathered for either cohort, but tutors agreed that the number of resets was much lower for the second iteration of the module, suggesting that far fewer students were experiencing difficulties in the second module iteration as compared to the first (student numbers in each cohort were similar: cohort 1 n=117 and cohort 2 n=101). The overall pass rate for the second cohort was better suggesting that the changes made had some positive effect. It must be remembered that the process reported here is a report of practice and gathering primary data at the time was not a priority.

The original assessment strategy featured two summative MCQ assessments whereas the new approach featured only one summative MCQ assessment. This single summative MCQ assessment was deployed at a later point in the module to enable students to become more familiar with the VLE and operating in an online and computer-based environment. This was done in an attempt to minimise technostress. It is believed that two formative assessments undertaken prior to the single summative MCQ assessment proffered several advantages. The first formative assessment gave students the opportunity to devise MCQ questions for themselves. This had two perceived benefits: Firstly, it allowed those students who were unfamiliar with MCQs to think about what such an assessment required, not in terms of the learning or the nature of deployment of the assessment (via VLE) but about the mechanism of the assessment itself, i.e. it increased students’ MCQ assessment competence. Thus, when a subsequent assessment of this nature is encountered by the student, they will have a better understanding of what an MCQ requires. Secondly, the development of MCQs by students demanded a deeper engagement with the material than might otherwise have been required by the summative assessment alone (Biggs, 1979). In order to develop an MCQ, the student will need to engage with the material and consider it in questioning terms, demanding more than superficial memory of facts. It is acknowledged that this process does not necessarily demand the deep levels of engagement required by an essay question, for example, but it does necessitate an engagement at a deeper level than might otherwise be attained if this formative assessment were not offered. The second formative assessment facilitated further engagement with learning materials to enhance students’ knowledge and understanding of the subject material (psychology) which may consequently reduce student assessment anxiety. It is notable that the MCQs devised by students were perceived to be more challenging than the ones devised by the tutors for the initial MCQ assessments. This suggests that students did indeed engage with the learning material on a deeper level.

It is proffered that the introduction of these formative assessments reduced student anxiety by increasing MCQ assessment competence through experiential engagement with the process of developing MCQs. Online MCQ anxiety was thought to be reduced by giving students the opportunity to gain experience through engagement with the VLE and ICT more generally.

Conclusions and food for thought

Anxiety that can be associated with online MCQ assessments might be mitigated through the careful use of formative assessments. However, it is important to acknowledge that there are potentially multiple factors that can contribute to anxiety related to online MCQ assessments. The formative assessments deployed as a consequence of feedback from students and other information addressed some of these factors. The formative assessments appeared to help reduce online MCQ assessment-related anxiety amongst a subsequent cohort of students. While the practice reported here is context specific to one institution, the findings may well be transferable to other institutions and courses where online MCQ assessments are used.

As lecturers, it is essential that we ensure that we are testing students’ knowledge and understanding of the subject matter rather than assessing students on their ability to cope with anxiety that we have in part caused as a consequence of the way in which we deploy an assessment. Online testing is a pragmatic alternative to exams where getting students into the same physical location at the same time can prove difficult. However, exams are an important contribution to the assessment mix that offers students the opportunity to play to their different strengths. Nevertheless, it is important that students are not disadvantaged by the way in which an assessment is deployed. It is for this reason that further research is necessary to establish exactly what contributes to online MCQ assessment anxiety. The findings reported here point towards a complex interaction of factors, but they also indicate that online MCQ test anxiety, associated with computer anxiety and technostress, might be amenable to intervention. As such, the development and testing of interventions could yield more effective methods for mitigating online test anxiety.


The work of the module tutors at the time was invaluable in the development of the changes made to the module. The support for the module team by the BSc Psychology Programme Leader, Wendy Maltinsky, was crucial in ensuring the success of the module and the development of subsequent improvements to it.


Dr Gareth Davies is the Programme Leader for the Master of Education at UHI. He teaches on a number of different undergraduate degree programmes including the BSc Psychology as well as postgraduate programmes including the MA in Health and Wellbeing.