05 Jun Why we can’t give you a simple set of rules for online assessment
By Hanelie Adendorff and Charmaine van der Merwe
In this blog, Hanelie Adendorff and Charmaine van der Merwe discuss why the current drive to take our assessments in the remote, online space, could be a unique opportunity.
In hierdie blog bespreek Hanelie Adendorff en Charmaine van der Merwe waarom die huidige beweging om ons assesserings in die afgeleë, aanlyn ruimte af te neem, ’n unieke geleentheid kan bied.
Kule bhlogo, uHanelie Adendorff noCharmaine van der Merwe baxoxa ngesizathu esenza ukuba le ndlela ikhuthazwayo kungokunje yokuqhubela iimvavanyo zethu kwezo ndawo abafundi abakuzo nange-intanethi ibe lithuba elikhethekileyo
We are all in the same storm but not in the same boat.
In the first few weeks of emergency remote teaching, the focus was on your teaching. And just like your teaching matters, your assessment matters. It matters a lot, especially to students. According to Boud (2013), our “assessment practices are the single most influential driver of what our students do”. With this in mind, how do you take your assessment online? Is there a recipe or a simple toolkit or are there maybe 10 tips? Or even just advice that doesn’t start with “It depends . . .” or “Maybe you can start by asking . . .”.
There is an intriguing process in science called “crystallisation.” It is as much art and magic as it is science. Seeing a nebulous solution turn clear and shimmering with little crystalline specks can produce a child-like sense of wonder.
The teaching and learning space boasts a similar occurrence – that moment when, out of the midst of conflicting and confusing positions and assertions, ideas coalesce into conception. And, much like crystallisation, it cannot be forced. It happens when the conditions are just right.
Could COVID-19 be such a set of circumstances for assessment in higher education? Could it create the conditions required to crystallise new thought and the moment that brings the murky space of conjecture and supposition into relief? Could these unique circumstances offer a unique opportunity to reconsider our beliefs about and approach to the assessment of student learning, whether face-to-face or online and remote?
In this complex time, you might be tempted to look for answers that promise to provide simple solutions. And, indeed, there are many of those out there, such as these strategies to prevent cheating in online assessments. So, why won’t we give you a similar simple set of rules for online assessment? Is it not true that the criteria in our assessment policy apply across contexts, irrespective of the mode of assessment?
The problem with simple strategies is that they often do not take context into account. Let’s take the example of time restrictions and test availability. Time restrictions are not new but why do we use them? In online assessment, it is often used to crystallise, magically, the murky waters of academic integrity by limiting the time available for consulting resources. But at what cost? How fair is this in our context where students might struggle with limited digital literacy skills or inadequate ICT access?
So, how do you choose? And what do you choose: integrity or fairness? The answer is one that only you can give. And it is best guided by asking a few critical questions. It is here that we might offer some help, thinking (or questioning) this through with you, whilst guided by the SU assessment policy.
We could, for example, start by asking what the purpose of the assessment opportunity is. In low-stakes assessments aimed at helping students to stay on track, you might be able to shift the integrity requirement. However, in high-stakes assessments, where both integrity and fairness are critical, we would ask a few other questions. For example, are there other ways in which you can increase integrity (McGee, 2013)? And, if time restrictions do indeed prove to be the only workable option, how do you accommodate students who were excluded or negatively impacted due to hardware, connection or even power failures? How do you communicate this to them?
So, before you reach for the how-to guide on time restrictions (or any other suggested strategies), ask yourself if it is the best for your context and for the outcomes of your module and assessment.
Assessments are stressful for students and, whilst some have found online assessment conducted in an assessment centre to be less trying (Cassady & Gridley, 2005), we cannot assume that the same holds for the current remote assessment approach. A recent empirical study of online-proctored exams, for example, found that “a general wariness of technology [combined] with students’ fear of testing” increased test anxiety (Woldeab & Brothen, 2019:8). The negative impact of anxiety on student performance in traditional test settings has been widely documented (see, for example, Cassady & Johnson, 2002; Von der Embse, Jester, Roy & Post, 2018). One can only imagine how additional stressors, such as concerns about sustained connectivity, might influence the outcomes of the assessment process.
We have not even touched on topics such as sequential question release and the disablement of the scroll-back option. Or the positive impact of online formative assessment, such as practise tests, on anxiety (Cassady & Gridley, 2005). So, you see, taking your assessment online is no simple thing governed by simple solutions (Cleland, McKimm, Fuller, Taylor, Janczukowicz & Gibbs, 2020). It is both art and magic, and science. Much like the art teacher, we can give (nuanced) advice and pointers, we can tell you how to mix paints and what works best in which circumstances. But the most breath-taking art pieces usually go beyond that, questioning what is and what is believed and offering new ways of seeing and interacting with the world.
Maybe, just maybe, COVID-19 is our chance to rethink our approach to and revisit our beliefs about the assessment of student learning. To start grappling with the questions that really matter. To dare greatly. To fly – rather than flap – in the online space (Salmon, 2005). To crystallise amazing new assessment art and science.
We are indeed all in the same (COVID-19) storm but we are not necessarily in the same (assessment) boat.
“I was intrigued by these mono-prints my daughter made as part of her experimentation for an art project. It is about time, focussing on the changing light, and was inspired by our early-morning walks in the mountain during lockdown level 4. Although it does not follow the form of the scenery, I can see something of its beauty in this. The allure of these abstract renderings of a familiar scene reminded me of the uncertainty of this time during which all that was familiar suddenly took an unfamiliar form, and how there might be beauty even amidst this confusion.” – Hanelie
Boud, D. 2013. Enhancing learning through self-assessment. Oxford: RoutledgeFalmer.
Cassady, JC & Gridley, BE. 2005. The effects of online formative and summative assessment on test anxiety and performance. The Journal of Technology, Learning and Assessment, 4(1) [Online]. Available: https://ejournals.bc.edu/index.php/jtla/article/view/1648 [2020, June 2].
Cassady, JC & Johnson, RE. 2002. Cognitive test anxiety and academic performance. Contemporary Educational Psychology, 27(2), 270–295 [Online]. Available: https://doi.org/10.1006/ceps.2001.1094.
Cleland, J, McKimm, J, Fuller, R, Taylor, D, Janczukowicz, J & Gibbs, T. 2020. Adapting to the impact of COVID-19: Sharing stories, sharing practice. Medical Teacher, 1–4 [Online]. Available: https://doi.org/10.1080/0142159X.2020.1757635.
McGee, P. 2013. Supporting academic honesty in online courses. The Journal of Educators Online, 10(1) [Online]. Available: https://doi.org/10.9743/JEO.2013.1.6.
Salmon, G. 2005. Flying not flapping: A strategic framework for e-learning and pedagogical innovation in higher education institutions. Research in Learning Technology, ALT-J, 13(3), 201–218 [Online]. Available: https://doi.org/10.3402/rlt.v13i3.11218.
Smith Budhai, S. 2020. Fourteen simple strategies to reduce cheating on online examinations [Online forum post]. Message posted to https://www.facultyfocus.com/articles/educational-assessment/fourteen-simple-strategies-to-reduce-cheating-on-online-examinations/ [2020, June 2].
Von der Embse, N, Jester, D, Roy, D & Post, J. (2018). Test anxiety effects, predictors, and correlates: A 30-year meta-analytic review. Journal of Affective Disorders, 227, 483–493 [Online]. Available: https://doi.org/10.1016/j.jad.2017.11.048.
Woldeab, D & Brothen, T. (2019). 21st century assessment: Online proctoring, test anxiety, and student performance. International Journal of E-Learning & Distance Education, 34(1), 1–10 [Online]. Available: https://search.proquest.com/docview/2291990031/fulltextPDF/5ED4C7926014F0BPQ/1?accountid=14049 [2020, June 2].
Dr Hanelie Adendorff obtained her Masters and Ph D degrees in Chemistry. While lecturing Chemistry at Stellenbosch University she researched factors influencing students’ learning. In 2003 she took a transdisciplinary step to join the SU Centre for Teaching and Learning (CTL) where she still represents the Sciences as Senior Advisor on the enhancing of teaching and learning. She has published extensively on LCT and other scholarly approaches, facilitated a special interest group on decolonising science teaching and learning, and takes a leadership role in the emergency remote teaching solutions at the university during Covid-19 lockdown. She enjoys game-based learning, infographics, storyboards and other creative waves.
Charmaine van der Merwe
Charmaine van der Merwe is an advisor at the Centre for Teaching and Learning at Stellenbosch University. She still sees herself as a novice academic developer, who enjoys working with individual lecturers to think about their teaching in new ways. She is interested in exploring ways to shift towards more formative assessment and feedback in higher education.
Disclaimer: All views expressed in this post are those of the author(s) and do not represent those of the University of Stellenbosch.
Download the Terms and conditions of use