EXAMINING EXAMS — PROCTORING IN A REMOTE WORLD
Many university students have not had class on campus for months. Where I work, courses began transitioning online as early as mid-March!
Since then, a lot of attention — and not unfairly so — has been placed on delivering virtual lectures. How should instructors use Zoom? How can they be charming through a webcam? What can be done to facilitate student engagement?
On the other hand, discussions surrounding exams — at least in my (biased) opinion — have not been as lively…
… which is a little strange, when you think about it. Traditionally, exams represent one of the more formal ways of assessing student performance. They feature rules and heightened supervision, intended to enforce integrity.
But now that university students can no longer be packed into rooms like the one above and held hostage for hours of misery, how have instructors been delivering midterms and finals?
For many, the answer has involved remotely proctored online exams.
WAIT — WHO AM I?
Hello, my name is Lyon!
You are reading an assignment from the “Mobile and Open Learning” course I am registered in — part of a masters program I began just a few months ago.
I work at a university — known to some as “a place of mind” — amongst a wonderful team of people in the school of business.
My interest in online exams begins with the fact that a significant component of my job involves planning and building them. Supporting students during these very exams is also part of my job description.
This work leaves me at an intersection between instructional design and student experience. When consulting with me, instructors reveal various motivations for wanting their exams to be proctored remotely — very sensible reasons.
With every exam that goes by though, my role as a support staff allows me to see firsthand how such decisions impact what students go through on the big day — they are not always led to the smoothest or most intuitive of experiences.
Any software company can make claims about what their product can technically do. I am more interested in how well these things can actually be done, and whether or not these “features” can serve to hinder or complicate a student’s test-taking — a big priority for me.
With that being said, I hope to provide you with a wish list or checklist of sorts — four ideas, in my humble opinion, to think about when it comes to improving remote proctoring experiences for students.
1. MORE NATIVE PROCTORING FUNCTIONALITY
Proctoring solutions exist largely because much of the functionality they offer is absent within learning management systems.
They come in the form of browser extensions, web portals, standalone programs, and more. This means that students must have these properly installed and set up just to launch their exams. Then, they must navigate these strange landscapes for the next few hours — no wonder they run into technical issues and difficulties!
I believe that more responsibility should be shifted over to the learning management systems — outsourcing proctoring to a third party simply introduces more variables to manage (and to go wrong).
From a usability perspective, it just makes sense for an exam interface to be consistent with that of the LMS overall — something students are already familiar with.
2. RESPONSIBLE AND EFFECTIVE MONITORING
Some remote proctoring platforms lock students full-screen in their exams, making it impossible for them to access other materials. This approach is not nearly as effective within the current context of mobility though — we cannot tell if a student has somebody else in the room with them, or whether they have access to other devices or materials.
In other words, monitoring student video and audio has become quite essential in the pursuit of ensuring academic integrity…
Of course, one of the most common concerns raised by students has to do with the distraction and anxiety of knowing that their every move can be watched and heard. Will they be flagged for cheating if the camera catches them staring blankly away from screen as they try to recall how to calculate manufacturing overhead?
Perhaps students should be recorded in brief snippets and only at randomized intervals, rather than constantly throughout the entire exam. Still images captured from webcams and screens can also be used to supplement and verify the information gathered.
Of course, it is imperative that these details be distributed in advance — students must know what to expect. Special emphasis should also be made to address privacy and security — ideally, no student data should ever need to be stored or accessed by the proctoring company at all.
3. INCLUSIVE HARDWARE / NETWORK REQUIREMENTS
Creating a solution that is less resource-intensive on student devices should be a priority as well — exams should not only run smoothly on the latest gadgets.
The exam interface needs to be responsive to the student’s screen size and resolution. Experiences on PC or Mac should also be as uniform as possible.
In fact, some proctoring solutions even support iPad and tablets now! I believe that as long as security and functionality are consistent across these different devices, it is a good thing for students to not have to be reliant on traditional computers all of the time.
Bandwidth and other network factors must also be considered and accommodated for. Data transfer should be kept as low as possible, and best efforts should be made to ensure that access is available overseas as well — mobility is key.
4. INTEGRATED COMMUNICATION CHANNELS
If students are being proctored and restricted from accessing anything outside of their exam, how are they supposed to communicate with their instructors or request technical support?
Based on my experience of actually building these, I am strongly advocating here for live chat rooms — integrated within the exam itself — for students to communicate with instructors, TA’s, and support staff with in real-time. Throw in the ability to screen-share for advanced troubleshooting.
A broadcast tool would also be extremely useful to allow for information to be shared quickly with all test-takers at once. Think about the possibilities — “Typo on Question 33. Year should be 1997”, or “Don’t forget to upload your file within the next 5 minutes!”.
If a technical issue is detected, relevant advice and troubleshooting information should automatically appear on student screens — “It appears that you are not currently connected to the internet. Please check your connection and return to the exam when you are ready. Lost time will be credited back.”.
SOME CLOSING REMARKS
A few years ago, Atoum et al. (2017) set out to design an intricate and automated online exam proctoring system. They claimed that “extensive experimental results demonstrate[d] the accuracy, robustness, and efficiency” of their solution (p.1609).
On paper at least, it was indeed an extremely complex and sophisticated approach of deterring academic dishonesty — driven by algorithms, speech detection, gaze estimation, and much more.
As I went through their work though, it became increasingly clear to me that these gentlemen did not pay nearly as much attention to the student experience than they did on identifying and neutralizing avenues for cheating — in which they did a great job, to be fair…
They proposed a “wearcam” as a way of seeing what students themselves were — this involved attaching “a regular wired webcam to a pair of eyeglasses”, justified by the fact that “webcams are becoming smaller in size, lighter in weight, cheaper over the years, and have real-time wireless capabilities” nowadays (Atoum et al., 2017, p.1612). They experimented with headbands as well…
My skepticism revolves not around effectiveness, but the practicality of the idea and what it would mean for students. Are they expected to undergo these DIY projects themselves? What if they do not own glasses? What if their webcam is built into a laptop? Would this even be… comfortable for students writing an exam?
My job demands that I become familiar with educational tools, identify best practices, and develop workarounds to address gaps in functionality if needed. Positive student experience is what inspires much of this work, and I think focusing on that would benefit a lot of proctoring companies.
These companies should not simply treat the institutions they signed a contract with as the end user — rather, it should the students themselves.
Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2017). Examining the effect of proctoring on online test scores. Journal of Asynchronous Learning Networks JALN, 21(1), 146.
Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2018). Interaction of proctoring and student major on online test performance. International Review of Research in Open and Distributed Learning, 19(5) doi:10.19173/irrodl.v19i5.3698
Atoum, Y., Chen, L., Liu, A. X., Hsu, S. D. H., & Liu, X. (2017). Automated online exam proctoring. IEEE Transactions on Multimedia, 19(7), 1609–1624. doi:10.1109/TMM.2017.2656064
Reisenwitz, T. H. (2020). Examining the necessity of proctoring online exams. Journal of Higher Education Theory and Practice, 20(1), 118–124. doi:10.33423/jhetp.v20i1.2782