Case study: Ultra Test for assessment
Department of Computer Science
Tommy Yuan
Tommy Yuan shares his experiences of using the Ultra test tool for formative and summative assessment, an assessment method which has increased in use within the Computer Science Department recently.
Key reported advantages for staff include:
- Automated grading and feedback bringing significant time savings especially for larger cohorts
- Automated submission at the end of the allocated exam time easing the administration burden of dealing with late submissions
- Features such as randomisation of questions and question order reducing the likelihood of collusion and academic misconduct
- Question analysis allowing easier identification of questions that may be poor discriminators for future improvement of exams
For students, advantages included the ability to provide formative quizzes in preparation for exams. This evaluated well in student feedback on the module.
Potential drawbacks included the need to ensure a good fit between the assessment type and the use of the test tool, the chance of technical issues, and challenges around saving exam papers for internal or external review.
Watch their presentation:
VLE Test for assessment (Panopto viewer) (6 mins 59 secs, UoY log-in required)
Transcript
Hello. I'm Tommy Yuan, a reader in the Computer Science department. I'm here to share my experiences of using the VLE Test tool for both formative and summative assessments. I will highlight the advantages and drawbacks and reflect on the outcomes from both staff and student perspectives.
Indeed, I have used VLE-based exams in several modules, two for our online MSC programs: Algorithm and Data Structures, and Software Engineering. And the more recently one for our campus-based module, Foundations of AI and Machine Learning.
First, let me clarify what VLE exams are. Unlike traditional exams with questions and answers on paper or PDF file, VLE exams have questions implemented within a VLE platform such as canvas or blackboard. Students are required to write their answers in the designated areas of the VLE. These questions can be multiple choice, fill in the blanks or essay-like questions.
One of the key benefits of using VLE exams from my perspective is automated grading and feedback. For multiple choice and fill-in-the-blank questions, automated marking is possible. Additionally, automated feedback or example answers can be prepared beforehand, reducing the time needed for providing feedback during marking. These time savings are particularly significant for large cohorts, such as classes with hundreds of students.
We have also noted an additional benefit for marking. In contrast to paper based exams, where student handwriting can sometimes be difficult to read. VLE exams typically result in clearer and more legible writing, as students either select or type their responses.
VLE exams also offer administrative savings. They utilise a countdown timer which automatically submits answers and closes the exam when time is up. For example, in a three hour exam, the answers will be automatically submitted after 180 minutes if they haven't been submitted before then. This in effect eliminates late penalties for students and reduces the workload for the exam administration team when processing them.
Enhanced security measures are another advantage of VLE exams. Features like random ordering of the questions and options for each question can create unique papers for each student. This makes collusion less likely. Additionally, student activities during the exam, such as log in and submission times can be tracked. This information helps to deter academic misconduct.
After grading, the automatic question analysis feature provides insights into student performance for each question, and this is helpful for identifying question difficulty levels and assessing assessment quality. For example, you can easily identify difficult questions when the achieved marks are low, and easier questions where the performance is high. It is important that the difficulty levels are appropriate for discriminating among individual students.
So far. I've focused on the advantages from a lecturer or administrator's perspective. From a student learning perspective, a key benefit, perhaps, is to use formative quizzes. These quizzes typically consists of multiple choice questions providing real-time scoring and instant feedback. So students can take the test at their own convenience and identify areas of weaknesses for improvement. Our student evaluations signify the usefulness of the weekly quizzes.
We have also encountered a few drawbacks or limitations with the current tools.
First. This is not a one size fits all solution. Certain assessment types, such as project reports, practical demonstrations, or presentations may not be easily replicated in the VLE environment.
Second, the success of the exam depends on stable internet connectivity and compatibility with different devices and browsers student use. It is important to conduct a practice exam beforehand so that any issues can be detected early.
We have also noted that there is no functionality to save a VLE exam paper for internal or external review. When you have created your exam paper in the VLE, after a lot of hard work and you want to save them for external review, unfortunately you can't. This shortfall was confirmed by the VLE Support Team. One workaround is to take screenshots into a file. Alternatively, you can save your paper in docs and then implement them in the VLE.
With ultra exams, we discovered a bug that unintentionally deletes manual feedback when updating a section score. One workaround is to enter scores before writing feedback and always save the feedback before changing a score.
Although our experience of using VLE exams have been encouraging, despite the limitations discussed above, we have noted an increase in the year of VLE exams within the Computer Science department.
Guides related to this case study