Horses for Courses: Is it possible to construct meaningful Blackboard-based exams?

 By Peter Stordy

Blackboard logo, white text on a black background


Abstract

For the past five years, I have successfully run Blackboard-based exams in two large postgraduate modules despite them being an inherently inferior form of assessment. In this post, I reflect upon my reasons for continuing. I’m not suggesting automatically marked Blackboard exams are the silver bullet to the horrors of marking hundreds of coursework or exam scripts, but they offer something worth considering, even at postgraduate level. Ultimately, it’s horses for courses.

Introduction

Tired of marking and moderating 300+ coursework scripts in two ‘soft’ technical modules* I piloted using Blackboard-based exams and have continued to do so ever since. The challenge was creating an exam which could be automatically marked and still fairly assess students’ achievement of the modules’ learning outcomes. After briefly providing some background, I present examples of how I [sort of] achieve this seemingly impossible requirement. The advantages and disadvantages of conducting Blackboard convened exams are then discussed.

* Database Design (N=250); Introduction to Programming (N=70)

Background
Blackboard allows conveners to construct tests which can be accessed under invigilated, time-constrained conditions AKA formal exams. Depending on how the tests are constructed, they can be automatically marked. I construct tests using Blackboard’s multiple-choice, multiple-answer, matching and fill-in-the-blanks tools. These can all be automatically marked and offer a degree of flexibility. However, these types of questions typically assess ‘knowledge’ and ‘understanding’ type learning outcomes. Using the techniques listed in the following section, I can automatically assess students’ achievement of ‘applying’ and ‘analysing’ type learning outcomes. ‘Evaluate’ and ‘create’ type learning outcomes are maybe beyond the scope of automatically marked exams (although AI in the future might change this).

Possible Solutions
  • For multiple-answer type questions, I’ve used students’ [coursework] answers from previous years’ rather than inventing my own. These answers might be more plausible since they have been created by students. Being a multiple-answer type question, students can select one or more answers. Each answer can be weighted. For example, I have presented a short case study and asked students to model the data present. Students then compare their model with the various models presented (that is, students’ attempts from previous years) and are asked to select the most appropriate.
  • The “fill in the blanks” Blackboard test tool is useful for providing extended code or text with gaps for students to complete. Blackboard can be quite nuanced with its marking, particularly with the use of regular expressions (although I’ve never needed this level of functionality).
  • The necessary use of the Respondus Lockdown Browser to prevent the use of unfair means during University online exams prevents students accessing external files and sites. However, there are a couple of work-arounds:
    • External files can be linked to by creating a hyperlink in the Blackboard Test “Instructions” section. For example, extended code can be given to students to analyse and answer questions on. This is better than ‘clogging-up’ the question text or printing the files for distributions during the exam.
    • For coding, a Trinket iframe can be embedded in a Blackboard test question allowing students to test their own programming or HTML solutions before selecting a multiple-choice question answer
A collection of images representing thought, with a lightbulb in the middle. Blue images on a dark background.



Advantages
For me, the advantages of convening automatically marked Blackboard-based exams are:

  • Despite their limitations, Blackboard’s “Test” tools are reliable enough, offer some flexibility to assess students in different ways and the Blackboard environment is familiar to students.
  • Assuming there were no glitches during the running of the Blackboard-based exam (hasn’t happened yet), or errors in the questions asked (happened to me once), marking is completed almost immediately
  • In addition to the time saved marking, the creation, maintenance and student support required to conduct coursework or paper-based exams can be overwhelming and unrecognised in work allocation models
  • The number of students requiring “additional feedback” after the assessment (AKA they disagree with the mark awarded) almost disappears with a multiple choice type exam. They are less likely to question the marks awarded
  • Students, particularly students whose first language is not English, like multiple-choice questions. It places almost no demands on their writing, listening and speaking.
Disadvantages
For me, the disadvantages of convening automatically marked Blackboard-based exams are:

  • Ultimately, it’s a compromise. If you want to assess students’ database design capabilities, you get them to design a database. Similarly, if you want to assess students’ computer programming capabilities, you get them to program. An automatically marked Blackboard-based exam is just a surrogate for the real thing.
  • The time gained in having the exam automatically marked has to be balanced with the enormous amount of time creating and implementing the online exam. Unless questions can be reused each year, there are no time savings … indeed quite the opposite
  • Automatically marked Blackboard exams are less forgiving of errors in their construction and design than paper-based exams or coursework.
Conclusion
On balance, the advantages of using automatically marked Blackboard Tests for online exams outweighs the disadvantages provided:

  • Each year builds upon the previous year’s questions
  • There are a large number of students to justify the enormous amount of time constructing the exam
  • The learning outcomes lend themselves to this type of assessment e.g. more technical modules

Ultimately, it’s “horses for courses”.

Peter Stordy is a Senior University Teacher at the Information School.