Writing Better MCQs - Anatomy of a Well-Written MCQ

  By Dr Drew Tarmey & Dr Matt Mears

Anatomy of a Well-Written MCQ





A good SBA-style MCQ has three key components, each doing a specific job:

  • The Stem sets the scene. In clinical or applied contexts, this might be a short case description; in other disciplines, it could be a scenario or concept setup. It should be focused, relevant, and free of extraneous detail.
  • The Lead-In is the actual question. This needs to be a full, direct sentence that makes sense on its own. The goal is clarity: students should know what they’re being asked before they look at the answer options.
  • The Options consist of one best answer and four plausible distractors. All should be similar in tone, structure, and content. The distractors aren’t filler—they’re there to challenge misconceptions or test subtle distinctions.

A well-written question rewards understanding, not test-taking technique. There should be no grammatical clues, repeated phrases from the stem, or obviously incorrect throwaway options.

Spotting Common Pitfalls

The workshop made excellent use of flawed examples to show what not to do. While some were clearly exaggerated, they illustrated common mistakes that are easy to make.

Take this example:

Q: What enzyme converts glastorfs into menzintow? 

A. Glastorfsase 

B. Pultonase 

C. Reriroase 

D. Aswortase 

E. Blubonase

    Fake microbiology terms aside, this question suffers from a classic case of clanging: the correct answer is a word repeated from the stem. Students don’t need to understand the content to spot the pattern.

    Here are some other pitfalls to watch for, with examples of how they might show up in practice:

    Negative phrasing

    Questions like “Which of the following is NOT correct?” increase the likelihood of misreading under pressure, especially during high-stakes exams. For example:

    Q: Which of the following is NOT a common side effect of corticosteriods?

    A. Hyperglycaemia

    B. Osteoporosis

    C. Weight gain

    D. Bradycardia

    E. Fluid retention

      Students have to mentally flip every option, which adds unnecessary cognitive load. Instead, rephrase positively: “Which of the following is LEAST likely to occur...?” or better yet, frame it as a straightforward question “what is the most commonly reported adverse effect of corticosteroid treatment?”.

      “All of the above” or multi-part options 

      These often turn the question into a logic puzzle rather than a knowledge test. For example:


      Q: Which of the following are features of nephrotic syndrome? 

      A. Proteinuria

      B. Hypoalbuminemia

      C. Oedema

      D. A & B

      E. A & C

                F. B & C 

        A student who knows two options are correct might guess “All of the above” without full understanding. Better to list plausible distractors and avoid bundled answers.

        Grammatical cues 

        If only one option fits grammatically with the stem, students may spot the correct answer through syntax alone. For instance:


        Q: The patient was given an....

        A. Anticoagulant

        B. Beta-Blocker

        C. Calcium channel blocker

        D. Diuretic

        E. Thrombolytic

          Only “anticoagulant” begins with a vowel, making it a giveaway. In simple terms, it just ‘reads better’ if your first language is British English (not all languages operate like this which may disadvantage non-native English speakers) The fix? Rewrite the stem to avoid such clues entirely.

          Implausible distractors

          Distractors that are obviously incorrect do nothing to test knowledge. For example:

          Q: What is the primary gas in Earth's atmosphere

          A. Oxygen

          B. Nitrogen

          C. Carbon Dioxide

          D. Helium

          E. Kryptonite

            If “Kryptonite” is among the options, it weakens the credibility of the question as students are not drawing on their knowledge of the subject to reject E as scientifically not valid. Aim to include only plausible answers that reflect common errors or misunderstandings. Whilst one would hope the student has enough subject knowledge to recognise that Kryptonite is not real, they may panic in the exam and assume it is something that they should know. We should always avoid writing questions that make reference to popular culture (unless of course mid 20th century comic literature is the topic being tested!) or U.K. specific slang terms.

            Uneven option lengths

            Students often gravitate to the longest or most complex-looking option, assuming it’s more likely correct. For instance:

            Q: What is the treatment of choice for atrial fibrillation

            A. Beta-Blocker

            B. Digoxin

            C. Catheter ablation or electrical cardioversion following pharmacological rate and rhythm control and risk stratification for stroke 

            D. Anticoagulation

            E. Amiodarone

              Option C stands out due to length alone so many students may choose it. Keep options consistent in complexity and structure.

              The Cover-Up Test

              One simple but powerful strategy introduced in the session was the cover-up test. It works like this: hide the answer options. Now read just the stem and lead-in. Can you tell what the question is asking? Can a student engage meaningfully with the task before they even see the options?
              If the answer is no, the question needs work.

              For example, this question fails the cover-up test:

              Q:  A 49 year old man attends for pulmonary function tests....(lots of clinical information)

              Which of the following is correct?

                The lead-in is vague. What kind of judgement is required—the most likely diagnosis? Interpretation of test results? Management? This requires students to do a lot of unnecessary mental ‘cross checking’.
                Contrast that with:


                Q:  A 73 year old woman has worsening breathlessness, needs four pillows to sleep and has ankle swelling

                What is the single most likely diagnosis?


                This question is direct, clear, and focused. Even without the options, students know what they’re being asked to consider. Students can answer the question based on knowledge even without a list of possible answers. This lessens the cognitive load associated with sitting the exam itself.

                Applying the cover-up test helps eliminate ambiguity and ensures your questions genuinely test the skills and knowledge you intend.




                Blueprinting: Aligning Content and Assessment

                No matter how well an individual question is written, a good exam is more than a collection of items. It should reflect the shape and priorities of the course itself. That’s where blueprinting comes in.

                Blueprinting means setting out in advance how much of the exam will focus on each topic area, and ensuring that this distribution aligns with the emphasis in the teaching.

                For example, if respiratory physiology was covered in four weeks of teaching and dermatology in one short session, then a fair MCQ paper might devote 30% of questions to the former and just 2% to the latter (even if the colleague who taught that session has written you lots of questions!).

                This alignment matters. Without it, students may feel blindsided by questions on some topics, or underprepared for areas that featured heavily in lectures but barely in the exam. It also makes the assessment more defensible in the face of student challenge.

                Blueprints don’t need to be bureaucratic. Here is an example from an end of year exam:

                MCQ Blueprint

                 

                OSCE* Blueprint

                 

                Topics

                % of exam

                Task

                % of exam

                Eyes

                3

                Communication

                50

                Cardiovascular

                15

                Examination

                40

                Pulmonary

                15

                Emergency

                10

                Endocrine and Diabetes

                10



                Gastrointestinal

                15



                Renal / urinary / men’s health

                8



                MSK

                4



                Neurology

                10



                Psychiatry

                4



                Dermatology

                4



                Haematology

                4



                Infectious Disease

                4



                Ethics and Professionalism

                4




                (* OSCE - Objective Structured Clinical Examination)

                Sharing a blueprint, such as the above, with students can help them structure their revision and reduce anxiety. It also encourages us, as educators, to reflect on whether our assessments really reflect our learning goals. One unintended but significant outcome from releasing MCQ blueprints to students at the University of Manchester is that student satisfaction around assessment increased significantly without undermining the validity of the assessments. Focus groups revealed that students were reassured their revision plan matched what was expected of them and they were more confident as a result. It also reassured them that a lot of thought was going into designing their assessments.



                Delivery Matters: Blackboard and NUMBAS

                The platform you use to deliver your MCQs affects not just the format, but also what kinds of knowledge and skills you can assess.

                Blackboard Ultra is well-suited to standard SBA questions. It’s integrated with our institutional systems and works well for secure, summative assessment. Tests are marked automatically which alleviates a lot of the burden!

                However, it can be clunky for questions that require equations or numeric input. In the example below the questions are not MCQs but are used to emphasise the point. The questions are identical but the first one has display issues with rounding errors, that needed an unusual work around to get the second version.


                NUMBAS, by contrast, was designed for maths and science disciplines. It allows for randomised variables, numerical input, and stepwise reasoning. This makes it a strong choice for formative assessment in STEM subjects, particularly when you want to assess calculation or process rather than simple selection.

                It may be somewhat excessive for simple MCQ structures, but if you want to merge MCQs with single typed answers (see below) then this is a more powerful tool that is also integrated with the grade centre in Blackboard Ultra and also can automatically mark any tests. It is also a better platform in terms of accessibility, particularly for mathematical content.



                Remember, these are just tools to deliver and facilitate your MCQ assessments. Neither platform makes up for a poorly written question.

                Final Thoughts

                Writing better MCQs doesn’t mean starting from scratch. It means being more intentional. It means asking: What am I really testing? Is the question clear and fair? Do the options challenge the right misconceptions? And does the paper as a whole reflect what we’ve actually taught?

                So start with one question. Apply the cover-up test. Check it against your blueprint. Share it with a colleague. These small actions can make a big difference to the clarity, fairness, and effectiveness of your assessments.

                Because ultimately, a good MCQ doesn’t just measure learning. It supports it.

                Dr Drew Tarmey works at the School of Medical Sciences at the University of Manchester

                Dr Matt Mears (he/him/they/them) is a Senior University Teacher in Physics within the School of Mathematical and Physical Sciences, and has been responsible for the second year laboratory on and off since 2012. You can contact him by email (m.mears@sheffield.ac.uk) or just put a coffee chat into their diary.