Skip to main content

Daniel Stucke

Diagnostic questioning #Computing style

2 min read

Back in September I was keen to get the Computing teaching community to work together to write a bank of high quality multiple choice questions. In fact, rather poorly, it was the last thing I wrote on this blog! Head back to see why I think that good quality MCQs can be an invaluable part of assessment for learning. Some educators got in touch and sent me some questions, but a lack of common format and no central repository curtailed my enthusiasm for the idea.

Step in This fantastic site is the brainchild of @mrbartonmaths.

There are lots of sites out there that support MCQs, there are a few reasons why I think stands out:

  • It’s free.
  • It’s really easy to add questions, including as images. So if people already have quizzes as html files, or moodle quizzes or powerpoint or whatever then they can get transferred in here pretty easily with a simple screenshot upload.
  • Students have to explain their answer for every answer. And then they get to compare their explanations to the best ones from students around the world. Fantastic for them, and also fantastic for teachers to delve into understanding, or lack there of.

It would be fantastic if some of the Computing teaching community could chip in with questions of their own. Whilst I’ve put most up so far credit is due to the brilliant @mrocallaghanedu who wrote and shared them originally on his blog.

I plan to use small quizzes each lesson on the run up to exams to ensure that students and myself know what they know, and more importantly know where the gaps are. They can revise to the gaps, and I can teach to them. Of course this will be supplimented with lengthier question types akin to those found in the exam. I also hope to get some of my more able students to write MCQs of their own. It’s no easy task and really tests your subject knowledge, a perfect task for those pushing A/A* grades this Summer in the GCSE.

The Diagnostic Questions site:

Daniel Stucke

Crowd-Sourcing Multiple Choice GCSE Computing Questions

6 min read

A plea to teachers of Computing GCSE, please join an effort to write a bank of high quality multiple choice questions to support the teaching of this course.

Why multiple choice questions?

There is an increasing body of research and writing showing that skilfully written multiple choice questions are an effective means of developing retention. They are easy to administer and easy to mark, particularly in an IT rich environment. My experience (and looking at OCR data, most other school’s experience) is that students perform poorly on the written examination part of the GCSE. Their retention and recall of the knowledge needed to succeed in the exam is poor.

Well written multiple choice questions could be used:
- as lesson starters / plenaries etc to revisit learning covered earlier in the course, helping to develop retention
- as hinge questions in the middle of a lesson as part of the AfL process
- as small assessments of units of work
- without the wrong answers, as flashcards for learner’s revision

Further reading

My thoughts on this have been shaped by some excellent writing and research online…
- Joe Kirby - How to design multiple-choice questions
- Daisy Christodoulou - Research on multiple choice questions
- Robert Bjork - Multiple-Choice Tests Exonerated, at Least of Some Charges: Fostering Test-Induced Learning and Avoiding Test-Induced Forgetting
- Dylan Wiliam - When is assessment learning orientated
- Dylan Wiliam & Caroline Wylie - Diagnostic Questions: Is there value in just one?
- David Didau - How effective learning hinges on good questioning


Great! I have set up a Google Form for you to submit your questions and answers. I’ve split the Computing GCSE up into a few high level topic areas to categorise each question. Once you’ve shared some questions I will happily give you access to the spreadsheet behind the form and what will hopefully grow into a large bank of high quality questions. If you have lots that you’ve already prepared and want to send me an email with them in a different format that would be wonderful. I’ll do my best to add them into the rest without you having to copy each one in to the form.

7 Principles for Designing Multiple Choice Options

Quality questions that help develop retention need carefully crafted options for the answers. Please follow these guidelines when constructing your questions and answers.

With his permission, I’ve shamelessly stolen this list from Joe Kirby.

  1. The proximity of options increases the rigour of the question For instance, the question is, what year was the battle of Hastings? Options 1065, 1066, 1067, 1068 or 1069 are more rigorous than options 1066, 1166, 1266, 1366 or 1466. Of course, the question itself also determines the rigour: ‘80 is what percentage of 200?’ is much easier than ‘79 is what percentage of 316?’  
  2. The number of incorrect options increases rigour Three options gives pupils a 33% chance of guessing the correct answer; five options reduces the chances of guessing to 20%; always create five rather than three or four options for multiple choice questions. A ‘don’t know’ option prevents pupils from blindly guessing, allowing them to flag up questions they’re unsure about rather than getting lucky with a correct guess. With this in mind the form will accept questions of the form:
    • 1 correct answer from a total of 4
    • 1 correct answer from a total of 5
    • 2 correct answers from a total of 5
    • 2 correct answers from a total of 6
      The further down that list the less chance someone can guess the answer correctly.
  3. Incorrect options should be plausible but unambiguously wrong If options are too implausible, this reduces rigour as pupils can too quickly dismiss them. For instance, in the question: what do Charles Dickens and Oliver Twist have in common, an implausible option would be that they were both bank robbers. However, if answers are too ambiguously similar, this creates problems. For instance, in the question, ‘What happens in the plot of Oliver Twist?’, these options are too ambiguous: a) A young boy runs away to London b) An orphan falls in with a street gang of street urchins c) A poor orphan is adopted by a wealthy gentleman d) A criminal murders a young woman and is pursued by a mob e) A gang of pickpockets abduct a young boy
  4. Incorrect options should be frequent misconceptions where possible For example, if you know pupils often confuse how autobiographical ‘Oliver Twist’ is, create options as common confusions. These distractors flag up what pupils are thinking if they select an incorrect option: a) Both were born in a workhouse b) Both were separated from their parents and family c) Both were put in prison for debt d) Both had families who were put in prison for debt e) Both were orphans
  5. Multiple correct options make a question more rigorous. Not stating how many correct options there are makes pupils think harder. For example: Which characteristics of “Elegy Written in a Country Churchyard” can be seen as Romantic? a) It celebrates the supernatural. b) It is written in iambic pentameter. c) It emphasises emotion over reason. d) It deals with the lives of common people. e) It aspires to nature and the sublime.
  6. The occasional negative question encourages kids to read the questions more carefully. Once they get a question like ‘Which of these is NOT a cause of World War 1?‘ wrong, and realise why, they’ll work out they need to read questions again to double-check on what it is they’re asking.
  7. Stretch questions can be created with comparisons or connections between topics. What was common to both the USA and Germany during the Great Depression?
    a)     Jewish immigration increased
    b)     Membership of Ku Klux Klan increased
    c)     Public works projects were implemented
    d)     Government social programs were reduced

Still here? Then let us begin!

The form is available here Don’t rush, fashion some great questions as you go throughout this year. Together we can make a powerful learning resource. And hopefully many hands will make light work of the task!

If you have a big list of questions to submit to the cause then please email them to me at dstucke [at] and I will endeavour to add them to the master list. If you would like access to the bank of questions then leave me a comment here, email me on the address above or send me a tweet. I’ll collate them into a shareable format. Probably just a csv or txt file to start with that can then be used to import into your response system of choice be that Moodle quizzes, Socrative or ExitTicket questions, whatever you choose.

Daniel Stucke

Dan's Digest by danielstucke

1 min read

My blogging is not what it once was, although some new posts are in gestation. But I thought I’d try something different, and slightly more old fashioned for sharing links and thoughts that I come across. In comes my brand new, very old fashioned email newsletter. Please sign up, Newsletter 1 out soon, and you can always subscribe if there’s nothing of interest in it :)

Dan's Digest by danielstucke

Daniel Stucke

Editorial (iOS App) Tags

2 min read

I’ve been using the excellent Editorial app on my iPad increasingly for all my writing and note taking needs. Editorial is a geeky combination of text editor and Python programming playground. You can create or install workflows that do everything from basic text formatting to highlighting verbs to sharing the text with other services like Evernote. It’s great, particularly if you’re writing for the web as it’s ridiculously easy to use the in built browser to find websites and then quickly copy links across to your writing.

You will quickly grow a large list of workflows that can become quite a pain to navigate. The latest update to Editorial has added tags. My top tip and the point of this post is to recommend that instead of using words to tag your workflows, use the inbuilt Emoji that come with iOS 🔡🔧↗️🔗💻🔎📝. You’ll end up with a simple menu bar that doesn’t scroll way off the sidebar. Something like this:


I hope someone finds this tip useful!

On the topic of Editorial, here’s my first workflow and here is an extensive review from the fountain of all knowledge on Editorial, @vittici.

Daniel Stucke

Confidence in predicting attainment

3 min read

How accurately can we set targets and predict performance of pupil’s exam attainment?

At this time of year the pressure is on teachers and leaders in school to know exactly how their young people will perform in the GCSE exams that they are currently sitting.

There is an expectation from leaders, governors and of course Ofsted to accurately track, monitor and predict pupil progress and hence exam performance.

Our school, like every other in the country, set targets for individual pupils at the start of the year and then ask the teachers to measure the pupil’s progress in comparison to this target grade, and to predict their final exam grade. These are collated and a range of performance measures for classes, subjects and the whole school are calculated.

But how accurate are these and how confident can we be in these predictions and targets? Our school has 155 young people in year 11. And some subjects have as few as 15 learners. One or two students having a bad day in the exam, or realising too late that they picked a subject that they really don’t enjoy, can have a large impact on the results. How much should we take this into account when setting targets and holding teachers and middle leaders to account?

This train of thought has led me to look at the use of confidence intervals. I’ve used the binomial proportion confidence interval to calculate the 95% confidence interval on the number of students predicted to achieve a C+ in each subject. This takes into account the size of the class.

I’ve ended up with results such as: English C+ = 79% +/- 6%. So we’re 95% sure they will end up with results between 73% and 85%. Computing, with a smaller number of students, comes in at 70% +/- 16%.

Is this an appropriate statistical measure to use?

The confidence intervals are quite large due to the small number of students involved. I’m not sure if this appropriate though? This measure is normally used when sampling a small part of a larger population. Is that what we are doing here? In that our Computing class is a small sample of the national group sitting that examination? Or are we actually sampling the whole population where that population is simply all the students studying it at our school?

If this is not that right measure of confidence to use in this case then what is?

Lots of questions that I’m hoping some of the data / statistics community can help answer. The stakes are high in schools now and the targets that departments and schools are set have high levels of accountability attached to them. It’s only fair that targets and predictions come with a suitable statistical window attached. How should that be calculated?