Skip to main content

Daniel Stucke

Assessing without levels - Milestones

7 min read

Assessing without levels - Milestones

At our school we took the decision last Summer to embrace the opportunities available to move away from National Curriculum levels. Our approach isn’t revolutionary, but I think it’s worth sharing.


I personally felt that there were numerous issues with the old NC levels. They were not as well understood by parents / pupils / parents as everybody thought. A false sense of accuracy had been developed as levels morphed into sub-levels, did anyone really know the difference between a 5c and a 5b? At a whole school level far too many schools, ourselves included were chasing sub-levels around in circles looking for ‘rapid and sustained progress’. They also lost so much detail, everyone would hang their hat on that one level. But a 5b could hide a myriad of important information. A student might have real strength in Shape and Data in Maths but be struggling with their Number and Algebra skills. We were also in the process of re-writing schemes for learning so it made sense to tackle the two jobs together.

Desired outcomes

Initial work was done between myself and our school improvement partner. We looked at the core outcomes we wanted from our assessment systems. Primarily we wanted to refocus assessment in the classroom on the learning. All assessment should help teachers and students understand which key concepts they had grasped and which they had not. Secondly we wanted a system that could report as efficiently and simply as possible to governors, leadership, teachers, students and parents which students were making expected progress in which subjects. On reflection that’s what all school wide level analysis looked at. And when it involved chasing 2-3 sub levels per year it was a nightmare.

Expected Progress

The National Curriculum and the GCSE programmes of study set in stone expected learning at the end of KS2, KS3 and KS4. School performance measures set 'expected’ progress from KS2-4. Whilst measures are changing at both KS2 and KS4 we felt confident that learners would still join us banded into High/Middle/Low 5+/4/3- attainment bands. And expected progress will still be around the equivalent of three levels of progress.

With this in mind our model takes each subject, splits this into three bands, and maps out the expected learning in each year to ensure that middle level learners join us and progress to at least a grade C+ equivalence. High level attainers to B+ etc.

We asked each academic subject to map out the learning for each of their three bands. Pasting in the GCSE PoS outcomes into Y11, and any NC outcomes into Y8 (we have moved to a 2 year KS3). Then shuffling these around to form their high level scheme of work.


Anyone who has read the NC or GCSE PoS’s will know that the language used therein is not great for use with pupils and parents. We asked our teams to cut these down to the key 'milestones’ that were crucial for progress in the subject. And we asked them to re-write these in language that pupils and parents would understand, without dumbing them down too much, I do think we should avoid hiding away from subject specific terminology. Staff were encouraged to use blooms and SOLO taxonomy language and structures as a guide in this process. This gave us a roadmap for each subject detailing exactly what we expected a student to learn each year, and it gave us a framework of expected knowledge and skills to assess learning against.

As an example the Computing roadmap is below:

Computing Roadmap

Reporting on progress

Each assessment window (we have four a year) we ask staff to report on each student’s progress. They simply report on whether students are making Significantly Above / Above / On / Below / Significantly Below expected progress. This judgment is made by taking a range of formative and summative assessment information and judging whether a student is on track to learn what is expected of them according to the milestone pathways.

Needless to say this makes reporting at class/subject/year group level incredibly easy, a simple tally and percentage of each grade allows us to monitor progress at this level.

Staff also record the next two key milestones that a student needs to master in order to make maximum progress. This might be a key skill that they should have mastered by now but are struggling with, or it might be an important topic that will be covered in coming weeks. Parents get a quarterly report detailing the SA/A/O/B/SA progress measures along with a pair of key milestones for each subject.

Reflections two terms in

I’m pleased with how things have gone so far. Staff worked incredibly hard over the Summer to write the frameworks for this to work. I do believe we have a system that has fulfilled our original aims. More assessment is focussed on specific areas of progress, the entire data collection and analysis system is far simpler, freeing our middle and senior leaders and reporting to governors et al is simplified.

I worry we may have set our expectations too low. 'Expected progress’ was set at the old equivalent of three levels of progress form KS2-4. We took the measure of 'expected’ as per performance tables and mapped it to individuals, sometimes individual expectations need to be different to those we 'expect’ of classes or year groups. The more work I’ve done in successful schools during my NPQH the more I’ve realised that setting an 'expectation’ high than that can lead to higher expectations from staff, parents and pupils of what progress is possible, and in turn leads to better progress. This was always planned as a flexible model that we could tweak as expectations and measures at KS2 and KS4 change and as the new PoS’s come into force. We will review formally at the end of the year and if we have to slide milestones around to raise expectations then so be it.

We have been through a brief spell of ’re-calibration’. Analysing the data showed that more students than should have been were making 'expected progress’ and less than should have been were making 'above expected progress’. On discussion with middle leaders and teachers it was clear that staff had set the bar a little too low when making judgements of 'expected progress’. And conversely too high for 'above expected’. Also staff were forgetting where students had started their journey. Those who joined us as level 4 learners and had worked hard for several years were being judged as making 'expected progress’ because that’s what staff had come to expect from them. When in fact, in terms of KS2-now measures they were working well 'above expected’. Much as it pained me a little to talk levels again, the diagram below helped staff to re-calibrate in their minds our progress statements.

Milestones mapped to levels and grades

In part two I’ll explain the implementation strategy we used to lead this change.

Daniel Stucke

Diagnostic questioning #Computing style

2 min read

Back in September I was keen to get the Computing teaching community to work together to write a bank of high quality multiple choice questions. In fact, rather poorly, it was the last thing I wrote on this blog! Head back to see why I think that good quality MCQs can be an invaluable part of assessment for learning. Some educators got in touch and sent me some questions, but a lack of common format and no central repository curtailed my enthusiasm for the idea.

Step in This fantastic site is the brainchild of @mrbartonmaths.

There are lots of sites out there that support MCQs, there are a few reasons why I think stands out:

  • It’s free.
  • It’s really easy to add questions, including as images. So if people already have quizzes as html files, or moodle quizzes or powerpoint or whatever then they can get transferred in here pretty easily with a simple screenshot upload.
  • Students have to explain their answer for every answer. And then they get to compare their explanations to the best ones from students around the world. Fantastic for them, and also fantastic for teachers to delve into understanding, or lack there of.

It would be fantastic if some of the Computing teaching community could chip in with questions of their own. Whilst I’ve put most up so far credit is due to the brilliant @mrocallaghanedu who wrote and shared them originally on his blog.

I plan to use small quizzes each lesson on the run up to exams to ensure that students and myself know what they know, and more importantly know where the gaps are. They can revise to the gaps, and I can teach to them. Of course this will be supplimented with lengthier question types akin to those found in the exam. I also hope to get some of my more able students to write MCQs of their own. It’s no easy task and really tests your subject knowledge, a perfect task for those pushing A/A* grades this Summer in the GCSE.

The Diagnostic Questions site:

Daniel Stucke

Crowd-Sourcing Multiple Choice GCSE Computing Questions

6 min read

A plea to teachers of Computing GCSE, please join an effort to write a bank of high quality multiple choice questions to support the teaching of this course.

Why multiple choice questions?

There is an increasing body of research and writing showing that skilfully written multiple choice questions are an effective means of developing retention. They are easy to administer and easy to mark, particularly in an IT rich environment. My experience (and looking at OCR data, most other school’s experience) is that students perform poorly on the written examination part of the GCSE. Their retention and recall of the knowledge needed to succeed in the exam is poor.

Well written multiple choice questions could be used:
- as lesson starters / plenaries etc to revisit learning covered earlier in the course, helping to develop retention
- as hinge questions in the middle of a lesson as part of the AfL process
- as small assessments of units of work
- without the wrong answers, as flashcards for learner’s revision

Further reading

My thoughts on this have been shaped by some excellent writing and research online…
- Joe Kirby - How to design multiple-choice questions
- Daisy Christodoulou - Research on multiple choice questions
- Robert Bjork - Multiple-Choice Tests Exonerated, at Least of Some Charges: Fostering Test-Induced Learning and Avoiding Test-Induced Forgetting
- Dylan Wiliam - When is assessment learning orientated
- Dylan Wiliam & Caroline Wylie - Diagnostic Questions: Is there value in just one?
- David Didau - How effective learning hinges on good questioning


Great! I have set up a Google Form for you to submit your questions and answers. I’ve split the Computing GCSE up into a few high level topic areas to categorise each question. Once you’ve shared some questions I will happily give you access to the spreadsheet behind the form and what will hopefully grow into a large bank of high quality questions. If you have lots that you’ve already prepared and want to send me an email with them in a different format that would be wonderful. I’ll do my best to add them into the rest without you having to copy each one in to the form.

7 Principles for Designing Multiple Choice Options

Quality questions that help develop retention need carefully crafted options for the answers. Please follow these guidelines when constructing your questions and answers.

With his permission, I’ve shamelessly stolen this list from Joe Kirby.

  1. The proximity of options increases the rigour of the question For instance, the question is, what year was the battle of Hastings? Options 1065, 1066, 1067, 1068 or 1069 are more rigorous than options 1066, 1166, 1266, 1366 or 1466. Of course, the question itself also determines the rigour: ‘80 is what percentage of 200?’ is much easier than ‘79 is what percentage of 316?’  
  2. The number of incorrect options increases rigour Three options gives pupils a 33% chance of guessing the correct answer; five options reduces the chances of guessing to 20%; always create five rather than three or four options for multiple choice questions. A ‘don’t know’ option prevents pupils from blindly guessing, allowing them to flag up questions they’re unsure about rather than getting lucky with a correct guess. With this in mind the form will accept questions of the form:
    • 1 correct answer from a total of 4
    • 1 correct answer from a total of 5
    • 2 correct answers from a total of 5
    • 2 correct answers from a total of 6
      The further down that list the less chance someone can guess the answer correctly.
  3. Incorrect options should be plausible but unambiguously wrong If options are too implausible, this reduces rigour as pupils can too quickly dismiss them. For instance, in the question: what do Charles Dickens and Oliver Twist have in common, an implausible option would be that they were both bank robbers. However, if answers are too ambiguously similar, this creates problems. For instance, in the question, ‘What happens in the plot of Oliver Twist?’, these options are too ambiguous: a) A young boy runs away to London b) An orphan falls in with a street gang of street urchins c) A poor orphan is adopted by a wealthy gentleman d) A criminal murders a young woman and is pursued by a mob e) A gang of pickpockets abduct a young boy
  4. Incorrect options should be frequent misconceptions where possible For example, if you know pupils often confuse how autobiographical ‘Oliver Twist’ is, create options as common confusions. These distractors flag up what pupils are thinking if they select an incorrect option: a) Both were born in a workhouse b) Both were separated from their parents and family c) Both were put in prison for debt d) Both had families who were put in prison for debt e) Both were orphans
  5. Multiple correct options make a question more rigorous. Not stating how many correct options there are makes pupils think harder. For example: Which characteristics of “Elegy Written in a Country Churchyard” can be seen as Romantic? a) It celebrates the supernatural. b) It is written in iambic pentameter. c) It emphasises emotion over reason. d) It deals with the lives of common people. e) It aspires to nature and the sublime.
  6. The occasional negative question encourages kids to read the questions more carefully. Once they get a question like ‘Which of these is NOT a cause of World War 1?‘ wrong, and realise why, they’ll work out they need to read questions again to double-check on what it is they’re asking.
  7. Stretch questions can be created with comparisons or connections between topics. What was common to both the USA and Germany during the Great Depression?
    a)     Jewish immigration increased
    b)     Membership of Ku Klux Klan increased
    c)     Public works projects were implemented
    d)     Government social programs were reduced

Still here? Then let us begin!

The form is available here Don’t rush, fashion some great questions as you go throughout this year. Together we can make a powerful learning resource. And hopefully many hands will make light work of the task!

If you have a big list of questions to submit to the cause then please email them to me at dstucke [at] and I will endeavour to add them to the master list. If you would like access to the bank of questions then leave me a comment here, email me on the address above or send me a tweet. I’ll collate them into a shareable format. Probably just a csv or txt file to start with that can then be used to import into your response system of choice be that Moodle quizzes, Socrative or ExitTicket questions, whatever you choose.

Daniel Stucke

Evidence Based Teaching?

1 min read

Oliver Quinlan’s got me thinking out loud…

…What if we spent all the money / time / efforts that our Teaching Assistants cost / provide on conducting detailed analysis of progress and then feeding this back to the teacher and learner?

John Hattie’s work is hard to ignore. But Oliver’s right. We continue to be blind to his findings.

Daniel Stucke

SSAT #NC10 – Dylan Wiliam – Formative Assessment

8 min read

The first of a series of notes / reflections on sessions at the 2010 SSAT National Conference.


Dylan Wiliam has the grand title of 'Emeritus Professor of Educational Assessment' at the Institute of Education in London.  He is a former Maths teacher and co-author of the book "Inside the Black Box".  He is a world renowned expert on assessment for learning, and was recently to be seen on BBC television in The Classroom Experiment.

My Notes

On learning environments & the role of the teacher: Teachers do not create learning.  Learners create learning.  Teaching is engineering of effective learning environments. Teachers should create engagement.  Learning environments should be regulated. Teachers should know when students are learning.  We should develop habits of mind.

On intelligence & environment: Intelligence is partly inherited.  IQ is the best predictor of GCSE performance. Predicts job placement/selection as you get older as people choose cognitive niches.  Environments create intelligence.  Intelligence creates environment.  We must create environments that challenge, foster high cognitive demand and are inclusive. We create amplifiers of success. E.g. January ice hockey players, picked young due to size, get the coaching & time - gaps increase.

On flow: Flow = match between challenge and capability. Csikszentmihalyi.  Children don't learn what we teach, this is why assessment is core.

On assessment: Pre tests. Diagnosis and remediation.  Manage flow of learning, goals vs horizons. Sometimes ok to move off the planned track. Don't test on the exact skill, generalise and test on application of that knowledge. Differentiate in terms of context and generalisation of knowledge.

5 key strategies in teaching:

  • Clarifying, Understanding and sharing intentions.
  • Engineering effective discussions, tasks and activities that elicit evidence of learning
  • Provide feedback to move forward.
  • Activate students as learning resources for one another
  • Best teachers clear about start, route, readings along way, change course where needed.

On feedback & questioning: Middle class kids 'get the code', working class are no less intelligent just don't get what we want. Nodding, smiling, giving positive feedback to teacher when they know it is wanted.  Make the rules of the game clear to stop the game of 'guess what's in the teachers head'. Posters of key words and rules of learning. Make own tests. Write tests for class next door. Pseudonyms mean they can mark as well.  Don't give pupils a choice of being picked. No hands-up rule.   Keep them all on their toes and engaged.

Plan questions carefully to elicit understanding, not incorrect methods that are resulting in right answers. Old adage of giving pupils enough time to respond, average time by UK teachers is less than a second.  Cause thinking. Good questions depend upon the knowledge base of the students. Open/closed not bad. Cause thinking. Questions should be designed to provide data to inform teaching.  Marking is the penance for bad planning of learning!

Wait time for questioning. Think pair share if needed. Move locus of question around a classroom, answer, elaboration, evaluation. Back to original kid if needed to repeat answer and stay engaged - no opt-out no-hands up. Random pupil selection often less random than you'd think. Otherwise you are making the achievement gap worse.

Most teachers ask questions where you learn nothing from answers unless you hear the explanations. Right wrong questions no use for checking understanding. Have two correct answers out of six to choose from, if they all pick the right two then MOVE ON! Create questions with answers that highlight the common misconceptions, with enough possible answers that guessing is factored out. Drill down into the wrong answers. Hinge question based on important concept that future learning based on. Design questions with all right or all wrong answers. More than one right answer. Questions must be designed so that kids with the wrong thinking get a different answer. Right thinking must be the only way to the right answer.

Forget AfL in terms of record keeping, make teaching more agile and reflective. Mini whiteboards, use 1-5 fingers held up for abcde optional answers to questions posed.

On written feedback: Crucial to give comment and a chance to effect change within the classroom. Avoid grades, no benefit at all, and negates comments if two given together. Sometimes useful to delay feedback, can be too quick, e.g. Computer based feedback leads to trial and improvement techniques. Oliver & DeNisi 1996 meta-research showed effect sizes highly variable, 40% made progress WORSE. Hard!

Key points:

  • Cause thinking. Response needed is to change behaviour or change goal.
  • Should be more work for the recipient than the donor. (maths, '5 are wrong - you find them')
  • Tell them what they can do to move forward.
  • Comment only marking.
  • Focussed marking, numbered responses/questions from their work. Not what's wrong, "what's next?" All pupils get same volume of work to do.
  • Refining assessment. 3/4 way through test. Read not mark, plan final 1/4 based on those tests.
  • Same roles and methods used for individuals and peers as to teacher based assessment.
  • Assessment is the bridge between teaching and learning. Best assessment leaves no mark in a book but leads to more learning.
  • 100 AfL techniques in a box available from the SSAT


This is the second time I have had the fortune of hearing Dylan speak in person.  However I have been aware of his work since writing an essay based on his book "Inside the Black Box" whilst completing my PGCE back at the dawn of my teaching career.  The first reflection is that Dylan's reliance on evidence based practice is admirable, each piece of advice is backed up with evidence from a myriad of university studies.  Whilst I suppose this shouldn't come as a surprise for such a respected Professor, it is so often not the case in education.

Dylan went to great lengths to model his methods with the room of 400+ people.  Encouraging group discussion with clear signals (hands up for everyone) when he needed our attention., along with asking random people in the audience (including those at the back!) for their answers.

The message at the heart of this session was that assessment is at the heart of good teaching.  Dylan reflected that pupils often don't learn what we teach them, sometimes they learn things that we hadn't even planned.  It is our role as educators to create environments within which they can learn together and then to use assessment as a means of focussing the learning onto the next steps.  Kristian and I had a discussion about pre-tests, and how little these seem to be used within schools.  How often are pupils taught things that they already know?  How much time is wasted in our classrooms due to only completing assessments at the end of a unit of work.

There was much to take from Dylan seminar.  I will be making a renewed effort to think carefully about the questions that I pose in my lessons.  Questioning is one of the most difficult and poorly practiced parts of teaching. Dylan had many tips on phrasing verbal and designing written questions that I will be taking back to my teaching and to my department.  I will continue to promote Dylan's words that effective AfL is not to be found as ticks in books or as marks in grade books.  As a school we are having a renewed push on the quality of written feedback, I have designed some marking stickers for whole school use and will share these here when they are ready.  I will take the ideas about the learning environment back to our school and see if we can improve this whole school, promoting no hands up rules and also clear signage in classrooms.  I love the idea of pupils writing exam questions and will be using this extensively, particularly with my Y10 & 11 classes.

Key Questions

The conference was based around 3 key questions, I'll attempt to reflect on how each seminar helped answer these questions.

What should students learn? New things!  Using assessment before teaching to avoid wasting time covering old ground.

How should students learn? Collaboratively, in an environment that engages and stimulates them. By being given the time and opportunity to respond to feedback and questioning that has been designed to move learning on to the next level.

How do we remove the barriers to learning? Do not allow the enthusiastic to grow in confidence and skill whilst the weaker and disengaged hide on the fringes and watch the gap grow wider.  Make classroom learning rules clear to all.


Dylan's presentation slides are available on his website.