What doesn’t work?

#15MinForum – 07/10/16

At this week’s 15 Minute Forum we took a look at the Sutton Trust’s report, ‘What Makes Great Teaching?‘. The report itself is a couple of years old now, but well worth a read if you missed it first time around…

wmgt

The exciting work that our Learning Communities are doing revolves around developing expertise through cycles of inquiry. In doing so, our teachers are drawing on stimulus material across a range of areas to identify practices which are identified as being particularly effective, then looking at how to adapt and refine these ideas in the context of their own classrooms. Many of the ideas about what works that feature in this report are therefore ideas that are being explored across the school already.

Our message to staff this year about investing in their own professional learning is underpinned by Dylan Wiliam’s mantra, ‘sometimes we have to stop doing good things in order to do even better things’. With this in mind, we used the 15 Minute Forum to take a look at the things that are highlighted as being ineffective – strategies or approaches for which the evidence base is weak, or which are simply regarded as being less efficient or effective than other alternatives.

What are the areas identified as ‘ineffective’?

  • Using praise lavishly.
  • Allowing learners to discover key ideas for themselves.
  • Grouping learners by ability (both in terms of allocating students to different teaching groups and in terms of within-class grouping).
  • Encouraging re-reading and highlighting to memorise key ideas.
  • Trying to address issues of confidence and low aspirations before trying to teach content.
  • Presenting information to learners in their preferred learning style.
  • Ensuring learners are always active, rather than listening passively, if you want them to remember.

Read the report for more! (the ineffective practices start on p22, but the whole article is well worth 10 minutes of your time)


For more research into what works, this post from Professor Rob Coe is a great starting point…

 

Planning for growth…

Earlier this month I shared a quote with staff, hinting at the direction we are heading with our professional development and learning programme. In their excellent book, Professional Capital (2012), Andy Hargreaves (@HargreavesBC) and Michael Fullan (@MichaelFullan1) state

What is needed is a profession that constantly and collectively builds its knowledge base and corresponding expertise, where practices and their impact are transparently tested, developed circulated and adapted. There needs to be a continuous amalgamation of precision and innovation, as well as inquiry, improvisation and experimentation.

 

Over the last few months I’ve tried to synthesise their work and that of many others (which I’ve blogged about recently), while also reflecting on the successes and challenges we’ve faced in trying to engage staff with learning projects, which culminated in this week’s ‘Celebration of Inquiry’. Today I shared with staff the fruits of that lengthy process with the launch of the themes for next year’s Learning Communities… and I’m very excited…LC

The work of the Learning Communities will form the backbone of our professional development and learning programme next year. We’ve bought staff some time by committing to the introduction of a fixed professional learning slot where once a half-term, students will have a late start to facilitate the meeting of the Learning Communities (we toyed with making it a fixed weekly arrangement to open up a whole host of other development opportunities, but decided to take one step at a time).

Staff have now been asked to read the blurb for each of the 10 themes and identify which Learning Community they would most like to become part of for the year. Although there is some overlap between some of the themes, the overall scope is intentionally broad in order not only to cater to a wide range of personal development and learning interests, but also to ensure that the work of the Learning Communities supports a range of strategic development priorities (i.e. priorities based on what we see as trends from observations etc, as well as priorities relating to the implemention of new courses with increasing challenge and linear assessment).

Within each Learning Community, staff will be supported with identifying and articulating specific inquiry questions to explore. It is likely that a range of different inquiry questions will be explored within each community, but these questions will nest within the group’s theme, allowing for individual direction but maintaining internal alignment of the community.

The focus of each Learning Community is underpinned by a piece of ‘essential reading’. For some of the communities, this consists of carefully selected articles and online resources (blogs, videos etc), while for other groups it is a carefully selected book. Once staff have indicated which community they would like to join, we buy the books! This key reading will be a stimulus for the professional learning of the group: books (or articles and web links where this constitutes the reading material) will be distributed before the start of our nice long summer for staff to read before the first meeting mid-Septemer! Individuals and groups will then derive specific inquiry questions to develop their own practice based on the learning stimulated by their reading and the discussion that ensues.

Each meeting of the Learning Communities will provide opportunity for each member to share their progress, engage in some new learning/ reflection on the reading or other stimulus, and plan for next steps. These meetings will be supported through the use of twilight INSET opportunities to develop all staff as coaches and as observers, as well as supporting staff with the devleopment of research skills (eg framing inquiry questions and measuring impact through soft or hard data, or using peer observation (which will run to the agenda of the observed teacher) or student observation).

Over the course of the year, the intention is to draw upon the developing expertise within these communities and use it to support staff beyond those communities as well. The work will end in a ‘Celebration of Inquiry’ event – it’ll be like the one we had earlier this week, but on a larger scale… I envisage all staff will be involved in sharing their learning from across the year…

I for one can’t wait!

‘mark less, but mark better’

 The EEF’s review into marking, published last month, has come at an opportune time as we continue to embed and refine our approach to written formative feedback.

Is it groundbreaking? No. Is it worth a read anyway? Yes.

3242197170_3950b7b362_bPhoto Credit: *janine* via Compfight cc

We all know the score with regards to teacher workload – this report from the TUC in February of this year provides some interesting contextualising figures:

The most unpaid overtime is done by teachers and education professionals (with more than half of them working an average of 11.9 hours unpaid every week)

… and I’d wager that there are more than a few teachers who occasionally – or even routinely – do more than this average! While such figures will inevitably continue to colour the perception of many  outside the profession, making a challenging climate for recruitment even more so, our focus at the moment is on doing what we can to support those staff that are already in our school to help them find manageable balance.

According to the Government Response to the Workload Challenge, published in February last year, 53% of those who participated in the survey identified marking as one of the areas that represents opportunity to reduce workload (only ‘recording, inputting, monitoring and analysing data’ featured more often in responses, at 56%). At around the same time as this report was published, we started the process of rethinking our assessment policy…

 

Borrowing from a phrase that I’d heard Christine Harrison use at a conference where she spoke about her work on AfL, one of the guiding principles for a central policy that we knew needed to work across the whole school in a range of contexts was the idea that we wanted consistency of principle rather than needing uniformity of practice. To this, we added the mantra (in relation to written assessment) that it should be done at the right time, for the right reasons (that is ‘to support the progress of students‘ rather than ‘to prove to an observer/ inspector/ line manager that I do it’!), and off we set…

Much of the final document focussed on the written feedback (i.e. ‘marking’) side of things; the classroom-based side of assessment and feedback (i.e. the ‘short cycle’ formative assessment I referred to in this post) is picked up elsewhere through our focus on classroom practice in the learning and teaching programme. It sets out minimum expectations and core principles (whilst avoiding being unnecessarily directive or prescriptive)  in terms of frequency of formative feedback and the importance of students  being given time to reflect and respond to feedback (DIRT) etc. It also prompted us to make a few potentially risky decisions (for good reasons!), for example removing half-termly data drops, opting instead for a ‘live’ system. This allows subject areas and class teachers to add interim assessment data as and when summative assessments are completed, thus allowing schemes of learning to be planned and scheduled in a way that makes sense for the learning and development of ideas rather than scheduling them just so that the assessment data from that unit can be included in an arbitrary data drop each half term.

thumbs up

Photo Credit: MartinShapiro via Compfight cc

Is it all working perfectly? Not yet. However, we are convinced that the principles are the right ones and we are taking every opportunity to remind staff that we want them marking at the right times and for the right reasons: we care about staff wellbeing and we care about the learning experience of our students… if we want our staff to work sensible hours, then we recognise that there may be occasions where compromises have to be made: we want staff to make decisions about judicious, high quality use of the red pen in a way that maximises impact at key times, and ensure that time is prioritised for the planning of great learning experiences across lessons and units.

In this post on sharing effective practice in relation to marking and using this to refine practice across the school (emphasising the long-term developmental focus rather than short-term monitoring of compliance), Stephen Tierney (@leadinglearner) shares some interesting ideas on moving a policy from words on a page to tangible changes in practice. There are certainly some things for us to attend to in this respect over the remainder of the academic year – creating opportunities for staff to see the detail of what is happening around the school and what seems to be working, not only from the point of view of workload, but also from the point of view of what effective written feedback actually looks like: if we’re looking at written feedback from the point of view of economy and efficiency, we have to look carefully at what has the most impact. We have work to do around ensuring it has impact for the students: impact that they are able to reflect on in a meaningful way and can articulate – for their own benefit, but also to others – the specific links between the feedback they are being provided with and the progress they are making.

Clearly, this has to be done alongside ongoing reflection on what the research is telling us, both that which is being gleaned as part of the work of one of our Professional Learning Project groups and also the larger-scale and more robust findings of the EEF’s long-awaited review into marking, published last month. Although one of the key messages from the document is that the evidence is actually fairly scant in relation to the impact of marking, don’t let that put you off reading it – there are still suggestions that emerge from the research that does exist, and many of these are to do with the fine details of how we mark, rather than recommendations that would impact on the broad-strokes with which we have set out the principles of our assessment policy.

 

“Does our marking approach require our pupils to work to remember or reach the correct answer?” p12.

This is the question that I think struck me most in the whole report. It isn’t necessarily the most significant point, but given our recent reflection on the way we are using summative assessments formatively and the conclusions we’ve reached about the need for students to think hard for themselves, this seems like a potentially useful rule of thumb in terms of considering whether our marking is focussed on surface-level corrections or development of deeper understanding.

The contention presented by the EEF review seems to be that mistakes (something a student can do but has not on this occasion) should be marked as incorrect but left for students to correct themselves, while errors (resulting from misunderstanding or having not yet mastered something) should perhaps be dealt with by providing hints or questions to lead the students to developing a more complete and accurate understanding of the topic at hand.

Although the distinction between errors and mistakes isn’t made in the same way in this transcript from a talk given by Dylan Wiliam, he does place a similar emphasis on the idea that it should be a standard expectation that students are expected to do the thinking:

We suggested that instead of telling students that they got 15 out of 20, the [maths] teacher could, instead, tell them that five of their answers were wrong, and that they should find them and fix them.  The important feature of this feedback, like comment-only marking, is that it engages students, leaving them with something to do. This technique was subsequently adopted by English teachers when they provided feedback on students’ final drafts of writing assignments. Rather than correcting spelling, punctuation and grammar, the teachers put a letter in the margin for each error in that line using a G if for an error in grammar, an S for a spelling mistake, a P for a punctuation, and so on. For the stronger students, the teacher would simply put a dot rather than S/P/G in the margin for each error,  and for the weaker student, the teacher might indicate where in the line the error was.  The idea is that the feedback gives something to the learner to do so that the immediate reaction of the learner is that they have to think.

Elsewhere, in this fairly weighty review of research on formative feedback from Professor Valerie Shute, though not specifically about written formative feedback, there are some interesting comments about the idea of ‘directive feedback’ (providing corrective information) as opposed to ‘facilitative feedback’ (providing guidance and cues):

“Conventional wisdom suggests that facilitative feedback…would enhance learning more than directive feedback…yet this is not necessarily the case. In fact, some research has shown that directive feedback may actually be more helpful than facilitative—particularly for learners who are just learning a topic or content area (e.g., Knoblauch & Brannon, 1981; Moreno, 2004). Because scaffolding relates to the explicit support of learners during the learning process, scaffolded feedback in an educational setting may include models, cues, prompts, hints, partial solutions, as well as direct instruction (Hartman, 2002). Scaffolding is gradually removed as students gain their cognitive footing, thus directive feedback may be most helpful during the early stages of learning. Facilitative feedback may be more helpful later on, and the question is when. According to Vygotsky (1987), external scaffolds can be removed when the learner develops more sophisticated cognitive systems, where the system of knowledge itself becomes part of the scaffold for new learning.”

So, like everything else to do with learning and teaching, it isn’t straightforward (and this is before we’ve even get into John Hattie and Helen Timperley’s work on the power of feedback and the importance of considering the ‘level’ at which feedback is directed: ‘task’, ‘process’, ‘self-regulation’ or just ‘self’…)

It is unlikely to be easy to prescribe for teachers in simple black and white exactly what they should do and when… and nor should we need to, if we have faith in their professional judgement and intuition – based on a detailed understanding of each individual – about what sort of feedback is most appropriate at any point in time for any single student. Perhaps the emphasis should be on making sure that our staff are proficient in working with a range of strategies and that they have an appreciation for the rationale behind which strategies can work and when, then trust them to make the right decisions.

 26642624651_a00b392734
Photo Credit: nicolegalpern1 via Compfight cc

 

The EEF review makes various other suggestions based on the evidence that is already available, some of it reassuring in terms of the direction we are heading, some of it giving pause for thought and highlighting areas that we would do well to put more thought to… but then if anyone in education reads a report like this and concludes that they’ve got it all nailed already, I suspect we could say that they either haven’t read the report properly, they haven’t really understood it, or they don’t really have a true appreciation of what is going on in their own setting…

Is anything in the report groundbreaking? No, but it offers some tangible suggestions around which we could do some developmental work with staff to ensure we get the most impact for the most reasonable amount of input. I’m looking forward to reading whatever comes next…

 


 

As we continue to look outside our own school to learn from others, I’m also rather intrigued by the idea of ‘marking the Michaela way’, which seems like a minimal (to say the least) whole-school approach to marking which could have a lot going for it… A thought provoking read!

A few more interesting reads…