Impact Journal Logo

Building a responsive teaching culture by designing and using hinge questions

6 min read
Victoria Foster, Simon Underhill, Kim Gillingham, Hazel Brinkworth
Dulwich College Shanghai Pudong (Junior School), China

Perhaps the most common and challenging problem that all teachers face is establishing what students have learnt, what misunderstandings they have and what their next steps should be. This could be described as the search for the holy grail of teaching – valid and reliable interpretations of student learning.

Student learning is quite different from student performance (Soderstrom and Bjork, 2015). This is one of the most fundamental concepts for teachers to grasp when embedding effective assessment practices. Put simply, student performance relates to short-term retention whilst student learning involves long-term retention and the ability to transfer what has been learnt to different contexts – a far more complicated process.

Developing robust practices to assess student learning within the Junior School at Dulwich College Shanghai Pudong has become a major focus in recent years. In the past, our assessment practices included votes of confidence, teacher judgements and formal written test papers, but rarely were these approaches adequate in making valid inferences of students’ learning. We needed more frequent, reliable, formative check-in approaches that would lead to valid interpretations.

Our starting point

To try to tackle the messy business of capturing students’ knowledge and understanding, we wanted to establish a culture of hinge questioning – the practice of carefully planning the content and timing of a question within a unit of work in order to identify those who were being left behind in the learning process.

Hinge questions allow misconceptions to be noted early and concepts retaught if needed (Wiliam and Leahy, 2015). We focused on multiple-choice hinge questions (Barton, 2018), as shown in Figure 1.

Figure 1: Example of a multiple-choice hinge question

Which fraction is equivalent to the fraction shaded in this diagram?

Diagram where four out of six squares are shaded.

A: 4/2

B: 2/6

C: 3/5

D: 2/3

Students who answer A might not have a clear understanding of fractions as parts of a whole. If they choose B, students have found the fraction of the unshaded part. Answer C might be chosen if the students have an idea of simplifying but thought that they could do this by subtracting one part from each. D is the correct answer, with simplifying involved. It might not look like the obvious answer, as there are not two parts shaded out of three in the simplest form, making it less likely to be an option that students might guess and still be correct.

The use of technology was key in our implementation journey as we could design and easily assess on platforms such as Plickers, Nearpod, Quizzes and GoFormative. This allowed us to create quizzes that were accessible, helped to collate answers in real time and encouraged a high level of engagement. It also allowed us to be more responsive – knowing when we needed to continue to explore an idea with students so that they gained further clarity and when to move on to the next idea. Equally, it helped to pinpoint groups of students who had not yet acquired the desired knowledge.

Staff began the process by identifying threshold conceptual understandings and knowledge in each unit of work’s progression model. For example, a Year 5 physics unit of work on forces involves students understanding that forces are needed to put anything into motion or to keep it still, and that controlling forces affect the speed, direction or shape of an object. In terms of knowledge, students will need to  know

the different forces of friction, gravity, push, pull, air resistance and support; different ways to influence the size and direction of a force, and how and why forces work in pairs.

What we noted early on was that teachers needed to work collaboratively to identify what understandings they hoped that the students would achieve and what possible misconceptions they might have. Short and reliable assessments where all answers were plausible were required to identify those who had gained the desired understanding. Each question required an element of thinking, and time given to do so, in order for students to decipher the correct answer and to minimise best-guess possibilities.

The greatest difficulty was in the creation of the hinge questions themselves. To ascertain the level of understanding of each student, the plausible answers needed to be similar enough to divide those who understood from those who didn’t. This required teachers to ensure that the answers were written in an accessible syntax for all, as well as taking into account the cognitive ability of the students to decipher similarly worded answers. Additionally, teachers needed to anticipate the possible misconceptions for each desired understanding. This proved to be challenging, as assumptions in the students’ general knowledge and experiences were difficult to ascertain. All these facets required time and solid collaboration amongst staff.

An example is shown in Figure 2. A group of Year 6 teachers wanted to assess students’ application of commas, colons and semi-colons in a passage of text. A hinge question was set using the Nearpod platform.

Figure 2: Example hinge question on Nearpod platform

Which of these sentences is punctuated correctly?

  1. When you go to the shop, buy the following: flour, sugar, and chocolate for the cake; bread, cheese, and ham for the sandwiches; and cat food, for the cat.
  2. When you go to the shop, buy the following: flour, sugar, and chocolate for the cake: bread, cheese, and ham for the sandwiches: and cat food, for the cat.
  3. When you go to the shop, buy the following: flour, sugar, and chocolate for the cake, bread, cheese, and ham for the sandwiches, and cat food for the cat.
  4. When you go to the shop, buy the following; flour, sugar, and chocolate for the cake; bread, cheese, and ham for the sandwiches; and cat food for the cat.

The feedback on how the students answered the question was immediate, allowing the teachers to see how to respond to individual needs.

After a few more days of reviewing the application of these punctuation tools, the teachers asked students who hadn’t responded correctly to the first hinge questions a follow-up question in the form of an exit ticket. Students were required to punctuate the following complex list:

My favourite sports are tennis cricked and golf in the summer rugby football and lacrosse in the winter and badminton and swimming all year round.

Assessing in an online context

Our steps forward in using multiple-choice questions proved to be a great asset when we were faced with the challenges of online learning. Synchronous teaching brought many new challenges to effective pedagogy – in particular, how best to assess whether students were learning in a virtual environment.

We used multiple-choice questions to create summative mini quizzes on domain-specific objectives from our English and maths curriculums. After trying to administer quizzes in a multitude of ways, we realised that if we were going to get any valid inferences of the students’ learning, our assessments had to be timed and supervised. This resulted in us carrying out our assessments in synchronous live lessons using online quiz tools that produced live responsive results. To try to ensure validity of learning and not performance in our inferences of what the quiz results revealed, we introduced a period of spacing between having finished teaching the concept and assessing it. This was to determine whether the desired learning had been committed to students’ long-term memory.

The future

We have focused on why we are asking the question, what we want to find out about each student’s understanding and when in the planning stage each hinge question should be asked. We are utilising the new information presented and adapting our teaching, either within the same lesson or in future lessons (Coe et al., 2019).

We are now planning to better inform our parents about their child’s learning, rather than their performance, by developing a live reporting system – sharing snapshots of a child’s journey. By identifying and collecting multiple snapshots, the learning journey becomes clearer in terms of what each of our students truly knows and understands, and these valid interpretations can be shared and celebrated.

References

Barton C (2018) On formative assessment in math: How diagnostic questions can help. Available at: www.aft.org/sites/default/files/ae_summer2018_barton.pdf (accessed 29 March 2021).

Coe R, Rauch CJ, Kime S et al. (2019) Great teaching toolkit: Evidence review. Evidence Based Education. Available at: https://assets.website-files.com/5ee28729f7b4a5fa99bef2b3/5ee9f507021911ae35ac6c4d_EBE_GTT_EVIDENCE%20REVIEW_DIGITAL.pdf (accessed 29 March 2021).

Soderstrom NC and Bjork RA (2015) Learning versus performance: An integrative review. Perspectives on Psychological Science 10: 176–199.

Wiliam D and Leahy S (2015) Embedding Formative Assessment: Practical Techniques for K-12 Classrooms. West Palm Beach, FL: Learning Sciences International.

      0 0 votes
      Please Rate this content
      Subscribe
      Notify of
      0 Comments
      Oldest
      Newest Most Voted
      Inline Feedbacks
      View all comments

      From this issue

      Impact Articles on the same themes

      Author(s): Bill Lucas