Impact Journal Logo

Learning by Questions: Using evidence to develop teaching and learning software

Written by: Tony Cann
3 min read

The author of this article is the settlor of the Bowland Charitable Trust, which finances Learning by Questions.

This case study outlines the process adopted by the development team at Learning by Questions (LbQ) to develop an evidence-based app for the classroom. In particular, they were set the challenge of producing solutions that would:

  • Provide personalised, live feedback to both teachers and students
  • Facilitate teacher intervention based on need
  • Assess learning and progress through effective questioning
  • Engage students of all ages and confidence levels
  • Reduce teacher workload in planning and marking.

The initial development focused on mathematics mastery, widely used by east and south-east Asian countries which excelled in the OECD PISA report 2015 (2016).

The team recruited over 50 primary and secondary schools in England and one Academy in Atlanta, USA. These pilot schools have adopted the technology and provided valuable feedback throughout the 12-month development phase.

Live feedback and teacher intervention

Wiliam (2011) states that ‘sharing high-quality questions may be the most significant thing we can do to improve the quality of student learning’, and believes that the sooner students receive feedback, the quicker real learning takes place. Similarly, in the USA, Wiggins (2012) describes effective feedback as ‘timely, concrete, specific, and useful’. With this in mind, the LbQ team developed the app with the capacity to offer ‘live feedback’ during lessons, generated by the software as soon as questions have been answered (see Powell, 2018).

The app has been designed so that students receive their questions and answer them on tablet screens. Once students start work, their names are displayed on a matrix on the teacher’s screen. Teachers can then access student responses when they want to monitor levels of understanding. This allows teachers to see instantly who has misconceptions and similar needs, those who need challenging and those needing some revision. Rather than circulating the classroom, hoping to spot errors and offer guidance, they know who to visit because they can see every individual’s progress live.

Effective questioning

Following the advice of Helen Drury (2012), one of the leading proponents of mathematics mastery in the UK, authors of the ‘question sets’ ensure that each topic contains questions that progress from basic practice and understanding to the ultimate goal of problem-solving. The questions appear in a self-paced format: when one question is successfully answered, another appears with no delay. If they get a question wrong, the students read the guidance and retry the question. Approximately 15,000 mathematics questions have already been published.

LbQ in practice

Robert Powell’s book (2018) details case studies of six schools examining the impact of LbQ on teaching and learning. Powell observed lessons and interviewed teachers, leaders and students. Some of his key findings are summarized below:

  • All teachers reported that the high quality of the LbQ question sets had enhanced their teaching. They reported that the live feedback to them and their teaching assistants had improved their efficiency through targeted interventions.
  • All students reported their high engagement and particularly their liking for the instant feedback and the self-paced questions.
  • All six schools involved in the case studies reported gains in attainment. One primary school in a challenging area moved its expected levels from 55% to 72%; one school in London had 12 pupils failing at the beginning of the year, but all passed their final SATs, 9 of them at a higher level; another school ran a ‘booster’ session at lunch time for 12 pupils with special needs – 11 reached ‘expected’ level, with 4 at ‘greater depth’.
  • On workload, teachers reported that the time saved on planning and marking was significant, with one hour saved, on average, per lesson taught.

The evidence above from the six case studies is reinforced by other teachers involved in the 12- month pilot. 85 teachers stayed with the pilot for the full 12 months, and over 80 reported that:

  • their workload has reduced, including marking
  • their students enjoyed maths more and improved their confidence levels
  • They were able to develop more effective interventions
  • The question Ssts compared favourably with other published resources.

Question sets for English and science are underway, with plans for humanities and modern languages in the near future. The software has also been improved based on the feedback from the pilot schools. The future is indeed exciting.

LbQ was a finalist in four categories at the 2019 BETT show and was chosen as Innovator of the Year. Further details can be found on www.lbq.org

References

Drury H (2018) How to teach mathematics for mastery, secondary school edition. Oxford University Press.

Organisation for Economic Co-operation and Development (OECD) (2016) PISA 2015: Results in focus. Available at: https://www.oecd.org/pisa/pisa-2015-results-in-focus.pdf(accessed 21 September 2018).

Powell R (2018) Learning by Questions: The Power of Live Feedback. Learning by Questions Ltd, Blackburn UK and Amazon.

Wiggins G (2012) Seven Keys to Effective Feedback. Education Leadership 70(1): 10–16.

Wiliam D (2011) Embedded Formative Assessment. Solution Tree Press.

      0 0 votes
      Please Rate this content
      Subscribe
      Notify of
      0 Comments
      Inline Feedbacks
      View all comments

      From this issue

      Impact Articles on the same themes