Impact Journal Logo

Promoting student-led questions in the secondary science classroom: An analysis of the types of questions created by students

Written by: Adewale Magaji
9 min read
Adewale Magaji, School of Education, FEHHS, University of Greenwich, UK

This article reports a section from a study aimed at promoting formative assessment through student-led questions. Questions are used to initiate classroom discussions and they form a prominent aspect of assessment for learning in science (Black and Wiliam, 1998). This can be used to scaffold learning and identify where students are, where they need to go and how best to support progress (Dixson and Worrell, 2016). Therefore, for questioning to be effective, teachers should engage students in developing questions and allowing them to explore these. This view is corroborated by the Department for Education (2019), who suggest that teachers should allow students to share emerging understanding so that misconceptions can be addressed through structured talk activities. This can be achieved through student-led questions and feedback to identify gaps in their knowledge. However, pedagogical challenges arise when teachers dominate questioning through the initiation, response and evaluation process (IRE; Cazden, 2001), and less time is devoted to allowing students to create questions and give feedback. This is a concern, as research shows that students find it difficult to ask questions (Mahmud, 2015) because they are not trained to do so. Therefore, formative assessment should be a priority for all teachers by promoting assessment capability and competency through using a range of assessment strategies to promote the learning of students (DeLuca et al., 2019), such as activities designed to develop student-led questions.

Experience of teaching and training teachers suggests that more time is required by science teachers in supporting student-led questions (Mahmud, 2015). This can be achieved by modelling how to create questions using the revised Bloom’s taxonomy (Anderson and Krathwohl, 2001) and helping them to understand the different types of questions and how they influence the quality of feedback required. Therefore, this study aims to answer the research question: What types of questions do science students ask and how can this promote learning?

Methodology

This research was informed by a constructivist paradigm promoting student-led questions. It allowed students to work collaboratively, sharing ideas, promoting reasoning and creating questions and feedback (Larkin, 2017). The study involved three science teachers and 137 students aged 12 to14 years old in a secondary school in England. The students were in the first, second and third year of secondary school education and comprised two groups. Group one was the experimental group, while group two was the control. Both groups were taught similar topics during the academic year (Table 1) and were exposed to similar pedagogical strategies, such as those listed below. For example, while teaching a topic on food and digestion, students were given a scenario to assume the role of a nutritionist by helping teenagers to decide on a healthy and balanced diet to promote good health. The activity was promoted by the strategies below, such as think-pair-share and problem-solving. This allowed students to ask questions and to establish a baseline with regard to the types of questions they created.

  • starters and plenaries
  • think-pair-share questions
  • thinking time and talk partners/structured talk activities
  • practical work and investigation
  • using exam-style questions
  • exploring texts and rewording contents
  • researching and finding information
  • problem-solving tasks
  • collaborative and cooperative learning activities
  • argumentation, debate and using evidence
  • pose, pause, bounce and pounce (P-P-B-P).

Several National Curriculum topics were taught in the academic year (see Table 1) and students were encouraged to ask questions. They used various strategies, such as think-pair-share and P-P-B-P. Questions were collected from different year groups of students (Table 1) to ascertain whether age and time spent in school could influence the quality of questions created. In term two, the experimental group was trained using the revised Bloom’s taxonomy (Anderson and Krathwohl, 2001) in creating questions through teacher modelling. This involved the use of Bloom’s prompts, such as what, why, explain, apply, justify, design and create. These key terms enabled students to construct sentences when creating questions. Data collection involved lesson observations, focus groups and field notes. Questions created by students were compared by grouping them into low or high order using the revised Bloom’s taxonomy and calculating their frequency and percentages. Clarity was sought from teachers regarding any technical terms used. The varied sources of data helped in furthering triangulation (Robson, 2011).

Findings and discussion

The outcome (Table 1) shows samples of low order questions created by both the experimental and control group of students.

Table 1: Frequency of questions created by the experimental and control group of students

Student ages and year in bracket  Topics covered Samples of questions created

(low order questions)

Frequency                                  n  n (%)
12 (7) Biology: cells, reproduction, environment              and feeding relationships

 

Chemistry: atoms, elements and compounds,   chemical reactions, acids and alkali, solid, liquid and gas

 

Physics: forces, energy resources, electricity

 

List the types of cells you know.

 

 

What is a compound?

 

 

 

Describe the meaning of force.

28 (23)

 

 

23 (19)

 

 

 

70 (58)

13 (8) Biology: plants, variation and inheritance,      classification, food and digestion

 

Chemistry: chemical reactions – metal and acids, compounds, environmental chemistry

 

Physics: sound and light, electricity,     magnetism, forces

What is photosynthesis?

 

 

State the gases that cause pollution.

 

 

What is light reflection?

 

 

157 (58)

 

 

54 (20)

 

 

58 (22)

14 (9) Biology: blood cells, plants and plant hormones, food/digestion/enzymes

 

Chemistry: organic chemistry, paints, air and  atmospheres

 

Physics: electromagnetic radiation, waves             and signals, fuels/energy

What are enzymes?

 

 

What do saturated hydrocarbon and polymerisation mean?

 

Why are fossil fuels finite?

 

29 (18)

 

 

59 (36)

 

 

74 (46)

A sample of low order question from Table 1 and the feedback is highlighted below:

Student 1 question: List the types of cells you know.                                         

Student 2 response: Plant and animal cells.

In contrast, a sample of high order question and feedback, highlighted below, emerged after training the experimental group in using Bloom’s taxonomy:

Student 3 question: Explain what changes you would recommend as an alternative to eating high energy and fatty foods.                                                  

Student 4 response: Eat a small portion of food with salads, lots of vegetables and drink plenty of water instead of fizzy drinks because they are not healthy for you.

Student 5 response: Reduce fat in your food using machines that remove oil from fried foods.

The questions initially created by students, shown in Table 1, were low order, categorised as ‘remember and understand’ on Bloom’s taxonomy (Anderson and Krathwohl, 2001). These questions do not promote thinking, such as the example included of ‘What is a compound?’ The question asked by student 1 did not promote cognitive development and application of knowledge, as seen in the feedback from student 2. However, due to the intervention of training the experimental group of students in using Bloom’s taxonomy, there was an improvement in the quality of questions and feedback, as evidenced by students 3, 4 and 5. Student 3 asked a high order question that may  involve ‘evaluating and creating’ by using Bloom’s taxonomy prompts such as ‘explain and recommend,’ implying that detailed feedback is required. The high order question allowed the students to embrace a wait time, internalising their thought processes and promoting collaboration and think-pair-share. This interactive and dialogic process resonates with the views of Gan et al. (2019), who assert that such collaborative learning is necessary to support student-led questions and feedback. In addition, engaging in high order questions enables students to challenge and explore misconceptions, create cognitive conflict among their peers, and discuss and share their varying ideas. This is supported by teacher 1, who said:

Some students helped others to reframe their questions if they thought it was not well presented but this was only seen in the students using Bloom’s taxonomy, as they had a guide to help them structure and ask high order questions.’

The comments above show that the ability to create high order questions and understand the depth and requirements of each question may influence the quality of feedback among students. Unfortunately, the teachers confirmed that they do not use Bloom’s taxonomy themselves and consequently have not trained students on using it to create questions. This study has shown that Bloom’s taxonomy can improve student-led questions and that there is a need to encourage and support teachers in using it in their classrooms. For example, teacher 2 said:

‘Using Bloom was effective in supporting students to develop questions as I identified three students who in normal lessons will not take part in activities but were engaged with the tasks, and asking questions and responding to other students’ questions, although it took them a while to do this.’

The pedagogical implication for science teachers is to consider how lessons are structured to promote student-led questions rather than teacher-led. This may involve using a combination of various pedagogical strategies mentioned earlier and Bloom’s taxonomy. This aligns with the views of Schildkampa et al. (2020) and Black and Wiliam (1998), that formative assessment should be an integrated element of instruction requiring a change in the role of the teacher and a shift in the power relations between teachers and students so that they can be jointly responsible for the quality of teaching and learning in the classroom. For example, it took several lessons to train the experimental group on using Bloom’s taxonomy in creating high order questions, and this should be an ongoing developmental process between science teachers and students. In addition, there is no difference in the quality of questions created by Years 7, 8 and 9 students – a concern, as the assumption would be that the older students would have more highly developed skills in asking high order questions. The total number of questions across the subject areas created by Years 7, 8 and 9 students were 214 in biology, 106 in chemistry and 102 in physics. The reason for a higher number of biology questions may have been due to the ease of students in being able to link scenarios in lessons to real-life situations; however, this could be explored further. There was no improvement in the quality of questions among the control groups compared to the experimental groups, who demonstrated an improvement in this aspect. This study shows that students are willing to ask questions but do not know how to. Therefore, teachers’ assessment capability and competency may be a barrier in promoting student-led questions, and urgent measures are required to improve.

Conclusion

The initial questions created by the experimental and control groups were low order, categorised as ‘remember and understand’. These questions involved factual recall and did not challenge or promote cognitive development, despite various pedagogical strategies being used. However, training the experimental group in using Bloom’s taxonomy led to an improvement in the quality of questions and feedback, implying that teachers should embrace this pedagogy in supporting student-led questions. An implication for professional development should focus on supporting science teachers in using Bloom’s taxonomy to create differentiated questions, as well as modelling to students how to create questions. This will help students to understand the various types of questions and the quality of feedback that can be achieved from them.

References

Anderson LW and Krathwohl DR (2001) A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Longman.

Black P and Wiliam D (1998) Inside the Black Box: Raising Standards Through Classroom Assessment. London: School of Education, King’s College.

Cazden C (2001) Classroom Discourse: The Language of Teaching and Learning, 2nd ed. Portsmouth, NH: Heinemann.

DeLuca C, Willis J, Cowie B et al. (2019) Policies, programs, and practices: Exploring the complex dynamics of assessment education in teacher education across four countries. Frontiers in Education 4: 132.

Department for Education (DfE) (2019) ITT Core Content Framework. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/843676/Initial_teacher_training_core_content_framework.pdf (accessed 27 July 2020).

Dixson DD and Worrell FC (2016) Formative and summative assessment in the classroom. Theory Into Practice 55: 153–159.

Gan Z, He J and Mu K (2019) Development and validation of the Assessment for Learning Experience Inventory (AFLEI) in Chinese higher education. Asia-Pacific Education Researcher 28(5): 371–385.

Larkin D (2017) Planning for the elicitation of students’ ideas: A lesson study approach with preservice science teachers. Journal of Science Teacher Education 28(5): 425–443.

Mahmud M (2015) Questioning powers of the students in the class. Journal of Language Teaching and Research 6(1): 111–116.

Robson C (2011) Real World Research: A Resource for Users of Social Research Methods in Applied Settings, 3rd ed. West Sussex: Wiley.

Schildkampa K, van der Kleijb FM, Heitinka MC et al. (2020) Formative assessment: A systematic review of critical teacher prerequisites for classroom practice. International Journal of Educational Research 103: 101602.

      0 0 votes
      Please Rate this content
      Subscribe
      Notify of
      0 Comments
      Inline Feedbacks
      View all comments

      From this issue

      Impact Articles on the same themes

      Author(s): Bill Lucas