Impact Journal Logo

Identifying and assessing students’ spoken language skills

Written by: Neil Mercer and Paul Warwick
6 min read

In recent years, there has been a growing recognition of the need to help young people develop their abilities to use spoken language effectively. Employers commonly say that members of their workforce, especially those engaged in creative activities and customer-related roles, need well-developed skills in communication and collaborative problem solving. They want people who can make clear presentations, work well in teams, listen properly to people and solve problems collaboratively. Moreover, these are skills that equip young people for full participation in active learning in school, in democratic processes and life in general.

If it is accepted that schools should be helping children to develop such skills, then teachers need ways of monitoring and assessing the talk skills of their students in a classroom setting so that they can aid their development. It is for these reasons that, with funding from the Education Endowment Foundation and working with the London-based free school School 21, we created an oracy assessment toolkit for assessing how well children of 11-12 years old can use spoken English for different purposes and in different contexts. This age group corresponds with the first year of secondary school in most schools in the UK (and in many other countries) and, by focusing on this age group in particular, teachers could make an initial assessment of their new intake as they arrived in school. The toolkit is designed for assessing all students as speakers of English, and not just those for whom English is a second language. When developing the toolkit we examined the kinds of assessment tools that others had already developed and used. These included the schemes devised by the Assessment of Performance Unit (APU) in the UK back in the 1980s, The Scottish National monitoring survey, the assessments of public speaking made by the English-Speaking Union, the GCSE English speaking and listening assessments used by the various examination boards, SATS Key Stage 2 speaking and listening tests; and several from outside the UK,including the schemes created by Oracy Australia. Although we are not concerned with assessing the developing use of English as a second language, we also looked at methods that have been used to assess the progress of second language learners. The survey of these diverse assessment tools proved very valuable, not least in helping us avoid reinventing wheels.

During the development of the Skills Framework we also consulted members of an expert group, including people with expertise in drama, English teaching, modern language teaching, linguistics, speech therapy and educational assessment. We were pleased with the quality of the constructive criticism they provided and with their enthusiastic support for the development of the toolkit. However, their comments did cause us some serious reflection about, and revision of, both the framework and our assessment tasks. The same is true for the secondary teachers (mainly of English, though also some modern foreign language teachers) we consulted and involved in testing out the assessment instruments. They were based in schools serving different kinds of catchments: inner London and Coventry, and rural Cumbria, Hertfordshire and Cambridgeshire. From them we gained much useful feedback on both the framework and the tasks; and the teachers provided valuable insights that helped us to avoid unnecessarily technical or complex ways of describe The toolkit consists of a set of initial tasks, a set of curriculum-embedded, assessment for learning (AfL tasks for use throughout the year and a set of end of year tasks, together with a system for assessing performance on these tasks and giving feedback to the children. We aimed to make the use of the toolkit as flexible as possible, so that teachers can use any or all of the AfL tasks at any points in the school year, with any number of children, depending on the circumstances within a school. The materials also include video examples of Year 7 students carrying out the tasks, and explanations of how these have been rated using the assessment scheme.

The Oracy Skills Framework (as shown in Figure 1) provides an important foundation for the toolkit. We felt the need to develop it as there did not seem to be an available comprehensive model of the various skills that are needed to use spoken language effectively across a range of situations. Moreover, most previous approaches to assessing oracy seemed to rely on performance criteria related to specific situations, rather than being underpinned by a more general framework. It seemed to us that, while some communicative tasks or situations differ regarding which skills or performance features are most important for effective communication, some – and perhaps even most – skills will have general relevance. So, for example, although the ability to project one’s voice will be more important when making a public speech than when involved in group work, and building upon what others say will conversely be more important in group work, in both types of tasks the ability to present one’s ideas clearly to a specific audience would be a crucial issue. By offering teachers this kind of framework, we considered that they could construct an ‘oracy profile’ for any student, which would not just be situation-specific. Thus a student might be given the feedback that they are excellent at making a clear formal presentation to an audience, but need to develop their ability to listen to what others say in group discussions. However, we are aware of the limitations of profile scoring when there are high intercorrelations between profile components (Feinberg and Jurich, 2017).

Our skills framework for oracy was developed in several ways. Initially, we had some extended and productive discussions with our partners in School 21 about what constituted the effective use of spoken language, and what might realistically be expected of 11-year-olds in that respect. This made us all more aware of the diverse nature of the skills involved, with some being essentially ‘physical’ (such as voice control), some ‘linguistic’ (such as choice of vocabulary), some ‘cognitive’ (such as organization of content) and some ‘social and emotional’ (such as the ability to manage a group discussion). Those different aspects became the key organising categories of the framework – see Figure 1. The assessment tasks have been designed to generate examples of young people’s use of talk in three rather different situations:

  1. making a presentational speech on a specific topic
  2. working collaboratively in a group to discuss an issue and reach an agreement
  3. working in a pair, with one person helping the other to perform a particular task (in this case, construction of a Lego model) by only using spoken language.

 

As mentioned earlier, videos of Year 7 students carrying out these tasks are available on the toolkit website: www.educ.cam.ac.uk/oracytoolkit. All of the other toolkit material can be downloaded free from that site. A more detailed account of how the toolkit was developed and validated can be found in Mercer, Warwick and Ahmed (2017). Since the completion of our project, another scheme for assessing spoken language skills (which draws upon our own work) has become available as part of the LAMDA Level 2 Award in Speaking and Listening Skills (www.lamda.org.uk/examinations/ schools-award). Both it and our toolkit show that it is possible to provide teachers with a framework for understanding the spoken language skills that their students will need to use to talk effectively in the various social situations they find themselves in; a set of tasks for assessing their students’ oracy skills across a sample of such situations; and a rating scheme that provides a valid and fairly reliable way of assessing individual students’ levels of competence and the progress they make over time. It is our hope that these developments will help to improve the amount and quality of oracy teaching in British schools, so that young people are better prepared for life in the 21st century.

 

References

Feinberg RA and Jurich DP (2017) Guidelines for interpreting and reporting subscores. Educational Measurement: Issues and Practice, 36: 5-13. DOI: 10.1111/emip.12142 (accessed 24 August 2017).

Mercer N, Warwick P and Ahmed A (2017) An oracy assessment toolkit: Linking research and development in the assessment of students’ spoken language skills at age 11-12. Learning and Instruction 48: 51-60.

      0 0 votes
      Please Rate this content
      Subscribe
      Notify of
      0 Comments
      Inline Feedbacks
      View all comments

      From this issue

      Impact Articles on the same themes

      Author(s): Bill Lucas