Note: The author of this article is an Apple Professional Learning Specialist.
‘Well, that was fun!’ was a comment I heard after a recent professional learning session with teachers. We had focussed on using robots to teach computing. The teachers had all taken part, solved problems and there was a real buzz in the room. It was fun, but would it make a difference to their students? How could we evaluate the session to find out if it was to be effective in practice?
Thomas Guskey proposed a five-level model to evaluate the effectiveness of professional learning (Guskey, 2000). He defined evaluation as ‘the systematic investigation of merit or worth’. Let’s take a look at each level.
Participants’ reactions – the first level
It begins with the people in the room. You will often have a ‘gut feeling’ about this, reading the body language of the participants. You will see keen and attentive people, actively listening and doing. You will also hear the opposite, with people saying, “I’m hopeless at technology.” It is important to engage the teachers from their starting points. A survey before the session may help to gauge the participants prior knowledge and experience. However, confidence can influence the responses – often people know more than they realise or feel confident enough to explain.
A questionnaire after the session can ascertain whether they found it useful, if the pace was appropriate and if the content was relevant. Often, informal comments made during conversations can add to the evidence of the initial reactions.
Participants’ Learning – the second level
We can then focus on new knowledge and skills gained. For example, what have the teachers learned? In conversation, they may explain what they intend to do next. This would normally be displayed over time – for example, a journal or portfolio could capture the modification of their lessons, to include the new knowledge.
There can be ‘unintended outcomes’, as described by Guskey (Guskey, 2002). Do they have new attitudes towards technology? When they can see how they have learned, this builds confidence and their attitude to using technology can improve.
Organisation support and change – the third level
This level can be crucial to the successful impact on learning. Does the school organisation support the implementation of new learning? For example, was enough time given? Teachers need to follow-up and reflect on their lessons and then plan the next steps.
Does the school support with resources? Buying equipment can be a difficult subject to approach, with many pressures on the school budget. The senior leadership needs to be included to see the potential value of the technology.
Participants’ use and new knowledge and skills – the fourth level
Moving on, the teacher may repeat the lesson or modify it, with a different context. It is all about building confidence, to integrate the technology. Performance management records could be a source of evidence. Videoing lessons can enable the teachers to watch themselves in the classroom. This can help them reflect and see their new knowledge and skills in action.
‘Student Learning Outcomes’ – the fifth level
How do the children show their learning? Has the students’ attainment increased? Has the implementation of the professional learning influenced their physical or emotional well-being? Is their attendance better? Are they more confident?
Applying the levels
Recently, I have been working with a network of primary schools, focussing on the computing curriculum. In particular, using Sphero robots to engage the children with physical computing has proven effective, especially when compared with their previous experience of only using onscreen learning. The Sphero robots are small, spherical robots, programmed using tablets or computers (Sphero EDU, 2018). They roll really quickly around the floor and the children and adults learn by programming them. The app enables the children to draw a path, use blocks of code, based on Scratch (Scratch, 2018) or type using JavaScript. An Apple app called ‘Swift Playgrounds’ can teach the Swift programming language, using Sphero robots (Apple, 2018).
In our example, the computing lessons were modelled, where I taught alongside the teachers. This modelling enables the teachers to see how the lesson can be structured. After the lesson, the teachers have said, “Now that I’ve seen it in action, I feel I can lead a lesson with the Spheros.” They have also said, “The children can show us how to do it!” These are the participants’ initial reactions.
To make the content relevant, links to the Barefoot Computing website are shared (Barefoot Computing, 2018). The website clearly defines the Computational Thinking concepts and why they are important. A teacher said, “I like that website, as it shows all the learning the children have demonstrated today.” This helps to build confidence, as they can realise the value of the lesson and link back to the curriculum. This shows the participants’ new learning.
With the Sphero robots, the senior leaders of the school were invited to join the lessons. They could see the children problem-solving, collaborating and getting excited about computing. If the impact on learning can be shown, then the purchase of technology can be justified. An after-school twilight session helped to share the experiences and convince the whole staff. Short videos and recordings of the children in action had a great effect too. This sharing and communication can affect the organisation support and change.
Working over the following weeks with the teachers, we reviewed the curriculum. Through this coaching cycle, it was important to ‘Backwards plan’ – identifying the attitudes, behaviours or attainment we would like to see in the children, and work backwards to design the learning. This demonstrated the participants’ use.
How did we gather evidence to gauge impact? The teachers noticed a behavioural change – one example was, “That child would never normally persevere with a problem, but they are shining in this lesson.” This is evidence of the students’ learning outcomes.
Conclusion
Guskey’s five-level model has certainly provoked thinking about student impact. Think of an example of professional learning you have experienced or led. Can you apply the five levels? How can you gather evidence to measure impact? We can all improve, as we work together, to make an impact on our students’ lives.
References
Guskey, TR (2000) Evaluating Professional Development. California: Corwin Press Inc.
“Does It Make a Difference? Evaluating Professional Development” by T.R. Guskey, 2002, Educational Leadership, 59(6), pp. 45–51
Sphero EDU (2018) Available at: https://edu.sphero.com (accessed 7 January 2019).
Scratch (2018) Available at: https://scratch.mit.edu (accessed 7 January 2019).
Apple Everyone can code (2018) Available at: https://www.apple.com/uk/everyone-can-code/ (accessed 7 January 2019).
Barefoot Computing (2018) Computational Thinking. Available at: https://barefootcas.org.uk/barefoot-primary-computing-resources/concepts/computational-thinking/ (accessed 7 January 2019).