Impact Journal Logo

Assessment and recognition for next-generation learning

12 min read
SANDRA MILLIGAN AND ANGELA POLLOCK, UNIVERSITY OF MELBOURNE, AUSTRALIA

Background and purpose of this paper

In Australia in 2012, general learning competencies (referred to in other places by terms such as 21st-century skills, transferable skills or transversal skills) were formally incorporated into the national curriculum and subsequently adopted at every level, in all Australian schooling jurisdictions. Since then, there has been a steady stream of major national reports that further endorse the importance for all students of developing these transferrable competencies (Gonski et al., 2011), having them formally assessed and recognised – from the Early Years (Victoria State Government, 2023) right though to school leavers (Education Council, 2020) – and ensuring that they are further developed in tertiary study (Noonan, 2019). It was recognised that these general learning competencies support depth of learning and engagement in other domains, such as literacy and subject study. They are thought to be fundamental to ‘next-generation learning’, affording the capacity to thrive in school and beyond (Stevenson, 2022) and providing a way in which to improve inclusion and to narrow the growing gaps in equity in Australian schooling.

In 2007, a research centre at the University of Melbourne, now known as Melbourne Metrics (MM), commenced a programme of research to support schools to teach, assess and recognise the complex transferable learning competencies. They work with industry partners to research the field; develop and supply assessment tools, resources and services to schools and early learning centres; provide advice and consultancy support for authorities and agencies in a variety of jurisdictions, both in Australia and internationally; and develop an evidence base to support reform of policy and practice (Melbourne Metrics, 2024).

The team recognised that schools find it difficult to introduce and sustain the necessary changes to pedagogy and learning design, often because traditional methods of assessment and certification are misaligned. Traditional assessment methods (written tests and exams) are sufficiently reliable to assess mastery of subject-based content and individual attainment in cognitive skills, and they provide the formal currency for success of students. But they are not suitable for assessing general learning competencies, such as collaboration, entrepreneurship, having agency in learning or being a good citizen, a strong communicator, a good leader, empathetic, persistent or ethical. This gap in assessment capability has the effect of downplaying the value of general transferable capabilities in the eyes of parents, stakeholders and even teachers (Pilcher and Tori, 2018).

Changing what is assessed can be powerful in changing what is valued, and school leaders can use this as a lever with which to change learning design, pedagogy and reporting in a school to better align with new learning ambitions. Different assessment and reporting tools are required, together with encouragement, understanding and time to implement changes, plus a supportive policy framework.

This paper provides a brief report of findings of long-term, ongoing research being conducted in Australia to create the methods, tools and techniques that enable assessment and reporting to lead initiatives to embed general learning capabilities as valued learning outcomes in schools. Collectively, these tools, techniques and methods are described in this paper as a ‘next-generation’ (next-gen) approach to assessment and reporting. The approach has emerged from MM’s research and development programme, undertaken with research partners over 15 years, involving over 627,000 assessments, 8,500 schools or other providers, and 67,000 students in Australia and internationally.

Characteristics of ‘next-gen’ assessment and recognition 

The traditional and time-honoured approach to school assessments is acceptable for generating a reliable rank-order list of the degree to which learners have mastered knowledge in a domain of teaching, or specific cognitive skills and techniques (such as solving problems, using calculus or similar). It is, however, unsuited to assessing competence in any domain. Competence in a domain is the ability to do something in practice successfully or efficiently. It usually develops over time, with practice, and derives from application of a constellation of knowledge, skills, attitudes and values, applied for purpose, often in novel circumstances. Competence can only be assessed by observing the degree to which a person can successfully perform complex tasks that are valued in the real world. Frequently used in assessment in the professions, sport and the performing arts, assessment of competence is now needed for the mainstream of schooling, to assess the degree to which students have the transferable learning competencies that they need to thrive as learners. 

Table 1 provides a summary of the key characteristics that distinguish the approach generally used in traditional assessment and reporting design (in the first column) from the MM ‘next-generation’ approach to assessment and recognition, suitable for assessment of traditionally hard-to-assess transferable competencies (in the second column). The next-gen approach draws from a range of traditions in assessment, such as peer- and self-assessment, criterion-referenced or standards-referenced assessment, formative and summative thinking, and measurement modelling. In some senses, there is nothing inherently new in it. What is new is the combination and its use as a mainstream approach in both low- and high-stakes environments, systematically applied to the assessment of all students across the curriculum.

Table 1: Key characteristics of traditional vs next-gen assessments and recognition
Traditional approach  ‘Next-gen’ approach for hard-to-assess competencies
Assessment as event at end of learning  Assessment as process through learning
Assesses syllabus-based content knowledge  Assesses competence in performance
Assessment tools capture student response to short, standard, common tasks Assessment tools capture considered judgements of teachers and students about performances on authentic non-standard performances with agreed design features
Fairness and rigour from ‘objectivity’: no human error or influence Fairness and rigour from aggregating multiple human judgements
Standards established by establishing pass/fail cut points or scores Standards established via calibrated descriptions of behaviours associated with progressive development of expertise (progressions), providing a line of sight for development over age and stage of schooling
Scorers/assessors blind to learner identity  Learner known and understood by assessors 
Fairness resides in standardisation of context, content and scoring Fairness resides in validity and inclusion: fitness of assessment to context and interest
Teacher agency dominates Learner agency maximised 
Generates reliable ranking Establishes reliable position on a described scale of progressive standards (developmental)
Reporting uses numeric scores in subjects Reporting via profiles of learner strengths and patterns of competence, plus portfolios
Moderation via specification of content, plus examination and/or standardised testing Moderation via integration of professional, statistical and process quality requirements

A key to this next-gen approach is how standards are defined. For each competency, four or five empirically validated levels of competence are defined for three or four stages of learning as children progress thorough schooling, with each level describing the typical pattern of behaviour that distinguishes different levels of expertise. The behaviour needs to be defined so as to be readily observable and interpretable by teachers, parents and employers. Teachers find these ‘progressions’ informative for teaching purposes. Learners use them to become aware and active participants in their own learning.

There is a challenge for school leaders in all this, even when next-gen tools are to hand. Teachers need to learn about this approach, work with their colleagues across subjects, and craft learning designs, pedagogy and assessment practices that enable learners to learn, practise and demonstrate competence. The challenge for school leaders is to ensure that all staff are on board, and that all aspects of school organisation and design are aligned.

There is a further challenge, this one for system authorities. Summative assessments used for external reporting need sufficient precision, reliability and comparability to generate measures of growth in learning from year to year, and to secure the trust of all stakeholders with an interest in that learner’s attainments, including parents, employers and university selectors (Milligan et al., 2023). The robustness of assessment of competence depends on a range of factors: the useability and psychometric quality of the assessment instruments; the level of understanding of the standards that apply by teachers, students and stakeholders; trust in the comparability of the results; the authenticity of the opportunities that students have to demonstrate what they can truly do; the opportunity for assessors to observe these; and the utility and predictive validity of the reports to learners and stakeholders in the school and beyond.

At present, the MM research team is working with a range of organisation and schools to design what amounts to a next-gen credentialling and regulatory system – one that supports the development of trust in assessment and reporting of general capabilities conducted in schools. The approach is creating interest in tertiary institutions in Australia who seek trusted information about learners, broader than the purely academic, to support better matching of candidates to places in their courses.

Systematic and careful application of this next-gen approach can be seen emerging in a range of initiatives in Australia (SACE, 2022; BPLA, 2022; Melbourne Metrics, 2024), in the UK (and elsewhere (for example, Lucas, 2022; Milligan et al., 2022).

A validated assessment architecture 

A provider using this next-gen approach, whether a school or a school system, typically has their own framework of learner competencies against which to report, often the result of years of consultation with their respective communities. For instance, the Australian national curriculum includes general capabilities such as intercultural understanding and ethical understanding (ACARA, 2023). The South Australian Board of Secondary Education is working with a framework of competencies, including competencies such as personal enterprise, principled action and collective engagement. The International Big Picture Learning Credential is based around five learning goals, which include knowing how to learn, empirical reasoning, social reasoning and quantitative reasoning. There are dozens of such frameworks, appearing on the surface to be different.

A challenge for providers is to report a student’s attainment against their own framework while ensuring that any external stakeholders can interpret, and sometimes compare, the level of attainment of young people from different providers. This is important when secondary school teachers seek to understand the strengths and needs of primary students transferring to their school from across a region, or for tertiary education recruiters seeking to be fair to candidates from different schools.

A solution to this problem is to use a common assessment architecture when reporting on general competencies, as exemplified by the MM assessment architecture now used across a range of different jurisdictions and programmes in Australia. The architecture establishes a common set of elements of competence, using which assessments suited to any framework can be operationalised. The approach capitalises on the phenomenon noticed by researchers that although competency frameworks might differ, the underlying knowledge, skills, attitudes and values that typically make up these competencies do not (Fadel, 2019). Expertise in any competency derives from the capacity of a person to seamlessly and unconsciously integrate particular elements in practice, and to apply them to any performance. 

For instance, Table 2 illustrates how a competency called agency in learning, used in an MM programme, has been conceptualised as an amalgam of 12 elements of competence, drawn from a larger set of over 30 elements identified and defined by the MM research team. Any construct can be conceptualised with its own small, unique set of elements drawn from the larger set. Using this architecture provides a common language for the profession and for learners and other stakeholders, and a framework for building assessment tools that are broadly interpretable and comparable, and have utility beyond the local. Related resources for teachers and students build understanding of the elements of competence and what they look like in learner behaviour. This is not a rubric or a learning progression, but rather an example of the guidance that teachers find useful when building their understanding of a particular competence.

Table 2: Elements of agency in learning, broadly illustrating the nature of growth in the elements 
ELEMENTS
OF AGENCY IN
LEARNING
Definition

Low expertise

High expertise

Acting with
courage
The ability to act in the pursuit of a worthwhile goal in the face of adversity or uncertainty Acts safely to protect self Acts even if it means feeling uncomfortable Risks negative consequences
Being a
producer
The ability to produce things (performances, materials, products, works, experiments, designs) to develop knowledge or skill Produces as asked or directed Produces things to get to a finishing point Continuously produces and reproduces, experimenting to extend skill or knowledge
Being open to the new The ability to embrace new ideas, experiences and ways of doing things Seeks the stability of the familiar Open to incremental change Seeks out new ideas, territories, contexts and methods
Being persistent The ability to be tenacious and determined in pursuit of a goal Stays on task when supported Stays on task until something is finished Persists despite challenges and setbacks
Being reflective The ability to evaluate actions or thoughts for improvements Accepts existing thinking or action Evaluates thinking or action when things go wrong or do not match expectations Takes lessons from all experiences and reflects to improve
Comprehending The ability to interpret meaning from diverse communication media (spoken, non-verbal, written, performative or other media) Understands literal meaning and explicit instruction Interprets meaning from complex messaging or explicit instruction Interprets context-dependent, nuanced meaning that may be implicit
Engaging in
dialogue
The ability to engage in dialogue to connect to others, explore or negotiate meaning Conducts transactional interactions Connects with others to share ideas, information or feelings Uses dialogue to: deepen understanding; interpret ideas; negotiate and explore points of view; develop new or novel perspectives
Managing ambiguity or uncertainty The ability to operate in the face of unknowns or uncertainties Seeks certainty Accepts uncertainty or unpredictability as normal and unavoidable Turns uncertainty into constructive experiences for growth
Pursuing goals The ability to commit
to social, personal or
community goals
Recognises or sets goals Works towards achieving goals Demonstrates drive, focus or aspiration; goes above and beyond
Seeking depth
in knowing and
knowing how
The ability to seek deep knowledge and know-how in a domain Focuses on gathering facts and surface skills Inquisitive and investigative about the ‘why’ or ‘how’ of phenomena in domain Probes to understand and critique how knowledge and know-how are produced
Taking responsibility for others The ability to support others to achieve their own or shared objectives Works independently of others Is helpful and assistive when asked Takes the initiative to assist; is a recognised source of support and service
Using feedback The ability to seek feedback for improved performance Makes adjustments as instructed Seeks feedback and input from authoritative figures such as parents, teacher and experts Seeks and weighs feedback from a diverse range of interested sources in a community

Conclusion

The approach to next-gen assessment and recognition described in this paper is spreading in schools because they see assessment as a powerful determiner of what is valued. Further, school leaders in those schools see reform of assessment and reporting as a lever with which to better align learning design, pedagogy and organisation in the school in the interests of improving learning.

This paper describes some of the key characteristics of next-generation assessment and recognition of general transferable competencies, and the new tools, methods and techniques required. A key defining feature of the approach is that it has been found possible to conceptualise the diverse constructs that underpin competency frameworks used by schools through using a common architecture of elements of competence. This common architecture provides the basis for a common language, common calibrated standards and a common understanding of the quality assurance required to generate trust in results from external stakeholders. This approach is now used in many Australian schools. Educators involved see themselves as designers of the learning needed for young Australians to thrive at school and beyond, supported by a fit-for-purpose assessment and recognition approach.

    0 0 votes
    Please Rate this content
    Subscribe
    Notify of
    0 Comments
    Oldest
    Newest Most Voted
    Inline Feedbacks
    View all comments

    From this issue

    Impact Articles on the same themes

    Author(s): Bill Lucas