Remarks by SPC’s Educational Quality and Assessment Programme (EQAP) Director Dr Michelle Belisle at the Launch of the Pacific Islands Literacy and Numeracy Assessment (PILNA) Report 2021


Remarks by SPC’s Educational Quality and Assessment Programme (EQAP) Director Dr Michelle Belisle at the Launch of the Pacific Islands Literacy and Numeracy Assessment (PILNA) Report 2021

Her Excellency Charlotte Darrow, New Zealand High Commissioner, Counsellor Sophie Temby,  Australian Government Department of Foreign Affairs and Trade, Miles Young, Director HRSD of the Pacific Community, representatives of ministries of education and foreign affairs, diplomatic posts, partners from regional organisations, DFAT posts and MFAT posts, invited guests, ladies and gentlemen.

​I welcome you all to the Pacific Island Literacy and Numeracy Assessment report launch, marking the release of the PILNA 2021 results. The report you will hear about this afternoon is made possible through the generous support of the Australian government and the New Zealand Government through the Ministry of Foreign Affairs and Trade Aid Programme and with technical support from the Australian Council for Educational Research.

​The overarching purpose of PILNA as a long-term Pacific-wide regional assessment is to generate cognitive and contextual data that can be used to facilitate ongoing collaborative efforts to monitor and improve learning outcomes for children in Pacific Island countries. ​

​The PILNA programme represents a commitment by Pacific Island governments and development partners to monitor the outcomes of education systems by measuring student achievement on a regular basis and within an agreed common framework. By building capacity through collaborative involvement of country representatives, the PILNA programme helps to strengthen learning assessments, standards and policies, while also supporting improvement in teaching and learning across the Pacific region. ​

​In 2015, countries agreed to focus on six guiding principles to achieve the purpose of PILNA and those principles continue to underpin the reporting for this fourth cycle of PILNA. ​

  1. PILNA content: The skills and concepts that form the content of PILNA are guided by the definitions and indicators outlined in the 2016 Pacific Regional Benchmarks for Literacy and Numeracy. ​
  2. Recognition of the value of good literacy and numeracy skills: The PILNA programme promotes the importance of literacy and numeracy skills as building blocks for children’s future learning opportunities. It also empowers citizens to communicate effectively, to make informed decisions and to take active control of their future. ​
  3. Assessment methodologies and types of data: The PILNA programme continues to improve its assessment methodologies to provide reliable and valid cognitive and contextual data at the regional and country level. ​
  4. Monitoring purpose of PILNA: The data generated from PILNA enable the monitoring of student learning in literacy and numeracy. Additionally, PILNA enables collection of background data on students, teachers and schools at regional and country levels.  ​
  5. Intervention as the added value for countries: The PILNA programme adds value for countries by enabling them to use regional and country-level data as evidence of student learning achievement for the development of targeted intervention strategies. ​
  6. Collaboration among stakeholders for good quality data: The PILNA programme is designed in such a way as to enable a range of data collection with strict adherence to technical standards. Collaboration among organisations and governments is a critical feature of PILNA administrations.  

The presentation this afternoon will highlight the results of PILNA 2021 in light of these guiding principles.

PILNA is focused on literacy and numeracy as defined collaboratively by regional experts in the regional benchmarks.

Literacy, in this context, refers to the “knowledge and skills necessary to empower a person to communicate through any form of language of their society and the wider world, with respect to all aspects of everyday life.”​

​​Similarly, numeracy refers to the “knowledge and skills necessary to empower a person to be able to use mathematical processes, as well as the language of mathematics, for a variety of purposes, with respect to everyday life.”

While not an assessment of curriculum, as each participating country has its own curriculum and ways to measure student learning against that curriculum, the regional benchmarks in reading, writing and numeracy are built on shared expectations. The benchmarks represent consensus among curriculum experts from 15 participating countries with respect to what students should know and be able to do at the end of two, four, six and eight years of formal education.

Under the 2018 Pacific Regional Education Framework (PacREF) wherein one of the four key priority areas is student outcomes and wellbeing. PILNA provides a tool to track student outcomes at two key levels: the end of four years and the end of six years of formal education. PILNA is also a set of instruments to collect and provide a wide range of data to help education systems around the region to understand student performance, the issues those systems face and possible ways in which to address those issues.  ​

​PILNA is one of nine cross-national learning assessments identified by the UNESCO Institute for Statistics that meet the criteria to measure SDG 4 Indicator 4.1.1, the proportion of children and young people achieving minimum proficiency in reading and mathematics.   ​

​The Nadi Declaration, coming out of the Conference of the Commonwealth Education Ministers (CCEM) in February 2018, also highlights the need for member states to pay close attention to literacy and numeracy outcomes for students, and to addressing the gender gaps that exist in the performance of boys and girls. PILNA is one tool that highlights student performance overall and brings focus to performance differences between subgroups of students – by gender, by geography and by school authority. ​

​Common across all of the various declarations, frameworks and strategies is the desire to understand and respond to challenges at all levels and to improve the quality of literacy and numeracy education for students in the Pacific region.  

It is anticipated that students who meet or exceed the minimum expectations for their year level are well-equipped to succeed in the learning opportunities provided in their formal education and effectively interact in their everyday living where literacy and numeracy are involved. Particularly important is the development of the critical thinking, problem-solving and decision-making skills that are a part of both literacy and numeracy.

The reporting metric that was constructed for PILNA in 2016 was designed to achieve two main goals: first, to provide descriptions of what students can do at various points along the metric; and second, to show results in a way that can be interpreted consistently across all participating populations. This means that results can readily be compared across different parts of each country’s population (for example, across students from urban and non-urban areas, or between girls and boys). National results can also be compared with the average achievement across the region. The proficiency levels were developed during the analysis and reporting phase of PILNA 2015 to provide a consistent comparator for PILNA results across multiple cycles.​ In the coming months, proficiency levels for the new writing scale will also be developed in a similar way.

​Students at each of the levels are able to do the skills in each described level with proper guidance by the teacher and are likely to do the skills in the preceding lower levels independently.​

​In 2020, three key enhancements to PILNA were introduced at the field trial stage to allow greater precision in describing students’ performance and an increase in the inclusiveness of PILNA. The first of these is a block design approach that allows for a greater number of items to be included in the assessment by having several different assessment booklets, each containing two blocks of items. Through increasing the quantity of items in PILNA 2021, we are able to report more specifically about what students at the low end of the scale are able to do and as well, we have been able to enact the second enhancement – splitting the literacy scale in to two individual scales; one for reading and the other for writing. This allows us to report more clearly on what aspects of literacy are performed well by students and what aspects of literacy pose more challenges for students and teachers in the region. Finally, 2021 marks the first time that PILNA has included accommodations for students with special needs such as provision of large-print materials as well as accommodation for students needing extra time and/or quiet space to participate in the assessment.

As with all previous PILNA cycles, the main study in 2021 is not an isolated effort within a single calendar year. From item development through to reporting, a PILNA cycle is three years in duration. Item development was carried out in December 2019 in advance of field trials set for 2020. The onset of the COVID-19 pandemic and related restrictions posed challenges for the field trials, but the consensus was to carry on with field trials to ensure a PILNA 2021 main study would be possible. As a result, we were able to trial our mitigations for restrictions as well as the items and booklet design. We couldn’t have known then how valuable that would be. In 2021, the Steering Committee took the decision to proceed with the PILNA main study in spite of challenges being faced and with our experience from the field trials in mind, we were able to carry out the administration and coding of the 2021 main study with in-country support from the ministries and departments of education coupled with remote virtual support from the EQAP team and our colleagues at the Australian Council for Educational Research. A welcome result of this approach was the increased capacity of ministry personnel to carry out the logistics of administration and coding of PILNA, putting more ownership of the processes in the hands of the participating countries.

​The timing of the PILNA 2021 main study also made it possible for the team to develop, trial and administer questionnaire items that specifically focused on disruptions to education in Pacific Island countries. Although the pandemic is one major source of disruption during the PILNA 2021 period, other challenges such as epidemics, natural disasters and climate-related events have also disrupted education around the region in the past three years. For the first time, we have students, teachers and school leaders reporting on how those disruptions affected teaching and learning and can relate that information back to the reading, writing and numeracy results.

As in the PILNA 2018 cycle, coding and scoring were the two methods used to assess students’ test responses. Student responses were first coded, meaning they were assigned to pre-defined response categories. The process of scoring occurs when a code is assigned a quantitative value (a score).​

​The advantages of a coding scheme are that additional information can be captured about student responses and it provides information on why some incorrect choices (or distractors) are more often selected by students than others.

​It is important to note that the coders are asked to observe and record what the students have given as responses. They are not asked to mark the question as correct or incorrect. That process, called scoring, occurs after values are assigned to particular codes to give full, partial or no credit for specific responses. ​

It is the coding data that is useful to teachers and ministries of education to better understand where students are in their learning and how best to intervene to fill gaps and correct misconceptions that get in the way of student achievement.

To help understand the context in which students are learning, we asked students to tell us about themselves through a student questionnaire. From the questionnaire data, we found that

  • most students (80%) who participated in PILNA 2021 attended ECE for at least one year. Similar proportions were observed across year four and year six students, and student performance tended to be higher for year six students who attended ECE than for those who did not.
  • students who performed better in reading tended to use the language they were assessed with in more settings. This suggests that using a language in everyday conversation may improve students’ reading ability in that language. 
  • Students from wealthier households had higher average performance on the PILNA assessments. This was observed across all domains – numeracy, reading and writing – and across both year levels. These differences were larger at the year six level than the year four level. 
  • About half the students in the region frequently receive support from their caregivers with homework, guidance and encouragement. There is evidence that support from caregivers is associated with students’ performance in numeracy and reading at both year levels. Students who met performance expectations in these areas had higher levels of caregiver support.  
  • A high proportion of students (80+%) in both year levels are enjoying reading, writing, and mathematics and identify them as being important. Students who met the expected performance in a subject area had higher attitude scores for the subject and for school in general.
  • Only half the students in the region, at least most of the time, are cheerful, have good days, and look forward to the next day. Many students in the region are regularly experiencing positive well-being but a substantial proportion are not. Also, about one in five students across the region are frequently experiencing challenges to well-being, such as feeling hungry, tired, upset, or not having enough friends.

Similarly, to understand the context from a teaching perspective, we asked teachers to tell us about themselves through a teacher questionnaire. From the questionnaire data, we found that

  • most students have teachers who agree they have enough time to complete lessons in mathematics, reading and writing. However, a large proportion of students in both year levels have teachers who think they do not have enough time to work with slow learners. About one in three learners in Pacific schools may not get the teacher support they need if they fall behind or are slow to absorb new concepts.
  • Teachers observed a substantial minority of their students experiencing behavioural and cognitive difficulties as the most common challenges to learning.  
  • A high proportion of students in the region have teachers who are confident in teaching literacy and numeracy. In literacy, more students have confident teachers in areas that are structured and rule based, such as spelling, punctuation, and vocabulary. Fewer students have confident teachers in areas that require more subjective or complex teaching and assessment, such as quality of ideas and organisation and structure in writing.  
  • A high proportion of students in the region have teachers who are experiencing stress in their job and feeling overwhelmed by their job. About half the students have teachers who reported not having enough time for managing their well-being through their personal life.
  • High proportions of students had teachers who were satisfied with and proud of their jobs. The findings suggest that teachers over 35 years of age tended to have higher levels of job satisfaction than teachers below 35 years of age.  

School leaders also provided information on factors that impacted both teaching and learning in their respective schools.

  • On average, two-thirds of students attended schools that closed for pandemic reasons. Different countries experienced different levels of school closures around the collected areas (pandemics, epidemics, natural disasters, and other reasons). 
  • On average, 51% of PILNA 2021 students attended schools where learning materials were made available for pick-up from schools during closures. A further 21% of PILNA 2021 students attended schools where this measure was in place but not implemented.
  • 40% of students attended schools where learning materials were delivered physically to a student’s home. A further 18% of students attended schools where this measure was available but not implemented.
  • While communication via social media and/or with email with students and parents was available to 36% of students, over half of the students from PILNA 2021 (50%) attended schools where the measure was not available. 
  • Measures to provide non-physical learning materials – emailed materials, materials available on websites, materials broadcast on radio, and materials broadcast on television – were also implemented by schools. However, only about one in five students attended schools where these measures were implemented. 

The overall findings of PILNA 2021 show that across the region, on average, year four students are not meeting minimum expected performance levels in reading, with only 43% of students meeting or exceeding that benchmark level. On average, Year 4 students are exceeding minimum expected performance levels in numeracy with 67% of students meeting or exceeding the benchmark. Expected levels of performance have not yet been established for writing but the average writing performance score is increasing.

A gender gap remains at year 4 with approximately 10% more girls than boys meeting or exceeding minimum expected proficiency in both numeracy and reading. Girls’ writing performance at Year 4 also exceeds that of boys. 

Across the region, on average, year six students are exceeding the minimum expected performance level in numeracy with 72% of students meeting or exceeding the benchmark level. On average year 6 students are meeting the minimum expected performance level in reading, but only just with 53% of students meeting or exceeding the benchmark level. Expected levels of performance have not yet been established for writing but the average writing performance was about the same as in 2018.  

At Year 6, the gender gap in Numeracy is similar to that seen at Year 4 with 77% of girls meeting or exceeding the minimum expected proficiency level while only 67% of boys are there. In reading however, 60% of girls meet or exceed the minimum expected proficiency level while only 46% of boys have reached or exceeded that level. In writing girls continue to outperform boys as has been the case in past PILNA cycles.

As has been the case in previous PILNA cycles, students have challenges with complex and critical thinking items, inference and interpretation, providing explanations, and developing a simple story or idea.

As captured through contextual questionnaires and evidenced in the cognitive results, high levels of stress for students, teachers and principals are having an impact on teaching and learning.

Responses to disruptions to education in the region remain physical in nature and are not equally accessible by all students, indicating that countries not yet ready to cope with remote learning. 

Priority areas for intervention: reading and formative assessment 

SPC-EQAP have identified two priority areas that apply across all recommendations: support is needed for student reading performance and formative assessment practices. These areas can have substantial effects on student performance. 

Student reading performance is a priority for intervention due to its underperformance in PILNA 2021 and its impacts on a diverse set of learning outcomes. Students require reading ability to not only engage with texts but also to understand problems and succeed in all subjects. The relatively low reading scores from PILNA 2021 may, therefore, result in compounding impacts on performance for this cohort of students. If left unaddressed, lower reading scores now may cause lower performance, in all subjects, as students progress through school. Targeted interventions for reading performance are needed from a household level to a governmental and regional level.

 Formative assessment practices are also a priority for the region; evidence has shown these are one of the most impactful strategies for improving student performance, in all areas. Formative assessment can be described as frequent and informal assessment of student learning that identifies learning needs and allows teachers to adapt their teaching accordingly. Some consider this as ‘checking-in’ with students for their understanding and identifying where extra support or teaching is needed. These practices can ensure a greater number of students understand new concepts and that slower learners are identified and can receive required supports. Given the number of students who are experiencing learning difficulties, formative assessment practices could have benefits across the region. These practices are powerful tools that can also improve teaching and learning for whole classes as well as individual students. It is recommended that regional entities, ministries of education, teacher education institutions, school leaders, and teachers strengthen, develop, and utilise formative assessment practices.

The digital interactive report that we are launching today is an exciting new development in the PILNA programme. Thematic and interest-based pathways in the report allow users to pursue the information that interests them the most and dig more deeply into the specifics revealed in the main study. Additionally, specific, targeted recommendations are articulated in the regional report that can inform action in the short and medium term. Three overarching recommendations emerge from PILNA 2021 in addition to those targeted recommendations:

1.     Use PILNA data effectively

Some regional findings from PILNA 2021 may justify immediate actions or interventions such as student performance, learning environments, and wellbeing challenges. 

  • PILNA 2021 provides a wealth of data for stakeholders to review that could not be covered comprehensively in this regional report. Stakeholders are encouraged to explore this data and create additional regional understanding. 
  • Ministries of education and development partners are encouraged to use national data in ways that meet their local needs and use the additional understanding to implement actions for change where needed. 
  • Sharing country level data can enable new analyses of local circumstances and can help to share the task of understanding the large amounts of data. As PILNA data is broad, it may be valuable to many stakeholders, even beyond education. Ministries and departments of education are also encouraged to share national findings with all groups who can use them to create positive change such as teachers, education institutions, regional entities, and community organisations. 
  • Comparing PILNA data to wider data sets can give perspective to the results and allow for the PILNA findings to be understood in a wider context. Integrating PILNA data with existing or future data may enable broader analyses to be done and wider conclusions to be drawn. 

2.     Enable supplementary data to be collected and used between PILNA cycles 

  • PILNA findings may highlight areas, such as student underperformance, that require more evidence to understand or to justify interventions at a local level. These should be identified, and ministries should plan for strategies to monitor and evaluate these outcomes over time. 
  • Similarly, PILNA findings may highlight areas, such as student underperformance, that require further monitoring and evaluation to inform interventions or to better identify changes over time and as a result of targeted interventions. Where this is required, stakeholders should implement activities to collect, analyse, and disseminate this information to appropriate audiences. 
  • Formative assessment approaches should be both encouraged and supported at regional and national levels. These enable teachers and other educators to more closely monitor student performance over time and provide additional supports and learning more flexibly where needed. 

Use efficient and scalable solutions to lift student performance and create positive changes

  • Educators and other stakeholders should collaborate to identify possible activities to address any identified student underperformance at local and regional levels. These should be implemented in a manner that is inclusive, equitable, and considers local and traditional approaches to education. 
  • The contextual data that PILNA 2021 collected gave valuable insights into the experiences of these groups and their environments. Some of the challenges that were identified may impact student performance; others may have impacts on wellbeing and the effective functioning of education systems. Stakeholders should collaborate to understand these issues at local and regional levels and implement actions to create positive change where necessary. 

The presentation today barely scratches the surface of the information gathered and analysed in the PILNA 2021 main study. More detail, specific recommendations and breakdowns of the results by gender and over time can be found in the digital regional report that is being launched today. National reports will follow in the coming weeks and will go even further in terms of specifics based on each country’s context.

Following the release of the results, EQAP officers will be travelling to participating countries to share national results in workshops and discussions with senior education officials, ministry officers, school leaders, teachers and national education stakeholders.

In mid-November, a regional data mining workshop is being held in Fiji. In that workshop, as was the case in 2019, country-nominated representatives will be guided by ACER and EQAP officers in exploring their national PILNA datasets and using the data to pose and answer questions that could inform decision makers on system level interventions to support student learning.

Over the coming months, additional research using the PILNA data will be undertaken to add to the growing body of knowledge around student learning outcomes and wellbeing.

We look forward to working with the ministries and departments of education, development partners and education stakeholders to make the most of the valuable data resource available through PILNA.

News Category


Displaying 1 - 10 of 10
Melanesia Regional Office
Suva Regional Office
Educational Quality and Assessment
Polynesia Regional Office
Micronesia Regional Office
Suva Regional Office
Educational Quality and Assessment
Polynesia Regional Office
Micronesia Regional Office
Melanesia Regional Office
Pacific Islands Literacy and Numeracy Assessment (PILNA)
Australian Council for Educational Research (ACER)
Australian Department of Foreign Affairs and Trade (DFAT)
New Zealand Ministry of Foreign Affairs and Trade (MFAT)
Pacific Islands Literacy and Numeracy Assessment (PILNA)
Australian Council for Educational Research (ACER)
Australian Department of Foreign Affairs and Trade (DFAT)
New Zealand Ministry of Foreign Affairs and Trade (MFAT)