Search the site by keyword

How we measure success

Due to the complex nature of productivity issues, the influence of our work will generally only emerge over long timeframes. Identifying changes in productivity performance or wellbeing that can be directly attributed to our work, as distinct from the many other factors that influence productivity performance can be challenging.

The nature of our role means that the inquiry topics we undertake, and the approaches we use, which are defined in part by referring Ministers through Terms of Reference, can vary. This makes the comparison of some performance measures, between years, challenging. Our service performance is outlined in the “Our performance this year” section (pages 11 to 14 and 26 to 36) and our end of year reporting requirements as per the Estimates of Appropriations 2022/23 (Finance and Government Administration Sector) are outlined on page 39. 

Disclosure of Service Performance Reporting judgements

In determining key service performance information for our intended impacts and outputs, the Board has used judgement based on our purpose as embodied in the New Zealand Productivity Commission Act 2010, our vision, and their intended contribution to achieving the outcomes we seek. The performance measures selected are at the discretion of our Board.

The “Our performance this year” section reports against the performance measures contained in the Statement of Performance Expectations 2022/23. Service performance information in this section is presented in accordance with PBE FRS-48.

Performance measures for our impacts and outputs have been selected for our key activities and range from short-term to long-term (see Figure 1) and are described in more detail below.

In selecting measures, we have made judgements to determine which aspects of performance are relevant and material to readers. These measures also inform our internal management and decision making.

Our measurement methodology

The material judgements we apply to assessing and reporting on our impacts and outputs are specific to the assessment method:

Monitoring of media

  • References made to the findings and recommendations made in our reports and our research, and mentions in Hansard, are an indicator of the role of our work in generating discussion and debate. Mentions by third parties also indicate the level of public and political discussion and debate on our work.
  • We use the media monitoring service provided by Fuseworks to monitor third party commentary (online and print including Hansard) on our work and the Commission in general.
  • Fuseworks provides all mentions of the Productivity Commission based upon key words we identify. We manually check the Fuseworks report weekly to ensure mentions relate to the Commission and our work. We then assess and tag each mention as they relate to our outcome, impact and output measures. Our communications team use their judgement to assess the commentary made on our work for the nature of the sentiment.
  • We will use judgement to report information that gives an accurate and insightful representation of commentary or mentions, including their nature and source.

Monitoring and review of formal Government responses

  • Following the delivery of our inquiry final report to referring Ministers, we expect to receive a formal response from Government that indicates the degree of agreement with our recommendations. The response will also include indications of commitment to exploring or implementing our recommendations, which may lead to policy change.

Monitoring of milestones

  • The monitoring of milestones for our outputs demonstrates our ability to effectively manage our processes, meet deadlines and meet the expectations of our stakeholders.
  • For inquiries our key milestones are outlined by referring Ministers in the Terms of Reference, as to the delivery of a draft and final report. Intermediate milestones are then developed based on the scope described in the Terms of Reference, our knowledge of past inquiry delivery, available capacity and likely capacity of key stakeholders. These are approved by the Commission’s Board.
  • We publish milestones for our key outputs on our website and display our progress in achieving them during the inquiry process and following completion.

Survey

  • Evaluation using a survey provides us with both quantitative (through Likert scale responses) and qualitative (through open text questions) measures of our effectiveness for our impact and output measures. The survey also allows us to gather a large amount of data from stakeholders most involved in the inquiry, and therefore in the best position to provide feedback on performance and impact.
  • Answer options used for the Annual Report take the form of Likert scales, often with six options, two positive, one neutral, two negative, and a “Don’t know” option. Exceptions to this are the initial qualifying questions gauging the organisation type of the respondent and their involvement in the inquiry. We also include an initial question on whether the inquiry has increased the respondent’s understanding of the topic, which is required for our impact measure on policy change – increasing understanding.
  • We run the survey through our Survey Monkey premium account, which enables us to design, operate, monitor and analyse the survey results.
  • Our measures specify which aggregated responses to report from each survey question. No judgement is applied to the interpretation of this data.
  • To evaluate our A Fair Chance for All inquiry we wrote a survey featuring questions based upon:
    • the measures and answer options as stated in our Statement of Performance Expectations 2022/23, and
    • the survey questions asked in the previous inquiry evaluation (Immigration) to ensure effective comparison.
  • We invited all inquiry participants from our database to complete a survey, which included participants who had made a submission or were actively involved in the inquiry. We had a response rate of 12% (146 responses from 1231 invitations). We have included a comparison of results from our survey of the 2022 Immigration inquiry, which had a response rate of 35% (100 responses from 289 invitations). The A Fair Chance for All inquiry survey had a lower response rate when compared with the Immigration inquiry as the nature of the inquiry meant it had a broader range of interested parties and the survey was sent to a far broader range of stakeholders.

Expert review

  • The expert review provides the Commission and its stakeholders with an independent view on where the inquiry performed well, and where there is room for improvement in future inquiries.
  • The primary evaluation frame for the expert review comes from the Commission’s six output measures. We exercise judgement in selecting summary comments that best address the measures and provide insight to the reader.
  • The A Fair Chance for All evaluation is the first time the Commission has brought together the traditional evaluation components (expert evaluation, focus groups, survey) into one combined report. Previous inquiry evaluations have delivered the review, focus groups and online survey components separately. The intent of commissioning the evaluation in this way was to enable greater triangulation and synthesis of the findings across the various data sources, with the view to eliciting richer commentary and more robust and usable recommendations for future quality improvement.

Focus group

  • Focus groups allow us to gather qualitative data from key stakeholders and delve into the reasoning for their views on our work. The focus groups are run by independent consultants as part of the inquiry evaluation, to ensure results are captured without bias.
  • We exercise judgement in selecting summary comments from the focus groups that best address the measure and provide insight to the reader.
  • For the A Fair Chance for All inquiry, our evaluation consisted of nine interviews and two focus groups with a total of 17 participants across both groups. Focus groups were designed to capture different types of conversations. The first group focused on academics/subject experts and community sector representatives. The second group focused on public sector professionals.
  • The results from this survey are compared to the focus groups of the 2022 Immigration inquiry, which collected feedback from 12 people representing industry groups and other stakeholders who were actively involved in the inquiry process. The focus group process included individual interviews, and two small group sessions.

To ensure a robust approach to assessing and reporting on our performance, we use qualitative and quantitative methods mentioned above, and often a mix of both.

For the reporting year 2022/23, there were no constraints collecting performance information.

The Commission sets targets for performance measures based on a combination of historical performance, with consideration of factors that may impact future performance and opportunities for improvement. As such, future performance may differ from budgeted performance.

We review our performance measures each year. Any proposed changes are approved by our Board; and outlined in our Statement of Performance Expectations for the following reporting period.

Our impact indicators

To support our aspiration to influence the behaviour of government, industry, and communities through our work, we look for evidence of our impact against three indicators.

Policies and behaviours change as a result of the Commission’s work. Evidence of a greater understanding of our work will lead to a better uptake and implementation of our recommendations. This will contribute to better decision making on the policies and programmes that could lead to improved productivity and wellbeing.

Generating discussion and debate. Wide-ranging discussion and debate by diverse voices is more likely to influence decision makers. Our reporting looks at evidence of our work being used by influencers, particularly those providing commentary on, or input into, policy and how and where our work is cited in those discussions.

Levels of engagement with, and responses to, our work. We look for feedback and mentions of our organisation that indicate our work plays a role in increasing the quality of analysis and advice on productivity-related topics and issues.

Our output measures

The key elements of our approach to performance measurement include six output measures.

Right focus – the relevance and materiality of our inquiry and research reports.

Good process management – the timeliness and effectiveness of our processes.

Effective engagement – quality of our engagement with interested parties.

Clear delivery of message – how well our work is communicated and presented.

High-quality work – the quality of our analysis and recommendations.

Overall quality – the overall quality of the work considering all factors.

Figure 1 How we measure the impact of our work

Figure 1 A diagram to show how we measure the impact of our work