Search the site by keyword

3. Findings

In general terms, the Inquiry was welcomed by a broad range of stakeholders across Aotearoa New Zealand. The topic was viewed as important and the Inquiry findings were of high interest to many people. The analytical frames used, and the breadth and depth of research commissioned to support the Inquiry, were overall seen as high-quality, robust and meaningful. This included support for te ao Māori frames and those that place whānau at the centre of analysis and advice. Particularly amongst the community sector, including some Māori and Pacific organisations, the Final Report is already being shared and used.

The wide approach to engagement, particularly at the beginning of the Inquiry was valued by many stakeholders, and seen as a particularly effective way to work within an Inquiry that dealt with people who persistently are left out or do without. The new initiatives trialled by the Commission were overwhelmingly supported and considered valuable. However, some stakeholders who often make use of Commission inquiry findings, did not always understand or appreciate the trade-offs this required, such as not publishing an issues paper because the consultation on the Terms of Reference had broadly served that purpose instead.

Structural and personnel changes at the Commission had a direct effect on the Inquiry, an impediment articulated by nearly every key respondent. This was amplified by other process obstacles, such as technical malfunction and insufficient planning in some areas.

Although some people bemoaned the decision to focus on the public management system for the Final Report, most people supported this, or at least understood the rationale for the decision. Some of the qualms expressed by respondents concerning this focus represented a difference in views around theories of change, in terms of when, where and how policy change can be most easily articulated and/or impactful.

The clarity of message was good or excellent for most people. Even for those who disagreed with the focus on the public management system, the narrative coherency of the Final Report was generally viewed positively.

As context to the following discussion on specific performance measures, it is helpful to remember the basics. The Inquiry delivered an insightful, comprehensive, and well-researched report within the timeframes agreed. The report provides a thorough and robust reference document to inform policy making and social change on the drivers behind persistent disadvantage, and the public sector mechanisms that can be considered for reducing it.

3.1 Performance measures

The following sections concerning performance measures are laid out as follows. First, the key qualitative findings for each performance measure are presented. More detail on these key findings, as well as additional common findings are then presented according to three types of feedback (intent and methodology, clarity and execution, useability). The coverage of themes differs between performance measures, in order to best capture the relevant data for each measure.

These qualitative findings are contextualised with further data sources from section 2.3. Key summary and findings by the Evaluation Project Director conclude each section.

These primary qualitative findings are followed with relevant references to other data sources (online survey, submissions summary report, engagement and media report) to highlight alignment or lack of alignment with the qualitative data. Each performance measure section is then wrapped up with a summary, inclusive of all the data sources, and concluded with findings from the Evaluation Project Director. At the end of this report, findings are recast as more general recommendations for the Commission.14

Throughout the report, the term ‘respondents’ refers to those who participated in interviews or focus groups for this evaluation. Where this report discusses further data sources, such as submissions summary report or online survey findings, the term respondents is further qualified.

3.1.1.  Right focus

The Right Focus measure is defined as ‘the relevance and materiality of the Final Report in meeting the Terms of Reference’. The key findings in this area were:

  • The ambition to focus on system-level change was broadly supported, although views differed on execution and some stakeholders wished for a deep dive in particular sectors.
  • The dynamics and drivers contributing to persistent disadvantage were captured well across the suite of research, analysis and reports, although views differed on the ultimate impact and effectiveness of recommendations that focused mostly on the public management system.
  • The intention to include both longitudinal and theoretical data (e.g. learning systems) was an ambitious attempt to speak to a range of methodological views and theories of change. The result was questions from all sides, which probably means the balance was about right.

Intent and methodology

Slightly more than half the respondents agreed with the intent and methods behind the focus of the Inquiry. The ambition to cover system-level change, rather than diving too deeply into too many individual sectors, was seen as the right choice by these respondents. The ambition of scope behind the Inquiry was acknowledged and celebrated, with respondents noting the challenge of this accomplishment. One contributing factor to the ‘rightness’ of this choice included the importance of establishing broad coverage to understand persistent disadvantage, before diving too deeply into specific sectors.

Some respondents had concerns about the focus on the public management system. These concerns originated within the narrowing respondents felt happened between the early Inquiry materials and the Final Report, to focus recommendations towards the public management system. Some respondents stated this as a change from the Interim Report, however the Interim Report states in its overview that, ‘Our interim recommendations focus on the overall settings of the “public management system”’15. One concern behind these choices was a view that focusing recommendations at the public management system minimised the role of other socio-economic drivers of persistent disadvantage. Drivers such as ‘design of the economy’, ‘job creation’ and ‘living wage movement’ came up repeatedly for these respondents. Another concern was a view that the Commission did not dive deeply enough into specific public management sector areas, such as health, housing, or child poverty reduction policies.

A different type of methodology concern came from people who felt the report was not grounded enough in quantitative data and evidence. This included a desire to see more longitudinal datasets as part of the Final Report. Many of the respondents who held this opinion, often referred to previous Commission inquiries as examples of what they had hoped to see, identifying a type of information that brings people to the discussion table.

It doesn’t engage enough at the data and evidence level. It is music to some people’s ears but turned off others. Focus Group Participant

Data people were frustrated because it was undercooked. People who wanted to see both systems change and data story integrated didn’t get it. It became more about the systems issues at the end. Interview Respondent

Note that discussions around data limitations are present in the Interim Report (Chapter 3), the quantitative report (Chapter 2) and are mentioned in the Final Report as well. Barriers to including the quantitative data within the Final Report are further discussed in the ‘Good Process Management’ section of this report.

Clarity and execution

The majority of respondents were favourable towards the execution of the focus that was ultimately decided upon. They supported the way the Inquiry was framed, identifying the language around persistent disadvantage at a systems level as accurate and clear. The dynamics and drivers feeding into the focus and recommendations were seen as both broad and deep, providing a robust base of knowledge and information for drawing on. The inclusion and integration of frames such as Mauri ora, He Ara Waiora and whānau-centred frames, contributed to clear and well-positioned execution of the focus.

Respondents recognised the nature of this Inquiry as a different type of topic for the Commission, as well as arguably for the wider public discourse, and acknowledged that finding a focus was always going to be a challenge. They felt that execution of a piece of complex system analysis, particularly in a topic that went beyond traditional economic expertise, had been delivered well.

The Terms of Reference were vast, choices had to be made by the Productivity Commission on finding a coherent topic within a vast territory. Asked to build on many things across many disciplines [they] instead went to system findings and steered into higher level description of systems problems. Focus Group Participant

In terms of what was seen as less effective for execution and clarity of focus, a number of respondents mentioned a ‘disconnect’ between the Interim and Final Reports. In particular, respondents from the public sector felt the discussion around complexity and what it means to tackle complex problems in the public sector was diminished in the Final Report, relative to the Interim Report. Noting that this connection is made in places within the Final Report (pages 36 and 104), it may simply be that the relationship between complex systems and learning systems could have been clarified more directly for readers in the Final Report.

Similarly to the feedback around the intent of the focus, some respondents were disappointed with the lack of economic data in the Final Report. Although this data was provided in the Quantitative Report in July 2023, that approach did not seem sufficient to such respondents, who saw that ‘the intention to do economic data and then the shift to systems change never melded together’. This finding is discussed in more detail in the considerations section of this report.

On ToR basis, I thought it could be a strong report, but I was very surprised by the report that came out the other end - I was expecting specific data and evidence, rather than systemic issues. Focus Group Participant

[There was a lack] of engagement with business … it has very little about material production. The way we produce and distribute economic resources is fundamental to our definition of economic disadvantage. Focus Group Participant

Further data sources concerning right focus

Responses to the online survey aligned with participants who supported the focus of the Final Report. This is best contextualised in Question 6, which tested the likelihood of people using the Inquiry report as a resource and reference in the future. To this question, 76% of survey respondents agreed or strongly agreed, indicating that the focus was right for a majority of stakeholders. Question 17, which asked whether the report focused on the issues of most significance from the Terms of Reference, had 69% of survey respondents who agreed or strongly agreed.

Submissions from the Interim Report in the Your Feedback report also ‘endorsed’ system barriers as a focus for persistent disadvantage.16 However, some of the feedback to this focus decision pointed at uncertainty as to whether a focus on the public management system would result in the type of recommendations that lead to real change. For example, some submitters to the Interim Report disagreed that focusing on proposed system barriers would be sufficient to reduce persistent disadvantage, with a small group of submitters suggesting that changes elsewhere (beyond the public management system) were needed instead.

Summary of right focus performance

The inquiry decided on, executed and communicated on a focus that worked for many people and entities. The decision to focus on system-level findings provides a platform for future work, whereas a decision to provide more deep dives would arguably have skipped a consistency and frame- establishment step, potentially creating methodological siloes for future work.

The split in the primary data between people who agreed versus disagreed with the focus of the Inquiry is quite even. This indicates that no choice would have satisfied everyone.17 That seems to be the case for the focus of the Fair Chance for All Inquiry.

There was a correlation between people who disagreed with the focus on the public management system and those who felt that economic data was underused in the Final Report. People holding this view were also most likely to feel that a particular sectoral deep-dive would have enhanced the report, often correlating with the sector in with the respondent worked. It is also interesting to note that some of the ‘missing sectors’ people wanted to see in the Final Report, such as job creation or the living-wage movement, do sit at least partially within the public management system. Some of the implications of these views are discussed further in the ‘Insights’ section of this report, regarding which methodological frames are seen as valid by whom.

There may have been an opportunity to communicate more effectively the trade-offs inherent in this narrowing choice to focus on the public management system. The Commission made a choice to focus at the system-level, rather than diving into sector-level analysis or recommendations. Based on respondent feedback, the Commission could have made this decision clearer in the Final Report. The recommendations in this section are therefore not around the focus choices that the Commission made, which were informed by robust feedback, engagement and expert knowledge of those on the Inquiry, but around the way those focus decisions were communicated.

Some of this scope discussion is interconnected with the approach taken to consult on the Terms of Reference, and may have been an implication of that approach. This is discussed in more detail within the 'New initiatives' section of this report.

Finding 1: The Final Report could have included a more proactive discussion around the limitations of public sector management levers and mechanisms alone to reduce persistent disadvantage. This discussion does occur within the Final Report but it could have been simplified and emphasised for readers.

Finding 2: The Commission could have more clearly telegraphed the evolution of ideas and findings from the Interim Report, through submissions, and the subsequent rationale for choices made in the Final Report. Specifically, the themed submission analysis on the Interim Report could have been accompanied by a more exhaustive rationale on all interim findings, recommendations and questions), mapping more clearly how these ideas did or did not find their way into the Final Report. The publishing of submissions provides a valuable record for those working on the topic in the future, and should continue.

3.1.2  Effective engagement

The effective engagement measure is defined as ‘the quality of engagement with interested parties’. The key findings in this area were:

  • Engagement was robust and thorough throughout most of the Inquiry. The Commission was perceived as having an authentic approach to engagement.
  • The inquiry covered an impressive spread of engagement types. This range of approaches created a broad reach to the voices included in the Final Report.
  • The level and quality of Pasifika and Māori engagement was seen as positive overall. Key partners saw the voices they represented incorporated throughout the process and within the Final Report.
  • The diversity of approaches at the beginning of the inquiry were highly valued by a portion of stakeholders. This included a reach into channels not used as much in previous inquiries. However, these stakeholders had wanted to see consultation and engagement continue at the same level as the process used for the Terms of Reference.
  • Respondents raised questions about sufficiency of engagement at political and decision- making levels, although others saw this as outside the mandate of the Commission. Views were inconclusive on what difference this may have made.

Intent and methodology

A majority of respondents valued the engagement approaches used in the Inquiry, finding them both effective as well as meaningful. This includes the breadth, depth and range of engagement.

It was actually a conversation, rather than ‘tell us what you think and we’ll take it away’. It was a dialogue. I think that was partly what fed into the confidence to go to a draft report rather than an issues paper, they felt they had a good feel. It’s a risk, when you go to report and recommendations, people do tend to focus on recommendations, but they had such deep engagement. Focus Group Participant

They were getting out and making contact with people, feeding back - we heard back, they were really good about it. They were really brave about coming out and talking to everyone, facing them. Focus group participant

Because the diversity of approaches, and the breadth of engagement, were so valued by a group of stakeholders, there was disappointment that this approach did not continue at the same level and reach throughout the entire Inquiry. A group of people wanted to see more consultation towards the middle and end segments of the work, mirroring the very wide, consultative approach to the Terms of Reference. Another group of people felt engagement approaches were inconsistent and thought the focus and supporting engagement should have been narrowed much earlier.

Clarity and execution

The authenticity of the Commission’s engagement came up as a theme for many respondents. This applied particularly to wānanga and talanoa, as well as to some of the early policy workshops held around the country. The Chair of the Commission was named several times in this context, as having a particularly authentic form of engagement, which many respondents valued.

[The] Commissioner honouring the lives of people he’s talking about was very well done. I heard him talk again last week and the attempts to honour lived experience came across as genuine. Focus Group Participant

In addition to the wānanga and talanoa, the level and approach to Pasifika and Māori engagement was seen as positive overall. This finding came both from respondents who participated in these processes, noting them as respectful and embodying types of engagement appropriate for working in te ao Māori and Pasifika, as well as those who were observers or recipients of this information.

The team are public sector workers so they’re comfortable with engaging with Te Ao Māori, if anything they were too reserved, worried about getting it wrong when they are actually quite capable. Interview respondent

There was a general intention to listen to feedback provided, you could see your input was captured. Interview respondent

These views included appreciation of the nuance of how Māori and Pasifika voices were incorporated, inclusive of both direct findings as well as contextual evidence.


A few respondents raised questions raised about sufficiency of engagement with political and decision-making levels, views that were often expressed hand in hand with a wish to see clear, decisive action on reducing persistent disadvantage. Lack of engagement with senior decision makers arose as a theme, although some voicing this view noted at the same time that they were unsure what difference it would have made. Some respondents mentioned Ministers as well in this context.

Note that the Productivity Commission Act requires the Commission to ‘act independently’ in delivering its role. This means that the Commission would deliberately not engage with Ministers during the course of an inquiry, apart from providing written and verbal briefings on publication of interim and final reports. For this Inquiry, referring Ministers were briefed verbally for the Interim Report. Referring Ministers did not take up the opportunity to be briefed verbally for the Final Report.18

Though not a majority view, some respondents found the tone of the inquiry ‘too political’ and indicated ‘too strong a hand’ from Government. This can be viewed as a perception rather than evidence, as the Government was involved only in setting the Terms of Reference and in providing general guidance to the Commission via the letter of expectations, discussed in section 3.1.5.

Other respondents saw engagement with the political layer as outside the mandate of the Commission. These views are discussed further in the ‘Insights’ section of this report.

[There was] a big gap between consultation and decision makers at agencies with more connection to the appropriate ministers. Focus Group Participant

There was not much from [political layer] leaders - but this is not the Productivity Commission’s fault, it’s hard to get to that level. Focus Group Participant

Further data sources concerning effective engagement

Responses to the online survey aligned with participants who found the engagement process effective and enjoyable. This is best represented in Question 22, where 68% of survey respondents agreed or strongly agreed that they had sufficient opportunity to participate in the Inquiry, and Question 23, where 68% of survey respondents agreed or strongly agreed that the Commission was approachable.

Of the respondents, 66% had never engaged with the Commission on previous inquiries. This is further indication of the breadth of engagement on the inquiry.

The media and engagement report outlined ministerial engagement during the launch phase of the Inquiry, with a written briefing paper provided for all referring Ministers. This report notes that in- person briefings were offered and that no referring Ministers took up this invitation.

Summary of effective engagement performance

The engagement approach for the Inquiry was not only welcomed but celebrated by many. Given the commonality of this view across respondents, this is a point of note within the findings. Although it is likely too early to identify deeper impact from this engagement, current engagement impressions of the Commission amongst many stakeholders are positive.

The approach to engaging with Māori and Pasifika was highly rated. This opinion was held both by those who engaged directly within wānanga, talanoa or other fora, as well as by people from these communities who engaged with the inquiry in other ways.

Finding 3: The Commission may not have sufficiently considered the trade-offs inherent in its approach to engagement, consultation and feedback. Specifically, this could have included:

  1. the degree to which broad engagement approaches and activities were sustainable throughout the Inquiry;
  2. actively managing stakeholder expectations about this level of engagement, particularly during the more intensive analytical phases of the Inquiry; and
  3. actively weighing up the relative merits of different consultation processes, and communicating this clearly as part of engagement.

For example, if the Commission were to repeat the consultation exercise on the Terms of Reference for an inquiry, it should conduct engagement in a way that makes clear that the exercise is the primary (or only) method for feeding into framing the inquiry.

Finding 4: The Commission delivered well on engagement methods specific to Māori and Pasifika as part of future inquiry and work planning. The feedback was positive overall, which created a good platform to work from. Further investment into these capabilities would be valuable, in order to improve integration and understanding with these frames and lived experiences.

3.1.3  Good process management

The Good Process Management measure is defined as ‘the timeliness and quality of the inquiry process’. The key findings in this area were:

  • The Final Report was delivered on time, including the completion of significant research and engagement work programmes.
  • Staff turnover and changes within the Commission more broadly had direct impact on the Inquiry. This contributed to resourcing challenges and some pockets of inconsistent information passing within the work team.
  • Role clarity and recommendation-setting process both suffered from lack of clarity at times. Expectations between directors and commissioners were not always clear, which created some critical pinch points.
  • Impediments at Statistics NZ reduced access to the Integrated Data Infrastructure (IDI), impacting the overall Inquiry timing, insofar as it required the Final and Quantitative reports to be delivered separately.

Clarity and execution

The Final Report was delivered on time, including the completion of significant engagement and research work programmes (e.g. the publication of ten supplementary research reports during the course of the inquiry). This was a significant achievement, particularly considering the challenge of the topic at hand and some of the intervening variables, both internal and external to the Commission.

Staff turnover impacted the inquiry significantly. These changes were occurring across the Commission more generally and were not always specific to the inquiry. The impacts on the Inquiry included direction changes in engagement methods and inquiry focus.

There was a lack of clarity in some domains concerning the respective roles/inputs of directors and commissioners. This reduced clarity of communication in some parts of the work. It was also responsible for creating pinch points, where an unclear process had to be worked through during the process, rather than in advance.

In future, we would put expectations on the table with clear roles and responsibilities, and we’d have that conversation right up front. Interview respondent

The process to set and confirm recommendations potentially suffered from under-planning. Although this was included as scheduled milestones within the work programme, it may not have been given sufficient ‘wiggle room’ to work through the types of analytical and tactical decisions that inform a recommendation-setting exercise. Several internal interview respondents identified this as an area to work on. Respondents reflected that this was potentially an instance of ‘over promising’ what the inquiry team could deliver within the timeframes and allocated resources. Additionally, family emergencies meant the inquiry Director was unavailable during parts of the process to confirm recommendations. The most sensible way to consider these challenges may be as a key person risk, where in a small organisation like the Commission having back-up or acting duties assigned is not always possible or top of mind. These findings inform Recommendation 5.

A drive to produce both a deep and broad report required a thorough quality assurance process. The time and resource this required was underestimated, which had an impact on planning both internally and externally.

A desire to do everything had an impact on planning. Honestly, we were a bit slow in terms of recognising those limitations and how much that would impact on our plans. Interview respondent

A few comments also arose concerning peer review processes, including significant technical failure of systems in use. This includes template malfunctions and issues with IT systems, leading to many hours of issue mitigation, all of which caused delays to producing report material drafts. Indirectly, these technical issues may also have affected team morale, although most respondents were not close enough to these issues to make such observations. Respondents who raised these issues, clarified that the technical malfunctions have been remedied since the Inquiry.


Significant delays from Statistics NZ occurred in releasing IDI data outputs for Commission use. This was a result both of Cyclone Gabrielle affecting the Census process as well as more general backlogs in accessing IDI. This meant the Commission had to make a decision about the release of the quantitative report, which was then delayed until July 2023 in order to ensure it was robust and thorough. It also meant the quantitative analyses weren't available until much later than planned, which limited the Commission's ability to integrate these findings more deeply alongside findings from other research and engagement.

In addition to limiting the quantitative available for the Final Report, this delay also presented a communication and messaging challenge in keeping stakeholders engaged long enough to receive the Quantitative Report. This is mentioned again within the ‘Clear message delivery’ measurement section of this report.

Further data sources concerning good process management

Responses to the online survey roughly aligned with some of these process and timing findings. This is best contextualised in Question 20, which tested survey respondent satisfaction with the Commission’s process. To this question, 34% of survey respondents disagreed, strongly disagreed or did not respond that they found the process satisfying. Although 66% of survey respondents agreed or strongly agreed that the process was satisfying, this is a lower rate of agreement than most of the other survey findings referenced in this report. This may indicate that some of the challenges discussed in this section were felt by stakeholders and participants.

Expert overview and findings

It is important to remember that the Final Report was delivered on time. This is an accomplishment from a process point of view, considering the ambition and scope of the report, as well as some of the internal and external challenges the inquiry team faced along the way.

Many of the process challenges in the inquiry appear to be correlated, insofar as they represent issues that may have some similar causes. Particularly for a small organisation working on an ambitious inquiry topic, these process challenges had an impact on the ease and internal clarity of expectations for the inquiry.

Finding 5: The Commission could have reduced risk and resource pressure by building in mitigations to anticipate disruption. This could have included:

  • role clarity across leadership functions could have been more clearly articulated to ensure understanding across all relevant parties;
  • the scheduled process for confirming recommendations could have been revisited, once resourcing changes and other delays occurred, to ensure the plan was still fit for purpose;
  • acting arrangements could have been considered as a mitigation to key-person risk and to anticipate the occurrence of personal events. Lessons learned from business continuity during COVID-19 events could potentially have informed an approach to this.

Finding 6: The delays in accessing IDI had a significant impact on the quantitative component of the inquiry. This required the Final and Quantitative Reports to be published separately, which likely reduced the readership of the Quantitative Report. Considering the criticality of this component for the inquiry, the Commission should anticipate such potential IDI delays in the future.

3.1.4  High quality work

The ‘high quality work’ measure is defined as ‘the quality of the analysis, use of evidence, findings and recommendations in the Final Report’. The key findings in this area were:

  • The overall presentation and analytical frames worked for many people as a relevant and evidence-grounded way to analyse and present information.
  • The breadth and depth of research that informed the inquiry was named as valuable by many. This finding was common across a range of stakeholder demographics and evidences participant confidence in the work.
  • The inclusion of longitudinal datasets to improve understanding of persistent disadvantage was a key accomplishment. This is part of the Commission’s core purpose and function but accessing, analysing and presenting such data is not a simple undertaking. This can be seen as a success for the inquiry.
  • The use of te ao Māori frames within the Final Report were generally seen to be integrated authentically and in an analytically rigorous way. This finding was common across many respondents.
  • Although engagement in te ao Māori and Pasifika was viewed positively, the process around integrating the research grounded in these frames could have been better supported. Some te reo terms and concepts required more effort and understanding in order to be utilised accurately, such as concepts like mauri ora and mauri noho. Integration of research and experiences arising from colonisation also required more nuanced conversations and time, than had potentially been anticipated by the Commission.
  • Some respondents identified recommendations that did not feel grounded in evidence and/or analytical frames, to the same extent as other recommendations. This (real and/or perceived) logic gap presented a barrier for them in using the Final Report.

Intent and methodology

The analytical frames used in the Final Report were viewed as well-grounded and rigorous by many participants. The overall presentation of the data according to these frames worked for many people, including those who are experts in the topic.

The breadth and depth of research commissioned to inform the inquiry was named by many as extremely valuable. Not only did this directly enhance the quality of inquiry reports (Interim, Final and Quantitative), but it also provided a public resource to inform future thinking and changes. The commonality of this finding demonstrates confidence in the work, across a range of stakeholders. The online survey gathered similar findings.

The creation and inclusion of longitudinal datasets to improve understanding of persistent disadvantage provided a key contribution to public understanding of the topic. Although some participants wished for more sector-specific data, most agreed that the information created and presented by the Inquiry was valuable. This is the type of work that, according to its purpose and function, would be expected of the Commission. However, it is a challenging task to actually deliver and should thus be viewed as a success for the Inquiry.

Quite impressed with way the team managed data quantitative evidence (close to my heart) with bringing in different databases. Interview Respondent

Just the descriptive statistics showed inequity all over the place. So, there was evidence to suggest a problem that needs to be fixed, but it wasn’t granular enough to target or understand the mechanisms to deal with the issues. This inquiry helped fill in this gap. Interview Respondent

The use of te ao Māori frames, namely the Mauri ora and the He Ara Waiora approaches, were named as authentic and applied well. This finding was articulated by a range of respondents. In addition to creating a way of viewing persistent disadvantage that is grounded in the experiences of Māori, the use of these frames creates analytical integration with similar topics of research and advice currently in the public sphere.19 Respondents also commented on the growing capability with the public sector, with the Commission as a good example of this, for understanding, discussing and referencing te ao Māori frames.

Execution and clarity

Some respondents extended their views of authentic application of te ao Māori frames to include the way that data and evidence around Māori and Pasifika people was incorporated into the Final Report. These respondents found the data to be nuanced and accurate, reflecting a wider understanding of the Māori and Pasifika experience. This included strengths-based as well as deficit- oriented data.

One area where room for improvement was identified was the articulation of some concepts from te ao Māori into non-Māori frames and language. For example, terms such as mauri ora and mauri noho may have been used more narrowly in the Final Report than they are in Māori communities.

Additionally, respondents noted that the intent to incorporate frames and experiences around colonisation and institutional racism was laudable, but the process took work to build understanding within the Commission. This work may have been integrated more smoothly if the inquiry had set aside more time to work through these topics with the research provider, considering the challenges inherent in bringing together differing backgrounds and frames of reference. Additional capability investment in te ao Māori, particularly when themes so central to the Māori experience are part of the evidence base, may have improved this process. This is captured in Recommendation 8.

Some respondents questioned whether the discussion around disadvantage versus persistent disadvantage, differentiated enough between the two concepts. Note that this discussion received direct discussion on page 25 of the Final Report, but some people may simply have wanted a more extensive discussion on the differentiation between types of disadvantages. Some of these comments may be mapped back to methodological differences, which is further discussed in the 'Insights' section of this report.

Some respondents identified recommendations that did not feel grounded in evidence and/or analytical frames to the same extent as other recommendations. This (real and/or perceived) logic gap presented a barrier for them in using the Final Report. Other respondents felt that some research was not represented in the way it was intended, although including specific examples would not be appropriate for this report; therefore this finding can be taken as generic rather than specific. One example was the Final Report Recommendation around a social floor.

There are some policy recommendations which don’t drop out of the analysis. It’s disconnected. With the public management system, it’s really important to distinguish relevant alternatives. Interview Respondent

There is a disconnect between the thinking and recommendations - we’re conflating two things: inactive behaviour and formal systems - they’re different, one may be much more permissive. The social floor [for example], where did this come from? This didn’t follow from the analysis. The links don’t always match up in the report, maybe because of time limits. Focus Group Participant

Some more general comments by respondents around frames that did or did not work for them, may have come from the new methods and approaches used in the inquiry, including the use of te ao Māori frames and experiences. Some of this feedback is further discussed in the section under new initiatives in this report.


The theme of focus areas came up again within respondents’ discussion around ‘high quality work’, echoing concerns that, whilst the recommendations are impactful and linked to relevant evidence within a public management frame, the Inquiry overall creates a sense that public sector change is the primary way forward. While the Final Report does clarify that causes of persistent disadvantage are wider than the public sector (page 17 and throughout Chapter 2), some respondents were left with the impression that the inquiry advocated change only in this domain.

Further data sources concerning High Quality Work

Responses to the online survey aligned with participants who found the quality of work informing and resulting from the Inquiry to be high. This is best contextualised in Question 7, which tested how logical the flow from analysis to findings was. To this question, 73% of respondents agreed or strongly agreed with the logic flow. Online survey Question 9 presented a similar view, with 90% of survey respondents identifying the quality of analysis as acceptable, good or excellent.

The submissions report focused on the most common key themes. It did not provide detail around topics areas where little or no submissions were received. For example, the submissions report did not provide detail around the questions that asked how best to measure persistent disadvantage (Interim Report Chapter 3 findings, recommendations and questions),20 largely because of the small amount of feedback received on these themes. However, the decision not to include an exhaustive ‘question by question’ summary of submissions from the Interim Report may have impacted respondent views around measurement questions, in particular. Considering the number of respondents that raised datasets and measurement of persistent disadvantage as being ‘undercooked’ in the Final Report, taking explicit account of these submissions, few though they may have been, could have been particularly valuable. This is reflected in Recommendation 2.

Note that a more exhaustive approach would have had resourcing and time implications, however, and this would have been a trade-off the Commission may already have considered. And this report acknowledges that a more exhaustive approach on mapping submissions analysis from Interim to Final Report, is still unlikely to have satisfied everyone.

Expert overview and recommendations

The breadth and depth of research that informed the inquiry was named as valuable by many, both the coverage of research as well as the quality and the way it linked through to findings and recommendations. This stood out as a key finding and valuable contribution from the inquiry. Although this is part of the Commission’s mandate, the delivery for this inquiry was particularly strong, in an effort to establish coverage across a range of frames - particularly where existing information was scanty. This can be seen as a key accomplishment for the inquiry, particularly given the aforementioned process obstacles.

Some respondents expressed a desire to see further longitudinal datasets and evidence at the core of the Final Report, specifically wishing for more detailed data in specific sectors or policy areas. The Commission chose to focus its limited quantitative capacity on understanding the disadvantage experience of the same cohort of people through time – in order to address the most critical research gap.

The inclusion and presentation of te ao Māori frames as an analytical lens enhanced the Final Report and was viewed positively. The Commission was seen as having understood and used these frames authentically and was encouraged to do more of this in the future. This finding did not, however, diminish the importance and need of continuing to build te ao Māori capability across the Commission.

Finding 7: The Commission delivered a significant breadth and depth of research in the inquiry, filling a research gap for Aotearoa New Zealand. This was part of the system-level approach to the inquiry, and it provided a key service for current and future users of information relating to persistent disadvantage.

Finding 8: The delivery and presentation of analysis within te ao Māori and Pasifika frames was delivered to a high-standard. Respondents to this evaluation found the use of these frames generally authentic and representative of lived experience. However, there were places where greater capability at the Commission could have improved understanding between frames of experience.

3.1.5 Clear message delivery

The ‘clear message delivery’ measure is defined as is defined as ‘how well the work was communicated and presented in the Final Report’. The key findings in this area were:

  • The Final Report overall was seen as coherent, clear and well-articulated. It had a good logic flow that was easy to follow. Most respondents found the narrative and findings clear.
  • The relationship between productivity and wellbeing was clear to most, is clearly laid out in the Productivity Commission Act 2010 and was detailed at the beginning of the Final Report.21
  • So far, the Final Report and wider Inquiry findings are being used and referenced frequently in circles of NGOs, whānau-led or place-based initiatives and community organisations, and some pockets of public sector agencies traditionally associated with social services.
  • So far, the Final Report and wider inquiry findings are not being used nor referenced as much in traditional economist circles, including at public sector agencies traditionally associated with economic policy.

Intent and methodology

The Final Report was viewed as being clear, logical and easy to follow. Recommendations were clear, although some respondents wished for an even shorter, clearer set of messages.

I found sequencing quite helpful, not overwhelming. Focus Group Participant

Many respondents confirmed that the relationship between productivity and wellbeing was clear in the Final Report and broader inquiry materials. Although some respondents identified that the Final Report included ‘too much on wellbeing’, a close look at the Productivity Commission Act 2010 confirms that the relationship between productivity and wellbeing is part of the Commission’s direct mandate. The current Letter of Expectations from the Minister of Finance provides clear guidance on the relationship between productivity and wellbeing.22

The opening discussion on p17 of the Final Report could perhaps have been emphasised throughout the report, in order to clarify this. It is also possible that even with more overt clarification, some stakeholders would have continued to question the value of wellbeing measures and/or analysis, as they relate to productivity.

Clarity and execution

The inquiry products, including the Final Report, were overall seen as well-articulated and concise. Most respondents agreed that the frames fit the evidence and the recommendations, delivering a clear narrative. Although not all readers accepted the rationale for the Final Report recommendations to focus on the public management system, within that frame, even respondents who disagreed with this choice in focus found the narrative to be clear. It was noted, however, that the public sector frame might provide a stronger narrative for people engaged in the public service than to the wider public.

The Commission has created a pathways forward diagram which focuses around 3 themes. I think that’s coherent, it’s a strong public sector focus, and goes back to the purpose of Commission … the frame it’s taken, it has achieved coherence. Focus Group Participant

Some respondents identified that the important point made of keeping the recommendations together was lost a bit. Although this was included in the Final Report, many people missed it.

The part of the story that is getting lost is that the recommendations are a package, but this is a small bone to pick really. Focus Group Participant

Most feedback that I’ve heard coming back has been cautiously positive, some very positive, [there are] messages in there that they can pick up, some recommendations they could sign up to. They can understand where recommendations come from. The downside [is that] one message is less clear than others, is [that] the recommendations are a package, you can’t cherry pick. Focus Group Participant

The separate release of the Final Report and the Quantitative Report had an impact on overall clarity of messaging.

But for the quantitative delay it was quite disappointing, we’re a bunch of data geeks and all that valuable stuff has gone. All the glamorous stuff is out there but I’m the only person in the data agencies who read it, and I have a whole team of data scientists and now they are not paying attention [once the Quantitative Report came out]. Focus Group Participant

This is discussed in more detail in the ‘Good process management’ measurement section of this report.


The immediate use of the Final Report is best illustrated by who respondents found was leveraging and referencing the report already. So far, the sectors that were using it frequently included:

  • NGOs and community organisations
  • Whānau led-initiatives and agencies/organisations that use a whānau-led frame
  • Place-based initiatives and movements

Focus groups and some interviews included clear, direct examples of this use so far.

In my work we call on this report, we write opinion pieces using it, it’s great to see the place-based initiatives recognised. I work from a place of hope, this report can validate that. I see how siloed the work is across [government] departments, and with this report I can draw on that. Focus Group Participant

This report has helped to reinforce some stuff we’ve been trying to land with people, and it’s useful to quote bits of. We need to drive cross-agency response, [to be] generative rather than punitive. Focus Group Participant

[This report] has galvanised people to think more strategically about what the learning system looks like in social paradigm, use the IDI to analyse this and the crisis space. The report has inputted into grappling with this, and adds a data point that we need to think differently around leadership and implementation. There are so many patterns but often when you’re in one agency you can’t see it - so it’s helpful to name it [across the system]. Focus Group Participant

Considering the novelty of this type of research for the Commission, the uptake and use by this range of stakeholders can be viewed as a success. Respondents internal to the Commission identified the focus on findings that would be valuable for these sectors as a deliberate choice. To have those groups and sectors using and celebrating the report as something that accurately and authentically covers, presents and analyses the lived experience they work within, is a real accomplishment for the Commission.

In terms of where the reported is being less referenced, respondents identified less of an uptake amongst:

  • Economics think tanks, academics and experts;
  • Economic and/or fiscal policy agencies;
  • Cross-agency groups and initiatives; and
  • Senior public sector leaders and politicians.23

Focus groups and some interviews included clear, direct examples of this under-use so far.

Other reports have sunk in more than this one. I don’t think it is being paid attention to or will be. People aren’t discussing it. The initiatives I work with are gratified to be featured, but if nobody else is listening then how much does the report validate their responses? Focus Group Participant

It feels like a report for experts and public sector management, to be honest. The whole theme around accountability is written for government, but I wonder if there should be something to all those who participated in the inquiry - so at each stage they do a response. Something that is digestible would be a useful tool for communities struggling in this space. Focus Group Participant

Internal Commission respondents identified that the inquiry had planned to respond more fully to participants, particularly from community organisations, at the end of the Inquiry. However, given some of the challenges identified in section 3.1.3 on process, this response plan became unrealistic.

It's worth noting that this particular inquiry aimed more at communities and those that represent them than previous inquiries. However, as this is not a common space for the Commission to operate in, it was challenging to shift the organisation towards that direction. As with any new approach, skills need to be built and practised in order to be most effective.

This finding can also be contextualised with the pattern from previous inquiries of having an ‘adoption curve’ time-lag in the uptake and use of inquiry findings. For some of the sectors that are not using the report immediately, at least not as much as the community and place-based sectors, it may be helpful to think in terms of a longer timeframe for uptake. For example, some previous Commission inquiries were referenced frequently and became key sources for those working in the relevant sectors,24 and this may be the case for Fair Chance for All as time passes. Whilst the confirmation of that suggestion is outside the scope of this report, if there is any bearing in that suggestion, it may be that a similar trend is occurring for the Fair Chance for All Inquiry. This could be information overload for those working in the public sector and/or it could be a lack of awareness around current research and analysis concerning wellbeing.

Further data sources concerning ‘clear message delivery’

Responses to the online survey aligned with participants who found the message delivery logical and clear. This is best contextualised in Question 11, which tested how clear survey respondents found the findings and recommendations. To this question, 92% of survey respondents agreed or strongly agreed that the findings and recommendations of the Final Report were clear. Online survey Question 24 presented a similar view, with 80% of survey respondents agreeing or strongly agreeing that the Commission communicated clearly.

Within the internal media report, some of the feedback from the launch event celebrates and notes the clarity of the Final Report.

Expert overview and recommendations

The stand out findings in this performance measure is how some sectors are referencing and using the report far more than others. This is discussed further within the section 4.3 of this report, as an illustration of different frames of meaning and value.

Finding 9: The Final Report spoke most clearly to community sector organisations, which are already making use of the findings in their work. However, there would have been value in creating accompanying messaging that could be circulated more easily in across the community sector.

3.1.6  Overall quality

The ‘overall quality’ measure is defined as ‘the overall quality of the inquiry taking into account all factors’. This section is discussed more generally than the previous five performance measure sections. Overall, the inquiry was viewed positively, welcomed by many for the new information it brought to light, and largely regarded as high quality and analytically sound. The online survey results roughly align with this, with 38% of survey respondents stating that the inquiry had increased their understanding of persistent disadvantage ‘a lot’, and a further 50% stating that the inquiry had increased their understanding of persistent disadvantage ‘a little’. Only 12% of survey respondents did not find their understanding increased through the inquiry.

Many respondents saw this inquiry as a different type of work for the Commission, acknowledging the need for different tools, different approaches, different capabilities to address the complexity and reach of the topic.

It was a new type of inquiry for the Commission, especially on top of this there were the engagement expectations, with a much broader set of stakeholders than previous inquiries. Interview respondent

What I could see with this, and why I was interested from the start, was that they wanted to try something different and see if it uncovered something new. If they went with the traditional way of evidence, causality, they would probably find out what we already knew. So, they wanted to go in at the systems level and say, what is it here that’s stopping us from implementing change? I get the sense around this table that there’s a lot of frustration how it’s been done. Ok, it’s not the report we thought, they tried something new, innovation won’t always work - let’s look for the bits that did work rather than cut the whole thing down. So that when they do this again (I’m grateful they’re doing this review), let’s learn from what didn't work. Focus Group Participant

Overall, the Commission was seen as accomplishing a good result with a broad topic. It took decisions to narrow the scope into a frame that could be presented coherently and informatively to the public. Not everyone agreed with those choices, but many respondents either did agree, or saw the rationale behind the decision.

Some of the Insights section that follows uncovers a context and/or trends that may have impacted the overall receipt of the inquiry.

3.1.7  New initiatives

This inquiry used a number of new initiatives in gathering and sharing information for the work. Specifically, that included:

  • Public consultation on shaping the Terms of Reference;
  • Actively collaborating with particular groups using policy workshops;
  • Using wānanga and talanoa sessions to gather evidence;
  • Taking a systems approach, instead of focusing on specific policy areas (eg housing), and using systems-thinking tools and methods, including causal loop diagrams; and
  • Publishing a themed submissions analysis - A comprehensive analysis and published summary of submissions to the Interim Report.

Overall, these initiatives were valued by the majority of stakeholders. They were viewed as widening the pool of people feeding into the Inquiry, which was seen as making it of greater public interest to society at large. Many of these have been discussed throughout the report already. This section is therefore quite concise, providing only information that has not already been presented directly.

Wānanga and talanoa sessions were named by several respondents as extremely positive.

Collaborations and stakeholders are the key [with wānanga]– we should do this more often going forward, using the strengths of other agencies and organisations in engagement or collaboration, a lot more of this. Interview respondent

The systems approach was supported by many, as detailed in section 3.1.1. Respondents tended to speak about this in full, rather than in detail. Respondents did not name specific systems-thinking tools or methods used in the inquiry, except for in general terms. However, it should be noted that the interviews and focus groups did not prompt for responses around specific tools or methods.

The themed submission analysis from the Interim Report provided insight into the majority view and a clear summary of what mattered most to submitters for including in the Final Report. However, the inquiry did not provide an overview to map all Interim Report findings, recommendations and questions into a decision to include or drop them from the Final Report. It is unclear what additional benefit this step would have added, noting that it is time and resource-intensive. But providing a more exhaustive rationale on scope and focus in the Final Report, as a result of submissions, may have reduced some of the questions around decisions to focus recommendation on the public management system. The Commission regularly publishes submissions and is encouraged to continue this practice, alongside the continued exercise of theming submissions.

The consultation on the Terms of Reference created the strongest views amongst respondents as to the value of this initiative, also discussed in section 3.1.1. Many found it extremely valuable as an initiative that worked well for gathering a wide range of stakeholder input. The process followed was comprehensive.25

However, this activity was new for the Commission. Instead of consulting on Terms of Reference, previous inquiries produce an issues paper early in the inquiry, seeking discussion and responses to the initial frames of reference through that mechanism. The consultation on the Terms of Reference, without more communication that this would replace an issues paper, may have created an expectation bind for the Commission. Although stakeholders were pleased to see the breadth of voices represented through the early Terms of Reference consultation, a group of respondents advised that they really missed the issues paper as a way to consolidate and respond to emerging frames. However, some of them reflected in the same breath that they may just be creatures of habit and that they had simply become used to an issues paper over years of responding to and working with Commission inquiries.

The implications from these trade-offs may not have been considered as deliberately in advance as they could have. Should the Commission repeat the early consultation exercise, the implications and lessons from the Fair Chance for All Inquiry should be considered in more depth. Recommendation 1 speaks to this.

14. Note that findings do not always fit neatly under the performance measure section in which they are suggested. This speaks to the interrelated nature of performance measures more generally.

15. Page 17 of Interim Report.


17. There is a cultural saying in the policy sector that ‘if everyone is a little bit disappointed then you have probably got it about right.’

18. The Commission provided a verbal briefing to the incoming Minister of Child Poverty Reduction, Hon Jan Tinetti, when portfolio changes occurring following Prime Minister Jacinda Ardern’s resignation. This was also offered to an incoming Minister for Pacific People, when this portfolio changed hands during the course of the inquiry, but Hon Barbara Edmonds did not take up the offer.

19. Recent examples including the Treasury Wellbeing Report ( and the Future for Local Government Review Final Report ($file/Te-Arotake_Final-report.pdf).

20. Specifically, Interim Report Findings 3.1-3.5, Recommendations 3.1-3.2, Question 3.1 concerning the measurement of persistent disadvantage are not present within the Submissions Summary Report. This may be the case for other findings, recommendations and questions as well but a fulsome evaluation of this report was outside the core scope of this evaluation.

21. This connection was also discussed in depth in one of the supplementary research papers supporting the inquiry:


23. The timing of the Inquiry launch just before the 2023 election season may impact this.

24. For example, it is the personal experience of this Evaluation Project Director that the inquiry on Regulatory institutions and practices ( Regulatory-institutions-and-practices.pdf) was heavily referenced and had direct influence on improving New Zealand’s regulatory environment. However, this is anecdotal observation and is may also be simply a result of the policy domains in the Project Director’s experience.

25. and