The 2016 PEFA Framework: Let’s not throw the baby out with the bath water

May 6, 2016Company News

Authors: Andrew Lawson and Isabel Bucknall

ODI’s recent paper raises important issues about the misuse of the PEFA Framework, which rightly deserve attention from governments, development agencies and consultants. Fiscus’ Andrew Lawson and Isabel Bucknall evaluate the strength of some of ODI’s arguments, and call for a more fair assessment of the PEFA’s benefits for governments and donors alike.

The launch of ODI’s new report, “PEFA: What is it good for?” at last week’s 2016 PEFA Conference offered a valuable platform for critical thinking about the use of the PEFA Framework. The authors, Sierd Hadley and Mark Miller, do an excellent job of highlighting the misuse of PEFA as a blueprint for PFM reform and present pragmatic recommendations for improving the process, that were echoed by many at the conference. These include measures such as strengthening the Summary Assessment, revising the donor approach to programming support to PFM reforms, encouraging problem-driven reform agendas and complementing PEFA assessments with Service Delivery Surveys and other such tools more oriented towards the experience of the “end-users” of the PFM system.

These are all valuable recommendations, which deserve to be taken up more extensively by governments, development agencies, consultants and researchers who use the PEFA framework. However, the value of these proposals risks being lost through the negative reactions which are very likely to be generated by the hyperbole to which the authors have fallen prey in other parts of the report. In several areas their points are over-stated and underestimate the importance of the PEFA Framework and its considerable value to governments and donors alike. In particular, we would take issue with two assertions made by the authors:

 

1)PEFA measures “Form” not “Function”

The authors argue that one of the reasons PEFA assessments do not reveal the real strengths and weaknesses of a PFM system, is because they measure how systems look, and not how they work in practice (“Form” over “Function”).

To a significant degree, this is an inherent risk of virtually any measurement system, which like the PEFA, seeks to assess the functionality of complex management processes within a limited time frame and with a modest budget. The result of this is that measurement systems must frequently resort to using proxy indicators, which can end up placing a heavier focus on form than function.  This problem would certainly be familiar to any manager that has attempted to apply the ISO 9000 standard for the quality of company management systems (measurement against  “fundamental concepts and principles of quality management”, including customer focus and factual approaches to decision-making) or the UK Investors in People standard for employee management.  Wise company managers would always be conscious of this weakness, recognizing that high ISO 9000 scores might be achieved even where customer focus is not truly integrated in the company culture. For these reasons, such systems only bring real value, when sensitively applied and interpreted, and the same is true of the PEFA framework.

Mindful of this risk, the PEFA framework puts considerable emphasis on the need to support conclusions with evidence and, as far as possible, to analyse and describe the functionality of each element of the PFM system. In the new Framework in particular, extensive emphasis is placed on the quality of the Summary Assessment and on the narrative description of each indicator. Moreover, great efforts have been made in the structuring of the dimensions of each indicator to capture function as well as form.  This is noticeable even with respect to those indicators related to the area of internal controls, where functionality is especially difficult to measure in a snapshot assessment. Examples include:

  • PI-26 Internal Audit: The first dimension assesses the extent to which government entities are subject to internal audit processes (form), and dimensions three and four require evidence of implementation of internal audits (preparation of audit programmes and the availability of audit reports), and responses to audit findings, which considers the extent to which entities respond and take action on the auditors’ recommendations. Since this indicator is scored using a “weakest link” methodology, this means that any weakness identified in the functioning of internal audit would prevent a higher score, regardless of its effective form. A similar focus is found in indicator PI-30 on External audit.
  • PI-28 In-year budget reports: This indicator is focused on the comprehensiveness, accuracy and timeliness of information on budget execution, provided within the budget year. The first and second dimensions focus on form: coverage and comparability of reports and timing of in-year budget reports. Function is measured in the third dimension, which analyses the accuracy of the data of in-year budget reports, and hence whether the in-year budget reports are providing meaningful information that can support monitoring of budget execution.

Image: PEFA Indicators measuring form and function, Fiscus Ltd.

 

2) The similarity of PFM reforms across the African Continent is driven by PEFA

Highlighting the misuse of the PEFA as a simplistic guide to the design of PFM reforms is important. However, the evidence provided by ODI does not conclusively reveal that it is PEFA that drives similar reforms across countries. In fact, it is clear that this trend of “isomorphic mimicry” (Andrews, 2009) was apparent before the launch of the first assessment framework in 2005. A 2000 paper from Diana Cook and Andrew Lawson (“Medium Term Expenditure Frameworks – panacea or dangerous distraction?”) identified that MTEF implementation was being universally embraced as a standard response to failures to link policy, planning and budgeting, regardless of the context or the specific problems within each country. There was significant evidence of “isomorphic mimicry“ before PEFA was even conceived.

In addition, those reforms that are considered isomorphic mimicry may not actually be that. ODI use the figure below to highlight the similarity of PFM reforms, but do not capture the diversity of processes and systems that may sit under a particular reform label.

Figure: The similarity of PFM reforms in 31 African countries (Source: ODI 2016)

This is particularly true of MTEF reforms, whose content may vary from high-level medium-term fiscal forecasts to more detailed performance-focused expenditure plans (see the typology of Medium-term Fiscal Frameworks, Budget-Frameworks and Performance –Frameworks from Brumby in the 2013 World Bank publication). The same is true of “performance budgeting reforms”, which sometimes include no more than the inclusion of outcome targets in the budgetary documentation of selected ministries and, in any other cases, entail the government-wide introduction of programme budgets. The content of IFMIS reforms can also vary considerably. In short, the fact that reforms have the same labels does not in itself mean that they are the same reforms.

Furthermore, the ubiquity of certain reforms simply reflects the fact that some reforms are universally necessary for countries, regardless of their context. Is there a country where implementing a treasury single account, or top-down budget ceilings is not a good idea or an early priority in the reform process?

 

Recognise the strengths and weaknesses of PEFA and use it wisely

In summary, ODI have done an important job, in raising awareness of some of the limitations of the PEFA Framework and in pointing out the unfortunate tendency of some stakeholders to misuse it. However, some of their criticisms are unjustified. The PEFA has always done what it “says on the tin”, namely: providing a “framework for assessing and reporting on the strengths and weaknesses of public financial management” (PEFA Framework 2016). The PEFA Secretariat provide clear guidance on what the PEFA does not do: it does not assess the causes for good or performance of PFM systems, nor does it assess the effectiveness of government policies.

Perhaps a better title for ODI’s paper would have been, “PEFA: An Instrument Abused”. The PEFA has too often been used inappropriately in the past: some assessments have been superficial, going little beyond the analysis of form, and even good assessments have often been used mechanically to generate PFM reform “targets”. Yet, some governments at the national and sub-national level have found ways to use the PEFA, as a valuable first step in the PFM reform process, as an identification of strengths and weaknesses, providing a springboard for a problem-driven process to explore the underlying causes of significant PFM weaknesses, and to begin to identify a road map to PFM reform that is contextually driven.

The challenge is to ensure that this happens more frequently and more systematically. PEFA should be a complement to an “in depth, country-level, politically informed understanding of how PFM systems work.” (ODI, 2016) To strengthen the implementation and use of PEFA, it is clear that there needs to be stronger engagement with donors, to redefine their use and understanding of the purpose of the PEFA in the PFM reform process more broadly. The recommendations in the ODI report provide a useful step in this direction but, in our view, in its current form the report also risks throwing the baby out with the bathwater.

The words of the Head of the PEFA Secretariat, Lewis Hawke in his opening remarks at the Budapest conference are important to remember here: “The PEFA is still not perfect, but its strengths outweigh its weaknesses, especially when combined with other information.”

Sources:

Andrews, M. (2009) Isomorphism and the limits African public financial management reforms, HKS Working Paper No. RWP09-012, Cambridge MA

Overseas Development Institute (2016) PEFA: What is it good for? Discussion Paper, London

Oxford Policy Management (2000) Medium Term Expenditure Frameworks – panacea or dangerous distraction? OPM Review, Paper 2, Oxford

World Bank (2013) Beyond the Annual Budget: Global Experience with Medium Term Expenditure Frameworks, Public Sector Governance 73514, Washington DC