Capacity Assessment for Effective Delivery of Development Results in Kenya

Similar documents
COUNTRY RESULTS FRAMEWORK KENYA. By Monica Asuna Resource Mobilization Department National Treasury KENYA

Public Finance Reforms in Kenya Some Emerging Issues and their Relevance under the Context of Devolution

MANAGERIAL ACCOUNTABILITY AND RISK MANAGEMENT

Mauritania s Poverty Reduction Strategy Paper (PRSP) was adopted in. Mauritania. History and Context

Joint Venture on Managing for Development Results

Chapter 6 MPRS Implementation, Monitoring and Evaluation

Chapter 6 MPRS Implementation, Monitoring and Evaluation

Proposed Working Mechanisms for Joint UN Teams on AIDS at Country Level

I Introduction 1. II Core Guiding Principles 2-3. III The APR Processes 3-9. Responsibilities of the Participating Countries 9-14

L 347/174 Official Journal of the European Union

The Presidency Department of Performance Monitoring and Evaluation

Introduction. The Assessment consists of: A checklist of best, good and leading practices A rating system to rank your company s current practices.

CAPACITY DEVELOPMENT WORKSHOP AIDE MEMOIRE AUDITING FOR SOCIAL CHANGE

WHO reform: programmes and priority setting

REPORT 2015/115 INTERNAL AUDIT DIVISION

THE IMPLEMENTATION OF THE MEDIUM-TERM EXPENDITURE FRAMEWORK IN CENTRAL AMERICA

SURVEY GUIDANCE CONTENTS Survey on Monitoring the Paris Declaration Fourth High Level Forum on Aid Effectiveness

Building a Nation: Sint Maarten National Development Plan and Institutional Strengthening. (1st January 31st March 2013) First-Quarter Report

Mutual Accountability: The Key Driver for Better Results

Terms of Reference (ToR)

ZIMBABWE_Reporting format for final scoring (Ref. 4)

Coordination and Implementation of the National AIDS Response

We recommend the establishment of One UN at country level, with one leader, one programme, one budgetary framework and, where appropriate, one office.

REPUBLIC OF KENYA COUNTY GOVERNMENT OF WAJIR DEPARTMENT OF FINANCE & ECONOMIC PLANNING

Economic and Social Council

Vanuatu. Vanuatu is a lower-middle-income country with a gross national income (GNI) of

Evolution of methodological approach

KENYA HEALTH SECTOR WIDE APPROACH CODE OF CONDUCT

Introduction. The Assessment consists of: Evaluation questions that assess best practices. A rating system to rank your board s current practices.

Additional Modalities that Further Enhance Direct Access: Terms of Reference for a Pilot Phase

Background and Introduction

Achieving the Sustainable Development Goals in the Era of the Addis Ababa Action Agenda

Treasury Board of Canada Secretariat

SECTOR ASSESSMENT (SUMMARY): PUBLIC SECTOR MANAGEMENT 1

REPUBLIC OF KENYA COUNTY GOVERNMENT OF BUSIA DEPARTMENT OF FINANCE AND ECONOMIC PLANNING

Office of the Public Sector Integrity Commissioner of Canada

AUDIT REPORT INTERNAL AUDIT DIVISION

Challenge: The Gambia lacked a medium-term fiscal framework (MTFF) and a medium-term expenditure framework (MTEF) to direct public expenditures

IMPLEMENTING THE PARIS DECLARATION AT THE COUNTRY LEVEL

REPORT 2015/174 INTERNAL AUDIT DIVISION

Arrangements for the revision of the terms of reference for the Peacebuilding Fund

PUBLIC FINANCE MANAGEMENT SEMINAR

October 2018 JM /3. Hundred and Twenty-fifth Session of the Programme Committee and Hundred and Seventy-third Session of the Finance Committee

The UNOPS Budget Estimates, Executive Board September 2013

REPUBLIC OF KENYA COUNTY GOVERNMENT OF WAJIR DEPARTMENT OF FINANCE & ECONOMIC PLANNING

162,951,560 GOOD PRACTICES 1.9% 0.8% 5.9% INTEGRATING THE SDGS INTO DEVELOPMENT PLANNING BANGLADESH POPULATION ECONOMY US$

Public Expenditure and Financial Accountability Baseline Report. Central Provincial Government

What is EACSOF? Achievements

Does the Ethiopian Budget encourage participation?

Strengthening Multisectoral Governance for Nutrition Deborah Ash, Kavita Sethuraman, Hanifa Bachou

Evaluation of the European Neighbourhood Instrument Draft Report Executive summary January 2017

B.29[17d] Medium-term planning in government departments: Four-year plans

AFGHANISTAN ALLOCATION GUIDELINES 22 JANUARY 2014

Public Financial Management

MONTENEGRO. Support to the Tax Administration INSTRUMENT FOR PRE-ACCESSION ASSISTANCE (IPA II) Action summary

Annex 1. Action Fiche for Solomon Islands

THE REPUBLIC OF UGANDA

Evaluation of the European Union s Co-operation with Kenya Country level evaluation

REPUBLIC OF ALBANIA. Prime Minister ORDER. Nr.12, date FOR

Lesotho. Lesotho is a lower-middle income country with a gross national income (GNI) per capita

REPORT 2016/081 INTERNAL AUDIT DIVISION

DG Enlargement. Support to civil society within the enlargement policy 2. should be focused on enabling and

Summary report. Technical workshop on principles guiding new investments in agriculture: Screening of prospective investors and investment proposals

Ensuring The Effective Participation Of Each Sphere Of Government In The Processes And Structures That Determine Intergovernmental Fiscal Arrangements

The DAC s main findings and recommendations. Extract from: OECD Development Co-operation Peer Reviews

TABLE OF CONTENTS SUBJECTS 1. INTRODUCTION 2. INSTITUTIONAL ARRANGEMENTS. Roles and responsibilities

JAES Action Plan : Cross-cutting issues

CASE STUDY 2: GENDER BUDGET INITIATIVE: THE CASE OF TANZANIA

COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS

TECHNICAL GUIDANCE FOR INVOLVING NON-STATE ACTORS IN THE COUNTRY PROGRAMMING FRAMEWORK (CPF)

Scaling Up Nutrition Kenya Country Experience

Kenya Food Security Institutional Architecture (IA) Workshop. Nairobi 28 th February 1 st March 2018

REPUBLIC OF KENYA BARINGO COUNTY GOVERNMENT COUNTY TREASURY AND ECONOMIC PLANNING

PEFA Handbook. Volume I: The PEFA Assessment Process Planning, Managing and Using PEFA

T H E NA I RO B I C A L L TO A C T I O N F O R C L O S I N G T H E I M P L E M E N TA T I O N G A P I N H E A LT H P RO M O T I O N

Joint Education Sector Working Group Terms of Reference (Revised)

The United Republic of Tanzania Ministry of Finance. Memorandum of Understanding. Between. The Government of the United Republic of Tanzania

Year end report (2016 activities, related expected results and objectives)

Version: th November 2010 RISK MANAGEMENT POLICY

REPORT BY THE COMPTROLLER AND AUDITOR GENERAL HC 1698 SESSION MAY HM Treasury and Cabinet Office. Assurance for major projects

Terms of Reference for consultancy to carry out Project Base line study in the Malawi, Mozambique, Tanzania, Zambia and SADC region

Overview of the Budget Cycle. Karen Rono Development Initiatives

Supplementary matrix 1

14684/16 YML/sv 1 DGC 1

GOVERNMENT OF ZIMBABWE NATIONAL MONITORING AND EVALUATION POLICY

EPWP INCENTIVE GRANT MANUAL

Tracking Government Investments for Nutrition at Country Level Patrizia Fracassi, Clara Picanyol, 03 rd July 2014

Initial Structure and Staffing of the Secretariat

2011 SURVEY ON MONITORING THE PARIS DECLARATION

INTERNAL AUDIT DIVISION REPORT 2018/058. Audit of the management of the regular programme of technical cooperation

FINAL CONSULTATION DOCUMENT May CONCEPT NOTE Shaping the InsuResilience Global Partnership

REPIM Curriculum Vitae Sharon Hanson-Cooper

EAST AFRICAN COMMUNITY EAC STRATEGY FOR MAINSTREAMING GENDER IN THE EAC STRUCTURES, ORGANS AND INSTITUTIONS (2013)

WHO S IN CHARGE OF THE HOUSING SECTOR?

Commissioner for Humanitarian Aid and Crisis Management

Development Planning in Uganda Patrick Birungi, PhD

Management response to the recommendations deriving from the evaluation of the Mali country portfolio ( )

REVIEW OF MANAGEMENT AND ADMINISTRATION IN THE WORLD METEOROLOGICAL ORGANIZATION (WMO): ADDITIONAL ISSUES

2 nd INDEPENDENT EXTERNAL EVALUATION of the EUROPEAN UNION AGENCY FOR FUNDAMENTAL RIGHTS (FRA)

Appreciative Inquiry Report Welsh Government s Approach to Assessing Equality Impacts of its Budget

Transcription:

Government of Kenya Capacity Assessment for Effective Delivery of Development Results in Kenya Managing for Development Results Capacity Scan Implementation of CAP-Scan Process January 3, 2012

Table of Content Acknowledgement 4 List of Acronyms 5 Summary 6 Introduction 8 I. Preparation of the Assessment 10 A. Facilitation of the CAP-Scan Exercise 10 B. Adaptation of the Tool 10 C. Training of Participants 10 D. Scope of the Assessment 11 II. Assessment of MfDR Capacity 12 A. Methodology 12 B. Results by Sector 13 C. Results by MfDR Pillar 14 1. Planning and Budgeting Scored 2.1 16 2. Monitoring and Evaluation Scored 1.7 18 3. Statistical Capacity Scored 2.2 21 4. Leadership Scored 2.4 23 5. Accountability and partnerships scored 2.5 25 III. Government-Wide MfDR Action Plan 27 A. Methodology 27 B. Action Plan by MfDR Pillar 28 IV. Evaluation of the CAP-Scan Workshop 34 Annexes 35 Annex 1 Expression of Interest from the Government of Kenya 35 Annex 2 Definition of Capacity Building Stages 38 Annex 3 The CAP-Scan Matrix for Kenya 39 Annex 4 CAP-Scan Journal for Agriculture and Justice, Health and Medical Services, Higher Education and Planning 50 Annex 5 CAP-Scan Journal for Prime Minister Office, Public Service, Tourism, Trade and Transport 74 2

List of Figures Figure 1: Results at National Level by MfDR Pillar... 6 Figure 2: Sectors Involved in Determining the National Score... 12 Figure 3: Average Capacity Score by Sector... 13 Figure 4: Results at National Level by MfDR Components... 15 Figure 5: Government Score in Planning and Budgeting... 16 Figure 8: Government Score in Monitoring and Evaluation... 18 Figure 6: Government Score in Statistics... 21 Figure 9: Government Score in Leadership... 23 Figure 7: Government Score in Accountability and Partnerships... 25 Figure 8: CAP-Scan Workshop Evaluation... 34 List of Tables Table 1: Components Scored by MfDR Pillars for the Capacity Assessment... 8 Table 5: Capacity Scores by Sector for Each Pillar... 14 Table 2: National MfDR Action Plan... 28 3

Acknowledgement The CAP-Scan Task Force gratefully acknowledges the active contributions of the fortyfour officials from the Government of Kenya who dedicated their time to self-assess their respective sectors and develop an action plan to reinforce national capacity to manage for development results. Their involvement during the three-day workshop held from May 30 to June 2 led to this report. Special gratitude is also extended to Permanent Secretaries who appointed officials to participate in the exercise. The Task Force also wishes to acknowledge the leadership demonstrated by Joshua Mwiranga from the Office of the Prime Minister who raised awareness on CAP-Scan at the national level and gathered government-wide support. The input from his colleagues in the Office of the Prime Minister is worth recognizing, especially from Mary Ndeto, Elijah Achoch, under the guidance of Emmanuel Lubembe, Acting Secretary, Public Service Transformation Department in the Office of the Prime Minister as well as Head of the Public Service Transformation Department. The leadership of the Permanent Secretary, Office of the Prime Minister was also instrumental. Also important to mention are Marco Varea, international facilitator, and Cyril Blet from the World Bank s Results Unit. Rosa Muraguri-Mwololo from UN-Habitat also contributed to facilitating the exercise. The team thanks the World Bank which provided support to implement CAP-Scan in Kenya, with financial assistance from the Swiss Agency for Development and Cooperation. Specifically, we express our gratitude to the World Bank country office in Kenya for ensuring support to conduct the exercise in Nairobi, Kenya. Technical contributions from and facilitation by Philip Jespersen from the Kenyan World Bank country office was also extremely appreciated. A CAP-Scan exercise is a highly participatory process and gratitude is due to all the contributors. We thank them for their inputs, comments and suggestions throughout the process. 4

List of Acronyms CAP-Scan Capacity Scan in Managing for Development Results CSO Civil Society Organization DP Development Partners E-ProMIS Electronic Projects Monitoring System GoK Government of Kenya IFMIS Integrated Financial Management Information System GHRIS Government Human Resource Information System KENAO Kenya National Audit Office KNBS Kenya National Bureau of Statistics M&E Monitoring and Evaluation MfDR Managing for Development Results MoPNDV2030 Ministry of Planning, National Development and Vision 2030 MoF Ministry of Finance MTEF Medium Term Expenditure Framework MTP Medium Term Plan NIMES National Integrated Monitoring and Evaluation System OPM Office of the Prime Minister PER Public Expenditure Review STATCAP Statistical Capacity Building 5

Summary The Government of Kenya decided to self-assess its capacity to Manage for Development Results (MfDR) using the Capacity Scan (CAP-Scan) methodology to identify its strengths and weaknesses, as well as to develop an action plan to improve the delivery of public goods. Under the leadership of the Office of the Prime Minister, forty-four government officials from ten key ministries Agriculture; Justice, National Cohesion and Constitutional Affairs; Public Health; and Medical Services; Education, Higher Education and Science and Technology; Planning, National Development and Vision 2030; Office of the Prime Minister; Public Service; Transport; Trade; and Tourism contributed to identify the national MfDR capacity using the CAP-Scan measurement framework adapted to the country context for the following five main areas: Leadership, Monitoring and Evaluation (M&E), Planning and Budgeting, Accountability and Partnerships, and Statistics. With an average score of 2.2 on a 4-point scale, Kenya is clearly implementing results-oriented approaches, with a need to follow a consistent Government-wide approach so as to increase its effectiveness for more performance. The figure below presents an overview of the ranking of the national MfDR capacity by results components for the five MfDR pillars. Figure 1: Results at National Level by MfDR Pillar Planning and Budgeting Monitoring and Evaluation Statistics Leadership Accountability and Partnerships Average 2.1 1.7 2.2 2.4 2.5 2.2 0.0 1.0 3.0 4.0 In 2003, the Kenyan Government committed to deliver targeted development results to Kenyans and adopted results-based management approaches to fast track the implementation of the Economic Recovery Strategy for Wealth and Employment Creation (2003-2007). This move includes the implementation of results-based sector policies, plans and programmes, as well as an increased transparency and accountability in the public sector at the intuitional level, and a broader move toward inculcating a performance culture for civil servants. According to the CAP-SCAN results, the strongest national MfDR capacity exists in the Accountability and Partnerships area, as well as in the Leadership domain. Specifically, 6

clarity of development orientations was recognized as an effective practice by all sectors which stems from the efforts made in defining a long term vision as well as resultsoriented medium term plans. Few components are scored high for only a limited number of sectors. Agriculture, Justice and Education ranked higher than others in inter-sectoral coordination for M&E. Tourism is the only sector with high capacity in terms of having an effective monitoring of public policies that permits adjustments in performance objectives. The scores are fairly low in both the Monitoring & Evaluation (M&E), and Planning and Budgeting pillars. The most serious weakness, as pointed by almost all sectors, is the capacity for the monitoring and evaluation of public policies. The capacity to have an administration geared toward development results, as well as the integration of M&E systems for decision making are scored fairly low. The following dimensions were also scored quite low: participation of non-state actors in budget preparation, system for measuring user satisfaction, data quality assessment and the alignment of partners on national priorities, as well as coordination amongst them. Improvement in the capacity to manage for results could be achieved in the short to medium term. Participants prepared a national action plan focusing on low cost quick wins to fully implement in the next 36 months. If improved, these results areas could have great multiplier and reinforcing effects on other areas, thereby improving the overall country capacity to manage for results in the foreseeable future. For example, the improvement of the 1.8 scored Information system and decision-support tools component, within the M&E pillar, is critical as it hinges on the government-wide capacity to coordinate the aggregation of results. For the Planning and Budgeting pillar, priorities are given to making the budget more consistent with national priorities through the establishment of balanced scorecards in line ministries. In terms of Accountability and Partnerships, civil servants underlined the need to increase the public access to results through forums held to inform citizens and stakeholders, as well as the branding of results obtained. A focus has also been put on increasing the reliability and credibility of data through the development of standardized data quality assessment protocols. A series of measures has been identified to incentivize the delivery of public services thanks to a better managed civil service workforce. The accomplishment of these actions would largely improve the results of this assessment and capacity to manage for development results in Kenya. With inclusive planning and follow-up, these measures could be introduced in the last quarter of 2011. The CAP-Scan in Kenya is very timely as the Government is on the verge of fully implementing its new constitution promulgated on 27 August 2010 with a stronger will to achieve results. The findings aim to foster delivery of the national objectives set in the Vision 20320. 7

Introduction The objective of this Managing for Development Results (MfDR) Capacity Scan (CAP- Scan) in Kenya has been twofold. It has been first conducted to assess the Government of Kenya s strengths and weaknesses in MfDR procedures, practices and capacity at the Government level; and second to design a plan focusing on key actions to foster the delivery of public goods and services, as well as to contribute toward informing ongoing efforts to implement MfDR-related activities. Both the self-assessment and the development of the action plan occurred during the CAP-Scan workshop held in Nairobi, Kenya, from May 30 to June 2, at the Safari Park Hotel. The result of the assessment provides a baseline against which progress in MfDR capacity can be measured through repeat assessments. The CAP-Scan measurement framework provides a scoring system on a scale from zero to four based on qualitative and quantitative assessments of the following five MfDR pillars: Leadership, Monitoring and Evaluation, Planning and Budgeting, Accountability and Partnerships, and Statistics. Each unit of the scale represents a stage in MfDR capacity ranging from Awareness to Exploration, Transition and Full Implementation of MfDR. Six to seven results components or indicators adapted to best fit to the Kenyan political and economic context are scored for each pillar using the CAP-Scan self-assessment methodology. See table 1 for a list of the components by pillar. Table 1: Components Scored by MfDR Pillars for the Capacity Assessment Central Pillars of MfDR Components by pillar 1. Planning & Budgeting 1.1. Budget consistency with national priorities 1.2. Budget preparation based on objectives and results 1.3. Participation of nongovernmental actors in budget planning and preparation 1.4. Intra-departmental coordination 2. Monitoring & Evaluation 2.1. National Planning geared to development results 2.2. Capacity for monitoring and evaluation of public policies 2.3. Information system and decision-support tools 2.4. System for measuring user satisfaction 3. Statistical Capacity 3.1. Statistics strategy and plan 3.2. Data disaggregation 3.3. Extent of data 3.4. Data quality assessment 4. Leadership 4.1 Commitment 4.2. Clarity and articulation of development orientations 4.3. Participation of non-state actors 4.4. Responsibility and delegation at the level of senior officials of the Administration 5. Accountability & Partnerships 5.1. Independence of the higher Audit institutions 5.2. Parliament s role in oversight of Government action 5.3. Media independence 5.4. Public access to results 1.5. Inter-sectoral coordination 2.5. Administration performance geared to development results 3.5 Capacity for conducting and exploiting countrywide surveys 4.5. Integration of the decentralization dimension 5.5. Coordination among DPs 1.6 Results management framework 1.7 Donors link programming to results 2.6. Harmonization of information requests by Development Partners (DP) 2.7 Integration of M&E systems for decision making process 3.6 Capacity for analysis and modeling 3.7 Performance Measurement 4.6. Change in Management 4.7. Human Resources Management 5.6. Alignment of partners on national priorities 8

The Government of Kenya (GoK) is of the firm belief that the road to achieving its Vision 2030 is one defined by a culture of results assured by the application of MfDR principles in the public sector. 1 In this regard, the installation of the requisite capacity in the public institutions to effectively and efficiently deliver results is of utmost importance. This forms the basis of CAP-Scan. Kenya becomes the sixth African country to undertake the CAP-Scan exercise, 2 demonstrating the Government s strong commitment to managing for development results for the achievement of desired results from the utilisation of scarce public resources. The capacity challenges encountered by the Government in the implementation of past and present development strategies had necessitated a range of capacity building interventions in the country. The country has also embarked in results-based management since 2003 to fast-track the Economic Recovery Strategy for Wealth and Employment Creation. The practices put in place in that context include among other results-based sectoral policies, plans and programmes: an increased transparency and accountability in the public sector; an improved public service delivery through the use of rapid results approaches; the use of performance contracting at institutional level; the development of service charters; and the establishment of performance appraisal systems. GoK is currently implementing its Medium Term Plan of the Kenya Vision 2030 for the period 2008-2012. In this context, the CAP-Scan exercise in Kenya is extremely timely as the government is committed to deliver the expected development results and increase its effectiveness in that regard. There is need to equip all ministries with the appropriate capacities to make this happen. The CAP-Scan exercise was used to identify key results policy areas in public sector management that needed strengthening for effective and efficient delivery of development results in Kenya. 1 In April 2011, the Office of the Prime Minister expressed interest to the World Bank to conduct the CAP- Scan in key sectors of the Government of Kenya. In partnership with the Swiss Agency for Development and Cooperation (SDC), the World Bank confirmed its support for the CAP-Scan for Kenya. GoK s expression of interest to conduct CAP-Scan is reproduced in Annex 1. 2 Mauritania, Niger, Senegal, Malawi and Sierra Leone were the first five African countries to conduct CAP-Scan. Other exercises took place in Europe and Central Asia. 9

A. Facilitation of the CAP-Scan Exercise I. Preparation of the Assessment The CAP-Scan was conducted in Kenya under the leadership of the Office of the Prime Minister which appointed a focal point who coordinated the whole self-assessment, with support from an international consultant and the World Bank s Results Unit. Specifically, the three following types of actors played a major role in the CAP-Scan: CAP-Scan Task Force. Made of senior civil servants from the Office of the Prime Minister as well as members of the Kenyan Community of Practice on MfDR, this team planned the process, secured resources and identified timelines. The team reports to the Permanent Secretary in the Office of the Prime Minister. The Task Force communicated with ministries and development partners to generate the national support for the exercise and identify capacity challenges. The task force provided administrative support and guidance to the entire process. Sector Groups. Ministers and Permanent Secretaries designated a group of around five representatives from their own sector to participate in the entire exercise. Participants came from a mix of senior level technical and managerial positions. Focal Points. Each sector appointed a Focal Point in charge of facilitating group discussions throughout the exercise toward reaching consensus on scores and providing conceptual guidance on capacity status in terms of service delivery. Rapporteurs. At least one participant from each sector was responsible for noting down the capacity scores and justification for these. B. Adaptation of the Tool Before the three-day exercise, the CAP-Scan Task Force, as well as development partners, tailored the tool to the Kenyan institutional context. Most of the changes are minor, clarifying wording issues and ensuring that the name of institutions would be understood in Kenya. A results component in the Planning and Budgeting pillar, focusing on the percentage of donor funding vis-à-vis the total expenditure budget of line ministries has been removed as it was unclear how this could effectively inform the overall assessment. C. Training of Participants All participants were trained on the basics of MfDR and the CAP-Scan methodology to ensure effective sector group discussions during the assessment. This also equipped them with the requisite skills to roll out the CAP-Scan tool to other sectors and local authorities, i.e. counties. This training which occurred on June 30, 2011, followed the opening ceremony, attended by representatives from the Government, as well as from the international development community, the civil society, and sector participants. Statements were made from representatives of the Government and development 10

partners. A brief overview of MfDR concepts and the CAP-Scan was presented and key inputs to the adaptation of the tool were generated during the discussion session. D. Scope of the Assessment The assessment targets the following Ministries covering the economic and social sectors falling within the strategic priority areas of GoK s Medium Term Plan 2008-2012 of the Vision 2030: - Agriculture; - Justice, National Cohesion and Constitutional Affairs; - Public Health; and Medical Services; - Education; and Higher Education, Science & Technology; - Planning, National Development and Vision 2030; - Prime Minister s Office; - Public Service; - Trade; - Transport; and - Tourism. The Office of the Prime Minister was included in this assessment, given its national coordinating function for all national development programs. It is important to note that the Ministry of Finance did not participate in the exercise, which may affect the scores of the assessment in the Planning and Budgeting pillar. A total of more than fifty participants took active part in the exercise. 11

II. Assessment of MfDR Capacity A. Methodology Sectors scored their capacity and the extent to which they have put in place MfDR-related practices and structures, following the CAP-Scan measurement framework. Each sector determined whether it was currently in the: Awareness level, recognizing the need to implement MfDR in the sector, with no concrete action conducting to date; Exploration stage, taking actions but in the context of limited means to effectively implement a given strategy; Transition stage, advancing with the effective use of MfDR practices; or in the full implementation of MfDR practices. The definition for each stage is reproduced in Annex 2. These stages range on a four-point scale divided into 0.25 increments, with zero being the start of the awareness level and four the complete full implementation. Supporting explanations were provided for each score by sectors based on evidence as contained in official documents, reproduced in Annexes 4 and 5. The overall national score has been obtained through averaging the scores of the seven sectors, rather than through an agreement by all sectors on a national score. Due to time constraints, it was chosen to focus on allocating more time to designing the action plan, as opposed to reaching consensus among sectors on the national score. Below is an explanation of the process. The following sub-sections outlines the results as they were generated during the selfassessment. It provides an overview of the results by participating public sectors, as well as presents the overall national score and a detailed analysis by MfDR pillars. Figure 2: Sectors Involved in Determining the National Score Agriculture Transport Justice Public Services National Score Health and Medical Services Prime Minister Office Higher Education Tourism Planning 12

B. Results by Sector The first step of the assessment was for each sector to self-administer the CAP-Scan to themselves. Trade, Agriculture and Justice, and Health and Medical Services ranked highest in terms of MfDR capacity scores, with respectively 2.8, 2.5 and 2.4. These three sectors are in transition phase in terms of applying MfDR principles. The sectors with the lowest scores are three institutions with strong coordination mandate: the Prime Minister Office as well as the ministries for Public Service and Planning, followed by Higher Education and Trade. Even the institution which has scored lowest has moved past half way this phase towards the transition phase. The average capacity score by sector is reproduced just below, whereas further below is detailed the results by sector for each pillar. Figure 3: Average Capacity Score by Sector Ministries of Agriculture and Justice Ministry of Health and Medical Services Ministry of Higher Education 2.1 2.5 2.4 Office of the Prime Minister 1.6 Ministry of State, for Planning and Vision 2030 Ministry of State for Public Services Ministry of Transport 2.1 2.1 Ministry of Trade 2.8 Ministry of Tourism 2.2 0.0 Awareness 1.0 Exploration Transition 3.0 Full imple- 4.0 phase phase mentation phase Strengths in applying MfDR strategies exist among sectors in specific areas. For instance, results-oriented framework in ministries was ranked fairly high by all sectors, but higher education and tourism. This 2.5 average score stems from the government s actions to implement results-based management approaches in all sectors. Few components are scored high for only a limited number of sectors. Agriculture, Justice and Education ranked higher than others in inter-sectoral coordination for M&E. Tourism is the only sector with high capacity in terms of having an effective monitoring of public policies that permits adjustments in performance objectives. The Trade and Health sectors stand out unique in their capacity to ensure harmonized information requests from development partners. The Agriculture sector is more outstanding in its capacity for conducting and exploiting countrywide surveys, while the Transport sector stands out in involving Civil Society Organizations (CSO) and the private sector toward the achievement of development results. Weaknesses have also been identified in specific results areas for all sectors. The inadequate capacity to analyze statistical data for forecasting purpose was assessed by all sectors, but two, as a very weak component in the country s capacity to manage for development results. The capacity to have an administration geared toward development 13

results, as well as the integration of M&E systems for decision making are scored fairly low too. The following dimensions were also scored quite low: participation of non-state actors in budget preparation, system for measuring user satisfaction, data quality assessment and the alignment of partners on national priorities, as well as coordination amongst them. Some sectors have also weaknesses in their MfDR capacity unique to themselves. This is the case for Higher Education in terms of public access to results and for Tourism in terms of change in management. MfDR Pillar Sector Planning and Budgeting Table 2: Capacity Scores by Sector for Each Pillar Monitoring and Evaluation Statistics Leadership Accountability and Partnerships Sector Average Agriculture and Justice Health and Medical Services 2.9 1.4 2.4 3.0 2.8 2.5 1.8 2.3 2.6 2.5 2.8 2.4 Higher Education 2.2 1.6 2.3 2.3 2.1 Planning 2.2 2.2 2.1 Prime Minister 1.7 0.8 1.9 1.9 1.9 1.6 Public Service 1.6 2.5 2.3 2.1 2.1 Tourism 1.9 1.8 2.2 2.2 2.8 2.2 Trade 2.5 2.9 2.7 2.7 3.0 2.8 Transport 1.4 1.6 2.3 2.6 National Average by Pillar 2.1 1.7 2.2 2.4 2.5 2.2 C. Results by MfDR Pillar Kenya was rated on average to have entered the transition phase in its MfDR capacity and implementation of MfDR-related strategies with an average score of 2.2 on a 4-point scale, as detailed in Figure 4. The country s main strength, according to the results, lies in the implementation of reforms in the accountability and partnerships area, as well as with a capacity in leadership for results, with capacity scores of respectively 2.5 and 2.4. All pillars, but the one focusing on M&E, are in transition phase. The country is scored the least in the monitoring and evaluation MfDR pillar with an average capacity score of 1.7, in the exploration phase. The following sub-sections provide an explanation of the government-wide scores for each MfDR pillar and are structured in the same order as the exercise was conducted. Detailed scores and justifications for each component by sector are reproduced in annexes 4 and 5. 14

5. Accountability and Partnerships 4. Leadership 3. Statistics 2. Monitoring and Evaluation 1. Planning and Budgeting Figure 4: Results at National Level by MfDR Components Results by Component Budget consistency with national priorities 2.2 Budget preparation based on objectives and results Participation of non-governmental actors in budget planning 1.6 Intra-departmental coordination Inter-sectoral coordination 2.3 Results management framework 2.5 Donors link programming to results 2.3 National Planning geared to development results Capacity for monitoring and evaluation of public policies 1.6 Information system and decision-support tools 1.8 System for measuring user satisfaction 2.1 Administration performance geared to development results 1.8 Harmonization of information requests by Development 1.1 Integration of M&E systems for decision making process 1.8 Statistics strategy and plan 1.8 Data disaggregation 2.6 Extent of data 2.3 Data quality assessment 2.1 Capacity for conducting and exploiting country-wide surveys 2.4 Capacity for analysis and modeling 1.9 Performance Measurement 2.5 Commitment 2.4 Clarity and articulation of development orientations 2.6 Participation of non-state actors 2.3 Responsibility and delegation at the level of senior officials 2.6 Integration of the decentralization dimension 2.4 Change in Management 2.1 Human Resources Management 2.1 Independence of the higher Audit institutions 3.1 Parliament s role in oversight of Government action 3.2 Media independence 2.8 Public access to results 2.1 Coordination among DPs Alignment of partners on national priorities 1.8 0.0 1.0 3.0 4.0 15

1. Planning and Budgeting Scored 2.1 Figure 5: Government Score in Planning and Budgeting Budget consistency with national priorities 2.2 Donors link programming to results 4.0 3.0 2.3 2.5 Results management framework 1.0 Budget preparation based on objectives and results Participation of nongovernmental actors in budget planning and preparation 0.0 1.6 2.3 Inter-sectoral coordination Intra-departmental coordination 1.1. Budget Consistency with National Priorities 2.2 The extent to which the budget is organized around the 2008-2012 Kenyan Medium Term Plan and funds allocated according to its priorities is critical to achieve the desired results. The consistency of budget allocations with national priorities is, however, happening in only a limited number of sectors. Though the budget of the ministries of Agriculture and Higher Education are structured along the sub-programs of their respective strategy, others, such as the Health sector, do not have their budget organized around key areas. Likewise, the budget of the Office of the Prime Minister public service transformation does not demonstrate this as a priority area and funds are often redirected to other activities. Underfunding, cost overruns, disasters and emergencies have been mentioned as causes for diverting funds from national priorities. This clearly indicates that Medium-Term Expenditure Frameworks (MTEF) are not fully functioning, in Kenya, as well as the fact that risk assessment and management is weak. 1.2. Budget Preparation Based on Objectives and Results Annual Public Expenditure Reviews (PER) are conducted to assess whether the targets have been met and funds adequately utilized, but it is unclear the extent to which the information provided by the PERs informs the budget preparation to allocate resources based on each department s results and objectives. Likewise, a performance contracting mechanism is in place in most ministries, without a direct link to budget decisions. This therefore indicates that there is weak linkage between planning, budgeting and results 1.3. Participation of non-governmental actors in budget planning and preparation 1.6 16

Citizens input is being collected during the planning and implementations processes, but not as part of the budget preparation. The agriculture and public service ministries are the only ones with budgetary hearings. For the agriculture sector, those are held in February to provide the public with the opportunity to give inputs to budget priorities, before the beginning of the next fiscal year starting in July. Those hearings are advertised in newspapers. Parliamentary committees also play an important role in finalizing the budget. In other sectors, CSOs are involved in the planning process, through for example the health sector coordination committee. The participation of citizens and CSOs in fora informs to a limited extent the budget preparation. 1.4. Intra-Departmental Coordination Joint planning, budgeting and measurement of results span organizational boundaries for better effectiveness. Most sectors are cascading down results management approaches only to a limited extent, hindering the ability to best coordinate among department. On the public financial management side, the preparation of the budget in the tourism sector as well as in other sectors is first done at the division level and then consolidated at the central level within the Ministry. Others, such as the Ministry for Planning or Prime Minister Office, noted that budget allocations are done in isolation, with limited consultation and inputs from divisions. Performance contract targets to be achieved by each department are determined once the budget for each department is agreed upon. 1.5. Inter-Sectoral Coordination 2.3 Though coordination across sectors is recognized as instrumental to best plan and implement policies for the delivery of common outcomes, this effort has remained weak in most sectors to inform the budget preparation. Only limited ad hoc inter-sectoral consultations occur, as it is the case with the health sector. No guideline or M&E system on coordination exists at the moment. The coordination bodies, such as the Ministry of Planning or the Office the Prime Minister, are not involved in the budget preparation across sectors. 1.6. Results Management Framework 2.5 Since 2003, the government is moving toward using results-based practices as part of the day-to-day public sector management. Ministries use results-based management tools such as the performance contracting, the rapid results initiative and the performance assessment system. In the health sectors, most managers use logical frameworks to achieve the desired results. However, some units in the agriculture sector, wildlife service or public service commission are not applying those practices to their day-to-day work. The capacity needs to be built for civil servants to clearly understand the results chain and move from a focus on outputs to a focus on outcomes and impacts. 1.7. Donors Link Programming to Results 2.3 Donor funding is tied to the objectives and plans of most ministries, but not necessarily based on previous performance. Joint statements of intent and credit agreement based on the Kenyan medium-term plan are defined between development partners and the Government. However, some development partners set unrealistic targets, expecting to achieve impacts in only a year. 17

2. Monitoring and Evaluation Scored 1.7 Figure 6: Government Score in Monitoring and Evaluation National Planning geared to development results Capacity for monitoring and evaluation of public policies 1.6 Information system and decisionsupport tools Integration of M&E systems for decision making process 4.0 3.0 1.8 1.0 0.0 1.8 1.1 2.1 1.8 Harmonization of information requests by Development Partners (DP) System for measuring user satisfaction Administration performance geared to development results 2.1. National Planning geared to development results The Planning ministry is in charge of measuring progress against all aspects of the Kenyan medium term plan through the National Integrated Monitoring and Evaluation System (NIMES). Sectors are bound to report against NIMES indicators. Other systems are used to inform the Annual Progress Review report. However, the use of those reports for decision making is very limited in ministries and there is no Government-wide M&E policy. Furthermore, the reliability and quality of data is also questioned. In the health sector, performance data is collected biannually, whereas in the education sector those data are collected quarterly. The transport sector ranked lowest in this results component. Although good policies such as traffic rules are formulated, the progress made is not adequately monitored. Notably, there are weak linkages between sector and institutional M&E systems and the NIMES Reports are published, but little to no use of the performance information exists. 2.2. Capacity for monitoring and evaluation of public policies 1.6 Ministries have inadequate capacities to monitor and evaluate policies. Insufficient leadership has been provided by the Ministry of Planning to guide sectors and decentralized entities on M&E. Training on this topic are also not based on capacity assessment, nor geared toward filling specific gaps. The health sector has installed the appropriate level of M&E capacity in its units related to HIV-AIDS, but doesn t have put the same effort in other public health areas. The hiring of consultants to conduct the technical work is a common practice. The very weak capacity for M&E at the local level is becoming even more stark as the capacity at the sector level is growing. 18

2.3. Information system and decision-support tools 1.8 The Government has put in place computerized information systems throughout its sectors. The Electronic Projects Monitoring System (E-ProMIS) is being used for data acquisition, storage and analysis, in different sectors. It tracks aid flows and makes it accessible to the public. However, only around 1,000 projects, such as ones from the Health and Education sectors, or a fraction of the total number of Government of Kenya s projects are being included in the E-ProMIS. The Ministry of Finance does not insist on having projects consistently loaded in that system. The Integrated Financial Management Information System (IFMIS), Government Human Resource Information System (GHRIS) and KenInfo are some other information systems in use but still operate as stand-alone systems. The tourism sector is still in the process of developing its information system. In the transport sector, only 50% of the civil servants have access to computers, with only 20% of them knowing how to use them effectively, resulting in limited use of computer-based M&E systems. The NIMES is still manual, hence limited in capturing real time data. This is consistent with a broader Government-wide trend leading to a low uptake of the use of information systems. 2.4. System for measuring user satisfaction 2.1 Assessing user satisfaction has been consistently used by Kenyan public services for almost a decade. The following tools are being used to that effect: suggestion boxes, public complaints offices, hotlines and service delivery surveys. Service charters are put at the entrance of public buildings, for citizens to know the services to be expected and react in case they did not receive the announced services. The education sector even has client satisfaction as one of its indicators in its performance contract, and the health sector is examining the complaints made by a hospital management committee. The knowledge coming from user satisfaction measurement systems is used to inform policies and processes to a limited extent. 2.5. Administration performance geared to development results 1.8 The Kenyan civil service is modernizing its process using results-based management tools without a fully comprehensive and coherence framework. In each ministry, the monitoring unit is in charge of coordinating the performance contract signed with the Government. Signatories to the performance contracts are the Permanent Secretaries and Accounting Officers from the side of the ministry involved and the head of the Public Service department and Cabinet Secretary from the side of the Government. Performance data are measured at the individual, departmental and institutional levels. Those are used by the Ministries to provide rewards and sanctions, as well as to identify training needs. Each civil servant is bound by his/her performance appraisal agreed upon with the management. 2.6. Harmonization of information requests by Development Partners (DP) 1.1 This component scored lowest in the M&E pillar, as the Government is still responding to diverse donor reporting requirements which consumes the time of its civil servants. There is currently no government-wide policy in place for harmonized donor reporting. The health and trade sectors are the only sectors moving toward harmonization of 19

information requests. In the health sector, this is due to the existing sector wide approach which brings together all development partners. 2.7. Integration of M&E systems for decision making process 1.8 M&E information systems exist, but are used independently from each other across sectors. The NIMES, which is coordinated by the Ministry of Planning, collects a specific set of indicators from each sector to inform on the progress made with the medium term plan implementation. The use of the data generated through NIMES for decision-making purpose appears to be limited, even though the data is widely shared. 20

3. Statistical Capacity Scored 2.2 Figure 7: Government Score in Statistics Statistics strategy and plan 1.8 Performance Measurement 4.0 2.5 3.0 1.0 1.9 Capacity for analysis and modeling 0.0 Data disaggregation 2.6 2.3 Extent of data 2.1 2.4 Data quality assessment Capacity for conducting and exploiting countrywide surveys 3.1. Statistics strategy and plan 1.8 The 2008-2012 Strategic Plan for the national development of statistics is in place, under the coordination of the Kenya National Bureau of Statistics (KNBS), and applies to all sectors. KNBS is in charge of managing and coordinating the national statistical system to enhance statistical production and utilization. It is a semi-autonomous government agency within the Ministry of Planning established in 2006, which is mandated to plan all official statistical programs. Data are collected from central planning units in ministries and feed into a national information system. 3.2. Data disaggregation 2.6 Ministries are increasingly disaggregated their data, especially in terms of gender and age. This is evident for the case of the ministry of Agriculture, which has set up a gender mainstreaming unit to spearhead gender disaggregated data. Due to the Kenyan social and geographic context, data are also disaggregated by regions and ethnicities. This information is used to recruit civil servants, as indicated by the Ministry of State for Public Service. Likewise the Ministry of Tourism is using the collected disaggregated data to change its strategy, moving from a focus on beach and wildlife attractions to one on health, sports and eco-tourism. 3.3. Extent of data 2.3 Most, but not all, sectors acknowledge that the set of data available is broad enough to measure most indicators related to national priorities, but little use of the information collected is made. This is the case for maternal health in the health sector and enrollment in tertiary institution in the education sector. The Trade ministry notes that areas of its 21

plan, especially for Small and Medium Entreprises as well as the informal sector, are missing important data. At the local level, there is very little data available on the results of the Constituency Development Fund projects. 3.4. Data quality assessment 2.1 Data quality assessment mechanisms exist in the Government. However, due to lack of capacity, officers do not comply with the agreed standards and mechanisms are of varying quality across the government. Ministries use different approaches. Some sectors follow advanced protocols, such as the ministry of Agriculture, but others do not comply to any specific type of assessment at all. In the health sector, capacity building initiatives took place to foster the use of internationally recognized standards, protocols and ethics. 3.5. Capacity for conducting and exploiting country-wide surveys 2.4 The KNBS is responsible for conducting country-wide surveys and has the capacity to do so both in terms of manpower and technical ability. However, some surveys are not conducted regularly and lack political support. The Agriculture sector has not conducted any all encompassing country-wide survey for a long time, though one is currently being prepared to define a baseline. 3.6. Capacity for analysis and modeling 1.9 Manpower has been trained in some ministries, such as Agriculture and Education, to conduct effective data analysis and modeling. The World Bank Statistical Capacity Building (STATCAP) initiative is used in Kenya to increase capacity for analyzing and modeling data. Fourteen sectors have been supported by STATCAP, which has led to increased data production and use in most of them. Some surveys require the use of researches and statisticians to help analyze the data. 3.7. Performance Measurement 2.5 A performance contracting tool is used throughout the government across all the sectors at both the Permanent Secretary and senior manager levels. However, adjustments to policies are not necessarily made based on the performance achieved and this work is often perceived more as a requirement than a tool to improve management effectiveness. The Ministry of Planning points that one of the challenges faced is that some ministries set low targets and then surpass them to be ranked high and be rewarded. 22

4. Leadership Scored 2.4 Figure 8: Government Score in Leadership Human Resources Management 4.0 Commitment 2.4 3.0 2.1 2.1 Change in Management 1.0 Clarity and articulation of development orientations 2.6 Participation of nonstate actors 2.3 0.0 2.6 2.4 Responsibility and delegation at the level of senior officials of the Administration Integration of the decentralization dimension 4.1. Commitment 2.4 Top management is strongly committed to implement and use results-based approaches. The interest in results-based management has been growing since 2003 when the Government first established policies on that matter. The launch of the Kenyan Community of Practice on MfDR in 2010, under the leadership of the Ministry of Planning, is another example of the government s commitment to achieving results. Systematic adoption of MfDR remains challenging. Furthermore, leadership from the Ministry of Planning, as well as the support from the Ministry of Finance and the Office of the Prime Minister, is inconsistent. Results reports are often not always produced in a timely manner, nor are they effectively made accessible to the public or used as part of the decision making process. 4.2. Clarity and articulation of development orientations 2.6 Both the Vision 2030 and its Medium Term Plan (MTP) form a clear map for the development of Kenya. The MTP clearly sets out targets, indicators and outputs to be achieved over the timeframe. A Vision 2030 Delivery Secretariat is engaging implementing ministries on the progress made. Sectors also have their own plan, mentioning goals to achieve, in line with the Vision 2030. While there is a deliberate effort to align institutional and sector plans to National priorities, the measurement of results is still output based. Hence the need to measure the performance of Ministries, Departments and Agencies based on outcomes. 4.3. Participation of non-state actors 2.3 The new Kenyan constitution recognizes that public consultation is necessary in policy setting and planning, giving more leeway to the public in participating in Government 23

processes. Civic dialogue and ministerial stakeholders forums are organized. Specifically, the ministries of Transport, Agriculture and Health are almost in the full implementation stage, reflecting the fact that those sectors consistently involve stakeholders in policy formulation and implementation. The knowledge of CSOs on a specific topic is often limited and therefore first requires sensitization on technical issues, before making them full partners in the achievement of results 4.4. Responsibility and delegation at the level of senior officials of the Administration 2.6 The practice of delegation and accountability is embedded in the civil service, which clearly details roles and responsibilities for senior and middle level managers. There are clear job descriptions with key result areas for individual employees. Departmental and individual work plans are also prepared and derive from strategic plans. However, resistance exists in some ministries where some civil servants are very rarely promoted. Incentives are also provided through improving the working environment of civil servants. However, Ministries, Departments and Agencies are not fully empowered to manage for results. This means that central government agencies such as the Treasury, Ministry of State for Public Service and Planning have not fully delegated to authorised officers in terms of management and resource utilization. 4.5. Integration of the decentralization dimension 2.4 Policy plans of some sectors highlight how the national and local levels will interact, though the sharing of responsibilities between those levels is not entirely delineated. The Health sector operates at all devolved levels, from the province level to the specific locations, with each level preparing strategic plans. In the Education sector, local institutions also participate in the planning process, but are left out when it comes to making decisions on budget allocations. Partnerships with District Information and Documentation Centres have been established to bridge the information gap and ensure that local civil servants are well informed about the public policies. More civil servants have been deployed at the devolved level in line with the new constitution. 4.6. Change in Management 2.1 The 2005 Recruitment and Training Policy guides capacity building in the public service. There is an emphasis on performance-related courses, such as ones focusing on leadership and change management. Even though mandatory training requirements are in place, only a limited number of civil servants have access to those due to inadequate funding. The coaching and mentoring aspects of capacity building remain overlooked. 4.7. Human Resources Management 2.1 The 2005 Recruitment and Training Policy underlines the importance of induction of new civil servants or newly promoted officers, though the emphasis is not clearly put on managing for development results. The individual performance data collected through assessments is being used as part of career progression. Individual performance is linked to departmental objectives and targets, as well as to the ministerial performance contract. This merit-based system is used for promotion and to provide other incentives. There is a general feeling that employees feel underutilized and unappreciated, hence the need for people-friendly policies 24

5. Accountability and partnerships scored 2.5 Figure 9: Government Score in Accountability and Partnerships Alignment of partners on national priorities 4.0 Independence of the higher Audit 3.1 institutions 3.0 1.8 1.0 Coordination among DPs 0.0 Parliament s role in oversight of 3.2 Government action 2.1 Public access to results 2.8 Media independence 5.1. Independence of the higher Audit institutions 3.1 The Kenya National Audit Office s (KENAO) independence has been strengthened with the new constitution. Reports are independently made after discussions with audited persons. However, inconsistent follow up of audit findings exist due to personal interests as well as lack of integrity and accountability. Line ministries also have officers in charge of internal audit but, unlike KENAO s staff, those do not have tenure which may ultimately compromise their independence. The Judiciary s independence is being gradually improved. 5.2. Parliament s role in oversight of Government action 3.2 The legislature has the structure and resources to oversee government activities. It approves the budget, with parliamentary committees in charge of scrutinizing government expenditures. The newly formed budget office in the parliament is also supposed to ensure that the budget is aligned to government policies, which is a move toward more accountability. 5.3. Media independence 2.8 Media outlets are largely independent. Both public and private media have criticized public authorities. Media is seen as a tool to fight corruption and promote healthy living. The coverage of public policy issues has been quite substantial. 5.4. Public access to results 2.1 The new constitution provides in its article 35 that citizens are entitled to request Government information and the Government is bound to provide the sought information. Information on results is shared through the many Government websites, as well as the 25

new Kenya Open Data platform. Before this platform, access to information was below expectation, with outdated data sets available online, such as on the KenInfo platform. 5.5. Coordination among DPs There is no formal government policy on development partner coordination, though development partners have a Joint Assistance Strategy and meet regularly with the Government of Kenya in the Aid Effectiveness Group. The Ministry of Finance has a Department of External Resources, but the coordination is weak. Groups on thematic areas, such as Agriculture or Democratic Governance, are convening all development partners working on sectoral topic. The Health sector has put in place a sector wide approach which sees the involvement of several development partners in an agreed work program. On the contrary, the Education sector works with each development partner separately. 5.6. Alignment of partners on national priorities 1.8 The alignment of development partners on Kenyan priorities exists in some sectors. However, some development partners still provide support only on specific development areas determined by their own government agenda. 26

III. Government-Wide MfDR Action Plan A. Methodology The CAP-Scan action plan is designed for the entire Kenyan public sector. It includes all participating sectors. It is expected that this action plan will feed into the current Vision 2030 Medium Term Plan for 2008-2012. The process to design the CAP-Scan Action Plan was twofold. First, the participants were divided in five groups, and assigned to work on one MfDR pillar per group. Second, a plenary session was organized to discuss the findings from each group and benefit from all represented institutions inputs, resulting in an action plan fully developed by the participating government officials. The components addressed as part of the action plan design were identified based on the results of the self-assessment and on the priority areas for MfDR in the Kenya public sector. It focuses only on new and results-focused activities, as opposed to already planned activities. Groups used a framework to provide information on the MfDR components targeted in the action plan, which mentions the objective to achieve, the score or baseline obtained during the assessment, the targeted score to attain, the time required and responsible institutions. In most cases, targets were discussed in terms of moving from one MfDR stage to another, following the CAP-Scan measurement framework. The time needed to complete the activities spans from one to three years. Progress made toward reaching the targets could be measured through repeat CAP-Scan exercises. In the action plan presented below, each action is numbered in line with the numbering of MfDR components in the CAP-Scan measurement framework. These dimensions, if improved, could have great multiplier and reinforcing effects on other areas thereby improving the overall country capacity to manage for results in the foreseeable future. The table 2 reproduced on the following page presents the national action plan. 27

B. Action Plan by MfDR Pillar Table 3: National MfDR Action Plan Action Plan Dimension and related activities, by pillar 1. Planning and Budgeting 1.1 Budget consistency with national priorities 1. Advocate for the alignment of strategic plans and budgeting process through meetings and forums 2. Development of balanced scorecard in line ministries 1.2 Budget preparation based on objectives and results 1. Ministries to be trained in results-based budgeting techniques and the government budgeting cycle 2. implement RBM Budgeting guidelines Objective to be achieved Increased budgeting in line with vision 2030, MTP priorities and MTEF Budgeting for results is spearheaded 1.4 Intra-departmental coordination Intra-departmental 1. Linking strategic plans at departmental synergy created and ministerial levels, with the Vision 2030 2. Defined synergetic coordination framework through participatory process 1.5 Inter-sectoral coordination Strengthened intersectoral 1. Lobby for increased information sharing linkages CAP Scan Baseline Score CAP Scan Target Score Estimated completion month/year Responsible person s designation 2.2 3.25 2 years Ministry of Planning and Finance as lead and line ministries participating 2 3 2 years Ministry of Planning; of Finance; Central Planning Units within ministries 2 2.75 2 years Ministry of Planning and Finance as lead, with the participation of line ministries 2.5 3 Ministry of Planning and Finance as lead and line ministries participating

Action Plan Dimension and related activities, by pillar 2. Monitoring and Evaluation 2.1 National planning geared to development results 1. Integrate performance data into national policies and plans for results 2.2 Capacity for monitoring and evaluation of public policies 1. Collect and analyze reliable data from the source 2. Increased and better trained M&E staff 2.3 Information system and decision-support tools Objective to be achieved Use of performance data demonstrated Strengthened M&E capacity in MDAs More effective tracking of progress 1. Build a sustainable M&E infrastructure (IT-driven tools and mechanisms) 2.4 System for measuring user satisfaction User satisfaction 1. Findings of customer satisfaction surveys implemented 2. Harmonized tools developed and results disseminated to relevant stakeholders 2.5 Administration performance geared to development results 1. M&E tools are integrated 2. National integrated performance framework from departmental to national activities measured in a standardized manner for better response to citizens' needs Increased focus on achieving results CAP Scan Baseline Score CAP Scan Target Score Estimated completion month/year Responsible person s designation 2 3.2 Jun-14 MoPNDV2030; OPM 1.6 3 Jun-14 MoPNDV2030; OPM 1.8 3.2 Jun-13 MoF; MoPNDV2030 2.1 3.5 Jun-12 OPM 1.8 3 Jun-12 OPM 29

Action Plan Dimension and related activities, by pillar 2.6 Harmonization of information requests by development partners 1.Time saved on reporting to donors through the set up of a systematic harmonized system 2.7 Integration of M&E systems for decisionmaking process 1. All M&E initiatives are linked and integrated into NIMES 2. Develop an automated NIMES 3. Statistics 3.1 Statistics Strategy and Plan 1. National strategy for statistics and strategic plan developed Objective to be achieved Donor reporting is harmonizing for increased for decreased time spent on reporting Increased access and use of Governmentwide progress information Strengthened Government-wide statistical effectiveness 3.4 Data Quality Assessment Increased reliability and credibility of data 1. Develop standardized data quality assessment protocols 3.5 Capacity for conducting country-wide surveys 1. Conduct 10 sector specific surveys in 10 counties More comprehensive and representative country-wide data 3.6 Capacity for analysis and modeling Staffed statistical units 1. Build well equipped and facilitated in sector ministries and personnel in counties and line ministries at the local level CAP Scan Baseline Score CAP Scan Target Score Estimated completion month/year Responsible person s designation 1.1 3.5 Jun-12 OPM, MoF, MoPNDV2030, 1.8 3.2 Jun-13 MoPNDV2030, OPM, MoF 1.8 3.5 1 year Ministry of Planning & Vision 2030,In conjunction with Kenya National Bureau of Statistics KNBS 2.1 4 1year Kenya National Bureau of Statistics (KNBS) 2.4 4 2 years Kenya National Bureau of Statistics (KNBS) 1.9 3 4 years Ministries of planning, Finance, and Public Service 30

Action Plan Dimension and related activities, by pillar Objective to be achieved 4. Leadership 4.1 Commitment Improved staff capacity 1. Sensitization to MfDR and training for the senior managers from line ministries 2. Incentivize the performance management through a set of instruments (delegation, monetary, recognition) to manage for results - commitment CAP Scan Baseline Score CAP Scan Target Score Estimated completion month/year Responsible person s designation 2.4 3.5 3 years, National Assembly, Line ministries, and county assemblies 4.3 Participation of Non-State Actors (NSA) Increased contribution 1. Policy framework on the involvement of of NSAs in public NSAs in public affairs developed affairs 2. NSAs are sensitized to increase projects and programs ownership 3. Reinforced NSAs capacity in MfDR 4.6 Change in Management Better trained 1. Strengthened and mainstreamed managers for change implementation of results-based management tools, such as the rapid results initiative and performance contracting, the implementation of RBM, RRI, performance contracting. 2. Reviewed results-based management courses and curriculum 2.3 3 3 years. Transition to the new constitution will determine how fast this can be implemented 2.1 2.5 1 year to review the framework National and County Assembly and line ministries National and County Assembly and line ministries 31

Action Plan Dimension and related activities, by pillar Objective to be achieved 4.7 Human resources management Strengthened 1. Capacity building framework reviewed in line with the constitution and the vision 2030 2. Capacity building activities on MfDR undertaken targeting all levels 3. Empowered managers with new management style, Delegation, Executive coaching. Top leaders are assigned to specific coaches. 4. Promotion schemes implemented to increase mobility across sectors 6. Transformative leadership for high potential leaders 5. Accountability and Partnerships 5.1 Independence of the higher Audit institutions 1. Strict implementation of audit recommendations is ensured 5.2 Parliament s role in oversight of Government action 1. Electorate sensitized to the role of MPs and issue based politics to ensure MPs are elected on the basis of their commitment playing their role as public watchdog. workforce committed to implement the MfDR approaches Increased awareness of citizens on the role of members of parliaments 5.3 Media independence More diverse media 1. Percentage of media coverage coverage allocated to development issues CAP Scan Baseline Score CAP Scan Target Score Estimated completion month/year Responsible person s designation 2.1 3 2 years Ministry of State for Public Service 3 3.75 1 year Line Ministries and DPM 3.25 3.75 Continuous Min Justice and legal affairs./civil society 2.75 3.5 Jun-12 Ministry of information and public communication 32

Action Plan Dimension and related activities, by pillar Objective to be achieved 5.4 Public access to results Increased awareness 1. Defined policy on access to information and understanding of development results 2. Strengthened district information and documentation centers 3. Forums held to inform public 4. Branding of development projects 5.5 Coordination among development partners & 5.6 Alignment of development partners to national priorities 1. Policy on donor funding is established 2. Strengthened focal point for external resources to ensure harmonized use of donor funding Streamlined donors' contributions to national priorities CAP Scan Baseline Score CAP Scan Target Score Estimated completion month/year Responsible person s designation 2 3 Continuous Ministry of information and public communication/ MNP&D 1.75 3 Continuous MoF, MNP&D, OPM 33

IV. Evaluation of the CAP-Scan Workshop An evaluation of the CAP-Scan workshop was conducted at its end. 70% of the participants found the assessment from useful to very useful to Government of Kenya and almost 90% now have a clear idea of the needs of Kenya in MfDR. Almost 80% of the 22 respondents to the questionnaire circulated at the end of the workshop believe the CAP- Scan will result in improvements in the Kenyan ability to manage for development results, and half of the respondents mentioned that their sector will further move towards results-based planning and development. Almost 90% of the participants are confident that they are in a position to repeat the CAP-Scan without the assistance of an outside facilitator, using this exercise as a baseline for future iterations. The following chart provides a detailed overview of the evaluation results. Figure 10: CAP-Scan Workshop Evaluation How useful do you find the CAP-Scan tool? How useful to you was the assessment of your government s ability to manage for development results (MfDR)? The Facilitator(s) The Exercise How clearly do you now understand your government s needs to be able to manage for development results? How confident are you that using the CAP-Scan will result in improvements in MfDR in the country? How likely is that your Sector/Organization will move towards results based planning and development? How confident are you in your ability to repeat the CAP-Scan assessment without the assistance of an outside facilitator? How do you rate the facilitator s ability to explain and communicate clearly? How do you rate the facilitator s knowledge of MfDR? How do you rate the facilitator s facilitation technique and skills? Very useful Useful Unsure Not useful Not useful at all 0% 20% 40% 60% 80% 100%

Annexes Annex 1 Expression of Interest from the Government of Kenya 35

36

37