On Monday morning, I got a call from one of the radio stations asking me to join in a discussion about the recent report, which appeared in the Weekend Post newspaper of 16th June 2012, on the performance of our ministries. The report centres on how most ministries seem to be not performing well and the controversy around some top performing ministries against what public perception may hold about such ministries. I have decided to continue my contribution on that piece in here just so that I clarify a few issues around the matter, especially those that I might not have touched on or glossed over on account of time constraints.
Firstly, I would like to repeat that I am going to assume that the measurement tool that was/is been used to measure the performance of ministries has specified aspects, values and targets that have a direct relationship to the different ministries’ official mandates and formal functional responsibilities. I am saying this because any performance, no matter how good, unrelated to the core business of the ministry would mislead the public and anyone else trying to determine the relevance of a particular ministry to its core mandate. This is not to suggest that secondary or support services are not important, but to simply argue that it would be an anomaly to have excellent discharge of support service which does not translate into productive and high performance of the core business. It is for this reason that I am hoping that the performance measurement tool as used in the public service on ministries clearly differentiates the measurement of these separate functions in ministries and their units.
Secondly, I am going to assume that the tool contains what would be more generic measurable which could apply equally across ministries and, also has those key aspects that can measure one particular ministry as different from the next, on account of not only the difference in the general mandate, but more importantly on the basis of more specialised functional responsibilities. This is critical because some services as delivered by government are purely expert defined and the parameters of details could solely be the experts ‘responsibilities, while others, especially social services, could require citizens to be the main players in defining patterns of needs and, sometimes even the details of what such services should actually entail. These two dimensions require careful assessment as to where to draw the line between these types of services and when tools of measurement can be appropriate or not. In the larger scheme of things any results about the performance of a ministry must clearly show this type of relationship and the value attached to these different types of services. In our local setup, we do have ministries that have a combination of these types of services and therefore a simply generic tool applied equally across the ministry’s departments and units may not adequately measure that which we think we are measuring.
Thirdly, while performance measurement tools are usually very technical and supposedly objective based, it would be interesting to correlate the results about the various ministries performance against the more subjective citizen/public perceptions about the same ministries. It would be interesting for instance to see if the public’s view agrees that ministries such as that of health and that of Infrastructure, Science and Technology should be some of the top performers as the tool results suggests. I am simply bringing this up as a way of saying that the tool that we use will need to reconcile the objective aspects with the subjective ones. As is the norm, objective details of the tool speak very well to issues of efficiency and economic use of resources as seen by the expert public officials, but the larger part of the subjective aspects would be speaking more to issues of effectiveness and synergy with citizen espoused values and relevance. The latter part should, in my view, reflect the public’s understanding of, not only the meaning and impact of these results, but more importantly, whether the citizen sees a clear and conscience correlation between what is measured and what he/she should be getting as a service from a particular ministry.
The above is where we ought to focus on the results in terms of what a given percentage (say 70 percent) actually translates into in terms of taking the ministry closer to achieving its set goals. It also means been able to ascertain whether that percentage in the various measurables such as capacity of the ministry; level of training of necessary human resources, structural arrangement of the organs, clarity of policies and roles; communication setups and relational linkages necessary for coordination, among other things, reflects a ministry’s efficient and effective use of resources at its disposal. In brief how do these results for each ministry speak to elements of the various ministries’ strategies as reflected in their chapters in the National Development Plans, annual operational plans, ministerial visions and missions, as all these must, at all times, guide the performance of ministries.
Lastly, I wish to reflect on the effect of these results on the human resource tasked with the responsibilities of driving ministerial performances. The newspaper report mentions lack of supervision as one of the key reasons for the non-performance by ministries and the big question would be where exactly is this gap? The setup of our ministries as we know them has different levels where supervision ought to take place. The minister and his assistants (for those that do have assistant minister(s)) take responsibility for political supervision of their ministries and the permanent secretaries and their deputies takes responsibility for the professional responsibilities of all the various specializations in their ministries, whereas below this level would be various levels of directors, divisional heads, and managers of all sorts. All these have supervision as a key element of their daily functions.
I am wondering if the report has singled out exactly where lack of supervision is an issue and what sanctions pertain for those who would have been identified as the reason ministries are not performing as expected. It is well and good to spell out the reasons for non-performance, but equally important is what we do to ensure that we establish the right path towards higher levels of performance in the future. In brief, any level of poor or good performance by our ministries speaks to the efficiency and effectiveness of service delivery by the public service.

