EXECUTIVE SUMMARY

  1. INTRODUCTION

A trend that is beginning to emerge in performance management is the linking of institutional objectives with performance appraisal. This does not only promote organizational effectiveness, but also ensures constant monitoring of individual performance. In this way areas that need development are identified with a view to accelerate performance, thus contributing to improvement of service delivery.

Realizing this trend as well as the important role of the senior managers in achieving institutional objectives, the Minister of Public Service and Administration, in 1998, introduced a system of performance agreements, for senior managers including heads of department (HoDs) in the public service. Although the system allowed constant feedback on performance at the levels below HoDs, there has been no systematic and coherent process through which the performance of HoDs could be assessed. As a result, the Public Service Commission (PSC) was requested by Cabinet to develop a framework for the evaluation of heads of department (HoDs).

The PSC conducted research to determine how the evaluation of HoDs in the Public Service is dealt with in other countries. Valuable lessons were drawn from these experiences and have informed the approach of the evaluation of HoDs in South Africa. The framework was approved by Cabinet and first implemented for the evaluation of HoDs’ performance during the 2000/2001 financial year.

Objectives of the The framework aims to achieve the following:

framework

Mandatory for As a first step the framework was made

national mandatory for the evaluation of heads of national

departments departments. Notwithstanding this, seven out of the nine provinces decided to implement the framework immediately. KwaZulu-Natal decided to delay the implementation by one year. Only the Western Cape Province decided not to implement the framework as they had already developed their own system.

Role of PSC The PSC played a pivotal role in the management of the evaluation process during the first year of implementation. Given the PSC’s role in the process and the fact that the evaluation framework is a new concept, it was deemed appropriate to conduct a thorough analysis of the first year of implementation of the framework with a view to evaluate its impact on performance management. This report presents the findings of this analysis and in the process reflects on the following:

2. METHODOLOGY

Panel members were requested to complete evaluation forms at the end of each evaluation panel meeting. The evaluation forms were designed to obtain feedback on the following:

Feedback was further obtained from EAs during evaluation panel meetings. Individual members of the PSC who acted as Chairpersons of the evaluation panels were able to reflect on their experiences during the evaluation process. The analysis of the evaluation process covers evaluations conducted both at national and provincial level.

  1. FRAMEWORK FOR THE EVALUATION OF HEADS OF DEPARTMENT – AN OVERVIEW

In view of the fact that the framework for the evaluation of HoDs is a new concept in the South African public service, it is deemed necessary to briefly discuss the components of the framework. Chapter 2 of the Report provides a detailed discussion of the framework.

4. STATISTICAL OVERVIEW

4.1 Statistical overview at National Level

Out of the 36 national HoDs, 12 HoDs were evaluated for the 2000/2001 financial year by means of the framework

Evaluations deferred The framework provides flexibility on the number of financial years to be covered for each evaluation period. As a result, evaluations for ten HoDs were deferred to include the 2001/2002 financial year (see at page 11 of the report).

Services terminated The services of the following HoDs were terminated before they could be evaluated:

The evaluation of 5 HoDs could not be activated for reasons unknown to the PSC (see at page 11 of the report).

Deviation The HoDs for the Presidency, the department of Labour and the Department of Environmental Affairs and Tourism were evaluated by their respective Ministers using methods other than the framework approved by Cabinet.

The fact that EAs and HoDs may decide on the number of financial years to be covered for an evaluation period has had an impact on the total number of national HoDs evaluated.

4.2 Statistical overview at Provincial Level

23 HoDs evaluated In the seven provinces, out of a total of 76, 23 HoDs were evaluated for the 2000/2001 financial year. Three additional evaluations were still pending at the time of drafting this report. Fifty three HoDs could not be evaluated for a number of reasons.

The HoD of the Department of Education in Gauteng was evaluated by the MEC using a different system from the approved framework.

4.3 Level of performance of HoDs evaluated

Out of the 35 HoDs evaluated both at national and provincial level, 31 were rated between satisfactory and excellent. This means that 89% of the HoDs evaluated, achieved the outputs/outcomes agreed upon in their performance agreements. However, 9% of the HoDs evaluated performed below satisfactory and the performance of 3% of the HoDs evaluated was rated unacceptable. The fact that 12% of the evaluated HoDs did not meet the required standard of performance raises serious concerns given the responsibility and important role of HoDs in achieving institutional objectives.

5. CRITICAL ANALYSIS OF THE FIRST IMPLEMENTATION

5.1 Evaluation panels

The framework provides for the use of a panel system to assist with the evaluation of HoDs. Evaluation panels appointed, comprised of Ministers, MECs, HoDs, external stakeholders including portfolio committee members. Although the involvement of Ministers and MECs in evaluation panels was regarded as invaluable, their heavy schedules led to administrative pitfalls.

The use of members of the PSC as Chairpersons for the evaluation panels provided necessary objectivity and consistency in the evaluation process.

5.2 Support by the Secretariat

All evaluation panels must be supported by a secretariat appointed by the EA. In all the evaluations the Office of the Public Service Commission (OPSC) served as secretariat to the evaluation panels. The panel members commended the support of the secretariat during the evaluations.

5.3 Documentation used for evaluation

Many problems experienced during evaluation process can be attributed to the documentation submitted for the evaluation of HoDs. Documents submitted did not always conform to the requirements of the framework. Problems were specifically encountered with the quality of performance agreements and verification statements compiled by the HoDs. The verification statement is expected to provide a detailed account of achievement by an HoD against the contents of his/her performance agreement and constitutes the most important evidence to be used by evaluation panel members in deciding on the level of performance.

No synergy between The regulatory framework in the Public Service emphasizes

documents the importance of aligning individual performance objectives of senior managers to departmental strategic plans. This does not only ensure consistent monitoring of individual performance but also ensures evaluation of organizational achievements over a given period. Notwithstanding this requirement, there were performance agreements that were not based on the strategic plans of their departments. In some cases there was no synergy between the key performance areas in the performance agreements and the key achievements outlined in annual reports.

    1. Evaluation of managerial competencies of HoDs

The guidelines for the evaluation of HoDs issued by the PSC provided EAs with a choice to make use of a 360-degree evaluation instrument. The purpose of this optional instrument was to provide feedback on the managerial competence of HoDs and to identify areas where development was required. Only one Executing Authority at national level opted to use of this instrument. In the provinces 12 of the 23 HoDs evaluated made use of this instrument.

5.5 Advice provided by the panel

The evaluation panel provides advice in writing to the EA after it has considered all information submitted to it through the documentation and discussion with the EA and the HoD during the evaluation session. Feedback obtained from EAs indicated that the advice provided assisted them in deciding on the performance bonus to be awarded for outstanding performance and actions to be taken for performance that does not meet expectations. There were, however, concerns raised with regard to the rating scale used. These concerns focused on the fact that the rating scale did not specify how much cash bonus should be awarded for each performance level. This was left at the discretion of the EA. EAs indicated that inconsistencies created by this could create unhappiness amongst HoDs.

6. RECOMMENDATIONS

Emanating from the conclusions during the first implementation of the framework the following recommendations are made to improve the process for the evaluation of HoDs:

6.1 Sound Public Service Leadership

The ratings awarded to HoDs suggest that that there is a high level of satisfaction with their abilities to provide effective leadership to their respective departments. Out of 35 HoDs that were evaluated both at national and provincial level, 89% met or exceeded the expectations contained in their performance agreements.

Developmental areas One particular area requiring further attention that was prominent in the advice of a number of panels was the need to promote coordination and cooperation between departments. Public Service leadership therefore needs to improve the manner in which it coordinates the machinery of government in meeting its overarching objectives.

Training to be Performance evaluation can only succeed in meeting its

provided objectives if it is adequately integrated with other human resource development practices. It is therefore recommended that the Public Service Commission engage the South African Management Development Institute (SAMDI) on the nature of training to be provided to management at this level.

    1. Disappointing levels of participation

It was expected that the first year of implementation would be slow to unfold. However, the lack of response from EAs, despite several follow-ups made with them, is of great concern. Out of 12 HoDs evaluated during the first year of implementation, only five (5) HoDs were evaluated by the end of the 2001/2002 financial year. The remaining seven (7) were evaluated during the first quarter of the 2002/2003 financial year. This illustrates a clear lack of urgency in finalizing the evaluation process.

Evaluation periods A large number of evaluations were deferred by EAs and HoDs to include the 2001/2002 financial year. Whilst this was in line with the provisions of the framework, this is not a sound practice given practical administrative problems associated with this. The principle of regular evaluation and performance feedback is also defeated through this approach.

In order to expedite the evaluation process, HoDs must begin with the compilation of verification statements immediately after the end of a financial year. It is recommended that the evaluation periods not exceed one financial year. These evaluations must be held not later than six months after the publishing of annual reports.

6.3 Phased-in approach for provinces results in inconsistent norms and standards

The framework was initially made obligatory for national HoDs. Although the majority of provinces decided to implement the framework, progress in implementation has been slow.

The fact that in the Western Cape, HoDs are only evaluated by their executing authorities, indicates that their system differs in terms of norms and standards from the framework approved by Cabinet. In a public service in transition it is important that its leaders are evaluated in a consistent manner.

Obligatory for It is recommended that implementation of the framework be

provinces made compulsory for the provinces. This will address problems experienced with delays in commencing with the evaluation process and will ensure that the same norms and standards are applied to all HoDs in the public service.

    1. Improvement of composition of evaluation panels
    2. In view of the logistical constraints associated with the

      changing schedules of Ministers, and given the benefits derived from their participation as panel members, it is not advisable to include more than one Minister in an evaluation panel. EAs are also encouraged to appoint panel members

      Use of different from the stakeholders of the department and external clients,

      Stakeholders in order to provide holistic feedback on the performance of HoDs.

    3. Inadequate performance agreements and verification statements
    4. The contents and quality of performance agreements did not facilitate the effective evaluation of performance in all instances.

      In terms of the evaluation framework, performance agreements form the basis upon which HoDs are evaluated. The Department of Public Service and Administration has emphasized the fact that if no performance agreement is signed then no evaluation can occur. A major concern therefore was the delay in the conclusion of performance agreements between EAs and their HoDs. Such cases reflect non-adherence to a regulatory requirement. A mechanism is required to facilitate agreement between executing authorities and HoDs, where such agreement is not forthcoming. It is also the opinion that intervention is required to ensure that performance agreements of HoDs are up to standard.

       

      Verification The purpose of verification statements is to provide a

      statements detailed account of actual performance against each of the agreed objectives and outputs contained in performance agreements. HoDs in many instances, however, did not ensure that the information contained in their verification statements provided enough substance for panel members to apply their minds.

      Assistance with Considering the role of the PSC in facilitating and overseeing

      performance the evaluation process, it is the opinion that the PSC would

      agreements be in the best position to provide the intervention required. It is therefore proposed that the performance agreements

      of all HoDs be filed with the PSC by the end of August of each year. In the case of newly appointed HoDs, performance agreements be filed within three months of their date of appointment. The PSC will provide necessary advice where required.

    5. A vital role performed by secretariat in the evaluation process
    6. Although the framework provided for the use of alternative personnel as secretariat, all EAs opted for the use of the OPSC. This made the task of Commissioners as Chairpersons and facilitators of the evaluation process easier in that they could effectively supervise arrangements made by the secretariat.

      OPSC acts as Given the fact that the PSC chairs all the evaluations, use of

      secretariat the OPSC as the secretariat will make coordination easier and the same norms and standards will be maintained in the collation of documentation. In view of this, it is proposed that the OPSC provides secretarial support to all evaluations. Initially, this was not budgeted for. Additional resources will be required, therefore, to support this function. Although there was general satisfaction with the manner in which the secretariat summarized and packaged the documents, the

      More succinct PSC believes that a more succinct summary should be

      summary provided by the secretariat.

    7. Assessment of managerial competence
    8. The use of the 360-degree evaluation instrument, provided by the PSC in its guidelines, could have contributed positively in assessing managerial competencies of HoDs and in identifying areas of development. In cases where 360-degree feedback was obtained important and useful information was provided to HoDs and EAs on the managerial competencies of the HoD.

      Evaluation of The new performance management system for the SMS

      managerial which applies with effect from the 2002/2003 financial year,

      competence provides that performance agreements be supplemented with key managerial competencies. It is, therefore, proposed that in the interim the use of the 360-degree evaluation instrument be made compulsory in the evaluation of all HoDs. This will facilitate feedback on managerial competence.

    9. Problems experienced with the use of the rating scale

Some inconsistencies have been experienced with the use of the rating scale provided by the framework in the awarding of cash bonuses. It is proposed therefore that the

Parameters of cash rating scale and parameters of cash bonuses attached to the

bonus new performance management system for the SMS be used instead.

The elements of the rating scale and the parameters of cash bonus are the following:

Table 1: Rating scale and parameters of cash bonus

Level

Definition

Parameters of cash bonus

5

Outstanding Performance

6-8%

4

Performance significantly above expectations

3-5%

3

Fully effective

N/A

2

Performance not fully satisfactory

N/A

1

Unacceptable performance

N/A

7. CONCLUSION

Government through the implementation of this framework has sent a clear message to HoDs indicating its commitment to monitor and ensure improved levels of performance in the public service. It is trusted that recommendations made by the PSC in this report will assist in facilitating the evaluation process. The success of the evaluation process however, remains largely dependent on the buy-in and cooperation of executing authorities and HoDs.