1. INTRODUCTION

1.1 PERFORMANCE MANAGEMENT AS AN INSTITUTIONAL PRIORITY

Institutions, both in the private and public sectors, are continuously searching for ways to improve productivity and service delivery. The impact of individual performance levels on institutional success is undeniable. Managing employee performance is therefore accorded a very high priority in most institutions.

Objectives linked Modern practices in performance management have moved towards the linking of performance appraisal with the achievement of predetermined institutional objectives. Performance management in such cases becomes a systematic process through which institutions involve their employees, as individuals or as members of a group, in improving organizational effectiveness in the accomplishment of institutional missions and goals.

Responsibility of Managers have the most significant impact with regard to the

managers achievement of institutional objectives and the effective monitoring of their performance and competency levels should therefore be accorded a very high priority. It is also by the very nature of their responsibilities that it is possible to hold managers accountable for the achievement of predetermined objectives and goals. This is even more important in the case of heads of institutions who take responsibility for the execution of agreed business plans. The effective monitoring of their performance provides valuable information on institutional successes or failures and alerts to areas where urgent intervention is required.

Development Performance management does not only measure performance but also identifies areas where individuals may require personal development. Performance management therefore forms an integral part of any career management programme and should not be viewed as an instrument solely designed to deal with poor performers. In fact the incentive schemes normally associated with performance management systems serve as a tool to encourage improved levels of performance.

As indicated in the Public Service Commission’s (PSC) Report on the Management of Senior Managers’ Performance Agreements there is a definite need for an effective system in South Africa to manage and monitor the performance of managers within the context of a public service in transformation.

1.2 PERFORMANCE AGREEMENTS

Implemented 1998 Realizing the need to link institutional objectives with the performance appraisal of senior managers, the Minister for Public Service and Administration introduced a system of performance agreements in 1998 for senior managers in the public service, including heads of department (HoDs). The management of the system below the level of heads of department provided for constant feedback on performance between supervisors and their staff. However, there was no systematic and coherent process in place through which the performance of heads of department could be assessed.

Ministries were not provided with assistance to manage the evaluation process in a meaningful way and in many instances Ministries may not have had the necessary capacity to deal with performance management. Many HoDs also indicated that they did not receive systematic and comprehensive feedback on their performance. In addition, there was no systematic way in which Government received feedback on the achievement of its priorities.

1.3 FRAMEWORK FOR THE EVALUATION OF HEADS OF DEPARTMENT

PSC tasked Given the problems experienced with the evaluation of HoDs’ performance, the PSC was tasked by Cabinet to develop a framework to assist executing authorities.

Research The PSC, in the development of the framework, conducted research to determine how the evaluation of HoDs is dealt with in the public services of Australia, New Zealand, Canada, Singapore and the United Kingdom. Valuable lessons were drawn from the experiences of these countries and have informed the approach to the evaluation of HoDs in South Africa.

Emanating from these lessons as well as the inputs from the executing authorities (EAs) and the HoDs a formal evaluation framework for heads of department was developed and approved by Cabinet. The framework proposed uniform but flexible structures and processes according to which the performance of all HoDs can be evaluated by EAs. It was designed to EAs, whilst at the same time attempting to ensure a holistic and objective evaluation of performance.

Principles The principles that underpinned the development of the framework are the following:

Objectives Given the mandate provided by Cabinet the framework developed by the PSC aimed to achieve the following:

The framework was first implemented for the evaluation of HoDs’ performance during the 2000/2001 financial year. The functioning of the framework is discussed in detail in Chapter 2 of this report.

    1. ROLE OF THE PSC

Manage process As will be discussed in Chapter 2, the PSC plays a pivotal role in the evaluation of HoDs. In essence the PSC is responsible for the management of the evaluation process through the involvement of its members as Chairpersons of evaluation panels and its Office as secretariat to evaluation panels. Evaluation panels comprise peers, clients, Ministers and other role players appointed by Executing Authorities to advise them on the evaluation of their HoDs.

Given its role in the evaluation process, the PSC is therefore in the best position to reflect on its experiences during the first year of implementation of the evaluation framework.

1.5 PURPOSE OF THE REPORT

Analysis of The evaluation framework for HoDs is a new concept in the

implementation public service. It is therefore necessary to assess whether the framework has achieved the desired results. In view of the PSC’s involvement it was deemed appropriate to conduct a thorough analysis of the first year of implementation of the framework with a view to evaluate its impact on performance management. This report presents the findings of this analysis and in the process reflects on the following:

1.6 METHODOLOGY

Panel members were requested to complete evaluation forms at the end of each evaluation panel meeting. The evaluation forms were designed to obtain feedback on the following:

Feedback was further obtained from EAs during evaluation panel meetings. Individual members of the PSC who acted as Chairpersons of evaluation panels were able to reflect on their experiences during the evaluation process. The analysis of the evaluation process covers evaluations conducted at both national and provincial level.

    1. CONCLUSION

The intention with this report is obviously to provide a critique on the first year of implementation of the framework. It is, however, the PSC’s desire that the report will create awareness amongst those HoDs and EAs that did not participate during the first year of implementation, and that it will serve as catalyst for the successful management of the HoD evaluation process in future.

  1. FRAMEWORK FOR THE EVALUATION OF HEADS OF DEPARTMENT – AN OVERVIEW
    1. INTRODUCTION

In view of the fact that the framework for the evaluation of HoDs is a new concept in the South African public service, it is deemed necessary to briefly discuss the components of the framework.

2.2 EVALUATION PANELS

EAs appoint panels EAs must appoint evaluation panels to assist them with the evaluation of their HoDs. Although the nomination of members to serve on evaluation panels is left at the discretion of executing authorities, the framework provides for consultation with HoDs. The evaluation panels can reflect all stakeholders as dictated by the nature of the department concerned and may also involve the peers of HoDs.

Chaired by PSC Each evaluation panel appointed by EAs for HoDs of national departments is chaired by either the Chairperson or Deputy-Chairperson of the PSC. Panels appointed for provincial HoDs are chaired by the Commissioner resident in that province or, in their absence, by one of the nationally nominated Commissioners (other than the Chairperson or Deputy-Chairperson). The involvement of the PSC on these panels is to ensure, as independent role player, that the evaluation process is fair and equitable and that the same norms and standards are applied during the evaluation process.

Provide advice to EAs The role of evaluation panels is to advise EAs on the performance of their HoDs.

2.3 SUPPORT BY SECRETARIAT

Office of PSC All evaluation panels are supported by a secretariat to be appointed by the executing authority. For this purpose executing authorities can make use of staff in their own offices/ministries or appoint external consultants. The PSC has made the services of its Office available to EAs to act as secretariat to their evaluation panels.

The role of the secretariat is to collate and process all information received from HoDs and executing authorities into a reporting format for the evaluation panels and to take minutes of proceedings during meetings of the evaluation panels.

2.4 PARTICIPATION OF EXECUTING AUTHORITIES IN THE EVALUATION MEETINGS

EAs take final All EAs participate in discussions of the evaluation panels

decisions and provide input when deemed necessary or when required by the panel. The advice emanating from the evaluation panel is not binding on executing authorities and they are still responsible for the final decisions emanating from the evaluation process. However, it is advisable for executing authorities not to deviate from the advice of the evaluation panel unless valid reasons can be provided by the executing authority. If such reasons exist it will be good practice for the executing authority to minute the reasons in his/her decision on the performance of the HoD. The reasons will be conveyed to the President/Premier and the HoD involved.

2.5 EVALUATION PROCESS

Link to financial years The evaluation of HoDs is aligned to the planning and the Medium Term Expenditure Framework (MTEF) cycles. It therefore follows that evaluation periods are linked to financial years. Executing authorities and their HoDs may decide on the number of financial years to be covered by an evaluation. The minimum period to be covered by an evaluation is one financial year.

HoDs and their EAs must complete negotiations and sign performance agreements by the end of April of each year. Progress made in relation to the set objectives in the performance agreements must be reviewed on a regular basis as agreed to between EAs and their HoDs.

Information used The following information for the relevant financial year(s), must be used during the evaluation process:

Role of Secretariat The designated secretariat collates all information submitted to it in a summary and forwards it to the evaluation panel for consideration. During the evaluation process, evaluation panels obtain inputs from both the executing authority and HoD.

The EA can also make use of further evaluation methodology, such as the 360-degree evaluation, to assess all aspects of the HoDs managerial performance. The purpose of evaluating the managerial competencies is to contribute to good management practice. In cases where this methodology is used, it becomes the responsibility of the secretariat to manage the process and provide the panel with the summary of feedback from the 360-degree questionnaires.

Role of Panel After the evaluation panel has considered all information submitted to it, it provides advice in writing to the relevant executing authority. This advice indicates the level of performance of HoD with regard to the key performance areas in the performance agreement. Areas of development identified are also provided. The executing authority, with due consideration to this advice, takes a decision on the awarding of a cash bonus and other actions to be taken in terms of the performance of his/her HoD.

The results of the evaluation process must be forwarded to the President and the Premiers.

    1. REVIEW OF THE EVALUATION PROCESS IN CASES OF DISSATISFACTION

Where a HoD is dissatisfied with a decision of the executing authority regarding the evaluation she/he may request a review of the matter. The performance agreements of HoDs provide for a dispute settlement procedure. According to this procedure a person acting as mediator must be identified.

Review Committee As a first step disputes emanating from the performance evaluation of HoDs must be referred to the agreed person. If, however, the dispute cannot be resolved by such a person, the matter can be referred to a Review Committee. A national HoD must lodge his/her dissatisfaction with a Review Committee consisting of the Deputy President and the Minister for Public Service and Administration or their nominees.

A provincial HoD must lodge his/her dissatisfaction with a Review Committee consisting of the Premier and a MEC nominated by the Premier. A Director-General in the Office of a Premier can refer his/her dispute to a Review Committee consisting of the Deputy President and the Minister for Public Service and Administration or their nominees.

2.7 GUIDELINES THAT COMPLEMENT THE REGULATORY PART OF THE FRAMEWORK

PSC issues guidelines In addition to the regulatory part of the framework as discussed in paragraphs 2.2 to 2.6, the Commission also issues Guidelines on an annual basis to assist executing authorities during the evaluation process. The Guidelines contain advice on the administrative arrangements during the evaluation process and also include proforma documents and instruments to be used by HoDs and executing authorities.

2.8 CONCLUSION

The design of the framework discussed in this Chapter clearly illustrates that it was modeled to assist executing authorities. The question, however, is whether its application has succeeded in attaining this objective.

 

3. STATISTICAL OVERVIEW

    1. INTRODUCTION

Mandatory for Cabinet decided that as a first step the framework would be

national departments mandatory for the evaluation of heads of national departments. Premiers were, however, encouraged by the Minister for Public Service and Administration to implement the framework as well. As a result seven out of the nine provinces decided to implement the framework immediately. KwaZulu-Natal, decided to delay the implementation of the framework by one year. Only the Western Cape Province decided not to implement the framework as it had already developed its own system.

This chapter provides an analysis of the extent to which the framework was implemented at national and provincial level. A statistical overview on the number of HoDs evaluated both nationally and provincially is provided.

3.2 STATISTICAL OVERVIEW OF EVALUATIONS CONDUCTED AT NATIONAL LEVEL

12 HoDs evaluated Out of 36 national HoDs 12 HoDs were evaluated for the 2000/2001 financial year by means of the framework.

Full details of the HoDs that were evaluated are contained in Table 1:

Table 1: HoDs evaluated for 2000/2001 by means of the framework

 

DEPARTMENTS

HEADS OF DEPARTMENT

DACST

Mr RM Adams

DEFENCE

Gen S Nyanda

DEFENCE SECRETARIAT

Mr JB Masilela

DPSA

Mr RM Ramaite

EDUCATION

Mr T Mseleku

GCIS

Mr J Netshitenzhe

HEALTH

Dr A Ntsaluba

PSC

Mr M Sikhosana

SAMDI

Prof J Mokgoro

SAPS

Comm J Selebi

SOCIAL DEVELOPMENT

Ms A Bester

WATER AFFAIRS

Mr M Muller

 

Evaluations deferred As indicated in Chapter 2, the evaluation of HoDs is aligned to the planning and the Medium Term Expenditure Framework (MTEF) cycles. Evaluation periods are therefore aligned to financial years. The framework provides flexibility on the number of financial years to be covered by an evaluation period. As a result, evaluations for the following HoDs were deferred to include the 2001/2002 financial year:

Two of the HoDs who deferred their evaluation to include 2001/2002 were newly appointed and had not completed the full financial year. The HoD for Statistics South Africa was also appointed at the beginning of 2001.

Services terminated The services of the following HoDs were terminated before they could be evaluated:

Although correspondence were received from the following Ministries indicating the desire to evaluate their HoDs, the evaluation process was not activated for reasons unknown to the PSC:

Deviation approved The HoDs for the Presidency and the Department of Environmental Affairs and Tourism were evaluated by their respective Ministers using methods other than the framework.

In the case of the Department of Transport a full-time appointment had not been made as yet.

The fact that EAs and HoDs may decide on the number of financial years to be covered by an evaluation has, obviously, an impact on the total number of national HoDs evaluated for the 2000/2001 financial year.

Process slow to unfold As this was the first year of implementation, it was expected that the process would be slow to unfold. This proved to be so in practice as well. By the end of March 2002 only 5 HoDs had been evaluated. The remaining seven HoDs were evaluated in the period April 2002 to August 2002.

3.3 STATISTICAL OVERVIEW AT PROVINCIAL LEVEL

23 HoDs evaluated Out of 76 HoDs in the seven provinces that decided to implement the framework, a total of 23 HoDs were evaluated for the 2000/2001 financial year. Table 2 indicates the total number of HoDs evaluated in each province:

Table 2: Number of HoDs evaluated in each province

Province

Total Number of HoDs

HoDs evaluated

Department

HoD

Eastern Cape

11

2

 

 

Office of the Premier

Dr ME Tom

Treasury

Mr M Tom

Free State

11

1

Office of the Premier

Mr K de Wee

Gauteng

11

2

Agriculture

Ms Hannekom

     

Finance and Economic Affairs

Ms Hlatshwako

Limpopo

10

5

Public Works

Dr S Phillips

Local Government

Mr P Ramogoma

Office of the Premier

Ms MB Monama

Sports & Culture

Mr JPE Ndhambi

Finance, Economics & Tourism

Mr MB Mphahlele

Mpumalanga

11

2

Office of the Premier

Adv. S Soko

Social Development

Mr NJ Mabilo

Northern Cape

11

4

Education

Mr MT Moraladi

Housing and Local Government

Mr P Govender

Sport, Arts & Culture

Mr HH Esau

Transport, Roads & Public Works

Mrs PMN Mokhali

North West

11

7

Office of the Premier

Dr Bakane-Tuoane

Health

Dr Gosnel

Traditional & Corporate Affairs

Ms Sebego

Transport, Road & Public Works

Mr Sebekwane

Agriculture, Conservation & Environment

Mr Wills

Finance

Mr Tjie

Education

Dr Karodia

Note:The HoD of the Department of Education in Gauteng was evaluated by the MEC using a different system to the framework.

Three additional evaluations were still pending at the time of drafting this report. However, fifty three (53) HoDs could not be evaluated for a number of reasons. Table 3 indicates the reasons for non-evaluation and the number of HoDs in each category:

Table 3: Reasons for non – evaluation and the number of HoDs in each category

Reasons for non evaluation

Number of HoDs

Service terminated

4

Appointed on acting capacity

6

Suspended

2

On sick leave

1

Newly appointed and had not completed the full financial year

7

Performance agreement not signed

2

Excluded because framework was only piloted in province

9

Evaluation period extended

2

No response received

14

Documents not submitted

3

Evaluations pending

3

3.4 SUMMARY OF RATINGS AWARDED TO HODS

A standard rating scale system was used in the assessment of the performance of HoDs. The rating scale is reflected in the table below:

Table 4: Rating scale used for the evaluation of HoDs

Rating

Definition of score

5

Excellent- Performance far exceeds the standard.

4

Above satisfactory- Performance is significantly higher than the standard.

3

Satisfactory – performance fully meets the standard expected in all areas of the job.

2

Below satisfactory- performance is below the standard required for the job in key areas.

1

Unacceptable -performance does not meet the standard expected.

As indicated, one of the objectives of the framework was to provide feedback on the extent to which the performance areas contained in the performance agreements had been achieved. The ratings awarded to HoDs at national and provincial level are summarized in the table below and provide an indication of the extent to which HoDs succeeded in meeting the demands placed on them through their performance agreements:

Table 5: Summary of ratings awarded to HoDs

 

Rating

Definition of score

Number of HoDs at national level

Number of HoDs at provincial level

Total

5

Excellent

5

0

5

4

Above satisfactory

7

9

16

3

Satisfactory

0

10

10

2

Below satisfactory

0

3

3

1

Unacceptable

0

1

1

From the contents of table 5, it is clear that 14% of the HoDs’ performance at national and provincial level was rated excellent, 46% was rated above satisfactory and 29% was rated satisfactory. This means that 89% of the evaluated HoDs achieved the key performance areas they were expected to achieve. However, 9% of the HoDs evaluated, performed below satisfactory and the performance of 3% was rated unacceptable. This means that 12% of the HoDs evaluated did not meet the required standard of performance. This raises serious concerns given their important role and responsibility in achieving organizational effectiveness.

3.5 CONCLUSION

The limited number of HoDs that were evaluated is obviously a matter of concern. The number of evaluations that were deferred to include the 2001/2002 financial year could, however, suggest that EAs and HoDs may not have been ready to deal with the demands placed on them by the evaluation framework. There may also have been an element of avoidance due to concerns of potential conflict situations emanating from the evaluation process. It is, however, expected that those HoDs and EAs that were involved in the evaluation process will convey the positive aspects thereof to their peers and that fears that may have been prevalent will be allayed.

4. CRITICAL ANALYSIS OF THE FIRST IMPLEMENTATION

    1. INTRODUCTION

Based on the experience of the PSC during the management of the evaluation process and the feedback obtained from HoDs and EAs, each principle of the framework as well as the guidelines that support these principles are evaluated in this Chapter.

4.2 EVALUATION PANELS

As indicated in Chapter 2, the framework provides for the use of a panel system to assist with the evaluation of HoDs. The aim of these panels is to provide holistic and objective advice to EAs on the performance of their HoDs. Panel members should ideally have insight into the core functions of the department as well as the performance of the HoD.

Composition of panels Evaluation panels appointed for the evaluation of HoDs both nationally and provincially comprised of Ministers (MECs in the case of provinces), HoDs, external stakeholders and in some instances, members of the relevant portfolio committees. The involvement of Ministers and MECs on evaluation panels provided invaluable contribution through their political insight and by providing feedback on the achievement of intersectoral objectives. There is no doubt that collaboration and cooperation within the same clusters is becoming a critical dimension of modern leadership in public services since the final objective is the overall outcomes of government through the achievement of individual departments.

Postponements There was, however, a significant administrative pitfall in involving Ministers on evaluation panels. Due to their ever changing schedules, it often happened that evaluation meetings had to be postponed or that the composition of panels had to be altered at late notice.

Peers of HoDs provide The involvement of HoDs as peers on evaluation panels was

insight useful in that panels were provided with insight into the dynamics involved in managing a public service department. HoDs also added value in providing inputs on the context and regulatory framework within which departments operate.

Other stakeholders In the provinces more use was made of external stakeholders such as NGOs, parastatals and academic / professional institutions as panel members. Only in two cases at national level was use made of such external stakeholders. Participation of such stakeholders contributed significantly to the depth of debate and provided useful external perspective on service delivery achievements.

Care however, should be taken not to nominate panel members who have vested interests in a department, through contracts, as this could jeopardize the objectivity of the process. Persons having material interest in a department could be less inclined to provide objective comment during panel meetings.

Appointment not EAs should confirm panel members’ appointment in writing.

confirmed Thereafter the Chairperson communicates with the panel members regarding the evaluation process. There were instances where panel members indicated that they were not aware of their appointment as panel members and as a result they declined participation in the process. Not only did this delay the process as new panel members had to be appointed, but the lack of communication prevented the inclusion of key individuals. To obtain optimal value from the process serious consideration should be given to the selection of panel members. The impression of quickly putting a panel together to "get on with the job" should be avoided.

PSC chaired panels As independent role player the PSC played a pivotal role in ensuring that the evaluation process was fair and equitable. Members of the PSC chaired evaluation panels and ensured consistency in the manner in which each HoD was evaluated. The same Agenda (Annexure 1) was used at all evaluation meetings.

Feedback obtained from panel members indicated that the involvement of the PSC provided the necessary objectivity to process. The role of the Chairperson in clarifying the process prior to and during the meeting assisted in facilitating proceedings. Much time is invested in this process and the smooth evaluations that followed bear testimony to these efforts.

PSC focused At certain panel meetings there was a tendency to focus

discussions discussions on issues of the day rather than on the activities and achievements of the HoD during the period under review.

The chairpersons therefore had an important role to play in ensuring that discussions remained relevant to the evaluation of the HoD as well as the evaluation period involved.

Panel members were required to raise questions for clarification to the relevant HoD and EA after going through all the documentation for the evaluation of HoDs. These questions would then be forwarded to the relevant HoDs and executing authorities. In most cases panel members did not raise questions before the meeting but would raise questions during the meeting. Although this did not pose a problem, in evaluation meetings where questions raised are adverse, it could be argued that the HoD has not been given sufficient time to provide supporting evidence on his/her performance.

Another concern emanating from the lack of formulation of prior questions could be that panel members are not analyzing the documents prior to the meeting. If this is true, HoDs may be unfairly prejudiced by the lack of proper application of the mind during their evaluation.

4.3 SUPPORT PROVIDED BY THE SECRETARIAT

PSC provided All evaluation panels must be supported by a secretariat to

secretariat be appointed by the executing authority. In all national and provincial evaluations the Office of the Public Service Commission (OPSC) served as secretariat to the evaluation panels.

Secretariat The role of the secretariat was to provide administrative

commended support to the panel. In doing so, cross-references were made between the main documents used during the evaluation process (i.e. the verification statement, the business plan and the annual report). Possible questions to be directed at HoDs and EAs were also identified by the secretariat. The intention of these questions was to obtain clarification on aspects raised in the documentation for the evaluation of HoDs. Panel members commended the support provided by the secretariat in this regard. According to panel members the cross-references simplified and expedited the verification of objectives and achievements through a scrutiny of the relevant documentation.

More succinct Although there was general satisfaction with the manner in

summary required which the secretariat summarized and packaged the documents, the PSC believes that a more succinct summary should be provided.

The use of the OPSC as secretariat in all the evaluations ensured that the same norms and standards were maintained in the compilation of documents. It should be noted, however, that delays in the submission of documents by departments as well as the quality of documents submitted, impacted negatively on the evaluation process. This aspect will be dealt with further in this chapter.

4.4 PARTICIPATION OF EXECUTING AUTHORITIES IN THE EVALUATION MEETINGS

Participation of EAs EAs participated with enthusiasm during evaluation meetings

Essential at both national and provincial level. In the majority of evaluation meetings EAs interacted with the panel and with their HoDs during the first phase of the meeting. During this phase of the evaluation meeting, EAs provided an overview of their HoDs’ performance during the period under review and responded to questions of clarity raised by panel members. They could also in turn pose questions to the HoDs during the meeting. Their presence assisted in providing further inputs to the responses provided by their HoDs.

After panel members had raised all the questions they deemed appropriate, both the EA and HoD were recused. EAs therefore did not form part of the final deliberations that led to the advice formulated by evaluation panels. This was necessary to ensure impartial and objective advice.

There were instances in the provincial evaluations where executing authorities could not attend because of their heavy schedules. Such a situation is not desirable as was noted by panel members. Inputs from executing authorities were, however, received in writing in those instances. The presence of executing authorities provides tremendous benefit to the process.

4.5 EVALUATION PROCESS

  1. Documentation used for evaluation

Quality of documents The nature of documents used during the evaluation

not as required process are discussed in Chapter 2.5. The most significant problems experienced by the PSC during the evaluation process can be attributed to the quality of documentation submitted by Ministries and HoDs.

Not all documents Despite the fact that copies of the framework and

submitted guidelines for the evaluation of HoDs were forwarded to Ministries, outlining the process and providing proforma formats for certain documents, there was still a lack of understanding on what documentation should be forwarded to the PSC for evaluation. As a result, in some cases not all documents were submitted and in other cases documents submitted did not conform to the requirements of the framework. Much time was wasted correcting documents. In provinces particularly, there was confusion on whose responsibility it was to ensure that all the documentation was submitted to the secretariat. This delayed the process. However, in the national departments this worked very well as the Ministries’ offices took responsibility for ensuring that documentation reached the OPSC.

No synergy between The regulatory framework in the Public Service

documents emphasize the importance of aligning individual performance objectives of senior managers to departmental strategic plans. This does not only ensure consistent monitoring of individual performance but also ensures evaluation of organizational achievements over a given period. Notwithstanding this requirement, there were performance agreements that were not based on the strategic plans of their departments. In some cases there was no synergy between the key performance areas in the performance agreements and the key achievements outlined in annual reports.

Performance criteria lacking Performance agreements emphasized outputs rather than outcomes. The performance criteria used were in the majority of cases limited to target dates and seldom provided qualitative criteria for the measurement of performance.

Performance agreements Performance agreements were not signed in all

not signed instances. In such cases, the evaluations could not proceed. This raises serious concerns with regard to non-adherence to regulatory requirements. It also raises concerns around relationship issues between EAs and their HoDs. Attention to this area and necessary intervention is needed.

Verification statements not Verification statements did not in all instances

completed correctly conform to the requirements of the framework. The

verification statement provides a detailed account of achievements by a HoD against the contents of his/her performance agreement and constitutes important evidence to be used by evaluation panel members in deciding on the level of performance. The verification statements were in some cases not signed by EAs as required by the guidelines for the evaluation of HoDs. Signing of verification statements indicates agreement on the achievement of key performance areas. This could prejudice the HoD when preparing his/her input. In sixty five percent (65%) of the cases EAs did not provide a brief summary on the performance of HoDs in the verification statement. Although this was resolved by affording EAs an opportunity to provide an overview on the performance of the HoD at the evaluation meeting, these HoDs did not have the benefit of obtaining feedback from their EAs.

Verification statements did There were instances where there was no

not correlate with correlation between the verification statement and the

performance agreements performance agreement. Key performance areas identified in the performance agreement were different from those outlined in the verification statement. One could assume that in these cases the key performance areas might have changed during the year. If this is true, then one could conclude that performance management is ineffective since this could have been addressed during quarterly reviews and the necessary adjustments made. Where verification statements were irreconcilable with performance agreements, verification statements were referred back to HoDs for amendment.

Annual reports lack detail Annual reports did not always discuss the achievements of departments in sufficient detail. This made it difficult to link the departmental achievements to the key individual achievements of the HoDs. There appears to be a gap between the reporting requirements prescribed by National Treasury and the manner in which departments comply with these requirements. Instead of reporting on the achievement of key departmental objectives, departments report on their activities.

b) Evaluation of managerial competencies of HoDs

360 degree not used The guidelines for the evaluation of HoDs issued by

sufficiently the PSC provided EAs with a choice to make use of a 360-degree evaluation instrument. A copy of the instrument is attached as Annexure 2. The purpose of this optional instrument was to provide feedback on the managerial competence of HoDs and to identify areas where development was required. The instrument was included in the guidelines due to the fact that the verification statement, merely allowed an evaluation based on the achievement of objectives. Performance agreements did not provide for the evaluation of managerial competencies.

The majority of EAs, however, opted not to make use of the 360-degree evaluation instrument. Out of twelve (12) national HoDs evaluated, only one HoD was evaluated by means of this instrument. In the provinces twelve (12) out of 23 HoDs were evaluated by means of this instrument. Feedback obtained from the provinces that used the instrument indicated that it provided valuable feedback on managerial competencies of the HoDs. There was however, a concern that not all managerial competencies as outlined in the questionnaire apply to all HoDs.

c) Advice by the panel

Advice provided level The evaluation panels provided advice in writing to

of performance EAs after it had considered all information submitted to it. The advice provided by the panels contained a summary that indicated the extent to which an HoD had met the targets and expectations as set out in his/her performance agreement. The advice further indicated an assessment of the level of performance of the HoD and areas that needed development. An example of the format in which the advice was provided is attached as Annexure 3.

Parameters of cash bonus According to the rating scale used in terms of the framework, an HoD whose performance was rated either above satisfactory or excellent (levels 4 and 5 of the rating scale) could be awarded a cash bonus as determined by the Minister of Public Service and Administration. Criticism raised indicated that the rating scale did not specify how much cash bonus should be awarded for each level. This was left at the discretion of the EA.

This meant that an HoD rated at level 4 could be awarded the same amount of cash bonus awarded to an HoD rated at level 5. EAs felt that this discrepancy could create unhappiness amongst HoDs.

Results of evaluations As indicated in Chapter 3, 89% of the HoDs evaluated nationally and provincially, was rated between satisfactory and excellent. There were few cases where performance of heads of provincial departments was rated below satisfactory. Only in one case was the head of a provincial department’s performance rated as unacceptable.

Despite concerns about the rating scale, EAs indicated that the advice provided by the panel assisted them in deciding on actions based on the performance of the HoDs. HoDs also provided positive feedback on the advice provided by panels on their performance.

Certain HoDs contacted the PSC enquiring about the results of the evaluations. This indicates that some EAs did not communicate the results of the evaluation. This is a matter of concern to the PSC as it is the role of the EAs to communicate these results. The PSC has also not received any feedback indicating deviation from advice provided by the panel. It is assumed that all the EAs whose HoDs were evaluated accepted the advice provided by the panels.

4.6 CONCLUSION

From the above discussion it is clear there were problems encountered during the evaluation process. This could, however, be expected given the fact that this was the first year of implementation. The PSC has submitted proposals to Cabinet that will address certain of the concerns raised. The recommendations of the PSC are discussed in detail in Chapter 5.

5. CONCLUSIONS AND RECOMMENDATIONS

5.1 INTRODUCTION

The analysis provided in Chapters 3 and 4 of this report has enabled the PSC to make certain conclusions regarding the first year of implementation of the framework for the evaluation

of HoDs. Emanating from these conclusions, a number of recommendations are also made in this chapter on how to improve the process for the evaluation of HoDs.

5.2 SOUND PUBLIC SERVICE LEADERSHIP

HoDs met The ratings awarded to HoDs suggest that there is a high level

Expectations of satisfaction with their abilities to provide effective leadership to their respective departments. Out of the 35 HoDs that were evaluated both at national and provincial departmental levels, 31 were awarded a rating of satisfactory, above satisfactory or excellent. This means that 89% of the HoDs that were evaluated met or exceeded the expectations contained in their performance agreements. The next round of evaluations, involving larger numbers of HoDs, should indicate whether this trend is representative of the public service as a whole.

Developmental areas Panels were afforded the opportunity to provide advice on areas where HoDs required further development. One particular area requiring further attention that was prominent in the advice of a number of panels was the need to promote coordination and cooperation between departments. Public Service leadership therefore needs to improve the manner in which it coordinates the machinery of government in meeting its overarching objectives.

Recommendations

Training to be Performance evaluation can only succeed in meeting its

provided objectives if it is adequately integrated with other human resource development practices. It is therefore recommended that the Public Service Commission engage the South African Management Development Institute (SAMDI) on the nature of training to be provided to management at this level. The Commission will assess the training programmes already provided by SAMDI and provide inputs on training required emanating from the HoD evaluation process.

 

5.3 DISSAPOINTING LEVELS OF PARTICIPATION

Lack of response It was expected that the first year of implementation would be slow to unfold. However, the lack of response from EAs, despite several follow-ups made with them, is of great concern. A total of six (6) EAs at national level did not indicate whether they intended to evaluate their HoDs or whether they were going to defer the evaluation to include the next financial year. This impacted negatively on the planning process.

The lack of response is also in contravention of national norms and standards as the framework was mandatory for heads of national departments.

Lack of urgency Although twelve (12) national HoDs were evaluated during the first year of implementation of the framework, only five (5) HoDs were evaluated by the end of the 2001/2002 financial year. The remaining seven (7) were evaluated during the first quarter of the 2002/2003 financial year. This illustrates a clear lack of urgency in finalizing the evaluation process.

Provinces In provinces such as Gauteng, Mpumalanga, Free State and the Northern Cape, the process took very long to commence. Unfortunately a cut-off date had to be set to allow preparation for the 2001/2002 evaluations. Had this not been the case a much higher number of HoDs could have been evaluated in these provinces for the 2000/2001 financial year.

Recommendations

Evaluation Executing authorities and HoDs should take note of the fact

Compulsory that evaluation in terms of the framework is compulsory and that non-adherence amounts to a transgression of national norms and standards. The Public Service Commission will report non-adherence to National and Provincial Cabinets, and the national/provincial legislatures.

Expedite evaluation In order to expedite the evaluation process, HoDs must begin with the compilation of verification statements immediately after the end of a financial year. Executing authorities must commit themselves to conclude the evaluation process not later than six months after the publishing of annual reports.

FOSAD to provide FOSAD should arrange a workshop to discuss what can be

inputs done to facilitate and expedite the implementation of the framework. During this workshop, FOSAD should also formulate recommendations on improvements to the framework.

5.4 EXTENDED EVALUATION PERIODS ARE PROBLEMATIC

Evaluation period A large number of evaluations were deferred by executing authorities to include the 2001/2002 financial year.

Whilst this decision by executing authorities was in line with the provisions of the framework, which allowed evaluation periods to span more than one financial year, this is not a sound practice for the following reasons:

Recommendations

Evaluate annually In view of the implications as discussed, evaluation periods

for HoDs should not exceed one financial year. All HoDs should therefore be evaluated on an annual basis.

    1. PHASED-IN APPROACH FOR PROVINCES RESULTS IN INCONSISTENT NORMS AND STANDARDS

The framework was, as indicated, initially only made obligatory for national HoDs. Although the majority of provinces decided to implement the framework, progress in implementation has been slow. This might be attributed to the fact that provinces, regardless of whether the Premier has adopted the framework, feel less compelled to prioritise its implementation.

The framework attempts to provide objective advice on the performance of HoDs. In order to ensure a higher level of objectivity, the involvement of external stakeholders in the evaluation process, as provided for by the panel system, becomes critical.

Western Cape differs In the Western Cape, HoDs are only evaluated by their executing authorities. In a normal work environment, evaluation of staff by their direct supervisors may be a good practice. However, the dynamics involved in the relationship between an executing authority and his/her HoD, differs to a very large extent from a normal supervisor/staff member relationship.

The framework adopted by Cabinet and the system applied in the Western Cape clearly applies different sets of norms and standards. In a public service in transition it is important that its leaders are evaluated in a consistent manner. For this very reason, the DPSA has recently introduced a performance management system that has to be applied for all senior managers. The HoD evaluation framework will be amended to accommodate this system.

Recommendation

Obligatory for Implementation of the evaluation framework must be made

provinces obligatory for all provinces. This should address problems experienced with delays in commencing with the evaluation process and will ensure that the same norms and standards are applied to all HoDs in the public service.

5.6 COMPOSITION OF EVALUATION PANELS CAN BE IMPROVED UPON

As previously discussed, the involvement of Ministers as panel members caused delays in the evaluation process. Their changing schedules necessitated the postponement of evaluations to accommodate their availability.

Benefit of involving This limitation with the involvement of Ministers, however,

Ministers does not outweigh the benefits derived from their participation. The presence of Ministers of departments operating in the same sectors on evaluation panels provided a perspective during panel meetings that would otherwise not have been obtained. Had they not been present, the nature and depth of the advice provided would have been adversely affected. It is, however, not deemed advisable to include more than one additional Minister on an evaluation panel given the logistical constraints experienced.

Involving clients Very few clients served on the panels of national HoDs. This is a matter that should be rectified in the next round of evaluations.

Clients of departments provide valuable feedback on the extent to which HoDs succeed in meeting service delivery demands and on the client-orientation of their departments.

Recommendations

Composition of panels It is recommended that, in addition to the member of the PSC, evaluation panels should not comprise more than four members. It is proposed that panels comprise the following members:

Executing authorities should, after consultation with panel members on their availability for the panel, confirm their appointment in writing. During the consultation process, the role of the panel should be explained.

5.7 INADEQUATE PERFORMANCE AGREEMENTS AND VERIFICATION STATEMENTS

Performance The contents of performance agreements did not facilitate

agreements the effective evaluation of performance in all instances. As indicated, performance criteria lacked qualitative elements. Clear criteria need to be incorporated in performance agreements that would allow objective measurement of performance. The responsibilities of HoDs contained in legislation and sub-ordinate legislation were also not referred to. These financial and administrative responsibilities form the core of an HoDs task as a manager.

It was also noticed that performance agreements tend to reinforce the functioning of departments in silos by limiting outputs to the defined functional terrains of departments. The overarching objectives of government are not addressed.

In terms of the evaluation framework, performance agreements form the basis upon which HoDs are evaluated. The Department of Public Service and Administration has emphasized the fact that if no performance agreement is in place then no evaluation can occur. A major concern therefore was the delay in the conclusion of performance agreements between executing authorities and their HoDs. Such cases reflect non-adherence to a regulatory requirement. A mechanism is required to facilitate agreement between executing authorities and HoDs, where such agreement is not forthcoming.

It is the opinion that intervention is required to ensure that performance agreements of HoDs are up to standard from the following perspectives:

Verification The purpose of verification statements is to provide a

Statements detailed account of actual performance against each of the agreed objectives and outputs contained in performance agreements. HoDs in many instances, however, did not ensure that the information contained in their verification statements provided enough substance for panel members to apply their minds. This might stem from the fact that the compilation of verification statements is seen as an additional burden on HoDs’ already full work programmes.

Recommendations

PSC to ensure quality Considering the role of the PSC in facilitating and

control overseeing the evaluation process of HoDs, it is the opinion that the PSC would be in the best position to provide the intervention required. It is therefore recommended that:

Verification HoDs must accord the necessary priority to the completion

statements to be of their verification statements. The Commission will advise

prioritized HoDs where the contents of their verification statements are not sufficient.

5.8 A VITAL ROLE PERFORMED BY SECRETARIAT IN THE EVALUATION PROCESS

Positive feedback on As indicated in Chapter 4, there was positive feedback by

support by Office panel members on the support provided by the Office of the PSC as secretariat to the evaluation process. The Office ensured that documents were packaged and summarized in the same manner for all evaluations.

Use of Office Although the framework provided for the use of alternative

facilitated personnel as secretariat, all executing authorities opted for

co-ordination the use of the Office. This made the task of Commissioners as chairpersons and facilitators of the evaluation process easier in that they could effectively supervise arrangements made by the secretariat. For this reason, it is highly unlikely that the coordination of the evaluation process would have been so effective if use were made of personnel outside the Office of the PSC.

More succinct Although there was general satisfaction with the manner in

summary required which the secretariat summarized and packaged the documents, the PSC believes that a more succinct summary should be provided by the secretariat.

Recommendations

Use of Office to be Considering the benefits as discussed, it is recommended

mandatory that the Office of the Public Service Commission serves as secretariat to all evaluation panels. Initially, this was not budgeted for. Additional resources will be required, therefore, to support this function.

More concise It is further recommended that the secretariat provides a

summary more concise summary capturing the salient issues emanating from the performance of the HoD.

5.9 ASSESSMENT OF MANAGERIAL COMPETENCE LACKING

The use of the 360-degree evaluation instrument, provided by the Public Service Commission in its guidelines, could have contributed positively in assessing managerial competencies of HoDs and in identifying areas of development. In cases where 360-degree feedback was obtained important and useful information was provided to HoDs and EAs on the managerial competencies of the HoD.

Lack of understanding It is the PSC’s opinion that the lack of use of the 360-degree instrument may stem from anxiety amongst HoDs to be evaluated by their peers, staff and clients. The purpose of the 360-degree instrument, i.e. to provide holistic feedback on competencies, might not have been understood.

Recommendations

To be mandatory The new performance management system for the SMS which applies with effect from the 2002/2003 financial year, provides that performance agreements be supplemented with key managerial competencies. In the interim it is proposed that the use of the 360-degree evaluation instrument be made compulsory in the evaluation of all HoDs. This will facilitate feedback on managerial competence.

5.10 PROBLEMS EXPERIENCED WITH THE USE OF THE RATING SCALE

The Commission is in agreement with the view expressed by certain executing authorities that the link between the cash bonus system for senior managers and the framework for the evaluation of HoDs is not as clear as it should be. The rating scale used does not indicate what cash bonus should be awarded.

The new performance evaluation system for the SMS implemented with effect from 1 April 2002, provides for the use of a uniform rating scale for senior managers. The elements of the rating scale are as follows:

"LEVEL 5: OUTSTANDING PERFORMANCE - Performance far exceeds the standard expected of a member at this level. The appraisal indicates that the jobholder has achieved exceptional results against all performance criteria and indicators and maintained this in all areas of responsibility throughout the year.

LEVEL 4: PERFORMANCE SIGNIFICANTLY ABOVE EXPECTATIONS - Performance is significantly higher than the standard expected in the job. The appraisal indicates that the member has achieved better than fully effective results against more than half of the performance criteria and indicators and fully achieved all others throughout the year.

LEVEL 3: FULLY EFFECTIVE - Performance fully meets the standard expected in all areas of the job. The appraisal indicates that the member has achieved effective results against all significant performance criteria and indicators and may have achieved results significantly above expectations in one or two less significant areas throughout the year.

LEVEL 2: PERFORMANCE NOT FULLY SATISFACTORY – Performance is below the standard required for the job in key areas. The appraisal indicates that the member has achieved adequate results against many key performance criteria and indicators but has not fully achieved adequate results against others during the course of the year. Improvement in these areas is necessary to bring performance up to the standard expected in the job.

LEVEL 1: UNACCEPTABLE PERFORMANCE -performance does not meet the standard expected for the job. The appraisal indicates that the member has not met one or more fundamental requirements and/or is achieving results that are well below the performance criteria and indicators in a number of significant areas of responsibility. The member has failed to demonstrate the commitment or ability to bring performance up to the level expected in the job despite management efforts to encourage improvement.

In terms of this new rating scale, a cash bonus between 6 and 8% may be awarded for performance rated as outstanding (rating 5). Where performance is rated as significantly above expectations (rating 4), a cash bonus between 3 and 5% may be awarded.

Recommendation

Apply SMS ratings Although the new performance management system for the SMS is only applicable with effect from the 2002/2003 financial year, the Public Service Commission recommends that the rating scale and parameters attached to the system be used immediately in respect of HoDs. This will eliminate concerns raised regarding the inconsistent awarding of cash bonuses.

 

5.11 CONCLUSION

Government Government, through the implementation of this framework,

Committed to improve has sent a clear message to HoDs indicating its

Performance commitment to monitor and ensure improved levels of performance in the public service. The challenge that faces the role players involved in implementing the evaluation framework is to ensure that it meets its intended objectives of measuring performance and facilitating development.

Progress to be The limited extent to which the framework was implemented

monitored during the first year of its application, raises concerns about the priority attached to it by HoDs and their Executing Authorities, especially given the fact that it was mandatory in terms of heads of national departments. Progress in this respect will be closely monitored by the Public Service Commission.

It is trusted that the recommendations made by the PSC in this report will assist in facilitating the evaluation process. The success of the evaluation process, however, remains largely dependant on the buy-in and cooperation of EAs and HoDs.