Department of Basic Education on its state of readiness for the 2014 Annual National Assessments and National Senior Certificate Examination

Basic Education

02 September 2014
Chairperson: Ms N Gina (ANC)
Share this page:

Meeting Summary

The Department of Basic Education (DBE) gave a detailed presentation on its readiness in terms of systems and learner preparation, for both the Annual National Assessment (ANA) and National Senior Certificate (NSC). Thereafter, the administration of both examinations was addressed.

The DBE dispelled the myth that there was a decreased confidence in the public schooling system, as there had been no significant growth in the number of independent schools.  It was looking at closing small schools which were educationally and economically not viable, keeping in mind departmental resource constraints.  The qualification of teachers had improved since 1994, but this had not yet translated into better learning outcomes. The challenge was that teachers who were expected to teach learners in their mother tongue were themselves taught in English. Until that was addressed, the situation would not improve.  It was conceded that over the years, training quality and support structures for ground staff had been neglected, hence the lack of dividends in terms of learning outcomes. There had been an improvement in the country’s international test standings, but it was not yet good enough.

The DBE was continuously trying to improve and enhance examination and assessment functionality. The state of examinations and assessment in South Africa was that exams were well established, with rigorous exam processes. The national question papers were of a high standard, and contributed to improved teaching and learning. The school-based assessment (SBA) was showing signs of improvement and the marking system was steadily improving. The DBE, together with the provincial education departments (PEDs), had monitored the preparations for ANA 2014 through bi-monthly meetings.  All reports from the PEDs indicated that they were ready for ANA 2014.  Provincial visits conducted by the DBE in May and June had verified the state of readiness of each province. All schools had received the timetable for the ANA. The DBE and the PEDs were ready to administer the 2014 NSC examination and an assessment system that was internationally comparable and credible.

The Department was asked to give details on the procedure and criteria for the employment and payment of markers.  How ready were schools for the new maths papers?   Would any schools struggle to provide adequate accommodation and furniture for the examinations?  How would the Department ensure that exam question papers would be kept secure?  Would learners at schools where no teaching had taken place, such as the Northern Cape and Eastern Cape, be able to write matric?  Members were also concerned that the lack of a reading culture hindered parents’ ability to assist their children, and suggested the introduction of mobile reading clinics.

Meeting report

Briefing by Department of Basic Eduaction (DBE)
Mr Paddy Padayachee, Acting Director General, DBE, outlined the purpose and content of the twofold presentation. The first presentation was directed at the readiness in terms of system and learner preparation, for both the Annual National Assessment (ANA) and National Senior Certificate (NSC). Thereafter, the administration of both examinations would be addressed.

Mr Mathanzima Mweli, Acting Deputy Director General, DBE, depicted the landscape of the country’s schools and dispelled the myth that there was a decreased confidence in the public schooling system, as there had been no significant growth in the number of independent schools. The foundation phase made up the majority of the schooling system, followed by the intermediate, senior and further education and training (FET) bands, with pre-grade R constituting the lowest number in the system.

The Department was looking at closing small schools which were educationally and economically not viable, keeping in mind departmental resource constraints. Small schools as a whole constituted 26% of the total schools in the system. In the implementation of the Curriculum Assessment Policy Statement (CAPS), R7.7 billion had been spent on textbooks alone. The final phase of CAPS (senior and grade 12) had topped costs at R 3.5 billion. The retention rate and retrieval of books had improved, with the Western Cape and Eastern Cape having experienced problems in prior years. That notwithstanding, provinces were doing well in this regard, and the 2014-2015 budget was being used to supplement, as required, to move toward universal coverage.

The qualification of teachers had improved since 1994, from 53% to approximately 98%, but this had not yet translated into better learning outcomes. This went to the issue of the foundation phase. The challenge was that teachers who were expected to teach learners in their mother tongue were themselves taught in English. Until that was addressed, the situation would not improve.

It was indicated that officials visited schools which had been a challenge beforehand, until the implementation of departmental monitoring.  It was conceded that over the years, training quality and support structures for ground staff had been neglected, hence the lack of dividends in terms of learning outcomes. There had been an improvement in the country’s international test standings, yet it was not good enough. All provinces save the Western Cape had improved in that regard, but despite the drop, it had maintained its position at the top of the national grid.  The ‘good schools’ – those schools in quintiles four and five -- had shown that they also needed attention in terms of the international tests.

The performance of districts showed that departmental interventions were working. Between 2012 and 2013 no provinces went down in terms of the NSC and only three of the nine provinces saw a decline in results in terms of the ANA. The rest went up. In terms of the supplementary exams, the North West had emerged as the highest performing. In terms of quality, all provinces had shown an upward trend in the number of Bachelors. The quality of grade 3 mathematics was not that promising, whereas the quality of performance in home languages was better. This trend intensified per grade. The senior phase proved to be the weakest link in the system, hence it was advocated that all intervention should be grounded before the senior phase.

In terms of efficiency, learners took more than 12 years to complete schooling because of perpetual repetition informed by the skills and knowledge deficit suffered by learners coming from the lower grades. The department’s assessment was that the actual dropout rate was 13-15%, and not 50% as put foward by some analysts and academics. The throughput rate was low, but the dropout rate was not as high as popularly indicated. It was pointed out that the poor performance of learners by international standards was partly due to the lack of a culture of reading. Parents, and South African society as a whole, did not read -- the majority were non- and basic readers, as opposed to advanced readers. A linkage was shown between countries that did well in reading and performed well in terms of mathematics and science.

Dr Jennifer Joshua, Director: Curriculum Implementation and Quality Improvement (GET), DBE, explained the genesis of the reading reports which stemmed from the Minister’s request for an audit, which recommended a national reading plan. The DBE had developed such a plan and provinces had aligned their individual plans accordingly. The provinces reported on this quarterly and these reports were analysed, with feedback being given to the provinces. The Free State had developed norms for reading fluency and comprehension. The norms stipulated how many words a learner should learn per term, the number of books to be read and passages written. It served them to match reading ages with chronological ages, and to put in the appropriate interventions.

The National Strategy for Learner Attainment contained all the interventions across phases and grades in the system. The provinces reported on a quarterly basis and the reports were analysed, with feedback being given to the provinces. An example was Gauteng, which had developed a languages and mathematics strategy which focused on assisting all underperforming schools. Scripted lesson plans and resources are provided, followed by monitoring, and it appeared that such a strategy was working. The FET underperforming schools and first time grade 12 schools were adopted by senior management service (SMS) members.  Subject advisors visited schools and determined whether the curriculum coverage was in line with the national teaching plan and where there was a discrepancy, advice was given. There were also the vacation camps held for all gateway subjects -- languages, maths and science -- where senior managers interacted with learners’ books to check what work had been completed against the annual teaching plan. This was an effective means of monitoring and evaluation.

The oversight visit served to survey all interventions in terms of the ANA and the NSC. These processes served to triangulate the information and identify overlaps in efforts to improve learner performance. The Department undertook visits to all provinces, and the provinces presented their progress, challenges and interventions. The Northern Cape was set as an example. Common tests had been set in that province, and as soon as results for the ANA and the NSC were released, the provinces then did an analysis on learner conceptual problems and performance, and devised improvement plans. All interventions were then organised around those improvement plans so that the key areas were addressed and learners mastered the concepts. The plans were disseminated down to district level and each school customised the plans to suit the areas of challenges for that particular school.  In terms of NSC preparations, the province had identified mentors for under-performing schools. There were also other programmes, such as Telematics and HeyMath!, which assisted learners to master mathematical concepts. Teachers had also been trained in language across the grades.

The Chairperson asked for a comparison with provinces that were not performing as well, in terms of their interventions.

Dr Joshua cited the Eastern Cape. The province faced specific challenges, by virtue of it being the largest and rural in nature.  Of the 23 districts, the deep rural schools were under-resourced and had trouble managing interventions.  Capacity issues in terms of maths and science teachers, were also being experienced, as with other provinces. There were attempts to ameliorate situation on the province’s part through the accountability programmes.

Mr Mweli emphasised that all provinces were implementing improvement plans. Preparations were made a quarter in advance. In October, all provinces were expected to bring improvement plans to the planning meeting when ANA results were available, and again in January when NSC results were available, to factor in these outcomes. Interventions were in fact instituted two years ago, with the introduction of CAPS. Another important forum was the Minister’s meeting with district directors. These meetings are focused on specific issues such as curriculum coverage. Districts were selected at random to present. Further, despite the volatile fiscal environment, provinces were still setting aside budgetary allowances for improvement plans.  More work needed to be done to measure the impact of specifically teacher development activities, among other initiatives. Not all provinces were making the effort to provide the requisite 40 reading books per learner, in keeping with the National Strategy for Learner Attainment (NSLA). Many schools did not have access to the number of books required by the sector plan, and that continued to be a challenge.

Dr Rufus Poliah, Chief Director: Exams, DBE, introduced the presentation as an investigation into the authenticity and credibility of the outcomes and outputs of the system. The NSC examination system had developed over 19-year period. The national assessment system was a fledgling in comparison, but it was forecast that it too would enjoy the same status as the NSC examination system in the years to come.  Internationally, South African examinations were comparable. The system was responsive to contrasts and changes.  A priority area was the independent centres, which had a vested interest in the outcomes of the exams as such.  The independent learning centres, whose integrity the Department was not confident of, would be administered by provincial departments.  Another threat to be kept in mind was the advancements in technology.  In summary, in looking at the state of examinations in the country, the question papers were instructive as they were of a high standard, and it was this high standard that was driving teaching and learning standards. The administration was fairly efficient and the success of examinations in the country could be attributed to the highly committed staff. The last 19 years had been a progression toward a single national standard, with the provincial systems being brought closer to each other based on national norms and standards that were continually being developed. Two areas remained challenging -- the school-based assessments and improving the marking system.

The national assessment was still a new system, but the provinces had developed the required capacity and there had been an improvement in the quality of the registration data.   Printing capacity had improved, although five provinces had their printing outsourced.  85% of the data had been captured in 2013, and it was hoped that in 2014, close to 90% would be captured.  Another attribute to the national assessment was that South Africa was one of the few countries in the world that administered a national assessment to grades 6 and 9 on an annual basis, and also that the tests were administered in September with results being delivered by the beginning of December at the latest.  

A significant achievement was that diagnostic analysis was beginning to add a new dimension to teaching and learning. As there were problems with the senior phase in terms of standards, it was conceded that there were problems with grades 10 and 11. Therefore the grade 11s would be writing a common mathematics and physical science examination for the first time in 2014 as part of the initiative to improve standards at that level. That initiative would progress incrementally across subjects in grade 11 and to grade 10. Results were not an end in themselves, so data utilisation was of importance, and in the past year the department had done a lot in terms of using feedback effectively in the system. Trend analysis in terms on international standard comparisons and internal year to year comparisons still needed work.

Regarding the NSC, enrolment of full-time learners had dropped by about 24 000, while part-time learners had increased by about 8 000. The number of invigilators and markers had remained fairly stable, with 118 marking centres. Provincially, KZN had the largest enrolment number.   Despite the decrease in enrolment, the number was in keeping with the 2010 numbers. The part-time enrolments were steadily increasing and the Committee was assured that this did not mean learners were being moved from full-time enrolments. Instead, the part-time learners constituted failures from previous years. Learners that took seven subjects or more on a part-time basis were continually monitored.  In terms of subject enrolments, history had seen an increase. The mathematics and maths literacy numbers remained constant, 42% and 48% respectively. The gender distribution also remained constant.  45.4% of the learner population was male, with 54.6% being female.

Monitoring was one of the key elements in terms of administering a credible exam. To this end, there was an early warning system that could pick up problems before they occurred, making the system proactive. The five pillars of the enhanced monitoring approach that were continually being improved upon, were development of norms and standards, mediation of norms and standards, coordination of examination processes, monitoring and support of Provincial Education Departments (PEDs), and evaluation and feedback.  What was stressed was the four-tiered approach in which every level in the system was to take responsibility for the level directly preceding it. This would make the responsibility of the national Department more streamlined and focused. Evaluation of the examination system was done according to the 14 target success indicators. The indicators were made explicit in the system to all provinces and participants, and they were held accountable according to these indicators. At the end of every cycle, every province would be assessed in terms of the indicators.

Mrs Priscilla Ogunbanjo, Director: Examination and Assessment, DBE, went through the exam cycle, indicating what had been done with regard to the state of readiness.  Provinces had already registered candidates and centres on the exam system.  Most of the PEDs did not register learners in grade 12 in 2014, but had rolled over the data from grade 10. The advantage was that the profile of a candidate was readily available on the system, detailing their specific requirements, subject changes, concessions etc.  Provinces had sent out two schedules of entries to schools for verification of the accuracy of registration data.

A notable improvement was that a number of provinces had developed a system of going to the districts with a computer programme enabling the candidates or principals to correct data directly on the system. Subject changes, immigrant candidates and special concessions had been appropriately managed. The DBE was conducting an audit of the registration data in terms of readiness for provision of examination material to ensure that every candidate was catered for.  All enhancements on the examination computer system would be completed by 30 September 2014.  The dry run on the system would have commenced by 30 September 2014 and would be done in conjunction with Umalusi.  All independent centres had been evaluated to ensure that they satisfied the criteria for registration.  In the cases where there was doubt as to the integrity of the centre, the examination would be administered by the PED or would be closely monitored. Concessions that had been granted included additional time, amanuensis and scribes. Repeat candidates who faced an expired School-Based Assessment (SBA) validity period (five years) in 2014 were granted a special extension by the Minister of one year for candidates who would have written between 2008 and 2013.  

All 258 question papers for November 2014 and March 2015 examinations had been set and moderated by the DBE to ensure a national standard. The question papers had been edited and quality assured by Umalusi and handed over to PEDs in accordance with their printing plans.  Adaptation of the question papers for blind, partially sighted and deaf candidates was complete.  Brailling of the paper for the blind was in progress.

Mr Mweli had alluded to the provincial preparation of candidates for the mathematics paper, for instance. This was an indication of the changes in the papers which had taken place, based on CAPS. Mathematics had been reduced to two papers, with the integration of previously optional Euclidian geometry included in paper two, and probability included in paper one.  10% of paper one was thus new content, and 30% of paper two was new content.  Exemplars had already been developed from grade 10 (two years ago) and examination guidelines had been reviewed for all subjects. Economics would have two papers, as opposed to one in previous years.  All home language papers were going to have a short text replaced by a long text.  History would include an additional essay and consumer studies would require an additional question. The panels in physical science and mathematics had been increased and in other subjects, where capacity allowed. Separate panels would set papers one and two. However, at the level of moderation there was a team of moderators to moderate across the multiple papers of the subject, to ensure that there was a balance. A fairness review had been introduced for the first time in 2014 in the light of issues experienced with the dramatic arts paper in 2013. This measure ensured that there was no prejudice, offence to any groups, bias or stereotyping in the papers.  An additional tier of editing had been introduced as a final quality assurance measure.

In terms of printing, packing and distribution, progress had gone according to plan. For security reasons, question papers were made available to PEDs only a week before printing commenced. The earliest printing could commence was 1 August 2014.  Storage facilities across all PEDs had been inspected and security had been improved at distribution points.  Question papers would be distributed to examination centres on the morning of the examination, except in the Western Cape. In the Northern Cape, question papers would be stored only at schools which were far from the districts. That province had thus successfully converted some schools into nodal points which were equipped with the requisite security measures. The Western Cape would deliver to schools in weekly consignments.

The examinations commence on 27 October and conclude on 28 November 2014.  Invigilators were in the process of being trained across all PEDs.  A common national Manual on Invigilation was used across all examination centres. The learners would be required to sign a pledge on 17 October at a pledge signing ceremony.  This was being done to inculcate a sense of commitment to compliance with the examination code of conduct.

The writing of the examinations would be monitored by the DBE, PEDs and Umalusi.  DBE had appointed 30 independent monitors who would be deployed to the provinces.  Schools with a history of previous irregularities would be closely monitored, although the rate of irregularities had decreased with the introduction of the pledge.  2014 would see the introduction of a common answer book across the provinces, except for two provinces whose surplus of previous answer books needed to be exhausted.  All PEDs had a clear control process for the management of answer books. All scripts were to be returned to the district offices on the same day. Scripts were to enjoy the same attention and security as the question paper. Four PEDs would be using bar coded labels in the 2014 NSC examinations, which would ensure better script control. All scripts were to be correctly labelled with the centre number and sealed at school level. All scripts were to be controlled and checked at each stage of the process and all PEDs would follow specific norm times for the return of scripts between collection points. The average was one hour, but exceptions were indicated by provinces and calculated times would be supplied accordingly.

In terms of marking, markers were appointed according to the Public Administration Management (PAM) criteria and most PEDs had added learner performance as an additional criterion.  National marking guideline discussions would be hosted in Pretoria from October for all subjects to ensure standardisation of marking across PEDs.  Internal moderators were to be appointed for each subject at each marking centre.  The DBE would appoint external moderators to evaluate the marking at each centre.

A tolerance range had been introduced to control differences in moderation.  This was the agreed degree of deviation between the marked and moderated mark.  It was not to exceed 2 - 3% -- not more than one mark -- at the total mark level and at the question level.  The tolerance range for every paper in every subject would be fixed at the national marking guideline discussions. Where it was found that the tolerance range had been exceeded in more than 50% of the moderated batch, the batch would be re-marked. If the discrepancy was in 50% of the scripts for moderation, additional scripts were to be moderated.  Where the range was exceeded in fewer than 50%, then the scripts would be accepted.

Chief markers and internal moderators during the national training session would be evaluated on their compliance to the tolerance range. After multiple opportunities and the marker or moderator was not able to adhere to the tolerance range, they would not be authorized as chief markers or internal moderators. Given that the chief markers and internal moderators had already been appointed, their services would not be terminated, so the DBE would arrange for support for these officials at the marking centre.

The authorisation process was such that prior to the commencement of marking, each marker would be given a batch of ten scripts to mark. If fewer than 50% of the scripts were within the tolerance range, the marker would be authorised. If fewer than 50% of the scripts were within the tolerance range, the marker would be given a second batch. If the marker was still outside of the tolerance range then they would be called into a consultative meeting, where their future involvement would be determined.

Centralisation of the small enrolment subjects, which allocated the marking of scripts from all nine provinces to one province for that particular subject, would be piloted in 2014. The pilot would serve as a sounding board for other subjects, with larger enrolments at a later stage. A province allocated a specific subject would take full responsibility for the marking of the subject. This would entail marker appointments, senior markers, chief markers and internal moderators.  The province was responsible for the provision of training, a venue, accommodation for markers and the management and security of the marking process.  There was a bilateral arrangement between PEDs where the number of scripts was less than 1 000. The DBE would coordinate and monitor the process nationally to ensure that there were no glitches. No PED would be allowed to mark a subject where all the quality assurance requirements could not be complied with.  Dance Studies and Agricultural Technology would be marked centrally. Centralised marking of deaf and Braille scripts would be done by the Western Cape and Gauteng Education Departments for the third year.

The DBE had conducted an audit of provincial SBA moderation systems in July 2014 in two districts per province. The second moderation would be in October, evaluating the assessment tasks and learner evidence.  Provincial moderation would be complete by 16 October, and all SBA marks captured by 15 November.  The common assessment task for life orientation had been developed and was to be written on 5 September.  National marking guidelines discussion would be conducted in Pretoria for life orientation, ensuring the standardisation of marking across all PEDs.  Marking was to be completed by 14 December, and mark capturing completed by 18 December.  Proposals in respect of adjustments for the pre-standardisation meeting were due by 20 December, with the meetings to be held on 21-22 December.  The Umalusi standardisation meeting was to be held on 23 December. Results would be checked by the DBE, PEDs and Umalusi from 23 to 30 December. Umalusi’s approval meeting would be on 30 December.

The Ministerial announcement on the 2014 NSC results would be on 5 January 2015.  The release of results to candidates by schools would be on 6 February 2015.  Re-mark and re-check would take place in early February. Candidates would be given 14 days to apply for a re-mark or re-check. If a candidate was still unsatisfied, the candidate would be allowed to view his/her script.  Candidates could write a supplementary exam based on specific criteria.  

Detailed analysis of results would be available. Qualitative analytical reports from chief markers or internal moderators would be consolidated for distribution to schools.  Outcomes for the Umalusi standardisation meetings would be available to curriculum specialists and examining panels.  Workshops would be conducted with teachers and subject advisors. Under-performing schools would be brought to account and improvement plans devised. The 2014 certificates were scheduled to be released to candidates by the end of May 2015.  The serious irregularities experienced in 2013 included the Dramatic Arts question on rape, marker appointments in KZN, lost scripts and copying, which was reduced.

Mr Qetelo Moloi, National Research Coordinator: DBE, said that the DBE, in conjunction with the PEDs, would administer ANA tests on all learners in grades 1-6 and 9.  Mathematics and languages would be tested, as it was believed that these skills were vital to learners negotiating any curriculum. A pilot study would be done on a representative sample of schools and learners at grades 7 and 8 from 16 – 19 September 2014. The timetable had been sent to PEDs and schools in the first quarter of the school year. Key systems for the monitoring of preparations and processes of ANA included bi-monthly meetings with Heads of Exams from PEDs and at least one monitoring visit to individual PEDs by the DBE before tests were written.

The DBE had visited all PEDs between May and June, and the focus had been on assessing the state of readiness for ANA 2014.  At that stage, 102 tests for ANA had been developed.  For the foundation phase, these were in all 11 official languages.  Grades 3, 6 and 9 were key transitional phases, and were therefore priority areas for ANA. The tests had been submitted to experts who served on the ANA advisory committee, and all received inputs were used to finalised the tests.  The tests were print ready by the end of May 2014, and adaptation for learners with special needs had been completed. Registration of learners included capturing essential information, and to date seven million learners had been registered on the GET platform.  Three of the PEDs had completed printing and packing. Only the Northern Cape was still in the process of printing. Tests for the Free State, Limpopo and North West were being printed by CTP Printers.  KZN and Gauteng materials were being printed by Lebone Litho Printers in Gauteng.  

Both service providers were to complete distribution to nodal points by 12 September 2014.  Tests were to be written in all schools from 26-19 September. Principals would allocate teachers to invigilate classes that they did not teach ordinarily.  Independent schools could participate in the ANA for the purpose of being evaluated for subsidies, and would be invigilated by a district official.  DBE would employ independent monitors to reinforce the monitoring that would be done by the DBE, PEDs and district officials.  

Marking would be preceded by memo discussions at national and district levels.  School Management Teams (SMTs) would moderate the marking before schools issued reports to parents. Participating independent schools would be marked by appointed markers at selected centres managed by PEDs.  All marks and assessment data per learner would be captured on existing IT systems and eventually be uploaded on to the national GET system.  DBE professionals would compile provincial and district results and a diagnostic report. The Minister would release the national report in December. The diagnostic report would be given to schools in January for teachers to address the identified learning deficiencies. An independent service provider had been appointed by the DBE to verify all processes. They would monitor the administration of the tests in a random sample of 125 schools per province. They would independently mark, process and report performance from the sampled schools.

Discussion
The Chairperson asked for an explanation of what was meant by the reference to an 85% registration. It had been mentioned, as a request, that the issue of subsidies for independent schools be discussed. Yet the question at present was why the ANA tests were not compulsory.  Was everything according to plan on the question of markers, in light of the KZN issue in 2013?  Was the discussion not skewed, if the emphasis was on the final stage of schooling in terms of marking, as they had been marking learners throughout the system, to be monitored only at the end stage?  This spoke to the issue of teacher development and what was being done in that regard.  What was the procedure on the employment and payment of markers?  How ready were schools in terms of the new maths papers, given that the sections were previously optional, and the teachers themselves were not familiar with the material.

Mr Mweli said that whenever there was a change in the curriculum, there were bound to be negative effects on the results.  The Department had thus deliberated on ways to mitigate the negative repercussions. Changes in mathematics had been identified as a potential challenge, so training had commenced in 2012 with the grade 10s. The department had monitored the system performance from 2012 to 2014 -- at that stage, grade 12s.  The learners were indeed struggling with the new content and it would have been unrealistic to expect otherwise. The request to include these aspects in the curriculum came from universities..

Dr Poliah clarified that there was 100% registration of learners for examinations.   The 85 % referred to the capturing of the marks at the end of the process.  The other 15% would constitute learners who did not write, or had not had their data captured for various reasons.

Mr T Khoza (ANC) asked about the situation where schools struggled with accommodation and furniture. He also inquired about the percentages of concessions, and what was done where schools failed to identify the need for concessions.

Dr Poliah said that there was a robust policy in place that audited the examination venues in terms of furniture and other resources.  No learner was adversely affected due to inadequate furniture. The Department had asked all provinces to send out circulars to all districts informing them of concessions, and there had subsequently been an increase in the number of requests from disadvantaged communities.  Another hurdle was that in order to qualify for the concessions, certain documents had to be obtained from professionals to whom disadvantaged communities did not have access, so provincial offices accommodated such professionals. On the other hand, the more affluent communities tended to abuse such concessions.  As a result, there were strict quality assurance and auditing mechanisms in place to regulate that process.

Mr D Mnguni (ANC) asked about the difference in learner numbers between grades 1 and 2, and why that number of learners did not continue on to grade 2 from grade 1.   What was the procedure regarding migrant learners from other provinces, as well as foreign learners from outside the country, especially in the border provinces?   He wondered whether it was sound to keep question papers at the schools and asked what the security procedure was.

Dr Poliah explained that there were mechanisms in place to accommodate migrant learners from one province to another on the national database.  As to the storage of question papers at schools, the Department was not comfortable with the arrangement in the Western Cape, and warned them annually of the risks involved.  Papers were delivered on a weekly basis, so no question paper remained at a school for more than five days. The Western Cape had done a thorough audit of all its schools, and they had strongroom facilities and principals were trained to use such resources. A double lock system had also been introduced, utilised on a container which had an unlocking code, which was sent to the school the morning before the examinations. The Northern Cape had adopted the modus operandi suggested by Mr Mnguni.  In the main, Northern Cape schools had their papers delivered daily.

Ms A Lovemore (DA) commented that the Internal Efficiency of Schools Systems report issued by the DBE in October 2013 stated that the dropout rate was 50%, and not 13-15%.  Also, the provincial training was not accredited, or up to standard.   What was being done to address that.  In terms of improvement plans, what was the quality assurance procedure during the year and not just at exam time, as most of these plans were untenable or unrealistic?  What was to be done about the situation in the Northern Cape, where teaching had not taken place for months, and in the Eastern Cape, where there were many teacher vacancies?   Were the learners in those provinces going to write matric?  What had become of the recommendations in the National Senior Certificate report?   Umalusi had indicated that they would no longer tolerate obviously inflated marks, with specific reference to Life Orientation. How were they going to do that?  Was the department aware of the circular sent out by Dr M. Ishmael, Head of Curriculum and Assessment at the Northern Cape Department of Education, instructing teachers that all learners must receive a minimum 60% mark for their school-based assessments, and what was to be done about that? The soundness of the post-marking and the verification of the marking done by Umalusi, was queried. Further, the statistical soundness of the ANA process was unconvincing. What had motivated the sample size of 125 per province for moderation, given the different learner populations across provinces?

The Department said the circular from the Northern Cape was intended to encourage schools to do well, and not intended to fix results. The wording had lent itself to a detrimental interpretation.

Dr Poliah explained that the verification process involved the candidates’ prior marks being compared to their examination mark.  If the exam mark was above the tolerance range, it would be brought down and vice versa. Thus the primary filter was the statistical moderation.

In terms of quality assurance of turnaround plans, Mr Mweli said that the ideal situation was self-managing schools, where a district plan should be informed by individual schools’ plans, and with district plans culminating in provincial plans. The observation thus meant that the Department had to go a level lower than the districts. Yet, if the national Department was to continue going to individual schools, what was the function of the districts and provinces, and when was the national Department going to fulfil its mandate?  Training was to be evaluated in terms of its impact, and there was to be a plan in place before it commenced to evaluate it.  It was conceded that even within the system, analysts confused dropout rates with repetition rates. The two concepts were intertwined but not equivalent. To argue otherwise was too simplistic.

Mr Moloi said that there was good statistical motivation behind the sample size. The Department was in fact over-sampling. This was to accommodate errors of margin, deviation and weightings. This did not disadvantage the larger provinces.

Ms J Basson (ANC) required clarity on schools classified as quintile 4 and 5 according to the specified criteria, while having learners that would be classified as disadvantaged learners. What was being done to address the problem of such schools not accommodating learners from surrounding indigent areas?  In terms of the senior phase being the weakest link in mathematics specifically, what was the problem and what was being done to address it?  Learners were sometimes punished for the carelessness of markers or invigilators -- what was being done to eradicate this?  In terms of the hot spots -- the Northern Cape in particular -- what plan of action was in place to assist those learners?   Further clarification was sought on the 85%, and whether there was a link between the voluntary nature of the tests. The keeping of scripts at schools in the Northern Cape must be investigated, although there were stringent security measures in place.

The issue was the structure of the schooling system. The tendency was that the focus was on grade 12, and thus weaker teachers were sent to lower grades. The Free State, for instance, had adopted interesting solutions, where deputy principals had been deployed to monitor grades 8 and 9. The nature of the schooling system was such that any assessment externally would be prioritised.  Hence ANA would inculcate discipline in all stakeholders.

Ms D van der Walt (DA) was of the opinion that the process of readiness began on the first day of teaching in the classrooms, which meant readiness for the whole year in terms of resources and qualified teachers.  It was acknowledged that the nation did not read and therefore could not assist the learners.  But in trying to create a culture of reading, when a person wanted to donate books or libraries, they would face numerous bureaucratic hindrances. That should not be the case.  What was the system regarding the retrieval of books?  Why did the table reflect no registrations for ANA for the North West? Where were markers and mentors sourced from, and were they qualified? What was the split between maths and maths literacy?  Every year, markers would complain about payment.   Were there certain criteria that markers were made to sign, acknowledging that their payment was performance-based?

Mr Mweli highlighted the fact that the Department had set extended targets to 2019 to encourage the shift from maths literacy to mathematics. The goal was to have a 70 /30 split, where 70 % did mathematics and 30% maths literacy. It was going to require a concerted effort and advocacy from parents, communities and schools, as well as a mind shift away from being 100% pass rate orientated. Learners were allowed to keep workbooks, but were required to return textbooks for future generations.  A possible solution was to introduce incentives for increased retrieval rates at schools. Any donation of books was welcome, and there was a ‘book flood’ initiative, targeting malls and requesting donations of books from the public. It was agreed that adult literacy was a problem, but the Ka Ri Gude Mass Literacy campaign had made significant inroads.

Dr Poliah said that attaining the qualifications of markers and moderators was not a foolproof method of ensuring the quality of the marking. The criteria were applied very strictly, and there was the additional criteria of learner performance. There was a move toward a competency test. The Department had gone ot all provinces to conduct an audit on markers.

Mr Padayachee indicated that quintile 1-3 schools were automatically no-fee schools. Quintile 4-5 could apply voluntarily to be no-fee schools by applying to the Head of Department (HOD) and the Member of the Executive Coouncil (MEC).   There were issues in terms of extending funding, but some provinces had reclassified some schools. Thus the school should make an application to the HOD, and if the funding was available, the school could be reclassified.

Ms H Boshoff (DA) asked whether mobile reading clinics could be implemented as that would be especially beneficial to the rural areas. There had been a problem with the direct translation of question papers which had led to incorrect sentence construction, for instance. What was being done about that? What happened in cases of exam papers which did not arrive on time and the schools would have to print the paper, placing extra pressure on learners?   How many subjects were there in the centralised marking scheme and how many provinces were involved?  How many independent schools were on board in terms of administration of tests and marking?

Mr Mweli was in agreement that mobile libraries were a brilliant idea. It had been tried on a small scale, but more needed to be done.

Dr Poliah said the Department had experts that translated the material and the added measure of back translations in the ANA, which was to be spread to the NSC.

The Chairperson postponed the adoption of minutes and adjourned the meeting.
 

Present

  • We don't have attendance info for this committee meeting

Download as PDF

You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.

See detailed instructions for your browser here.

Share this page: