National Benchmark Tests Project & standards for National Examination & Assessment Systems: Department of Higher Education

Basic Education

18 August 2009
Chairperson: Ms F Chohan (ANC)
Share this page:

Meeting Summary

Higher Education South Africa (HESA) briefed the Committee on results received from the National Benchmark Tests Project. The project was developed to demonstrate inefficiencies in Higher Education and to address concerns about how to interpret the new National Senior Certificate. There were difficulties in identifying students’ educational needs and there was a lack of appropriate curriculum flexibility at entry to meet these needs. The project aimed to provide additional information about performance in core, underlying areas of Academic Literacy, Quantitative Literacy and Mathematics. The tests categorised students into three domains depending on whether they fell in to the proficient, intermediate or basic bands. HESA identified challenges with students who placed in the intermediate band. These challenges would impact on their chances to achieve a degree of quality within a reasonable time. The universities would have to put in programmes to assist these students with their needs. Students in the proficient band did not need much assistance. An institution that admitted students that fell in to the basic band had to be able to demonstrate to the Department of Education that it was involved in initiatives to help these students. The Department of Education stated that people should be intolerant about the level of failure that was reflected in the matric examinations. It maintained that the matric results were an accurate assessment of the level of students in the country.

The Committee asked about a statement made by the HESA Chairperson to the Higher Education Portfolio Committee that most students in their first year were unable to read, write and spell, doubting whether most students could be functionally illiterate, unless the matric exams were severely compromised or there was something else wrong. HESA clarified that the Chairperson’s earlier comments reflected the results of tests written at some historically Afrikaans universities. Members also thought that there must be a serious discrepancy between the National Senior Certificate (NSC) exams, which most students passed, and the National Benchmark Tests (NBT), which the majority did not pass. They questioned what the consequences were to failing the Benchmark Tests, whether these were used to indicate what level of support was required, or whether they were used to “gate-keep” and block some students from admission. The Committee
agreed that something had to be done about meeting students’ educational needs. The NBT had to tell universities to institutionalise their support mechanisms; it was not supposed to be used to turn students away. Members also queried the statement that many students were being “coached” at schools, and examined the distinction between good coaching, which was essentially supportive teaching, and coaching merely to get through the exam papers. The Department was asked to comment whether it had erred in regard to the recent maths exams, and Members commented that there needed to be greater synergy between the NSC level and the NBT, so that there could ideally be one test for entrance, qualification and support assessment. They also queried whether the fact that NBT was written four months after students’ last studies had a bearing.

Members also commented that other countries would not let matriculants enter university before undergoing a bridging course, that perhaps it was unrealistic to expect degrees to be completed within three years, which placed strain on the schooling system, that there were cultural and practical differences between schools and universities, and the fact that many students were entering universities that taught in their second or third languages was a major factor. HESA agreed that it would prefer a four-year curriculum.

The Department briefed the Committee on the examination cycle and the characteristics of the National Examination and Assessment System. Students were encouraged to attain educational excellence.
The assessment system should promote good teaching and learning practices. The Department ensured standards and quality in question papers by ensuring strict appointment criteria for examiners and moderators. Exam papers were rigorously moderated. The Department described the results of the Impact Analysis that was commissioned to do an analysis of data on the NSC examinations, which showed that learners’ limited proficiency in English contributed to poor performance, and access to examination through vernacular language led to higher levels of success. Comments on the 2008 exam papers were received from chief markers, universities, teachers unions and the general public, and generally agreed that most question papers were of an appropriate standard, based on different cognitive levels, catered for low, medium and high achievers, and adequately assessed the content as stipulated in the National Senior Certificate. However, Maths Papers I and II were deemed too easy and Physical Science Paper 1 was too difficult.

Members reiterated that a single assessment of school leavers was desirable, to inform the extent to which learners achieved a general education in schools, whether the student was university material, and whether learners would be able to cope in any faculty in any institution. A more determined effort was required to find out what was needed at the Higher Education level, which was perhaps not catered for at school level. They commented that the statement that students at matric level could not read, write or spell was incorrect, and that it had cast unfortunate doubts on the standards of exams and schools. Members questioned whether HESA and the Department were understanding each other, but were assured that they had been working together for some years. HESA added that universities had wanted to insist that students pass Maths paper III, but that the Department was unwilling to allow this as it did not have sufficient teachers to teach the subject in all schools. Members agreed that there must be further discussion on this issue, especially in light of the curriculum planning. Members reiterated that there were flaws in the two tests, and that the purpose of the NBT should be to indicate the flaws and where support was needed. Members were generally satisfied with the standards of education, but reiterated the need to bridge the gaps between school and university levels, and to resolve language and cultural challenges. The Department and HESA were asked to return within a month and give a further report. It was noted that Prof Eloff should also ideally attend, or at least clarify on what his statements had been based.

Meeting report

Opening Remarks
The Chairperson stated that both Higher Education South Africa (HESA) and the Department of Education (DoE) were to brief the Committee on their views of the National Benchmark Tests (NBT) Project, and to highlight any other problems. The Chairperson said that the Committee wanted to know about the schooling system and the readiness of school leavers for tertiary education. The Committee was concerned about learners who, although passing matric to a sufficiently high level to be accepted into tertiary institutions, were nonetheless found, upon testing by the institutions, not to be up to standard for various tertiary degrees. It was therefore important to have a discussion with the people responsible for the National Benchmark Tests (NBT) and the people in the DoE responsible for the curriculum and matric examinations.

Higher Education South Africa (HESA) Briefing on the National Benchmark Tests Project
Professor Nan Yeld, Dean of the Centre for Higher Education Development, University of Cape Town, briefed the Committee on behalf of HESA on the results from the National Benchmark Tests Project (NBTP). She was the principal investigator of the NBT Project.

Prof Yeld stated that the NBTP was developed to demonstrate inefficiencies in Higher Education (HE) and to address concerns about how to interpret the new National Senior Certificate (NSC). There were difficulties in identifying students’ educational needs and there was a lack of appropriate curriculum flexibility at entry to meet these needs. It was important to note that HE’s idea of what the core domains of knowledge and skills were might differ from what was deemed most salient by the school-leaving system. It was inevitable that there would always be slightly different results.

The NBT aimed to provide additional information about performance in core, underlying areas. The core areas consisted of Academic Literacy (AL), Quantitative Literacy (QL) and Mathematics [see document for definitions]. At the design phase of the NBT, HESA decided not to test history, life sciences or physics because they did not want people to see the NBT as a direct alternative to the National Senior Certificate (NSC). Most institutions wanted to students to enter tertiary education with a decent level of maths, the ability to deal properly with reading material and the ability to express themselves.

The NBT tried to categorise students in to three domains, depending on whether they were proficient, intermediate or basic. The study looked at whether students would have problems in HE if they had problems in certain domains, for instance whether their levels of proficiency in academic literacy would impede their progress. Students in the proficient band did not need much assistance. HESA identified challenges with students who placed in the intermediate band. This grading could impact on their chances to achieve a degree of quality within a reasonable time. The universities would have to put in programmes to assist these students with their needs. There were serious problems with those in the basic band. An institution that admitted these students had to be able to demonstrate to the DoE that it was involved in initiatives to help these students. HESA was not saying that students could not read or write, but was saying that students exhibited challenges in certain domains, and that the institutions must be prepared to work to overcome the challenges.

Prof Yeld discussed how benchmarks were derived. The process was fundamentally different to examination paper design procedures, and the norm referenced standardising and resulting processes of the NSC. The benchmark setting process was not about whether students could pass a paper or not, but was based on a set of probability assessments made by first-year lecturers. The core question was whether a student would experience academic difficulties that might affect his or her ability to pass an item, and how severe the difficulties would be.

NBT information could be delivered at an individual level and a group level, which was the level of a faculty, qualification or institution. The project involved over three hundred academics from across the sector. Planning for the NBTP occurred in 2005, and implementation occurred in May 2009.

Prof Yeld discussed the data based on the February 2009 pilots [see graphs in document]. She noted that the media had focused on academic literacy. Some of the publicity that the NBTP generated was not a result of information put out by HESA, but was rather in response to broader questions about Outcomes Based Education (OBE). Those students in the basic and intermediate band together indicated that the majority of students entering higher education needed some support on academic literacy. This should not be a surprise, as most students chose to study at universities using their second or third language. Most students needed only a little help, and fewer were needing a great deal of help.

 Mr Duncan Hindle, Director-General, DoE, added that he hoped the media would focus on the fact that the majority of students, with some help, should be able to succeed at universities. Studies showed that the Commerce Faculties had the highest number of students that were proficient.

Quantitative Literacy (QL) showed a real problem in levels of performance. There were highly selected students at university level who could not do QL, and very few students fell into the proficient band for mathematics. Universities noted that there was a declining level of preparation for university mathematics over the last few years. Mathematics standards for the benchmark cut-scores were set against the mathematics curriculum for papers 1 and 2, and what the HE sector believed students needed to know to be able to cope with first-year study in mathematics.

The NBT results could result in some conclusions being drawn; for example, that the NSC examination could be set at too low a level, possibly approaching that of standard grade in previous years, that the NBTP test in mathematics could have been unrealistically difficult, and that the new curriculum statement for mathematics might not be taught in its entirety at schools. When comparing the NSC maths exam results to the NBT maths results, students received lower results on the NBT.

Prof Yeld stressed that the NBT was primarily aimed at HESA coming to grips with important core domains and revealing what it needed to do to meet the needs of its students. [See graph on last page of document]

Discussion
The Chairperson noted that Prof Yeld stated that tertiary institutions wanted students with “a decent level of maths, the ability to deal with reading material and the ability to express one’s self”. She stated that the question was whether the matric examinations delivered these requirements. She added that Professor Theuns Eloff, the HESA Chairperson, had told the Higher Education Portfolio Committee that most students in their first year were unable to read, write and spell. These were students who had already passed the matric examinations, which were supposedly set at a fairly high and rigorous standard. This immediately brought to the fore the question why, if HESA wanted to test the extent of literacy and numeracy in students, it did not regard the matric results as sufficient for its purposes.

Prof Yeld stated that even though results were improving, there would be an unacceptable failure rate if the NSC did reflect the true results of all the students. Although, for instance, the pass rate in English was high, in fact these passes were at very low levels.

Mr Nkosi Sishi, Chief Director: Educational Measurement, Assessment and Examinations, Department of Education, added that the matric results showed that 20% of matriculants qualified for entry into universities. He stated that people should be intolerant about the level of failure in the matric examinations. He thought the matric results were an accurate assessment of the level of students in the country.

The Chairperson stated that this would be debated later on. Members needed to understand what the issues were.  It could not be true that most students were illiterate, as that would either mean that the matric examinations were severely compromised or there was something else wrong.

Prof Yeld stated that she had not seen the reports on which Prof Eloff based his comments.

The Chairperson stated that Prof Yeld should clarify if the comments were based on the same tests that she was describing.

Prof Yeld stated that they were not based on the same tests. Prof Eloff had based his comments on some language tests that were written at historically Afrikaans universities.

The Chairperson stated that she thought that Prof Eloff, as the Chairperson of HESA, had been talking to the NBT Report. Now it seemed that his comments were based on another report. She read from the Business Day article, published on 13 August 2009, that Prof Eloff had stated that “most of SA’s first-year university students could not read, or write or comprehend - nor could they spell. He said that universities that conducted regular competency tests in English and Afrikaans reported a decline in standards”. This remark was directed at the standard of education at schools.

The Chairperson noted that the Maths NBT incorporated maths papers 1 and II into its maths exam and it also tested whether students were proficient at a level that was sufficient for university. This showed that while most students passed the NSC maths, they did not do well with the NBT, which tested what was required at university. She asked if this was a fair interpretation.

Prof Yeld stated that the NBT only tested what was in the maths paper I and II and did not test other additional mathematics.

The Chairperson asked why they tested the maths exam if they were not testing anything additional.

Prof Yeld stated that the maths was being tested because very different results were being seen, and there were very high failure rates, even when students received high marks in the NSC maths exam.

The Chairperson noted that these results showed either that the NSC maths examination standard was not sufficiently high, or that HESA was not sure whether students could pass the NBT.

Prof Yeld stated that it was possible for a student who was writing the NSC exam simply to cram before the exam, and achieve good results. The new Outcomes Based Education system was trying to do something about this. It was not clear if students who received 80% and above for the NSC exam would be able to pass the NBT. There was also some speculation on whether everything included in syllabus for the maths papers I and II was being taught.

The Chairperson stated that maths was a subject that could not be swotted up and passed, unless there were serious challenges with the security of the maths exam.

Prof Yeld stated that this was not quite correct. There were schools that coached students for the exam. There were people who looked at the exemplars. Maths was not exempt from this.

The Chairperson stated that the point of the matter was that there was some serious discrepancy. The point of OBE was to test the ability to apply the knowledge. The Committee had to look at the consequences of the NBT exam, as opposed to the matric exam. Failing the matric exam had serious consequences. She wondered what the consequences were if students did not pass the NBT.

Prof Yeld stated that students failing the NBT would be placed in an extended programme. Universities also did not have to select these students.

The Chairperson stated that legally, a student was allowed to enter a university if he or she passed matric at a specific level.

Prof Yeld stated that there were minimum requirements set by the NSC, and universities had the right to set additional requirements.

The Chairperson noted that this seemed to be “gate-keeping”.

Prof Yeld stated that the tests were designed for placement purposes. She stated that she did not think that students would not take the NBT seriously if there were no consequences for failing.

The Chairperson asked for confirmation that if a prospective student applied for engineering studies, he or she would have to write the NBT test first, and the final selection depended on the results.

Prof Yeld answered that it depended on the institution. She stated that something had to be done about meeting students’ educational needs.

The Chairperson addressed the maths results. She asked how fair the assessments were on the degree of preparedness, and in terms of the scarce skills that were needed in the country. She understood that it was unfair to say that students would not succeed if they did not do well at maths. There were some students who did well in the matric exams and dismally on the NBT. If the NBT was used to get universities to support students, it should not be used as a gate-keeping exercise. The NBT should be advising universities to institutionalise their support mechanisms, rather than being used to turn students away.

Prof Yeld agreed with this statement completely. HESA was trying to do something about the extremely low pass rates.

Ms Penelope Vinjevold, Deputy Director-General: Further Education and Training, Department of Education, also agreed with the Chairperson. The DoE wanted to concentrate on the maths results. There was huge variation between the two results. The majority of students who received 80% or more in the maths matric exams fell into the proficient or intermediate NBT bands. She was not sure if students thought the NBT was very important. In addition, these students had not been academically engaged for four months, since writing their matric examinations. She wondered if the NBT could be administered in July or August, now that students understood the importance of the test. She wanted to know if the NBT results were going to be used in September and October as part of the entrance requirements for all universities.

Prof Yeld stated that University of Cape Town was using the results in the Faculty of Engineering and some other faculties as an indicator. The NBT was rigorously designed to assess performance. This point could be argued. 

Mr D Smiles (DA) asked the DoE for comment on the statement that some students were being “coached” at schools. He noted that there were quite a few students who passed with 100%. His experience with teaching mathematics had never shown a 100% pass.

Mr Hindle stated that any exam paper would inevitably have a section that most students could complete successfully. Other sections would identify the top students. He was worried about the notion that too much coaching was occurring. In the end teaching was about putting information across and preparing learners to pass the exams. He did not think that coaching students to pass exams successfully was a bad thing. The DoE wanted teachers to teach the proper curriculum and to prepare students properly. He urged some caution on the notion that coaching students was wrong. This should be encouraged rather than discouraged.

Ms C Dudley (ACDP) addressed the issue of discrepancies between the different tests. She asked if any similar studies were done in other countries, and if the tests could be compared elsewhere. She was also concerned about the coaching issue, saying that no matter how much students were “coached”, they had to have some form of understanding themselves, so they could apply the knowledge to other questions. She did not see a problem with the argument.

Mr Hindle addressed the last graph in the presentation that looked at NBT and NSC maths results. He stated that there was a large difference and the DoE knew it had to address the issue of the discrepancy between the two results. However, the trend line showed that half the NSC results were remarkably similar to the NBT results. What the DoE heard today was very different from what it had heard over the past week, and the media reports had been very damaging to many people. There were a large number of young people who were told that even though they achieved 90% in matric, they were deemed “unable” to read or write. However, the DoE acknowledged that there were very serious performance and quality issues and was trying to resolve the issues. The DoE was involved with a highly structured engagement with HESA, particularly around issues of curriculum and assessment. HESA was very involved in devising the curriculum, assessment and examination process. In many regards, HESA was helping to set the standard of the NSC, and was also involved in the standardisation of results.

The Chairperson asked why, if this was the case, there was a need for further assessments on subjects like mathematics. She would understand the need for further assessments if there was no collaboration between HESA and the DoE. The gate-keeping process worried her greatly. The issue was still why, if the maths matric exam was at a sufficiently high level, the universities still wanted to test the proficiency of the students after that exam.

Mr Hindle stated that this was the first NSC. He stated that there was a concern about grade inflation. Since 2004, the matric results were either standard or declining; therefore there could not be grade inflation.

Prof Yeld stated that over the years, testing was done using the Alternative Admissions Research Project (AARP). It showed that there were declining levels of performance in domain areas. This result did not match results from the NSC. Because of the major curriculum innovation in schools, HESA thought it was important and to develop a lens through which they could learn to interpret the results of the NSC. The DoE had admitted that it had erred with certain subjects such as mathematics. HESA acted responsibly to find out what the needs of students were.

The Chairperson agreed that students needed additional support. However, she still had a problem with gate-keeping. She asked if the DoE agreed that it had fallen short with the maths paper.

Mr Sishi noted that Prof Yeld had effectively disagreed with what Prof Eloff had stated. This distinction was crucial. The issue should be clarified so the public could not be confused. All educational institutions in the world debated the issue of gate-keeping. People needed to know that the South African national exam system was not established overnight, but was more than 150 years old. The system had benchmarking tools, tests and research.

The Chairperson asked Mr Sishi to answer the specific question of whether the DoE got it wrong in the maths paper, and if it was too easy. She asked if this issue was corrected. She wanted the DoE to give the Committee a sense of what the standard of the matric exam was when compared internationally. She stated that the DoE could answer these questions during their presentation to the Committee.

Mr Sishi stated that he thought that clarity had been provided. The purpose of the DoE’s presentation was to show that South Africa had a world class examination system.

Ms Vinjevold stated that the papers were written in very early November. The papers were given to HE institutions and they had not commented on the quality. She stated that the NBT wanted to assess the ability of students when they were asked questions phrased slightly differently from the NSC, and said that the way that tests were set could confuse some students. DoE looked at the papers item by item. HE institutions stated that if a student received 50% in the maths paper in 2009, that student would have passed the previous Higher Grade mathematics. She added that it would have been more useful not to make statements to the press about the results. The DoE did not hear any feedback from HESA on which concepts to improve and test, and HESA needed to give feedback on the benchmark testing per item so that the DoE could improve the curriculum.

The Chairperson stated that there must be synergy between the NBT and matric levels. There should be one test for entrance, qualification and support assessment. She asked what the percentage was that separated the basic, intermediate and proficient bands. Given that the primary aim of the tests was so that universities could provide bridging support for the students, she thought that South Africa’s educational system was a little “awkward”, because it had introduced the unusual practice of letting matriculants enter university at this stage, whereas many other countries would offer bridging or finishing courses before the decision was taken to allow them to university. This took away the problem of bridging having to be offered at the universities. She wondered what kind of support was initiated in SA universities.

Prof Yeld stated that there was a cut-score between basic and intermediate and another cut-score between intermediate and proficient bands. Cut-scores were derived by a large group of first-year academic teachers who taught first-year students. They made probability judgments on each item and assessed how important it was for students to pass or fail those items. This aggregated into a percentage.

She clarified that
HESA tried to develop placement tests, not entrance tests. Problems with results and grade inflation occurred in many countries, and were not exclusive to South Africa. Internationally, South Africa was in the forefront of creating innovations that institutions needed to meet the needs of students. She stated that it was going to be a long time before the twelve year school system prepared students properly to complete their degrees in three years. Internationally, this was not known. The United Kingdom and United States of America had four-year degrees, and she felt that requiring a degree course to be finished in three years was not fair on the school system. HESA looked forward to a four-year curriculum.

Mr Sishi added that the DoE felt that the values underpinning the NBTP did not differ from values upheld by the DoE. He did not think the matric exams were under attack unfairly. However, people should be assured that the matric exams were of a high standard. The DoE still had a long way to go on how it wanted to prepare students for matric exams. The DoE also wanted to prepare students to succeed at any benchmarking test in the world. He was sure that universities would receive students that were properly prepared. The examinations in the country were not just prepared by the officials in the DoE, but by a panel. The exams were moderated externally.

Standards for the National Examination and Assessment System (NEAS): Briefing by Department of Education
Mr Sishi briefed Members on the examination cycle and the characteristics of the National Examination and Assessment System (NEAS). The characteristics of the system included fitness for purpose, integrity and public confidence, efficiency and cost effectiveness, and transparency. [See document for full details]

The National Strategy for Learner Attainment (NSLA) encouraged students to attain educational excellence. The assessment system should promote good teaching and learning practices. It should include systematic feedback of information to all role players. The South African Qualifications Authority (SAQA) ensured that tests were internationally comparable. It set standards to generate globally recognised qualifications. The National Qualifications Framework (NQF) Bill looked to the advancement and recognition of learning to enhance the values of a democratic society and the well being of citizens. There were also Quality Councils (QCs) to advise the Minister, develop and implement policy criteria on qualifications and assessments, and do quality assurance.

The DoE ensured standards and quality in question papers by ensuring strict appointment criteria for examiners and moderators. Examination papers were rigorously moderated. The process consisted of pre-writing preparation, the writing session, internal moderation, finalisation after moderation, external moderation, finalisation of moderation, translation and editing, proof reading, adaptation and quality assurance. The examinations assessed knowledge, skills and values, critical thinking, problem solving and analytical skills.

The DoE was involved in preparing exemplar question papers for Grades 10 to 12, setting examination guidelines, and the ongoing training of examiners and moderators. The Department was involved in provincial coordination, regular monitoring and support, establishment of the National Examination Board and the District and School Assessment Irregularities Board. The DoE was also involved in marketing initiatives such as the national training of chief markers.

The Department received comments on the 2008 exam papers from chief markers, universities, teachers’ unions and the general public.
Generally, the comments reflected that question papers were of an appropriate standard, based on different cognitive levels, that they catered for low, medium and high achievers, and adequately assessed the content as stipulated in the NCS. There were a few exceptions, as comments had said that the Maths Papers I and II were too easy and Physical Science Paper 1 was too difficult. Examining panels reviewed the comments and had made the necessary adjustments necessary.

In May 2008, the DoE commissioned an Impact Analysis to perform an analysis of data on the NSC examination. The aim was to gauge the level of performance of learners and their response to new ways of setting papers. The Analysis showed that learners’ limited proficiency in English contributed to poor performance, and access to the examination through the vernacular languages led to higher levels of success. The DoE also tried translating four question papers into nine African languages. Forty-eight schools participated. These were mainly rural and peri-urban (township) schools. Control and experiential groups wrote the papers while the control group was given papers written in English only. The experiential group was given bilingual papers. The DoE found that
there was no significant difference between the performance of learners in the control and the experiential group. Learners’ understanding of the subject matter was very poor and there was no significant difference between the control and experiential group for either the urban or rural learners.

The Department compared the NSC November 2008 question papers to the NSC 2007 papers. The Physical Science paper was quite similar in terms of the content coverage, but an incremental improvement signified the 29% additional content. The standard of the English paper remained the same. The level of difficulty, cognitive level, and types of questions asked in the History paper for November 2008 were an improvement on the 2007 paper. Other subjects that were compared included accounting, life sciences and mathematical literacy.
 
Discussion
The Chairperson asked the DoE what concerned it about the maths examination and results.

Mr Sishi stated that he and HESA had both expressed their concern about the maths.

The Chairperson noted that the tests were benchmarked against international tests. She asked what benchmarking was involved.

Prof Yeld stated that the benchmarking was done in the HE sector. The NBT was quality assured by the Assessment Systems Corporation in Michigan and the educational system at Princeton University.

The Chairperson stated that a single assessment of school leavers was needed, to inform the extent to which learners achieved a general education in schools, whether the student was university material, and to inform the extent to which learners would be able to cope in any faculty in any institution. The difficulty related to the throughput of students once they arrived at universities. Learners tended not to be able to achieve a three-year degree within three years. There was a genuine need for bridging the gap between schools and universities. This did not mean that the general education received at schools was of an inferior quality. There were language and culture issues to contend with. The Portfolio Committee on Higher Education was looking into the issue of the cultural gap between schools and universities. Most students were operating at second and third language levels when they reached university. What was required was a more determined effort to find out what was needed at the HE level, which was perhaps not catered for at school level. Bridging must occur without compromising general education at schools. Many countries had a bridging course built in to the curriculum, although this was not part and parcel of the South African education system. It was a misstatement to say that students at matric level could not read, write or spell. This caused unnecessary doubt on the standards of the exams and schools. The discourse was about students who had performed well in the schooling system, but who needed support at university level.

Prof Yeld stated that HESA did not go to the media with the NBT results. They valued their relationship with the Department. She wanted to add a comment about the maths papers. Currently, HE institutions could not include, in their entrance requirements, that students needed to write the Maths paper III, although this was desirable because this paper contained much that would be required for engineering. The problem was that many schools did not have the teachers competent to teach the content of this maths paper.  If the HE institutions had asked that students write this paper, this was seen by the DoE as putting pressure on schools to offer the course. This was an issue that needed to be resolved.

The Chairperson asked if the HE institutions would differentiate between the students that did paper 3 and those that did not, or if they would make all students write the paper.

Prof Yeld stated that the HE institutions did not test anything that was not in papers I and II, as this would eliminate all the students that could not write paper III. She reiterated that there were not enough teachers who could teach Paper III.

The Chairperson stated that there needed to be a discourse on the issue. If it was known that engineering students would need that extra maths capacity, then schools would feel pressured to provide that level of maths. If the DoE and HESA wanted students to perform at a particular level, then they should be measuring students’ capacity. If there were problems around students performing at the level of the maths paper III, then they could say there was a challenge.

Prof Yeld reiterated that the NBT did not test anything that was not in the curriculum.

The Chairperson asked again why universities wanted to re-test the maths curriculum if the matric examinations had already tested maths papers 1 and II already.

Prof Yeld answered that it was because there were different results.

Mr Hindle added that any examination paper only tested a selection of the curriculum. If one part was tested, this would give one set of results, but if another part was tested there might be different results. The difference also came in the way questions were asked and the context in which they were asked. A more determined engagement was needed between HESA and DoE. He was not sure that on an aggregate level, they were getting different results.

The Chairperson stated that engagement was important if they were planning the curriculum for the next three years. She added that Prof Eloff was invited to attend the meeting, however, he was not too accommodating. The Committee would invite him to attend the meeting again. The Committee would reconvene in a month to receive inputs from the DoE on its engagement with HESA and the NBT. If the two entities were testing the same subjects, they should be getting the same results. More determined and definite answers were needed for some of the Committees questions. She thought the next engagement would result in a lot more certainty.

Ms Dudley asked why there was a problem in having more than one assessment, as her logic told her that there should be a clearer understanding from both assessments. She did not think that universities would “block” students based on the results of the NBT alone. She thought they would take into consideration the school exam and the NBT. This could guide HE institutions. She wondered if the DoE understood what HESA was saying, and if HESA thought that the DoE understood what they were trying to say.

Ms Vinjevold stated that the DoE had been working with HESA since 2005. The DoE did not see that there was a huge difference between the NSC exam and the NBT. The DoE understood the benchmark tests and there was not a lack of understanding. However, more feedback was needed about concepts that must be taught more thoroughly. More information was needed to improve the system.

Prof Yeld added that the principle of using more than one test was very well established in the higher education system. This was not a new practice at all. The HE system used the NSC as eligibility criteria. The NBT was a more rigorously designed set of tests.

Ms Vinjevold said that DoE’s concerns were related to the purposes for which the NBT would be used. It support was based upon the assurance that it would be used in order to provide support to students. The DoE was concerned that there were young disadvantaged students that did not know what the curriculum was and what the test would contain. This would “churn up” the system. The DoE did not know if the NBT would continue to be used for its original intention to support students. This was worrying. 

Mr Smiles stated that he needed a clear explanation from HESA as to whether their expectations were too high or if the standards for assessment were too high.

The Chairperson stated that she could not allow this unfair comment. HESA’s findings were not about high or low expectations. Prof Yeld was querying whether the full curriculum was taught at schools, and where emphasis was made regarding certain subjects.

Mr Smiles stated people in Higher Education were questioning Basic Education as to the readiness of students for university. This could be a result of what was happening in classrooms. The DoE had to ensure that the students received quality education that prepared them for going into the labour market. There were definite quality issues in classrooms that needed to be addressed.

The Chairperson stated that she did not think that the DoE would disagree with this statement.

Prof Yeld did not think that the bar was set too high. She did not think that HESA would ever really get the same results as the DoE even if they worked together very closely. HE practitioners looked at the curriculum and what they thought was important. HESA tested students for a different purpose than the DoE’s tests, so that it was likely that the results could differ.

The Chairperson stated that Members were talking about differentials and outcomes of the tests. There were either deep flaws in the way in which the two tests failed to complement each other or assess the same things, or this showed that people given a test at one time would achieve different results than if they had been given the same test at another time. Whatever the case, it was something that the experts and educators needed to resolve, and decide how to proceed. The tests must show where the gaps were, and where support was needed. HESA and DoE needed to make inputs into this issue. Her impression was that the issues occurred because both entities were at the early stage of assessments. She was quite satisfied that the standard of education was fine and there was no real crisis. However, there was a need to bridge the gap between school level and university level. There was also a need to resolve the language and cultural challenges. The DoE had to reach out to HESA and engage on these issues. She asked that both return to the Committee in a month and give Members some sense of the discussions that were held.

Ms G Gina (ANC) believed that students would perform better if they were made to write the maths paper III.

Ms Vinjevold stated that if the maths paper III was added to the NSC, the DoE would have to remove another examination paper. If the DoE brought something new into the curriculum, it would have to take something else out.

Ms J Kloppers-Lourens (DA) asked what criteria were used for the compilation of the exam panel. She believed the DoE should use teachers in the panel. She also thought that the issue of coaching should be revisited. There were different definitions for coaching. Some teachers drilled in questions and answers. This had a definite effect on results.

Mr Hindle stated that teachers were involved in the setting of exam papers. He agreed that there were various definitions for coaching. Good coaching was close to good teaching and should be encouraged. The bad kind of coaching was if teachers taught students only to pass the exams, but did not teach them anything beyond that.  This should be interrogated.

Mr Hindle added that it was clear that Prof Yeld’s research was not the same as that upon which Prof Eloff had reported. It was a pity that he was not at the meeting. He suggested that the Committee needed to call upon him to make a formal statement about the confusion. It would be better if HESA gave a written statement regarding the matter.

The Chairperson stated that the Committee had invited Prof Eloff to the session but he indicated that he could not attend. The Committee would invite him the following month. The DoE could meet with Mr Eloff to clarify what his statements were based on. She knew there were two different tests, but she did not know there were two different findings. She added that the engagement was very constructive and thanked HESA and the DoE for their contributions. 

The meeting was adjourned.

Share this page: