Lack of science education keeping Maltese children behind
Malta is lagging behind international education tests, but Evarist Bartolo says the one-size-fits-all model of testing from America should be resisted by the EU
Malta’s education system must give more importance to the teaching of science, or students will be unable to improve international test rankings.
A study undertaken by the Malta Union of Teachers mapped a true picture of Maltese student’s performance in international educational rankings TIMMS, PIRLS and PISA.
Statistician Vincent Marmara said the study showed science teaching enjoyed lower priority in State and Church schools than in independent schools, and that more of an emphasis had to be placed on this subject if Malta is to do better in the tests. “More focus needs to be put on science subjects, especially at a primary level,” Marmara said.
In PISA (Programme for International Student Assessment), which focuses on maths, science and reading, Malta obtained a score of 465, below the international average of 493.
TIMMS (Trends in International Mathematics and Science Study) gave Malta a 481 score, ranking it 22nd out of the 39 participant countries, also below the 500 average.
The story is the same with PIRLS (Progress in International Reading Literacy Study) – which deals solely with reading – with Malta obtaining a 452 score, well under the 500 average.
Students attending local independent schools performed better in PISA and TIMMS, but Church schools students did best in PIRLS.
Marmara said that children from better socio-economic situations performed better in the tests, with it being evident that there are also differences in the background of students attending State, Church and independent schools.
But the MUT study shows that the tests’ criteria were not catered for in the Maltese education system, in contrast to the situation in other countries, some of which base their curriculum on the assessments themselves, with the objective of obtaining a better ranking.
While most educators had heard about TIMSS, PIRLS and PISA, there wasn’t a general in-depth level of knowledge on what they consisted in. “The study clearly showed that only around 28% of educators were either knowledgeable or very knowledgeable about the tests. Three-quarters of educators know only very little about what they involve,” Marmara said.
There was also an incorrect perception about Maltese students’ performance in the tests, with around 40% of educators thinking children were attaining average scores, when in reality they are performing below average.
Educators’ involvement also leaves much to be desired. Less than 17% are involved in PISA, and under 9% in PIRLS. Likewise, less than 5% said the tests were given a high level of importance in the educators’ workplace.
On average, educators said their level of preparation for the tests was low. And when it comes to students, a considerable amount (around 30%) of respondents said there was no preparation at all for the assessments.
The opinion of educators about the tests was, in its majority, either “neutral” about the importance of the tests, or else saw the assessments as “just a test" despite the fact that Malta was being ranked against other countries on its performance, Marmara said.
Only a fraction of respondents said the assessments provide a true picture of students’ knowledge.
A considerable amount of participants (over 60%) said the language of the assessments had a significant impact on Malta’s results.
Working group to evaluate study’s results
The MUT has recommended that a working group evaluate the results, so that it can raise the bar when it comes to teaching science.
“A decision had to be made on whether Malta should continue participating in the international tests,” the MUT said. “There has to be a discussion of which language should be used for the tests, since this is evidently having an impact on the results... The educational system’s mentality also has to be changed, making it more oriented towards problem skills instead of having a vast content in its syllabi.”
The MUT said the low preparedness levels of all those involved in the test – not only of those who administer them – had to be addressed, and ways how the national curriculum can be more in line with the international assessments should be explored.
The union suggested that the Education Ministry should look to solve the issue of students considering the tests as irrelevant, and to reconsider the period when the tests are carried out since they have a bearing on the results and the students’ level of interest.
EU needs its own testing system
Education minister Evarist Bartolo actually questioned the validity of international testing systems and their relevance to the reality in Malta, welcoming the MUT’s initiative to carry out a study on the matter.
Bartolo said Malta faces a number of challenges when it comes to its performance in the tests, such as the fact that disabled children are integrated into mainstream education, and the number of foreign children in the school whose native language isn’t English.
“So we need to ask ourselves about the worth of these tests, and what agenda they have. Do they have an educational agenda, an economical one?” Bartolo, said referring to the fact that the studies are carried out by the rich nations’ think-tank, the Organisation for Economic and Cooperation and Development, which Bartolo referred to as the “educational industrial complex”.
“I don’t say this to discredit TIMMS, PISA and PIRLS surveys, but one must examine what these tests measures, and how they measure it.”
Bartolo said the EU should not accept “the American hegemony” on testing the education system, and that European countries should develop their own surveys which take into consideration the values and skills our schools develop in students. “The countries which perform best in these tests aren’t those which promote inquiry-based learning, but those that push students to learn things by heart.”
He said the EU was working towards the European Education Area, and that every country should find its own path in determining how to measure the efficacy of its education model, but not with a one-size-fits-all method.