general information note launch of major international ... · performance on pisa, drawing on that...
TRANSCRIPT
1
General information Note
Launch of major international study of Irish Students’
Performance in Science, Mathematics and Reading in PISA 2015
Under strict embargo until Tuesday, 06 December 2016 at
11.00 Central European Time (i.e. 10.00 Irish time).
1. Background
1.1. What is PISA?
The Programme for International Student Assessment (PISA) is a project of the Organisation
for Economic Co-operation and Development (OECD) that aims to measure how well
students, at age 15, are prepared to meet the challenges they may encounter in future life,
including education. PISA takes place every three years and assesses students in the three
domains of Reading, Mathematics and Science1.
In 2015, the PISA assessments were administered in 72 countries/economies. The scores of
two countries/territories are not included in the ranking tables.
PISA is based on a random sample of 15-year old students in each of the participating
countries. Over a half a million students took part in PISA. In 2015, 167 post-primary schools
in Ireland took part, giving a weighted school-level response of 99.3%. This exceeds the
required 80%. After exemptions, refusals and absences were taken into account, 5741
students completed the assessment, giving a weighted and unweighted student response rate
of 88.6%. This exceeds the required 80.0%.
In Ireland, Third Year students account for 60.5% of students in PISA 2015, Transition Year
students for 26.7%, Fifth Year students for 10.9%, and Second Year students for 1.8%.
PISA tests were scaled to an OECD mean of 500 and a standard deviation of 100 in the first
cycle in which a domain had major status – 2000 for reading, 2003 for Mathematics, and 2006
for Science. This means that performance in 2015 can be linked back to these and subsequent
cycles.
1 Throughout the PISA report, the terms reading, mathematics and science are used as shorthand for reading literacy, mathematical literacy, and scientific literacy.
2
1.2. How is PISA administered?
PISA is implemented on behalf of the OECD by a group of five institutions, led in 2015 by The
Educational Testing Service, USA. Between 2000 and 2012, the Consortium was led by the
Australian Council for Educational Research (ACER). In 2015, the PISA assessments were
administered in 72 countries/economies. See Appendix 1.
PISA 2015 consisted of a field trial and main study. In Ireland, these were managed by the
Educational Research Centre (ERC) on behalf of the Department of Education and Skills (DES).
The actual assessments in schools were administered by inspectors from the DES and, in a
small number of cases, representatives of the ERC.
Each cycle of PISA focuses on one ‘major domain’, either reading, mathematical or Science,
to which the majority of testing time is devoted. The ‘minor domains’ provide a less detailed
account of achievement. Science was the major domain in PISA 20152, while minor domains
included reading and Mathematics.
PISA also collects contextual information through questionnaires completed by students and
principals, as well as some information from national sources (such as the DES post-primary
database).
1.3 What does PISA assess?
The subject areas or ‘domains’ assessed by PISA are reading, mathematics and science. In
each cycle of PISA, one of these subjects is the main focus (‘major domain’) of the assessment,
with less emphasis placed on the remaining subjects (‘minor domains’).
PISA 2015 focused on science as the major domain. It measures students’ ability to
o explain phenomena scientifically,
o evaluate and design scientific enquiry, and
o interpret data and evidence scientifically
PISA also collects contextual information through questionnaires completed by students,
parents and principals, as well as some information from national sources (such as the DES
post-primary database). The ERC will conduct more detailed national analyses of Irish
performance on PISA, drawing on that contextual data, and reports will follow in 2017 and
2018.
2 Science was the major assessment domain in two PISA cycles, 2006 and 2015. This means that detailed information is
available on performance in those years, including performance by subscale.
3
1.4 How did the tests in 2015 differ from earlier PISA assessments?
The 2015 test itself differed in many ways from previous PISA cycles:
In 2012, countries had the option of participating in computer-based assessments of Reading
and Mathematics in addition to the traditional paper-based assessments of Mathematics,
Reading and Science. Science was the only domain that did not have a computer-based
component. In 2015, PISA moved from a paper-based to a computer-based assessment in
most participating countries, including Ireland. This meant that Irish students completed
their assessments in Science, using ICT, for the first time in this 2015 round.
The move to computer-based tests also brought a change in the type of questions asked for
the major domain, Science. The number of trend items (questions which were similar to
those that had been asked in earlier tests) was increased and new questions, including
interactive questions, were posed. This meant that questions differed both in their content
and in their presentation from those which had been assessed in 2006, when Science was
last the major domain, and in 2012.
PISA 2015 for the first time assessed students’ ability to conduct scientific enquiry by asking
them to design (simulated) experiments and interpret the resulting evidence. This was made
possible through the use of interactive presentations, where students’ actions determined
what they saw on the screen. Twenty-four items included in the main study (or about 13%)
were interactive.
There was an increase in the number of questions used in the assessment of Science since
the last time Science was a major domain. In 2006 there were 103 questions used and in PISA
2015 there were 184 questions used. Of these 184 items included in the PISA 2015 Science
assessment, 85 questions were trend tasks, which were used in previous PISA surveys, (and
which had to be adapted for delivery on computer) and 99 questions were new Science tasks.
In PISA the OECD worked with a new team of contractors who introduced new scaling
methodologies, new approaches to dealing with not-reached items, and new strategies for
dealing with items that demonstrated significant country-by-item interactions. As a result,
the methodologies used to calculate students’ achievements in the tests changed. For
example, in PISA 2015, non-reached items (i.e. unanswered items at the end of test booklets)
were treated as not administered, whereas in previous PISA cycles they were considered as
wrong answers when estimating student proficiency (this new treatment of non-reached
items might result in higher scores than would have been estimated in the past for countries
with many unanswered items).
4
2. Performance of Irish students in PISA 2015 summary view
2.1 Summary tables
Outcomes for
students in PISA
SCIENCE
(Major Domain) MATHEMATICS READING
OECD All PISA OECD All PISA OECD All PISA
Placing in 2015
(35 OECD/70 PISA) 13th 19th 13th 18th 3rd 5th
Table 1 Placing of Irish students in PISA 2006-2015 *Mathematics was the major domain in 2012
Outcomes
for students
in PISA
SCIENCE
(Major Domain) MATHEMATICS READING
Irish
mean
OECD
mean
EU
mean
Irish
mean
OECD
mean
EU
mean
Irish
mean
OECD
mean
EU
mean
Mean score
in 2015 502.6 493.2 489.1 503.7 490.2 487.1 520.8 492.5 486.0
Table 2 Mean scores achieved
15 year-olds in Ireland performed above the OECD average across all three of the competency sub-
scales for Science, with a relative strength in “explain phenomena scientifically” competency
subscale.
Science: PISA subscales
explain phenomena scientifically
evaluate and design scientific enquiry
interpret data and
evidence
scientifically
OECD average 493 493 493
Ireland 505 500 500
Table 3 Mean scores achieved on Science competency sub-scales
2.2 Proficiency levels
PISA 2015 describes seven levels of proficiency for the overall Science scale. Proficiency levels define
the skills, abilities and competencies that students scoring within specific ranges can demonstrate.
At Level 2, students are beginning to display the key competencies that will enable them to
participate effectively and productively in life situations. Hence, at-risk students are those who
perform below Level 2. The top performers in each domain attain Proficiency Level 5 or higher.
5
PISA 2015 % of students
below Level 2
% of students at or above
Level 5
SCIENCE Ireland 15.3 7.1
OECD average 21.2 7.7
EU average 21.6 7.4
MATHEMATICS Ireland 15.0 9.8
OECD average 23.4 10.7
EU average 23.9 9.7
READING Ireland 10.2 10.7
OECD average 20.1 8.3
EU average 20.4 7.5
Table 4 Percentage of students at each of key proficiency benchmarks in PISA 2015
The table below summarises the position of Irish students in the PISA 2015 assessments relative to selected countries. PISA 2015
Students’ mean scores significantly higher than Irish students in…
Irish students’ mean score is not significantly different from those in…
Students’ mean scores significantly lower than Irish students in…
SCIENCE (Major Domain)
14 countries including Singapore Hong Kong Korea Chinese Taipei Japan Finland Canada New Zealand Australia Estonia
10 countries including United Kingdom Germany Netherlands Switzerland Belgium Denmark Poland Portugal Norway
45 countries including Hungary Czech Republic Austria France Sweden Spain Italy
MATHEMATICS 14 countries including Singapore Hong Kong Korea Chinese Taipei Japan Finland Canada Netherlands Denmark Slovenia Switzerland Estonia
5 countries Belgium Germany Poland Norway Austria
50 countries including Australia United Kingdom United States Portugal Czech Republic
READING 1 country Singapore
6 countries Hong Kong-China Canada Finland Estonia Korea Japan
62 countries including United Kingdom New Zealand Norway Netherlands United States Sweden
Table 5 Relative position in 2015
6
3. How well did Irish students perform on Science?
3.1. Comparing major domain (2015) to major domain (2006)
An analysis of trends indicates that Ireland’s mean score on Science in 2015 (502.6), was
similar, in terms of statistical significance, to the score achieved in PISA 2006 (508.3), the last
time Science was a major domain. In all PISA cycles to date, including 2015, Ireland’s mean
performance on Science has been significantly higher than the OECD average.
A number of countries, including Ireland (-5.8) and Northern Ireland (-8.0) experienced non-
significant changes in performance between 2006 and 2015.
Finland (-32.7), New Zealand (-17.1) and Australia (-16.9) were among the countries that
showed significant declines in performance between 2006 and 2015.
3.2 Comparing major domain (2015) to minor domain (2012)
NOTE – The International Report on PISA 2015 places considerably less emphasis on this
comparison, than on the more reliable 2006-2015 comparison.
Ireland’s performance in Science between 2012 and 2015 has dropped from 522.0 in 2012 to
502.6 in 2015. This is a statistically significant decline in performance.
Ireland’s average three-year trend covering the period 2006 to 2015 indicated a non-
significant point score across PISA cycles.
Figure 1: Mean scores on the overall Science scale in Ireland and the average across OECD countries, 2006 to
2015
There were significant declines in performance between 2012 and 2015 in eleven countries
and on average across OECD countries. The largest differences were observed in Hong Kong
(-31.7), Poland (-24.4), Korea (-22) and Ireland (-19.4). Performance dropped by 7.1 score
points in Northern Ireland but this was not statistically significant.
Ireland’s mean percent correct score on interactive items was about the same as the OECD
average, and lower than in any of the comparison countries. While Ireland’s performance on
the non-interactive new items was marginally above the corresponding OECD average, it was
lower than in any of the comparison countries.
508.3 508.0
522.0
502.6
497.7500.8 501.0
493.1
485
495
505
515
525
2006 2009 2012 2015
Ireland OECD
7
3.3 Proficiency levels
PISA 2015 describes seven levels of proficiency for the overall Science scale. Proficiency levels
define the skills, abilities and competencies that students scoring within specific ranges can
demonstrate. At Level 2, students are beginning to display the key competencies that will
enable them to participate effectively and productively in life situations related to Science
and technology and in future education in these fields. The top performers in Science attain
Proficiency Level 5 or higher.
In summary, lower achieving students in Ireland performed better than the OECD average
while our higher achieving students performed in or around the OECD average. Ireland
performs less well than its comparison countries for our higher-achieving students.
In Ireland, a greater proportion of students achieves baseline (Level 2) Science proficiency
(84.7%) than in OECD countries on average (78.8%). Among a set of comparison countries,
including those with highest average performance in PISA, six countries, the proportion of
students achieving baseline proficiency is greater than in Ireland (Estonia, Japan, Singapore,
Canada, Finland and Korea). The proportion of students achieving baseline proficiency in
Ireland is greater than in New Zealand, the United Kingdom, Australia, and Northern Ireland,
and considerably greater than in the Netherlands, the United States, Sweden and France.
In Ireland, 7.1% of students perform at or above Level 5 in Science, a similar proportion as
performs at this level across the OECD (7.7%) on average. Northern Ireland (6.8%) and France
(8.0%) also have similar proportions of top performers as Ireland. All other comparison
countries have greater proportions of students at Level 5 and above.
There was a significant decline in the percentage of students performing at or above Level 5
in Ireland between 2006 and 2015 (from 9.4% to 7.1%), while the difference between Irish
students’ mean scores was relatively small (-5.8 points), and the percentages performing
below Level 2 were almost identical.
Compared to most of the selected countries, Ireland displays considerably less variability in
scores on the overall Science scale, meaning the gap between higher and lower performers
in Ireland is smaller than in most comparison countries.
3.4 Gender
Boys (507.7) now significantly outperform girls (497.2) on the overall Science scale in Ireland
for the first time. This is a change from previous PISA cycles in Ireland where the difference
between the performance of boys and girls in Ireland was not statistically significant. The
score difference between boys and girls in Ireland is now 10.5 points compared to the OECD
average difference of 3.5 points.
Also of note is that the mean score of boys dropped by 16.2 points between 2012 and 2015,
the mean scores of girls dropped by 22.8 points. However, girls in Ireland still score
8
significantly above the OECD average for girls in Science. Boys in Ireland score significantly
above the OECD average for boys.
3.5 What key conclusions can be drawn in relation to performance in Science?
Ireland has maintained its position relative to the OECD average, as its mean score is
significantly higher than the OECD average. Ireland has experienced a small and non-
significant change in performance since 2006 (when Science was last a major domain).
However, Ireland, like many higher-performing countries, has experienced a significant drop
in performance between 2012 and 2015. Ireland’s drop is the fourth largest drop in
performance across participating countries.
An analysis of Irish students’ performance on PISA Science items shows that that Irish
students performed less well on new Science items specifically developed for computer-
based testing and administered for the first time in PISA 2015.
Students in Ireland performed reasonably well on trend items in Science, which were
transferred directly from paper to computer for the first time in 2015. However, there was a
drop of 2.3 percentage points between 2012 and 2015 on a cluster of 53 trend such items,
which had been administered on paper in 2006, 2009 and 2012 and on computer in 2015.
Performance on these items dropped by 4.2 percentages points on average across OECD
countries between 2012 and 2015, suggesting a somewhat weaker mode effect in Ireland.
Relative to other countries that performed at a similar level to Ireland in 2012 Science,
performance in Ireland was poorer on 99 new Science items administered for the first time in
2015, including interactive items. This suggests that, the decline experienced in Ireland in
2015 arose both because of a change to computer-based assessment, and because of the
types of items that were added to the PISA item pool in 2015 (that is, new Science items
designed for administration on computer only).
In considering changes in performance in 2015, it should be noted that Ireland’s score
increased significantly between 2009 and 2012, by some 20.8 score points. If performance
had not increased by this amount, performance in Ireland would probably not have fallen by
19.4 score points in 2015. The ERC has suggested that performance on Science in 2012
reflected the impact of the revised Science syllabus introduced in 2003, for first examination
in 2006. Improvements arising from the introduction 2003 syllabus may also have prevented
a decline in PISA 2009 Science (when reading and Mathematics dropped substantially).
Boys’ performance in Science was significantly stronger than girls for the first time.
Questionnaire results show that Irish boys have greater confidence and proficiency in
computer-usage than Irish girls. It is suggested that this is a contributory factor in differences
in performance.
9
Boys in Ireland also had significantly higher scores on measures of interest and engagement
in Science including enjoyment of Science, instrumental motivation to learn Science, and
interest in broad Science topics. Boys also engaged more frequently than girls in Science-
related activities such as watching TV programmes about Science and visiting websites about
Science topics. However, students in Ireland (both boys and girls) reported considerably lower
levels of engagement in science-related activities than did students on average across OECD
countries.
The performance in Science points to the need for on-going curricular reform and
implementation in this area at primary and post-primary. For example, outcomes associated
with the description of capabilities at Level 5 of the PISA Framework for Science are more
advanced than the objectives of the previous Junior Cycle Science syllabus.
The new Junior Cycle Science specification introduced in September 2016 should help to
address this. There is a new focus given to the Nature of Science which aims to promote
greater engagement and thinking about how Science works. There is considerably greater
scope for students to carry out scientific investigations, communicate about Science, and
develop an appreciation of the role and contribution of Science and scientists to society.
There are also new assessments which offer students a chance to demonstrate their
achievement as instigators of experiments and projects related to Science, and as authors of
scientific research reports.
4. How well did Irish students perform on Mathematics?
4.1 Comparing minor domain (2015) to major domain (2012)
Mathematics was assessed as a minor domain in PISA 2015. The Mathematics items
administered on computer for the first time in PISA 2015 comprise a subset of the items
administered on paper in PISA 2012.
Irish students, with a mean score of 503.7, performed significantly above the OECD average
mean score on Mathematics of 490.2 in PISA 2015. Ireland was placed 13th of the 35 OECD
countries and 18th overall of the 70 countries/economies in Mathematics performance.
Ireland’s mean score on PISA Mathematics was above the OECD average in both PISA 2012
and PISA 2015. Ireland’s relatively stable performance between 2012 and 2015 is remarkable
in that 8 of the top 30 highest performing countries in 2012 experienced large and significant
declines in 2015 including Korea, Poland and the Netherlands, while five experienced
significant increases, including Sweden, Norway and Denmark. It is unclear if the transition to
computer-based assessment is responsible for these changes and if so, why.
10
4.2 Proficiency levels
In Ireland, 15% of students perform below Level 2 on PISA Mathematics (indicating weak
performance) compared with 23.4% on average across OECD countries. Only a small number
of comparison countries, Singapore, Japan, Canada and Estonia have fewer students
performing below Level 2.
On the other hand 9.8% of students perform at Levels (5-6) (indicating strong performance in
Mathematics), compared with an OECD average of 10.7%. In Ireland, in general, lower-
performing students do well relative to their counterparts in other PISA countries, while
higher performing students do less well. Given that overall Mathematics performance in
Ireland is above the OECD average, this suggests that higher achieving students in Ireland are
underachieving.
There is a relatively narrow gap between lower-achieving and higher-achieving students in
Ireland compared to other OECD countries. Other countries with a relatively low spread of
Mathematics achievement include Denmark, Estonia and Norway. The spread of
Mathematics achievement in Northern Ireland is similar to Ireland.
4.3 Gender
In general, Irish boys outperformed Irish girls in Mathematics. Ireland had one of the largest
gender gaps in Mathematics among OECD countries. The difference (16.1 points) is
statistically significant and is larger than the corresponding OECD average difference (7.9),
also in favour of male students. Other comparison countries with relatively large differences
in favour of male students included Germany (16.6 points), Japan (13.8), Switzerland (12.0),
and the United Kingdom (11.6).
The proportion of boys (14.1%) and girls (15.8%) performing below Level 2 are broadly similar.
However, in Ireland, 12.9% of boys, compared with 6.5% of girls, perform at Levels 5-6
(indicating high performance). Equivalent proportions of boys perform at Level 5-6 across
OECD countries as in Ireland, while fewer girls in Ireland than on average across OECD
countries perform at these levels.
It is a matter of concern that just 6.5% of girls in Ireland performed at or above Level 5 in
Mathematics. This is below the OECD average of 8.9% in 2015 and down from 8.2% in 2012.
4.4 What key conclusions can be drawn in relation to performance in Mathematics?
Ireland maintained and even improved somewhat on its performance in Mathematics
compared to earlier PISA cycles. Ireland’s mean score in Mathematics has been above the
OECD average in both 2012 and 2015. PISA 2015 was the first cycle in which all participating
students in Ireland had studied the new mathematics curriculum for Junior Certificate and
Leaving Certificate, colloquially known as “Project Maths”. However, it is difficult to assess
11
the impact of the new curriculum on performance, given the other changes to PISA in 2015,
including the transition to a computer-based platform. It may be that in Ireland, changes
arising from the new curriculum contributed to Ireland’s stable performance compared to
other countries, enabling students to maintain and even improve a little on their
performance, and deal more effectively with the change to computer-based assessment. It
will be interesting going forward to note whether further improvement takes place as the
new curriculum becomes more established in the system.
The relatively poor performance of Irish 15-year-old girls in Mathematics compared to Irish
boys is puzzling, especially when the opposite occurs in reading. Underperformance among
girls is not reflected in the Junior Certificate examination where, on average, girls achieve
higher grades in Mathematics than boys. Also, there was no significant difference in the
performance of boys and girls in Mathematics or Science for fourth class or second year
students as indicated in the TIMSS 2015 study. The gender difference observed in Ireland in
2015 is likely to arise from the transition to computer-based assessment (with girls finding
the transition more challenging than boys), and with differences in performance on aspects
of PISA Mathematics (girls in Ireland did significantly less well on the Shape and Space content
area in 2012).
The underperformance of higher-achieving students in Ireland in Mathematics is a matter of
concern but is not altogether unexpected. The need to challenge the more-able students is a
constant theme of inspection reports in Mathematics. The new Junior Cycle Framework with
its emphasis on formative assessment across all subjects should contribute to an enhanced
performance among students when it is fully rolled out. However, teachers also need to set
more challenging, real-life Mathematics tasks for higher-performing students, that are within
the parameters of existing syllabi.
The limited spread of results, i.e. the relatively narrow gap between lower-achieving and
higher-achieving students in Ireland compared to other OECD countries is very welcome, as
are relatively small differences in performance across schools. However, these patterns may
be linked to the relatively weak performance of higher-achieving students in Ireland,
compared with higher-achievers in other OECD countries, coupled with relatively strong
performance among lower-achieving students. There’s also a view that we may focus
teaching on low achievers (through lots of direct instruction and memorisation, lots of
repetition, frequent testing and revision) rather than also focusing on the types of learning
that higher-achievers would benefit from (problem solving, discussion, collaborative learning,
student-orientated instruction, use of control strategies and so on).
These patterns suggest a need to raise performance across all achievement levels – among
high achieving students because they appear to be under-performing, and among lower-
achieving students because they continue to lack key mathematical skills needed for their
future lives.
12
Although average performance in Ireland was significantly above the OECD average in 2012
and 2015, Ireland still performed less well than a number of European counties, including
Estonia, Switzerland, the Netherlands, Denmark and Finland. There is still plenty of scope for
improvement in performance in Mathematics. Furthermore, stronger performance on
Mathematics might contribute to improved performance on Science, especially as the new
interactive items on computer-based Science in PISA place a stronger premium on aspects of
mathematical proficiency.
5. How well did Irish students perform in Reading?
5.1 Comparing minor domain (2015) to major domain (2009)
Reading was a major assessment domain in 2009. The reading items in PISA 2015 comprise
items that were administered on paper before in PISA 2009, when reading was a major
assessment domain. In 2015 these items were administered on computer for the first time.
However the items used were used on paper-based assessments in previous cycles and it
seems that the transition to computer-based assessment did not unduly affect students in
Ireland in this domain.
The average score of reading in Ireland in 2015 when reading was a minor domain is 520.8
compared to an average score of 495.6 in 2009. However, Ireland’s relatively poor
performance in 2009 was not consistent with earlier cycles of PISA.
It was confirmed in 2012 that the decline in reading observed in Ireland in 2009 had been
reversed, and the outcomes for PISA 2015 confirmed this.
5.2 Comparing minor domain (2015) to minor domain (2012)
Reading was a minor domain in 2006 and 2012. The average score in reading in Ireland in
2015 is slightly lower but not significantly different than the score achieved in 2012.
The mean score for girls dropped back to 526.9 in 2015 from 537.7 in 2012.
Mean scores for boys rose slightly from 509.2 in 2012 to 515.0 in 2015. This rise in the score
for boys in 2012 compensated to an extent for the decline in the performance of girls.
5. 3 Proficiency levels
10.2% of Irish students performed at the lowest level of proficiency (i.e. below Level 2) in
2015. This is about the same as in Estonia, Canada, Finland and Singapore. This is significantly
lower than the corresponding OECD average of 20.1% and lower too than the percentage of
students who perform below Level 2 in Northern Ireland (15.3%).
13
10.7% of students perform at or above Level 5. This is slightly higher than the OECD average
of 8.3%. The proportion of students performing at or above Level 5 in Ireland is below that
of a number of other countries with high average performance including Singapore (18.4%),
Canada (14.0%) and Finland (13.7%). Ireland has more higher-achieving students than
Northern Ireland (6.0%).
While the range of reading achievement in Ireland is significantly smaller than on average
across OECD countries, there is a need to increase the proportion of students scoring at or
above Level 5. There is also a need to ensure that those scoring below Level 2 have adequate
skills for their future lives.
5.4 Gender
Irish girls significantly outperform Irish boys on PISA 2015 in overall reading. The gender
difference is significantly smaller in Ireland compared to the OECD average difference.
The proportion of lower-achieving boys in reading (12.3%) is greater than the number of lower
achieving girls (8.0%). The proportions for both genders are well below the corresponding
OECD averages (24.4 % and 15.6% respectively).
The percentage of higher achieving girls – those performing at Levels 5-6 (9.9%) – is close to
the OECD average for girls (9.0%). The percentage of higher-achieving boys – those
performing at Levels 5-6 – is 10.7% compared with an OECD average of 6.8%.
The percentage of higher-achieving girls has decreased since 2012 while the percentage of
higher achieving boys has increased. This suggests that further improvement in reading in
Ireland could be achieved if higher-achieving girls could return to the performance level they
achieved in 2012.
5.5 What key conclusions can be made in relation to performance in reading?
Our performance in reading continues to be at a very high standard. Just one country,
Singapore, has a significantly higher mean score in 2015.
The percentage of students performing at or above Level 5 in reading in Ireland is only slightly
above the corresponding OECD average proportions, while other high-performing countries
such as Finland and Canada have higher percentages of students reaching this benchmark.
This is consistent with patterns identified in earlier cycles of PISA. There is a need to stretch
and challenge our more able students so that they will perform better at this level. We need
to ensure that girls, in particular, cope more effectively with the transition to computer-based
assessment of reading.
There is still a significant gap between the performance of girls and boys in reading, however
the gap is among the lowest in the set of comparison countries. It is also close to the
14
difference in Northern Ireland. The National Strategy: Literacy and Numeracy for Learning and Life
2011-2020 which was rolled out from 2011 may have been instrumental in ensuring higher
scores in reading for boys. It may also be the case that boys adjusted more easily to the
requirements of computer-based testing, compared with girls.
6. Factors influencing student performance in 2015
6.1 Test-related factors
It is unclear to what extent the transition to computer-based testing contributed to the large
drop in mean performance in Science in Ireland between 2012 and 2015. The analysis of item-
level data suggest that, compared with a set of countries that performed at the same level on
PISA Science as Ireland did in 2006, students in Ireland performed disproportionately poorly
on new Science items (i.e., those items specifically developed for computer-based testing in
2015), and especially on items described as ‘interactive’.
The persistently low levels of engagement with ICT for school-related tasks in school and for
homework by students in Ireland may explain, in part, why students in Ireland struggled with
the new PISA Science items, including, in particular, those that required them run simulations,
while manipulating a number of variables at the same time.
The TIMMS data also shows that Irish 15 years olds’ use of ICT in the classroom as a tool to
support teaching and learning is significantly below the international average.
The stronger perceived ICT competence among male students, and their greater involvement
in social discussions about ICTs, may also have contributed to the greater gender difference
in Science in Ireland in 2015, compared with earlier PISA cycles.
The computer-based items in reading and Mathematics were items that had been asked
before in paper-based previous PISA assessments. These items were not described as
“interactive” and would not have made the same demands on students unaccustomed to
computer-based testing. On the other hand, the demands made of students in the new
interactive items in Science were quite different to the demands made of students in relation
to the computer-based trend items in reading and Mathematics.
Of note was the fact that over half of Irish students in PISA 2015 had never taken a computer-
based test before this. These students have significantly lower scores on Science, reading and
Mathematics than students who had taken a computer-based test on at least one occasion.
6.2 Student characteristics
Student economic, social and cultural status (ESCS) is a strong predictor of achievement in
Ireland and across OECD countries, accounting for 12.7% of the variance in Science
achievement in Ireland and 12.9% across OECD countries on average. Elements within the
15
ESCS index (parent occupation, parent education and home possessions) were all found to
have a significant positive correlations with achievement in Science, with parent occupation
having the strongest association with achievement in Science, reading and Mathematics
among students in Ireland.
83.8% of students in Ireland have attended some form of pre-primary education3, according
to parents. The comparable figure for OECD countries on average is 56.01%. It was found that
students who attended pre-primary education scored higher on Science, reading and
Mathematics compared to students who did not attend pre-primary education.
There has been a small increase in the number of students categorised as coming from an
immigrant background in Ireland between Pisa 2012 (10.1%) and PISA 2015 (14.4%).
However, there was no significant differences observed on overall Science performance
among students in Ireland based on immigrant and language background. The performance
of native and immigrant students had aligned by 2015 compared to a gap in their performance
in 2006. The only difference observed on the basis of immigrant status is on reading with
native students scoring some 25 points higher than immigrant students who speak a language
other than English or Irish at home4.
Gender was a significant marker for student performance with girls performing better than
boys in reading and boys performing better than girls in Mathematics and Science.
Interest in ICT among students in Ireland is significantly greater than among their OECD
counterparts. Students in Ireland do not differ from their OECD counterparts on their use of
ICT outside of school for leisure, but they use ICT less as a topic of social interaction compared
to students on average across OECD countries. However, students in Ireland use ICT at school
in general and outside of school for schoolwork significantly less than students across OECD
countries on average.
Boys and girls differ in their familiarity with, and use of, ICT. Compared to girls, boys feel more
competent and autonomous with regard to ICT use. The two groups do not differ in their
overall interest in ICT or in their use of ICT at school in general.
6.3 Student attitudes towards and engagement with Science
It is encouraging that students in Ireland reported a large and significant increase in their
enjoyment of Science between 2006 and 2015 and that Ireland’s mean score on this index is
now significantly above the OECD average. It is also noteworthy that instrumental motivation
3 (e.g., playschool, Montessori, pre-school, naíonra (Irish language pre-school), Early Start, Traveller pre-school
programme) 4 It should be noted, however, that large standard errors arising from the small numbers of students in the immigrant
groups may mask differences that exist between the groups on the basis of immigrant and language background.
16
to learn Science has increased significantly, perhaps because students are more aware of the
importance of Science in the context of their future careers.
The disparity in Science performance between boys and girls is reflected in attitudes to
Science. Irish boys score significantly higher on enjoyment of Science and interest in broad
Science topics than girls. They also engage more frequently in Science activities than girls.
However boys and girls do not differ significantly in their motivation to learn Science.
It is positive to note that around eight out of ten students in Ireland, and more girls than boys,
are studying or intending to study Science to Leaving Certificate level. However, less than one-
third of students in Ireland expects to be in a Science-related career at age 30. This is slightly
above the OECD average.
Although there has been an increase in Ireland’s mean score on the PISA index of Science
activities, it still lags behind the corresponding OECD average score, and students in Ireland
continue to report low involvement in activities such as visiting web sites about Science
topics, reading Science magazines, and attending Science clubs. Frequency of engagement
with Science activities is associated with higher performance on PISA Science and this may
explain some of the findings in relation to Science.
6.4 School-related factors
Students’ participation in education is also associated with achievement. Those deemed at
risk5 of early school leaving score significantly lower on Science than those deemed not at
risk. The majority of students (87.5%) are considered not at risk of early school leaving. 12.5%
of students at risk of early school leaving have a significantly lower mean score on overall
Science, reading and Mathematics.
Between-school variation in performance on Science accounts for 11.5% of the total variation
in Science performance in Ireland compared to 30.1% across OECD countries on average, and
50.0% in countries such as the Netherlands, Hungary and Bulgaria. This can be interpreted as
an indicator of equity in the system.
The difference in Science performance between students in advantaged and disadvantaged
schools6 is 67.3 points in Ireland and 103.8 points across OECD countries on average. 19.8%
of students in Ireland are attending a socioeconomically disadvantaged school. Students in
socioeconomically disadvantaged schools score significantly below the OECD average in PISA
5 Students in Ireland deemed at risk of leaving school early are those who indicated the Junior Certificate as the highest
level of formal education they expect to complete.
6 The OECD classifies schools by level of disadvantage based on the proportion of students scoring in the bottom, middle
and top quartiles on the ESCS index for a particular country/economy.
17
Science in Ireland and across OECD countries on average. School average ESCS is also
significantly related to performance in reading and Mathematics.
While performance in Science dropped in all school types, the decline was greatest in girls’
secondary schools where there was a significant drop of 14 score points. Students in boys’
secondary schools have the highest mean scores in Science. In addition, students attending
schools in the School Support Programme (SSP) under DEIS have significantly lower mean
scores on Science compared to students in non-SSP schools. Students in fee-paying schools
outperform students in non-fee-paying schools in Science, reading and Mathematics.
ICT usage in Irish schools, as it relates to schoolwork, whether at home or at school, lags well
behind OECD average levels in both 2006 and 2015. Interest in ICT among students in Ireland
is significantly greater than among their OECD counterparts. This would indicate that student
interest needs to be utilised into real usage and development of ICT skills in Irish classrooms.
Characteristics of Science classes that are related to performance in Science include the
disciplinary climate in classrooms, teacher support, and feedback from teachers, as well as
specific teaching practices including adaptive instruction, inquiry-based instruction, and
teacher-directed Science instruction. Inquiry-based practices are stated as being particularly
useful for physical and life Sciences.
6.5 Changes in the curriculum and related initiatives
Student performance may be influenced by changes in the curriculum and related initiatives.
The roll-out of the National Strategy: Literacy and Numeracy for Learning and Life (2011-2020)
between 2011 and 2016 may have contributed to Ireland maintaining its relatively stable
position in reading and Mathematics between 2012 and 2015. It may also have contributed
in a rise in the reading scores of boys in PISA 2015.
7. Summary
7.1 What were the main strengths in students’ performance in PISA 2015?
Ireland continues to maintain high standards in Science, Mathematics and in reading.
The mean performance of Irish students in reading was significantly above the OECD average.
Ireland ranked 3rd out of the 35 OECD countries and 5th of all 70 participating countries.
The mean scores for Ireland in both Mathematics and Science are significantly above the
average scores for OECD countries and Ireland’s performance ranked 13th among all OECD
countries in both in 2015.
Ireland has improved on its performance in Mathematics compared to other PISA cycles and
is now significantly above the average for the second cycle in a row.
18
There is good progress in terms of maintaining and improving the performance of lower-
achieving students. Lower-achieving students perform significantly better than the OECD
average in reading, Science and Mathematics.
Compared to other countries Ireland displays considerably less variability in scores on the
overall Science, Mathematics and reading scales. This can be interpreted as an indicator of
equity in the system. Equally, it may indicate that higher-achieving students are
underperforming.
It is encouraging that students in Ireland report a large and significant increase of their
enjoyment of Science between 2006 and 2016 and that Ireland’s mean score on this index is
now significantly above the OECD average.
7.2 What are the main areas for development?
Compared to other countries we are doing well, but internationally we are not in the top
group, apart from in reading. If we are to be competitors in the global arena, we have to look
at the performance of Asian economies such as Japan and Hong Kong (China) and countries
like Estonia and Finland closer to home.
Our performance on the new interactive items in computer-based assessments in Science
points to the fact that Irish students are disadvantaged by the patterns of computer usage in
schools.
There is a need to emphasise problem solving and higher-level thinking in Mathematics and
Science, in order to stretch our higher-achieving students.
Students in Ireland continue to report low involvement in activities such as visiting websites
about Science topics, reading Science magazines and attending Science clubs. There should
be a focus on how to increase Irish students’ interest and involvement in Science-related
extra-curricular activities, especially outside the context of transition year.
Ireland has one of the largest gender gaps between boys and girls in Mathematics with boys
outperforming girls at all levels particularly at Level 5-6. There is a need to increase girls’
interest in mathematics and strengthen their performance.
7.3 What actions are already underway to address these issues?
We have made a strong commitment in the Action Plan for Education 2016-2019 to focus on
improving achievement and seeking continual improvement in standards
Ongoing curricular reform and implementation across the system with the roll-out of Junior
Cycle reform since 2014 is expected to improve performance across all levels.
A new Science specification has been introduced this year for junior-cycle students. This has
been developed to significantly enhance our students’ ability in a full range of Science skills
including their ability to apply their scientific knowledge and interpret scientific phenomena.
19
It also includes a new strand called the Nature of Science. The PISA report shows that
characteristics of Science classes that are related to performance in Science include feedback
from teachers and inquiry-based instruction. These are all aspects of the new specification.
The junior cycle for teachers (JCT) support service has a designated Science team. They began
providing CPD for teachers in January 2016. They are supporting teachers in their engagement
with the learning outcomes and assessment in Junior Cycle Science.
The Mathematics curriculum for Junior Certificate, colloquially known as “Project Maths”,will
be updated with a further Junior Cycle Mathematics specification to be published in 2017 and
implemented in 2018 as part of junior cycle reform.
The Department’s new Digital Strategy for Schools 2015-2020: Enhancing Teaching, Learning
and Assessment, which was launched in October 2015, sets out a clear vision that is focussed
on realising the potential of digital technologies to transform the learning experiences of
students by helping them become engaged thinkers, active learners, knowledge constructors
and global citizens who participate fully in society and the economy. This policy sets out a 5-
year plan to embed ICT in teaching, learning and assessment through an ambitious
programme of work for the period 2015 -2020. Implementation of the actions in the plan will
support additional use of information and communications technology (ICT) and digital
learning across the curriculum including in Science and in Mathematics.
Other steps that are being taken or that are underway include:
A sustained system-wide focus on the implementation of actions to support numeracy as proposed in the National Strategy: Literacy and Numeracy for Learning and Life 2011-2020, informed by the interim review of the Strategy which is due to be published shortly.
A STEM Education Policy Statement is due to be published in early 2017. This will further support teaching and learning in Science and Mathematics at all levels of the education system.
Two new short courses have been developed by the NCCA as part of junior cycle reform, Coding and Digital Media Literacy.
A continued focus, through the school self-evaluation (SSE) process on promoting improvement across the domains of learner outcomes, learner experiences and teachers’ practices
Continued support by the DES of initiatives and programmes which promote innovative teaching and learning experiences in Mathematics and Science in primary and post-primary schools, such as I-Wish and general competency in spatial awareness such as the National Spatial Skills Project.
The DES will continue to support important Science events in the educational calendar such
as the BT Young Scientist and SCi-Fest.
20
8. Comparison with TIMSS results from last week A number of the findings of the PISA tests are consistent with last week’s TIMSS results, i.e.
commentary in relation to high-achievers and to performance of low-achievers. Ireland was above
the PISA mean score and above the TIMSS centrepoint for both Mathematics and Science. (Note that
the TIMSS results did not include Reading but its sister survey PIRLS (Progress in International Reading
Literacy Study) in 2011 confirmed high reading scores for Ireland, which is consistent with the PISA
results for Reading.
Appendix 4 of this note contains details identifying the similarities between the PISA and TIMSS tests,
in terms of their aims, focus and structure. It also includes tables showing how countries that
participated in both PISA and TIMSS in 2015 compared in terms of their performance in each test,
relative to Ireland.
21
APPENDIX 1 Mean country/economy scores, standard deviations and standard errors for the overall science scale, and positions relative to the OECD and Irish means, for all participating countries/economies (PISA 2015)
Mean SE SD SE IRL Mean SE SD SE IRL
Singapore 555.6 (1.20) 103.6 (0.90) ▲ Lithuania 475.4 (2.65) 90.9 (1.41) ▼
Japan 538.4 (2.97) 93.5 (1.65) ▲ Croatia 475.4 (2.45) 89.3 (1.25) ▼
Estonia 534.2 (2.09) 88.9 (1.10) ▲ Argentina Cities 475.2 (6.28) 85.8 (2.72) ▼
Chinese Taipei 532.3 (2.69) 99.6 (1.92) ▲ Iceland 473.2 (1.68) 91.2 (1.15) ▼
Finland 530.7 (2.39) 96.2 (1.31) ▲ Israel 466.6 (3.44) 106.4 (1.63) ▼
Macao (China) 528.5 (1.06) 81.4 (0.96) ▲ Malta 464.8 (1.64) 117.6 (1.51) ▼
Canada 527.7 (2.08) 92.4 (0.88) ▲ Slovak Republic 460.8 (2.59) 98.9 (1.53) ▼
Viet Nam 524.6 (3.91) 76.6 (2.34) ▲ Greece 454.8 (3.92) 91.9 (1.84) ▼
Hong Kong (China) 523.3 (2.55) 80.6 (1.41) ▲ Chile 447.0 (2.38) 86.0 (1.34) ▼
B-S-J-G (China) 517.8 (4.64) 103.4 (2.47) ▲ Bulgaria 445.8 (4.35) 101.5 (2.10) ▼
Korea 515.8 (3.13) 95.2 (1.47) ▲ UAE 436.7 (2.42) 99.1 (1.06) ▼
New Zealand 513.3 (2.38) 104.1 (1.43) ▲ Uruguay 435.4 (2.20) 86.5 (1.26) ▼
Slovenia 512.9 (1.32) 95.2 (1.08) ▲ Romania 434.9 (3.23) 79.1 (1.72) ▼
Australia 510.0 (1.54) 102.3 (0.92) ▲ Cyprus 432.6 (1.38) 92.8 (1.16) ▼
United Kingdom 509.2 (2.56) 99.7 (1.02) O Moldova 428.0 (1.97) 86.0 (1.35) ▼
Germany 509.1 (2.70) 99.3 (1.48) O Albania 427.2 (3.28) 78.5 (1.45) ▼
Netherlands 508.6 (2.26) 100.9 (1.49) O Turkey 425.5 (3.93) 79.3 (1.89) ▼
Switzerland 505.5 (2.90) 99.5 (1.55) O Trinidad +
Tobago 424.6 (1.41) 93.8 (1.10) ▼
Ireland 502.6 (2.39) 88.9 (1.33) Thailand 421.3 (2.83) 78.5 (1.58) ▼
Belgium 502.0 (2.29) 100.2 (1.24) O Costa Rica 419.6 (2.07) 70.0 (1.21) ▼
Denmark 501.9 (2.38) 90.3 (1.14) O Qatar 417.6 (1.00) 98.7 (0.74) ▼
Poland 501.4 (2.51) 90.8 (1.34) O Colombia 415.7 (2.36) 80.4 (1.29) ▼
Portugal 501.1 (2.43) 91.8 (1.08) O Mexico 415.7 (2.13) 71.4 (1.09) ▼
Norway 498.5 (2.26) 96.2 (1.30) O Montenegro 411.3 (1.03) 85.3 (0.88) ▼
United States 496.2 (3.18) 98.6 (1.40) O Georgia 411.1 (2.42) 90.6 (1.31) ▼
Austria 495.0 (2.44) 97.3 (1.31) ▼ Jordan 408.7 (2.67) 84.4 (1.59) ▼
France 495.0 (2.06) 102.0 (1.45) ▼ Indonesia 403.1 (2.57) 68.4 (1.64) ▼
Sweden 493.4 (3.60) 102.5 (1.37) ▼ Brazil 400.7 (2.30) 89.2 (1.27) ▼
Czech Republic 492.8 (2.27) 95.3 (1.45) ▼ Peru 396.7 (2.36) 76.7 (1.41) ▼
Spain 492.8 (2.07) 88.0 (1.08) ▼ Lebanon 386.5 (3.40) 90.4 (1.76) ▼
Latvia 490.2 (1.56) 82.2 (1.06) ▼ Tunisia 386.4 (2.10) 64.9 (1.57) ▼
Russian Federation 486.6 (2.91) 82.4 (1.05) ▼ FYR of
Macedonia 383.7 (1.25) 84.8 (1.29) ▼
Luxembourg 482.8 (1.12) 100.4 (1.08) ▼ Kosovo 378.4 (1.70) 71.3 (1.11) ▼
Italy 480.5 (2.52) 91.4 (1.35) ▼ Algeria 375.7 (2.64) 69.3 (1.51) ▼
Hungary 476.7 (2.42) 96.3 (1.58) ▼ Dominican Rep. 331.6 (2.58) 72.5 (1.80) ▼
OECD Average 493.2 (0.43) 94.4 (0.23) ▼
EU Average 489.1 (0.70) 95.0 (0.33) ▼
Significantly above the OECD average ▲ Significantly higher than Ireland
At OECD average O Not significantly different to Ireland Significantly below the OECD average ▼ Significantly lower than Ireland
OECD countries are in regular font, partner countries/economies are in italics. Argentina, Malaysia and Kazakhstan are omitted, as
coverage is too small to ensure comparability (OECD, 2016b). Data for four Argentinian cities are provided.
APPENDIX 2 Mean scores, standard deviations and standard errors for all participating
countries/economies on the PISA 2015 reading scale and positions relative to the mean scores for
Ireland and on average Mean SE SD SE IRL Mean SE SD SE IRL
Singapore 535.1 (1.63) 98.7 (1.06) ▲ Israel 479.0 (3.78) 113.1 (1.97) ▼
22
Hong Kong-
China 526.7 (2.69) 85.8 (1.45) O Argentina-Cities 475.3 (7.19) 90.5 (3.38) ▼
Canada 526.7 (2.30) 92.8 (1.33) O Lithuania 472.4 (2.74) 94.4 (1.55) ▼
Finland 526.4 (2.55) 93.9 (1.53) O Hungary 469.5 (2.66) 97.0 (1.67) ▼
Ireland 520.8 (2.47) 86.2 (1.47) Greece 467.0 (4.34) 98.2 (2.40) ▼
Estonia 519.1 (2.22) 87.5 (1.22) O Chile 458.6 (2.58) 88.1 (1.74) ▼
Korea 517.4 (3.50) 97.0 (1.73) O Slovak Republic 452.5 (2.83) 104.2 (1.75) ▼
Japan 516.0 (3.20) 92.4 (1.83) O Malta 446.7 (1.78) 120.6 (1.46) ▼
Norway 513.2 (2.51) 98.8 (1.69) ▼ Cyprus1 442.8 (1.65) 102.3 (1.27) ▼
New Zealand 509.3 (2.40) 105.0 (1.68) ▼ Uruguay 436.6 (2.55) 96.6 (1.60) ▼ Germany 509.1 (3.02) 100.1 (1.62) ▼ Romania 433.6 (4.07) 95.1 (2.12) ▼ Macao (China) 508.7 (1.25) 82.1 (1.07) ▼ UAE 433.5 (2.87) 105.7 (1.39) ▼ Poland 505.7 (2.48) 89.6 (1.30) ▼ Bulgaria 431.7 (5.00) 114.6 (2.60) ▼ Slovenia 505.2 (1.47) 91.8 (1.29) ▼ Turkey 428.3 (3.96) 82.4 (2.00) ▼ Netherlands 503.0 (2.41) 101.0 (1.65) ▼ Costa Rica 427.5 (2.63) 79.2 (1.55) ▼
Australia 502.9 (1.69) 102.7 (1.12) ▼
Trinidad-
Tobago 427.3 (1.49) 104.1 (1.34) ▼
Sweden 500.2 (3.48) 101.8 (1.47) ▼ Montenegro 426.9 (1.58) 94.1 (1.20) ▼ Denmark 499.8 (2.54) 87.3 (1.16) ▼ Colombia 424.9 (2.94) 89.8 (1.53) ▼ France 499.3 (2.51) 112.0 (2.02) ▼ Mexico 423.3 (2.58) 78.0 (1.47) ▼ Belgium 498.5 (2.42) 100.2 (1.52) ▼ Moldova 416.2 (2.52) 97.8 (1.48) ▼ Portugal 498.1 (2.69) 92.0 (1.14) ▼ Thailand 409.1 (3.35) 79.8 (1.69) ▼ United
Kingdom 498.0 (2.77) 96.7 (1.09) ▼
Jordan 408.1 (2.93) 94.1 (1.78) ▼
Chinese Taipei 497.1 (2.50) 93.2 (1.66) ▼ Brazil 407.3 (2.75) 100.2 (1.54) ▼ United States 496.9 (3.41) 99.8 (1.58) ▼ Albania 405.3 (4.13) 96.6 (1.85) ▼ Spain 495.6 (2.36) 87.3 (1.40) ▼ Qatar 401.9 (1.02) 110.6 (0.97) ▼ Russian Fed. 494.6 (3.08) 87.4 (1.39) ▼ Georgia 401.3 (2.96) 103.6 (1.81) ▼ B-S-J-G (China) 493.9 (5.13) 108.9 (2.90) ▼ Peru 397.5 (2.89) 89.1 (1.60) ▼ Switzerland 492.2 (3.03) 97.9 (1.69) ▼ Indonesia 397.3 (2.87) 76.0 (1.77) ▼ Latvia 487.8 (1.80) 84.8 (1.48) ▼ Tunisia 361.1 (3.06) 81.6 (1.91) ▼ Czech Republic 487.3 (2.60) 100.5 (1.74) ▼ Dominican Rep. 357.7 (3.05) 84.9 (1.94) ▼ Croatia 486.9 (2.68) 90.7 (1.56) ▼ FYR Macedonia 351.7 (1.41) 99.2 (1.24) ▼ Viet Nam 486.8 (3.73) 72.6 (2.03) ▼ Algeria 349.9 (3.00) 72.7 (1.56) ▼ Austria 484.9 (2.84) 101.1 (1.54) ▼ Kosovo 347.1 (1.57) 78.3 (1.10) ▼ Italy 484.8 (2.68) 93.8 (1.62) ▼ Lebanon 346.5 (4.41) 115.5 (2.61) ▼
Iceland 481.5 (1.98) 99.4 (1.66) ▼ OECD Average 492.5 (0.46) 96.0 (0.27) ▼
Luxembourg 481.4 (1.44) 106.6 (0.98) EU Average 486.0 (0.52) 97.5 (0.30)
Significantly above OECD average ▲ Significantly higher than Ireland
At OECD average O Not significantly different from Ireland
Significantly below OECD average ▼ Significantly lower than Ireland
OECD countries are in regular font, partner countries/economies are in italics. Argentina, Malaysia and Kazakhstan are
omitted, as coverage is too small to ensure comparability (OECD, 2016b). Data for four Argentinian cities are provided.
APPENDIX 3 Mean scores, standard deviations and standard errors for all participating
countries/economies on the PISA 2015 mathematics scale and positions relative to the mean scores
for Ireland and on average across OECD countries
Mean SE SD SE IRL Mean SE SD SE IRL
Singapore 564.2 (1.47) 95.4 (0.83) ▲ Hungary 476.8 (2.53) 93.8 (1.70 ▼
Hong Kong (C) 547.9 (2.98) 90.1 (1.51) ▲ Slovak Republic 475.2 (2.66) 95.4 (1.61) ▼
Macao (China) 543.8 (1.11) 79.9 (1.13) ▲ Israel 469.7 (3.63) 103.4 (2.16) ▼
Chinese Taipei 542.3 (3.03) 102.9 (1.95) ▲ United States 469.6 (3.17) 88.5 (1.52) ▼
23
Japan 532.4 (3.00) 88.2 (1.74) ▲ Croatia 464.0 (2.77) 88.3 (1.56) ▼
B-S-J-G (China) 531.3 (4.89) 106.0 (2.45) ▲ Argentina-Cities 456.3 (6.91) 88.5 (3.44) ▼
Korea 524.1 (3.71) 99.7 (1.77) ▲ Greece 453.6 (3.75) 89.4 (1.79) ▼
Switzerland 521.3 (2.92) 95.7 (1.58) ▲ Romania 444.0 (3.79) 86.3 (2.10) ▼
Estonia 519.5 (2.04) 80.4 (1.06) ▲ Bulgaria 441.2 (3.95) 97.2 (2.37) ▼
Canada 515.6 (2.31) 87.7 (1.05) ▲ Cyprus1 437.1 (1.72) 92.4 (1.06) ▼
Netherlands 512.3 (2.21) 91.5 (1.46) ▲ UAE 427.5 (2.41) 96.5 (1.29) ▼
Denmark 511.1 (2.17) 80.6 (1.22) ▲ Chile 422.7 (2.54) 85.4 (1.41) ▼
Finland 511.1 (2.31) 82.2 (1.26) ▲ Turkey 420.5 (4.13) 81.9 (2.35) ▼
Slovenia 509.9 (1.26) 87.8 (1.29) ▲ Moldova 419.7 (2.47) 90.1 (1.53) ▼
Belgium 507.0 (2.35) 97.4 (1.47) O Uruguay 418.0 (2.50) 86.6 (1.65) ▼
Germany 506.0 (2.89) 89.0 (1.39) O Montenegro 417.9 (1.46) 86.6 (1.35) ▼
Poland 504.5 (2.39) 87.6 (1.67) O Trinidad-Tob. 417.2 (1.41) 96.0 (1.19) ▼
Ireland 503.7 (2.05) 79.8 (1.38) Thailand 415.5 (3.03) 81.5 (1.95) ▼
Norway 501.7 (2.23) 84.9 (1.10) O Albania 413.2 (3.45) 86.2 (1.62) ▼
Austria 496.7 (2.86) 95.1 (1.83) O Mexico 408.0 (2.24) 75.0 (1.28) ▼
New Zealand 495.2 (2.27) 92.1 (1.31) ▼ Georgia 403.8 (2.78) 93.9 (2.19) ▼
Viet Nam 494.5 (4.46) 83.7 (2.71) ▼ Qatar 402.4 (1.27) 98.8 (1.00) ▼
Russian Fed. 494.1 (3.11) 83.1 (1.32) ▼ Costa Rica 400.3 (2.47) 68.4 (1.43) ▼
Sweden 493.9 (3.17) 90.1 (1.74) ▼ Lebanon 396.2 (3.69) 101.1 (1.99) ▼
Australia 493.9 (1.61) 93.1 (1.23) ▼ Colombia 389.6 (2.29) 77.2 (1.35) ▼
France 492.9 (2.10) 95.2 (1.53) ▼ Peru 386.6 (2.71) 82.6 (1.37) ▼
United Kingd. 492.5 (2.50) 92.6 (1.36) ▼ Indonesia 386.1 (3.08) 79.8 (1.99) ▼
Czech Republic 492.3 (2.40) 90.7 (1.74) ▼ Jordan 380.3 (2.65) 85.8 (2.08) ▼
Portugal 491.6 (2.49) 95.7 (1.30) ▼ Brazil 377.1 (2.86) 89.2 (1.73) ▼
Italy 489.7 (2.85) 93.6 (1.67) ▼ FYR Macedonia 371.3 (1.28) 95.9 (1.60) ▼
Iceland 488.0 (1.99) 92.9 (1.33) ▼ Tunisia 366.8 (2.95) 84.2 (2.34) ▼
Spain 485.8 (2.15) 84.7 (1.27) ▼ Kosovo 361.5 (1.63) 75.3 (1.40) ▼
Luxembourg 485.8 (1.27) 93.6 (1.18) ▼ Algeria 359.6 (2.95) 71.1 (1.53) ▼
Latvia 482.3 (1.87) 77.5 (1.19) ▼ Dominican Rep. 327.7 (2.69) 68.5 (2.01) ▼
Malta 478.6 (1.72) 110.3 (1.38) ▼ OECD 490.2 (0.44) 89.5 (0.26)
Lithuania 478.4 (2.33) 86.5 (1.39) ▼ EU 487.1 (0.48) 90.2 (0.29)
Significantly above OECD average ▲ Significantly higher than Ireland
At OECD average O Not significantly different from Ireland
Significantly below OECD average ▼ Significantly lower than Ireland
OECD countries are in regular font, partner countries/economies are in italics. Argentina, Malaysia and Kazakhstan are
omitted, as coverage is too small to ensure comparability (OECD, 2016b). Data for four Argentinian cities are provided.
24
APPENDIX 4 – PISA & TIMSS – Key Features and Tabular Comparisons of 2015 results
In spring 2015, post-primary schools in Ireland participated in two large international studies of
education: the Programme for International Student Assessment (PISA) and Trends in International
Mathematics and Science Study (TIMSS). To minimise the burden on schools, the sampling of schools
for the two studies was done collaboratively, and no school participated in both.
Comparing PISA and TIMSS Both studies provide information on the achievement of students in Ireland and allow for comparison with that of students in other countries. However, there are also important distinctions between them. Table 4.1 provides a summary of the key features of each study, and how they compare to each other in terms of their aims, focus and structure.
PISA TIMSS
PISA has taken place every three years since 2000 and Ireland has participated in each cycle of the study, allowing for examination of changes in performance over time.
TIMSS has taken place every four years since 1995. Ireland has participated in two cycles of the study at Second Year: 1995 and 2015. This means that Ireland’s performance on TIMSS in 2015 can be compared to Ireland’s performance in 1995.
PISA assess student achievement in three domains: reading, mathematical and science. In each cycle, one domain is the main focus of the assessment and the two other are assessed as minor domains. In 2015, science was the main focus of the assessment, with less emphasis on mathematics and reading.
TIMSS measures student achievement in two domains: mathematics and science. Each domain is given equal weighting in each cycle of the study. Because each domain is given equal weighting, performance on specific content areas in both mathematics and science can be examined in each cycle.
PISA assesses 15-year-old students (the age at which students in most participating countries are at the end of compulsory schooling). Within selected schools, students are selected from across a number of different grade levels. In Ireland, about 60% of students who participated in PISA 2015 were in Third Year, just over 25% were in Transition Year, about 11% were in Fifth year and just under 2% were in Second Year.
TIMSS is a grade-based study, assessing students at Fourth grade (Fourth class in Ireland) and Eighth grade (Second Year). Within a selected school, either one or two intact classes are randomly selected, and all students in these classes participate. Because TIMSS tests at both Fourth and Eighth grade, and because the tests used at each grade level are informed by the same assessment framework, general links can be made between performance at primary and post-primary level.
PISA also collects context information from students, school principals and parents and links their responses to student performance. Some national-level data, including data from school science co-ordinators, is also collected. This information provides a rich context in which to interpret the achievement results of students.
TIMSS collects information from principals and the mathematics and science teachers of the selected students. Information from the mathematics and science teacher questionnaires is linked to individual students, allowing for examination of the links between classroom processes and student achievement.
Table 4.1: Key features of PISA and TIMSS
25
PISA TIMSS
PISA aims to measure how well students are prepared to meet the challenges they may encounter in future life and, while it is informed by the content of school curricula, the focus of the assessment is on students’ ability to apply knowledge and skills in unfamiliar, real-life situations. PISA aims to measure the outcomes of schooling in order to provide guidance to governments on shaping educational policies.
TIMSS aims to measure school-based learning and the extent to which students have mastered essential mathematics and science concepts, content and procedures. TIMSS also aims to help participating countries make data-based decisions about how to improve teaching and learning in mathematics and science.
PISA test content is presented in clusters. Students see a stimulus (such as a text or graphic), and a set of accompanying questions. For maths, a stimulus typically has three related questions, while for science, it has five. Some follow a multiple-choice format, while others are constructed response items that require answers of varying lengths.
Much of the TIMSS test content consists of short, standalone questions, with little accompanying text. In a small number of cases, a short stimulus text is presented with two to three associated questions.
In 2015, the PISA tests were delivered on computer in most participating countries for the first time and the science test included some interactive items.
In 2015, TIMSS tests were delivered on paper. For the next cycle in 2019, countries will have the option of electronic testing.
Students are allocated two hours to complete the PISA tests, with a short break in the middle. In 2015, 46% of students took a test that contained science and reading items, 46% took a test that contained science and maths items and 8% took a test that contained items from all three domains.
Students are allocated 90 minutes to complete the TIMSS tests, with a short break in the middle. Half of students spend the first 45 minutes answering science questions, and half spend it answering mathematics questions.
Table 4.1 Cont’d: Key features of PISA and TIMSS
Performance on Science and Mathematics Table 4.2 presents the countries that participated in both TIMSS and PISA in 2015, in descending
order of their performance on the Science and Mathematics tests, as well as their performance
relative to Ireland’s mean or average Science and Mathematics scores.
Science:
The top-performing countries in both studies include Singapore, Japan, Korea, Hong-Kong and
Chinese Taipei. These five countries and Slovenia significantly out-performed Ireland in Science in
both TIMSS and PISA in 2015. Of the 28 countries that participated in both studies, these significantly
outperformed Ireland on Science in both studies. Students in Ireland performed at similar levels to
students in the United States and England/the United Kingdom on Science in both studies. Some
differences are also observed between the performances of countries across the two studies. For
example, New Zealand and Australia performed significantly less well than Ireland on TIMSS Science,
but significantly better than Ireland on PISA Science. Also, the Russian Federation significantly
outperformed Ireland on TIMSS Science, but performed significantly less well than Ireland on PISA
Science. Canada significantly outperformed Ireland on PISA Science but their performance on TIMSS
was similar to Ireland’s.
26
Mathematics:
As for Science, there are similarlities between PISA and TIMSS in terms of the countries which out-
performed Ireland. However, Ireland’s relative performance, when compared to that of the 28
countries which participated in both TIMSS 2015 and PISA 2015, is the same. Students in Ireland
performed at a similar level to students in the United States and England in TIMSS Mathematics, but
significantly outperformed students in these countries (the United Kingdom in PISA) in PISA
Mathematics. As in Science, the Russian Federation achieved a mean Mathematics score that was
significantly higher than Ireland’s mean Mathematics score in TIMSS, but performed significantly less
well than Ireland on PISA Mathematics. On the other hand, Slovenia performed significantly less well
than Ireland on TIMSS Mathematics, but had a significantly higher mean Mathematics score than
Ireland on PISA Mathematics.
Table 4.2 Countries that participated in TIMSS and PISA 2015, presented in descending order of their performance
on the Science and Mathematics assessments and with reference to their position relative to Ireland’s mean score
Science Mathematics TIMSS PISA TIMSS PISA
Singapore Singapore Singapore Singapore
Japan Japan Korea, Rep. of Hong Kong SAR
Chinese Taipei Chinese Taipei Chinese Taipei Chinese Taipei
Korea, Rep. of Canada Hong Kong SAR Japan
Slovenia Hong Kong SAR Japan Korea, Rep. of
Hong Kong SAR Korea, Rep. of Russian Fed. Canada
Russian Fed. New Zealand Canada Slovenia
England Slovenia Ireland Ireland
Ireland Australia United States Norway
United States United Kingdom England New Zealand
Hungary Ireland Slovenia Russian Fed.
Canada Norway Hungary Sweden
Sweden United States Norway Australia
Lithuania Sweden Lithuania United Kingdom
New Zealand Russian Federation Israel Italy
Australia Italy Australia Malta
Norway Hungary Sweden Lithuania
Israel Lithuania Italy Hungary
Italy Israel Malta Israel
Turkey Malta New Zealand United States
Malta Chile UAE UAE
UAE UAE Turkey Chile
Qatar Turkey Georgia Turkey
Thailand Thailand Lebanon Thailand
Chile Qatar Qatar Georgia
Georgia Georgia Thailand Qatar
Jordan Jordan Chile Lebanon
Lebanon Lebanon Jordan Jordan
Mean score significantly above Ireland’s
Mean score not significantly different from Ireland’s
Mean score significantly below Ireland’s
27
Engagement, Motivation and Attitudes to Science
TIMSS measures students’ attitudes towards learning Science. Four aspects are described here: how
much students like Science, are confident in Science, value Science and how engaging they find their
Science lessons. Proportionally fewer students in Ireland than on average across all countries that
participated in TIMSS reported liking Science very much (33% compared to 37%), while many more
indicated that they do not like learning Science (26% compared to 19%) (Table 4.3). More students
in Ireland (26%) than on average across TIMSS countries (22%) indicated that they are very confident
in Science but much fewer students in Ireland reported that they strongly value Science (30%
compared to 40%). Almost 80% of students in Ireland indicated that they experienced engaging or
very engaging Science lessons, which was marginally below the average percentage across all
countries that participated in TIMSS (83%).
Very much like learning Science
Like learning Science
Do not like learning Science
% % %
Ireland 33 41 26 International Average
37 44 19
Very confident in Science
Confident in Science
Not confident in Science
% % % Ireland 26 36 38 International Average
22 39 40
Strongly value Science Value Science Do not value Science
% % % Ireland 30 43 27 International Average
40 41 19
Very engaging lessons Engaging lessons
Less than engaging lessons
Ireland 45 34 21 International Average
47 36 17
Table 4.3: Percentages of TIMSS students indicating how much they like Science, are confident in Science, value Science and find their Science lessons engaging (Ireland and TIMSS International average)
Like TIMSS, PISA also gathers engagement, motivational and attitudinal variables. In 2015, these
focused on Science. Table 4.4 shows that, overall, students in Ireland are less-frequently engaged in
Science-related activities than their counterparts on average across OECD countries. The activities
Irish students engage in most frequently are watching TV programmes about Science and visiting
websites, while fewer than 2% are involved in Science clubs, compared with 8% on average across
OECD countries. Male students in Ireland (-0.20) had a significantly higher mean score than females
(-0.55) on the index. The correlation between engagement in Science-related activities and overall
Science in Ireland was 0.31, indicating that higher-achievers in Science tend to engage in Science-
related activities more often.
28
Index of Engagement in Science-related Activities
IRELAND OECD
Mean SE Mean SE
-0.37 (0.02) -0.02 (0.00)
Percent of Students who (very often or regularly): % SE % SE
Watch TV programmes about Science 16.8 (0.5) 23.0 (0.1)
Borrow or buy books on Science topics 6.3 (0.3) 11.1 (0.1)
Visit websites about Science topics 14.4 (0.5) 19.1 (0.1) Read Science magazines or Science articles in newspapers
8.5 (0.4) 15.8 (0.1)
Attend a Science club 1.6 (0.2) 8.3 (0.1)
Table 4.4: Mean score of students in Ireland and on average across OECD countries on the PISA index of engagement
in Science-related activities, and component items (2015)
For more information: TIMSS www.erc.ie/timss PISA www.erc.ie/pisa