scientific literacy for a knowledge society · scientific literacy for a knowledge society 8 by...

28
Scientific Literacy for a Knowledge Society Glen Aikenhead Professor Emeritus University of Saskatchewan, Saskatoon, Saskatchewan, Canada Graham Orpwood Professor Emeritus York University, Toronto, Ontario, Canada Peter Fensham Emeritus Professor Monash University, Australia, and Adjunct Professor Queensland University of Technology, Australia Published as: Aikenhead, G.S., Orpwood, G., & Fensham, P. (2011). Scientific literacy for a knowledge society. In C. Linder, L. Ostman, D.A. Roberts, P-O. Wickman, G. Erickson, & A. MacKinnon (Eds.), Exploring the landscape of scientific literacy (28-44). New York: Routledge, Taylor and Francis Group.

Upload: others

Post on 02-Feb-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

  • Scientific Literacy for a Knowledge Society

    Glen Aikenhead Professor Emeritus

    University of Saskatchewan, Saskatoon, Saskatchewan, Canada

    Graham Orpwood Professor Emeritus

    York University, Toronto, Ontario, Canada

    Peter Fensham Emeritus Professor Monash University, Australia, and

    Adjunct Professor Queensland University of Technology, Australia

    Published as: Aikenhead, G.S., Orpwood, G., & Fensham, P. (2011). Scientific literacy for a knowledge society. In C. Linder, L. Ostman, D.A. Roberts, P-O. Wickman, G. Erickson, & A. MacKinnon (Eds.), Exploring the landscape of scientific literacy (28-44). New York: Routledge, Taylor and Francis Group.

  • Scientific Literacy for a Knowledge Society

    2

    Scientific Literacy for a Knowledge Society

    Glen Aikenhead, Graham Orpwood, and Peter Fensham

    The role of science education in preparing students for the world of work was

    taken seriously by OECD (1996a,b) when it launched studies on how the world of

    work was changing, especially in science-related occupations. At about the same

    time, the Royal Society for the Arts, Industry, and Commerce initiated a similar

    study in Britain (Bayliss, 1998). Together these studies found that the nature of

    work in developed countries has changed in three ways: in kind, in the

    requirements for performance, and in the lack of permanence of one’s

    engagement. These changes in the world of employment are driven by new forms

    of information technology, design requirements, and technology transfer. For

    example, oral and written literacies are no longer adequate; digital literacy is a

    new essential. The resulting new knowledge enables innovation of processes and

    products, locally and globally.

    This new knowledge is increasingly becoming a primary source of

    economic growth in knowledge-based economies. Taken together, these changes

    characterize a Knowledge Society (Gilbert, 2005). A knowledge-based economy

    and a Knowledge Society create a need to re-examine SL in developed countries.

    The chapter begins with an account of the features of a knowledge-based

    economy and a Knowledge Society, as these features affect the interpretation of

    SL. A policy position about SL-in-action is developed and analyzed in terms of

    considerations that would confront decision makers who determine school science

    curriculum policy. Special attention is paid to the choices that exist in types of

    school science content, and to the powerful influence of educational assessment

    practices.

    A Knowledge-Based Economy

  • Scientific Literacy for a Knowledge Society

    3

    In a knowledge-based economy, knowledge takes three forms: codified, tooled,

    and personal (Chartrand, 2007). Codified knowledge is communicated in many

    different forms as meaning. Both sender and receiver must know the code if the

    message is to convey semiotic or symbolic meaning (Foray & Lundvall, 1996).

    Newly codified knowledge is often converted into legal property held by a

    company or institution: property that can be bought and sold through copyright.

    Tooled knowledge is also communicated in many different ways but

    always as function (Chartrand, 2007). It is often protected by patents. Tooled

    knowledge takes two forms: hard and soft. Hard tooled knowledge is a physical

    implement or process that manipulates or responds to matter-energy. A scientific

    instrument is tooled knowledge that can extend human perception into domains

    referred to as electrons, quarks, galaxies, and the genomic blueprint of life, for

    instance. To accomplish this, scientific tools probe beyond human perception to

    produce numbers, which are often converted into graphics. Numbers and graphics

    are in turn treated by scientists and technologists as observations. Thus, many

    scientific observations today involve a cyborg-like relationship between a human

    and an instrument – “instrumental realism” (Ihde, 1991). Soft tooled knowledge,

    on the other hand, includes the standards embedded in an instrument (e.g., its

    designed voltage – 12, 110, or 220), as well as the instrument’s programming, its

    operating instructions, and the techniques required to optimize its performance.

    Technology transfer, a key process in economic growth, involves tooled

    knowledge (soft and hard) being transferred from one company or institution to

    another.

    Personal knowledge contrasts with both codified and tooled knowledge

    due to the fact that personal knowledge consists of bundles of memory and trained

    reflexes of nerve and muscle of a scientist, engineer, or technician (Chartrand,

    2007; Foray & Lundvall, 1996). Importantly, some personal knowledge can be

  • Scientific Literacy for a Knowledge Society

    4

    codified or tooled, but some inevitably remains tacit (Polanyi, 1958). Personal

    knowledge is legally protected as know-how.

    Ultimately, however, all knowledge in a knowledge-based economy is

    personal knowledge because without a human to decode or push the right buttons,

    codified and tooled knowledge remain meaningless or functionless (Foray &

    Lundvall, 1996). This has implications for science education in a society

    propelled by a knowledge-based economy. It means that a country’s knowledge-

    based economy relies, in part, on its people and their ability to code and decode

    semiotic or symbolic meaning and machine-instrument function. This know-how

    is one indicator of competitiveness in the global economy, and thus it is an

    important consideration for SL in a knowledge-based economy.

    A country’s scientific and technological know-how is characterized by a

    blend of knowledge and action. Codified knowledge is about communicating;

    tooled knowledge (soft and hard) is about functioning; and personal knowledge is

    about participating in some way in the economy. The purpose of ‘knowing’ and

    ‘having knowledge and know-how’ is thus linked to innovation in science-related

    occupations.

    In the context of a knowledge-based economy, it is evident that the

    disciplines of science and technology are socially interrelated to such an extent

    that they form one heterogeneous domain – “technoscience” (Désautels, 2004;

    Fleming, 1989). For instance, R&D (research and development) is a conventional

    form of technoscience. Désautels, among others, has argued for this broader focus

    in school science. In this chapter, we adopt Désautels’s technoscience formulation

    and draw on Hurd (1998) in designating technoscience as “science-technology”

    (ST).

    Any perspective on SL that restricts its meaning to codified scientific

    knowledge will be inadequate for students’ future participation in their country’s

  • Scientific Literacy for a Knowledge Society

    5

    knowledge-based economy. Such a perspective ignores tooled and personal

    knowledge, and it ignores technology.

    A Knowledge Society

    When the engine of a country’s economy is knowledge-based, we have a

    Knowledge Society, in which the meaning of wealth creation has changed.

    “Where wealth was once related to resources and industrial processes, it is now a

    consequence of ever-renewing knowledges necessary to innovate, design,

    produce, and market products and service” (Carter, 2008, p. 621).

    In an earlier work about the Knowledge Society, Gilbert (2005) explored

    the educational implications of these societal changes, based on a comparison to

    existing conditions in most educational systems:

    1. knowledge is about acting and doing to produce new things, rather than

    being only an accumulation of established information; and

    2. what one does with knowledge is paramount, not how much knowledge

    one possesses.

    Consequently in a Knowledge Society, value is associated with:

    1. knowing how to learn, knowing how to keep learning, and knowing when

    one needs to know more, rather than knowing many bits of content from a

    canonical science curriculum;

    2. knowing how to learn with others, rather than only accumulating

    knowledge as an individual;

    3. using knowledge as a resource for resolving problems rather than simply

    as a catalogue of ‘right’ answers; and

    4. acquiring important competences (skills) in the use of knowledge, rather

    than only storing it.

    Change is the norm in a Knowledge Society. It follows that learning in such a

    society should have a dynamic character that equips students to adapt to change,

  • Scientific Literacy for a Knowledge Society

    6

    to generate new knowledge, and to continue to improve performance (Bybee &

    Fuchs, 2006; Fraser & Greenhalgh, 2001).

    Implications for SL

    A Knowledge Society relies on expertise in science-technology (ST) employment,

    and on the capacity of its citizens to deal with ST-related situations in their

    everyday lives. Participation in a Knowledge Society by employers, employees,

    and citizens calls for knowledge to be treated in conjunction with acting –

    ‘knowing-in-action’ – and not as stored facts, abstractions, and algorithms. Both

    expert knowledge and citizen knowledge are reconceptualized in this chapter in

    terms of ST knowing-in-action.

    Some research programs in science education already support a

    Knowledge Society. These programs treat knowledge as contextualized social

    action – knowing-in-action. For example, Gaskell (2002) maps out partnerships

    between science education and industry, in which “knowledge is to be learned in

    the context of doing … ‘working knowledge,’ as opposed to abstract propositional

    knowledge” (p. 64). Further examples include research programs described by

    Roth and Calabrese Barton (2004) and by Roth and Lee (2004), and informed by

    activity theory associated especially with Lave and Wenger (1991).

    Competence at ST knowing-in-action (i.e., competence at ST-related work

    skills or competence in resolving ST-related events and issues) is not simply a

    matter of ‘applying’ knowledge mastered in a conventional science classroom. A

    curriculum change, and thus a curriculum policy change, is required. Most science

    content in the typical curriculum is not directly useable in ST-related occupations

    and everyday situations. There are several reasons for this, established by research

    on situated cognition: “Thinking … depends on specific, context-bound skills and

    units of knowledge that have little application to other domains” (Furnham, 1992,

    p. 33).

  • Scientific Literacy for a Knowledge Society

    7

    The first reason is that a conventional science curriculum’s content must

    be deconstructed and then reconstructed and integrated into the idiosyncratic

    demands of the specific everyday context (Jenkins, 1992; Layton, 1991; Ryder,

    2001). “This reworking of scientific knowledge is demanding, but necessary as

    socio-scientific issues are complex. It typically involves science from different

    sub-disciplines, knowledge from other social domains, and of course value

    judgements and social elements” (Kolstø, 2000, p. 659).

    Second, school knowledge tends to be compartmentalized in most

    students’ minds, separate from their out-of-school (life-world) knowledge, and the

    two sources or knowledge systems simply do not interact (Hennessy, 1993;

    Solomon, 1984). The third reason is that science content used in ST-rich

    workplaces tends to be procedural scientific knowledge (knowing-in-action),

    which is simply different from the propositional knowledge of canonical school

    science (Gott, Duggan, & Johnson, 1999; Law, 2002). Because different science

    content is applicable in each setting, there is very little transfer of knowledge.

    Fourth, the purpose and accountability of the workplace and the science

    classroom differ dramatically. Consequently, workplace science is qualitatively

    different from school science (Chin et al., 2004).

    These established research findings have significant implications for SL as

    a curriculum concept. SL in a Knowledge Society is necessarily literacy-in-action

    – oral, written, and digital literacy-in-action. Consequently SL as an educational

    outcome takes on an active, rather than a passive connotation. SL is not about

    ‘How much do you know?’ but instead, ‘What can you learn when the need

    arises?’ and ‘How effectively can you use your learning to deal with ST-related

    events in the work world or the everyday world of citizens?’ The shift in outcome

    – from ‘knowing that’ to ‘knowing how to learn and to use this relevant content’ –

    would represent a radical shift in school science curriculum policy. At the same

    time, it would resonate with the goals for a knowledge-based economy discussed

  • Scientific Literacy for a Knowledge Society

    8

    by Gilbert (2005), and also by Guo (2007). In short, acquiring knowledge

    (‘knowing that’) would be replaced by capacity building (‘knowing how to learn

    and knowing-in-action’) as the primary mission for school science. For a

    Knowledge Society, the primary meaning for SL becomes SL-in-action.

    In terms of Roberts’s Visions I and II of SL (Chapter 1, this volume), our

    proposed school science policy position suggests a shift away from Vision I,

    given its concentration on theoretical reasoning and its inward-looking focus on

    the products and processes of science itself. Instead, ours is an intermediary

    position between Visions I and II, but favoring Vision II, given its concentration

    on a combination of theoretical, technological, and practical reasoning and its

    outward-looking focus on ST-related situations.

    SL-in-action requires that students come to understand their ST

    knowledge as having a purpose beyond simply ‘knowing that.’ Examples of such

    purposes include: getting and keeping ST-related employment, informing daily

    activities, analyzing socio-scientific issues, and comprehending global concerns.

    As a curriculum concept, SL-in-action is highly promising as a way to increase

    the relevance of school science, and to engender habits of life-long learning.

    Choosing School Science Content: A Theoretical Framework

    Because we treat SL for a Knowledge Society as knowing-in-action associated

    with capacity building for life-long learning, we reposition ourselves in this

    chapter with respect to what counts as worthwhile knowledge to teach and assess

    in science education. To achieve SL-in-action for a Knowledge Society, we

    require an innovative way to determine the content, processes, and contexts for

    school science. Together this triad will be called “school science content.”

    Content without context is ephemeral. Processes without content or context (i.e.,

    generic skills) are powerless – a lesson learned from the failure of Science: A

    Process Approach, a 1960s elementary science program in the United States, and

    its counterpart in the United Kingdom, Science 5-13 (Millar & Driver, 1987).

  • Scientific Literacy for a Knowledge Society

    9

    We emphasize a distinction between the canonical science content found

    in conventional science curricula, on one hand, and on the other hand, relevant

    science content – the type of science content actually used by employees in ST-

    related occupations and by the public coping with ST-related events and issues. In

    short, the distinction is between academic decontextualized knowledge (Roberts’s

    Vision I to the extreme) and relevant contextualized knowing-in-action (Vision

    II), respectively.

    We propose a theoretical framework for school science content based on

    two principles: relevance and who decides what is relevant for SL-in-action. The

    two principles are depicted in Table 2.1. Who decides what is relevant is

    represented in column 1; and column 2 represents, as a consequence, various

    types of school science content. Our framework recognizes seven groups of

    people who currently decide, or who could reasonably decide, what is to be

    included in school science content. The categories, based on the work of Fensham

    (2000) and Aikenhead (2006), are not discrete but overlap and interact in various

    ways. To work toward achieving SL-in-action for a Knowledge Society,

    curriculum developers can draw from several of these categories, and the resulting

    curricula will most likely consist of different combinations of categories.

    [INSERT TABLE 2.1 HERE]

    A conventional school science curriculum emerges from the first category,

    “wish-they-knew science” in Table 1. This category embraces the subject matter

    of scientific disciplines (the science curriculum). Wish-they-knew science

    predominates in Vision I versions of SL.

    The other six categories in Table 1 reflect the work world of employers

    and employees, as well as the everyday world of citizens, in which ST content

    pertains to phenomena and events not normally of interest to most university

    science professors (scholarly academics), education officials, and currently many

    science teachers. The other six categories represent knowing-in-action, by and

  • Scientific Literacy for a Knowledge Society

    10

    large, and are therefore supportive of SL-in-action for a Knowledge Society. They

    reflect an emphasis on Vision II SL.

    Space does not allow us to review the research related to each category in

    Table 1 (see Aikenhead, 2006, Ch. 3), but two categories are summarized here,

    “functional science” and “have-cause-to-know science.” They clarify suitable

    school science content for SL-in-action.

    Functional Science

    Functional science is the science content that has functional value to ST-

    rich employment and to ST-related everyday events. Systematic research has

    produced a wealth of general and specific results. For example, industry personnel

    placed ‘understanding science ideas’ at the lowest priority for judging a recruit to

    their industry. Why? The answer comes from the ethnographic research by

    Duggan and Gott (2002) in the United Kingdom, Rodrigues and colleagues (2007)

    in Australia, Law (2002) in China, Lottero-Perdue and Brickhouse (2002) in the

    United States, and Aikenhead (2005) in Canada. The researchers’ in situ

    interviews with people in ST-related occupations indicated that the science

    content used by these science graduates in the workplace was so context specific

    it had to be learned on the job. High school and university science content was

    rarely drawn upon.

    Thus, an important quality valued by both employers and employees in

    ST-related employment is the capacity to learn ST content on the job. Of course,

    school science content for the purpose of preparing students for ST-related

    occupations must include science concepts, but the choice of these concepts can

    be a functionally relevant choice, not a scholarly academic choice (i.e., opting for

    wish-they-knew science). The science content that underpins local contexts of

    interest works better for teaching students how to learn and use ST as needed

    (Aikenhead, 2006, Ch. 6).

  • Scientific Literacy for a Knowledge Society

    11

    Have-Cause-to-Know Science

    This category represents science content identified by ST experts who

    consistently interact with the general public on real-life matters pertaining to ST,

    and who know the problems citizens encounter when interacting with these

    experts (Fensham, 2002). Out of the diverse research reported in the literature (see

    Aikenhead, 2006, Ch. 3), the research program undertaken by Law and her

    colleagues (2000) in China clarifies have-cause-to-know science the best. Their

    project determined the have-cause-to-know science for two different curricula:

    one aimed at citizens’ capabilities at coping with everyday events and issues, and

    the other aimed at socio-scientific decision making (Law, 2002; Law et al., 2000).

    For the first curriculum, societal experts (e.g., people who work with home and

    workplace safety issues; and in medical, health, and hygiene areas) agreed that the

    public had cause to know basic scientific knowledge related to an event with

    which people were trying to cope, and to know specific applications of that

    knowledge (knowing-in-action). Most of all, they should be able to critically

    evaluate cultural practices, personal habits, media information, and multiple

    sources of conflicting information (Law, 2002). During their interviews, the

    experts noted public misconceptions, superstitions, and cultural habits detrimental

    to everyday coping.

    For Law’s second curriculum (citizens’ participation in socio-scientific

    decision making), experts were selected from Hong Kong’s democratic

    institutions (the legislature, a government planning department, and a civilian

    environmental advocacy group) and were interviewed. The researchers concluded

    that the public’s have-cause-to-know science for decision making was very

    similar to that required for everyday coping, except socio-scientific decision

    making drew upon more complex skills to critically evaluate information and

    potential solutions (Law, 2002). The societal experts acknowledged the fact that

    socio-scientific decisions often rely more on applying values than on applying

  • Scientific Literacy for a Knowledge Society

    12

    specific science content, a result duplicated in the United States with academic

    scientists at several universities (Bell & Lederman, 2003).

    Overall, the Chinese ST experts placed emphasis on a citizen’s capacity to

    undertake self-directed learning (life-long learning), but placed low value on a

    citizen’s knowing particular content from a typical science curriculum. This result

    is similar to the research findings for functional science.

    Summary: Pertinent Considerations for Making a Policy Decision

    SL that nurtures economic growth requires science education to promote

    capacity building in which future workers and savvy citizens learn how to learn

    ST as the need arises. Life-long learning for a Knowledge Society ensues. The

    school science content essential to achieve this end is not solely “wish-they-knew

    science,” but instead any combination of categories of school science content

    (Table 2.1) that leads to SL-in-action.

    Curriculum policy makers are expected to consider what Deng identifies

    as the “institutional” domain of curriculum making (Chapter 3, this volume).

    Policy makers can be assisted by considering examples of the implications for the

    other two domains Deng identifies: “programmatic” and “classroom.” For

    instance, SL-in-action in the programmatic domain is exemplified by:

    1. a project that designs context-based school science materials for authentic

    ST social practices (Bulte, Chapter xx, this volume), which combined

    functional and personal-curiosity science content;

    2. a Canadian grade 10 textbook, Logical Reasoning in Science &

    Technology (Aikenhead, 1991), which combined functional, have-cause-

    to-know, personal-curiosity, and wish-they-knew science content;

    3. an AS-level textbook in the United Kingdom, Science for Public

    Understanding (Hunt & Millar, 2000), which combined enticed-to-know,

    have-cause-to-know, and wish-they-knew science content;

  • Scientific Literacy for a Knowledge Society

    13

    4. the “public understanding of science” curriculum (Algeme

    Natuurwetenshappen) in The Netherlands (De Vos & Reiding, 1999),

    which combined have-cause-to-know, need-to-know, and wish-they-knew

    science content; and

    5. the Science, Technology, Environment in Modern Society project in Israel

    (Dori & Tal, 2000), which combined functional, have-cause-to-know, and

    wish-they-knew science.

    Our SL-in-action policy in the classroom domain is illustrated by, for example:

    1. Carlone’s (2003) research program about physics teachers who offered

    Active Physics at their school, which combined wish-they-knew,

    functional, and personal-curiosity science content; and

    2. Kortland’s (2001) research program into students’ learning how to make

    decisions in the context of a waste management module, which combined

    functional, have-cause-to-know, personal-curiosity, and wish-they-knew

    science.

    Although concrete examples of program and classroom embodiments are useful to

    policy makers, assessment of students’ learning is also a central consideration

    .We turn next to an exploration of the role of assessment in science education

    reform during the past several decades, and the potential implications for

    considering adoption of an SL-in-action policy for school science.

    Potential Assessment Issues Surrounding SL-in-Action

    Common sense dictates that there should be a strong, clear relationship between

    the substance of a curriculum and the assessment of students’ learning based on it.

    The relationship can be complex, though, and the complexity becomes more

    apparent when the policy represents a considerable change from typical practice,

    as indeed SL-in-action would.

    Orpwood (2001) recounts two well-known major shifts in science

    curriculum policy: the first was a shift to science processes and the structure of

  • Scientific Literacy for a Knowledge Society

    14

    science in about 1960, and the second was a shift to STS/STSE, beginning early

    in the 1980s. In both cases development and acceptance of appropriate assessment

    procedures and strategies lagged by about 20 years behind adoption (on paper) of

    the new policy. In the former case, Orpwood notes that “The first significant

    ‘performance assessments’ ... were designed in England in the early 1980s by the

    Assessment of Performance Unit (APU, 1983) fully 20 years after the goal of

    instilling ‘inquiry skills’ in students had first been introduced into the curriculum”

    (p. 143). We also note the 20-year lag regarding the shift to STS/STSE. Only

    recently has the Assessment and Qualifications Alliance in England published its

    General certificate of education: Science for public understanding 2004 (AQA,

    2003). In about the same time frame, OECD’s Programme of International

    Student Achievement (PISA) has made available some good examples of how to

    assess students on STS/STSE content.

    The goal SL-in-action cannot wait 20 years for sufficient assessment

    procedures to be developed and implemented, as was the pattern for the previous

    two major shifts in science curriculum policy. Events in the world of work could

    well marginalize the canonical school science curriculum, making it more

    irrelevant than it is currently observed to be, as evidenced in the Statement of

    Concern at the beginning of this volume.

    In the remainder of this section we offer an analysis of why educational

    assessment has such a powerful influence over the selection and implementation

    of science curriculum policy. The first part of the analysis examines four

    functions of educational assessment and some technical factors (notably validity

    and reliability) that are integral to understanding how different kinds of

    assessment serve different purposes. The second part is about power distribution

    in educational institutions. In the third part we apply these analytical insights to

    the case of SL-in-action, with particular reference to potential considerations for

    policy makers, educational researchers, and science educators.

  • Scientific Literacy for a Knowledge Society

    15

    Functions of Educational Assessment

    Our first function of educational assessment concerns accountability. The

    public, media, and politicians in most Western democracies are increasingly

    demanding that education systems and individual schools provide evidence of

    their effectiveness, productivity, and ‘value’ for taxpayers’ investment. In some

    countries, this has taken the form of ‘league tables’ of schools, based on the

    results of examinations designed for other purposes (e.g., GCSE and A levels in

    England & Wales). Elsewhere, international assessments such as Trends in

    International Mathematics and Science Study (TIMSS) and PISA are used as

    proxies for school and system effectiveness. Because of the authority that TIMSS

    and PISA command, their reports can have a profound impact on educational

    policy despite the very obvious limitations and inadequacies of their paper-and-

    pencil character and their objective to transcend local realities (Fensham, 2007).

    Finally, in some countries, special tests are designed with this accountability

    agenda explicitly in mind. In Canada, for example, the Pan-Canadian Assessment

    Program is a case in point.

    A key expectation of assessments designed for accountability is the

    comparison among schools or school systems. The reliability of the assessment is

    therefore very important. For this reason, multiple choice tests are often used

    because these can be designed with high reliability for recall of factual

    information (i.e., superficial, codified, scientific knowledge, but not tooled

    knowledge, nor personal knowledge, nor technological knowledge). Recall is

    easier to assess with multiple-choice items than is the more complex knowing

    demanded by SL-in-action.

    The PISA program was charged with assessing students’ preparedness for

    21st century life. Unlike TIMSS, the PISA program is not constrained to be a

    narrow test of school curriculum content. In its tests of reading, mathematics, and

    science in 2000, 2003, and 2006, PISA has demonstrated that compatibility

  • Scientific Literacy for a Knowledge Society

    16

    between high reliability and the assessment of more complex knowing-in-action is

    possible with the use of a wider variety of item types.

    Student certification and selection, a second function of assessment, is

    perhaps the most traditional role for assessment. Examinations are used in the

    majority of countries throughout the world for the purposes of certifying students’

    completion of one stage of education and for selecting them for subsequent

    opportunities either in education or employment. This type of assessment can

    have the advantage of moving the certification role of education from the school

    level to an external (and supposedly impartial) agency level, thus ensuring an

    equality of standards countrywide. This advantage, however, becomes a

    disadvantage by inhibiting desired changes in curriculum policy and curriculum

    implementation when students can only be assessed by these externally designed

    assessment instruments, usually the paper-and-pencil type.

    When the assessment of more complex goals is included, it has

    traditionally been engineered using open-ended or constructed-response items

    with the advantages and disadvantages that such items always have. They offer

    the student an opportunity to demonstrate their knowledge and skill in more

    diverse ways than with multiple-choice items. Thus, open-ended or constructed

    response items have greater validity. But reliable (consistent) marking remains a

    challenge. Moreover, the assessment is still confined to what can be written

    within a fixed (usually short) period of time.

    In contrast, our vision of SL-in-action for a Knowledge Society

    emphasizes validity as being central to assessment. Students would be given time

    to solve a problem and the freedom to draw on a variety of resources, as is the

    case in ST-related employment in the everyday world. However, marking these

    types of responses requires one-on-one expert observation and evaluation of a

    student’s performance. Although this is not impossible – music and dance have

  • Scientific Literacy for a Knowledge Society

    17

    been evaluated this way for years – it is very rare in a science context, even

    though its validity is superior.

    School improvement is a third function of educational assessment. Many

    people hold a view in which schools whose students achieve well in examinations

    are ‘good schools’ and that, correspondingly, schools with lower aggregate results

    are poor and are in need of improvement (a view we would argue is misguided).

    Assessment is therefore the basis for determining the quality of individual schools

    and for measuring whether they are improving.

    In the Canadian province of Ontario, for instance, a whole new

    government bureaucracy has been developed with a view to improving school

    performance to meet government-set targets. On the face of it, this is a good plan.

    However, the use of assessment to label schools raises the stakes for schools,

    which can, in turn, have an undesirable impact on teaching and learning. In a

    recent study, for example, researchers found that some teachers try to avoid

    teaching a grade level in which provincial testing takes place; and that, in some

    schools, principals discourage teaching subjects that are not tested (Sinclair,

    Orpwood, & Byers, 2007).

    In any agenda to improve school, assessments that mirror those designed

    for accountability can narrow the scope of the curriculum taught, as mentioned

    above. School improvement needs to be conceptualized more broadly. While

    assessment results can form a component, they should form only part of a broader

    description of the character and effectiveness of schools. For example, in the case

    of SL-in-action, analyzing the range of activities in which students are engaged

    can offer better evidence than statistics based on written tests narrow in scope.

    A fourth function of assessment focuses on improving student learning. In

    recent years, science educators (e.g., Black & Wiliam, 1998; Bell & Cowie, 2001)

    have argued persuasively that the most important purpose for assessment, which

    they call ‘assessment for learning’ or ‘formative assessment,’ is also the most

  • Scientific Literacy for a Knowledge Society

    18

    neglected. Assessment for learning aims squarely at the individual student’s

    learning and is designed to have an immediate positive impact on that learning. It

    represents the antithesis of the other three approaches to assessment because: it is

    individual in its context; it is classroom-based; it is designed and practised

    exclusively by teachers; it lacks secrecy – indeed, the sharing of assessment

    criteria with students is a key to its success; and its results do not necessarily

    require documentation or reporting. It truly implements the view articulated by

    Wiggins (1993) that assessment is something we should do with students rather

    than to them.

    In summary, all four functions of educational assessment are valid in their

    own terms. All offer potentially valuable contributions to education and, in an

    ideal system, all would have a place. However, political and professional

    pressures tend to flavor some over others, and the resulting imbalance can affect

    curriculum change. In particular, the choice of assessment function can threaten

    the implementation of SL-in-action. Imbalances also create a hierarchy among the

    four sets of purposes based in terms of the political and professional power of

    those controlling each level of assessment.

    Power over Purpose

    Senior levels of government typically have the power and resources to

    invest more in assessment than lower levels of government, senior bureaucrats

    more than junior bureaucrats, and school principals more than classroom teachers.

    It follows, therefore, that the needs and interests of the more senior levels will

    tend to take precedence over those below them. It is likely that international and

    national assessments designed for system accountability or student

    certification/selection will have greater power over those designed locally for

    promoting student learning. Importantly, the first three functions of educational

    assessment have greater power over the fourth function that supports student

  • Scientific Literacy for a Knowledge Society

    19

    learning. The evidence for the existence of such a hierarchy is the financial

    resources devoted to each type of assessment.

    For example, in Ontario the results of provincial mathematics and

    language assessments are used to make judgments about schools as a whole.

    Moreover, the content of these tests have a significant steering effect on the

    teaching and assessment carried out by teachers, whether or not this is appropriate

    (Sinclair, Orpwood, & Byers, 2007). This hierarchical competition among

    functions of assessment also means that reliability-related criteria – critically

    important in large-scale and high-stakes assessments – are likely to be of more

    significance than validity-related criteria – usually of much greater significance in

    the curriculum-related assessments at the school and classroom levels.

    The implications for assessment are clear. It is cheaper, simpler, easier,

    and more aggrandizing to measure students’ memorization or accumulation of

    facts, abstractions, and algorithms than to assess the more complex competencies

    that make up a richer vision of SL for a Knowledge Society. Thus, in a

    competition for resources, political power wins out over education purpose every

    time, and SL-in-action is likely to suffer.

    Assessing Knowing-in-Action

    If SL-in-action is to become a reality, how should it be assessed when

    competence is judged as acting effectively, and assessment is seen as describing

    and monitoring that process? For a Knowledge Society, the focus is less on ‘what

    you know’ but more on ‘what you can learn and what can you do with it in the

    context of the everyday world of work and responsible citizenship.’ Science is

    understood in the context of technological and social challenges faced by

    individuals and societies.

    We propose that assessment move correspondingly away from science

    knowledge in a static and de-contextualized sense (traditional assessment), and

    even beyond performance assessment with its focus on students’ ability to

  • Scientific Literacy for a Knowledge Society

    20

    perform science experiments. Assessment can move toward giving students real-

    world tasks where they learn relevant ST content and use this content to achieve a

    broader goal.

    Real-world tasks can be of two kinds. First, more appropriate for

    elementary school, students can be given a simple task that requires the use of

    their knowledge, but its use is left to the student to determine. Examples of this

    kind of assessment can be found in the Assessment of Science & Technology

    Achievement Project (Orpwood & Barnett, 1997). A Grade 1 student who has

    been taught about the senses (seeing, hearing, touching, listening, and smelling) is

    asked to design a game for students who are blind. A Grade 6 student who has

    been taught about methods of heat transmission (conduction, convection, and

    radiation) is asked to design a cup that will keep a chocolate drink hot the longest.

    Second, more appropriate at a senior level, real-world tasks can be broader

    and less concrete. Students can be asked for solutions to societal challenges to

    which there may be no right answers (Driver, Leach, Millar, & Scott, 1996). PISA

    does include a number of these types of challenges. Functional, have-cause-to-

    know, and need-to-know science content can be assessed through students’

    analysis of socio-scientific decision scenarios. This is illustrated by the Science

    Education for Public Understanding Program (SEPUP, 2003) in the United

    States (Thier & Davies, 2001), which explicitly connects its relevant science

    content (functional and have-cause-to-know science) to the wish-they-knew

    science of the country’s National Science Education Standards (NRC, 1996).

    Assessing student decision making on ST-related events and issues has been a

    research program in a number of countries (Gaskell, 1994; Kolstø, 2001;

    Kortland, 1996; Ratcliffe & Grace, 2003; Sadler, 2009; Zeidler & Sadler, Chapter

    yy, this volume). For policy makers, all of these examples are rich sources for

    alternative procedures for assessing SL-in-action. These assessment methods go

    beyond marking answers as right or wrong, based on their matching a

  • Scientific Literacy for a Knowledge Society

    21

    predetermined answer or based on a checklist of predetermined skills. Such

    simple methods are contradictory to preparing students for a knowledge-based

    economy.

    Science students need to be exposed to situations where they can

    demonstrate their responses in a scientifically literate manner, and assessment

    specialists need to describe and monitor such responses. Because SL-in-action is

    ‘knowing how to learn and knowing how to use that learning,’ assessment of SL-

    in-action must describe and monitor those complex processes.

    Conclusion

    We succinctly reiterate two crucial concepts – ‘SL-in-action’ and ‘school science

    content.’ A Knowledge Society requires employers, employees, and citizens to

    develop the capacity to treat knowledge in terms of action – knowing-in-action. In

    science education this becomes SL-in-action.

    We conceptualize school science content as a triad of scientific content,

    processes, and contexts, in which content and processes are invariably context-

    bound as they are in the world of employment (i.e., context-bound content and

    context-bound processes). This specified meaning for ‘school science content’

    harmonizes science education with SL-in-action and ultimately with a Knowledge

    Society.

    A major implication for policy concerning student assessment logically

    follows. Context-bound science instruction and learning are by and large

    incompatible with universal assessment ideologies found in many national and

    international testing institutions. In the agenda of a Knowledge Society, non-

    universal types of assessment for science-technology (ST) learning are given

    precedence over issues of accountability, student certification, student selection,

    and school improvement.

    SL-in-action is about capacity building either for competence in ST-

    related occupations, or for resolving ST-related events and issues. If policies,

  • Scientific Literacy for a Knowledge Society

    22

    curricula, teacher education programs, teacher professional development, and

    student assessment in science education do not support SL-in-action, they hinder a

    country’s knowledge-based economy and thereby undermine its Knowledge

    Society (Munby, Hutchinson, Chin, Versnel, & Zanibbi, 2003).

    We have offered policy makers a theoretical framework for choosing

    school science content in line with a Knowledge Society. In addition, we have

    clarified an ideology that would guide the development of student assessment

    policies that support a Knowledge Society. Current national and international

    assessment conventions and practices must give way to new ones so that all

    citizens develop a rich range of school science outcomes needed in a Knowledge

    Society.

    References

    Aikenhead, G.S. (1991). Logical reasoning in science & technology. Toronto,

    Ontario: John Wiley of Canada.

    Aikenhead, G.S. (2005). Science-based occupations and the science curriculum:

    Concepts of evidence. Science Education, 89, 242-275.

    Aikenhead, G.S. (2006). Science education for everyday life: Evidence-based

    practice. New York: Teachers College Press.

    APU (Assessment of Performance Unit) (1983). Science at age 11. London:

    Department of Education and Science.

    AQA (Assessment and Qualifications Alliance) (2003). General certificate of

    education: Science for public understanding 2004. Manchester, AQA. (and

    subsequent years)

    Bayliss, V. (1998). Redefining work: An RSA initiative. London: RSA.

    Bell, B., & Cowie, B. (2001). Formative assessment and science education.

    Dordrecht, The Netherlands: Kluwer Academic Publishers.

  • Scientific Literacy for a Knowledge Society

    23

    Bell, R.L., & Lederman, N.G. (2003). Understandings of the nature of science and

    decision making on science and technology based issues. Science Education,

    87, 352-377.

    Black, P. & Wiliam, D (1998). Assessment and classroom learning. Assessment in

    Education 5(1), 7-71.

    Bybee, R.W., & Fuchs, B. (2006). Preparing the 21st century workforce: A new

    reform in science and technology education. Journal of Research in Science

    Teaching, 43, 349-352.

    Carlone, H.B. (2003). Innovative science within and against a culture of

    “achievement.” Science Education, 87, 307-328.

    Carter, L. (2008). Globalization and science education: The implications for

    science in the new economy. Journal of Research in Science Teaching, 45,

    617-633.

    Chartrand, H.H. (2007). Ideological evolution: The competitiveness of nations in

    a global knowledge-based economy. Unpublished PhD dissertation,

    University of Saskatchewan, Saskatoon, Canada.

    Chin, P., Munby, H., Hutchinson, N.L., Taylor, J., & Clark, F. (2004). Where’s

    the science? Understanding the form and function of workplace science. In E.

    Scanlon, P. Murphy, J. Thomas, & E. Whitelegg (Eds.), Reconsidering

    science learning. New York: RoutledgeFalmer, pp. 118-134.

    Désautels, J. ( 2004). Technobiological literacies and citizenship education. In J.

    Lewis, A. Magro, & L. Simonneaux (Eds.), Biology education for the real

    world: Student-teacher-citizen (Proceedings of the Fourth Conference of

    European Researchers in Didaktic of Biology) (pp. 9-26). Toulouse, France:

    École nationale de formation agronomique.

    De Vos, W., & Reiding, J. (1999). Public understanding of science as a separate

    subject in secondary schools in The Netherlands. International Journal of

    Science Education, 21, 711-719.

  • Scientific Literacy for a Knowledge Society

    24

    Dori, Y.J., & Tal, R.T. (2000). Formal and informal collaborative projects:

    Engaging in industry with environmental awareness. Science Education, 84,

    95-113.

    Driver, R., Leach, J., Millar, R., & Scott, P. (1996). Young people’s images of

    science. Buckingham, UK: Open University Press.

    Duggan, S., & Gott, R. (2002). What sort of science education do we really need?

    International Journal of Science Education, 24, 661-679.

    Fensham, P.J. (2000). Issues for schooling in science. In R.T. Cross & P.J.

    Fensham (Eds.), Science and the citizen for educators and the public.

    Melbourne: Arena, pp. 73-77.

    Fensham, P.J. (2002). Time to change drivers for scientific literacy. Canadian

    Journal of Science, Mathematics and Technology Education, 2, 9-24.

    Fensham, P.J. (2007). Values in the measurement of students’ science

    achievement in TIMSS and PISA. In D. Corrigan, J. Dillon, & R. Gunstone

    (Eds.), The re-emergence of values in science education (pp. 215-229).

    Rotterdam: Sense Publishers.

    Fleming, R. (1989). Literacy for a technological age. Science Education, 73, 391-

    404.

    Foray, D., & Lundvall, B. D. (1996). The knowledge-based economy: From the

    economics of knowledge to the learning economy. In OECD, Employment and

    growth in the knowledge-based economy (pp. 11-32). Paris: OECD.

    Fraser, S.W., & Greenhalgh, T. (2001). Coping with complexity: Educating for

    capability, British Medical Journal, 323, 799-803.

    Furnham, A. (1992). Lay understanding of science: Young people and adults’

    ideas of scientific concepts. Studies in Science Education, 20, 29-64.

    Gaskell, P.J. (1994). Assessing STS literacy: What is rational? In K. Boersma, K.

    Kortland, & J. van Trommel (Eds.), 7th IOSTE symposium proceedings:

  • Scientific Literacy for a Knowledge Society

    25

    Papers. Part 1 (pp. 309-320). Endrecht, The Netherlands: IOSTE Conference

    Committee.

    Gaskell, P.J. (2002). Of cabbages and kings: Opening the hard shell of science

    curriculum policy. Canadian Journal of Science, Mathematics and

    Technology Education, 2, 59-66.

    Gilbert, J. (2005). Catching the knowledge wave? The Knowledge Society and the

    future of education. Wellington: New Zealand Council for Educational

    Research.

    Gott, R., Duggan, S., & Johnson, P. (1999). What do practising applied scientists

    do and what are the implications for science education? Research in Science

    & Technological Education, 17, 97-107.

    Guo, C-J. (2007). Issues in science learning: An international perspective. In S.K.

    Abell & N.G. Lederman (Eds.), Handbook of research on science education

    (pp. 227-256). Mahwah, NJ: Lawrence Erlbaum Associates.

    Hennessy, S. (1993). Situated cognition and cognitive apprenticeship:

    Implications for classroom learning. Studies in Science Education, 22, 1-41.

    Hunt, A., & Millar, R. (2000). AS science for public understanding. Oxford:

    Heinemann.

    Hurd, P.D. (1998). Scientific literacy: New minds for a changing world. Science

    Education, 82, 407-416.

    Ihde, D. (1991). Instrumental realism: The interface between philosophy of

    science and philosophy of technology. Bloomington, Indiana: Indiana

    University Press.

    Jenkins, E. (1992). School science education: Towards a reconstruction. Journal

    of Curriculum Studies, 24, 229-246.

    Kolstø, S.D. (2000). Consensus projects: Teaching science for citizenship.

    International Journal of Science Education, 22, 645-664.

  • Scientific Literacy for a Knowledge Society

    26

    Kolstø, S.D. (2001). ‘To trust or not to trust, …’ – pupils’ ways of judging

    information encountered in a socio-scientific issue. International Journal of

    Science Education, 23, 877-901.

    Kortland, J. (1996). An STS case study about students’ decision making on the

    waste issue. Science Education, 80, 673-689.

    Kortland, J. (2001). A problem posing approach to teaching decision making

    about the waste issue. Utrecht, The Netherlands: University of Utrecht Cdβ

    Press.

    Lave, U., & Wenger, E. (1991). Situated learning: Legitimate peripheral

    participation. Cambridge: Cambridge University Press.

    Law, N. (2002). Scientific literacy: Charting the terrains of a multifaceted

    enterprise. Canadian Journal of Science, Mathematics and Technology

    Education, 2, 151-176.

    Law, N., Fensham, P.J., Li, S., & Wei, B. (2000). Public understanding of science

    as basic literacy. In R.T. Cross & P.J. Fensham (Eds.), Science and the citizen

    for educators and the public. Melbourne: Arena, pp. 145-155.

    Layton, D. (1991). Science education and praxis: The relationship of school

    science to practical action. Studies in Science Education, 19, 43-79.

    Lottero-Perdue, P.S., & Brickhouse, N.W. (2002). Learning on the job: The

    acquisition of scientific competence. Science Education, 86, 756-782.

    Millar, R., & Driver, R. (1987). Beyond processes. Studies in science education,

    14, 33-62.

    Munby, H., Hutchinson, N. L., Chin, P., Versnel, J., Zanibbi, M. (2003,

    February). Curriculum and learning in work-based education: Implications

    for education in the new economy. Paper presented at the National Invitational

    Conference, “School-to-Work and Vocational Education: The New

    Synthesis,” Temple University Center for Research in Human Development

    and Education, Philadelphia.

  • Scientific Literacy for a Knowledge Society

    27

    NRC (National Research Council). (1996). National science education standards.

    Washington, DC: National Academy Press.

    OECD (1996a) The knowledge-based economy. Paris: OECD Publications.

    OECD (1996b) Employment and growth in a knowledge-based economy.

    Washington, DC: OECD Publications and Information Centre.

    OECD (2007). PISA 2006 science competencies for tomorrow’s world. Paris,

    OECD.

    Orpwood, G. (2001). The role of assessment in science curriculum reform.

    Assessment in Education, 8(2), 135-151.

    Orpwood, G., & Barnett, J. (1997). Science in the National Curriculum: An

    international perspective. Curriculum Journal, 8, 331 – 349.

    Polanyi, M. (1958). Personal knowledge. Chicago: University of Chicago Press.

    Ratcliffe, M., & Grace, M. (2003). Science education for citizenship: Teaching

    socio-scientific issues. Philadelphia: Open University Press.

    Rodrigues, S., Tytler, R., Darby, L., Hubber, P., Symington, D., & Edwards, J.

    (2007). The usefulness of a science degree: The “lost voices” of science

    trained professionals. International Journal of Science Education, 29, 1411-

    1433.

    Roth, W-M., & Calabrese-Barton, A. (2004). Rethinking scientific literacy. New

    York: RoutledgeFalmer.

    Roth, W-M., & Lee, S. (2004). Science education as/for participation in the

    community. Science Education, 88, 263-291.

    Ryder, J. (2001). Identifying science understanding for functional scientific

    literacy. Studies in Science Education, 36, 1-42.

    Sadler, T.D. (2009). Situated learning in science education: Socio-scientific issues

    as contexts for practice. Studies in Science Education, 45, 1-42.

  • Scientific Literacy for a Knowledge Society

    28

    SEPUP. (2003). SEPUP News. Berkeley, CA: Science Education for Public

    Understanding Project, Lawrence Hall of Science, University of California at

    Berkeley. Retrieved June 4, 2008, from www.sepup.com.

    Sinclair, M., Orpwood, G., & Byers, P. (2007, April). Supporting school

    improvement in mathematics. Paper presented at the annual meeting of the

    American Educational Research Association, Chicago, IL.

    Solomon, J. (1984). Prompts, cues and discrimination: The utilization of two

    separate knowledge systems. European Journal of Science Education, 6, 277-

    284.

    Thier, H.D., & Daviss, B. (2001). Developing inquiry-based science materials: A

    guide for educators. New York: Teachers College Press.

    Wiggins, G. (1993). Assessing student performance: Exploring the purpose and

    limits of testing. San Francisco: Jossey-Bass.