approaches to monitoring and evaluation in literacy programmes

24
2006/ED/EFA/MRT/PI/14 Background paper* prepared for the Education for All Global Monitoring Report 2006 Literacy for Life Approaches to monitoring and evaluation in literacy programmes H.S. Bhola This paper was commissioned by the Education for All Global Monitoring Report as background information to assist in drafting the 2006 report. It has not been edited by the team. The views and opinions expressed in this paper are those of the author(s) and should not be attributed to the EFA Global Monitoring Report or to UNESCO. The papers can be cited with the following reference: “Paper commissioned for the EFA Global Monitoring Report 2006, Literacy for Life”. For further information, please contact [email protected] * Commissioned through the UNESCO Institute for Education (UIE)

Upload: others

Post on 03-Feb-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Background paper* prepared for the

Education for All Global Monitoring Repor

Literacy for Life

Approaches to monitoring and

in literacy programm

H.S. Bhola

This paper was commissioned by the Education for All Globalbackground information to assist in drafting the 2006 report. It has notThe views and opinions expressed in this paper are those of the autattributed to the EFA Global Monitoring Report or to UNESCO. The pafollowing reference: “Paper commissioned for the EFA Global Monitorfor Life”. For further information, please contact [email protected] * Commissioned through the UNESCO Institute for Education (UIE)

2006/ED/EFA/MRT/PI/14

t 2006

evaluation es

Monitoring Report as been edited by the team. hor(s) and should not be pers can be cited with the ing Report 2006, Literacy

APPROACHES TO MONITORING AND EVALUATION IN LITERACY PROGRAMMES H.S. Bhola I BACKGROUND AND CONTEXT The need for establishing appropriate, organized, and duly comprehensive Monitoring and Evaluation (M&E) systems, integrated into the programs and projects of education and development, is well-understood today - - on the one hand, to inform the processes of policy formulation and policy analysis, and on the other hand, to enable an efficient and effective achievement of program objectives. Establishing an M&E system is by no means a simple design-and-plug operation. Sufficiently comprehensive and effective M&E systems must evolve within a complex and continuous dialectic between what theory compels and what the experiential-empirical world of practice permits or prohibits. There is yet another complexity. As “standpoint” theorists point out, both those who develop literacy policies and programs and those who design related M&E systems assume particular standpoints of theory and ideology and use specific norms and criteria in identifying exemplars of good practice on the ground. This behavior is, of course, disciplined by theoretical tradition and collective knowledge of the professional community and by the criterion of “Strong Objectivity” in the overall process (Harding 1998; Durham 1998). The text that follows is anchored in an epistemological triangle defined by systems thinking, dialectical thinking and constructivist thinking. Positivist thinking is not rejected outright but is seen as a particular construction of reality in settings where assumptions of direct and immediate causality and of relative certainty in a specific context can be made. NEW DIRECTIONS IN THEORY AND PRACTICE OF MONITORING AND EVALUATION Advances in theory and methodology of M&E have been impressive during the last forty years (Kellaghan, Stufflebeam & Wingate 2003; Mathison 2005). One of the more important advances in the theory and practice of M&E may have been the move away from linearity to a systems view, though it is not always done self-consciously; and in it applications the systems view becomes more systematic and less systemic. Along with this advance, is what has come to be called the paradigm dialog and resulting accommodations between positivism and constructivism. Mixed models are now routinely used. Quantitative methods are joined with qualitative, analytical and historical methods, to create fuller understandings of reality. Understanding of institutional

realities and “cultural competence” is demanded from evaluators (Patton 2000; Bhola 2003c, 2004b, 2004d; Thomas & Stevens 2004). There has been a keener understanding of the intersections between theory and ideological values, which then affects both the choice of models and methods, and subsequent interpretations of data. External evaluations are asked to be balanced by internal evaluations which, in turn, are required to be participatory, so as to serve the needs and interests of all relevant stakeholders, including learners, field workers, facilitators, families, and communities. FROM THEORY AND IDEOLOGY TO AN OPERATIAONAL FRAME Making monitoring and evaluation “operational” within adult literacy [1] projects, programs and campaigns, is an old concern (Bhola 1973). The advances in the theory and practice sketched above have provided a new frame to operationalize "Built-in" or "In-Built" Monitoring and Evaluation systems (Bhola 1983, 1984, 1989b, 1990, 1995a, 1998a). [2] First and foremost, Built-in Monitoring and Evaluation is seen as a "System" carefully constructed from a systemic and systematic analysis of a Program which is also seen as a "System" with its four parameters of Contexts, Inputs, Processes, and Products (CIPP). The Program system and the Monitoring and Evaluation system are seen as mutually integrated and to be fully and functionally interfaced in terms of program aspects to be focused on, and indicators to be used to monitor and evaluate change in those aspects. The second essential element of the Built-in M&E System is, to make the agent of action also the agent responsible for evaluation - - by self-evaluation or co-evaluation. The Built-in M&E is thus a system in which information is used immediately at the level where it is produced and then sent upwards (Bottom-Up) for collation, analysis, interpretation and utilization at each successive level. It is a two-way flow of information, as feedback-information from above is then fed back (Top-Down) to each of the levels below, now suffused with a larger holistic perspective. Built-in monitoring and evaluation system is praxis, not policing; inventing, not importing; it is synthetic rather than analytic; collective rather than bureaucratic; democratic rather than autocratic; participative rather than hierarchical; and empowering rather than disempowering. Such a system creates intelligent institutions and cultures of information within which informed decisions are made to plan and deliver literacy services to those youth and adults who need literacy to transform their identities and communities (Bhola 1990, 1995a, 1998a; Senge 1990; Chapman & Mahlek 1993; Lind 1996). INTERFACING THE M&E OPERATIONAL FRAME WITH FACTS OF LITERACY INITIATIVES ON THE GOUND As the "EFA Global Monitoring Report 2002: Education for All: Is the World on Track?" stated: “The meaning of literacy has changed radically since the World Conference on Education for All in Jomtien in 1990. Conceived now in the plural as 'literacies', and embedded in a range of life and livelihood situations, literacy differs according to purpose, context, use, script and institutional framework. But these conceptual advances have not been matched by the priority accorded to it in policy and

resource allocation, in part because many governments perceive the expansion of primary education as the main driver for the eradication of illiteracy (UNESCO 2002)." The above statement points to some of the most serious challenges in interfacing the operational frame of an M&E system with a literacy program on the ground. First and foremost, is the question of defining and delineating literacy itself: How to accommodate differentiations of languages of literacy, and levels of difficulty and knowledge hidden in the words being read; and, at the same time, be able to abstract for generalization, the essential literacy skills and the core of development-oriented knowledge across those diverse social spaces for "ensuring a unified vision, framework and plan" from the local to the global - - and yet meet "the challenge of celebrating diversity". (United Nations 2000, p.16). This should encourage us to go back from multiple literacies to an essential "semiotic" definition of literacy skills, for us to be able to measure competence in coding and decoding , in terms of general levels within different language systems - - complemented with separate tests for content in the texts relevant to contexts (Langer 1942; Bhola 1989; 1997). There are additional pedagogical and pragmatic questions: How to include teaching of numeracy and essential life skills as part of literacy?; and how to serve out-of-school children, and adolescents, and adults whether or not they attended school? There is an analytical-methodological challenge: A “causal dynamic” for use of literacy in poverty reduction must be constructed and embedded in the design and delivery of the literacy program and then used for developing indicators to be used in Monitoring and Evaluation (M&E) to be able to make assertions about consequences of literacy for learners, their families, and communities. Finally, the policy challenge: Are the policy makers going to make commitments to literacy promotion; link literacy with policies and plans for poverty reduction; commit to M&E approaches that take an expanded views of monitoring and evaluation and not simply stay with statistics; and expend the resources necessary for making it all possible? II GROUNDING THE MONITORING AND EVALUATION FRAME IN THE EXPERIENTIAL WORLD: ANALYSIS OF SELECT CASES OF LITERACY PRACTICES The dialectic between theory and practice is a continuous, never-ending process. The GMR–2006 with Literacy as its Theme, now in process, provides an appropriate marker for putting the conceptual frame for Built-In Monitoring and Evaluation to the test of experience provided by the practice of literacy in the field. Empirical tests in the context of dialectical thinking are, of course, not Pass/Fail tests, nor are they meant to provide Proof or Disproof, or help discover packages of good practices ready to be picked up and installed in house. Dialectical thinking helps deconstruct narratives and structures of monitoring and evaluation not to deduce axioms but to develop “insights” that may then assist in the creative process of re-stating theory and/or re-inventing practice in new contexts.

In looking for a sample of literacy experiences, one cannot but look at the universe of literacy work - - which at the present moment in the history of literacy work reveals a depressing picture without promise for any immediate positive change. While there might be “many a flower that bloom unseen” on the landscape of literacy work, there are only a few examples of literacy projects, programs or campaigns that have come to attention at national and international levels. The last two decades are indeed marked by continued neglect of literacy and by decline in the fortunes of on-going initiatives that had first come on the horizon in the 1970s, and 1980s. Policy declarations seeking to promote literacy have been frequent and bold, but have all been later subverted and scrapped by policy makers. Yet, hoping against hope, a search for a sample of literacy projects, programs or campaigns that might illuminate the processes under consideration was begun with an e-mail survey of some 300 active adult literacy educators from around the world, requesting help in identifying, on the basis of personal knowledge and experience, literacy programs, projects or campaigns that had self-consciously and proactively, designed and implemented what could be considered built-in or in-built monitoring and evaluation system. No more than half a dozen usable responses were received. A tight time-frame for sending in responses may have been the reason for such a disappointing level of response. One of the responses was telling.. Dr. Wim Hoppers, then the Moderator of EDF remarked: "There is very little in-built systematic monitoring of literacy programmes that I have come across. Usually sponsors are only interested in immediate outputs, i.e. numbers of people trained!" This has indeed also been this author’s impression. In the meanwhile, it was discovered that Built-in Monitoring and Evaluation is now a part of the chatter on the Internet. The descriptor "Built-in Monitoring and Evaluation" listed 756,000 entries; and "In-Built Monitoring and Evaluation" listed 46,500. However, even a quick scanning of entries revealed that both authors and bibliographers had equated collaborative and participatory evaluations with built-in/in-built evaluations of some sort (www.Google.com, accessed March 2, 2005.) The International Handbook of Evaluation (Kellaghan, Stufflebeam & Wingate 2003) in its Index does not have any entries on Built-in or In-built Monitoring or Evaluation, though it does make references to diagnostic monitoring, and performance monitoring. It also includes items on international comparative studies of achievement and lists EFA monitoring as well. Similarly, the Encyclopedia of evaluation (Mathison 2005) does not have an entry on Build-in or In-Built Evaluation, though it does have entries on participatory evaluation, participatory monitoring and evaluation, participatory rural appraisal, and performance-based monitoring. With all these references to diagnostic and performance monitoring and to collaborative and participatory evaluations and appraisals, it is possible to assume that there are literacy projects, large and small, around the world that “have managed to create and nurture cultures of M&E that serve their purposes (In a personal note from Adama Ouane, April 8, 2005).” The sample of monitoring and evaluation studies finally selected was first by reputation and then by convenience of access to documentary resources related to the

those studies. Of the six cases, three are from Asia, two from South America and one from Africa. Unsurprisingly, four of the cases were from the E-9 countries. MEXICO: NATIONAL INSTITUTE FOR ADULT EDUCATION (INAE) The National Institute for Adult Education (INAE) was created in 1981 with the mandate for promoting universal literacy which was called a fundamental initial phase for all education, and the essential element for social equality (Moreno-Cedillos 2002). Between the politics of North America Free Trade Agreement (NAFTA) of January 1994, and policies and practices of adult learning proclaimed at the conclusion of CONFINTEA V at Hamburg (1997), the Mexican Government gave INEA a new ideology and policy orientation: adult and youth education as a process and a right; work as an indispensable component of adult education in the context of lifelong learning; priority to women and indigenous populations; a model to overcome the compensatory and supplementary vision associated with adult education; and the commitment to achieve universal literacy, with guaranteed use of computers and communication technologies with connectivity to the Internet for communities. Literacy was to provide a second chance education for those 15 years old or older who wished to complete primary and secondary education. There was to be a strong emphasis on technical and vocational training. INEA was to build cooperation with communities and businesses. There was to be mutual recognition of skills acquired in the labor market, and formal and nonformal education. Equivalencies were to be established between INEA courses and secondary education curricula. While INEA continues to provide a whole series of programs (as a number of NGOs continue to serve different groups and constituencies with their adult education needs), the government’s Model for Life and Work (MEVyT) remains INEA's flagship program based on curriculum designs to achieve particular behavioral objectives and delivered with the help of new Information and Communication Technologies. In 2001, over 3 million received INEA assistance; 134,648 completed their program. Another program worth mentioning is their Community Education program serving indigenous peoples. In 2001, some 34,000 community instructors were active in communities educating learners regarding health, nutrition and work. While records of coverage, attendance and attainment have been kept for all of the programs under INEA, the MEVyT initiative has had most systematic monitoring in relation to learners since the mid-1990s. A 3-step learner "evaluation" is conducted: First is diagnostic, based on an initial interview; second is formative, a process of self- and co-evaluation; and third is evaluation for accreditation which also involves qualitative evaluation of learners' achievement. In addition, Final Examinations are held nation-wide or certification. The backbone of this system is The Automated Follow-up and Pass System (SASA), ensuring reliable records for service delivery and follow-up. Numerical data suggests that Evaluation is considered to be important for delivery as well as follow-up but nothing conclusive has been available by way of knowledge of results

for the institutions of delivery or for learners (Literacy Exchange: Mexico, accessed February 2005). BRAZIL: ALPHABETIZACAO SOLIDARIA Paulo Freire's ideological model of literacy as conscientization was resonant in the work of the Brazilian Conference of Bishops during 1958-1964, but it was overwhelmed by the Brazilian Literacy Movement (MOBRAL) proclaimed by the state on the International Literacy Day of September 8, 1967 with the objective of promoting literacy among illiterate adolescents and adults within the framework of continuing education, within the ideological frame of "modernization and reinforcement of political structures." MOBRAL was followed by Fundacao Educar (Educate Foundation) that sought to offer literacy programs for youth and adults started with the support of civil society. This too was abandoned in 1990, a victim of the World Bank's Structural Adjustment Program. In 1995, Alphabetizacao Solidaria (The Solidarity in Literacy Programs was organized as a Forum to project civil society interests in literacy. In 1997, a partnership had become operative bringing together the Government, the civil society and the private sector. The strategy was to bring promising youth from communities to university campuses for preparation as Literacy Trainers in a one month module, followed by four months in communities to run classes for four days each week, and 3 hours per class period. There was a higher agenda: to enhance the individual potential of these trainees to be able to empower other social entities to generate social change. Alphabetizacao Solidaria monitors attendance, and uses a Self-evaluation system with its teachers, officials and monitoring staff. A project report states that at the end of 2001, 1578 municipalities in 20 Brazilian states had been reached, with 120,000 teachers having worked with 2,410,000 students - - 60 percent of these teachers and learners were from rural areas. Classroom level evaluations reported that the percentage of students able to reproduce phrases in the language of literacy rose from 13.53 % to 30.37%; and to produce text from 6.7% to 24.05 %. The percentage of students with no reading skills decreased from 55.00 to 14.5%. Percentage of those able to use additional means of estimation, comparisons and problem-solving increased from 60% to 75%. The program had impact by interaction with other programs of education and extension: School attendance "increased significantly" in communities where the program had worked; and 71% of the participating municipalities had reportedly institutionalized youth and adult education programs (Literacy Exchange: Brazil, accessed February 2005). SOUTH AFRICA: ITHUTENG AND ABET At the time of Independence in 1994, the need of literacy for the illiterate and poor was not immediately recognized. The government focussed instead on adult basic education and training (ABET) to prepare workers employed in the formal economy, to better serve the needs of the rapidly changing technologies.

In 1996, realizing that ABET as structured and delivered was not serving the interests of all of the people, the Department of Education at the center, launched its Ithuteng (Ready to Learn) campaign for those with low literacy skills or no literacy skills at all. Ithuteng recruited learners only at ABET levels 1 and 2 and thus was a program within a program, with adult literacy serving as the springboard for further learning under ABET. Ithuteng during the five years of its life (until 2002/03) had served some 500,000 adults. Monitoring and evaluation of Ithuteng in principle is part and parcel of the M&E system designed for the larger ABET project which, in turn, uses the paradigm of Outcome-Based Education (OBE) system designed for the total system of public education in South Africa that integrates implementation with continuous assessment of performance of learners. Under the earlier OBE scheme, curriculum is conceptualized as sets of critical outcomes, organized into clusters of units for each level. Each unit contains a listing of measurable outcomes that must be achieved by all learners at an acceptable level of competent performance. Prior knowledge acquired at work or through self-learning, after testing, can qualify for certification leading to portable qualifications. All qualifications finally fit into the National Qualification Framework (NQF) of the South Africa Qualification Authority. Studies on the utilization of Monitoring data by policy makers or practitioners of Ithuteng or ABET are hard to come by. Though, educators outside the governmental structures had been saying that OBE system was too complicated and had had completely unrealistic expectations from the teachers and educational administrators in the early post-Apartheid years. The report of a review committee on “Curriculum 2005” established by the Government came to the conclusion that sadly “neither the social nor the personal goals” of learners had been achieved. In the maize of elaborate listing of testable outcomes, non-integrated (and unsuitable) learning support materials, accumulations of multiple credits, and an array of rules for certification for qualifications, the most basic skills of comprehensive reading and writing and foundational mathematical, had been lost sight of (Chisholm 2000, Jansen 2004). Some stand-alone studies of evaluation of literacy projects run by NGOs, have been more positive. The impact of ABET projects on their own facilitator and on learners in communities, has been positive. These NGO-projects have indicated remarkable levels of achievements on the part of facilitator both as promoters of literacy and as activists in development; and learners have been both “learning and discerning” - - in Freire’s words, using the word to read the world (Bhola 1998a, 1998c; Literacy Exchange: South Africa, accessed February 2005). INDIA: NATIONAL LITERACY MISSION (NLM) The National Literacy Mission (NLM) was launched in 1988, a decade after the

inauguration of National Adult Education Programs (NAEP), and it was in more than one way continuous with the earlier initiative. The three program objectives of (i) literacy, (ii) awareness, and (iii) functionality, remained in focus in the design of Moniotring and Evaluation (M&E). The most important innovation of the NLM was its organization. First, it was a federal program implemented by states. Second, the program at the center with state collaboration translated into 574 Total Literacy Campaigns (TLC's) at the level of districts. The NLM had planned for a comprehensive and systematic M&E, anticipating data flows from learners groups on the ground on to the district headquarters, to the states and then to the Center. Internal monthly monitoring was also expected to be complemented with formal internal evaluations when required. External spot checks were also part of the monitoring plans. The hope was to collect policy oriented feedback that could be collated for use at 6-month intervals. Concurrent Monitoring as well as Quick Appraisals were to be conducted at all the various levels and locations of the overall system. Data in the Monitoring system was to be used to develop program evaluations in all the aspects of formative evaluation, evaluation of instruction, and outcome evaluation. In addition to in-house monitoring and evaluations, large-scope evaluations of TLC's were mandated that were to be paid for by the central government. At least three evaluations had to be conducted in each TLC: (1) a quick appraisals at the beginning; (2) a mid-term evaluation; and (3) an end-term evaluation. The list of questions that evaluations should find answers to were also established. The state governments had to contract these evaluations with outside experts - - in university departments, specialized institutes, or NGO's who have acquired a reputation for doing good evaluation work, and were on the approved list. Normative frames for methodology, for testing achievement, for definition of success are also provided. By 2002, some 14 years after its launch, the NLM, using goal-oriented, area-specific and time-bound strategies, had covered 574 of the nation's 598 districts. As many as 302 districts were already offering continuing education programs. Total enrollment had touched 125 million adults, with some 12 million volunteers leading learners groups. As many as 71.45 million had become literate (Rao 2002). An evaluative account based on some 97 external evaluations of TLC’s conducted in the districts spread all over India had produced interesting results. Enthusiasm for the literacy program was far higher than had been expected; and the literacy program had changed the social geography of communities just by their presence in the communities. Enrollments were higher for women than men. Success rate for literacy acquisition was typically between 75% to 95%. The males and females had about the same scores in tests; and the so-called weaker sections of the community showed strong performance with equal or superior scores. Teaching of numeracy did not fair well. Consequences of literacy first appeared in terms of self-affirmation and self-esteem - - this was specially significant for females. New learners used their literacy skills to read printed signs in their environment, wrote letter and made better transactions in the market. They obtained useful information for personal and family health, in growing trees, and casting valid

votes. Most of all, they dared to speak up! Awareness - - a radical concept borrowed from Freire - - had been tamed to be knowledge of existing government's development programs! Functional knowledge in the primers was not enough and there was neither know-how nor capital to engage in any activities of livelihoods. Structural changes in the communities were outside the locus and context of control of learners (Literacy Exchange: India, accessed February 2005; Bhola 2003b). BANGLADESH: DHAKA AHSANIA MISSION The Dhaka Ahsania Mission (DAM), one of the world’s most reputed NGOs today, was established back in 1958. Since 1999, DAM has been in “Operational Relations with UNESCO.” With a nation-wide reach, it offers a whole range of services for socio-economic and cultural development including non-formal education for all groups – children, adolescents and adults - - linked with appropriate continuing education. To get an idea of its commitment and scope of work in adult literacy , it should be noted that by early 1999, as many as 122, 875 learners in 4,274 centers had participated in DAM’s literacy work and as many as 873 continuing education centers were functioning to reinforce literacy and functional skills of its neo-literates. DAM’s work has expanded considerably over the years and is marked today by the highest levels of professionalism in all the aspects of program design, management of implementation and in-built monitoring and evaluation (Rahman 1999). FROM THE PHILLIPINES: AN MIS SYSTEM OF PROMISE During 1996-98, the SEAMEO INNOTECH of the Philippines, developed an MIS system for Nation-wide NFE Accreditation and Equivalency Programme run by the Bureau of Nonformal Education of the Department of Education, Culture and Sports. A PowerPoint presentation based on the above, in 20 slides, explicates the objectives, and the structure of the MIS. This comprehensive MIS is supplemented and complemented by evaluations of projects as appropriate, including all of the Government’s programs: early childhood care and education; non-formal primary education; literacy for youth and adults; continuing education covering post-literacy programs, vocational skills training; and linkage between formal and nonformal education programs and with development services. All interventions are oriented to socio-economic and cultural development and ideologically committed to gender parity, women empowerment, and inclusion of the marginalized. The system as planned covers the whole range of variables under each of the four systems parameters: inputs, processes, outputs/outcomes and contexts. The MIS is structured as a tight frame, however the accommodation of participation of all stakeholders and the use of both quantitative and qualitative methodologies permits a balance between the systematic and systemic aspects of the overall MIS. The MIS is thus able to illuminate macro level perspectives of policy makers and to serve micro level needs of field workers. It monitors performance of the delivery system (regarding training, materials production, and capacity building) on the one hand, and learner achievements and changes in communities on the other. Computer technologies are used

for storage of data; and utilization of information - - as monthly, quarterly, bi-annual and annual reports are generated at various levels of the system (PowerPoint 2005). Under Globalization, economic growth is primarily anchored in the formal economy. The formal economy requires formal education; and certification of knowledge and skills, learned in nonformal settings, as well as prior knowledge learned informally at work. Accreditation of institutions authorized to test and certify prior knowledge; and to offer non-formal education programs becomes a necessity. This in turn demands standard setting and establishment of equivalencies between formal and non-formal education, as also in regards to prior knowledge. A second PowerPoint presentation on Nonformal Education Accreditation and Equivalency (NFE A&E) System of the SEAMEO INNOTECH of the Philippiones, in 23 slides, explains the underpinning, objectives and structure of the accreditation and equivalency system in use (Powerpoint 2005). The NFE A&E System does not merely administer standardized tests, but includes entry level assessment and counseling; and actual learning interventions including individual tutorials, learning groups, self instructional modules and resource speakers - - leading to certification. The alternative curriculum while made equivalent to curriculum in formal education, meets the learning needs of those in the nonformal streams. Those who pass the “Equivalency Tests” are offered bridging programs for further academic work, or helped to enter or re-enter the world of work. OVERVIEW OF CASES An overview of the country cases confirms that the discourses on purposes of M&E, and methods, and values used in the design and implementation of M&E today strongly resonate to the global advances in theory and research in M&E. The use of the four systems parameters - - inputs, processes, outputs/outcomes and contexts - - is embedded in almost all of the cases. The new ideologies of collaboration, participation seem to have taken hold in all the six contexts. Of course, cultural and social adaptations are made in particular contexts to the cultures of institutions and cultural identities of individual in communities (Bhola 2003d). The overview points to another important reality: that of the tension between collaboration versus control, and participation versus transfer of expert knowledge and values. Structures for M&E have to be built systematically by defining roles, and rules for relating those roles, which over time can come to be highly routinized and bureaucratized. As a result, the systemic can be squeezed out of the systems thinking. To enable the emerging M&E systems to approximate more and more to the idealized Build-in Monitoring and Evaluation system will require continuous vigilance to ensure that the systemic in this framework is overwhelmed by the systematic.

III NEEDED A SYSTEMATIC INITIATIVE OF M&E IN LITERACY PROGRAMMES Monitoring and Evaluation are actions with purpose: which is to illuminate policy, and to guide practice. M&E is more than commonsense; and doing M&E will require more than technical skills of data collection, data storage and report writing. Capacity building for M&E initiatives regarding adult literacy will require (1) conceptual- analytical capacities to understand the nature of literacy itself; (2) skills in system design and ability to interface systems of implementation with systems of M&E; and (3) technical skills such as indicator writing and instrument design. Unfortunately, such capacities do not exist in all those places where they are needed. National level initiatives will be necessary in most Third World countries. DEFINING LITERACY OPERATIONALLY The place to start has to be to understand the nature of literacy and define literacy in operational terms in any particular context. The splintering of literacy into literacies, has to be repaired by using a semiotic definition of literacy - - "as the ability to make symbolic transformations of reality in writing, of reality already transformed into speech (Bhola 1997)." Levels of literacy achievement may then be determined within the context of the symbol systems used in the particular language of literacy, and by making comparisons between ratios of successes achieved across programs, languages, and regions. Literacy may be seen as layered, so that all who join a literacy program will get their first layer of literacy as initial literacy, to be followed by subsequent layers of literacy with greater fluency of coding and decoding (or acquiring a second literacy in an additional code); and more and more content of essential life skills as relevant in contexts. Literacy we should understand has to be quite autonomous" in the moment of teaching, but will become "ideological" in the process of learning (Bhola 1989). We cannot write two billion individualized prescriptions for 2 billion illiterates and semi-literates. For teaching literacy, standardization of content and methods will have to done within large enough contexts. DESIGNING SYSTEMS IN INTERFACE It will be necessary to look at literacy projects, programs and campaigns as systems. M&E will also have to be conceptualized as a system . Suitable government policies will have to be proclaimed and budgetary allocations made. The delivery of literacy programs will have to be institutionalized. Literacy work should not be left to the vagaries and uncertainties of so-called partnerships. Policies as proclaimed will have to be operationalized as goals to be achieved, with clear targets for coverage according to a time schedule. These goals should be translated into curriculum to be delivered in particular contexts. If numeracy and essential life skills are part of the definition of literacy, these should actually be dealt with in the curriculum.

Another important antecedent to the construction of an M&E System is planning for implementation, and clarifying the means and ends calculus of goal attainment. Without the articulation of the embedded dynamics of such a system of action in regard to its inputs, processes of intervention, and intended outputs located in a particular context, it is impossible to identify the indicators that should be part of the monitoring process, or to develop appropriate evaluation agendas. METHODS AND TECHNIQUES OF M&E The methods and techniques of designing, pre-testing and administering instruments will have to be taught. Interview and focus groups processes will have to be learned as well. Converting qualitative data into numbers and /or writing narratives that capture the meanings embedded in numbers and narratives, will have to be taught. Finally, patterns of utilization of information developed from data must be institutionalized. A MONUMENTAL INITIATIVE IN CAPACITY BUILDING The following tabulation suggests what needs to happen in M&E in literacy work during the next few years: ___________________________________________________________ National Level Developing clear and coherent literacy policies; as also integrated and interlinked policies and procedures for a Built-in M&E System. Establishing protocols and statistical frames for use at all the various levels of the literacy promotion system as well as for the collection of literacy data by national census authorities. Conducting or commissioning research for operationalizing the concepts of multiple levels of literacy, and for establishing equivalencies between and among literacy programs from different social contexts and language communities (Bhola 1999). Developing prototypes of curriculum in the teaching of literacy, numeracy, and most importantly, of "essential life skills" - - to build connections with the grand ultimate goal of poverty reduction - - for adaptation and adoption in a whole variety of contexts. Conducting or commissioning capacity building schemes to undertake the training implied in the above and in the following, using a multiplier model. Conducting or commissioning special evaluation studies on utilization of literacy by new-literates, using data already in the M&E system, with supplementation as necessary. ___________________________________________________________

Mid-Level Conducting or commissioning action research to serve sub-national contexts, related to developing differentiated levels of literacy achievement; and preparation of appropriate curricula and associated achievement tests to enable such differentiations. Conducting or commissioning research related to developing equivalencies between and among programs across languages and social contexts. Translating and adapting frameworks and instruments for data collections and data reporting. Meeting all of the training needs of the districts and sub-districts within their jurisdiction by using multiplier models. Conducting overall evaluations of utilization and impacts of literacy learning on the lives of people within their own jurisdiction; and assisting in such studies undertaken by national authorities. ____________________________________________________________ District/ Sub-district Level Providing all training needs of facilitator in learner centers. Collating data received from facilitator from the field and writing analytical reports using both numbers and critical reflections made by facilitator Data collection in the field to serve the needs of Overall Evaluations conducted or required by planners or policy makers at higher levels. Conducting periodical Monitoring visits to learner centers within their jurisdiction, and conducting feedback sessions with facilitator and learners Writing monthly reports to be sent up to the next level. ____________________________________________________________ Literacy Center Level Maintaining Attendance Records. Writing meaningful reflections in the Facilitator Logbook. Conducting end-of-the month "sharing session" with learners in the group Writing Monthly Reports to be sent up to the next level.

_____________________________________________________________ Fortunately, some examples of capacity building for M&E of adult literacy/adult education initiatives are already available. A UIE-sponsored, 5-day international seminar-workshop focussed directly on building M&E systems in adult education that would be truly gender sensitive (Medel-Anonuevo 1999). The workshop assumed a systems perspective with its four parameters of (i) inputs, (ii) processes, and (iii) outputs/ outcomes within a (iv) context. Within this systems perspective, the workshop insisted that M&E go beyond counting and expanding enrollments and participation in income generation activities - - which could indeed “school the minds” of women for continued subjugation by learning coping mechanisms and accommodations with existing discriminations. M&E specialists, they suggested should focus on existing power relations and write indicators that go beyond number-gaps, and should instead measure changes in women participation, and in the levels of gender discrimination and gender oppression (Medel-Anonuevo 1999). An on-going IIZ/DVV project for the promotion of adult education in Guinea (Conakry), West Africa puts considerable emphasis on capacity building in general and in M&E approaches in particular, among facilitators and first line managers. The key concepts are participation, self-evaluation, and measuring progress by engaging in “outcome mapping” (IIZ/DVV Documents, 2004. Based on a summary prepared by Christine Glanz of UIE). IV MERITS AND BENEFITS OF MONITORING AND EVALUATION: CHANGING LIVES; TRANSFORMING SOCIETIES Capacity building for policy development, planning, implementation, and for construction of built-in monitoring and evaluation systems will need resources in the short run, but in the long run developing nations of the world will reap benefits that will transform their economic, social and political realities. In-Built Monitoring and Evaluation in literacy programs has inherent merits and multiple benefits for individual learners as persons and as citizens; and their facilitators enabling them to grow as persons and professionals, and perhaps also as activists in behalf of their communities. What is personal development from the individual perspectives is, of course, Human Resource Development from the institutional perspective. By insisting on data collection by each participant in the program and reflecting on what is learned, creates intelligent organizations with renewed cultures of institutions (Senge 1990; Bhola 1995a). At higher levels of planning and policy making, it is possible to establish informed norms of accountability and make more sensible studies of cost-effectiveness. Finally,

Built-in M&E can act as a strategy for social reform and cultural renewal. Without this, United Nations/UNESCO hope to put literacy in the service of sustainable development and human freedom in general, is nearly impossible. The role of a good M&E structure can be stretched to include the promotion of a creative dialectic between traditional and modern knowledge and cultures. As people begin to contend with information they themselves have produced on the basis of their own personal knowledge, and have consolidated by methods that make sense to them, they are going to put this newly learned information by the side of their traditional knowledge and beliefs, whenever it is possible. Thereby, they will be able to renew their traditions as they contend with modernization which has become a global phenomenon (Bhola 2002b). Last, but not the least, Built-in M&E systems can be a bonanza for researchers, with what they could do with the data, both quantitative and qualitative, available in M&E systems, from the local to the global.

_______________________________________________________________ APPROACHES TO MONITORING AND EVALUATION OF LITERACY PROGRAMMES: NOTES / REFERENCES / BIBLIOGRAPHY - - - - - NOTES [1] In actual practice, both in the developed and the developing world, adult literacy programs become adult education programs as emphasis shifts from the teaching of literacy skills to the teaching of substantive knowledge, values and skills for citizenship and livelihoods. On the other hand, programs conceptualized as adult education discover the need for teaching or refreshing literacy skills of learners. Understandably, monitoring and evaluation of adult literacy and adult education programs are not always possible to separate in theory or practice. Indeed, adult literacy statistics are often used as proxies to monitor progress in adult education. [2] The conceptualization and testing of the “Framework for Built-in Monitoring and Evaluation”, and of the associated “Action Training Model” were possible within the context of a series of training workshops in several English-Speaking countries of Sub-Saharan Africa from mid-1970s to late 1980s, with support of UNESCO Institute of Education, and German Foundation for International Development (DSE), as also from of GTZ and SIDA. REFERENCES AND BIBLIOGRAPHY Alkin, M.C. and S. Taut. 2002. Unbundling evaluation use. Studies in Educational Evaluation, 29(1): 1-12. Archer, David and Sara Cottingham. 1996. The REFLECT Mother Manual. London: Actionaid. Bhola, H.S. 1973. Making evaluation operational in functional literacy programs. Literacy Discussion (Quarterly Journal of the Ex-UNESCO International Institute for Adult Literacy Methods, Tehran, Iran), 4 (4): 457-491. [Translated in French and Spanish]. Bhola, H.S. 1983. Action Training Model: An Innovative Approach to Training of

Literacy Workers. (Unit of Cooperation with UNICEF and Women’s Program. Notes and Comments Series # 128). Paris: UNESCO. Bhola, H.S. 1984. Building a Built-in Evaluation System (in Botswana): A Case in Point. Paper presented to the Joint Meeting of the Evaluation Research Society and the Evaluation Network [now American Evaluation Association], San Francisco, CA, October 10-13, 1984. ERIC Clearinghouse Document No. ED 256 779. Bhola, H.S. 1989a. Literacy as a social process; literacy as a social intervention. ASPBAE Courier (Journal of the Asian South Pacific Bureau of Adult Education. Special Issue to Commemorate International Literacy Year 1990), 47: 6-14. Bhola, H.S. 1989b. Training evaluators in the Third World: Implementation of the Action Training Model (ATM) in Kenya. Evaluation And Program Planning (A Pergamon Journal), 12(3): 249-288. Bhola, H.S. 1990. Management information systems (MIS). (Part III, Chapters 5-8, pp. 75-152). In: H.S. Bhola. Evaluating "Literacy for Development": Projects, Programs, and Campaigns. Hamburg: UNESCO Institute of Education / Bonn: German Foundation for International Development. (Also in French, Spanish, Arabic, and Persian) Bhola, H.S. 1994. A Source Book for Literacy Work: Perspectives from the Grassroots. Paris: UNESCO / London: Jessica Kingsley. Bhola, H.S. 1995a. Informed decisions within a culture of information: Updating a model of information development and evaluation. Adult Education and Development, 44:75-84. Bhola, H.S. 1995b. A Policy Analysis and Program Evaluation: National Literacy Program in Namibia. Windhoek, Namibia, Directorate of Adult Basic Education, Ministry of Education and Culture, Government of the Republic of Namibia. (Sponsored by SIDA and UNICEF (United Nations Children's Fund). Bhola, H.S. 1997. Literacy. In: John Feathers and Paul Sturges. (eds.). International Encyclopedia of Information and Library Sciences (p.277-280). London/ New York: Routledge. Bhola, H.S. 1998a. Constructivist capacity building and systemic evaluation of adult basic education and training in South Africa: One discourse, two scripts. Evaluation, 4: 1-22. Bhola, H.S. 1998b. Program evaluation for program renewal: a study of the National Literacy Program in Namibia (NLPN), Studies in Educational Evaluation, 24 (4): 303-330. Bhola, H.S. 1998c. They are learning and discerning: Evaluation of an adult

education project of the National Literacy Co-operation of South Africa. Studies in Educational Evaluation, 24(2): 153-177. Bhola, H.S. 1999. Equivalent curriculum development as situated discourse: A case in the context of adult education in Namibia. Curriculum Inquiry, 29(4): 459-484. Bhola, H.S. 2000a. A discourse on impact evaluation: A model and its application to a literacy intervention in Ghana. Evaluation: The International Journal of Theory, Research and Practice, 6(2):161-177. Bhola, H.S. 2000b. Developing discourses on evaluation. In: Daniel L. Stufflebeam, George F. Madaus, and Thomas Kellaghan (eds.). Evaluation Models: Viewpoints on Educational and Human Service Evaluation (p.383-394). Second Edition. Boston/Dordrecht: Kluwer Academic Publishers. Bhola, H.S. 2002a. A discourse on educational leadership: Global themes, postmodern perspectives. Studies in Philosophy and Education (A Kluwer Journal. Special Issue, edited by Louis F. Miron and Gert J.J. Biesta), 21(2): 181-202. Bhola, H.S. 2002b. Reclaiming old heritage for proclaiming future history: The knowledge-for-development debate in African contexts. Africa Today, 49(3): 3-21. Bhola, H.S. 2003a. Introduction. [To the section on Social and Cultural Contexts of Educational Evaluation]. In: Thomas Kellaghan, Dan L. Stufflebeam, and Lori Wingate (eds.). International Handbook on Educational Evaluation (p. 383-390). Dordrecht: Kluwer Academic Publishers. Bhola, H.S. 2003b. A model and method of evaluative accounts: Development impact of the National Literacy Mission (NLM) of India. Studies in Educational Evaluation 28:273-296. [A Pergamon/Elsevier Journal]. Bhola, H.S. 2003c. Monitoring and Evaluation of Adult Education and Adult Learning. Paper prepared for use at the Thematic Workshop on Monitoring and Evaluation of Adult Learning, September 7, 2003, organized by UNESCO Institute of Education and UNESCO Institute for Statistics, and held at CONFINTEA V Mid-Term Review Conference, September 6-11, 2003, Bangkok, Thailand. Bhola, H.S. 2003d. Social and Cultural Contexts of Educational Evaluation: A Global Perspective. In: Thomas Kellaghan, Dan L. Stufflebeam, and Lori Wingate (eds.). International Handbook on Educational Evaluation (p. 391-409). Dordrecht: Kluwer Academic Publishers. Bhola, H.S. 2003e. Papers and proposals submitted for consideration to AE/NFE Unit, Department of Education, Government of Tanzania as part of a UNESCO-sponsored consultation on constructing a comprehensive and coherent model of capacity building for Adult Education/Nonformal Education (AE/NFE) in Tanzania, including: (1)

Guide for a Nationwide Needs Assessment for Capacity Acquisition and Capacity Building in AE/NFE Sector; (2) A Manual on Theory and Practice of AE/NFE System for Use with and by the Trainers of Trainers in a Cascading Model of National Capacity Building; and (3) A Guidelines Document for Use in the Delivery of Training as Adapted to Various Groups and Levels of Professionals and Providers within the AE/NFE System. October 13 - November 16, 2003. Bhola, H.S. 2004a. Adult education for poverty reduction: Political economy analysis in systems theory perspective. Adult Education and Development (IIZ/DVV Journal), 62: 13-24. Bhola, H.S. 2004b. Educational Evaluation - - A Discussion Paper, for the lead session on Evaluation of the International Expert Meeting organized by Division of Educational Policies and Strategies, Section for Educational Policy Analysis and Evaluation, UNESCO, during 19-23 April 2004, UNESCO House, Paris. Bhola, H.S. 2004c. Policy implementation: Planning and actualization. Journal of Educational Planning and Administration, XVIII(3): 295-312. Bhola, H.S. 2004d. Reflections that Compel Actions: The International Meeting of Experts on Educational Policy Analysis and Educational Evaluation, UNESCO, Paris, 19-23 April 2004, 40 pages. Chisholm, Linda et al. 2000. A South African Curriculum for Twenty First Century: Report of the Review Committee on Curriculum 2005. Pretoria: Department of Education. http://education.pwv.gov.za./policies_reports/reports_2000//2005.htm Chapman, D. and L. Mahlck. 1993. From Data to Action: Information Systems in Educational Planning. Paris: UNESCO. Durham, Meenakshi Gigi. 1998. On the relevance of standpoint epistemology to the practice of journalism: The case of “strong objectivity. Communication Theory, 8 (2):117-140. Elmore, R. 1996. Getting to scale with good educational practice. Harvard Education Review, 6(1): 1-26. Fisk, Edward. 2000. EFA: Status and Trends - - Assessing Learning Achievement. Paris: UNESCO. Harding, Sandra G. 1998. Is Science Multicultural? Postcolonialism, Feminism, and Epistemology. Bloomington, IN: Indiana University Press.

India, Government of. 1979. Monitoring The National Adult Education Programme. Report of the All India Seminar, August 28-31, 1978. New Delhi: Directorate of Adult Education and Council for Social Development. Inter-agency Commission (UNDP, UNESCO, UNICEF, World Bank). 1990. World Conference on Education for All: Meeting Basic Learning Needs, 5-9 March 1990. Jomtien, Thailand. Final Report. New York: Inter-agency Commission. International Literacy Institute (ILI)/UNESCO/ UIE/UIS. 2002. Analytical Review of Four LAP Country Case Studies. Literacy Assessment Practices (LAP) in Selected Developing Countries. Philadelphia: ILI. Jansen, Jonathan D. 2004. Importing outcome-based education into South Africa: Policy borrowing in post-communist world. In: David Phillips and Kimberly Ochs (eds.). Educational Policy Borrowing: Historical Perspectives (p. 199-220).. Oxford: Symposium Books. Kellaghan, Thomas, Dan L. Stufflebeam, Lori Wingate (eds.) 2003. International Handbook on Educational Evaluation. Dordrecht: Kluwer Academic Publishers. Langer, Susanne K. 1942. Philosophy in a New Key: A Study in the Symbolism of Reason, Rite and Art. Cambridge: Harvard University Press. Lind, Agneta. 1996. "Free to Speak Up": Overall Evaluation of the National Literacy Program in Namibia. Windhoek, Namibia: Directorate of Adult Basic Education. Ministry of Basic Education and Culture, Government of Namibia (Sponsored by SIDA (Swedish International Development Authority) and UNICEF (United Nations Children's Fund). Literacy Exchange: World Resources on Literacy - - Brazil <http://www.rrz.uni-hamburg.de/UNESCO-UIE/literacyexchange/brazil/brazildata/htm> Literacy Exchange: World Resources on Literacy - - India <http://www.rrz.uni-hamburg.de/UNESCO-UIE/literacyexchange/india/indiadata/htm> Literacy Exchange: World Resources on Literacy - - Mexico <http://www.rrz.uni-hamburg.de/UNESCO-IE/literacyexchange/mexico/mexicodata/hm> Literacy Exchange: World Resources on Literacy - - South Africa <http://www.rrz.uni-hamburg.de/UNESCO-UIE/literacyexchange/southafrica/southafricadata/htm> Lonsdale, Michele, and Doug McCurry. 2004. Literacy in the New Millennium. Adelaide, Australia: National Centre for Vocational Education Research (NCVER) Mathison, Sandra. (ed.). 2005. Encyclopedia of Evaluation. Thousand Oaks, CA:

Sage Publications. (Sponsored by the American Evaluation Association). Medel-Anonuevo, Carolyn. 1999. Breaking Through: Engendering Monitoring and Evaluation in Adult Education. Hamburg: UNESCO Institute of Education. Moreno-Cedillos, Alicia (of CREFAL, Mexico). 2002. Mexico Case Study. Literacy Assessment Practices (LAP) in Selected Developing Countries. Discussion document prepared for ILI/UNESCO. LAP 2nd Experts' Meeting at UNESCO, Paris, 7-8 March 2002. ILI/UNESCO). NIEPA (National Institute of Educational Planning and Administration. Government of India). 1985. Towards Restructuring Indian Education: Citizens' Perception. New Delhi: The Institute. Okech, Anthony, Roy A. Carr-Hill, Anne R. Katahoire, Teresa Kakooza, and Alice N. Ndidde. 1999. Report of the Functional Adult Literacy Programme in Uganda. Kampala: Ministry of Gender, Labour and Social Development in collaboration with World Bank Mission in Uganda. Ott, J.S. Organizational Culture Perspectives. Chicago: Dorsey Press, 1989. Oxenham, John, Abdoul Hamid Diallo, Anne Ruhweza Katahoire, Anne Petkova-Mwangi, and Oumar Sall. 2002. Skills and Literacy Training for Better Livelihoods: A Review of Approaches and Experiences. Washington, D.C. : The World Bank. Africa Region. Patton, Michael. 2000. Utilization-Focussed Evaluation . In: Daniel F. Stufflebeam, George F. Madaus, and Thomas Kellaghan (eds.). Evaluation Models: Viewpoints on Educational and Human Services Evaluation (p. 425-438). Second Edition. Boston/Dordrecht: Kluwer Academic Publishers. Phillipines, Regarding the. http://www.unesco.org/education/literacy_2000/prizes.html. PowerPoint Files received from [email protected] on February 24, 2005. Rahman, Ehsanur. 1999. Monitoring participation in literacy and continuing education. The Dhaka Ahsania Mission experience. In: Medel-Anonuevo, Carolyn. ( ed.). Breaking Through: Engendering Monitoring and Evaluation in Adult Education (p. 43-56). Hamburg: UNESCO Institute of Education. Rao, I.V. Subba. 2002. A new approach to literacy assessment in India. In: Madhu Singh (ed.). Institutionalizing Lifelong Learning: Creating Conducive Environments for Adult Learning in the Asian Context (p. 270-291). Hamburg: UNESCO Institute for Education. Schwand, Thomas A. 2005. The centrality of practice in evaluation. American Journal of Evaluation, 26(1): 95-105.

Senge, P.M. 1990. The Fifth Discipline: The Art and Science of Organizational Learning. New York: The Falmer Press. Singh, Madhu (ed.). 2002. Institutionalizing Lifelong Learning: Creating Conducive Environments for Adult Learning in the Asian Context. Hamburg: UNESCO Institute for Education. Thomas, V.G. and F.I. Stevens (eds.) 2004. Co-constructing a contextually responsive evaluation framework: The Talent Development Model of School Reform. [Special Issue] New Directions for Evaluation, 101. San Francisco: Jossey Bass. Trochim, W.M. K. 1989. An introduction to concept mapping for planning and evaluation. Evaluation and Program Planning. 12(1):1-16. UNESCO. 1976. The Experimental World Literacy Programme: A Critical Assessment. Paris: UNESCO/UNDP. UNESCO 1986 Unesco's standard-setting instruments V3, B4. Paris: Unesco. [Including materials from Unesco General Conference, 20th Session, Paris, 1978. Also including, Revised recommendations concerning the international standardization of educational statistics. Adopted by the General Conference at its 20th session, Paris, 27 November 1978]. UNESCO 1996. Education for All: Achieving the Goal. Working Documents for the Mid-Decade Meeting of the International Consultative Forum, June 16-19 1996, Amman, Jordan, Paris:UNESCO. (The International Consultative Forum is an alliance of UNESCO, UNICEF, UNDP, and the World Bank). UNESCO-Institute for Education. 1997a. Fifth International Conference on Adult Education: A Key to the 21st Century. Hamburg, 14-18 July 1997. Paris: UNESCO. UNESCO-Institute for Education. 1997b. Adult Education. The Hamburg Declaration. The Agenda for the Future. (Fifth International Conference on Adult Education 14-18 July 1997.) Hamburg: UIE. UNESCO 2000 Final Report. World Education Forum. The Dakar Framework for Action, Education for All: Meeting Our Collective Commitments, adopted by the World Education Forum, Dakar, Senegal, 26-28 April 2000. Paris, UNESCO. UNESCO. 2002. EFA Global Monitoring Report 2002. Education for All: Is the World on Track? Paris: UNESCO. UNESCO. 2003. EFA Global Monitoring Report 2003/4: Gender and Education for All: The Leap to Equality. Paris: UNESCO.

UNESCO. 2005. EFA Global Monitoring Report 2005. Education for All: The Quality Imperative. UNESCO. 2006. EFA Global Monitoring Report 2006. Special Theme: Literacy. United Nations. 2000. Literacy for All. A United Nations Literacy Decade Discussion paper. <http://lacnet.unicttaskforce. org/Docs/Literacy%20for%20all_Discussion%20paper.pdf> 2000, p.16).