skeleton curriculum as a tool for facilitating...

16
Skeleton Curriculum as a Tool for Facilitating Constructivist Learning Online David A. Ireland Heriot-Watt University Edinburgh UK Abstract This paper outlines the development of a shared knowledge-constructing method for tutors to facilitate constructivist learning online. A ‘skeleton curriculum’ tool is proposed as a means of realizing this method and an initial evaluation to determine what: a) students thought the ease of use of this tool was like compared to a discussion board tool, and b) tutors thought about the quality of student work when using it compared to when using a discussion board is detailed. Students’ and tutors’ tool preference is also determined. Results indicate that tutors thought the quality of student work was slightly better when using the skeleton curriculum and that as tutors’ ratings for the quality of student work when using a skeleton curriculum tended towards being better than when using a discussion board so too did students’ preference for tool tend towards the skeleton curriculum. This initial evaluation has provided rapid, cost-effective feedback for use in the further development of the skeleton curriculum tool for research purposes. Introduction Learning online is fast becoming an accepted form of distance education throughout the world (Phillips, 2004). This global acceptance has for the most part been fostered by the progressive development of more advanced technologies — technologies that are better able to facilitate learning. Facilitating traditional, face-to-face learning is a complex task as it requires an in-depth knowledge of the theoretical principles underpinning the process of learning and an appreciation for how these principles apply to real-world situations. This task is more complex online, not necessarily because of any significant change in theory, but in part because of the additional requirement to have a working knowledge of how to use the associated, technology-based facilitation tools (e-learning tools). The research described here sets out to develop a relatively simple method for tutors to facilitate constructivist learning online. A research context and theoretical background are provided. The development of a prototype e-learning tool is briefly described and an initial evaluation is detailed. Conclusions and a brief discussion are included.

Upload: dinhthu

Post on 04-May-2018

217 views

Category:

Documents


1 download

TRANSCRIPT

Skeleton Curriculum as a Tool for

Facilitating Constructivist Learning Online

David A. Ireland Heriot-Watt University

Edinburgh UK

Abstract This paper outlines the development of a shared knowledge-constructing method for tutors to facilitate constructivist learning online. A ‘skeleton curriculum’ tool is proposed as a means of realizing this method and an initial evaluation to determine what: a) students thought the ease of use of this tool was like compared to a discussion board tool, and b) tutors thought about the quality of student work when using it compared to when using a discussion board is detailed. Students’ and tutors’ tool preference is also determined. Results indicate that tutors thought the quality of student work was slightly better when using the skeleton curriculum and that as tutors’ ratings for the quality of student work when using a skeleton curriculum tended towards being better than when using a discussion board so too did students’ preference for tool tend towards the skeleton curriculum. This initial evaluation has provided rapid, cost-effective feedback for use in the further development of the skeleton curriculum tool for research purposes.

Introduction Learning online is fast becoming an accepted form of distance education throughout the world (Phillips, 2004). This global acceptance has for the most part been fostered by the progressive development of more advanced technologies — technologies that are better able to facilitate learning. Facilitating traditional, face-to-face learning is a complex task as it requires an in-depth knowledge of the theoretical principles underpinning the process of learning and an appreciation for how these principles apply to real-world situations. This task is more complex online, not necessarily because of any significant change in theory, but in part because of the additional requirement to have a working knowledge of how to use the associated, technology-based facilitation tools (e-learning tools). The research described here sets out to develop a relatively simple method for tutors to facilitate constructivist learning online. A research context and theoretical background are provided. The development of a prototype e-learning tool is briefly described and an initial evaluation is detailed. Conclusions and a brief discussion are included.

A Research Context Heriot-Watt is an established UK University, internationally-renowned for its business and management programs. As with many other such universities, Heriot-Watt has been proactive in developing these programs for the online learning market. The Management Programme (MP), which was originally developed to facilitate learning via traditional, face-to-face tutoring (provided by international, approved support centres), is one that has been developed to also facilitate learning via online, e-learning tools. MP development is based on an iterative design cycle whereby e-learning tools are first implemented according to in-house instructional design principles and then modified according to feedback obtained from MP students, tutors, etc. The development of the overall learning facilitation method and evaluation of the prototype e-learning tool described below is based on the analysis of such feedback (Ireland, 2002; Ireland, Graves, & Hare, 2003).

Theoretical Background

Constructivist Learning Theory Facilitating learning requires an in-depth knowledge of the theoretical principles underpinning the process of learning. As the learning process can be view from differing psychological perspectives, however, multiple and possibly conflicting theoretical principles exist. MP learning is viewed from the perspectives of cognitive and social psychology (see Bandura, 1970; Cronbach, 1954; Fontana, 1988; Jarvis, 1987; Kolb & Fry, 1975; Phillips, & Soltis, 1998). Learning is, therefore, defined as a change in internal cognitive structures that is shaped by an individual’s interpretation of, and subsequent action on, all of the information, objects, persons, groups, etc., in his/her environment. In addition, the MP learning process is underpinned by the theoretical principles associated with a moderate approach to social constructivism. Constructivism posits that knowledge is actively constructed by the individual (Hendry, 1996; Mayer, 1992) while striving to make sense of the world on the basis of personal filters: experiences, goals, curiosities and beliefs (Cole, 1992). Social constructivism posits that reality is a constructive process embedded in socio-cultural practices (Duffy & Cunningham, 1996) and that knowledge is viable in social as well as personal contexts (Tobin & Tippings, 1993). In terms of learning, the theoretical principles associated with social constructivism suggest that learners make and test tentative interpretations of new experiences until satisfied (Perkins, 1991), and that they construct meaning through

social interactions within the groups to which they belong (von Glasserfeld, 1995; Willis, 1998). Whereas an extreme approach to social constructivism is narrow and limited to certain kinds of learning situations (Merrill, 1991), a moderate approach can be generic enough to be relevant to a wide variety of learning situations (Wilson, 1997). For example, Merrill (1992) suggests that learning can be active yet not always collaborative, and that testing can be integrated and consistent with pre-specified learning content and objectives yet also include separate assessment of achievement. Similarly, Mergel (1998) suggests that instruction can be pre-determined, sequential, and criterion-referenced for introductory learning and yet more constructivist for advanced knowledge acquisition. Constructivist Instructional Design Facilitating learning also requires an appreciation for how the theoretical principles underpinning the process of learning apply to real-world situations. How constructivist principles, in particular, influence instructional design has been discussed by a number of theorists (see Jonassen, 1994; Lebow, 1993; Willis, 1995) who propose a type of constructivist instructional design model. Jonassen (1999) goes a step further and proposes a model for designing constructivist learning environments. The core constructivist principles adhered to by the in-house MP instructional designers were that learning is an active process (Spiro, Feltovich, Jacobsen, & Coulson, 1991) involving engagement and experimentation (Squires, 1999) that occurs most effectively in context (Jonassen, 1991). Accordingly, the MP online learning environment was developed to promote scaffolding and coaching of knowledge (Conway, 1997; Hannafin, Hannafin, Land, & Oliver, 1991) and encourage open interpretations and explanations (Cobb, 1994). Scaffolding and coaching of knowledge was promoted using asynchronous conferencing methods, i.e., discussion board tools, which allowed students to interact with their tutors and peers. Open interpretations and explanations were encouraged using Internet relay chat methods, i.e., chat room tools, which allowed students to interact with their peers. By using these learning facilitation methods and their associated e-learning tools, it was expected that MP students would actively engage in experimentation via discussions with their tutors and peers, and gain an authentic context via chatting with their peers. Analysis of MP student feedback, however, has provided evidence to suggest that these methods have not been as successful at promoting active, authentic learning as originally hoped. Specifically, MP students used neither the discussion board tools to engage in experimentation nor the chat room tools to gain an authentic context (Ireland, 2002). Further investigation into possible reasons for this has provided evidence to suggest that certain student groups require more direct and directed support from their tutors in order to engage in experimentation (Ireland et al., 2003) and, as a result, acquired context from them instead of from their peers.

Results from this further investigation seem to be in agreement with Merrill’s view that learning can be active yet not always collaborative, since sometimes individual learning is more effective (Merrill, 1992). The results also seem to be in agreement with Perkins’ (1999) view that unlimited student control of learning creates problems of accountability, and O’Donnell’s (2000) view that not all learners benefit from having almost unlimited control over their own learning.

Development and Initial Evaluation

Learning Facilitation Method Development Overall, MP feedback has indicated a need for its online learning facilitation methods to become more explicit and thus more controlled. A corollary of this need is that current MP e-learning tools will have to be modified. To understand how this need can be accommodated, a brief description is required of how the MP’s current and proposed methods function to facilitate learning. Current Methods The asynchronous conferencing methods employed by the MP provide a virtual location where registered students can go to access information about their specific course topics. Tutor-initiated topics and associated information resources can be reviewed and subsequently commented on by students. These comments are open for all to see and so can be accessed and subsequently commented on again by other students and the tutor. The comment and review process is unrestricted and so it can either continue until the topics have been exhausted or remain stagnant, leaving the topics untouched. In addition, the pace of any process progress can be either rapid or sluggish, depending on the rate at which students choose to comment. Theoretically, providing students with such an open and unrestricted method should facilitate their learning by allowing them to take control and direct collaborative construction of meaning. In reality, however, MP students have either been unable to collaborate (due to a lack of student contributions) or manage collaboration (due to a high number and rapid rate of student contributions). At this point, it is important to note that contributions from MP tutors are usually only in responses to questions directed at them, otherwise they leave the students to explore the topics. Research literature suggests that, in fact, effective conferencing is brought about via tutors who moderate as discussion leader and group facilitator (Eastmond, 1995; McConnell, 1994; Paulsen, 1995). However, MP development has also been to facilitate learning online, i.e. it is not yet a purely online program, and so its tutors are responsible for undertaking other, varied tasks.

Proposed Methods In order to accommodate MP student needs without any significant increase in tutor task load, an alternative method for facilitating constructivist learning online was needed. This alternative method had to provide tutors with a more explicit means of controlling student experimentation while requiring as little additional knowledge as possible. According to Jonassen’s model for designing constructivist learning environments, one way to control student experimentation is through a shared knowledge modeling/building method (Jonassen, 1999). This method facilitates learning by posing a common problem or question to individual students who then share the solutions and answers after they have been constructed and submitted. The obvious advantage of this method for the MP is that it accommodates the need for students to have more directed learning activities and does so in such a way so as to reduced potential for interference from other students’ activity (or lack thereof). As this method functions in much the same way as the conferencing method does, it would benefit MP tutors via task familiarity. For example, tutors could initiate with a problem or question in much the same way as they do with a topic, and the students could begin to understand the problem/question by accessing the very same supporting information resources that are already available for the conferencing method. In addition, students could contact tutors directly as they can in the conferencing method. The key difference between the two methods is that topic information is co-constructed in the conferencing method and solutions or answers are first individually constructed and then shared in the modeling/building method. It is this difference in construction that provides the basis for modification of the current MP e-learning tools.

E-Learning Tool Prototyping An Overview The modified e-learning tool, which can be seen in Figures 1–3, is a type of knowledge constructor and shall hereafter be referred to as a Skeleton Curriculum. The skeletal metaphor was chosen as only the structure of any MP curricula is provided; ‘fleshing out’ the content is the responsibility of the student(s).

Figure 1. Skeleton Curriculum: Beginning Window View

Physically, a skeleton curriculum is an online interface that displays a common problem or question together with a list of the key concepts deemed to constitute the knowledge base needed to solve the problem or answer the question. The interface allows the individual student to interact directly with the list of concepts and so provides functions to enter, submit, view, and edit textual content.

Figure 2. Skeleton Curriculum: View of the Information Resources Window

The expectation is that individual students will review the available information resources and begin to write their own curriculum content. In doing so it is hoped that students will construct meaning and understand enough about the key concepts to be able to then provide a solution or answer. The role of the tutor is primarily to assess the quality of students’ solutions or answers and subsequently provide feedback. As the tutors have full access to the content provided by each student, they can address the concepts least understood by the student and so suggest possible avenues for improving solutions or answers. In addition, the tutor can make content submitted by any student available to any other and so knowledge is shared as well as constructed.

Figure 3. Skeleton Curriculum: View of a Key Concept Window

Technical Architecture in Brief The evaluated skeleton curriculum is only a prototype and so has been implemented in Java for research purposes. However, the technical architecture needed to actually implement a skeleton curriculum is no more advanced than existing web technologies; only a server with, say, a mySQL database configured to be accessed via some scripting mechanism, such as Perl or PHP, is needed. Obviously tutors and students need not know anything about this architecture as they interact directly with a skeleton curriculum via the web at the browser interface (interested readers should see Ireland, 2004). What is important is that no additional costs would be incurred by the MP as this technical architecture already exists and is currently being used to implement their discussion boards.

Evaluation Methodology Experimental Design Five Management tutors with experience using discussion boards to facilitate constructivist learning were trained to use the prototype skeleton curriculum and asked to employ it as a replacement for their usual discussion boards in one of their classes. The choice of class was left to the discretion of the tutor but depended upon the criteria that students within the class had to have: a.) experience using discussion boards, and b.) been previously assessed by the tutor in respect to using discussion boards to undertake course work. A result of having such criteria is that the implemented skeleton curricula were associated with various management topics. Also, as only limited time and resources were available for the evaluation, tutors were required to implement just one question with between five and seven key concepts (not a full-scale curriculum with many questions). Based on the tutors’ choices, 58 students were made available for the evaluation. These students were also trained to use the prototype skeleton curriculum prior to using it in class. Methods Upon completion of the classes, both the tutors and the students were asked to rate on a Likert-type Scale (Likert, 1932) of 1 through 5 the ease of use of the skeleton curriculum compared to a discussion board (from previous experience), where 1 = much harder than a discussion board, 3 = neither harder nor easier, and 5 = much easier than a discussion board. The tutors were then asked to rate — in the same way — the quality of each student’s work produced from using the skeleton curriculum compared to when using a discussion board (in the past), where 1 = much worse than when using a discussion board, 3 = the same for both, 5 = much better than when using a discussion board. Finally, both the tutors and the students were also asked to choose which of the two tools they preferred using, a binary choice, where 0 = discussion board and 1 = skeleton curriculum.

Results and Analysis Table 1 illustrates students’ choice between a skeleton curriculum and a discussion boards.

Table 1. Students’ Choice of Tool

Choice Discussion Board Skeleton Curriculum Observed N 25 33 Total N = 58

Students’ ratings for the ease of use of skeleton curriculum compared to a discussion board can be found in Table 2.

Table 2. Students’ Ease of Use Ratings

Rating 1 2 3 4 5 Observed N 4 20 19 15 0 Total N = 58 µ = 2.78; σ = 0.918

Tutors’ ratings for the quality of student work when using skeleton curriculum compared to when using a discussion board can be found in Table 3.

Table 3. Tutors’ Quality of Student Work Ratings

Rating 1 2 3 4 5 Observed N 0 4 17 22 15 Total N = 58 µ = 3.83; σ = 0.901

Students’ Choice of Tool The Binomial Test was used to determine whether a significant difference between the proportion of students’ choosing a discussion board versus the proportion of students’ choosing the skeleton curriculum exists (2-tailed). The null hypothesis (H

0): there is no

significant difference between the proportions. The hypothesis (H1): there is a significant

difference between the proportions. With a test proportion = 0.5, p > 0.05. For this result H

0 cannot be rejected in favor of H

1. There is no significant difference between the

proportion of students’ choosing discussion boards versus the proportion of students’ choosing skeleton curriculum.

Students’ Ease of Use Ratings A One-Sample T-test was used to determine whether a significant difference exists between the observed mean for students’ ease of use rating and the theoretical mean based on a population with a normal distribution (2-tailed). The null hypothesis (H

0):

there is no significant difference between the observed and theoretical means. The hypothesis (H

1): there is a significant difference between the observed and theoretical

means. With an observed mean = 2.78 and a theoretical mean = 3, t (57) = -1.858; p > 0.05. For this result H

0 cannot be rejected in favor of H

1. There is no significant

difference between the observed mean for students’ ease of use ratings and the theoretical mean based on a population with a normal distribution. Both the Kendall’s tau (τ) statistic and Spearman’s rank (r

s) correlation were used to

determine whether a significant association between students’ ease of use ratings and their choice of tool exists (2-tailed). The null hypothesis (H

0): there is no significant

association. The hypothesis (H1): there is a significant association. With τ = 0.354, n =

58; p < 0.05. With rs = 0.381, n = 58; p < 0.05. For both of these results, H

0 can be

rejected in favor of H1. There is a significant association between students’ ease of use

ratings and their choice of tool. As the associated Bivariate Correlation Coefficient = 1.000, this association is both strong and positive. Pearson’s Chi-Square (χ

2) was subsequently used to verify the significance of this

association using the same H0 and H

1. χ

2 = 11.229; df = 3; p < 0.05. For this result, H

0

can be rejected in favor of H1. This confirms that there is a significant association

between students’ ease of use ratings and their choice of tool. In addition, Phi (Ф) and Cramer’s V were subsequently used to determine the significance of the strength of this association as indicated by the Bivariate Correlation Coefficient. With Ф = 0.440, p < 0.05. With Cramer’s V = 0.440, p < 0.05. These results indicate that the strength of the association is significant. Tutors’ Quality of Student Work Ratings The One-Sample T-Test was also used to test whether a significant difference exists between the observed mean for tutors’ quality of student work rating and the theoretical mean based on a population with a normal distribution (2-tailed). The same H

0 and H

1

apply as in the case of One-Sample T-tests for students’ ease of use ratings. With an observed mean = 3.83 and a theoretical mean = 3, t (57) = 6.995; p < 0.05. For this result H

0 can be rejected in favor of H

1. There is a significant difference between the observed

mean for tutors’ quality of student work ratings and the theoretical mean based on a population with a normal distribution.

Both the Kendall’s tau (τ) statistic and Spearman’s rank (rs) correlation were also used to

determine whether a significant association between tutors’ quality of student work ratings and students’ choice of tool exists (2-tailed). Again, the same H

0 and H

1 apply as

in the case of students’ ease of use ratings and their choice of tool. With τ = 0.569, n = 58; p < 0.05. With r

s = 0.613, n = 58; p < 0.05. For both of these results, H

0 can be

rejected in favor of H1. There is a significant association between tutors’ quality of

student work ratings and students’ choice of tool. As the associated Bivariate Correlation Coefficient = 1.000, this association is also both strong and positive. Pearson’s Chi-Square (χ

2) was again subsequently used to verify the significance of this

association using the same H0 and H

1. χ

2 = 22.261; df = 3; p < 0.05. For this result, H

0

can be rejected in favor of H1. This confirms that there is a significant association

between tutors’ quality of student work ratings and students’ choice of tool. In addition, Phi (Ф) and Cramer’s V were again subsequently used to determine the significance of the strength of this association as indicated by the Bivariate Correlation Coefficient. With Ф = 0.620, p < 0.05. With Cramer’s V = 0.620, p < 0.05. These results indicate that the strength of the association is significant. Miscellaneous Due to the very small number of tutors giving ease of use ratings and a choice indication their data have not been analyzed nor have they even been tabulated. Instead they are described here simply to provide a general context for the initial evaluation. Of the 5 tutors: 3 rated the skeleton curriculum as neither harder nor easier to use than a discussion board, 1 rated it slightly easier to use than a discussion board, and 1 rated it slightly harder to use than a discussion board. In addition, 3 of the 5 tutors chose the skeleton curriculum over a discussion board and the remaining two chose a discussion board over the skeleton curriculum.

Conclusions Although a greater proportion of students seemed to prefer skeleton curriculum over a discussion board, the difference between the proportion of students’ choosing the skeleton curriculum versus the proportion of students’ choosing a discussion board was not significant. Accordingly, no conclusions can be made about students’ e-learning tool preference. Similarly, although more students rated skeleton curriculum as being slightly harder to use than a discussion board, the difference between the observed mean for students’ ease of use rating and the theoretical mean was not significant and so no conclusions can be made about students’ thoughts about the ease of use of the skeleton curriculum compared

to a discussion board. As a significant association existed between students’ ease of use ratings and their choice of tool and as this association was positive and significantly strong it can, however, be concluded that as students’ ratings for the skeleton curriculum tended towards being easier to use than a discussion board so too did their preference for e-learning tool tend towards the skeleton curriculum. As more tutors rated the quality of student work as being slightly better when using the skeleton curriculum compared to when using a discussion board and as the difference between the observed mean for tutors’ quality of student work ratings and the theoretical mean was significant, it can be concluded that tutors thought that their students’ quality of work was slightly better when using the skeleton curriculum compared to when using a discussion board. In addition, as a significant association existed between tutors’ quality of student work ratings and students’ choice of tool and as this association was positive and significantly strong, it can also be concluded that as tutors’ ratings for the quality of student work when using a skeleton curriculum tended towards being better than when using a discussion board so too did the students’ preference for e-learning tool tend towards the skeleton curriculum.

Discussion Theoretical Limitations The main theoretical limitation of this initial evaluation is that although the obtained data have given an idea of what students thought the ease of use of the skeleton curriculum was like compared to a discussion board tool, and what tutors thought about the quality of student work when using it compared to when using a discussion board, the obtained data have not given any idea as to why tutors and students thought what they did. Knowing the latter is particularly important as it would specifically address the theory upon which the prototype skeleton curriculum tool was developed. Methodological Limitations The research described here set out to develop a relatively simple method for tutors to facilitate constructivist learning online. As a result, the relatively large amount of data obtained mostly from students in this case is not as good as similarly large amounts of data obtained from both students and tutors would have been. However, the research has at least provided some fairly rapid initial findings about the shared knowledge modeling/building method. Prototype Limitations The prototype skeleton curriculum is fairly minimal in terms of available functions. This minimalist approach to functionality was chosen in order to reduce the effort needed to learn how to use the prototype for research purposes. Real-world implementations of a skeleton curriculum may want to include many more functions.

General Implications The proposed shared knowledge modeling/building method works in part by breaking down curriculum content and specifying its key concepts in advance, and in part by sharing any student work only after it has been completed and reviewed by the tutor. From a theoretical perspective, it would seem that a less social, less constructivist method such as this could prove useful for certain online student groups. Groups, for example, which require additional support during the initial stages of knowledge acquisition. From a real-world perspective, it would seem that the prototype skeleton curriculum tool functions to provide such additional support, and does so in a cost effective manner. Armed with Heriot-Watt University’s experience from this initial evaluation, other universities might want to look into exactly how their e-learning tools facilitate learning online, paying close attention to what improvements could be implemented. Acknowledgements I would like to thank Dr. Nestor Milyaeva of Heriot-Watt University for his programming assistance during the prototyping of the skeleton curriculum tool used for this research. I would also like to thank the various tutors involved in this initial evaluation of the skeleton curriculum tool. References Bandura, A. (1970). Social learning theory. New Jersey: Prentice-Hall. Cobb, P. (1994). Constructivism and learning. In T. Husen, T. & T. N. Postlethwaite

(Eds.), International Encyclopedia of Education (pp. 1049-1051). Oxford: Pergamon Press.

Cole, P. (1992). Constructivism revisited: A search for common ground. Educational Technology, 33(2), 27–34.

Conway, J. (1997). Educational technology’s effect on models of instruction. Retrieved December 16, 2004, from http://www.copland.udel.edu/~jconway/EDST666.htm.

Corich, S. (2004). Instructional design in the real world: A view from the trenches (Book Review). Educational Technology & Society, 7(1), 128–129.

Cronbach, L. J. (1954). Educational psychology (pp. 49–51). USA: Harcourt, Brace and Company.

Duffy, T., & Cunningham, D. (1996). Constructivism: Implications for the design and delivery of instruction. In D. H. Jonassen, (Ed.), Handbook of research for educational communications and technology (pp. 170–198). New York: Simon and Schuster.

Eastmond, D. (1995). Alone but together. New Jersey: Hampton Press. Fontana, D. (1988). Psychology for Teachers (pp. 125–164). UK: The British

Psychological Society. Hannafin, M. J., Hannafin, K. M., Land, S. M., & Oliver, K. (1997). Grounded practice

and the design of constructivist learning environments. Educational Technology Research and Development, 45(3), 101–117.

Hendry, G. D. (1996). Constructivism and educational practice. Australian Journal of Education, 40(1), 19–45.

Ireland, D. A. (2002). Culture as a potential factor affecting the development of open

learning programmes. In A. Thatcher, J. Fisher, & K. Miller, K. (Eds.), The 3rd International Cyberspace Conference on Ergonomics (pp. 651–658). Johannesburg, South Africa: The University of Witwatersrand.

Ireland, D. A., Graves, R., & Hare, P. G. (2003). Tutor activity in open learning systems: Using local practice to develop support technologies. In Proceedings of the 15th Triennial Congress of the International Ergonomics Association. Seoul, Korea.

Ireland, D. A. (2004). User-modifiable learning objects an e-learning support technology for students with English as a second language. In Proceedings of the 5th Annual Irish Educational Technology Users Conference. Co. Kerry, Ireland: Institute of Technology Tralee.

Jarvis, P. (1987). Adult learning in the social context. London: Croom Helm. Jonassen, D. H. (1991). Objectivism versus constructivism: Do we need a new

philosophical paradigm? Educational Technology Research and Development, 39(3), 5–14.

Jonassen, D. H. (1994). Thinking technology. Educational Technology, 34(4), 34–37. Jonassen, D. H. (1999). Designing constructivist learning environments. In C. M.

Reigeluth, (Ed.), Instructional-design theories and models (Vol. II, pp. 215–239). New Jersey: Lawrence Erlbaum Associates.

Kolb, D. A., & Fry, R. (1975). Toward an applied theory of experiential learning. In C. Cooper, (Ed.), Theories of group process. London: Wiley.

Lebow, D. (1993). Constructivist values for systems designers: five principles toward a new mindset. Educational Technology Research and Development, 41, 4–16.

Likert, R. (1932). A technique for the measurement of attitudes. Archives of Psychology 40.

McConnell, D. (1994). Implementing computer supported cooperative learning. London: Kogan Page.

Mayer, R. E. (1992). Cognition and instruction: Their historic meeting within educational psychology. Journal of Educational Psychology, 84, 405–412.

Mergel, B. (1998). Instructional Design and Learning Theory. Retrieved December 16, 2004, from http://www.usask.ca/education/coursework/802papers/mergel/brenda.htm

Merrill, M. D. (1991). Constructivism andiInstructional design. Educational Technology, 31(5), 45–53.

Merrill, M. D. (1992). Constructivism and instructional design. In T. M. Duffy & D. H. Jonassen (Eds.), Constructivism and the technology of instruction: A conversation (pp. 99–114). New Jersey: Lawrence Erlbaum.

O’Donnell, A. M. (2000). Constructivism by design and in practice: A review. Issues in Education, 3(2), 285–294.

Paulsen, M. F. (1995). Moderating educational computer conferences. In Z. L. Berge & M. P. Collins (Eds.), Mediated communication and the online classroom. New Jersey: Hampton Press.

Perkins, D. N. (1991). Technology meets constructivism: Do they make a marriage? Educational Technology, 31(5), 19–23.

Perkins, D. N. (1999). The many faces of constructivism. Educational Leadership, 57(3), 6–11.

Phillips, D. C., & Soltis, J. F. (1998). Perspective on learning (pp. 41–52). USA: Teachers College Press.

Phillips, V. (2004, July 17). Online degrees — Public acceptance. LLC Virtual University Gazette Newsletter. Retrieved October 25, 2005, from http://www.geteducated.com/

Spiro, R. J., Feltovich, P. J., Jacobson, M. J., & Coulson, R. L. (1991). Knowledge representation, content specification, and the development of skill in situation-specific knowledge assembly: Some constructivist issues as they relate to Cognitive Flexibility theory and hypertext. Educational Technology, 31(9), 22–25.

Squires, D. (1999). Educational software for constructivist learning environment: Subversive use and volatile design. Educational Technology, 39 (3), 48-54.

Tobin, K., & Tippings, D. (1993). Constructivism as a referent for teaching and learning. In Tobin, K. (Ed.) The practice of constructivism in science education (pp. 3-12). New Jersey: Lawrence Erlbaum.

von Glasserfeld, E. (1995). A constructivist approach to teaching. In Steffe, L.P., & Gale, J. (Eds.) Constructivism in education (pp. 3-15). New Jersey: Lawernce Erlbaum.

Willis, J. (1995). Recursive, reflective instructional design model based on constructivist-interpretist theory. Educational Technology, 35 (6), 5-23.

Willis, J. (1998). Alternative instructional design paradigms: What’s worth discussing and what isn’t. Educational Technology, 38 (3), 5-16.

Wilson, B. (1997). Reflections on Constructivism and Instructional Design. In Dills, C.R., & Romiszowski, A.A. (Eds.) Instructional Development Paradigms (pp. 63-80). New Jersey: Educational Technology Publications.