special issue on knowledge transformation, design and technology : three papers from the...

3
Special Issue on knowledge transformation, design and technology Three papers from the Technology-Enhanced Learning Research ProgrammeDoes technology enhance learning? Despite the inter- vention of numerous researchers in both the learning and computing sciences, this is by far the most cited question in the field on the part of practitioners and poli- cymakers alike, as well as a steady stream of research- ers. Slowly but surely, however, the last decade or so has seen some acknowledgement of the question’s limita- tions, and its technocentrism in ascribing roles to tech- nology that ignore context and activity. Attention has shifted to how technology mediates learning, how dif- ferent technologies can shape learning and teaching in different ways, and how technology can be designed to influence the behaviours of teaching and teachers. The successor to the Teaching and Learning Research Pro- gramme, the Technology-Enhanced Learning Research Programme, funded jointly by the Economic and Social Research Council and the Engineering and Physical Sciences Research Council, flagged this awareness, stressing the need for interdisciplinary research that incorporates research on design, implementation and evaluation, thus recognizing explicitly how complex research in the field of TEL could and should be. As part of this work, attention is implicitly turning to the limitations of the enhancement metaphor, which encourages a belief that technology enters into the culture of learning like some addition of an ingredient to a recipe. More importantly, the metaphor of enhance- ment takes for granted the immutability of what is to be enhanced. It leads to a tendency to render knowledge itself as an invariant in the transformation of learning, asking whether the acquisition of some knowledge is easier, or faster, or more efficient with technology than without it. The limitations of this view lie in a failure to recognize that knowledge itself is mediated by the com- puter presence, and that understanding how knowledge is reshaped with technology, rethinking what can be learned, is every bit as important as asking how effec- tively a given piece of learning can be effected. This is what Seymour Papert calls the ‘10% challenge’ – the need for researchers to spend at least a little time – 10% will do – thinking about epistemology rather than only cognition or pedagogy. One approach is to stress metacognitive themes, looking for evidence that technology-based learning focuses attention on problem solving, the development of heuristic strategies or the affective dimensions of engagement and motivation. However, this is only a partial recognition of Papert’s challenge. The alterna- tive is to consider the epistemological rather than the (meta-)cognitive, asking what happens to the knowl- edge at stake as it is transformed by the advent of new representational infrastructures made possible by tech- nology; how technology can be transformational not only in terms of how knowledge is accessed, but also of how the landscape of knowledge is potentially reconfig- ured if technology is designed and deployed thought- fully (Kaput et al. 2002; Wilensky & Papert 2010). An important corollary to this approach is that there is a reciprocal role for technology, in which the study of new representational forms offers a window (Noss & Hoyles 1996) on current practice and pedagogy. Any transformative technology takes time to adopt its own conventions and establishes its distinctive cultures – the much-cited ‘filming the stage’ of early film is a good example: the camera transformed the notion of perfor- mance, of direction, of plot and so on. The study of this transformation can tell us much about learning, how far established pedagogic practice – or even innovatory practice is in fact constrained and shaped by existing technology, and how new kinds of pedagogies become possible with new technologies (and, in fact, how old ones can be realistically implemented for the first time). A good example of this last point is in the first paper in the trio that make up the special issue. The Ensemble project sets out to explore the ways in which semantic web technologies could enable case-based learning, doi: 10.1111/j.1365-2729.2011.00466.x Editorial © 2012 Blackwell Publishing Ltd Journal of Computer Assisted Learning 1

Upload: richard-noss

Post on 24-Sep-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Special Issue on knowledge transformation,design and technologyThree papers from the Technology-Enhanced LearningResearch Programmejcal_466 1..3

Does technology enhance learning? Despite the inter-vention of numerous researchers in both the learningand computing sciences, this is by far the most citedquestion in the field on the part of practitioners and poli-cymakers alike, as well as a steady stream of research-ers. Slowly but surely, however, the last decade or so hasseen some acknowledgement of the question’s limita-tions, and its technocentrism in ascribing roles to tech-nology that ignore context and activity. Attention hasshifted to how technology mediates learning, how dif-ferent technologies can shape learning and teaching indifferent ways, and how technology can be designed toinfluence the behaviours of teaching and teachers. Thesuccessor to the Teaching and Learning Research Pro-gramme, the Technology-Enhanced Learning ResearchProgramme, funded jointly by the Economic and SocialResearch Council and the Engineering and PhysicalSciences Research Council, flagged this awareness,stressing the need for interdisciplinary research thatincorporates research on design, implementation andevaluation, thus recognizing explicitly how complexresearch in the field of TEL could and should be.

As part of this work, attention is implicitly turning tothe limitations of the enhancement metaphor, whichencourages a belief that technology enters into theculture of learning like some addition of an ingredient toa recipe. More importantly, the metaphor of enhance-ment takes for granted the immutability of what is to beenhanced. It leads to a tendency to render knowledgeitself as an invariant in the transformation of learning,asking whether the acquisition of some knowledge iseasier, or faster, or more efficient with technology thanwithout it. The limitations of this view lie in a failure torecognize that knowledge itself is mediated by the com-puter presence, and that understanding how knowledgeis reshaped with technology, rethinking what can belearned, is every bit as important as asking how effec-tively a given piece of learning can be effected. This

is what Seymour Papert calls the ‘10% challenge’ – theneed for researchers to spend at least a little time – 10%will do – thinking about epistemology rather than onlycognition or pedagogy.

One approach is to stress metacognitive themes,looking for evidence that technology-based learningfocuses attention on problem solving, the developmentof heuristic strategies or the affective dimensions ofengagement and motivation. However, this is only apartial recognition of Papert’s challenge. The alterna-tive is to consider the epistemological rather than the(meta-)cognitive, asking what happens to the knowl-edge at stake as it is transformed by the advent of newrepresentational infrastructures made possible by tech-nology; how technology can be transformational notonly in terms of how knowledge is accessed, but also ofhow the landscape of knowledge is potentially reconfig-ured if technology is designed and deployed thought-fully (Kaput et al. 2002; Wilensky & Papert 2010).

An important corollary to this approach is that thereis a reciprocal role for technology, in which the study ofnew representational forms offers a window (Noss &Hoyles 1996) on current practice and pedagogy. Anytransformative technology takes time to adopt its ownconventions and establishes its distinctive cultures – themuch-cited ‘filming the stage’ of early film is a goodexample: the camera transformed the notion of perfor-mance, of direction, of plot and so on. The study of thistransformation can tell us much about learning, how farestablished pedagogic practice – or even innovatorypractice is in fact constrained and shaped by existingtechnology, and how new kinds of pedagogies becomepossible with new technologies (and, in fact, how oldones can be realistically implemented for the first time).

A good example of this last point is in the first paperin the trio that make up the special issue. The Ensembleproject sets out to explore the ways in which semanticweb technologies could enable case-based learning,

doi: 10.1111/j.1365-2729.2011.00466.x

Editorial

© 2012 Blackwell Publishing Ltd Journal of Computer Assisted Learning 1

an established pedagogic strategy aimed at ‘bringingreality into the classroom’. However, like all the TELprojects, Carmichael & Tscholl set out not onlyto observe but also to design and implement, bringingtogether state-of-the-art semantic web tools innovel ways to bring this pedagogy to life, to assist inachieving its aim of basing learning on experience andenculturation.

In fact, a rather surprising outcome is reported withthe researchers describing ‘a shift in our understandingof the affordances of case-based learning’: to which wemight add, in the roles that technology plays in shapingit. It turns out that case-based learning is a pedagogythat technology certainly makes possible, but intandem, its key assumptions need re-examination in thelight of technological tools. Until now, the rhetoric ofreality has been based on the ways that data can bebrought to life – and the semantic web clearly has a roleto play here since it has the potential to aggregate datafrom multiple sources into interactive and visualizablerepresentations. What has emerged is that in thisprocess, the idea of ‘reality’ turns out to be more prob-lematic than at first sight. Thus, the appeal to reality ofthe case-based learning approach cannot ignore thathowever much it is an aspiration, the simple harnessingof new resource discovery and visualization tools is notsufficient. In fact, the very notion of authenticity andthe roles it serves needs to be re-examined and newquestions asked. Seeing digital technology as a succes-sion of new and increasingly powerful ways to makelearning more and more ‘authentic’ may be to missimportant points about learning and the roles of thedigital technologies.

The Ensemble researchers report that their explora-tion of semantic web technologies for case-based learn-ing has led to new insight into the nature and scope ofcase-based learning and, more generally, the idea andutility of ‘authenticity’. At the same time, this work hasfed into better understanding of different aspects of the‘semantic web toolbox’, and how to make the most of itsdifferent affordances.

This approach resonates with the paper by Mavrikiset al. in which a new representational infrastructure isdescribed for the learning of mathematical generaliza-tion – arguably the key idea that lies behind the schoolmathematics curriculum. Just as the Ensemble team wasled to reconceptualize the case-based strategy, theMiGen researchers had to consider carefully what new

knowledge elements should be flagged as central in thesystem under development – not necessarily those thatare attended to in the conventional pedagogic setting.

This necessity to make implicit epistemologiesexplicit is a key element of technology-enhanced learn-ing research. For example, the complexity of the idea of‘variable’, quite often taken for granted in mathematicallearning contexts, becomes critical in the MiGen micro-world (called the eXpresser), as it becomes a tool withwhich a student’s objectives can be realized. More gen-erally, we see in the paper how the design of the systemleads to a redrawing of the epistemological landscape,so that the introduction of technology focuses moredirectly on the knowledge actually at stake (in this casethe elusive idea of generalization), providing the studentwith tools that assist in seeing what really matters (theeffect on the general by constructing the specific).

The MiGen research reinforces the notion that thecomplexity of an idea is related to how it is represented,and that the relative learnability of a concept is inti-mately connected to the representational forms withwhich it is expressed. Generalization is expressed alge-braically; but algebra in this case is a means to an end,not an end in itself. In fact, it often looks the other wayround to the student, who is required to learn the rules ofalgebra and construct for herself (or more often ignore)what the algebra is for.

This necessity to reconstruct the epistemologicallandscape lies at the heart of the constructionist frame-work (Harel & Papert 1991). Constructionism lookslike a pedagogic strategy (‘children learn best whenthey are building stuff and sharing it’). However, it ismore than that, generating in research terms, the neces-sity to reconceptualize the knowledge elements of whatis to be learned and how this network of knowledge ele-ments is reconfigured as new representational forms areimported. MiGen shows nicely that what starts asfinding new ways to express old knowledge points tothe evolution of new knowledge – MiGen algebra is notthe same as conventional algebra, and exploring therelationship between them is a key element of theresearch.

The constructionist paradigm lies at the heart of Lau-rillard et al.’s paper, which focuses on teachers ratherthan (only) their students in the design of their ‘Learn-ing Designer’ tools. It has been developed to helpteachers conceptualize, construct, and implement newlearning designs with new technologies, because it is

2 Editorial

© 2012 Blackwell Publishing Ltd

argued that ‘teachers need a theory-informed way ofrepresenting the critical characteristics of goodpedagogy as they discover how to optimise learningtechnologies’.

Here too the challenge is as much epistemological asit is technical and pedagogical. As Laurillard and hercolleagues point out, pedagogical knowledge is difficultto learn and it is difficult to express. Indeed, the lack ofan effective and widely shared language for thinkingabout pedagogic knowledge – and especially pedagogicknowledge mediated by technology – is a major chal-lenge for those seeking to innovate and is, perhaps, onefactor that has made it so difficult to embed technologyin pedagogic practice. One important contribution,then, of this paper is its focus on ontology, which for-malizes the concepts and relations embedded in theLearning Designer. It is claimed that using the concep-tual objects available in the design tool to express theirpedagogic design gives teachers a different way ofcoming to recognize the knowledge landscape oftechnology-based pedagogy.

This approach allows principles and concepts to beintroduced to the teacher who is constructing asequence of learning activities. And in this, there is aclear resonance with the MiGen paper, seeking to focuslearners (in this case the learners are teachers) on whatmatters in the new domain of knowledge that is beingconstructed. The Learning Designer is not providingtools that merely codify or ‘enhance’practice, but seeksto transform the pedagogic landscape by affording‘users’ the opportunity to see how the knowledge ele-ments of their field (e.g. Q&A sessions in a lecture) canbe reconfigured in a technology-rich setting (as, forexample, FAQs and a discussion forum with differentstudent experience properties).

One final point concerns the idea of offering an analy-sis of the quality of a teacher’s design in the form of piechart and bar chart that represent the types of learningfacilitated and types of learning experience engendered.This is a core element of the system (and one which,it appears, is very well received by those using it).The key issue here is that these kinds of visualization

are not merely new ways to represent existing know-ledge, but clearly – as is evident from the snippets ofcomments presented – new knowledge for the teachersconcerned. The visualizations represent new pedagogicdesign elements that are largely invisible, expressingimplications of practice that remain implicit in standardpractice.

The papers in this issue represent a tip of a ratherlarge iceberg, consisting of eight major researchprojects and a wide range of thematic and methodologi-cal issues in the field of technology-enhanced learning(tel.ac.uk). Together, they aim to contribute to theresearch effort that explores and strengthens new inter-connections on design and evaluation, on computerscience and the learning sciences. More broadly, I hopethat these papers and the output of the programme as awhole continue to make a contribution to the literaturebeyond the question of technological ‘enhancement’ tolook again at the nature of knowledge and how we cometo know it.

Richard NossDirector, Technology Enhanced Learning Programme,

London Knowledge Lab,Institute of Education,University of London,

London, UK

References

Harel I. & Papert S., eds (1991) Constructionism. Ablex Pub-lishing Corporation, Norwood, NJ.

Kaput J., Noss R. & Hoyles C. (2002) Developing new nota-tions for a learnable mathematics in the computational era.In Handbook of International Research in MathematicsEducation (ed. L.D. English), pp. 51–75. LawrenceErlabum Associates, Mahwah, NJ.

Noss R. & Hoyles C. (1996) Windows on MathematicalMeanings: Learning Cultures and Computers. Kluwer,Dordrecht, The Netherlands.

Wilensky U. & Papert S. (2010) Restructurations: reformula-tions of knowledge disciplines through new representa-tional forms. Paper presented at the Constructionism 2010,Paris, France.

Editorial 3

© 2012 Blackwell Publishing Ltd