exploring the role of visualization and engagement in computer science education naps, t., &...
TRANSCRIPT
Exploring the role of visualization and engagement in Computer Science Education
Naps, T., & etc. (2003). Exploring the role of visualization and engagement in Computer Science Education. inroads – The SIGCSE Bulletin, 35(2), 131-152.
Introduction
• Report of the Working Group on Improving the Educational Impact of Algorithm Visualization (USA, Hong Kong, Finland, Germany).
• Experimental studies designed to substantiate the educational effectiveness of such visualization technology simply do not bear this out
• Two key obstacles to visualization technology’s widespread adoption– From the learner’s perspective, the visualization
technology may not be educationally beneficial– From the instructor’s perspective, the visualization
technology may simply incur too much overhead to make it worthwhile
Introduction
• Visualization technology has been successfully used to actively engage learners in such activities as:– Constructing their own input data sets– Making predictions regarding future
visualization states– Programming the target algorithm– Answering strategic questions about the
visualization– Constructing their own visualizations
Introduction
• Visualization technology, no matter how well it is designed, is of little educational value unless it engages learners in an active learning activity.
• Based on these taxonomies of engagement and effectiveness metrics, this report present a framework for experimental studies of visualization effectiveness
Background
• Commonly accepted suggestions about algorithm animations drawn from experience– Provide resources that help learners interpret the
graphical representation– Adapt to the knowledge level of the user– Provide multiple views– Include performance information– Include execution history– Support flexible execution control– Support learner-built visualizations– Support custom input data sets– Support dynamic questions– Support dynamic feedback– Complement visualizations with explanations
Engagement Taxonomy
• learners’ involvement in an education situation that includes visualization– No viewing– Viewing– Responding (answer question)– Changing (modify the visualization)– Constructing– Presenting
Determining Effectiveness of Visualization
• Develop a basis for defining metrics for determining the effectiveness of visualization
• How Bloom’s taxonomy can be used to give a concrete definition of expectations for a learner’s understanding
• Provide a specific example of applying Bloom’s taxonomy in the context of algorithmics and data structures
• Explore factors that could be measured in order to demonstrate learning improvement
• Explore additional factors that can be collected to help profile the learners and provide a better context for data analysis
Algorithmics and Data Structures in the Context of Bloom’s Taxonomy
Algorithmics and Data Structures in the Context of Bloom’s Taxonomy
Other Factors
• Learner’s progress
• Drop-out rate
• Learning time
• Learner satisfaction
Covariant factors
• Learning style
• Learner's familiarity with using visualization technology
• Learning orientation
• Other background information
An Experimental Framework• General Hypotheses
– Viewing vs. No Viewing• Viewing results in equivalent learning outcomes to no visualization
(and thus no viewing).– Responding vs. Viewing
• Responding results in significantly better learning outcomes than viewing.
– Changing vs. Responding• Changing results in significantly better learning outcomes than
responding.
– Constructing vs. Changing• Constructing results in significantly better learning outcomes than
changing.
– Presenting vs. Constructing• Presenting results in significantly better learning outcomes than
constructing.
– Multiple Engagements• A mix of several forms of engagement is natural and we expect this to
occur in experiments, especially in the latter types of engagement.
An Experimental Framework
• Participants• Materials and Tasks• Procedure
– The overall experimental procedure is rather straightforward: pre-test, use materials to perform tasks, post-test.
• Evaluation Instruments– Pre-Test and Post-test– Background– Task-specific
• Time on task, Learner progress, Drop-out rate– Learner feedback
Example Experiments
Area: Introduction to ProgrammingTopic: RecursionHypothesis VI: Changing vs. Viewing• a program visualization tool is needed (Alice 3D
animation tool http://www.alice.org ).• Viewing could mean watching an animation where
a skater (or some figure) is skating to a cone on the ice, avoiding collision.– The condition of nearness to the cone is used to control a
recursive call to the animation method (tail recursion).
• Constructing and Changing could mean the learner constructs the animation using recursive calls to glide a skater to the cone without colliding.
Some of Problematic Issues
• The differing complexity of basic concepts within the field of algorithmics.
• Algorithmics includes knowledge that can be viewed as both conceptual and related to implementation.
• Algorithm analysis is a complicated undertaking that has parts belonging to several levels in Bloom’s taxonomy
The End
Thank you for your listening