next practices for oer quality evaluation | lisa petrides

16
Next Practices for OER Quality Evaluation: Using Analytics to Support Continuous Improvement LAK 2013 - Learning Object Analytics for Collections, Repositories & Federations April 9, 2013 4.9.13 Lisa Petrides, Ph.D. ISKME 2013:

Upload: lpetrides

Post on 28-Nov-2014

7.439 views

Category:

Education


2 download

DESCRIPTION

Keynote at the Learning Analytics and Knowledge conference (LAK 2013), Leuven, Belgium - for the Learning Object Analytics for Collections, Repositories & Federations workshop, by Lisa Petrides, entitled "Next Practices for OER Quality Evaluation: Using Analytics to Support Continuous Improvement"

TRANSCRIPT

Page 1: Next Practices for OER Quality Evaluation | Lisa Petrides

Next Practices for OER Quality Evaluation: Using Analytics to Support Continuous Improvement

LAK 2013 - Learning Object Analytics for Collections, Repositories & Federations

April 9, 2013

4.9.13

Lisa Petrides, Ph.D.

ISKME 2013:

Page 2: Next Practices for OER Quality Evaluation | Lisa Petrides

ISKME (Institute for the Study of Knowledge Management in Education)Research, tools and services to advance teaching and learning

• Study (research)• Open (open knowledge networks)• Build (training and design)

Page 3: Next Practices for OER Quality Evaluation | Lisa Petrides

OER Commons.org

Page 4: Next Practices for OER Quality Evaluation | Lisa Petrides

Open Author

Page 5: Next Practices for OER Quality Evaluation | Lisa Petrides

Open Key drivers and next practices

Next practices:OER evaluation tools

Custom analytics

00

Uncertainty around implementation of new learning standards

Decreases in education funding

New learning standards

Increased demand for analyticsrelated to the use of online resources

Page 6: Next Practices for OER Quality Evaluation | Lisa Petrides

Analytics – For What PurposeFrom resource discovery to improved teaching and learning

• What resource usage patterns can and should be tracked and shared? • How can paradata

support resource and technology improvements?

• Which resources are learners spending time on? • How do usage

patterns map to assessment outcomes?

• Are resources meeting learning standards? If yes, how? If no, why not?• What factors make

resources reusable by teachers/learners? • What makes an

exemplary resource exemplary?

Learning Registry

NSDL schema for paradata exchange

Site-specific initatives*

*Examples include: Open High School Utah, Carnegie Mellon OLI, edX and others

ISKME – OER Commons

• How can we support resource discovery through shared metadata and paradata standards?

Key questions

Resource discovery

Technology and resource improvements

Curriculum improvements

Target outcomes

Enhanced teaching and learning practices; Curriculum improvements

Page 7: Next Practices for OER Quality Evaluation | Lisa Petrides

OER Quality EvaluationEQuIP tool for evaluating resources on alignment to state standards

Rubric dimensions:

1.Alignment to the depth of the CCSS (Common Core State Standards)

2.Key shifts in the CCSS

3.Instructional supports

4.Assessment

5.Overal rating for the lesson/unit

Page 8: Next Practices for OER Quality Evaluation | Lisa Petrides

OER Quality EvaluationAchieve tool for evaluating resources on quality dimensions

Rubric dimensions:

1.Quality of explanation of the subject matter

2.Utility of the materials designed to support teaching

3.Quality of assessments

4.Quality of technological interactivity

5.Quality of instructional and practice exercises

6.Opportunities for deeper learning

Page 9: Next Practices for OER Quality Evaluation | Lisa Petrides

Analytics Use Case Supporting teacher professional development around finding, creating, evaluating and aligning resources

• Are my teachers finding the resources they need?• Are they reaching our district’s goals for identifying

and evaluating resources?• What activities do teachers need more support in?• Where should I focus my professional development

efforts with my teachers?• Are teachers able to see what dimensions of a

resource need to be improved for it to be

considered exemplary?

Project leaders, district administrators, and state curriculum developers working with teachers to identify quality resources that are aligned to learning standards

Key Questions the Analytics Help to Answer

Page 10: Next Practices for OER Quality Evaluation | Lisa Petrides

• What distinguishes resources with high ratings from those with low

ratings?• What is it that makes a resource exemplary?• What factors contribute to the use and reuse of resources by

teachers?• How can we encourage the creation of high quality resources

through our tools and supports?

Key Questions the Analytics Help to Answer (for ISKME)

Analytics Use CaseSupporting improvements on learning resources

Page 11: Next Practices for OER Quality Evaluation | Lisa Petrides

Example Dashboard ViewResources by evaluation scores

Quality of explanation of subject matter

Quality of technologicalinteractivity

Page 12: Next Practices for OER Quality Evaluation | Lisa Petrides

Example Dashboard View Evaluation activities by user

Michael Sander

Jessalyn Katona

Marta Levy

William Donovan

Avery Mitchell

Sam Olsson

Chris Senges

Indicates whether goals for evaluating (or tagging) resources have been met

Page 13: Next Practices for OER Quality Evaluation | Lisa Petrides

Example ReportUser comments on evaluated resources

• All qualitative comments can be exported to a csv file for content analysis

• Comments can provide insight into needed improvements to the resource, what is good about the resource, and ways the resource can be used in the classroom

Page 14: Next Practices for OER Quality Evaluation | Lisa Petrides

Next Phase Custom AnalyticsExamples of additional data we are collecting through Open Author

Open Author Analytics Indicator of….

# of subheadings by resource Whether resources can be broken into smaller parts. How “modular” is the resource collection?

# of external URLs by resource Whether resources are being combined with other resources. How “remixable” are the resources?

# of versions of a resource by the original author; # of versions by other others

How many derivatives are being made of the resources, and by whom? Are resources in the collection adaptable?

Reasons provided by users for changing an existing resource

Why and how resources are changed.What makes a resource adaptable?

Page 15: Next Practices for OER Quality Evaluation | Lisa Petrides

What This All MeansContinuous improvement of resources toward enhanced learning

If one of our hyptheses is correct that…

Resources with the highest overall quality rating on our Achieve rubric are also found to have:

•The highest rating on dimension 5: Quality of instructional and practice exercises•More subheadings than other resources (more modular)•More external URLs than other resources (more remixable)•More versions created (more reusable)

•ISKME builds prompts into Open Author to encourage the creation of resources that have these components•This leads to the creation of new resources that potentially better meet learning standards and teaching needs•The newly created resources are then analyzed through the analytics•This creates a continuous cycle of resource and tool enhancement, towards improved teaching and learning

This could lead to…

Page 16: Next Practices for OER Quality Evaluation | Lisa Petrides

Lisa Petrides, PresidentEmail: [email protected]: @lpetrides

Institute for the Study of Knowledge Management in Education

Half Moon Bay, California