data informed decision making that improves teaching and learning meim's 15th annual technology...
TRANSCRIPT
Data Informed Decision Making that Improves Teaching and Learning
MEIM's 15th Annual Technology Conference & Expo Dec. 2, 2005
Why are educators so fired up about data?
Superintendents ask •How do we know if teachers are teaching our curriculum?
•How do we maximize the value of dollars spent for assessment and data management?
•Are all of our students achieving at acceptable levels?
Professional learning communities ask
•What is it we want our students toknow and be able to do?
•How will we know when they have learned it?
•What will we do when students are not learning?
ImprovingStudent Achievement
Is The Reason.
Why are educators so fired up about “data?”
Creating some common languageabout data in schools
What are the major systems?
How are they related?
What have districts done?
Where do we want to go?
Assessment SystemsStudent Information Systems
Data analysis systems Data warehouse
4 Major Data & Technology Systems in Schools.
Data analysis process
From Matt Stein. Making Sense of the Data: Overview of the K-12Data Management and Analysis Market, Eduventures, Inc., Nov. 2003.
What is a Student Information System?
• Registers new students• Demographic information (address,
emergency contacts, etc.)• Attendance• Scheduling of classes• Achievement data • Examples include: CIMS, Skyward,
Chancery, Pentamation, Zangle, etc.
It is not keeping track of what is going on in classrooms.
What is an Assessment System?
Tool for gathering achievement information– Some deliver item banks
• Benchmark by NCS Pearson
• MAP by the Northwest Evaluation Association
– Some deliver intact tests• Assess2Learn by Riverside
• EdVision by Scantron,
• Homeroom by Princeton Review
– Most are web-based
It is assessing what is going on in classrooms.
Who needs what data?
• Administrators, public, legislators– Evaluation
– Accountability
– Long range planning
• Teachers, parents, students– Diagnosis
– Prescription
– Placement
– Short range planning
– Very specific ach info
e.g., What percent met standards on 4th grade MEAP math?Are students doing better this yearthan they were doing last year?
e.g., Who understood this concept? Why is Becky having trouble reading?
A single assessment cannot meet all needs.
Large Grain Size Fine Grain Size
What is a “data analysis system?” •The vendor maps your data to their system•Predefines the kinds of analyses staff will do•Allows user to create answers to questions•Lots of nice graphs, lists, etc.
Examples:AMS by TurnLeaf,SAMS by ExecutiveIntelligence, QSP,STARS by SchoolCity,Pinnacle by ExcelsiorInform by Pearson.
D’Tool and TestWiz are “sort of” data analysis systems.
File Maker lets districts
invent their own system.
What is a data warehouse?
• It brings all the various sets of data together– Financial data
– Personnel data
– Building infrastructure data
– Student demographic information
– Student program information
– Student achievement information
– Example: Center for Educational Performance and Information’s Michigan Education Information System.
(80% of work is data cleansing.)
MEAP
School Infrastructure
Database (SID)
FinancialInformation
Database (FID)
Student Test and Achievement Repository (STAR)
Registry of Educational
Personnel (REP)
Single RecordStudent Database
(SRSD)
ACT
What’s in CEPI’s data warehouse?
School Code Master
SAT
Why some things aren’t in a warehouse….
hoarding
strayoverlooked
Easier to ignore
Not sure what it isor how to measure it
How are these things related?
You can have a Student Info System and nothing else.
You can have an assessment system and nothing else(but most assessment systems “depend” on data from the SIS).
There is no point in having a data analysis system unless you have data. If you have a SIS & an assessment system,you’ll probably want a data analysis system.
The State of Michigan is creating a data warehouse.A data analysis system could also use data from the warehouse.A data analysis system can bring the pieces together without a warehouse.
Oakland Schools Board of Education agreed to spend up to $1,600,000
in 2005-06 to makePearson Benchmark “Lite” & Inform
available to all districts.
What we are trying to do:Provide Technology that Will Help
• Improve teaching and increase learning for all
• Useful reports for teachers, principals and district administration
• Common assessments tied to GLCEs
• Item banks tied to GLCEs
• Multiple district on-ramps
Project Planning Process
• Fall 2003 – Meetings with focus groups• Fall 2004 create RFP• Oct 2004 – Meeting with Assessment,
Curriculum and Technology directors from Oakland districts to discuss requirements
• Dec 2004 – RFP sent out to bid• Jan 2005 – 10 responses received• May 2005 – Committee selects products• July 2005 – Oakland School BOE approval
Oakland & LEA Members Only(N = 15)
Items are arranged by “Importance” rating.)
1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0
Useful reports for teachers
Useful longitudinal reports for administrators
Comprehensive data analysis tool
Useful reports for administrators
Aligns to district curriculum
Softw are is user friendly
Easy to create & manage tests
Useful item & test statistics for administrators
Scanning & scoring is good
Useful longitudinal student profile for teachers
Allow s creation of multiple item types
Helps w ith collaboration on instruction
Provides item bank
SIS interface is good
Training model is good
Useful reports for students & parents
Web based testing is good
Provides instructional resources
Vendor A
Vendor B
Pearson
St. Disagree
Disagree
Not Sure
Agree
St. AgreeSAS-DAT Team -- Oakland only
Importance
Higher
Lower
Measure, Manage and Maximize Measure, Manage and Maximize Student AchievementStudent Achievement
Benchmark Test Results
By Test
• This view displays one or all tests that the selected student population has taken. Student scores are plotted across a proficiency scale.
• The view displays the percentage of students who scored within the range of each level on the proficiency scale.
• This view displays each assessed standard and graphs the percentage of students who mastered and did not master the standard on each assessment.
• Selecting a single test displays detailed results by standard for that test.
• Selecting all tests displays student performance on the standards over time.
Benchmark Test Results
By Standard
• This view displays all mastery records for the given student, sorted by standard.
• This represents a detailed running record of a student’s mastery across all benchmark tests.
Benchmark Test Results
By Individual - View Mastery Details
• Click on the question number to see the question itself.
• Click on the icon next to the question number to see a breakdown of the item’s performance by demographic category.
Benchmark Test Results
Item Analysis
• This view plots a line-dot graph based on the test frequency distribution, and calculates the range, mean, standard deviation, and standard error.
• In addition to this baseline data, you can choose to plot up to four graphs for particular demographic groups.
• The sample displays the distribution of female scores compared to the overall baseline.
• The view also displays how the • scores fall along the selected • proficiency scale.
Benchmark Test Results
Frequency Distribution
Pearson Benchmark
Benchmark Lite ends here
Pearson School Systems
*** School District*** School DistrictSelf-Guided Product TourSelf-Guided Product Tour
Please see comments in Notes Section, using “Notes Page” view.
Principal’s Dashboard
All users can run queries and reports(Teachers, principals, counselors, etc.)
All tests are also broken down byConcepts (“Strands”)
Parent’s / Student’s Dashboard
Oakland Schools Support
• Models defined to support diverse needs of districts and multiple on-ramps
• Monetary support – Oakland Schools resources aligned
• Curriculum, Item Banks, and Assessments delivered to all districts
Professional Development for LEA’s• Using data to inform instruction• Using Benchmark & Inform for grouping and
differentiation• Using the Benchmark with Common Assessments• Using the Benchmark for Classroom Assessments• Administrator use of Inform• SIP Planning using both products
Current Status
Pearson Benchmark & Inform
25
20
26
18
0
4
8
12
16
20
24
28
Inform Data Validation Lite Full
Inform Benchmark
Num
ber o
f Dis
trict
s
As of November 21, 2005
(Completed, Scheduled, or Planned)
Early successes
Lake Orion High School•5 departments•14 courses•36 teachers (about 25%)•72 sections•Over 2200 scan sheets
Phase I (Sept-Nov)
• Meet individually with department heads• Review exams with course teams• Create answer keys• Verify data • Distribute results to participating teachers• Review detailed results to participating teachers• All-staff professional development (11-11-05)
Impact of Phase I
• Improved dialogue between participating teams – Discussion and modification of course assessment
schedule
– Question issues
– Assessment design
• Increased participation• Improved teacher comfort level of common
assessment procedures
Phase 2 (Nov-Jan)
• Try online testing
• Try using rubrics
• Additional course benchmarks
• Build new tests
• Identify & train department experts
Phase 3 (Jan-March)
• Initiate middle school implementation– Benchmarks– Create common assessments for core courses– Collaborate with high school departments– Coach high school teams
Phase 4 (March – August)
• Create and administer benchmark assessments in all high school courses
• Administer common assessments in middle schools
• Design/modify instructional practices based on data
Inform
• Create structure for naming/filing queries for– Principals– Teachers
• Create a consistent set of queries for each
• Teach all principals to run their own queries
• Get additional test data into Inform
“Favorite Queries/Reports”To Facilitate Initial
Pearson Inform Training
Depending on AnIndividual’s Access Permissions …
“Favorite Queries” Can Be Viewed
At the District, School and Class Levels