fall 08 district capacity assessment (dca) technical manual
TRANSCRIPT
![Page 1: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/1.jpg)
Suggested Citation: Russell, C., Ward, C., Harms, A., St. Martin, K., Cusumano, D., Fixsen, D. Levy, R. & LeVesseur, C. (2016). District Capacity Assessment Technical Manual. National Implementation Research Network, University of North Carolina at Chapel Hill
DistrictCapacityAssessment(DCA)TechnicalManualDevelopedincollaborationbetweenNIRNandMIBLSIJuly,2016
08Fall
![Page 2: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/2.jpg)
2
TableofContents
Preface...................................................................................................................................4PurposeofThisManual...........................................................................................................................................................4Audience........................................................................................................................................................................................4OverviewoftheDistrictCapacityAssessment(DCA)...............................................................5DescriptionoftheDistrictCapacityAssessment.........................................................................................................5HistoryofDCA.............................................................................................................................................................................6RoleofImplementationSciencewithinEducation.....................................................................................................7NeedforaMeasureofDistrictCapacityforImplementation.................................................................................8ValidationofAssessments......................................................................................................9ApproachestoValidity............................................................................................................................................................9FocusandProcessofCurrentValidityWork..............................................................................................................11
InitialDevelopmentoftheDistrictCapacityAssessment......................................................12ConstructDefinitions............................................................................................................................................................12ItemsandRubric.....................................................................................................................................................................13ContentValidationProcess:4-PartSurveyProtocol..............................................................14GeneralSurveyDevelopment............................................................................................................................................14TestContentParticipants....................................................................................................................................................14ContentValidationSurveyElements..............................................................................................................................15
TestContentSurveyDescription...........................................................................................19ConsentandEdits...................................................................................................................................................................19ItemAnalysis............................................................................................................................................................................19DCAConstruct..........................................................................................................................................................................19Sequencing,FormatandFrequency...............................................................................................................................20AnalysisofContentValiditySurveyResults/DecisionRules...............................................................................20
DCATestContentValidationResults....................................................................................21ImprovementComparedtoOtherMeasures..............................................................................................................21ConstructDefinitions............................................................................................................................................................21FrequencyofAssessment....................................................................................................................................................22ComprehensiveandClearSections.................................................................................................................................23ItemAnalysis............................................................................................................................................................................24ItemMatchwithConstructs...............................................................................................................................................26SequencingofItems...............................................................................................................................................................26ResponseProcess:ThinkAloudProtocols.............................................................................26ResponseProcessOverview..............................................................................................................................................26GeneralProtocolDevelopment.........................................................................................................................................27ResponseProcessParticipants.........................................................................................................................................28
UsabilityTesting:ContinuousImprovementProcess.............................................................29UsabilityTestingOverview................................................................................................................................................29UsabilityTestingPlan...........................................................................................................................................................29UsabilityTestingResultsandModificationstotheMeasure...............................................................................32
PreliminaryReliabilityResults..............................................................................................32DescriptiveStatistics.............................................................................................................................................................32BivariateCorrelations...........................................................................................................................................................35
![Page 3: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/3.jpg)
3
Cronbach’sAlphaCoefficients...........................................................................................................................................35ExploratoryFactorAnalysis...............................................................................................................................................36CurrentandFutureUsesoftheDistrictCapacityAssessment...............................................38AppropriateUseoftheDCA..................................................................................................................................................38FutureValidationoftheDCA................................................................................................................................................38
References...........................................................................................................................39
AppendixA:ContentValidationSurveys...............................................................................42
AppendixB:ThinkAloudProtocolGuide..............................................................................43
![Page 4: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/4.jpg)
4
Preface
PurposeofThisManualThepurposeoftheDistrictCapacityAssessment(DCA)TechnicalManualistoprovidebackgroundinformationonthetechnicaladequacyoftheDCA(Wardetal.,2015).Thiscurrentversiondrawsuponarichhistoryandbackgroundofpreviousworkassessingdistrictcapacity.Notably,thecurrentversionincludessignificantmodificationsfromearlieriterationsincludingreviseditems,arubricforscoringandaglossaryofterms.ThisversionoftheDCAwasreleasedinthespringof2015followingathoroughdevelopmentprocessandearlyvalidationworkresultinginahighqualityassessmentofdistrictcapacityforimplementationofeffectiveinnovations.Validityevidencecollectedduringtheassessmentdevelopmentprocessisrarelyobtainedandwhenitisobtaineditisnotoftenpresentedindetail(Carretero-Dios&Perez,2007).Thistechnicalmanualdetailsthedevelopmentprocesstodate,thevalidityworkthathasbeencompleted,usabilitytestingeffortsaccomplishedandanoutlineofnextstepstocontinuetheworktoafullyestablishedassessmentforlocaleducationagencies(LEAs).AudienceThismanualwaswrittenforstate,regionalandlocalagenciesthatareconsideringoralreadyusingtheDCAtoassessdistrictcapacityforimplementationofeffectiveinnovations.Thismanualcanhelpwiththeselectionprocessanagencymayengageinwhenchoosinganassessmentofcapacity.Additionally,DCAAdministrators,facilitatorsandrespondentsmayusethismanualtodeepentheirbackgroundknowledgeonthedevelopmentandvalidationoftheDCA.
![Page 5: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/5.jpg)
5
OverviewoftheDistrictCapacityAssessment(DCA)DescriptionoftheDistrictCapacityAssessmentTheDistrictCapacityAssessment(DCA)isa26-itemteam-basedself-assessmentdevelopedtoassistschoolLocalEducationAgencies(e.g.,schooldistricts)intheimplementationofeffectiveinnovationsthatbenefitstudents(Wardetal.,2015).ADistrictImplementationTeam,inclusiveofDistrictLeadership,usestheDCAtoassistwiththedevelopmentofanactionplantoimprovecapacityforimplementationofaneffectiveinnovation(EI),helpwithmonitoringoftheactionplan’seffectivenessinimprovingoverallcapacity,andsupportthedevelopmentofadistrict-wideconsistentstructureforsupportinginitiativesandpracticesacrossschools.Additionally,theDCAcanbeusedaspartofafeedbackstructuretoastateorregionaleducationbodytoimproveandfocustheworkofindividualswhosupportdistricts.
ThedistrictteamworksthroughitemswithaspecificEffectiveInnovationinmind.Aneffectiveinnovationis“anythingthatisnewtoadistrictandthatisintendedforusetoimproveeffectivenessorefficiency.Theinnovationwasdevelopedbasedonthebestavailableevidence(e.g.,evaluationresults,researchfindings)”(Wardetal.,p.29).Consequently,ateamisabletoutilizetheDCAwithany/allinnovationsthatareoccurringwithinthesystemorusetheassessmentwiththeirmostprominentinitiative.TheDCAisgroundedintheunderstandingthatdistrictsmustdevelopcapacityintheActiveImplementationFrameworks(Fixsen,etal.2005)toreachdesiredoutcomesfromaninnovation.Wardetal.(2015)definedistrictcapacityasthedevelopmentof“systems,activities,andresourcesthatarenecessaryforschoolstosuccessfullyadoptandsustainEffectiveInnovations”(p.5).KeyorganizationalactivitiesrequiredforstrongimplementationandsustainabilityofeffortsareorganizedintothreecriticalImplementationDriversthatincludeLeadership,Competency,andOrganization.
Leadership-Activeinvolvementinfacilitatingandsustainingsystemschangetosupportimplementationoftheeffectiveinnovationthroughstrategiccommunication,decisions,guidance,andresourceallocation.Leadershipincludes:LeadershipandPlanning
Competency-Strategiestodevelop,improve,andsustaineducators’abilitiestoimplementanEffectiveInnovationasintendedinordertoachievedesiredoutcomes.CompetencyDriversinclude:PerformanceAssessment,Selection,Training,andCoaching
Organization–Strategiesforanalyzing,communicating,andrespondingtodatainwaysthatresultincontinuousimprovementofsystemsandsupportsforeducatorstoimplementaneffectiveinnovation.OrganizationDriversinclude:DecisionSupportDataSystem,FacilitativeAdministration,andSystemsIntervention
ThesuggestedscheduleforconductingaDCAistwiceayear,abouteverysixmonths.AnadministrationinFebruary/Marchtimeframehasbeenfoundtobestrategicininformingdistrictbudgetingprocessfortheupcomingschoolyearfollowedbyarepeatedadministration
![Page 6: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/6.jpg)
6
sixmonthslater.Throughouttheadministrationofthe26-itemself-assessment,arubricisutilizedtoanchorcurrentfunctioningwithascoreof0,1or2.TheDCArequiresspecificroles,includingaDCAAdministrator,Facilitator,NoteTakerandRespondents.PreparationfortheadministrationofassessmentincludescommitmenttothetimefortheDCAadministration,identificationofroles,andsecuringleadershipsupportfortheadministrationanduseoftheresultsforactionplanning(pleaseseeDCAAdministratingtrainingcourseformoretrainingonhowtoadministertheDCAathttp://implementation.fpg.unc.edu/resources/district-capacity-assessment-dca).Duringadministrationtheteamusesasimultaneousandpublicvotingprocesswhererespondentssimultaneouslyholdupeitherafingeroraresponsecardtoindicatetheirvoteofa0,1,or2foreachitem.VotingisguidedbyrequirementsincludedintheDCAscoringguide(rubric).Thefacilitatorcontributestotheprocessbyprovidingcontextualizationforanyitemsandrubricrequirements.Whilescoringisimportant,discussionsoccurringthroughouttheadministrationprocessserveascriticallinkstoactionplanning.UponcompletionoftheDCA,theteamenterstheirresultsintoaweb-basedapplicationprovidedbySISEP.org,whichstoresdatafromalladministrations.ItisimportanttonotethattheDCAdoesnotendwhenthelastitemisscored.Incontrast,theteamthenmovesintodevelopinganactionplanthatincludesassigningactivitiestoimprovethedistrict’scapacitytosupporttheidentifiedEI.HistoryofDCATheStateImplementationandScalingupofEvidence-basedPrograms(SISEP)CenterbeganinOctober2007.Oneofthegoalswastofindordevelopimplementationcapacityassessments.ByJune2008,areviewofavailablemeasureshadbeencompletedbySandraNaoomandMichelleDuda(SISEP)andAmandaFixsen(PortlandStateUniversitygraduatestudent).ContentderivedfromthiseffortwasthenreviewedbyRobHorner(UniversityofOregon)andGeorgeSugai(UniversityofConnecticut)whowereco-developersofSISEP.InDecember2008thefirstdraftofStateCapacityAssessment(SCA)itemswascirculatedamongSISEPEvaluationAdvisoryBoardmembersDavidMank,MarkGreenberg,andMitchellYell.In2009,theSISEPstaffbegantodevelopitemsrelatedtostatecapacity,regionalcapacity,anddistrictcapacity.ByJanuary2010,afirstdraftofdistrictcapacityitemswasproduced.InMarch2010,CarolSadler(StateTransformationSpecialistinOregon;developerofEffectiveBehavior&InstructionalSupportSystems[EBISS]intheTigard-Tualatinschooldistrict)conductedacrosswalkoftheEBISSDistrictSystemSupportPlan(DSSP)assessmenttoolandthedraftofSISEPdistrictcapacityitems.ThisledtoabroaderdiscussionofrelateddistrictassessmentworktakingplaceinOregon.Inlate2010,RobHornerconvenedagroupofindividualswhohadestablishedandwereusingsomeformofanassessmentofdistrictcapacity.ThegroupincludedSISEP(MichelleDuda),EBISS(CarolSadlerandErinChaparro),ResponsetoIntervention(DavidPutnam),andPositiveBehaviorInterventionsandSupports(RobHorner).Conceptareas,thelogicalrelationships
![Page 7: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/7.jpg)
7
betweenitemsandconceptareas,thewordingofitems,scalingrubrics,andsoonweremajortopicsfordiscussionintheperiodicmeetingsthatoccurredoverthenextseveralmonths.Subsequently,anotherprogram,Michigan’sIntegratedBehaviorLearningSupportInitiative(MIBLSI),ledbySteveGoodman,createdtheDistrictMulti-TieredSystemsofSupportCapacityAssessment(DMCA)in2011.TheassessmentwashighlyinformedbytheDCAbuttailoredforMichiganteamsandspecifictotheeffectiveinnovationofMulti-TieredSystemsofSupport(MTSS).Inanefforttousefeedbacktoshapetheitems,languageandformatoftheassessment,theDMCAwasputthroughacontentvalidationstudyandusabilitytesting.InMarch2012thefirstcompletedraftoftheSISEPDistrictCapacityAssessment(DCA)wasdistributedtoSISEPstaffforitsfirstuseindistrictsinactivescalingstates.InJune2014,RobHorner(UniversityofOregon)convenedtheSISEPteamandMIBLSIteamtorefine,reconcile,andproduceanationallyagreeduponDistrictCapacityAssessmentusingquantitativeandqualitativedatacollectedontheusabilityonboththeoriginalDCAusedbySISEPandtheDMCAusedbyMIBLSI.Asaresultofthereconciliation,contentvalidity,andusabilitytestingprocessesdescribedinthismanual,therevisedDistrictCapacityAssessment(v6.0)wasdevelopedandiscurrentlyinusebySISEPactivescalingstates,NationalTechnicalAssistanceCenters,andotherSEAfundedimplementationwork.RoleofImplementationSciencewithinEducationIncreasedattentionisbeingpaidtohowinnovationsareimplementedbecausestudentscannotbenefitfromeducationalpracticestheydonotexperience.Whilethisseemsobvious(anditis),educationsystemsareworkingtodeveloptheimplementationcapacitytohelpallteachersmakegooduseofevidence-basedpracticesthatenhancethequalityofeducationandoutcomesforallstudents.Strongpressuretoimplementsolutionstoovercomechallengesorproblemsinsocialsystemssuchaseducationarenotnew;however,pressuretodrawsolutionsfromagrowingportfolioofstrategiesthathavedocumentedoutcomesnarrowsthepoolofinnovationsfromwhichwecanchoose.Inthisquesttoaffectmeaningfulchangesineducationaloutcomes,wemustduallydirectourattentiontowhateffectiveinnovationsareselectedandhowtheyareimplemented.Inshort,effortstoimprovesociallysignificantoutcomesforstudentsandfamiliesrequirestrongcollaborativesystemssupportingtheimplementationofpracticesselectedtoaddresstargetedchallenges.“How”practicesareimplementedisasimportantas“what”strategiesaresoughttofixtheproblem.In2005,theNationalImplementationResearchNetwork(NIRN)releasedamonographsynthesizingimplementationresearchfindingsacrossarangeoffields.Basedonthesefindings,NIRNdevelopedfiveoverarchingframeworksreferredtoastheActiveImplementationFrameworks.TheActiveImplementationframeworkshelpdefinewhatneedstobedone,howtoestablishwhatneedstobedone,whowilldotheworkandwhen,andestablishthehospitableenvironmentfortheworktoaccomplishthepositiveoutcomes(Blase,Fixsen,Naoom&Wallace,2005).TheActiveImplementationFrameworks(AIFs)areuniversalandapplytoattemptstouseanyinnovation.FormoreinformationandresourcesontheActive
![Page 8: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/8.jpg)
8
ImplementationFrameworks,visittheActiveImplementationHub:http://implementation.fpg.unc.edu/modules-and-lessons
Table1.ActiveImplementationFrameworksFramework DefinitionUsableInnovations Tobeusable,aninnovationmustnotonlydemonstratethefeasibility
ofimprovingoutcomes,butalsomustbewelloperationalizedsothatitisteachable,learnable,doable,andreadilyassessable.
ImplementationStages
StagesofimplementationrequirethinkingthroughtherightactivitiesforeachstagetoincreasethelikelihoodofsuccessfuluseoftheAIFsandthepractice.Stagesareexploration,installation,initialimplementation,andfullimplementation
ImplementationDrivers
Keycomponentsoftheinfrastructureandcapacitythatinfluencethesuccessfuluseofaninnovation.Therearethreedriverdomains:Competency(selection,training,coaching,fidelity),Organization(decisionsupportdatasystems,facilitativeadministration,systemsintervention),andLeadership(adaptive,technical)
ImprovementCycles IterativeprocessesbywhichimprovementsaremadeandproblemssolvedbasedonthePlan-Do-Study–ActCycle(3typesofcycles:RapidCycleproblemsolving,UsabilityTesting,andPractice-PolicyCommunicationcycles)
ImplementationTeams
Teamsareaccountableforplanningandseeingtheimplementationprocessthroughtofullimplementation.
Developingtheskills,knowledge,andabilitiesofLocalEducationAgencies(LEAs)tousetheActiveImplementationFrameworksisimperativeforthesustainedandeffectiveuseofevidence-basedpracticessothatsociallysignificantandmeaningfuloutcomesareobtained.Withoutthesecooperativeandalignedsupports,theresultisofteninequitiesinoutcomesforstaffandstudents.(Skiba,Middelberg&McClain,2013;Fuchs&Deshler,2007).NeedforaMeasureofDistrictCapacityforImplementationAttemptstoanalyzecomponentsofimplementationhavetakenseveralapproachessuchas:verygeneralmeasuresthatdonotspecificallyaddresscoreimplementationcomponents(e.g.Landenberger&Lipsey,2005;Mihalic&Irwin,2003);measuresspecifictoagiveninnovationthatmaylackgeneralityacrossprograms(e.g.Olds,Hill,O'Brien,Racine,&Moritz,2003;Schoenwald,Sheidow,&Letourneau,2004);ormeasuresthatonlyindirectlyassesstheinfluencesofsomeofthecoreimplementationcomponents(e.g.Klein,Conn,Smith,Speer,&Sorra,2001;Aarons,Cafri,Lugo,&Sawitzky,2012).InorderforLEAstosupportschoolsto
![Page 9: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/9.jpg)
9
successfullyuseandsustaintheuseofevidence-basedpractices,itisessentialtohavereliableandvalidmeasuresofimplementationcomponents.Thisinformationwillinformthedistrict’splanningforeffectivesupportstoschoolstaffandwillassisttheminassessingprogresstowardsimplementationcapacity.Additionally,thesedatacanbeusedtoconductrigorousresearchoneffectiveandefficientimplementationsupports.Despitetheseearlierefforts,theneedforameasurethataddressescoreimplementationcomponentsthatisgeneralizableacrossinnovationsremains.Inresponsetothisvoid,aseriesofImplementationCapacityAssessmentshavebeendevelopedthatspanacrosstheeducationalsystemfromtheStateEducationAgency(SEA)totheschoollevel.Thesemeasurestarget“implementationcapacity”withafocusonthesystems,activities,andresourcesthatarenecessarytosuccessfullyadopt,use,andsustaineffectiveinnovations.IncludedinthisseriesistheStateCapacityAssessment(SCA),RegionalCapacityAssessment(RCA),DistrictCapacityAssessment(DCA),andDriversBestPracticesAssessment(DBPA).Importantly,thesecapacityassessmentsare“actionassessments.”Thatis,theypromoteactionstosupportimplementationofbestpracticesthroughrichdiscussionsthatoccurduringtheadministrationprocess.
ValidationofAssessmentsApproachestoValidityValidityisconsideredthemostimportantissueinassessment.Establishingvaliditysignificantlyinfluencestheaccuracyofassessmentsandabilityforanassessortoassignmeaningtoitsresults(Popham,2008).Ineducation,assessmentsareroutinelyusedwithinacycleofschoolimprovement.Thesedatahavethepowertoswayresourceallocationanddetermineprioritiesforactionplanningwithinadistrictorschool.Inlightofthis,itisessentialthatassessmentsbedevelopedinatechnicallysoundmannerwithappropriateattentionpaidtopsychometricpropertiessuchasreliabilityandvalidity.Evidencemustshowthattheassessmentcaptureswhatitwasintendedtomeasureandthatthemeaningandinterpretationoftestscoresareconsistentwitheachintendeduse.TheAmericanPsychologicalAssociation(APA)recommendstheuseofstrongpsychometricproceduresinthedesignofassessmentsasawaytoreduceoreliminatebiaswithintheassessment(APA,2010,p.13).Historically,approachestoestablishingvalidityhavefocusedonthreeareas:1)contentvalidity,2)criterionvalidityand3)constructvalidity.Typicallyeachofthesethreeareasisconceptualizedinisolationandreportedseparately.Whileattendingtovalidityinthiswaycanleadtoabetterunderstandingofhowwellanassessmentismeasuringaconstruct,Messick(1995)proposedanalternativemethodwherevalidityisconsideredonelargeconceptwithanumbervaliditysub-areasthatshouldbeinvestigatedtovalidateanassessmenttoolfully.TheStandardsforEducationalandPsychologicalTesting(AmericaneducationalResearchAssociation,2014)reinforceMessick’salternativemethodstatingthatbestpracticeistoreportfindingsasfivesourcesofevidencetodeterminetheoverallvalidityofanassessment.These
![Page 10: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/10.jpg)
10
fivesourcesofvalidityare;1)testcontent,2)responseprocess,3)internalstructure,4)relationshiptoothervariables,and5)consequenceoftesting.Table2.SourcesofValidity
SourcesofValidity Description ExampleMethodologies
TestContent Instrumentcharacteristicssuchasthemes,
wording,formatofitems,tasks,questions,instructions,guidelinesandproceduresforadministrationandscoring
• Basisforitems/literaturereview
• Qualificationofauthorsandreviews
• Itemwritingprocess• Reviewbypanelof
experts• Vettingandediting
processResponseProcess Fitbetweentheitemsandprocess
engagedinbyindividualsusingtheassessment
ThinkAloudProtocols
InternalStructure Analysisofpatternsandtrendsamongitemsthatallowitemstobereducedtolargerconstructsbasedonrelationshipsbetweenthem
FactorAnalysis
RelationshiptoOtherVariables
Relationshipoftestscorestovariablesexternaltothetext
Relationshipbetweenatestscoreandanoutcome• Predictiveevidence• Concurrentevidence• Convergentevidence• Divergent
ConsequenceofTesting
Intendedandunintendedconsequencesoftestuse
Purpose,use,andoutcomesoftestadministrationincludingargumentsforandagainst
Technicaladequacyintheareaofvalidityreliesonintegratingmultiplesourcesofevidence,butnosourceofevidenceisconsideredinherentlybetter.Itistherelevanceandqualityoftheevidencethatmatters.Whilereportingmultiplesourcesofvalidityistheexpectationforanassessmenttobeconsideredvalid,gatheringevidenceacrossallfiveareasisalengthyprocess,notadiscreteactivity.Therefore,validityevolvesovertimeasadditionalsourcesofevidencebecomerelevanttocollectandreportatvaryingstagesoftheassessmentdevelopmentanduseprocess.
![Page 11: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/11.jpg)
11
Acriticalfirststepininstrumentdevelopmentisgatheringevidenceofhowsoundlythetestcontentmeasurestheconstruct.Thisshouldbegatheredasapartofthetestdevelopmentprocess.Gatheringevidenceoftestcontentestablishestheappropriatenessoftheconceptualframeworkandhowwellitemsrepresenttheconstruct(Sireci&Faulkner-Bond,2014).Testcontentvalidityisconsideredanimportantfeaturewhendevelopinganinstrumentbecauseitrepresentstheextenttowhichitemsadequatelysampletheconstruct(Gable&Wolf,1993,Beck&Gable,2001).Otherformsofvalidityandreliabilitydonotcarryasmuchweightwithoutfirstestablishingstrongtestcontentvalidity.Scalesanditemsthatarepoorlydevelopedcanhaveanimpactonwhethertheassessmentisbiased,flawed,orotherwisenotdesignedinawaytoelicitqualityresponsesleadingtoasoundmeasureoftheconstructathand.Consequently,thequalityoftheconstructionoftheDCAhingesonimportantcontentfactors,suchashowwelltheinstructionsarewritten,howclearlyitemsarephrased,andtheformatandappropriatenessofthescalethatisused.Followingtestcontentevaluationandsubsequenteditingoftheassessmentbasedontheresults,itisbeneficialtoensurethatparticipantsinterprettheinstrumentasexpected.Evidenceoftheresponseprocessisdeterminedbytheextenttowhichparticipantresponsesarealignedwiththeintendedinterpretationofscores(Smith&Smith,2007).Thepurposeistoobserveparticipantperformancestrategiesandresponses,suchashowparticipantsapproachandanalyzeparticularitems.Thisenablesinvestigatorstorethinkorformatitemsthathavebeenmisinterpreted,thusremovinganyitemsthatdonotrepresenttheconstruct(Standards,2014,p.12).FocusandProcessofCurrentValidityWorkTestcontentandresponseprocesselementsofvalidity,alongwithusabilitytesting,wereaccomplishedthroughamulti-phaseprocessusingamulti-methodapproachcollectingbothqualitativeandquantitativeresponses.FollowingtheinitialdevelopmentoftheDCAitemsandscoringrubricinthefallof2014,a4-partsurveywasdevelopedtocollectfeedbackfromexpertsandpractitionersregardingtheassessmentduringNovemberof2014.InDecemberof2014,thinkaloudprotocolswerecompleted,andthroughthewinterof2015usabilitytestingwascompleted.FollowingeachphaseofworkwiththeDCA,theassessmentwasrefinedbasedonthefeedbackandinformationgathered.Theresultsofeachstageandthemodificationsmadearediscussedinfurthersectionsofthistechnicalreport.
![Page 12: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/12.jpg)
12
Figure1.PhasesofInitialDevelopment,Validity,andUsability
InitialDevelopmentoftheDistrictCapacityAssessmentConstructDefinitionsTheDCAisdesignedtomeasurepracticesasoperationalizedwithinActiveImplementationFrameworks,meaningthattheschooldistrictworkstoprovidespecificsupportsforaprogrambothtobenefitendusersandsustainpracticesovertime.Supportsassessedarethosediscussedwithinimplementationscienceresearch.Terminologysuchaseffectiveinnovation,capacityforimplementation,andimplementationdriversareallconceptsembeddedwithintheassessmentandmustbewellunderstoodbythoseinteractingwiththetool.Martinez,LewisandWeiner(2014)pointoutthatcurrentlanguageanddefinitionsusedwithinimplementation
TestContent:4-PartSurveyProcess• ConsentandEdits(includingtrackchanges)• ItemAnalysis• Construct• Sequencing,FrequencyandFormat
ResponseProcess• ThinkAloudProtocol
UsabilityTestingandRefinement• ImprovementCycleBasedonPlan-Do-Study-Act
ConstructDefinition,ItemGeneration• Useofpreviouscapacityassessments• Feedbackfromadministrationsofpreviousassessments• AdvancementsinImplementationScienceandImplementationCapacity
![Page 13: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/13.jpg)
13
sciencearenotconsistent,leadingtovarianceinhowconstructsaredescribedwithinresearcharticlesandinstrumentsassessingimplementation.Inanareaofresearch,suchasimplementationscience,wherethelackofmaturityofthecontentarealeadstovarianceintheusesofcriticalterms,itisessentialthatconstructspresentedarewellstatedandvisiblewithintheassessment.ConstructsusedwithintheDCAweredefinedtoalignwiththeNationalImplementationResearchNetwork’s(NIRN)ActiveImplementationFrameworksanddefinitions(seeTable1).ItemsandRubricItemsfrompreviousversionsoftheDCAandtheDMCAwereusedtosetthestageforinitialitemgeneration.Factorsinfluencingitemgenerationincluded:aprocessofcomparingsimilaritemsbetweenassessments;carefulconsiderationofitemsfoundononlyoneassessment;useoffeedbackcollectedfromadministratorsandfacilitatorsofpreviousDCAadministrations,andrecentadvancesinthefieldofimplementationscience.Itemsincludedinpreviousmeasuresweredeletedwhendeemedinappropriateorineffectiveandnewitemswerecreatedtofillgapswithintheassessment.CarefulconsiderationwasgiventofeaturesoutlinedbyHaynes(1995)andDeVellis(2012)withattentiontohowwelleachitemreflectedthescale’spurpose,decreasingredundancywithintheassessment,readingdifficultylevel,lengthofanitem,avoidingmultiplenegatives,doublebarreleditems,confusinglanguage,andnegativelyversuspositivelystatedwording.Itemgenerationconcludedwith28itemsintheassessmentincludingascoringrubricforeachitemreflecting“FullyinPlace”,“PartiallyinPlace”or“NotinPlace.”EachitemwascategorizedwithinoneoftheImplementationDriversofLeadership,OrganizationandCompetency.Accompanyingintroductorysections,instructions,andtoolsforadministrationandscoringweredevelopedtosupporttheappropriateuseofthetool.Thesectionsinclude:IntroductionandPurpose,DCAAdministrationFidelityChecklist,DCAScoringForm,ActionPlanning,andGlossary.
![Page 14: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/14.jpg)
14
ContentValidationProcess:4-PartSurveyProtocolGeneralSurveyDevelopmentAcontentvaliditysurveywasdevelopedtogatherfeedbackonfourcomponents:theimportance/relevanceofeachDCAitem;theattainabilityofeachitem;definitionsoftermsandconstructs,sequencing,frequency;andformat.Designingthesurveyintofourcomponentsprovidedshorterandmoremanageablesegmentsofworkforparticipants;whichminimizedtheriskofparticipantfatigue.Separationofcriticalaspectsofthevalidationprocessalsoaidedintheanalysisofresults.TestContentParticipantsThenumberofparticipantssuggestedforacontentvalidationsurveyvariesfrom2-20(Gable&Wolf,1994;Grant&Davis,1997;Lynn,1986;Tilden,Nelson,&May,1990;Waltz,Strickland,&Lenz,1991).Whatisimportantisthattheendgroupofparticipantsisrepresentativeoftherangeofexperience,backgroundandexpertisethatisdesiredforafullreviewoftheassessment.TheDCAcontentvaliditysurveyresultsincludedfeedbackfrom34participants.Initially,56individualsreceivedtherequestforsurveyparticipationresulting(57%responserate).Individualsapproachedtoparticipatemetoneofthefollowingcriteria:
1. Aresearcherwithatleastonepublicationintheareaofimplementationscience;2. StaffmemberwithNIRNwhoprovidednationaltechnicalassistancerelatedto
implementationscience;3. StafffromMichigan’sIntegratedBehaviorLearningSupportInitiative(MIBLSI)or
partnersofNIRN(e.g.,stafffromdifferentpartneringstatesanddistrict)whoprovidedtechnicalassistancetotheimplementationofeffectiveinnovationsatthestateorregionallevels;or
4. Schooldistrictpractitionersdirectlyinvolvedinthetrainingand/orcoachingstructurefordistrictimplementationteams(DITs)withinaSISEPactivestateorapartneringMIBLSIdistrict.
Table3.TestContentValidationParticipants
Research/NationalTechnicalAssistance
Providers
State/RegionalTechnicalAssistance
Providers
District
Practitioners
Total
Number4 19 11 34
Alargeresponsepoolwasdesiredtoservemultiplefunctions.PuttingtheDCAinfrontofindividualswhohadusedoradministeredpreviousversionsoftheDCAgavethosewhohadbeeninvolvedwithearlycapacityassessmentworkanopportunitytobuildanunderstandingoftheproposedchanges.TheDCAdevelopersalsovaluedconcurrentfeedbackfromthosewhohadnotpreviouslyinteractedwithanassessmentofthistype.Additionally,thelargerparticipantgroupallowedinputfromthosewhofacilitate,trainandsupportdistrict
![Page 15: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/15.jpg)
15
implementationteamstobeexaminedalongwithpractitionersondistrictteamswhoserveinabroadrangeofeducationalroles.ParticipantswhowerenotapartMIBLSIorNIRNstaffwereofferedastipendof$200fortheirparticipationinthecontentvalidationprocessifthistaskfelloutsideofthescopeoftheirworkresponsibilitiesoriftheyneededtoallocatetimeoutsideoftheirnormalworkschedule.Compensationwasprovidedtotheiremployerifsurveycompletionfellwithinthescopeoftheirworkresponsibilities,occurredduringworkhoursandparticipantshadpermissionfromtheiremployer.Alternatively,aparticipantcouldofferhisorherservicesinkindwithnopaymentexchanged.InvolvementasareviewerprovidedparticipantswithauniqueopportunitytopreviewtheDCA,shapethenextversionoftheassessmentandberecognizedasDCAcontributors.ContentValidationSurveyElementsAnarrayofquestionscanbeaskedtoelicitfeedbackfromparticipantswithinacontentvalidationsurvey.Themostconsistentlyaddressedportionofacontentvalidationsurveyistheratingofitemsinareassuchasrelevanceandclarity.Haynes,RichardandKubany(1995)suggestincludingallsectionsoftheassessmentwithincontentvalidation.Thisincludes:instructions,responseformatsandresponsescales,relevanceandrepresentativenessalongwithprobingrespondentstosharewhatinferencestheybelievewillbeabletobedrawnfromtheinformationgatheredaftertheassessmenthasbeencompleted.Asanadditionalsupport,Haynesetal.outlineanumberofelementsthatmayberelevantforacontentvalidationsurvey.Itisstatedthatnotallquestionsmayberelevantforallassessments,butthatintentionalconsiderationofthesuggestedelementsshouldhelpinformthedevelopmentofacontentvalidationsurvey.Table4outlinestestcontentelementssuggestedbyHaynesetal.TablecolumnsoutlinewhethertheelementwasconsideredappropriateforthecontentsurveysrelatedtotheDCAandthesurveyinwhichincludedelementsareaddressed.Table5listseachsurveyandwhatcomponentswereincludedwithinthatspecificsurvey.AcopyofeachsurveyisincludedinAppendixA.
![Page 16: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/16.jpg)
16
Table4.TestContentElementsElementsSuggestedbyHaynesandIncludedintheSurvey
TestContentValidationSurveyConsentandEdits
ItemAnalysis
DCAConstruct
Sequencing,Frequency,Format
Precisionofwordingordefinitionofindividualitems
X X
Itemresponseform(e.g.,scale) X Temporalparametersofresponses X Instructionstoparticipants X XMethodandstandardizationofadministration X XComponentsofanaggregate,factor,responseclass
X
Definitionofdomainandconstruct X Sequenceofitemsorstimuli XFunction-instrumentmatch XElementsSuggestedbyHaynesbutnotConsideredNecessaryfortheSurveys:
• Arrayofitemsselected(questions,codes,measures)• Situationssampled• Behaviororeventssampled• Scoring,datareduction,itemweighting• Method-modematch
![Page 17: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/17.jpg)
17
Table5.TestContentValiditySurveyComponentsSurvey ComponentsIncludedintheSurveyConsentandEdits • Consentform
• Optin/outoflistingasaDCAcontributor• DownloadablePDFofDCA• UploadDCAwithedits,suggestions,questionsprovided
throughtrackchangesDCAItemAnalysis • AttainabilityofeachDCAitemratedona3-pointscale
• ImportanceofeachDCAitemratedona3-pointscale• Opportunitytoselectthe5mostcriticalDCAitems
DCAConstructs • ComprehensivenessofeachDCAconstructdefinitionratedona3-pointscale
• ClarityofeachDCAconstructdefinitionratedona3-pointscale
• Open-endedcommentsonconstructdefinition• IndicationofthebestfitforeachDCAitemwithinasubscale
Sequencing,Frequency,Format
• SuggestionsforreorderingDCAitems• FrequencyDCAshouldbeadministered• ComprehensivenessofeachDCAsectionratedona3-point
scale• ClarityofeachDCAsectionratedona3-pointscale• Open-endedcommentsonsectionsoftheDCA• Ifparticipanthadexperienceadministeringaprevious
versionoftheDCAoranothercapacityassessmentasked:1)whethercurrentversionoftheDCAisanimprovementfrompreviousversionsand2)togiveinputonwhatbenefitshavebeenexperiencedfromusingtheDCAorDMCAinthepast
Eachsegmentofthecontentvalidationsurveybeganwithawelcomestatementandashortvideooutlininghowtointeractwiththatspecificsegmentofthesurvey(lengthofvideorangedfromapproximately1.20minto3min).Attheconclusionofeachsurveysegment,aquestionwasposedaskingparticipantstoreporthowlong(inminutes)ittooktocompletethesurvey,alongwithathankyoupagecontainingthelinktothenextsegmentofthesurvey.Table6sharestheaveragedurationforeachsegmentofthesurveyalongwiththeaveragetotalcompletiontimeforallfoursurveys.Withinthefirstsurvey,participantswereaskedfirsttoreadthecurrentversionoftheDCAandmaketrackchangeswithinaworddocumentdenotingquestions,suggestionsforrewording,re-ordering,etc.Followingtheinitialreadandtrackchanges,participantscompletedtheremainingthreesectionsofthe4-partsurvey.A3-pointLikertscaleresponsewithanchorsof“Very”,“Somewhat”and“NotatAll”codedas3,2,or1respectivelyforanalysispurposeswasprovidedasresponseoptionsforappropriateitemsonthecontentvaliditysurvey.
![Page 18: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/18.jpg)
18
Table6.MinutesSpentCompletingTestContentValidationSurvey
ConsentandEdits
Item
Analysis
Construct
SequencingFrequencyand
Format
TotalAverage89
Range(23–200)
26
(10-60)
23
(5–75)
20
(6–60)
157
(40–275)
![Page 19: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/19.jpg)
19
TestContentSurveyDescriptionConsentandEditsAuniqueaspectoftheDCAcontentvaliditysurveyincludedanopportunitytoprovidefeedback,questions,andcommentsdirectlywithintheDCAassessmentitself.Afterwatchingavideothatprovidedtraininginhowtoutilizereview/trackchangesoptionsinMicrosoftWord,participantsweredirectedtodownloadawordversionoftheDCAandthenprovidetheirfeedbackandsuggestionsdirectlytothedownloadedDCA.Oncethereviewwascompleted,theywereaskedtouploadtheircommentsandtrackchangesoftheDCAtotheelectroniccontentvalidationsurveysothattheDCAdevelopershadaccesstotheircommentsandtrackchanges.Thisreviewactivityenabledparticipantstofocusandprovidetheirfeedbacktocertainsections,items,orevenspecificsentenceswithintheDCA.ThegoalofthisatypicalprocesswastohaveparticipantsworkingwithintheDCAastheywereprovidingfeedback,ratherthansolelycollectingfeedbackwithinanonlinesurvey.Participantswereaskedtopayparticularattentiontothescoringrubric,ashavingarubricwasanewadditiontotheDCA..TheemphasisongatheringfeedbackinthisdetailedwayrightwithinthedocumentprovidedDCAdeveloperswithasignificantquantityofcomments,questionsandeditsthroughtrackchangestoassistwithimprovingclarityandimprovingthequalityofthedetailedscoringrubric.ThedevelopersalsohypothesizedthattheprocessofprovidingcommentsandtrackchangesdirectlyintotheDCAwouldresultinparticipantshavinggreaterfamiliaritywiththeDCA;thusimprovingthequalityoftheirfeedbackinsubsequentsegmentsofthecontentvalidationsurvey.ItemAnalysisFollowingthecompletionofthecommentsandtrackchangeseditstotheDCA,participantswerenextdirectedtoasurveywheretheyhadanopportunitytoratetheattainabilityandimportanceofeachitemincludedwithintheDCA.Athree-pointLikertscalewasprovidedtoquantifythisinformation.Additionally,afterratingtheitems,participantswereaskedtoselectwhattheybelievedtobethetopfivemostcriticalitemsincludedintheDCA.Thepurposeofthisstepwastohelptofurtherdiscernwhichitemsparticipantsviewedascriticallyimportant.DCAConstructToassesstheclarityandcomprehensivenatureofspecificdefinitionsusedforconstructsincludedwithintheDCA,eachconstructdefinitionwasprovidedwithanopportunityforparticipantstogivefeedbackthroughbothquantitativeandqualitativemeans.Specifically,participantswereaskedtorateona3-pointscalethecomprehensivenessaswellasthelevelofclarityofeachconstructdefinition.Finally,commentswerecollectedaboutdefinitionsprovided.Contentvalidationalsooffersanopportunitytogatherinformationfromrespondentsonwhetherthereisastrongmatchbetweentheitemandtheconstructunderwhichitisclustered.ParticipantswereaskedtoindicateforeachitemwithintheDCAwhich
![Page 20: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/20.jpg)
20
ImplementationDriver:Leadership,Organization,orCompetency,theitembestfitswithinbasedonthedefinitionofthedriver.ThisinformationwasusedtoassistinfinalmappingofDCAitemsintoImplementationDrivers,ordomains,withintheassessment.Sequencing,FormatandFrequencyConsideringtheflowofmovingfromitemtoitem,participantswereaskedifthehadanysuggestionsforchangestotheorderoftheitems.Ifyes,theywereaskedtoprovidetheirsequencingsuggestionsaswellasanyothercommentsinanopen-endedformat.FrequencyofadministrationalsowasaddressedthroughaseriesofquestionsrelatedtohowoftentheDCAshouldbeusedbyadistrictteam,thelatencyoftimeinthecyclefromactiontoimprovement,andthefrequencywithwhichthistypeofassessmentwouldassistwithproblemsolving.Aswithmostotheritemsincludedinthissurvey,open-endedresponsewasalsoavailableforitemsrelatedwithfrequency.InanefforttodivedeeplyintothequalityofthesectionswithintheDCAincludingtheIntroductionandPurpose,AdministrationandFidelityChecklist,DCAScoringForm,ScoringGuide,ActionPlanning,andGlossary,questionsaboutcomprehensivenessandclaritywereincluded.Respondentsagainuseda3-pointLikertscaletorespondandhadspaceforopen-endedcomments.ThecontentvalidationsurveyconcludedwithabriefquestionspecificallyforthosewhohadexperiencewithapreviousversionoftheDCAortheDMCA.Ifaparticipanthadexperiencewiththesetools,heorshewasaskedtwoopen-endedquestions.TheseitemsaskedaboutperceivedbenefitsoftherevisedDCAincomparisontopreviousadministrations.Participantsthenrated,onaslidingscalefrom0-10with0representing“NotanImprovement”and10indicating“SignificantImprovement”,whetherthecurrentversionoftheDCAwasanimprovementfromtheassessmentwithwhichtheyhadpreviouslyworked. AnalysisofContentValiditySurveyResults/DecisionRulesAvarietyofquantitativeandqualitativeresponseswereelicitedthroughoutthetestcontentsurvey.QuantitativeandqualitativeresponseswereorganizedtogetheracrossareasoftheDCAinanefforttotriangulatedataandenhancedecisionmaking.Qualitativeresponsesfromtheopen-endedquestionswithinthesurveywerecombinedwithcomments,edits,questionsandsuggestionsfromthetrackchangesdocumentsprovidedbyeachparticipant.PriortoanalysisoftheDCAresults,decisionrulesweredevelopedandagreeduponbytheDCAdevelopers.Decisionrulessupportanunbiaseduseofresults.Allcomments,edits,questionsandsuggestionsfromsurveyresultsandthetrackchangesdocumentswerereadandconsideredbydevelopers.However,thelevelofeditingandchangesthatwereemployedwasmediatedbyquantitativeresults.ItemsusingaLikertrating,suchasa3-or10-pointscale,wereanalyzedusingacontentvalidityindex(CVI)scoreforeachitem.ACVIofatleast0.80isconsideredtobeagoodcriterionforacceptinganitemasvalid(Davis,1992).Otherqualitativedatawerereportedbynumberofparticipantsrespondingaparticularwaywithpredeterminedcutscoressetforanalysis.
![Page 21: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/21.jpg)
21
DCATestContentValidationResultsImprovementComparedtoOtherMeasuresSeventy-sixpercent(n=16)ofrespondentstotheDCAtestcontentvalidationsurveyhadpreviouslycompletedasimilarassessment,suchastheDCAorDMCA.Onaresponsescaleof0-10,thenewversionoftheDCAwasgivenanaverageimprovementratingofaneight.Commentsprovidedindicatedthatthenewversionwassimplified,hadshorterandmoreconciseitemsandwasimprovedbytheadditionofascoringrubric.ThisinformationwasusedtoconfirmthecontinuedworkonthenewversionoftheDCA.Infact,ahighnumberofresponsesnotonlyendorsedthisversionasanimprovementoverpreviousmeasures,theyalsoendorsedthistoolaspositivelyimpactingtheassessmentofimplementationcapacity.ConstructDefinitionsAmajortaskwithinacontentvalidationstudyistodeterminewhetherthoseworkingwithinthefieldacceptandsupportthedefinitionsprovidedwithintheassessment.Howcomprehensiveandclearthefoundationalconstructsareaffectswhetherrespondentswillbeginanassessmentwithacommonlanguageandunderstandingofthefoundationsoftheassessment.DuetothecomplexityoftheDCAconstructsandtheinfancyofthestudyofimplementationscience,itwasessentialtogatherfeedbackonhowwelltheconstructdefinitionsweredeveloped.Withinthesurvey,constructdefinitionsforcapacity,leadership,organizationalenvironmentandcompetencywereprovided.RespondentsrevieweddefinitionswithintheirdownloadedcopyoftheDCAbyprovidingcommentsand/ortrackchangesintheConsentandEditsportionofthesurvey.Additionally,withintheDCAConstructssegment,respondentswereaskedtoratethecomprehensivenessandclarityofeachdefinition.Definitionsthatreceivedanaverageratingbelow2.5wouldberevised.QuantitativeSurveyresultsindicatednosignificantrevisionswereneededforthecapacityandcompetencydefinitions.CommentsprovidedwithintheDCAConstructsurveyandtrackchangeswereconsidered;howeverfewtonochangesweremadetothedefinitions.LeadershipandOrganizationalEnvironmentdidnotfairaswellonthequantitativefeedback.Duetothisfinding,commentsprovidedwereusedtorewritebothdefinitions.
![Page 22: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/22.jpg)
22
Table7.ConstructDefinitionDecisionRulesandResultsConstruct
MetorExceededComprehensivenessThreshold
MetorExceededClarityThreshold
Revisions
Capacity Yes Yes NoRevisions
Competency Yes Yes NoRevisions
Leadership Yes No Rewrotedefinition
Organization No No Rewrotedefinition
Decisioncutpoint:averageratingoflessthan2.5forcomprehensivenessorclarityDecisionrule:revisedefinitionbasedoncommentsFrequencyofAssessmentAnimportantquestionindeterminingthesuggestedscheduleanduseoftheassessmentwasaddressedinitemsaskingaboutthefrequencywithwhichdistrictteamsshouldusetheassessment.Respondentswereaskednotonlyabouthowoftentoadministertheassessmentbutalsoaboutthedurationbetweenassessments.Inotherwords,respondentswereaskedaboutthetimeneededforateamtoseegrowth/changeintheirDCAresultsalongwiththetimingofteamsusingtheDCAresultstoinformdistrict-levelactionplanning.Thedecisioncutpointwassetatmorethan70%ofrespondentssuggestingoneoptionforfrequency.Resultsforthissectionofthecontentvalidationsurveywereinconclusive.Thedecisionrulecutpointwasnotmet.Forallquestionsrelatedtofrequency,mostresultsweresplitbetweenannualassessmentandbi-annualassessmentoftheDCA.Whiletheresultsweresimilarbetweenthetwooptions,morerespondentssharedthatatwiceayearassessmentscheduledwouldbemostbeneficialtoteams.Commentsalsoindicatedthatamorerigorousschedulewaswarrantedforsuchadifficultareaofworkforteams.Reviewersfeltthatdirectingteamstoreturntotheassessmentonaregularbasiswouldhelpkeeptheteamsfocusedandmovingforwardontasksrelatedtocapacityforsupportofeffectiveinnovations.Themajorityrecommendationtoassesstwiceannuallywasadopted.Reviewersdidpointoutthatlessfrequentassessmentmaybeappropriateatlaterstagesofimplementationand/oronceathresholdhadbeenmet.TheDCAdevelopmentteamdeterminedthatthisfeedbackcouldbefurtherconsideredafteradditionaluseoftheassessment,usabilitytesting,andwhenasignificantnumberofteamsinteractingwiththeassessmenthavereachedfurtherstagesofimplementationcoupledwithabovethresholdresultsontheDCA.
![Page 23: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/23.jpg)
23
Table8.FrequencyofDCAAdministrationDecisionRulesandResultsFrequency PercentofRespondents
SuggestingtheFrequencyDecisiontoSuggestthisFrequency
Monthly 0% NoQuarterly 9% NoBi-Annually 50% Yes:MajorityresponseAnnually 38% NoEvery2Years 0% NoOther 3% NoDecisioncutpoint:Morethan70%ofrespondentssuggestoneoptionforfrequencyDecisionrule:UsetherecommendationasthesuggestedfrequencyComprehensiveandClearSectionsTheDCAprovidesawealthofinformationandsupportthroughtheinclusionofsectionssuchastheIntroductionandPurpose,ScoringGuide,FidelityAssessment,Glossary,etc.Thesesectionshelptoensureacommonunderstandingofwhytheassessmentiscompletedandprovidecleardirectionsonhowtocompletetheadministrationandscoringprocessinastandardizedway.Toensurethatthesesectionswerewelldeveloped,surveyparticipantswereaskedtoreadthroughthesectionsandprovideedits,comments,andquestionswithinthetrackchangesdocument.Respondentsalsosharedfeedbackregardingthecomprehensivenessandclarityofthesesections.AllsectionsincludedintheDCAmetthethresholdforcomprehensiveandclearformatandlanguage.DCAdeveloperscontinuedtoanalyzeandattendtothetrackchangeseditstoensurethatthehighestqualitydiscussionwasprovidedineachsection.Significantrewritingwasnotneeded,butbasedonsuggestions,smalleditstothecontentandformatdidoccur.Table9.SectionComprehensivenessandClarityDecisionRulesandResultsSection
AverageComprehensiveness
Rating
AverageClarity
Rating
DecisionIntroductionandPurpose 2.9 2.8 NochangesAdministrationandFidelityChecklist
3.0 2.8 Nochanges
DCAScoringForm 3.0 2.9 NochangesScoringGuide 3.0 2.8 NochangesActionPlanning 2.7 2.6 NochangesGlossary 2.8 2.8 NochangesDecisioncutpoint:Averageratingoflessthen2.5forcomprehensivenessorclarityDecisionRule:Revisesectionsbasedoncomments
![Page 24: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/24.jpg)
24
ItemAnalysisThehallmarkofcontentvalidationistoensurecomprehensiveandclearitems.Theitemanalysisportionofthecontentvalidationprocessisthemosttimeconsumingandtheimportantaspect.WithinthevalidationprocessoftheDCA,itemsanditemdetailincludedinthescoringrubricwereanalyzedtogethertoeachmethighqualitystandards.Whenanalyzingthedatacompiledforeachitem,developersfirstconsidereditemratingsonimportanceandhowmanyreviewersratedtheitemasoneofthetopfivemostimportantitemswithintheDCA.Thisinformationwasusedinitiallytodeterminewhethertheitemwouldneedsignificantrewritingoronlysmalleditsbasedonsuggestions.IfanitemmettheContentValidityIndex(CVI)criteriaasanimportantitem,DCAdeveloperskepttheitemandonlyusedcommentsandeditsfromthetrackchangesdocumentasaguideforidentifyingsmalleditslikespelling,grammar,orwordorderwhichultimatelyledtoenhancingtheitem.Ifanitemwasratedlowonimportance,reviewersconsideredwhethertheitemwasnecessaryandifso,usedfeedbackfromthetrackchangesdocumenttorewritetheitem.Informationfromtheattainabilityratinggaveinsightintowhichitemsreviewersconsidereddifficultfordistrictstoattain.Thisinformationwasusedbythedeveloperstoprioritizeresourcestoassistdistrictsintheireffortstodevelopcapacity.Attheconclusionoftheitemanalysis,theDCAdeveloperscombinedtwooftheitemsanddeletedoneitem.Editstoeachitemweremade;includingeditstohowtheitemwasdefinedinthescoringguide.Theeditsmadewerebasedonreviewerfeedbackthatwasprovidedthroughthetrackchangesprovidedwithintheassessmenttool.TheresultsofthesurveyandtrackchangesfeedbackrelatedtoitemanalysisareorganizedinTable10.Itisnoteworthythatthenumberofcomments,questions,andeditsreceivedforanitemdidnotdependontheimportancerating.Feedbackofferedrelatedtoimprovingthequalityoftheitemandthereforethenumberofsuggestededitsfluctuatedfromotherratings.AnotherkeydiscoverywasthatthemajorityoflowerratingsforattainabilitywereforitemsrelatedtotheCompetencyDriver.Thisclusteringofitemsreviewersconsidereddifficultfordistrictstoattainfacilitatedactionplanningfordevelopingfutureresources.
![Page 25: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/25.jpg)
25
Table10.ItemAnalysisDecisionRulesandResultsDCAItem
AverageImportanceRating
AverageAttainabilityRating
NumberofTimesRatedasaTop5MostImportantItem
1 3.00 2.97 192 2.92 2.86 183 2.92 2.88 44 2.94 2.85 125 2.60 2.52 46 2.75 2.67 27 2.83 2.31* 78 2.94 2.92 149 2.86 2.83 910 2.69 2.50 711 2.50 2.50 012 2.67 2.39* 813 3.00 2.83 514 3.00 2.81 2515 2.75 2.78 216 2.92 2.75 217 2.97 2.78 1518 2.89 2.75 319 3.00 2.83 820 2.64 2.25* 021 2.42*Deleted 2.11* 022 2.81 2.39* 223 2.56*Combinedwithitem#20 2.17* 024 2.81 2.47* 125 2.69 2.39* 426 2.83 2.33* 427 2.75 2.22* 128 2.86 2.44* 4 Decisioncutpoint:
ContentValidityIndexbelow2.5(CVI)Decisionrule:*Eliminateorsubstantiallychangetheitem
Decisioncutpoint:CVIbelow2.5DecisionRule:*Developanactionplantocreateresourcestoassistteamswithactionplanningandattainingitem
UsedtofurthervalidateCVIrating
![Page 26: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/26.jpg)
26
ItemMatchwithConstructsMatchingitemstoconstructsoftheassessmenthelpstoensurethatallconstructsareamatchtotheoverallassessment.Constructmatchingcanassistinsettingupitemmappingtosubscalesoftheassessment.InthecaseoftheDCA,reviewerswerenotconsistentlymatchingitemstotheconstructsidentifiedbytheauthors.Thismayalsohavebeenmediatedbythevarietyofdefinitionsusedwithinthefieldofimplementationscienceandthenoveltyoftheconceptstosomepractitioners.Forthesereasons,theDCAdevelopmentteamdeterminedthatdeveloperswhoweremostknowledgeableabouttheimplementationscienceconstructswouldmaptheitemsusingthecommentsprovidedbyreviewersasanadditionalguide/resource.Table11.ItemMatchwithConstructsDecisionRulesandResultsGreaterthan70%ofRespondentsAlignedItemtothesameconstruct
50-70%ofRespondentsAlignedItemtoSameConstruct
Lessthan50%ofRespondentsAlignedItemtoSameConstruct
3Items:2,13,22
20Items:1,3,4,5,6,7,8,9,10,12,14,15,16,17,18,19,20,21,24,25,26
3Items:11,23,27
Decisioncutpoint:Lessthan70%ofrespondentsalignanitemtothesameconstructDecisionRule:Authorswilluseresults,commentsandpersonalknowledgeoftheconstructstomapanitemtoaconstructSequencingofItemsThelargemajorityofreviewersreportedthatthecurrentorderoftheitemswassufficient.Duetothefeedbackonlyminoreditsweremadeandthesewerebasedontheeditsmadeduringtheitemanalysisand/oracommentmadebyareviewerthatdidsuggestanitemchange.Overall,theorderoftheitemsseemedappropriateforthetoolandwasadequatelysequencedfordistrictteamstocompletetheassessment.Thedecisionruleforreorderingitemswasifmorethan50%ofrespondentssuggestedmovinganitem.77%ofreviewerssuggestednoreorderingofitems.Afewitemswerereorderedbasedonreviewercommentsandduetoeditstotheassessmentitems.
ResponseProcess:ThinkAloudProtocolsResponseProcessOverviewResponseprocessisusedasapartofthevalidationprocesstocollectfurtherevidenceofthealignmentbetweenassessmentpurposeanddirectionsandresultingthinkingandactingbythoseusingtheassessment.Alignmentbetweenparticipantresponsesandintendedinterpretationoftheassessmentareevaluated(Smith&Smith,2007).Whileresponseprocessisnothistoricallyawidelyusedsourceofvalidity,itishighlightedasacriticalelementof
![Page 27: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/27.jpg)
27
validationwithintheStandardforEducationandPsychologicalTesting(AmericanEducationalResearchAssociation,theAmericanPsychologicalAssociation,andtheNationalCouncilonMeasurementinEducation,2014).Togatherthesedata,participantsareexpectedtoverballyreporttheirthoughtsduringasection-by-sectionanditem-by-itemwalkthroughoftheassessment.Theobjectiveistocaptureparticipants’cognitions,performancestrategies,thoughts,feelings,beliefs,andexperiencesastheyrespondtoassessmentitems.Johnstone,Bottsford-Miller,andThompson(2006)pointtothisprocessasawaytogathervaluableinformationaroundpotentialassessmentdesignproblemsthatmayleadtoinconsistencyinhowitemsorassessmentdirectionsareinterpreted.Thisstepenablesassessmentdeveloperstorethinkorreformatitemsthathavethepotentialtobemisinterpretedincreasesinternalvalidityoftheassessment. GeneralProtocolDevelopmentTomoredeeplyunderstandaparticipant’sthoughtswhileworkingthroughresponseprocesscognitiveinterviews,otherwiseknownasthinkaloudprotocols,canbeused.Whileworkingthroughathinkaloudprotocol,participantssharealoudwhattheyarethinking,doingandfeelingastheyengageinanassessment.Inanefforttostandardizethethinkaloudprocessandaddeasetothecollectionoffeedback,aThink-AloudProtocolGuide(TAPGuide)wasdevelopedtocollectthisinformationwhileparticipantsprogressedthroughtheDCA.TheTAPGuideisintendedtobeanefficientstrategyforgatheringevidenceofvalidity.TheTAPGuideisascriptusedduringathinkaloudcontainingseveralbestpracticespromotedbyresearcherswhohavedevelopedmethodstostandardizetheobservationandrecordingofverbalreportingdata(Conrad,Blair,andTracy,1999;Conrad&Blair1996,Willis,1999).TheTAPGuideisincludedinAppendixB.TheTAPGuideincludesscriptedinstructions,apracticephase,andclearinstructionsonhowsomeoneadministeringthethinkaloudprotocolshouldrespondtoparticipantinput.TheTAPGuidebeginsbybrieflyexplainingthepurposeoftheresponseprocess.Thenitdescribingtheworkheorshewilldowhilereadingtheassessmentaloud.Reviewerswillvoiceeverythingthatcomestomindasheorsheverballyanswereachofitems.Toacclimatereviewerstovoicingaloudwhatcomestomindastheycompletetheassessment,apracticephaseisconducted;whichprovidestheparticipantsanopportunitytopractice. Duringthethinkaloudprotocol,DCAcontentdeveloperscollectedqualitativedatainrealtimealongwithoccasionallyprobingreviewerstoencouragefurtherdialogueaboutwhatcametotheirmindastheyreadtheassessmentaloud.Attheconclusionofthethinkaloud,protocolfollow-upquestionsareusedasanopportunitytoaddressreviewerquestionsthataroseduringthethink-aloudprotocol,askclarifyingquestionsregardingspecificitemsanddirections,andsummarizereviewers’generalimpressionsoftheassessment.
![Page 28: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/28.jpg)
28
ResponseProcessParticipantsWillis(1999)suggeststhatrecruitmentofparticipantsshouldemphasizediversificationbasedoncharacteristicsofinterestthatwillsupportavarietyofviewpointsprovidingfeedbackontheassessment.Largesamplesizesarenotrequiredbecausethepurposeisnotstatisticalestimation;rather,qualitativeanalysis.Withinindividualinterviewingprocedures,Virzi(1992)recommendstheuseoffourorfiveparticipants,whichhasbeenshowntoadequatelyuncover80%oftheconstruct-irrelevantvariance.Forthisaspectofthevalidationprocess,fourparticipantswereidentified.Effortsweremadetoselectindividualsthathaveeitherdifferingrolesinsupportingdistrictimplementationorhavevariouslevelsofexperienceinusingpreviousiterationsofdistrictcapacityassessments.Rolesincludedanadministrator,schoolpsychologist,andanMTSSCoordinatorrepresentingavarietyofexperienceswithimplementationwork.WithinthereviewergrouptwoindividualshadworkedcloselywithMIBLSItoimplementaneffectiveinnovationandtwoothershadmoreancillaryparticipationintheMIBLSIwork.Eachparticipantprovidedfeedbackthroughaone-on-onemeetingwithoneoftheDCAdevelopersusingthedescribedTAPGuide.Oneparticipantreviewedtheentireassessmentfromstarttofinish.Intheinterestofreceivinghigh-qualityfeedbackwithoutfatiguingthereviewers,oneadditionalrespondentreviewedtheIntroductionandPurposeandDCAAdministrationFidelityChecklist;while,twoadditionalparticipantsreviewedtheScoringGuide,whichalsoincludedareviewoftheDCAitems.Allparticipantswereaskedtorefertotheglossaryasneededandwhenthisoccurredreviewerswereaskedtogivefeedbackontheportionoftheglossarythattheyaccessed.ResponseProcessResultsandModificationstotheMeasureThetimeneededtocompletetheresponseprocessvariedfromonereviewertoanotherduetothevarietyofsectionsuponwhicheachreviewerprovidedfeedback.Onaverage,thisprocesstooktwotothreehoursperreviewer.ResultsweredocumentedinthenotessectionoftheTAPGuidebycapturing,wheneverpossible,whatthereviewersaidverbatim.ThoseadministeringtheThinkAloudProtocoldidsousingprescribeddirectionswithintheprotocol.Theresponseprocessresultswereanalyzedandacteduponbythedevelopersfollowingthecompletionofthethinkaloudprocedures.Qualitativeresultsweresummarizedandactionablefeedbackwassharedwiththegroupforconsideration.Forthisportionofthecontentvalidationprocess,nosignificantchangestotheDCAwerenecessary;however,minorimprovementsweremade(e.g.,itemandscoringguidere-wording)inanefforttoimprovetheclarityoftheassessment.Theresponseprocesswasconsideredvaluableasithighlighteddifficulttoreadsentencesandinconsistenciesinlanguage,andwordingthatcouldbeinterpretedmultipleways.CommentsandsuggestededitswereusedwithinafinaleditingprocesstoensureconsistencyandclarityinwordingacrosstheDCA.
![Page 29: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/29.jpg)
29
UsabilityTesting:ContinuousImprovementProcessUsabilityTestingOverviewUsabilityTestingwascompletedtotestthefeasibilityoftheassessmentandadministrationprocesses.UsabilityTestingisaplannedseriesofimprovementcycles(Plan-Do-Study-ActCycles).Specifically,smallcohortsofDCAadministrations(N=4-5)werecompletedinfourintentionalimprovementcycles(seeFigure2).Thegoalofusabilitytestingistoprogressivelyimprovetheadministrationandscoringprocessbyidentifyingandaddressingchallengesencounteredbeforebroadlyusingtheassessment.ThekeytoUsabilityTestingishavingateamthat:
• Plans-Leadstheimprovementplanningprocessanddevelopsthescopeofthetestforuseoftheassessment
• Does-Engagesinusingtheassessmentasoutlinedintheplanningphase• Studies–Aftereachdatacollectioncycle,theteamstudieswhatisworking(ornot)
usingdata• Acts-Identifiesactionstheteamwilltakeandimplementsthoseactionsinanother
datacollectioncyclewithadifferentcohortByengaginginfourtofiveimprovementcycles,approximately80%oftheproblemswiththeassessmentitselfcanbeeliminated(Nielsen,2000).Thisimprovestheadministrationandscoringexperienceofthoseusingtheassessment(formoreinformationonusabilitytesting:http://implementation.fpg.unc.edu/module-5/topic-2-usability-testing).Figure2.UsabilityTesting
UsabilityTestingPlanTheusabilitytestingplanwascomprisedofseveralcomponentsincluding1)selectioncriteriafordeterminingwhowouldparticipateineachcohortofacycle;2)scopeofthetestincludingelementstostudy;3)criteriaforsuccessorreconcilinginformation,and4)processesfordatacollection.DetailsoftheusabilitytestingplanundertakenfortheDCAareoutlinedinTable12.
![Page 30: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/30.jpg)
30
Ineachimprovementcycle,3-4DistrictCapacityAssessmentswerecompletedwithintwotothreeStateEducationAgenciesincludingMichigan,Washington,andNorthCarolina.Districtsvariedinsize(e.g.urban,rural),need,anddemographics.
![Page 31: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/31.jpg)
31
Table12.UsabilityTestingPlan1. Selection
CriteriaAdministrators:• TrainedontheadministrationoftheDCA• NIRNstafforMIBLSIstaff
SitesforAdministration:• Districtsactivelyengagedininstallinganeffectiveinnovation• Districtsactivelyengagedindevelopingtheircapacitytosupport
effectiveuseofinnovationsinpartnershipwithMIBLSIorNIRN• DistrictswithexecutiveleadershipsupportforuseoftheDCA
2. ScopeofTest
AreasofStudy:• CommunicationandpreparationfortheDCAadministrationprocess
o Obtainingcommitmenttothetimefortheadministrationo Obtainingleadershipsupportforadministrationo Identifyingappropriaterespondents
• Administrationprotocolwithfidelity• Itemsandscoringrubric
o Clarityoflanguageo Sequencingofitems
• ParticipantResponsiveness• TrainingImplications
3. CriteriaforSuccess
CommunicationandPreparation• Commitmentobtainedtothetimeandleadershipsupportasevidenced
by90%ormoreofrespondentsstayingortheentirelengthofadministration
• AppropriaterespondentswereidentifiedforadministrationAdministrationProtocolwithFidelity• 90%orgreaterofitemsonadministrationprotocolreportedtobe
completedItemsandScoringRubric• Fewerthan10itemsneedingrevisionforclaritypurposes• Lessthan2changesinsequence
ParticipantResponsiveness• Majorityofrespondents(80%)reportedtobeengagedandpositive
4. ProcessforDataCollection
Attheendofimprovementcycle,thedevelopmentteammettoreviewtheresultsoftheadministrationsincludingDCAresultsincludinglengthofadministration,respondents,adherencetofidelitychecklistresults,andqualitativefeedbackfromadministrators.
![Page 32: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/32.jpg)
32
UsabilityTestingResultsandModificationstotheMeasureThenumberofimprovementsidentifiedforthedifferentareas(e.g.,communication,administrationprotocol,itemsandscoringrubric,participantresponsiveness,andtrainingimplications)decreasedbytheendofthefourthimprovementcycle.Inaddition,thecriteriaforsuccessforeachofstudywasmetbytheendofthefourthimprovementcycle.ExamplesofimprovementfordifferentareasstudiedandacteduponarelistedbelowinTable13.Table13.AreasIdentifiedforImprovementBasedonUsabilityTestingArea ExamplesofImprovement
Communication&Preparation
• Moreguidancedevelopedaroundteamcompositionandrespondents• Lengthoftimerequestedforadministrationincreasedfrom1.5hours
to2hourstoincludetimeforreviewofresults• OnepagehandoutintroducingtheDCAwasdevelopedfor
communicationpurposes
AdministrationProtocol
• 100%adherencetotheadministrationprotocolonadministrations
Items&ScoringRubric
• Minorwordingchangestoatotalof4items(3incycle1,2incycle2);• Sequencingofitemswasreviewedbutnotchanged
TrainingImplications
• Facilitationskillsidentifiedandtrainingmaterialsrefined• Processdevelopedfortheprioritizationofareasforactionplanning
usingDCAresults
ParticipantResponse
• Engagedandpositivethroughoutalladministrationswithin4improvementcycles
PreliminaryReliabilityResultsDescriptiveStatisticsOnehundredandninety-fiveDCAswithin18stateshavebeenconductedsincethereleaseoftheinstrument.Aseriesofdescriptivestatistics,initialreliabilityanalyses,andexploratoryfactoranalyseswereconducted.Resultsoftheseanalysesarepresentedbelow.Giventhelowscoresandlittlevariabilityinseveraloftheitems,itwasdeterminedaflooringeffectiscurrentlyoccurringforthoseitemssuchastheitemscomprisingthecoachingsubscale.Beforemakinganysignificantchangestotheitemsthemselvesandrescalingtheinstrument,thedevelopersdeterminedadditionaldatawasneeded.
![Page 33: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/33.jpg)
33
Table14.DCAScaleandSubscaleDescriptiveStatistics(N=195)Scale/Subscale M SD RangeTotalDCAScore 22.34 9.85 0-48LeadershipComposite 9.67 3.76 0-16CompetencyComposite 5.10 3.32 0-16OrganizationComposite 7.57 4.09 0-19LeadershipSubscale 7.17 2.32 0-10PlanningSubscale 2.50 2.02 0-6PerformanceAssessmentSubscale 1.75 1.05 0-4SelectionSubscale 1.47 1.11 0-4TrainingSubscale 1.37 1.17 0-4CoachingSubscale 0.51 0.99 0-4DecisionSupportSubscale 2.32 1.87 0-6FacilitativeAdministrationSubscale 4.23 2.34 0-11SystemsInterventionSubscale 1.03 0.80 0-2
![Page 34: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/34.jpg)
34
Table15.DCAItemDescriptiveStatistics(N=195)Item M SDLeadershipSubscale
1.ThereisaDITtosupportimplementationofEI. 1.63 0.662.DITincludessomeonewithexecutiveleadershipauthority 1.58 0.693.DITincludesanidentifiedcoordinator(s) 1.20 0.777.FundsareavailabletosupporttheimplementationoftheEI 1.47 0.6517.BITsaredevelopedandfunctioningtosupportimplementationofEIs 1.29 0.68
PlanningSubscale 8.DistricthasanimplementationplanfortheEI 0.76 0.749.DITactivelymonitorstheimplementationoftheplan 0.60 0.7518.BITimplementationplansarelinkedtodistrictimprovementplan 1.14 0.90
PerformanceAssessmentSubscale 13.DITsupportstheuseofafidelitymeasureforimplementationoftheEI 1.11 0.7926.Staffperformancefeedbackison-going 0.64 0.61
SelectionSubscale 20.Districtusesaprocessforselectingstaff(internaland/orexternal)whowill
implementandsupporttheEI0.67 0.67
21.StaffmembersselectedtoimplementorsupporttheEIhaveaplantocontinuouslystrengthenskills
0.80 0.67
TrainingSubscale 22.DITsecurestrainingontheEIforalldistrict/schoolpersonnelandstakeholders 0.95 0.7223.DITusestrainingeffectivenessdata 0.42 0.66
CoachingSubscale 24.DITusesacoachingservicedeliveryplan 0.34 0.6125.DITusescoachingeffectivenessdata 0.17 0.45
DecisionSupportDataSystemsSubscale 14.DIThasaccesstodatafortheEI 1.06 0.7315.DIThasaprocessforusingdatafordecisionmaking 0.61 0.7419.BITshaveaprocessforusingdatafordecisionmaking 0.66 0.71
FacilitativeAdministrationSubscale 4.DITusesaneffectiveteammeetingprocess 0.97 0.725.DistrictoutlinesaformalprocedureforselectingEIsthroughtheuseofguidance
documents0.42 0.57
6.DistrictdocumentshowcurrentEIslinktogether 0.65 0.6610.Districtutilizesacommunicationplan 0.60 0.6011.Districtusesaprocessforaddressinginternalbarriers 0.83 0.4816.DistrictprovidesastatusreportontheEItotheschoolboard 0.76 0.72
SystemsInterventionSubscale 12.Districtusesaprocesstoreportpolicyrelevantinformationtooutsideentities 1.03 0.80
Note.Allitemscoresrangefrom0-2.DCA=DistrictCapacityAssessment;DIT=DistrictImplementationTeam;EI=EffectiveInnovations;BIT=BuildingImplementationTeam.
![Page 35: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/35.jpg)
35
BivariateCorrelationsPearsonproduct-momentcorrelationcoefficientswereconductedtodeterminerelationshipsbetweentheDistrictCapacityAssessment(DCA)subscales.Thisanalysisdeterminedthatthemajorityofsubscales(Leadership,Planning,PerformanceAssessment,Selection,Training,Coaching,DecisionSupport,FacilitativeAdministration,SystemsIntervention)aresignificantlycorrelatedwitheachother(seeTable3).Theonlyexceptionsare:LeadershipandCoaching(r=-.002,n=195,p>.05),SystemsInterventionandTraining(r=.089,n=195,p>.05),andSystemsInterventionandCoaching(r=.113,n=195,p>.05).ItisimportanttonotethatthemeanscoreforCoaching(M=.51)islowerthanothersubscalemeanscores,whichmaybecontributingtotheaforementionedexceptions.Table16.BivariateCorrelationsBetweenDCASubscales
Subscale 1 2 3 4 5 6 7 8
1.Leadership --
2.Planning .504** --
3.PerformanceAssessment
.500** .633** --
4.Selection .321** .470** .501** --
5.Training .280** .529** .465** .591** --
6.Coaching -.002 .269** .287** .384** .449** --
7.DecisionSupportDataSystem
.393** .775** .617** .518** .522** .314** --
8.FacilitativeAdmin.
.526** .682** .529** .515** .536** .274** .632** --
9.SystemsIntervention
.181* .196** .148* .154* .089 .113 .146* .318**
Note.DCA=DistrictCapacityAssessment.**p<.001(2-tailed)*p<.05(2-tailed)Cronbach’sAlphaCoefficientsCronbach’salphacoefficientswereconductedtodetermineinternalconsistencyoftheDistrictCapacityAssessment(DCA),DCAcompositescales,andDCAsubscales.ThetotalDCAhasstronginternalconsistencywithaCronbach’salphacoefficientof.908.Thethreecompositesalsohaveadequateinternalconsistency:Leadership(α=.794),Competency(α=.791),andOrganization(α=.805).Theeightsubscalesvaryininternalconsistency.ThePlanningsubscale(α=.797),Coachingsubscale(α=.832),andDecisionSupportDataSystemssubscale(α=.818)allhaveadequateinternalconsistency.TheLeadershipsubscale(α=.689),PerformanceAssessmentsubscale(α=.224),Selectionsubscale(α=.563),Trainingsubscale(α=.606),andFacilitative
![Page 36: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/36.jpg)
36
Administrationsubscale(α=.678)areallbelowtheadequatelevelofinternalconsistency(i.e.,.700).SeeTable4forallresultsfromtheseanalyses.Table17.Cronbach’sAlphaCoefficientsScale/Subscale
Cronbach’sAlpha
Cronbach’sAlphaBased
onStandardized
Items
NumberofItems
DCA .908 .908 26
LeadershipComposite .794 .794 8
CompetencyComposite .791 .802 8
OrganizationComposite .805 .811 10
LeadershipSubscale .689 .687 5
PlanningSubscale .797 .808 3
PerformanceAssessmentSubscale .224 .230 2
SelectionSubscale .563 .563 2
TrainingSubscale .606 .607 2
CoachingSubscale .832 .854 2
DecisionSupportDataSystemsSubscale .818 .818 3
FacilitativeAdministrationSubscale .678 .690 6
Note.DCA=DistrictCapacityAssessmentExploratoryFactorAnalysisThe26itemsoftheDistrictCapacityAssessment(DCA)weresubjectedtoprinciplecomponentsanalysis(PCA).PriortointerpretingthePCAresults,thesuitabilityofdataforfactoranalysiswasassessedbyinspectingtheKaiser-Meyer-Oklin(KMO)valueandBartlett’sTestofSphericity.TheKMOvalue(.865)metthenecessaryleveltorunafactoranalysis,andBartlett’sTestofSphericityreachedstatisticalsignificance.ThePCArevealedfivecomponentswitheigenvaluesexceeding1withintheDCA.Thesefivefactorsexplain32.16%,9.66%,6.36%,5.51%,and5.05%ofthevariancerespectively.SeeTables17and18fortheresults.ItisinterestingtonotethattheDCAwasdevelopedwithninedifferentfactorsinmind,howeveronlyfivedifferentfactorswereidentifiedinthisanalysis.Factoroneconsistsofthoseitemsrelatedtodataanduseofdataforactionplanningatbothdistrictandbuildinglevels.Usingdataforevaluationcompetencysupports(e.g.trainingandcoaching)wasprevalentthemeofFactor2.Thethirdfactorwascomprisedofitemsthataddressleadershipcoordinationandteamfunctioning.Itemsaddressingcompetencyandfacilitativeadministrationactivitiesand
![Page 37: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/37.jpg)
37
supportscomprisedthefourthfactor.Finally,thefifthfactorconsistedofthosetimesrelatedtopolicy,communication,reporting,andothersystemsinterventionsupports.TheDCAdevelopersdeterminedthatadditionaldataareneededtofurtheranalyzetheinternalstructureoftheinstrumentbeforemakingsignificantchangestotheitemsthemselvesandthecompositionofthesubscales.ThebasisofthisdecisionwasmadeonfactorssuchastheorytheDCAisbuiltupon,theflooringeffectobservedinlowscoresacrossseveralitemsduetothecurrentstateofthefield,andsamplesize.Table18.FiveFactorSolutionSimplifiedFactor Itemsloadedintofactor(DCA
Subscale)FactorLoadings
1 19(DSDS) .832 18(Planning) .818 17(Leadership) .714 14(DSDS) .698 15(DSDS) .688 8(Planning) .651 9(Planning) .583 13(PerformanceAssessment) .560 16(FacilitativeAdministration) .4432 25(Coaching) -.856 24(Coaching) -8.49 23(Training) -.662 21(Selection) -.494 26(PerformanceAssessment) -.4863 2(Leadership) .815 1(Leadership) .770 3(Leadership) .745 4(FacilitativeAdministration) .6934 7(Leadership) .711 20(Selection) .533 22(Training) .473 11(FacilitativeAdministration) .4345 12(SystemsIntervention) .739 5(FacilitativeAdministration) .666 10(FacilitativeAdministration) .481 6(FacilitativeAdministration) .363Note.Majorloadingsforeachitemarebolded.DCA=DistrictCapacityAssessment;DSDS=DecisionSupportDataSystems;DIT=DistrictImplementationTeam;EI=EffectiveInnovations;BIT=BuildingImplementationTeam.DCA=DistrictCapacityAssessment;DSDS=DecisionSupportDataSystems.
![Page 38: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/38.jpg)
38
CurrentandFutureUsesoftheDistrictCapacityAssessmentAppropriateUseoftheDCAAswithallassessmentinstruments,thereareappropriateusesoftheDistrictCapacityAssessment.Theseincludethefollowing:
• Districtself-assessmentandprogressmonitoringusedtoguideandimproveimplementationcapacitybuilding
• Coachingfordistrictandbuildingimplementationteamsondevelopingofsystems,structures,functions,androlesnecessarytoadoptandsustainimplementationofEIs
• Coachingforimplementationspecialistsattheregionalandstatelevelonthedevelopmentofdistrictandbuildingimplementationteamstoengageincapacitybuilding
• Feedbackonmaterials,resources,andlearningtoolstosupportimplementationspecialistsandimplementationteamsoncapacitybuilding
• Researchonstructures,roles,andfunctionsnecessaryforeffectiveandsustainedimplementationofEIsandtheassociationsbetweentheseandfidelitymeasuresoftheEIsandstudentoutcomes
TheDCAshouldnotbeusedashighstakesevaluationtoolofaDIT.TheDCA’svalidityandreliabilityisstillbeingassessed.Itsprincipalpurposeisforuseasanactionassessmenttoassistdistrictsandtheirschoolstoimplementevidencebasedpracticesthatbenefitstudents.FutureValidationoftheDCANextstepsinthedevelopmentandvalidationprocessoftheDCAincludedesigningandconductingresearchtofurtherexaminetheDCA’sinternalstructure(e.g.,FactorAnalysis),itsrelationshiptoothervariables(e.g.,Predictive,Concurrent,Convergent,andDivergentValidityanalyses),anditsconsequentialvalidity,thatistheintendedandunintendedconsequencesofusingtheDCA.TheDCAdevelopmentteamiscurrentlyinprocessofdesigningtheresearchtoaddresstheseareasofvalidationandsecuringthefundstoaccomplishthistask.
![Page 39: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/39.jpg)
39
References
AaronsG.A.,CafriG.,LugoL.,&SawitzkyA.(2012).Expandingthedomainsofattitudestowards evidence-basedpractice:TheEvidenceBasedAttitudesScale-50.Administrationand PolicyinMentalHealthandMentalHealthServicesResearch,5,331-340.AmericanEducationalResearchAssociation,AmericanPsychologicalAssociation,&National CouncilonMeasurementinEducation,&JointCommitteeonStandardsforEducational andPsychologicalTesting.(2014).Standardsforeducationalandpsychologicaltesting. Washington,DC:AERA.AmericanPsychologicalAssociation.(2010).AmericanPsychologicalAssociationethical principlesofpsychologistsandcodeofconduct.Retrievedfrom http://www.apa.org/ethics/code/index.aspxBeck,C.T.,&Gable,R.K.(2001).Ensuringcontentvalidity:Anillustrationoftheprocess.Journal ofNursingMeasurement,9,201–215.Blase,K.A.,Fixsen,D.L.,Naoom,S.F.,(2005).Operationalizingimplementation:Strategiesand methods.Tampa,FL:UniversityofSouthFlorida,LouisdelaParteFloridaMental HealthInstitute.Carretero-Dios,H.,&Pérez,C.(2007).Standardsforthedevelopmentandreviewof instrumentalstudies:Considerationsabouttestselectioninpsychological research.InternationalJournalofClinicalandHealthPsychology,7,863-882.Conrad,F.G.,&Blair,J.(1996).FromImpressionstoData:IncreasingtheObjectivityof CognitiveInterviews.ProceedingsoftheSectiononSurveyResearchMethods,Annual MeetingsoftheAmericanStatisticalAssociation.1–10.Alexandria,VA:American StatisticalAssociation.Conrad,F.G.,Blair,J.,&Tracy,E.(1999).VerbalReportsareData!ATheoreticalApproachto CognitiveInterviews.ProceedingsoftheFederalCommitteeonStatisticalMethodology ResearchConference,TuesdayBSessions,Arlington,VA,11–20.Davis,L.(1992).Instrumentreview:Gettingthemostfromyourpanelofexperts.Applied NursingResearch,5,194–197.DeVellis,R.F.(2012).Scaledevelopment:Theoryandapplications.ThousandOaks,CA:SAGE Publications.Fixsen,D.L.,Naoom,S.F.,Blase,K.A.,Friedman,R.M.,&Wallace,F.(2005).Implementation research:Asynthesisoftheliterature.Tampa,FL:UniversityofSouthFlorida,Louisdela ParteFloridaMentalHealthInstitute,TheNationalImplementationResearchNetwork (FMHIPublication#231).RetrievedMay,2016 http://nirn.fpg.unc.edu/resources/implementation-research-synthesis-literatureFuchs,D.,&Deshler,D.(2007).Whatweneedtoknowaboutresponsivenessto intervention (andshouldn’tbeafraidtoask).LearningDisabilitiesResearchandPractice,22,129-136.
![Page 40: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/40.jpg)
40
Gable,R.,&Wolf,M.(1994).InstrumentDevelopmentintheAffectiveDomain:Measuring AttitudesandValuesinCorporateandSchoolSettings.NewYork,NY:Evaluationin EducationandHumanServices.Grant,J.S.,&Davis,L.L.(1997).Selectionanduseofcontentexpertsforinstrument development.ResearchinNursing&Health,20,269–274.Haynes,S.N.,Richard,D.C.S.,&Kubany,E.S.(1995).ContentValidityinPsychological Assessment:AFunctionalApproachtoConceptsandMethods.Psychological Assessment,3,238-247.Johnstone,C.J.,Bottsford-Miller,N.A.,&Thompson,S.J.(2006).Usingthethinkaloud method(cognitivelabs)toevaluatetestdesignforstudentswithdisabilitiesandEnglish languagelearners.Minneapolis,MN:NationalCenteronEducationalOutcomes.Klein,K.J.,Conn,A.B.,Smith,D.B.,&Sorra,J.S.2001.Iseveryoneinagreement?An explorationofwithin-groupagreementinemployeeperceptionsofthework environment.JournalofAppliedPsychology,86,3–16.Landenberger,N.A.,&Lipsey,M.W.(2005).Thepositiveeffectsofcognitive-behavioral programsforoffenders:Ameta-analysisoffactorsassociatedwitheffective treatment.JournalofExperimentalCriminology,1,451–476.Lynn,M.(1986).Determinationandquantificationofcontentvalidity.NursingResearch, 35,382–385.Martinez,R.G.,Lewis,C.C.,&WeinerB.J.(2014).Instrumentationissuesinimplementation science.ImplementationScience,9,118.Messick,S.(1995).Validityofpsychologicalassessment:Validationofinferencesfrom persons’responsesandperformancesasscientificinquiryintoscoremeaning.American Psychologist,50,741-749.Mihalic,S.,&Irwin,K.(2003).Fromresearchtorealworldsettings:Factorsinfluencingthe successfulreplicationofmodelprograms.YouthViolenceandJuvenileJustice,1,307- 329.Nielsen,J.(2000,March19),Whyyouonlyneedtotestwith5users.RetrievedJune,2016from http://www.useit.com/alertbox/20000319.htmlOlds,D.L.,Hill,P.L.,O’Brien,R.,Racine,D.,&Moritz,P.(2003).Takingpreventiveintervention toscale:Thenurse–familypartnership.CognitiveandBehavioralPractice,10,278–290.Popham,W.J.(2008).AllAboutAssessment/AMisunderstoodGrail.EducationalLeadership, 66,82-83.Schoenwald,S.K.,Sheidow,A.J.,&Letourneau,E.J.(2004).Towardeffectivequality assuranceinevidence-basedpractice:Linksbetweenexpertconsultation, therapistfidelity,andchildoutcomes.JournalofClinicalChildandAdolescent Psychology,33,94-104.
![Page 41: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/41.jpg)
41
Sireci,S.,&Faulkner-Bond,M.(2014).Validityevidencebasedontestcontent.Psicothema,26, 100-107.Skiba,R.J.,Middelberg,L.,&McClain,M.(2013).MulticulturalissuesforschoolsandEBD students:Disproportionalityindisciplineandspecialeducation.InH.WalkerF. Gresham(Eds.),HandbookofEvidence-BasedPracticesforStudentsHaving EmotionalandBehavioralDisorders.NewYork:Guilford.Smith,E.V.,&Smith,R.M.(2007).Raschmeasurement:Advancedandspecialized applications.MapleGrove,MN:JAMPress.Tilden,V.P.,Nelson,C.A.,&May,B.A.(1990).Useofqualitativemethodstoenhancecontent validity.NursingResearch,39,172–175.Virzi,R.A.(1992).Refiningthetestphaseofusabilityevaluation:Howmanysubjectsare enough?HumanFactors34,457-468.Waltz,C.F.,StricklandO.L.,&LenzE.R.(1991)Reliabilityandvalidityofnorm-referenced measures.MeasurementinNursingResearch,161-194.Ward,C.,St.Martin,K.,Horner,R.,Duda,M.,Ingram-West,K.,Tedesco,M.,Putnam,D., Buenrostro,M.,&Chaparro,E.(2015).DistrictCapacityAssessment.Universityof NorthCarolinaatChapelHill. Willis,G.(1999).CognitiveInterviewing:A"HowTo"Guide(ReducingSurveyErrorthroughResearch
onthecognitiveandDecisionProcessesinSurveys)MeetingoftheAmericanStatisticalAssociation(pp.1-37).Durham,NC:RTIInternational.
![Page 42: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/42.jpg)
AppendixA:ContentValidationSurveys
![Page 43: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/43.jpg)
DCA Consent and Edits
Welcome
Play the video below to learn about how to interact with Survey #1.
Consent
First and Last Name *
By clicking on the consent checkbox below, I acknowledge that I have readand agree to the terms of participation outlined the District CapacityAssessment (DCA) Content Validation Research Consent Form. *
May we include your name as a DCA reviewer? Your name will appear afterthe cover page. *
Response Time
Please upload your edited version of the District Capacity Assessment. *
I consent to participate as an expert reviewer for the District CapacityAssessment.
Yes
No
UploadBrowse...
![Page 44: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/44.jpg)
DCA Item Analysis
Welcome
1. First and Last Name
View the video below to learn about how to interact with Survey #2.
Item Feedback
Each item within the DCA is provided in the left hand column. Read througheach individual item and respond regarding:
How attainable the specific DCA item is for the District ImplementationTeamHow important the specific DCA item is for successful implementation ofEffective Innovations (EI)Which items are the top 5 most critical items to include in the DCA
*
How attainableis this item for adistrict team? *
How important isthis item forsuccessful
implementation ofEffective
Innovations? *
Select the 5most critical
items toinclude inthe DCA
1. There is a DistrictImplementation Team (DIT) tosupport implementationefforts of Effective Innovations(EI)
VerySomewhatNot at All
VerySomewhatNot at All
2. The DIT includes someoneVerySomewhatNot at All
VerySomewhatNot at All
![Page 45: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/45.jpg)
2. The DIT includes someonewith executive leadershipauthority
VerySomewhatNot at All
VerySomewhatNot at All
3. DIT uses an effective teammeeting process
VerySomewhatNot at All
VerySomewhatNot at All
4. DIT includes an identifiedcoordinator (or coordinators)
VerySomewhatNot at All
VerySomewhatNot at All
5. District guidancedocuments outline a formalprocedure for selecting EIs
VerySomewhatNot at All
VerySomewhatNot at All
6. District documents howcurrent initiatives/ practicelink together
VerySomewhatNot at All
VerySomewhatNot at All
7. Funds are available tosupport the implementation ofEIs
VerySomewhatNot at All
VerySomewhatNot at All
8. District has animplementation plan
VerySomewhatNot at All
VerySomewhatNot at All
9. DIT actively monitors theimplementation of the plan
VerySomewhatNot at All
VerySomewhatNot at All
10. The district uses aprocess for addressinginternal barriers
VerySomewhatNot at All
VerySomewhatNot at All
![Page 46: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/46.jpg)
internal barriers
VerySomewhatNot at All
VerySomewhatNot at All
11. District uses a process toreport policy relevantinformation to outside entities
VerySomewhatNot at All
VerySomewhatNot at All
12. DIT uses a measure offidelity for the use of the EI
VerySomewhatNot at All
VerySomewhatNot at All
13. DIT has access to data
VerySomewhatNot at All
VerySomewhatNot at All
14. DIT has a process forusing data for decisionmaking
VerySomewhatNot at All
VerySomewhatNot at All
15. District provides a statusreport to the school board
VerySomewhatNot at All
VerySomewhatNot at All
16. District utilizes acommunication plan
VerySomewhatNot at All
VerySomewhatNot at All
17. Building ImplementationTeams (BITs) are developedand functioning to supportimplementation of EIs
VerySomewhatNot at All
VerySomewhatNot at All
18. BIT implementation plansare linked to districtimprovement plan
VerySomewhatNot at All
VerySomewhatNot at All
![Page 47: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/47.jpg)
19. BITs have a process forusing data for decisionmaking
VerySomewhatNot at All
VerySomewhatNot at All
20. A process is followed forrecruiting staff (internal and/orexternal) to implement the EI
VerySomewhatNot at All
VerySomewhatNot at All
21. A process is in place toevaluate selection outcomes
VerySomewhatNot at All
VerySomewhatNot at All
22. Staff members selectedhave a plan to strengthenskills necessary for success
VerySomewhatNot at All
VerySomewhatNot at All
23. A process is in place toevaluate selection outcomes
VerySomewhatNot at All
VerySomewhatNot at All
24. DIT secures training for alldistrict/school personnel andstakeholders
VerySomewhatNot at All
VerySomewhatNot at All
25. DIT uses trainingeffectiveness data
VerySomewhatNot at All
VerySomewhatNot at All
26. DIT uses a Coachingservice delivery plan
VerySomewhatNot at All
VerySomewhatNot at All
VerySomewhatNot at All
VerySomewhatNot at All
![Page 48: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/48.jpg)
27. DIT uses coachingeffectiveness data
VerySomewhatNot at All
VerySomewhatNot at All
28. Staff performancefeedback is perpetual
VerySomewhatNot at All
VerySomewhatNot at All
Time to Respond
Approximately how many minutes did it take you to complete this "DCA ItemAnalysis" survey?
![Page 49: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/49.jpg)
DCA Construct
Welcome
1. First and Last Name *
Play the video below to learn about how to interact with Survey #3.
Construct Definitions
Provide feedback on the definitions and descriptions that are used within theDCA *
Is thedefinition comprehensive
*
Is thedefinition clear
*
Provide any comments about thisdefinition / description
Capacity:organization,activities, andsystems thatexist at thedistrict leveland have adirect effecton thesuccess ofbuildingleadershipteams toadopt andsustainevidence-basedpractices.
VerySomewhatNot at All
VerySomewhatNot at All
Competency:mechanisms
![Page 50: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/50.jpg)
mechanismsto develop,improve, andsustain one'sability toimplement aninterventionas intended inorder tobenefitchildren,families, andcommunities.
VerySomewhatNot at All
VerySomewhatNot at All
Organization:mechanismsto create andsustainhospitableorganizationaland systemenvironmentsfor effectiveservices.
VerySomewhatNot at All
VerySomewhatNot at All
Leadership:emphasis onproviding therightleadershipstrategies fortypes ofleadershipchallenges.Theseleadershipchallengesoften emergeas part of thechangemanagementprocessneeded tomakedecisions,provideguidance, andsupportorganization.
VerySomewhatNot at All
VerySomewhatNot at All
![Page 51: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/51.jpg)
Item-Construct Alignment
For each item within the DCA, indicate which Implementation Driver:
Leadership, Organization or Competency, that the item best fits
within based on the definition of each driver listed below.
Competency: mechanisms to develop, improve, and sustain one's ability to implement an
intervention as intended in order to benefit children, families, and communities.
Organization: mechanisms to create and sustain hospitable organizational and system
environments for effective services.
Leadership: emphasis on providing the right leadership strategies for types of leadership challenges.
These leadership challenges often emerge as part of the change management process needed to
make decisions, provide guidance, and support organization.
*
ImplementationDriver *
1. There is a District Implementation Team (DIT) to supportimplementation of Effective Innovations (EI)
LeadershipOrganizationCompetencyUnsure
2. The DIT includes someone with executive leadership authority
LeadershipOrganizationCompetencyUnsure
3. DIT uses an effective team meeting process
LeadershipOrganizationCompetencyUnsure
![Page 52: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/52.jpg)
4. DIT includes an identified coordinator (or coordinators)
LeadershipOrganizationCompetencyUnsure
5. District guidance documents outline a formal procedure forselecting EIs
LeadershipOrganizationCompetencyUnsure
6. District documents how current initiatives/ practice link together
LeadershipOrganizationCompetencyUnsure
7. Funds are available to support the implementation of EIs
LeadershipOrganizationCompetencyUnsure
8. District has an implementation plan
LeadershipOrganizationCompetencyUnsure
9. DIT actively monitors the implementation of the plan
LeadershipOrganizationCompetencyUnsure
10. The district uses a process for addressing internal barriers
LeadershipOrganizationCompetencyUnsure
11. District uses a process to report policy relevant information tooutside entities
LeadershipOrganizationCompetencyUnsure
LeadershipOrganizationCompetencyUnsure
![Page 53: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/53.jpg)
12. DIT uses a measure of fidelity for the use of the EI
LeadershipOrganizationCompetencyUnsure
13. DIT has access to data
LeadershipOrganizationCompetencyUnsure
14. DIT has a process for using data for decision making
LeadershipOrganizationCompetencyUnsure
15. District provides a status report to the school board
LeadershipOrganizationCompetencyUnsure
16. District utilizes a communication plan
LeadershipOrganizationCompetencyUnsure
17. Building Implementation Teams (BITs) are developed andfunctioning to support implementation of EIs
LeadershipOrganizationCompetencyUnsure
18. BIT implementation plans are linked to district improvementplan
LeadershipOrganizationCompetencyUnsure
19. BITs have a process for using data for decision making
LeadershipOrganizationCompetencyUnsure
20. A process is followed for recruiting staff (internal and/orexternal) to implement the EI
LeadershipOrganizationCompetencyUnsure
![Page 54: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/54.jpg)
LeadershipOrganizationCompetencyUnsure
21. A process is followed for selecting staff who will implementthe EI
LeadershipOrganizationCompetencyUnsure
22. Staff members selected have a plan to strengthen skillsnecessary for success
LeadershipOrganizationCompetencyUnsure
23. A process is in place to evaluate selection outcomes
LeadershipOrganizationCompetencyUnsure
24. DIT secures training for all district/school personnel andstakeholders
LeadershipOrganizationCompetencyUnsure
25. DIT uses training effectiveness data
LeadershipOrganizationCompetencyUnsure
26. DIT uses a Coaching service delivery plan
LeadershipOrganizationCompetencyUnsure
27. DIT uses coaching effectiveness data
LeadershipOrganizationCompetencyUnsure
28. Staff performance feedback is perpetual
LeadershipOrganizationCompetencyUnsure
![Page 55: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/55.jpg)
DCA Sequencing, Frequency and Format
Welcome
1. First and Last Name
Play the video below to learn about how to interact with Survey #4.
Order of Items
Read through the items on the DCA considering the order in which the itemsare currently organized. Consider the flow and ease of moving from item toitem for a district team.
Do you have suggestions for changes to the order of questions? *
Provide any other comments about the order of items within the assessment
Yes, significant reordering (6 or more suggestions)
Yes, some reordering (5 or less suggestions)
No reordering suggestions
![Page 56: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/56.jpg)
Sequencing of QuestionsDrag items from the left-hand list into the right-hand list to order them.
1. There is a District Implementation Team (DIT) tosupport implementation of Effective Innovations (EI)
2. DIT includes someone with executive leadershipauthority
3. DIT uses an effective team meeting process
4. DIT includes an identified coordinator (orcoordinators)
5. District guidance documents outline a formalprocedure for selecting EIs
6. District documents how current initiatives/practice link together
7. Funds are available to support theimplementation of EIs
8. District has an implementation plan
9. DIT actively monitors the implementation of theplan
10. The district uses a process for addressinginternal barriers
11. District uses a process to report policy relevantinformation to outside entities
12. DIT uses a measure of fidelity for the use of theEI
13. DIT has access to data
14. DIT has a process for using data for decisionmaking
15. District provides a status report to the schoolboard
16. District utilizes a communication plan
![Page 57: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/57.jpg)
Provide any other comments about the order of items within the assessment
17. Building Implementation Teams (BITs) aredeveloped and functioning to supportimplementation of EIs
18. BIT implementation plans are linked to districtimprovement plan
19. BITs have a process for using data for decisionmaking
20. A process is followed for recruiting staff (internaland/or external) to implement the EI
21. A process is followed for selecting staff who willimplement the EI
22. Staff members selected have a plan tostrengthen skills necessary for success
23. A process is in place to evaluate selectionoutcomes
24. DIT secures training for all district/schoolpersonnel and stakeholders
25. DIT uses training effectiveness data
26. DIT uses a Coaching service delivery plan
27. DIT uses coaching effectiveness data
28. Staff performance feedback is perpetual
![Page 58: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/58.jpg)
Clearly state the ordering changes you would suggest. If it is helpful, alsoinclude the rationale for this order change.
ItemNumber
Description of what needs to change
Suggestion1
Suggestion2
Suggestion3
Suggestion4
Suggestion5
Provide any other comments about the order of items within the assessment
Frequency of Assessment
![Page 59: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/59.jpg)
Please provide feedback on the 3 items below related to frequency of usingthe DCA. *
Monthly QuarterlyBi-
Annually Annually
EveryTwo
Years Other
How often would it wouldbe helpful for a districtimplementation team toadminister thisassessment?
How long betweenassessments would ateam likely seegrowth/change in theirDCA results?
How often would theDCA results informdistrict-level actionplanning?
Please give more information about what other frequency would be helpfulfor this assessment
Comments about the frequency with which district teams should completethis assessment
Format
![Page 60: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/60.jpg)
2. Provide feedback on the following sections of the DCA:
Is the sectioncomprehensive?
Is the sectionclear? Comments
Introduction andPurpose
VerySomewhatNot at All
VerySomewhatNot at All
Administration andFidelity Checklist
VerySomewhatNot at All
VerySomewhatNot at All
DCA Scoring Form
VerySomewhatNot at All
VerySomewhatNot at All
Scoring Guide
VerySomewhatNot at All
VerySomewhatNot at All
Action Planning
VerySomewhatNot at All
VerySomewhatNot at All
Glossary
VerySomewhatNot at All
VerySomewhatNot at All
Previous Experience
![Page 61: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/61.jpg)
Have you previously completed an assessment of district capacity such asthe District Capacity Assessment (DCA; SISEP) or the District MTSSCapacity Assessment (DMCA; MiBLSi) *
Yes, the District Capacity Assessment (DCA; SISEP)
Yes, the District MTSS Capacity Assessment (DMCA; MiBLSi)
No
Unsure
Other
*
![Page 62: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/62.jpg)
Consider your experience with the DCA *
What benefits have you and / or your team experienced from your previous experience completing an assessment of district capacity?
What would have improved your previous DCA experience?
Is this current version of an assessment of district capacity an improvement compared towhat you have worked with before?
Comments
Not anImprovem
ent
SignificantImprovement
![Page 63: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/63.jpg)
Consider your experience with the DMCA *
Response Time
What benefits have you and / or your team experienced from your previous experience completing an assessment of district capacity?
What would have improved your previous DMCA experience?
Is this current version of an assessment of district capacity an improvement compared towhat you have worked with before?
Comments
Not anImprovem
ent
SignificantImprovement
![Page 64: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/64.jpg)
43
AppendixB:ThinkAloudProtocolGuide
![Page 65: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/65.jpg)
Think-Aloud Protocol Guide (TAP-Guide): Instructions
The TAP-Guide is a data collection blueprint used by researchers who are in the process of developing an instrument. It contains several best practices advocated by researchers who have developed rigorous verbal reporting methods called think-aloud protocols. The TAP-Guide is intended to be an efficient strategy for gathering evidence of validity based on response process and may provide valuable information around various design problems that introduce construct-irrelevant variance (e.g., unclearly defined instructions, items, and response categories).
How to Complete the TAP-Guide
Step #1 & #2: Complete Demographic Information & Individual/ Team Profile: Indicate the name of the observer, date, number of participants, and the name of the participating individual/ team. Begin each cognitive interview with a review of individual/ team characteristics regarding professional role, level of professional experience, and any other identifying information pertinent to the study. Step #3: Review Conditional Probes and Taxonomy of Possible Response Problems In order to improve validity and objectivity during the think-aloud session, the TAP-Guide provides a standardized format for probing and selecting a taxonomy of possible participant problems. Investigators can use condition-specific probes when participant’s verbal reports signal they are having a potential problem and thus, warrants their use. Investigators should be well versed on the major response stages that a participant is likely to pass through and major problem types for which participants provide evidence when answering an item. Step #4: Begin TAP Protocol Part I: Establish Rapport, TAP Directions, Modeling and Practice of Examples Investigators should present standardized instructions, model an example question, and provide a practice phase. The practice phase provides the participants an opportunity to practice thinking aloud and to ensure that it meets the expectations of the investigator. Following, the investigator should ask participants if they have any questions and then proceed. Step #5, #6, & #7: TAP Protocol Part I, II, & III: Collect Data Investigators should be prepared to record data in the first two sections, which support the collection of introspective data on items by making available a section for qualitative observation data and a checklist of the major response stages and problem types. Step #7: TAP Protocol Part IV: Collect Retrospective Data on Instrument Directions, Items, Response Categories, Scoring Rubric, and General Observations/ Questions There is no script for follow-up questions. Instead, investigators should address issues and questions that arise as a result of the think-aloud protocol. However, the TAP-Guide does provide a framework for facilitating the follow-up interview. TAP Protocol Part IV provides qualitative data recording sections on instrument directions, items, response categories, scoring rubric, and general observation or questions.
![Page 66: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/66.jpg)
Think-Aloud Protocol (TAP) Guide: Introspective & Retrospective Data Collection Observer: ____________________________ Date: _____________________________ # of Participants: ________ Participating Individual/ Team: ________________________________ Team Profile: Please identify at least two characteristics regarding participants’ role and teams level of experience (e.g., job title, years of experience, trainings attended). ____________________________________________________________________________________________________________________________________________________________________________ ______________________________________________________________________________________ Review Conditional Probes: C stands for “conditional;” P for “probe” C Participant cannot answer; does not
provide a protocol P “What was going through your mind as you tried to
answer the question?” C Answer after a period of silence P “You took a little while to answer that question. What
were you thinking about?” C Answer with uncertainty (e.g., “um”
and “ah,” changing an answer, etc.) P “It sounds like the question may be difficult. If so, can
you tell me why?” “What occurred to you that caused you to change your answer?”
C Answer contingent on certain conditions being met (“…if you don’t need a super precise number.”)
P “You seem a little unsure. Was there something unclear about the question?”
C Erroneous answer; verbal report indicates misconceptions
P Clarify participant’s understanding of particular word, concept, etc. Probe this misconception.
C Participant requests information initially instead of answering.
P “If I weren’t available or able to answer, what would you decide it means?” “Please elaborate.”
Review Taxonomy Of Possible Respondent Problems
RESPONSE STAGE PROBLEM TYPE Understanding
(Problems with comprehension of item)
Task Performance (Understands item but difficulty executing -
retrieval, deduction, etc.)
Response Formatting (Differences in response and
response options)
Lexical (Meaning of words)
Trouble comprehending the meaning of words/ concepts, phrases, etc. (e.g., defining “capacity building”)
Understands meaning but trouble differentiating (e.g., “capacity building”: does CHAMPS training count?)
Differences in meaning of response and provided category labels (e.g., 8 participants vs. “many”)
Omission/ Inclusion (Understanding the scope of a word/ item)
Trouble understanding the scope and limits of a word (e.g., “capacity building”: individual and/or organizational”)
No explicit decision rule for including/ excluding instances from a category (e.g., “organizational capacity”: include “community level”?)
Involves using a response option that was not explicitly provided (e.g., answering 1.5 vs. 0, 1, or 2).
Temporal (Involve time period)
Trouble grasping the meaning of temporal terms (e.g., “last year”: calendar year vs. past 12 months
Understands meaning but assigns an incorrect reference (e.g., “current month” mistaken for previous month b/c month just recently changed)
Differences in response time period and provided category labels (e.g., 6 months vs. “often”)
Logical (Involves semantics)
• Trouble comprehending the inclusion of semantic devices (e.g., and, or, non-, other than, un-, etc.)
• The inclusion of false presuppositions in a question (e.g., how many times a month do you provide teacher consultation?)
• Item involves contradictions/ tautologies (e.g., necessary requirement, forward planning, the truth is false, great fidelity but bad implementation)
Computational • Residual category: assign after all others have been considered
Step 1 Step 2 Step 3 Step 4
![Page 67: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/67.jpg)
TAP Protocol Part I: Establish Rapport, TAP Instructions, Modeling and Practice of Examples Checklist ☐ Establish Rapport ☐ Provide Explicit Instructions ☐ Provide Clear Expectations “We will begin the survey soon, starting from the beginning by reading the directions and then proceeding to the items. After reading each item, think aloud as you reflect and problem solve. After you have read each item, respond to them in your own words. Do not feel pressured to answer each item correctly, as no evaluation of you or your rating performance will occur. Please act and talk as if you are talking to yourself and be completely natural and honest about your rating process and reactions. Also, feel free to take as long as needed to adequately verbalize.” “Do you have any questions?” “Can you say in your own words, what the expectations are?” “To sum up, we are less interested in the answer participants provide as we are with how they are thinking about them. Remember, we are interested in how participants solve, think about, feel, and the beliefs they have as they respond to survey items.” ☐ Model TAP Model Example: “Lets start with an example. I will go first and then you can do the next one”:
• “ISD implementation Plan operationally defines steps for addressing equity issues related to educational programming.” “Indicate the level of implementation from 0 to 2, 0 indicating not in place, 1 indicating partially in place, and 2 indicating fully in place.”
o When I read this, I’m not exactly sure how “equity issues” are defined. It could mean trying to reduce gaps between subgroups, or it could mean something more specific. It’s been a while since I’ve actually seen our ISD implementation plan, so I’m not really sure if our plan does address equity issues. Because I don’t know and I don’t remember us ever talking about it, I would score this a 0, but will probably want to check with my other team members at some point.
☐ Provide Practice Participant Example: “Now you try: ISD implementation team consists of a diverse group of professionals.”
o “Indicate the level of implementation from 0 to 2, 0 indicating not in place, 1 indicating partially in place, and 2 indicating fully in place.”
o Provide suggestions to correct and praise to encourage. o Ask participants whether they have any questions and then proceed.
Step 4 Step 5
![Page 68: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/68.jpg)
TAP Protocol Part II: Collect Concurrent Data on Instrument Directions Qualitative Data
Respondent Problems ☐Understanding Task ☐Performance ☐Response Formatting (Comprehension of item) (Difficulty executing) (Response options)
☐Lexical ☐Omission/ Inclusion ☐Temporal (Meaning of words) (Scope of a word/ item) (Time period)
☐Logical (Semantic: Semantic devices [and, or, other than, etc.], false presuppositions, contradictions, etc.)
☐Computational (Residual Category: Assign after all others have been considered)
TAP Protocol Part III: Collect Concurrent Data on Items Item Qualitative and Respondent Problem Data 1 Qualitative:
Respondent Problems: ☐Understanding Task ☐Performance ☐Response Formatting (Comprehension of item) (Difficulty executing) (Response options) ☐Lexical ☐Omission/ Inclusion ☐Temporal (Meaning of words) (Scope of a word/ item) (Time period) ☐Logical (Semantic: Semantic devices [and, or, other than, etc.], false presuppositions, contradictions, etc.) ☐Computational (Residual Category: Assign after all others have been considered)
2 Qualitative: Respondent Problems: ☐Understanding Task ☐Performance ☐Response Formatting ☐Lexical ☐Omission/ Inclusion ☐Temporal ☐Logical ☐Computational
*Continue for all 28 items TAP Protocol Part IV: Collect Retrospective Data on Instrument Directions, Items, Response Categories, Scoring Rubric, and General Observations/ Questions
Directions
Items
Step 6
Step 7
![Page 69: Fall 08 District Capacity Assessment (DCA) Technical Manual](https://reader031.vdocuments.site/reader031/viewer/2022020621/61ea54a47180246938792004/html5/thumbnails/69.jpg)
Response Categories
Scoring Rubric
General Observations/ Questions