performance incentives to improve community …college “open access” philosophy. moreover, while...

21
Performance Incentives to Improve Community College Completion: Learning from Washington State’s Student Achievement Initiative A State Policy Brief Nancy Shulock Institute for Higher Education Leadership and Policy Davis Jenkins Community College Research Center March 2011

Upload: others

Post on 04-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

Performance Incentives to Improve Community College Completion:Learning from Washington State’s Student Achievement Initiative

A State Policy Brief

Nancy ShulockInstitute for Higher Education

Leadership and Policy

Davis JenkinsCommunity College Research Center

March 2011

Page 2: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

Acknowledgements

This policy brief is based on an ongoing evaluation of Washington State’s Student AchievementInitiative by the Community College Research Center (CCRC) and the Institute for HigherEducation Leadership and Policy (IHELP). In addition to the authors of this brief, our researchteam includes Vivian Cazanis, Colleen Moore, and Jeremy Offenstein of IHELP and John Wachenof CCRC. We are indebted to the administrators, faculty, and staff we interviewed at 17 ofWashington State’s 34 community and technical colleges. We also want to thank the staff of theWashington State Board for Community and Technical Colleges, who have provided invaluableinput on our analysis and who reviewed drafts of this brief. Thanks also to Thomas Bailey andKevin Dougherty, who provided helpful comments on an earlier draft of this brief. Funding forthis research is provided by the Bill & Melinda Gates Foundation.

About the Authors

Nancy Shulock is executive director of IHELP and a professor of public policy and administration atSacramento State University. She has authored numerous policy reports on community collegeperformance, finance, and student success and on college readiness and higher educationaccountability. She has been an academic administrator and a fiscal analyst for the California StateLegislature.

Davis Jenkins is a senior research associate at CCRC. He is one of the nation’s leading researchers onways to improve community college student success and institutional performance. He recentlyauthored CCRC’s Redesigning Community Colleges for Completion: Lessons from Research on High-Performance Organizations.

The Community College Research Center (CCRC), Teachers College, Columbia University,conducts research on the major issues affecting community colleges in the United States andcontributes to the development of practice and policy that expands access to higher educationand promotes success for all students.

Community College Research CenterTeachers College, Columbia University525 West 120th Street, Box 174New York, NY 10027

http://ccrc.tc.columbia.edu

The Institute for Higher Education Leadership & Policy (IHELP) at Sacramento Stateproduces research and services for policymakers, practitioners, and educators toenhance leadership and policy for higher education in California and the nation.

Institute for Higher Education Leadership & PolicyCalifornia State University, Sacramento6000 J Street, Tahoe Hall 3063Sacramento, CA 95819

http://www.csus.edu/ihelp

Page 3: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

Table of Contents

Inside This Brief . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Introduction: Washington State’s Student Achievement Initiative and theNational Imperative to Improve Community College Completion . . . . . . . . . . . 3

The Student Achievement Initiative: A Next-Generation Performance Incentive Policy . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Limitations of Traditional Approaches to Measuring and Rewarding Success. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4Development and Design Features of the SAI. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4How the SAI Addresses the Limitations of Earlier Policies. . . . . . . . . . . . . . . . . . 6

Designing Community College Performance Incentives: Policy Challenges and Choices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Designing a Valid and Useful Measurement System . . . . . . . . . . . . . . . . . . . . . . . 8Designing an Effective Performance Funding System . . . . . . . . . . . . . . . . . . . . . 10Fostering Conditions for Institutional Change . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Next Steps in the National Conversation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Endnotes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

Page 4: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information
Page 5: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

1

Inside This BriefIn 2007, Washington State launched an innovativeperformance incentive policy for communitycolleges, called the Student Achievement Initiative(SAI). The purpose of the SAI is to use data andperformance funding to motivate colleges toimplement systemic changes in practice that lead toimproved student outcomes. The SAI seeks toaddress some of the key shortcomings of previousstate higher education performance incentivepolicies and so has attracted a lot of interest frompolicymakers and funders nationally. By drawingon initial observations from our ongoing evaluationof the SAI as well as knowledge about earlier-generation performance funding policies attemptedelsewhere, this brief aims to inform theconversation currently going on in many statesabout using state policy levers to meet ambitiousstate and national goals for increased collegeattainment. To that end, this brief calls particularattention to a number of policy choices that theWashington community and technical collegesystem faced as it designed and began to implementthe SAI across its 34 colleges. These are some of thesame key choices policymakers and college leadersin other states will confront should they begin todesign and enact performance incentive policies asmeans to improve community college studentoutcomes:

• Complexity in the measurement framework: What is the best way to balance simplicity andcomprehensiveness in selecting the specificmeasures to use in a performance incentivesystem that rewards intermediate milestones aswell as completions?

• Nature of attainment data to be reported:Should cross-sectional or longitudinal studentdata be used to compute achievement points? Ifcross-sectional data are used, how can they beconnected to longitudinal cohort analysis tobetter inform institutional improvement efforts?

• Defining college performance: Is it more prudentto reward colleges on the basis of gains in “totalpoints” or on the basis of gains in “points perstudent”?

• Proportion of performance-based funding: Whatportion of the total institutional budget shouldbe distributed on the basis of performance?

• Source of performance funds: Shouldperformance dollars be new appropriations orshould they be reallocated from institutionalbase budgets? What is a reasonable time periodand process for reaching this ultimate level ofperformance funding?

• Mechanism for allocating performance funds:Should performance be factored into the basicfunding formula (regardless of funding leveland source) or should performance funds beallocated as a bonus on top of regular funding?

• Buy-in and engagement: What activities, such asthe creation of a broadly representative taskforce for developing overarching designprinciples or the establishment of a low-stakes“learning year” during early implementation,are conducive to gaining the support of keycollege personnel, such as college presidentsand institutional researchers?

• Technical assistance: What sorts of assistanceshould states provide to help colleges learn howto use data to continuously improve programsand services?

Page 6: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information
Page 7: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

3

Amid growing signs of America’s weakeningposition in the global economy, federal andstate policymakers and major foundations

have set ambitious goals for increasingpostsecondary attainment in the United States.Given changing U.S. demographics, it has becomeclear that these national goals are attainable onlywith vastly improved outcomes among traditionallyless-successful populations, which includeacademically underprepared students who comefrom poor primary and secondary schools, adultswho are working in low-wage jobs or areunemployed, and immigrants with limited Englishproficiency. These populations enrollpredominantly in the nation’s community colleges.Accordingly, much of the national mobilizationaround college completion is focused squarely oncommunity colleges as well as on state policy,because community colleges receive most of theirfunding from state and local sources.

Given declining public resources for public highereducation and skyrocketing enrollments, betteroutcomes from community colleges will requiregains in productivity. Colleges will need to learnhow to increase student progression and successwith the same or fewer resources. Doing so willrequire community colleges to make systemicchanges in their policies and practices.

It is within this context that the Washington StateStudent Achievement Initiative (SAI) has attractednational attention. As states strive to increasecommunity college student success, many arelooking at two policy levers—data and performancefunding—to motivate and guide colleges toimplement changes in practice that lead toimproved student outcomes. Washington’s StateBoard for Community and Technical Colleges is atthe forefront of efforts to develop better measures ofstudent success and to use those enhancedmeasures as the basis for rewarding colleges forincreasing student achievement. In adopting and

implementing the SAI, the State Board is seeking touse data and fiscal incentives to drive systemicinstitutional improvement and thus increaseproductivity across its 34-college system.

This brief examines key policy issues raised byWashington State’s experience to date with itsvanguard performance incentive policy. It draws onour observations from the first year of a three-year,Gates Foundation-funded evaluation of the SAI aswell as from our experience with these issues in anational context. Because the evaluation of the SAIand the SAI policy itself are still in their earlystages, it is too early to assess the long-termeffectiveness of this effort. However, insights fromWashington State’s early experience with the policycan help inform the conversation currentlyoccurring in many states on how to use state policylevers to meet ambitious state and national goals forincreased college completion. To that end, the briefexamines policy challenges and choices that theWashington community and technical collegesystem has grappled with as it designed and is nowimplementing the SAI, and that policymakers andcollege leaders in other states will have to addressin their efforts to improve community collegeoutcomes.

Introduction:Washington State’s Student Achievement Initiative and the National Imperative to Improve Community College Completion

Page 8: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

4

Neither performance measurement norperformance funding is new in U.S. highereducation. Both have decades-long histories

as part of the growing interest in publicaccountability and efficient government.1 But thetraditional means of measuring and accounting foroutcomes in community colleges have proveninadequate,2 and many of the earlier-generationstate performance funding policies have beenabandoned.3 In developing the StudentAchievement Initiative, the Washington communityand technical college system sought to address theshortcomings of earlier policies, including previouspolicies enacted in its own state.

Limitations of Traditional Approachesto Measuring and Rewarding Success

Traditional measures of postsecondary studentoutcomes are typically focused on completion ratesfor students in degree programs. Yet communitycolleges serve many students who start in remedialor even adult basic skills programs and who maytake years to progress toward credentials. Measuresthat focus on completion in a set period of time thusput colleges that serve large numbers ofunderprepared students at a disadvantage. As such,they might discourage colleges from serving thesestudents, which is contrary to the communitycollege “open access” philosophy. Moreover, whileoutcome measures such as graduation rates areimportant, they provide no information on wherestudents struggle on their paths to earning acredential, thus offering colleges little informationon what they can do to improve their performance.

Previous attempts to design performance fundingschemes for postsecondary education have alsoproven inadequate.4 A primary reason is that theyhave typically been devised without much inputfrom college educators, who often do not embracethe definitions of “performance” reflected in thechosen metrics. Systems that reward completionalone have been especially unpopular across

community colleges because they do not reflect thefull range of missions assigned to the colleges. Byproviding rewards primarily for completion, theseearlier performance funding systems were feared toencourage colleges to turn away from their historicmission to serve underprepared students, who arethe least likely to succeed. Many performancefunding systems set targets for colleges to meet inorder to earn rewards, which led to controversiesabout reasonable expectations for open-accessinstitutions. Finally, these earlier performancefunding systems were financially unstable.Proposals to carve performance funding fromcollege base budgets were met with stiff politicalresistance. But funding systems that relied on“new” money often fell victim to budget cuts asinstitutions fought to protect their base budgets atthe expense of special programs.

Another weakness of the traditional approaches toboth performance measurement and performancefunding is that the impact of these policies did notpenetrate to the level of classroom faculty andstudent support personnel whose interactions withstudents must change before substantialimprovements in student learning and completioncan be expected.5 Thus, these tools have had morerelevance in the arenas of public accountability andpolitical decision-making than in institutionalimprovement and effectiveness.

Perhaps because of these limitations, there is scantevidence, if any, that performance reporting andperformance funding, separately or in combination,have led to any fundamental changes ininstitutional practice or effectiveness.6

Development and Design Features of the SAI

Like many states, Washington implementedperformance funding in the 1990s, but that effort wasshort-lived, lasting only from 1997 to 1999. Its demiseis attributable to a number of factors, including lack

The Student Achievement Initiative: A Next-Generation Performance Incentive Policy

Page 9: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

5

of support from the colleges, which were beingaffected by a policy that was developed and imposedby a coalition consisting largely of legislators andbusiness leaders.7 This time around the impetuscame from within the system. In conceiving the SAI,the State Board was responding to heightenedexpectations from policymakers and the public foraccountability and improved college outcomes, butthe Board did this in a proactive way intended toreflect the mission and values of the college systembetter than the earlier efforts had done.

In September of 2007, the State Board forCommunity and Technical Colleges launched theStudent Achievement Initiative for Washington’stwo-year colleges, following its development by atask force comprising State Board members, collegetrustees, presidents, and faculty representatives.The policy that emerged from the task forcereflected a set of design principles that would guidethe implementation of the SAI (see box, “SAIDesign Principles”).8 These principles indeedshaped the development of the two main

components of the policy—the measurementframework and the funding mechanism.

SAI Measurement Framework

The Student Achievement Initiative sets forth aframework for computing the number of“achievement points” attained by students enrolledin the state’s 34 public two-year colleges. Collegesearn points when students achieve one or more ofthese educational milestones, which are organizedalong a continuum from remedial programs (whichinclude adult basic education and pre-college,“developmental” education) up through thecompletion of credentials and training programs.An achievement point is generated for attainmentsin any of four areas:

(1) Building toward college-level skills (two typesof attainments)

a. Basic skills gains—increase in skill levelbased on a standardized test (multiplepoints attainable by a student)

Overall Principles: • The initiative leads to improved educational

attainment for students, specifically the goalof reaching the “tipping point” (of at least ayear of college and an occupational certificate)and beyond.

• The initiative allows colleges sufficientflexibility to improve student achievementaccording to their local needs.

• The initiative results in the identification andimplementation of successful practices toimprove student achievement system-wide.

Principles for Measurement: • Performance measures recognize students in

all mission areas and reflect the needs of thediverse communities served by colleges.

• Performance measures must measureincremental gains in students’ educationalprogress irrespective of mission area.

• Measures are simple, understandable, and

reliable and represent valid points in students’educational progress.

• Measures focus on student achievementimprovements that can be influenced bycolleges.

Principles for Incentive Funding: • Colleges are rewarded for improvements in

student achievement.

• Funding is structured so that colleges competeagainst themselves for continuousimprovement rather than competing witheach other.

• Funding is stable and predictable, andcumulative over time.

• Incentive funding rewards student successand becomes a resource for adopting andexpanding practices leading to furthersuccess.

• New funds provide the greatest incentive.

SAI Design Principles

Page 10: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

6

b. Passing pre-college writing or mathcourses with a grade that would qualify astudent to advance to the next level(multiple points attainable by a student)

(2) First year retention and progressa. Earning 15 quarter credits of college-level

course workb. Earning 30 quarter credits of college-level

course work(3) Completing college-level math (passing math

courses required for either technical oracademic associate degrees)

(4) Completions (degrees, occupationalcertificates, apprenticeship training)

Notably, the achievement points encompass the fullrange of mission areas, including basic skills,remedial, workforce education, and academictransfer. The State Board provides the colleges withquarterly student unit record data (the collegesoperate on a quarter term system) to track trends inthe attainment of achievement points by theirstudents. The data are intended to help collegesidentify opportunities for improvement andevaluate the impact of efforts to increaseachievement. The achievement points are computedfor each college based on the total achievements byall students over the course of one year (withquarterly reporting along the way).

SAI Funding Mechanism

The measurement framework yields data on totalachievement points earned by category ofachievement, but colleges receive funding forincreases in total achievement points attained bystudents in a given year compared to the baselineyear. The baseline year was 2006-07 for the firstround of performance funding, but it became theprior year after that initial round. Funding per pointgain is determined by the amount of dollarsavailable to fund the initiative and the total pointgain achieved across the system. The State Boardestimates the size of the point gain at the start of theyear and announces the approximate value of apoint gain that can be earned. Funding per pointgain in the first two years of performance fundingwas $31 and $40, respectively.

The implementation schedule for SAI fundingawards was designed to help colleges understandand adapt to the metrics and funding rules:

• 2006-07 was established as the base year formeasuring colleges’ initial performance.

• 2007-08 was designated as the “learning year”for colleges to begin to use data and developinstitutional responses prior to the introductionof performance funding; each college received$51,000 in seed money (which was added totheir base budgets).

• 2008-09 was the second year of seed money($66,000 of base budget funding per college).

• In fall 2009 colleges collectively received $1.8million for performance gains in 2008-09, inaddition to retaining the increased base budgetfunding from the prior years. At a rate of $31 perpoint gain, the average performance-generatedfunding award was $60,000 per district. Of the$1.8 million, $800,000 was granted on a one-timebasis by the Ford Foundation and was notcarried forward into college base budgets.

• In 2010-11 colleges will retain the state-fundedawards from the prior years and receive anadditional $1.8 million for performance gainsachieved in 2009-10, funded at $40 per pointgain. The Gates Foundation provided $800,000of this amount, which, as a one-time grant, willnot be carried forward into college base budgets.

The SAI funding mechanism does not affect theformula by which the bulk of the system’s budget isallocated to colleges. SAI funding is provided as afinancial reward in addition to the amount eachcollege receives through the system’s basic fundingmechanism. Once earned, however, the reward is abonus to be added to the college’s base budget. Fornow, the SAI funds constitute less than one percentof the system’s total budget. The intent is togradually increase reward amounts.

How the SAI Addresses the Limitations of Earlier Policies

The measurement system devised for the SAIaddresses one of the key concerns with traditionalmetrics by monitoring student progression throughintermediate milestones on the way to programcompletion, in addition to tracking completions.The milestones were chosen on the basis of researchinto where students tend to struggle and drop out.9If students can be helped to reach these key

Page 11: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

7

transition points, research suggests they will buildmomentum toward completion. A continuum ofmilestones was developed to span all mission areasof the colleges, with all students, both noncredit andcredit, eligible to earn points as they progress.Through another design feature that aimed toimprove upon traditional accountability foroutcomes, the State Board gives colleges timely dataon achievement points earned by students to helpthem identify areas of weakness and opportunitiesfor improvement.

In an effort to increase support for the initiativefrom across the college system, the State Boardconvened a task force of college representatives todevelop the SAI measurement system, informed bythe research on progression milestones. The taskforce’s decision to define performance measuresaround progression points has resonated withpolicy reform initiatives across the country that areseeking to improve college completion rates.10

Some of the better known examples are theComplete College America initiative, which hasdeveloped metrics in concert with the NationalGovernors Association; the Achieving the Dreaminitiative and its Developmental EducationInitiative; and the Access to Success initiative of theEducation Trust and National Association of SystemHeads. As activity around intermediate measuresgrows, the experience of Washington’s SAI will beincreasingly valuable in helping to establishconnections between intermediate attainments andprogression toward ultimate outcomes.

The design of the funding features of the SAI wasalso an attempt to rectify some weaknesses inprevious performance funding models. Byrewarding colleges with funding for helpingstudents reach these intermediate points, ratherthan just for completion, a greater portion of collegeefforts are recognized. Concerns about creatingdisincentives for colleges to serve underpreparedstudents are addressed in two ways. First, the pointsystem emphasizes the progression of students inand through pre-college study, effectively accordinggreat value to colleges’ work with underpreparedstudents. Second, colleges earn points based ontheir own improvement over time rather than onany targets or comparisons with sister colleges.Thus, colleges that serve more disadvantagedstudents are not penalized for doing so. The underlying premise of the SAI is that the

reporting of data and the financial rewards forstudent achievement will motivate and guidecolleges to undertake systemic changes in practiceto improve student outcomes on a substantial scale.These systemic changes are what so many states areseeking today as they attempt to raise educationlevels. It is especially noteworthy that, unlike manystate performance funding systems that wereimposed from the outside, in this case the StateBoard took the lead in developing the SAI. Theinternal ownership and management of theinitiative is, then, another way in which the SAIattempts to overcome the shortcomings of previouspolicies.

Page 12: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

8

The Washington community and technicalcolleges deserve great credit for introducing aperformance incentive policy that seeks to

address the limitations of earlier policies and that isfocused centrally on improving student outcomes.The State Board identified student success as such animportant priority for the future of Washington that ittook the risky step of pioneering a new approachacross an entire community college system. Beginningthe design process with a set of principles developedby a task force with representation from across thesystem helped immensely in garnering initial buy-inand in easing some of the anxiety that necessarilyaccompanies policy innovation. Still, the processinvolved making hard choices about the design of theincentive system.

In this section we describe core policy issues thatthe Washington State two-year college system facedin designing the SAI, with a focus on its threeprimary components:

• Performance measures • Performance funding• Support for institutional change.

We discuss how decisions about these issues seemto be playing out early on in the implementation ofthe SAI in Washington State, and we present keychoices for policymakers to consider in developingtheir own performance incentive systems toimprove college performance while ensuringaccountability.

Designing a Valid and Useful Measurement System

The experience to date with the SAI in Washingtonsuggests that a performance measurementframework based on intermediate measures ofstudent progression is viewed by faculty and staffas more valid than traditional measurement systemsand thus can help colleges to focus on studentsuccess. Most of what are considered leading statesin addressing performance accountability have now

begun to incorporate measures of studentprogression. In deciding how to attach points to theprogression measures, the SAI designers opted forsimplicity and timeliness of reporting by usingannual counts of total points earned by all students.This approach has raised some challenges forcolleges in trying to use achievement point data toinform their efforts to improve programs andservices on the ground.

Washington State’s Approach

Washington’s effort to design performancemeasures for the SAI began from a strongfoundation that used research from the State Boardresearch staff and the Community College ResearchCenter to help determine the key points wherestudents get stuck and the achievements thatprovide momentum for further progress andcompletion. In our interviews with colleges, weheard widespread support for this research, whichhelped to lend credibility to the measures.

Grounding the performance measures in a set ofdesign principles helped to avoid some of the pitfallsof other measurement systems. For example, thecommitment to recognize student progression acrossmission areas confronted head-on one of the mostcommon complaints about performancemeasurement—that it penalizes colleges that servelarge numbers of educationally disadvantagedstudents. Similarly, the design principle to measureonly those achievements that can be influenced bycolleges sent a strong message that this internal effortwould avoid the problems encountered by thosemeasurement systems designed by outsiders who donot understand the challenges that colleges face.Among the most important design principles, whichwas indeed followed, is that the measures should besimple and understandable. Too many measurementsystems have failed to deliver because they havetried to incorporate too many measures. It isespecially impressive that a broadly representativetask force was able to agree on a simple framework.

Designing Community College Performance Incentives:Policy Challenges and Choices

Page 13: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

9

Our preliminary field research in Washington Stateindicates that, among those who are familiar with it,the achievement point framework is quite popularacross the colleges. In part, this seems to be a result ofits ability to provide a common vocabulary around acommon goal—student progression. It is importantto recognize that the SAI was not introduced in avacuum. The colleges already had a range of studentretention and strategic enrollment managementinitiatives underway, and they are finding the SAIachievement point framework helpful in givingcohesion to a number of sometimes fragmentedexisting initiatives. In this environment of heightenedpublic accountability, some college leaders arefinding the framework valuable for communicatingwith trustees, accreditation agencies, grantors, andothers about their student success goals and theirefforts to meet them. The framework also seems tohave elevated the importance of adult basic skills anddevelopmental education. By rewardingachievements in those areas, the initiative hassignaled not only that these are important functionsof the colleges but also that student success in thoseareas is vital to increased attainment of certificatesand degrees.

Key Choices for Policymakers

While the overall SAI achievement point frameworkhas proved to be compelling to college leaders inand outside of Washington, three of the specificdesign decisions reflected in the framework raisesome trade-offs that should be considered carefullyby those thinking of adopting a performancemeasurement system.

(1) Complexity in the measurement framework

First, in seeking simplicity in the measures acrossthe full spectrum of missions, the State Board choseto focus on a subset of achievement points amongmany points that could have been selected. There isalways a trade-off between simplicity andcomprehensiveness in constructing a performancemeasurement framework, and it is not surprisingthat arguments have been presented for more, ordifferent, achievements to be counted. In ourinterviews with college personnel, numeroussuggestions surfaced for “tweaking” the pointsystem. Since funding is involved, it is notsurprising that these concerns tend to reflect theprogram area and job duties of the person

expressing them. For example, some English facultyquestion why a point is generated for passingcollege math but not for passing college English,despite the fact that both are required for anassociate degree. Similarly, some faculty inacademic transfer programs question why no pointsare generated for a transfer, since many studentstransfer without earning an associate degree,generating no points despite the significantachievement. Many such suggestions were offered,indicating that people are potentially responsive tothe incentives embedded in the point system andillustrating the trade-offs inherent in the StateBoard’s decision to design a simple measurementsystem. Had the Board developed a framework thatincluded the additional measures suggested to us,we might have heard complaints that the systemwas too complicated.

(2) Nature of attainment data to be reported

Second, the achievement points are computed foreach college based on the total achievements by allstudents over the course of one year (with quarterlyreporting along the way). This cross-sectionalreporting of achievements in a given time period isquite different from tracking student progress overtime. For example, tracking student progress overtime—often called “cohort tracking”—would allowreporting of how many students completedevelopmental education within one year, or howmany complete college math within two years. Themethod chosen for the SAI more simply reportshow many of a college’s students earn a particularachievement point in a given year, regardless ofhow long they have been enrolled. Collecting dataon rates of progression of student cohorts over timeis more complicated and would involve a lag timethat the designers wanted to avoid. In addition, theSAI designers wanted a simple way to showincremental student gains made each year—information that is meaningful to policymakers butless useful to the colleges themselves as they workto identify timely student progression and toevaluate the impact of various interventions.

At the same time, the State Board has recognizedthat SAI data are most valuable when used incombination with local student-level transcriptdata. Staff are now working with institutionalresearchers at the colleges to understand how bestto use SAI data together with local data to generate

Page 14: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

10

information useful for improving programs andservices. These efforts have the potential to promotesharing among colleges of effective practices—something that could not be done with SAI dataalone. Colleges ultimately need to conduct cohorttracking to identify strategies for improving studentprogression and evaluating and further improvingthose efforts. For example, if a college implementeda new college-wide approach to intake andorientation, it could use local data to generate a newcohort that experienced that innovation and priorcohorts that had not experienced it. It could thenuse the SAI framework to compare outcomes acrosscohorts. That is, the rate at which the new cohortearned college readiness and first-year progressionpoints could be compared to the rate for previouscohorts. The SAI measures are valuable when usedas high-level indicators to see where, alongpathways toward completion, students are mostlikely to drop out, and to set priorities accordingly.Student-level cohort data can then take the analysisto a deeper level as the progress of subgroups ofstudents can be analyzed with respect to existing ornew policies and interventions.

(3) Defining college performance

Third, again in the interest of keeping things simple,the Board chose gains in total points as the commondenominator to use in distributing performancesbonuses. In effect, this becomes the “bottom line”definition of performance. However, a measurebased on total points is highly correlated withcollege size (i.e., enrollment). A college may achievepoint gains by recruiting more students orotherwise experiencing an enrollment increase,while taking no particular actions to help studentsmake better progress. Indeed, a college couldgenerate a point gain even if, on average, studentsearned fewer points than in the prior year, if theenrollment gain were large enough to outweigh theloss in average points per student. Our analysis ofthe points generated by colleges during the initialperiod for which the first performance funding wasawarded under the SAI (2006-07 through 2008-09)found that about one-fifth of the gain in points wasattributable to enrollment increases and about four-fifths to gains in points per student.

We believe that if a measurement system is toencourage greater student progress (and overallsystem productivity), it helps to use a measure that

gauges outcomes with respect to inputs. This could besome version of points per student or points perdollar of expenditure. We favor the former becauseit would inspire needed focus on student progress.Tracking absolute points may be consistent withstate and national goals to increase the volume ofcollege degrees, but in a period in which collegeswill be compelled to increase the number ofgraduates with fixed or declining resources,tracking points per student or points per dollar maymake more sense.

Designing an Effective Performance Funding System

Performance funding is a complex topic withinterconnected technical and political factorsinvolved. The past history of performance fundinghas been a checkered one at best, with most statepolicies proving unsustainable.11 It is fair to describeperformance funding as an unsettled paradigm witha lack of consensus on effective design principles.

We noted earlier that Washington’s SAI hasaddressed some of the serious limitations of earlygenerations of performance funding by developinga definition of performance that encompasses allmission areas and avoids penalizing colleges thatserve underprepared students. But far less settled isthe question of how to reward performancefinancially, once performance is defined. We find ithelpful to distinguish among three features of anyfunding system that attempts to create fiscalincentives for performance, that is, three choicesthat policymakers face:

(1) The amount of performance funds: What portionof the total institutional budget is distributedon the basis of performance?

(2) The source of performance funds: Are theperformance dollars new appropriations or arethey reallocated from institutional basebudgets, or some combination?

(3) The mechanism for allocating performance funds:Is performance factored into the basic fundingformula (regardless of funding level andsource) or are performance funds allocated asa bonus on top of regular funding?

We have observed that the second and third choicesare often conflated in discussions about the designof performance funding. It is often assumed that if

Page 15: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

11

new funds are identified to support performancethey somehow need to be set aside and used toreward performance after base institutional fundsare distributed on the basis of enrollment. But newfunds could be added to the base with theaugmented total distributed by means of a formulathat rewards enrollment and performance in somecombination. Conversely, a percentage of the basebudget could be set aside to be used as bonus fundsto reward performance without changing the basicenrollment-driven funding model for the balance ofeach institution’s budget. The point is that there arechoices to be made on all three dimensions ofperformance funding design.

Washington State’s Approach

In designing the funding mechanism, the StateBoard and the SAI task force drew upon expertopinion about performance funding and made a setof considered choices. Based on this advice, theState Board and the task force designed aperformance funding approach that (1) at leastinitially devotes less than one percent of the systembudget to performance with no clear plans to phasein increases; (2) relies, in principle, on new fundingalone, and (3) leaves the base funding formulaunchanged while allocating SAI performancedollars as bonuses. To mitigate opposition to thenew policy, Washington started small. The “learningyear,” in which colleges were given non-competitivefunding to lay the groundwork locally, and thesmall budgetary stakes thereafter have reducedresistance up to this point. The State Board has alsostressed that performance funding is for purposesof internal improvement and not for punishment orinappropriate comparisons.

Key Choices for Policymakers

Washington State has so far gone through twocycles of SAI performance funding. Based on thisexperience, and in light of a dramatically changedfiscal situation in Washington and most other states,we can examine in hindsight crucial issues that flowfrom initial design decisions.

(1) The amount of performance funds

Definitive evidence on how much performancefunding it takes to stimulate systemic change islacking (because most past attempts have dedicated

only small amounts to performance). However, theconsensus among the college leaders weinterviewed in Washington is that the amountallocated so far through the SAI—less than onepercent of the budget—is insufficient to inspire thekinds of fundamental systemic changes the StateBoard is seeking. To date, colleges seem to havespent the modest funding they received from theSAI on relatively small-scale, isolated activities.Some states that have borrowed heavily from thedesign of the SAI measurement framework have, infact, chosen different approaches to funding.12 Incontrast to Washington, Ohio has set 20% as thebudget target for full implementation of itscommunity college performance funding policy,which is based in part on the Washingtonachievement point model, and other states havesimilarly set their sights on a significant portion ofbudgets dedicated to funding performance. Still,regardless of the portion decided on, we believethat it is prudent to phase in performance fundinggradually so that colleges have time to learn fromthe data and implement better practices. A slow,staged implementation toward a significant level offunding supports the goal of continuousimprovement. While policymakers may push forquicker implementation for public accountabilitypurposes, such approaches are understandablyviewed as punitive and counterproductive frominside the academy.

(2) The source of performance funds

In Washington, the State Board was advised byoutside experts at the outset that using new funds ismore effective in creating fiscal incentives to increasestudent success. Certainly new funds are morepolitically popular, as no college would choose tolose base funds over earning new funds. Butwhether new funds are more effective thanreallocated funds in changing behavior is anunresolved empirical question. Regardless, thedesign principle to rely on new funds becameimpossible to follow as the budget situationdeteriorated in Washington State. As thecommunity college system budget was beingreduced significantly over the last two years, theSAI was funded increasingly through basereallocation. The popularity of the SAI with thegovernor and the legislature very likely inoculatedthe system against even larger budget cuts, aslawmakers agreed to preserve some funding that

Page 16: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

12

might otherwise have been cut as long as it wasused for performance. Yet the college leaders weinterviewed did not seem to have considered thepolitical capital that the SAI may have bought them;instead they focused on what they perceive as areneging on the promise not to take away funds tosupport the SAI. One lesson emerging from thisdifficult situation may be that more research isneeded to examine the impact of new base budgetincentives on institutional behaviors. What is clear,however, given the fiscal realities facing publicpostsecondary education, is that statesimplementing new performance funding programsover the foreseeable future should not structurethem around new funding if they have anyintention of devoting a significant amount of fundsto performance.

(3) The mechanism for allocating performance funds

As with the prior choice of new versus reallocatedfunds, this choice necessarily confronts a powerfulinstitutional point of view that colleges are alreadyunderfunded for basic enrollment and operations.According to this view, major new initiatives shouldbe accompanied by specially designated funds.Colleges see increasing numbers of what they call“unfunded enrollments” and believe they canjustify base budget increases on those groundsalone. If they are to be expected to makefundamental performance gains, they expect specialfunding designated for that purpose. This attitudehas become pervasive across American highereducation as lawmakers repeatedly add categoricalprograms or other budget line items to accomplishpolicy objectives.

The bonus mechanism adopted by the State Boardin Washington reflects this perspective by leavingthe base budget allocation methodology unchangedand creating a pool of additional bonus funding. Yeteven in the best of budgetary times, thisarrangement could be seen as carrying the messagethat colleges are funded to operate, irrespective ofperformance levels, and are expected to improvetheir performance only to the extent that specialfunding becomes and remains available. In theworst of budgetary times, as have hit Washingtonand other states, the bonus mechanism becomesvery difficult to explain. We believe that it is notcredible to tell colleges they are getting a bonusforperformance when, from their perspective, they

have had their budgets cut by far more than theamount of the bonus. Several college leaders weinterviewed were upset by what they see as the“raw deal” of having to earn back even just aportion of the funds they have lost. Although notmaterially different in terms of total resources, itseems that a more credible message would be thattoday’s fiscal realities require the state to invest itsscarce resources in success. That, in turn, wouldrequire a phased-in change to the way colleges arefunded—moving away from straight enrollmentfunding toward funding colleges in part for theirongoing work to help students make progress andcomplete their education. Colleges might stillcomplain about having to earn back their lost funds,but the reason for having to do so would be clear:they are living in a new fiscal reality in whichtaxpayers invest in performance.

The most important lesson from the performancefunding aspect of the SAI implementation to date isthat communication is critical. Ways must be foundto simplify explanations of what is by nature a verycomplex undertaking. Despite the vigorouscommunication efforts of the State Board staff, wewere told at every campus we studied that mostfaculty and staff, including high-leveladministrators, have no familiarity with thecomplexities inherent in performance funding,including the design choices noted above and themeans available for defining “performance” toavoid perverse incentives, such as discouragingenrollment of underprepared students. States willbe well served by engaging college leaders inongoing discussions about the goals, choices, andstrategies attendant to performance funding in theirstates. Protecting funding levels in Washington andelsewhere may increasingly require college systemsto incorporate performance incentives into theirbase funding formulas. As state policymakers workto invest state funds most productively to increasecollege attainment and improve economiccompetitiveness, current methods of fundingcolleges may no longer be politically andeconomically sustainable. Common ground amongpolicymakers and colleges might be found in theclaim that performance funding can enhance totalfunding which, in turn, allows college faculty andstaff to focus more intently on student success.

Page 17: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

13

Fostering Conditions for Institutional Change

We have already described what will be required forthe data to become more useful for identifyingeffective practices and for the funding incentives to bemore likely to motivate systemic change. Researchand reform around the country have shown that dataand incentives alone may not be sufficient, however.Colleges can collect and report data as a complianceactivity without learning anything or doing anythingas a result.13 New fiscal incentives often fail topenetrate the collective consciousness of institutions,and even if they do, they can lead to gaming or otheraccounting devices that only give the appearance ofimprovement toward the intended outcomes.14 Herewe consider the conditions that promote systemicchange and how states can foster them.

Washington State’s Approach

The State Board’s strategy for achieving institutionalchange has primarily involved gaining internal buy-in and engaging in extensive educational sessionsabout the SAI across the system, particularly duringthe early stages of planning and implementation.Acting on the knowledge that performance reportingand performance funding policies imposed from theoutside have not worked well and have not gainedbuy-in from the institutions they affect, the StateBoard formed a task force with broad representationfrom across the system to develop the SAI. Collegepresidents were a critical part of the task force, as on-the-ground leadership is vital to instilling a culturesupportive of continuous improvement. That hasbeen an invaluable factor in the sustainability of theSAI to this point. Also valuable was the learning yearthat helped colleges understand the new initiative ina low-stakes environment. Throughout that year theState Board engaged in a major communication effort,with extensive site visits and televised sessionsbroadcast to various segments of the collegecommunities. In the subsequent years, because ofresource constraints and the need to address otherpriorities, the Board reduced the scope of itscommunication efforts but attempted to keep thecolleges informed through the system’s presidents’council and through its work with the variousstatewide professional councils that haverepresentatives from every college. Board staff alsoworked extensively with college institutionalresearchers to address identified problems with the

data, knowing that the validity of the data is aprecondition for the success of the initiative. Recently,the State Board began to work with the colleges toprovide much more hands-on technical assistance onusing data for improvement. There is a goodinfrastructure in place across the system that allowsthe Board to do this on a system-wide basis. Workingwith the various professional councils (e.g., basicskills, workforce, and academic transfer) andincluding college institutional researchers in thesemeetings, the Board can greatly improve theprospects for data-informed improvement across thesystem.

Key Choices for Policymakers

Achieving a change in institutional culture in anyorganization is difficult, and doing so demands agreat deal from institutional leaders. But in highereducation, with shared governance and sharedleadership, the change process is especiallychallenging.

(1) Buy-in and engagement

From our interviews we identified presidents andinstitutional researchers as key players influencing acollege’s response to the SAI—the former forobvious reasons and the latter because researchers’attitudes about the validity and usefulness of datacan promote or impede willingness to use the dataas a basis for decision-making. For their part,presidents face some strategic choices. For example,do they frame the SAI as a discrete initiative andwork with faculty leaders and others to implementit? Or does that risk a backlash of “just anotherthing we have to add to our list when we’re alreadyover-worked”? Is it more effective to frame theinitiative as a way to bring together existing collegeinitiatives and establish a common vocabularyaround them? Presidents also face a choice abouthow quickly and thoroughly to engage thecampus—particularly the faculty—around the SAI.Given a history of faculty opposition toperformance funding and a history of many short-lived reforms, is it best to take a more mutedapproach at the outset, perhaps until it isabundantly clear to faculty that the initiative is notgoing away?

Another important choice to be made by presidentsinvolves strategies for using the performance funds

Page 18: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

14

earned by their colleges. In Washington, somepresidents have used the funds to support small-scale efforts aimed at improving student success,such as a new orientation program for specialpopulations or additional advising staff. Otherpresidents have so far chosen to fold SAI fundinginto the overall college budget. The formerinvestments, while garnering support among thedirect recipients, were too small to have a systemicimpact while the latter approach limited awarenessof the funds that are intended to spur newinstitutional behaviors.

Research on practices associated withorganizational improvement in and outside ofhigher education suggests that deep engagement offaculty and staff with student data is anotherprecondition of fundamental change in institutionalpolicies and practices.15 Achieving this kind ofengagement requires political and technicalstrategies. Politically, it is important to getinfluential faculty on board as spokespersons for thevalue of these kinds of data. It also requires asubstantial commitment of faculty and staff time,including the time of institutional researchers.

(2) Technical assistance

Washington’s experience suggests that technicalassistance to colleges is both critical and time-consuming. Moreover, providing technicalassistance to colleges on using data to continuouslyimprove programs and services is a new role formany state higher education agencies. At the veryleast, state agencies should provide forums wherefaculty and staff can get together from acrosscolleges and discuss key problems with studentprogression and share promising solutions, backedup with evidence. In Washington, the State Boardhas partnered effectively with the various statewidecouncils of college personnel, in a number of casesbringing different councils together to sharepromising practices. Many other states have similarstatewide organizations and could seek to workwith them strategically as the Washington StateBoard has done.

In a period in which community colleges, not richlyfunded to begin with, are facing draconian budgetcuts even as enrollments increase, any discussion ofchanging the measures and means by whichcolleges are funded is going to be politically

sensitive. Faculty and staff are likely to argue thatsuch changes should wait until college budgetsreturn to more normal levels. Yet it is preciselybecause the current climate has likely become the“new normal” that states such as Washington arecharting new ground in seeking to use scarceresources in ways that provide incentives forcolleges to improve outcomes. Communication isessential in this uncertain and challengingenvironment. Fear and resistance are natural urges,and a lack of information increases these tendencies.State agencies must explain clearly the goals andstrategies of performance incentive reforms tocollege leaders who, in turn, need to communicatethese issues across their campuses. Systemic changeis unlikely to take hold unless the majority ofstakeholders understand the political and economicrealities that are shaping strategies aroundperformance reporting and funding.

Page 19: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

15

The Student Achievement Initiative is a vitallyimportant effort from which there is, and willcontinue to be, much to learn. The

Washington State Community and TechnicalCollege System deserves accolades for itswillingness to innovate to such an extent. Thestature of community colleges and the rates ofstudent success across the country would likely behigher if all state community college systems sharedthe spirit, commitment, and tenacity of theWashington system.

Evaluation of the SAI continues through 2012. Evenas Washington considers refinements to its ownmodel, the national conversation aroundintermediate measures, performance funding, andinstitutional change can benefit from Washington’sexperience to date in implementing the SAI. In

addition to considering the policy challenges andchoices we presented above, we hope that thoseinvolved in similar efforts will put some thoughtinto additional questions to further advancecollective knowledge about these issues (see box,“Performance Incentive Policies: Further Questionsfor Consideration”).

National efforts to increase college completion anduse resources more productively stand to beenriched by the Washington experience. Through itsintegrated application of new approaches to dataand funding, the Student Achievement Initiative israising important questions, yielding some usefulanswers, giving states a model to adapt to their owncircumstances, and generally providing hope forbetter student outcomes in the nation’s preciouscommunity college sector.

Questions About Performance MeasurementSystems:

• How can annual count data on studentachievement, such as data from the SAIachievement point system, be used mosteffectively in combination with studentlongitudinal cohort data as a basis foridentifying problems and improvingpractices?

• Given the value of cohort data but the delaysattendant to tracking progress over time,how can colleges get timely feedback on theimpact of their actions to help students?

• Does public accountability for intermediatestudent achievements, short of completion,help or hurt community college efforts toconvey their value to external stakeholders(general public, business, trustees, etc.) whomay place more value on the award ofcredentials?

Questions About Performance Funding Systems:• What kinds of messaging and

communication strategies can increaseunderstanding across colleges and among

policymakers about the purposes ofperformance funding and the design choicesinvolved?

• How can performance funding systems bestprovide incentives for colleges to serveunderprepared students?

• How does a college’s attitude towardperformance funding change with experienceover time, and how does it vary with respectto its relative performance?

Questions About Promoting SystemicInstitutional Change:

• What campus practices seem most conduciveto fostering widespread engagement in datafor purposes of learning how to improvestudent outcomes?

• What good models exist for integratingperformance data into a college’s strategicplanning and institutional improvementprocesses?

• How can colleges best innovate at scale inorder to have the best chance of achievingsystemic change?

Performance Incentive Policies: Further Questions for Consideration

Next Steps in the National Conversation

Page 20: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

16

ReferencesBurke, J. C., & Associates. (Eds.) (2002). Funding

public colleges and universities for performance:Popularity, problems, and prospects. Albany, NY:Rockefeller Institute Press.

Burke, J. C., & Associates. (Eds.) (2005). Achievingaccountability in higher education: Balancing public,academic, and market demands. San Francisco, CA:Jossey-Bass.

Dougherty, K. J., & Hong, E. (2006). Performanceaccountability as imperfect panacea: Thecommunity college experience. In T. Bailey & V.S. Morest (Eds.), Defending the community collegeequity agenda. Baltimore, MD: Johns HopkinsUniversity Press.

Dougherty, K. J., & Kerrigan, M. R. (with Crosta, P.,Nienhusser, H. K., & Walker, N.) (2007). Fiftystates of Achieving the Dream: State policies toenhance access to and success in community collegesacross the United States. New York, NY: ColumbiaUniversity, Teachers College, CommunityCollege Research Center. Retrieved fromhttp://ccrc.tc.columbia.edu/Publication.asp?UID=504

Dougherty, K. J., & Natow, R.S. (2009). The demise ofhigher education performance funding systems inthree states. (CCRC Brief No. 41). New York, NY:Columbia University, Teachers College,Community College Research Center. Retrievedfrom http://ccrc.tc.columbia.edu/Publication.asp?uid=694

Dougherty, K. J., Natow, R. S., Hare, R. J., & Vega, B.E. (2010). The political origins of state-levelperformance funding for higher education: The casesof Florida, Illinois, Missouri, South Carolina,Tennessee, and Washington (CCRC Working PaperNo. 22). New York, NY: Columbia University,Teachers College, Community College ResearchCenter. Retrieved fromhttp://ccrc.tc.columbia.edu/Publication.asp?UID=819

Indiana Commission for Higher Education. (n.d.).Reaching higher. Retrieved fromhttp://www.in.gov/che/2349.htm

Endnotes1. See Dougherty, Natow, Hare, & Vega (2010);

Wellman (2002); and Zumeta (2001).2. Offenstein & Shulock (2009).3. See Burke & Associates (2005) and Dougherty &

Reid (2007).4. See Burke & Associates (2005) and Dougherty et

al. (2010).5. Burke & Associates (2002).6. Dougherty & Hong (2006).7. Dougherty & Natow (2009).8. Washington State Board for Community and

Technical Colleges (2007).9. Leinbach & Jenkins (2008).10. Offenstein & Shulock (2010).11. See Burke & Associates (2005) and Dougherty &

Hong (2006).

12. See, for example, Tennessee Higher EducationCommission (2010), Ohio Board of Regents(2010), and Indiana Commission for HigherEducation (n.d.).

13. A national survey of community collegeinstitutional research practices found that topadministrators generally do not use data onstudent outcomes for decision making. This is thecase even in some colleges that have well-staffedresearch departments and strong informationsystems. See Morest & Jenkins (2007).

14. See Burke & Associates (2005) and Dougherty &Hong (2006).

15. For a review of research on ways communitycolleges can improve organizationalperformance, see Jenkins (2011).

Page 21: Performance Incentives to Improve Community …college “open access” philosophy. Moreover, while outcome measures such as graduation rates are important, they provide no information

17

Jenkins, D. (2011). Redesigning community colleges forcompletion: Lessons from research on high-performance organizations (CCRC Working PaperNo. 24, Assessment of Evidence Series). NewYork, NY: Columbia University, TeachersCollege, Community College Research Center.Retrieved from http://ccrc.tc.columbia.edu/Publication.asp?UID=844

Leinbach, D. T., & Jenkins, D. (2008). Usinglongitudinal data to increase community collegestudent success: A guide to measuring milestone andmomentum point attainment (CCRC ResearchTools No. 2). New York, NY: ColumbiaUniversity, Teachers College, CommunityCollege Research Center. Retrieved fromhttp://ccrc.tc.columbia.edu/Publication.asp?uid=570

Morest, V. S., & Jenkins, D. (2007). Institutionalresearch and the culture of evidence at communitycolleges (Culture of Evidence Series, Report No.1). New York, NY: Achieving the Dream andCommunity College Research Center, TeachersCollege, Columbia University. Retrieved fromhttp://ccrc.tc.columbia.edu/Publication.asp?uid=515

Offenstein, J., & Shulock, N. (2009). Communitycollege student outcomes: Limitations of theIntegrated Postsecondary Education Data System(IPEDS) and recommendations for improvement.Sacramento, CA: California State University,Sacramento, Institute for Higher EducationLeadership and Policy. Retrieved fromhttp://www.csus.edu/ihelp/PDFs/R_IPEDS_08-09.pdf

Offenstein, J., & Shulock, N. (2010). Taking the nextstep: The promise of intermediate measures formeeting postsecondary completion goals. Boston,MA: Jobs for the Future. Retrieved fromhttp://www.csus.edu/ihelp/PDFs/R_TakingtheNextStep_091410.pdf

Ohio Board of Regents. (2010). State share ofinstruction handbook: Providing the methodology forallocating state share of instruction funds for fiscalyear 2010 and fiscal year 2011. Retrieved fromhttp://regents.ohio.gov/financial/selected_budget_detail/operating_budget_1011/handbook-cc.pdf

Tennessee Higher Education Commission. (2010).2010 outcomes based funding formula modelpresentation. Retrieved fromhttp://www.tn.gov/thec/Divisions/Fiscal/funding_formula_presentation.html

Washington State Board for Community andTechnical Colleges. (2007, September 12).Regular meeting agenda item. Retrieved fromhttp://www.sbctc.ctc.edu/college/education/proposal_to_board_sept07.pdf

Wellman, J. V. (2002). Statewide higher educationaccountability: Issues and strategies for success.Washington, DC: National GovernorsAssociation.

Zumeta, W. (2001). Public policy and accountabilityin higher education: Lessons from the past andpresent for the new millennium. In D. E. Heller(Ed.), The states and public higher education policy:Affordability, access, and accountability (pp. 155–197). Baltimore, MD: Johns Hopkins UniversityPress.