southeast evaluation association newslettersoutheastevaluation.org/yahoo_site_admin/assets/...next...

7
Southeast Ev Southeast Ev Southeast Ev Southeast Ev Southeast Evaluation aluation aluation aluation aluation Association Association Association Association Association September 2006 September 2006 September 2006 September 2006 September 2006 President: Kaye Kendrick Secretary: Kathy McGuire Treasurer: Ghazwan Lutfi P.O. Box 10125 P.O. Box 10125 P.O. Box 10125 P.O. Box 10125 P.O. Box 10125 Tallahassee, FL 32302 Tallahassee, FL 32302 Tallahassee, FL 32302 Tallahassee, FL 32302 Tallahassee, FL 32302 Ne Ne Ne Ne Newsle wsle wsle wsle wslett tt tt tt tter er er er er THE PRESIDENT' S COLUMN SEA Annual Conf SEA Annual Conf SEA Annual Conf SEA Annual Conf SEA Annual Conference Announced f erence Announced f erence Announced f erence Announced f erence Announced for Januar or Januar or Januar or Januar or January 200 y 200 y 200 y 200 y 2007 Hello SEA Members and Readers: This has been a busy year for SEA, starting with a successful annual conference at the Tallahassee Civic Center. It had more than 160 participants – perhaps a little more rain than some of the out-of-town folks are used to – but the programs and networking were phenomenal. Our new President-Elect, Betty Serow, has already begun working on next year’s conference, which promises to be even bigger and better. Our Program Committee co-chairs, Christine Johnson and Mary Kay Falconer, have already organized and provided two workshops on categorical data and survival analysis (May 3) and on qualitative data analysis (May 23) and Florida's new state contracting legisla- tion (August 22). We are looking forward to more state-of-the-art programs as the year progresses. Our new Treasurer, Ghazwan Lutfi, Professor at Florida A & M University, Yahong Zhang, our Intern and Graduate Student at Florida State University, and I have been enhancing SEA’s internal management systems. We are implementing QuickBooks on-line, so that we may provide more efficient accountability and communi- cation. This new system will improve our capability to maintain our member and mailing lists, monitor our budget and expenses, and account for our programs. By the end of the year, we hope to be able to accept credit card payments for membership and program fees. Our vision also includes website enhancements to better support member services and networking. If you have website design expertise and are willing to help, please let me know. Ultimately, we are working toward doing our part in the Southeast to support the American Evaluation Association’s mission to: (1) improve evaluation practices and methods, (2) increase evaluation use, (3) promote evaluation as a profession, and (4) support the contribution of evaluation to the generation of theory and knowl- edge about effective human action. I am honored to be serving as your President for 2006. I am having so much fun getting to know my fellow Board members. The synergy of our group is very special, with members of the most interesting backgrounds, special skills and talents. So, please join us – be a member, be a participant, be a volunteer! You may reach me at (850) 509-5927 or e-mail me at [email protected] with your ideas, suggestions and questions. Kaye Kendrick, SEA President SEA Members and Friends, Mark your calendars now! Our next SEA Annual Conference, “Ethics, Evaluation and Accountability,” will be held on January 18-19, 2007. Our keynote speaker will be Dr. Michael Morris, a Professor in the Department of Psychology at the University of New Haven in Connecticut. Mike is editor of the Ethical Chal- lenges section in the American Journal of Evaluation, Chair of the Ethics Committee of the American Evaluation Association, and author of the “Ethical Considerations in Evaluation” chapter in The International Handbook of Educational Evaluation. As in years past, Dr. Morris also will conduct a full-day pre- conference workshop. Because he will be using a case-based ap- proach, space will be limited. The workshop will be on January 17, 2007 - Keep your eyes open for registration materials in the Fall. Start thinking now about theme-related papers you might want to contribute, or panel sessions you might want to organize. Look for the ‘call for papers’ in September. Susan McNamara and I are co-chairing the conference committee together this year and we are looking for members interested in being part of the fun. If you enjoy doing logistics, want to help with registration, are interested in reading the abstracts submitted for inclusion, or just want to be more active in your organization, please contact me at [email protected], or (850) 893-5522. Betty Serow President-elect and Conference Co-Chair Want to get involved in the SEA? Well, here’s how! Contact SEA at our website e-mail address: [email protected]. There are numerous opportunities for members to help out…no contribution is too small! Thanks to everyone who helped with this newsletter: Yahong Zhang, Christine Johnson, Kathy McGuire, Mary Kay Falconer, Betty Serow, Christopher Sullivan and other SEA board members!

Upload: others

Post on 24-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Southeast EvSoutheast EvSoutheast EvSoutheast EvSoutheast EvaluationaluationaluationaluationaluationAssociationAssociationAssociationAssociationAssociation

September 2006September 2006September 2006September 2006September 2006

President: Kaye Kendrick Secretary: Kathy McGuire Treasurer: Ghazwan Lutfi

P.O. Box 10125 P.O. Box 10125 P.O. Box 10125 P.O. Box 10125 P.O. Box 10125 Tallahassee, FL 32302Tallahassee, FL 32302Tallahassee, FL 32302Tallahassee, FL 32302Tallahassee, FL 32302

NeNeNeNeNewslewslewslewslewslettttttttttererererer

THE PRESIDENT'S COLUMN

SEA Annual ConfSEA Annual ConfSEA Annual ConfSEA Annual ConfSEA Annual Conference Announced ference Announced ference Announced ference Announced ference Announced for Januaror Januaror Januaror Januaror January 200y 200y 200y 200y 20077777

Hello SEA Members and Readers:

This has been a busy year for SEA, starting with a successful annualconference at the Tallahassee Civic Center. It had more than 160participants – perhaps a little more rain than some of the out-of-townfolks are used to – but the programs and networking were phenomenal.

Our new President-Elect, Betty Serow, has already begun working onnext year’s conference, which promises to be even bigger and better.

Our Program Committee co-chairs, Christine Johnson and MaryKay Falconer, have already organized and provided two workshopson categorical data and survival analysis (May 3) and on qualitativedata analysis (May 23) and Florida's new state contracting legisla-tion (August 22). We are looking forward to more state-of-the-artprograms as the year progresses.

Our new Treasurer, Ghazwan Lutfi, Professor at Florida A & MUniversity, Yahong Zhang, our Intern and Graduate Student atFlorida State University, and I have been enhancing SEA’s internalmanagement systems. We are implementing QuickBooks on-line,so that we may provide more efficient accountability and communi-cation. This new system will improve our capability to maintain ourmember and mailing lists, monitor our budget and expenses, andaccount for our programs.

By the end of the year, we hope to be able to accept credit cardpayments for membership and program fees. Our vision alsoincludes website enhancements to better support member servicesand networking. If you have website design expertise and arewilling to help, please let me know.

Ultimately, we are working toward doing our part in the Southeastto support the American Evaluation Association’s mission to: (1)improve evaluation practices and methods, (2) increase evaluationuse, (3) promote evaluation as a profession, and (4) support thecontribution of evaluation to the generation of theory and knowl-edge about effective human action.

I am honored to be serving as your President for 2006. I am havingso much fun getting to know my fellow Board members. Thesynergy of our group is very special, with members of the mostinteresting backgrounds, special skills and talents. So, please join us– be a member, be a participant, be a volunteer! You may reach meat (850) 509-5927 or e-mail me at [email protected] with yourideas, suggestions and questions.

Kaye Kendrick, SEA President

SEA Members and Friends,

Mark your calendars now! Our next SEA Annual Conference,“Ethics, Evaluation and Accountability,” will be held on January18-19, 2007. Our keynote speaker will be Dr. Michael Morris, aProfessor in the Department of Psychology at the University ofNew Haven in Connecticut. Mike is editor of the Ethical Chal-lenges section in the American Journal of Evaluation, Chair of theEthics Committee of the American Evaluation Association, andauthor of the “Ethical Considerations in Evaluation” chapter inThe International Handbook of Educational Evaluation.

As in years past, Dr. Morris also will conduct a full-day pre-conference workshop. Because he will be using a case-based ap-proach, space will be limited. The workshop will be on January 17,2007 - Keep your eyes open for registration materials in the Fall.

Start thinking now about theme-related papers you might want tocontribute, or panel sessions you might want to organize. Look forthe ‘call for papers’ in September.

Susan McNamara and I are co-chairing the conference committeetogether this year and we are looking for members interested inbeing part of the fun. If you enjoy doing logistics, want to help withregistration, are interested in reading the abstracts submitted forinclusion, or just want to be more active in your organization, pleasecontact me at [email protected], or (850) 893-5522.

Betty Serow

President-elect and Conference Co-Chair

Want to get involved in the SEA?

Well, here’s how! Contact SEA at our website e-mailaddress: [email protected]. Thereare numerous opportunities for members to help out…nocontribution is too small!

Thanks to everyone who helped with this newsletter:Yahong Zhang, Christine Johnson, Kathy McGuire, MaryKay Falconer, Betty Serow, Christopher Sullivan and otherSEA board members!

Southeast Evaluation Association Newsletter - September 2006

PAGE 2

REVIEW OF RECENT JOURNAL ARTICLES

Accounting for the Value of Performance Measurement from thePerspective of Midwestern Mayors

Alfred Tat-Kei Ho

Journal of Public Administration Research and Theory, Apr 2006: 217-237

Many of the past debates about the impact of performance measure-ment and performance budgeting have been normative or descrip-tive. How performance measurement is integrated into decisionmaking remains a “black box.” This article studies the decade-longquestion, “Does performance measurement matter?” by examininghow and why Midwest mayors perceive value in performancemeasurement. The results show that the tool is perceived positively,but its impact on decision making depends on whether performancemeasurement is integrated into strategic planning, goal setting, andinternal communication between city council members and depart-mental staff and on whether major stakeholders are involved indeveloping performance measures. The article discusses the impli-cations for future results-oriented reforms and concludes that sim-ply reporting performance information in budgetary or public docu-ments is not enough. Rather, a more comprehensive look at theimplementation issues of performance measurement and perfor-mance budgeting is necessary.

The full article is available at:http://www.southeastevaluation.com/Ho%202006.pdf

Efforts to Improve Public Policy and Programs through DataPractice: Experiences in 15 Distressed American Cities

Beth C. Weitzman, Diana Silver and Caitlyn Brazill

Public Administration Review, May/June 2006: 386-399

Philanthropies and govern-ment agencies interested inchildren’s issues are encour-aging localities to improvethe process of collecting,linking, and sharing micro-data and aggregated sum-mary statistics. An implicitassumption of these efforts is that outcomes will improve as a resultof the new approaches. In this article, the authors examine efforts toimprove data practice in 15 distressed American cities. Interviewsconducted in these cities revealed variation in the types of informa-tion collected, dissemination, and intended audiences. They identifysignificant challenges to these efforts, including adequate resources,turf battles, technical problems, access to information sources, incon-sistent leadership, and absence of political will. This study finds thatlittle is known about the impact of these initiatives on decisionmaking. Assumptions that improved data practice will lead toimproved policy making have not yet been realized in these cities.

The full article is available at:http://www.southeastevaluation.com/Weitzman%202006.pdf

The Triumph of Numbers: Knowledges and the Mismeasure ofManagement

Ralph P. Hummel

Administration & Society, March 2006: 58-78

We live in a world of numbers, but numbers have become sodominant that we consider nothing to be real unless it can bemeasured and mathematized. How did we come to live by thenumbers? To answer, Husserl unraveled the secret of how geometrymust have evolved, giving one model to explain how a tool deeplyinvolved in measuring for human purposes could become a tool oftotally detached knowledge. The tale of geometry prefigures howknowledge becomes not only devoid of human purposes but iscapable of turning against humanity as we become careless ofhuman concerns.

The structure of work in modernorganizations is now seen in thelight of the evolution of measure-ment. But measurement has pro-ceeded from the rough and readyto increasing perfection. This driveto perfection is part of a civiliza-tion-wide movement toward—ironically—a thoughtless faith in pure ideas, an idealism devoid ofcontext or content, rationalism estranged from reasonability. Thehistory of this development reminds us of what is humanly at stakewhen we engage in the misuse of measurement.

The full article is available at:http://www.southeastevaluation.com/Hummel%202006.pdf

Civil Society: Measurement, Evaluation, Policy

Helmut K. Anheier Earthscan Press, April 2004

Civil society — comprising the activities of non-state organizations, institutions and movements —has in recent years emerged as the major force forchange in the realms of politics, public policy andsociety both globally and locally. Yet, despite thecrucial importance of this political phenomenon tothe principle and practice of democracy, it eludesdefinition and systematic understanding. This bookprovides a comprehensive and flexible framework

for the definition, measurement, analysis and interpretation of civilsociety based on the innovative “Civil Society Diamond.” Written asa guide for both practitioners and academics, the book presentsprecise and insightful solutions to the issues of how to understandthe concept of civil society, where to locate it theoretically andempirically, and which techniques are best suited to its measure-ment.

The approach presented here has been successfully adopted acrossa wide range of civil society organizations in over 50 countries. Theauthor draws on and applies a diverse repertoire of indicators, toolsand data — suitable for various organizational forms, practicalcontexts and theoretical perspectives — which measure the effec-tiveness of civil society initiatives and reveal certain strategic andpolicy options. The aim is to promote and facilitate structured,informed and fruitful dialogue both within civil society organiza-tions and between them and the governmental, corporate andacademic actors with whom they are now so integrally linked.

BOOK REVIEW

Southeast Evaluation Association Newsletter - September 2006

PAGE 3

:

BIOGRAPHIES OF SEA OFFICERS - 2006-2007

Kaye Kendrick, CPASEA President (2006)

Kaye is the current President of SEA. She hasa business and management consulting prac-tice which specializes in performance enhance-ment, with a mission to foster organizationalharmony, healthy families and holistic health.Her firm is a licensed CPA firm. Kaye enjoysassisting organizations and people in theiraccomplishments. She also enjoys swimming,kickboxing, flowers, and spending time with her husband andtwo teenage daughters.

Fran Berry, Ph.DSEA Past-President (2005)Fran is Director of the Askew School ofPublic Administration and Policy at FloridaState University, and holds the Frank P.Sherwood Professor of Public Administra-tion professorship. She has publishedwidely and her scholarly research exploresthese topics: (1) policy innovation, diffu-sion and change; (2) strategic managementin state, local and nonprofit agencies; and (3) implementation,evaluation and utilization of policy and administrativereforms. Since moving to FL in 1990, she has served as projectdirector on over three dozen projects with Florida governmen-tal and nonprofit agencies. Prior to being a faculty member,Fran worked as director of research and executive leadershipat the Council of State Governments. She teaches public policytheory, policy evaluation and development; strategic manage-ment; and intergovernmental relations.

Betty Serow, Ph.D. & M.P.HPresident-Elect (2006), Conference Co-Chair

Betty has been a member of SEA for over 15years, and has chaired various Association com-mittees. She works in the Office of Planning,Evaluation and Data Analysis at the FloridaDepartment of Health. Her role there is makingdata easily available to and understandable bypeople working in communities to improve thehealth and well-being of their citizens. Bettyhas been working in the area of community development and publichealth for over 25 years. In her spare time she sings with theTallahassee Community Chorus, and travels as much as possible.

Ghazwan A. Lutfi, Ph.DSEA Treasurer

Ghazwan is an Associate Professor at Depart-ment of Educational Leadership and HumanServices, College of Education, Florida A&MUniversity. He has expertise in educational re-search methods, statistics, assessment and pro-gram evaluation. His research interest is insurvey methods, accountability, and programevaluation. He has served as an evaluator and statistical consultantto various programs and grants. He has presented and publishedseveral articles.

Kathy McGuireSEA Secretary, Essential Skills WorkshopChair

Kathy is the Deputy Director for the FloridaLegislature’s Office of Program Policy Analysisand Government Accountability (OPPAGA).She has been doing evaluation work for the past25 years. She has enjoyed attending the SEAconferences for years and is happy to serve on the board this year.When she is not at work she enjoys traveling with her family,reading, and dancing.

Susan R. McNamara, M.S.Conference Committee Co-Chair

Susan has been involved in SEA since its firstconference, was Communications Chair for 14years, and designed the original SEA websiteand listserv. She was presented a lifetime mem-bership award in 2003. Susan currently works atthe Agency for Health Care Administration, inthe Medicaid Research and Policy unit of theQuality Management Bureau. Her responsibilities include manag-ing the Medicaid Reform Evaluation contract; assisting with theResearch Conference; and researching issues for the public concern-ing health care for the uninsured. She is married to Bob O’Lary.Susan’s hobbies include collecting comedy music and playing com-puter games.

Christine E. Johnson, M.S.Program Committee Co-Chair

Christine joined SEA in 2002 and co-chaired the2003 conference. She is an Associate in Research atthe Learning Systems Institute, Office of the Pro-vost, Florida State University, where she managesresearch and policy projects related to K-12 educa-tion. She has 27 years of experience in educationalpolicy research, evaluation, performance measure-ment, and project management for a wide variety of organizations,including Florida State University, the Florida Governor’s Office, stateagencies, and private, non-profit organizations. She holds an M.S. degreein psychology from FSU and has completed coursework toward adoctorate in psychology with a minor in research and evaluation.

Mary Kay Falconer, Ph.D.SEA Program Committee Co-Chair

After completing her doctorate in sociology/demography in 1983 at Florida State Univer-sity, Mary Kay worked for 2 years at theFlorida Department of Health and Rehabilita-tive Services. For the next 11 years, she was ananalyst and then director of a legislative com-mittee in the Florida Legislature. Her nextcareer move was to the School of Social Work as a ResearchAssociate in the Institute for Health and Human Services Re-search at Florida State University. In July 2003, she was hired asa senior evaluator at the Ounce of Prevention Fund of Floridato conduct research on participants in Healthy Families Floridaand other programs funded by the agency.

Southeast Evaluation Association Newsletter - September 2006

PAGE 4

Xiaohu Wang, Ph.DAssociate Professor, University of Central [email protected]

This essay examines a key issue in conducting outcome-orientedperformance analysis for public services. Outcome-oriented perfor-mance analysis (OPA) is broadly defined as the use of quantitativeoutcome performance indicators to systematically analyze organi-zational performance in order to improve organizational perfor-mance and accountability. In practice, OPA is a process of develop-ing analysis questions, measures, theories, and designs, and analyz-ing empirical data to draw analytical conclusions and recommenda-tions for performance improvement and accountability. It consists ofthe following components:

· Determining questions for the analysis;· Defining and measuring performance;· Developing theories on what impacts performance;· Designing the mode of data collection to answer theanalysis questions;· Collecting data;· Conducting analysis; and· Completing analytical report.

What are the multiple interests in OPA and why do they exist?

One major difficulty in OPA for the public services is the existence ofmultiple interests for the analysis. The absence of a common marketto assess the economic value of a product makes it difficult to developa set of goals collectively agreed by stakeholders of public services,and therefore stakeholders are left with great discretion to determinetheir own priorities of service goals and preferences of implementa-tion strategies. Moreover, stakeholders of a public service agencyoften have diverse and changing interests, and these interests arelikely inconsistent, or even contradictory to each other in that thefulfillment of one harms another. For example, a desire to improvestudents’ academic performance measured by standard test scoresmay be inconsistent with the interest to increase accessibility to aneducational service, and contradictory to the goal of educational costcontrol.

Consequently, it is likely that different stakeholder groups want differentquestions answered in performance analysis. The existence of multipleanalysis questions should not be a problem when measures are availableand designs are developed properly to answer these questions. Never-theless, the OPA implementation is problematic when inconsistent orcontradictory questions are raised. Even in the simplest form of goal-setting process while two stakeholders exist for two possible goals, thechoices of possible combinations of their goal preferences could be morethan OPA analysts can decipher, and the possibility to achieve a samegoal preference for the two stakeholders is slim. The possibility ofconsistent preferences reduces dramatically with more stakeholders ormore goals involved.

Changing Questions in OPA

Analysis questions may change in OPA, either because of the naturalprogression of the analysis with new data and measures available,or more often, because of stakeholders’ changing expectations forthe OPA. Oftentimes, stakeholders have expectations that are notarticulated at the beginning of the analysis. This results in significant

confusion in the goal-setting process and possible delay in designand implementation of OPA.

Changing analysis questions, like a moving target, could lead to aseries of changes in OPA measures and designs, and pose newrequirements for data analysis. For example, an OPA of a healthcareeducational program originally designed to examine an immediateoutcome of psychosocial achievements could be later asked toaddress issues of participants’ educational achievement. As “educa-tional achievements” are outcomes completely different from “psy-chological status,” this change of analysis questions requires thedevelopment of new measures and possible new designs.

A design problem often occurs when such a change happens. As theprogram (or the process) being analyzed is designed to influence one setof outcomes, its impact on other outcomes needs to be defined andtheorized before any analysis can be performed to achieve an acceptablelevel of design internal validity. The impact of changing questions onanalysis design can be elaborated in two situations while the changeposes two analysis goals that (1) are related or (2) unrelated. It can beestablished that, if both goals are related, the program then impacts bothgoals, although the forms of impact are different.

How to Stabilize the Question-Asking Process

There may be some truth to the argument that OPA is a process forstakeholders to learn their organizations and to raise new questionswith the analysis process evolving. Nevertheless, an OPA analystneeds to have a set of analysis questions fixed for a period of timelong enough for the completion of an analysis process that consistsof development of design, data collection, and data analysis.

Although it is always a good idea to try to articulate the questions atthe beginning of analysis and achieve a consensus among stakehold-ers about the questions, it may be difficult to do so, as stakeholdersmay be unclear about their expectations of the analysis or theseexpectations may change. Nevertheless, some strategies can beemployed to help stabilize the question-asking process. The keysseem to articulate goals and purposes of analysis as much as possiblebefore an analysis starts, and be flexible in designs. The intentionhere is to bring up different expectations of stakeholders before theanalysis starts, so a compromise of these expectations may be madeamong stakeholders to arrive at a series of consistent analysisquestions. This strategy consists of a few steps, as follows.

First, there should be a thorough discussion on the purposes of analysis.Is OPA for accountability or for management process improvement?A focus on accountability may suggest that OPA should includemore questions on outcomes that stakeholders prefer, while for anOPA focusing on managerial improvement, more questions onmanagerial process as well as the interrelationship among inputs,outputs, and outcomes may be needed.

Is OPA for performance monitoring or for performance evaluation?Frequency of observations is required for performance monitoringthat focuses on consistent performance improvement. However,frequent observations may not be strictly required for performanceevaluation (auditing), which often emphasizes the thoroughness ofanalysis designs and measurements.

(Continued. See: Multiple Stakelholder Issues on Page 6)

METHODOLOGY: MULTIPLE STAKEHOLDER INTERESTS IN PERFORMANCE ANALYSIS

How to Address the Issue of Multiple Stakeholder Interests in Performance Analysis for the Public Services

Southeast Evaluation Association Newsletter - September 2006

PAGE 5

The purpose of SEA’s brown bags and work-shops is to offer training by experts in keyareas important to both current and potentialSEA members. Our goal is to serve profes-sionals who have:

(1) diverse roles, ranging from internal auditorsoverseeing programs to evaluators at small private, non-profits account-able to state agencies

Upcoming SEA Events

Fall 2006 Essential Skills Training

SEA’s Essential Skills Training provides a comprehensive overviewof program evaluation. The objectives are to: (1) introduce andincrease knowledge of basic program evaluation concepts, proce-dures, and standards of professional practice and (2) reflect on therole of program evaluation in program planning and development.

The Essential Skills training is scheduled for sometime this fall at thePepper Building (downtown and across from the Capitol) and willbe taught by

Dr. Fran Berry, Director of the Askew School of Public Admin-istration at Florida State University;

Dr. Betty Serow of the Office of Planning, Evaluation and DataAnalysis at the Florida Department of Health;

Dr. Linda Schrader, Professor, Program Evaluation, College ofEducation, FSU;

Dr. Mary Kay Falconer of the Ounce of Prevention Fund ofFlorida; and

Mr. Gary VanLandingham, Director and Dr. Steve Harkreader,Methodologist for OPPAGA.

This three-day class is of value to:

(1) those who would like a refresher course on the mainconcepts and issues in program evaluation;

(2) newly appointed evaluation or program review professionals,grants managers, or professionals with evaluation responsibility

(3) those who manage evaluation projects within theirorganizations.

The classes highlight the fact that plan-ning and doing evaluation are bothtechnical and consultative endeavorsthat allow a variety of approaches fordifferent skill levels and resources. Italso demonstrates how valuable pro-gram evaluation can be across sub-stantive areas and technical special-izations.

The workshop program is provided on Page 7 of this newsletter.

Fall 2006 Workshop on Sampling and Statistical Power Analysis

This training event will be targeted for intermediate and advancedlevels. The presenter for this workshop will be Dr. Dan McGee, Chairof the Department of Statistics at Florida State University. A recentsurvey of the SEA membership indicated a substantial interest in thisworkshop as well as others that will focus on other statisticaltechniques. The workshop will take place on:

Friday, September 29th, 9:00 a.m. - 12:00 p.m.Room #302, Pepper Building,111 West Madison Street

Workshop fees are $15 for SEA members, $30 for non-members, andfree for graduate students.

To register, please send an e-mail [email protected] .

2006 PLANS FOR WORKSHOPS AND BROWN BAGS

Fall 2006 Social

Join us for a social event in Tallahassee to facilitate networkingamong SEA members and encourage new members to getinvolved in SEA. Historically, we have held our social justbefore the Christmas holidays. We thoughtOctober might be better timing for everyone,including graduate students who are joiningSEA in increasing numbers.

Please mark your calendars and join us at ChezPierre on Tuesday, October 3, 5:00 - 7:30 p.m..

Fall 2006 Roundtable on Building the Capacity of Non-Profits

The Ounce of Prevention ofFlorida, Inc., is helping SEA or-ganize a roundtable on perfor-mance measurement and the

infrastructure required to do performance measurement and use itfor internal decision-making. Panelists will include evaluators fromthe Ounce of Prevention Fund and other organizations that workclosely with non-profits on evaluation.

This workshop will support the mission of SEA’s newly initiatedNot-for-Profit Support Committee, which is designed to improvethe in-house evaluation capabilities of not-for-profit organizationsand to provide the knowledge and tools necessary to procure qualityevaluation services. (see http://www.southeastevaluation.com/programnp.php)

SEA Program Committee Co-Chairs, Christine Johnson and Mary Kay Falconer

(2) varying levels of professional experience, ranging from basic tomore advanced applications of evaluation tools and techniques.

If you have any recommendations for future workshops, includingtopics and presenters, please don’t hesitate to contact ProgramCommittee Co-Chairs, Christine Johnson ([email protected])or Mary Kay Falconer ([email protected]).

We look forward to seeing you at upcoming events this year!

Southeast Evaluation Association Newsletter - September 2006

PAGE 6

Workshop on Qualitative Data Analysis

On May 23, Dr. Linda Schrader, an SEA board member and facultyat Florida State University’s College of Education, and Dr. DanKaczynski, faculty at the University of West Florida’s College ofProfessional Studies, conducted a half-day workshop on qualitativedata analysis, Evaluating through a Qualitative Lens: Data Analysis andReporting. The workshop filled to capacity (40 participants) withinonly a few days. We were especially pleased with the response fromFSU graduate students in colleges of education, information, publicadministration and social work, who comprised almost half of theattendees.

With an emphasis on group discussion and small group activities,the workshop presenters reviewed the nature of qualitative inquiry,its role in the evaluation process, collection and coding of qualitativedata, and validation and reporting of findings. Also included was abrief description of computer-assisted qualitative data analysis soft-ware (CAQDAS), including the recently released NVivo 7. Becauseof the high level of interest in this area, we may include similarworkshops as part of the annual conference program.

August 2006 Workshop on New State Contracting Legislation (SB 2518

In follow up to the Fall 2005 workshop on privatization and statecontracting, the SEA sponsored a second workshop, Florida’s NewState Contracting Legislation: What It Says, What It Means, at the R. A.Gray Building auditorium, on Tuesday, August 22.

Fred Springer, Interim Executive Director of the new Office ofEfficient Government at the Florida Department of ManagementServices, reviewed major elements of the new legislation, how theyrelate to existing processes, and what to expect over the next sixmonths.

Walter Sachs, Staff Director for Contracts at the Florida Department ofChildren and Families talked about how to develop a business case froma state agency perspective. There was no registration fee.

PowerPoint presentations and handouts have been posted on theSEA website: http://www.southeastevaluation.com/program.php.A copy of the new legislation is available at:http://www.flsenate.gov/Welcome/index.cfm?CFID=59174261&CFTOKEN=26497783.

On the left side of the webpage, jump to SB 2518.

Is OPA question-driven? Is data mining possible? My experience isstakeholders often prefer a question-driven process that they keepraising questions while the analysis process progresses and theirunderstanding of the analysis process evolves. OPA can be a ques-tion-driven process if or only if plentiful measures and a largeamount of data are available. Abundance of data and measuresmakes it possible for a data mining process that allows analysts toanswer analysis questions raised by stakeholders on an ongoingbasis.

Is OPA assumption-driven or data-driven? OPA could be assump-tion-based, when data collected are estimated from multiple as-sumptions. In analysis, particular data point (s) may be not availableso estimation is needed to complete the database. The accuracy ofestimate is based on appropriateness of assumptions made. Stake-holders should know that the results of analysis would change as aresult of changing assumptions.

Second, demonstrate possible answers to probable OPA questions to findstakeholders’ true interest to the questions and answers.Stakeholders have a better idea about their expectations for the OPAwhen a possible answer is given to an analysis question. The answer,followed by a question of “Is this what you want?” is often a powerfultool to find the real intents of stakeholders.

Third, present clearly the cost of a new set of questions.The financial cost of the analysis, as well as the possible politicalimplications can be made clear to the stakeholders on a regularlybasis. After a clear understanding of the financial and political costassociated, stakeholders may have second thoughts on changing thedirection of an analysis.

Workshops Held to Date This Year:

Advanced Statistical Analysis Workshop with OPPAGA

We began the year by partnering withOPPAGA for a 3-hour workshop on statis-tical analysis of risk with categorical de-pendent variables. Relative risk and oddsratios were covered in chi-square analysisand binary logistic regression. Other tech-niques included in the workshop wereKaplan-Meier to measure the probability of

an event not occurring at a unit of time and Cox ProportionalHazard Modeling.

The instructors were Dr. Steve Harkreader and Jason Gaitanis,OPPAGA staff. The instructors used SPSS statistical software toillustrate the techniques with data and output. Historically, staffwith OPPAGA have been a valuable source of expertise for SEA andwe appreciate their invitation for SEA members to participate.

Multiple Stakelholder Issues (Continued from Page 4)

Directions to the Essential Skills Workshop

Southeast Evaluation Association Newsletter - September 2006

PAGE 7

2006 Essential Skills Training

WHEN:Fall 2006. Sign up for one, two, or all three days.

WHO SHOULD ATTEND:Experienced researchers who want a refresher; new evaluation or programprofessionals; and those who manage evaluation projects for their organizations.

CONTENT of the TRAINNG PROGRAM:

DAY 1: Program EvaluationIntroduction to Program EvaluationTypes of EvaluationEthical Issues with Human Subjects Planning an Evaluation

DAY 2: Monitoring, Process, and Outcome Evaluation

Introduction to Process EvaluationDesigning Process EvaluationsIntroduction to Outcome EvaluationDesigning Outcome EvaluationsRelating Results to Program Costs

DAY 3: Data Collection Techniques and Using Evaluation Results

Data Collection Techniques - Interviews, Focus groups, and SurveysCommunicating Evaluation ResultsEvaluation Utilization

INSTRUCTORS: Top-notch in the field of program evaluation:Dr. Fran Berry, Director, Askew School of Public Administration, FSUDr. Betty Serow, Office of Planning, Evaluation, and Data Analysis, Dept. of HealthDr. Linda Schrader, Professor, Program Evaluation, College of Education, FSUDr. Mary Kay Falconer, Senior Evaluator, Ounce of Prevention Fund of FloridaGary VanLandingham, Director, OPPAGADr. Steve Harkreader, Methodologist, OPPAGA

LOCATION:

Conference room 302 of the Pepper Building at 111 West Madison Street in Tallahassee. Parking isavailable at Kleman Plaza, with an entrance on Duval Street (one-way north) near the Duval andJefferson Street intersection. For a map of downtown Tallahassee, please go to http://www.oppaga.state.fl.us/location/downtown.html.

For more information and the registration form, please go to SEA’s websitehttp://www.southeasternevaluation.com/skills.php

DON’T MISS IT!