2004 national study of postsecondary faculty …nces.ed.gov/pubs2006/2006179.pdf2004 national study...

376
2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report U.S. Department of Education Institute of Education Sciences NCES 2006–179

Upload: others

Post on 18-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report

Technical Report

U.S. Department of Education Institute of Education Sciences NCES 2006–179

Page 2: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 3: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report

Technical Report

May 2006

U.S. Department of Education Institute of Education Sciences NCES 2006–179

Ruth Heuer Brian Kuhr Mansour Fahimi T.R. Curtin Marjorie Hinsdale Lisa Carley-Baxter Pat Green RTI International Linda Zimbler Project Officer National Center for Education Statistics

Page 4: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

U.S. Department of Education Margaret Spellings Secretary

Institute of Education Sciences Grover J. Whitehurst Director

National Center for Education Statistics Mark Schneider Commissioner

The National Center for Education Statistics (NCES) is the primary federal entity for collecting, analyzing, and reporting data related to education in the United States and other nations. It fulfills a congressional mandate to collect, collate, analyze, and report full and complete statistics on the condition of education in the United States; conduct and publish reports and specialized analyses of the meaning and significance of such statistics; assist state and local education agencies in improving their statistical systems; and review and report on education activities in foreign countries.

NCES activities are designed to address high priority education data needs; provide consistent, reliable, complete, and accurate indicators of education status and trends; and report timely, useful, and high quality data to the U.S. Department of Education, the Congress, the states, other education policymakers, practitioners, data users, and the general public.

We strive to make our products available in a variety of formats and in language that is appropriate to a variety of audiences. You, as our customer, are the best judge of our success in communicating information effectively. If you have any comments or suggestions about this or any other NCES product or report, we would like to hear from you. Please direct your comments to:

National Center for Education Statistics Institute of Education Sciences U.S. Department of Education 1990 K Street NW Washington, DC 20006–5651

May 2006

The NCES World Wide Web Home Page address is http://nces.ed.gov The NCES World Wide Web Electronic Catalog is: http://nces.ed.gov/pubsearch

Suggested Citation Heuer, R., Kuhr, B., Fahimi, M., Curtin, T.R., Hinsdale, M., Carley-Baxter, L., and Green, P. (2005). 2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report (NCES 2006-179). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

For ordering information on this report, write: U.S. Department of Education ED Pubs P.O. Box 1398 Jessup, MD 20794–1398

Or call toll free 1–877–4ED–Pubs

Content Contact: Aurora D’Amico (202) 502–7334 aurora.d’[email protected]

Page 5: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

iii

Executive Summary The 2004 National Study of Postsecondary Faculty (NSOPF:04), conducted by RTI

International (RTI) and sponsored by the U.S. Department of Education’s National Center for Education Statistics (NCES), is a nationally representative study that collects data regarding the characteristics, workload, and career paths of full- and part-time postsecondary faculty and instructional staff at public and private not-for-profit 2- and 4-year institutions in the United States. Conducted previously in 1988, 1993, and 1999, it serves a continuing need for data on faculty and instructional staff.

For the first time, NSOPF:04 is being conducted as a component study of the 2004 National Study of Faculty and Students (NSoFaS:04). The student component—the 2004 National Postsecondary Student Aid Study (NPSAS:04)—is a nationally representative study of students enrolled in all levels of postsecondary education. Historically, there has been considerable overlap in the institutions selected for participation in NSOPF and NPSAS; therefore, institution sampling and contacting activities for both studies were coordinated to help minimize response burden on institutions and to improve data collection efficiency.

This report describes the methodology and findings of NSOPF:04, which took place during the 2003–04 academic year. A field test, conducted in the 2002–03 academic year, was used to plan, implement, and evaluate methodological procedures, instruments, and systems proposed for use in the full-scale study. The 2004 National Study of Postsecondary Faculty Field Test Methodology Report (Heuer et al. 2004) is available from NCES.

This methodology report is designed to report solely for NSOPF:04. NPSAS:04 procedures and results—provided in a separate report—are discussed here only as they impact or overlap with those outlined for NSOPF:04.

Target Population and Sample Design The NSOPF:04 sample consists of postsecondary institutions and their full- and part-time

faculty and instructional staff. The sampled institutions represent all public and private not-for-profit Title IV-participating, degree-granting institutions in the 50 states and the District of Columbia, as reported in the 2002 Integrated Postsecondary Education Data System (IPEDS) data files. Stratified, systematic samples of institutions and faculty were designed to allow detailed comparisons and high levels of precision. A customized cost/variance optimization program was implemented to efficiently secure targeted levels of precision for key estimates.

A two-stage sampling methodology was utilized. In the first stage, the institution sample was drawn based on a probability proportional to size (PPS) selection methodology, where each institution was assigned a composite measure of size (MOS) that reflected the number of eligible faculty and instructional staff in each of six strata. A sample of 1,080 postsecondary institutions was selected for participation; 1,070* of these were eligible. Each institution was asked to provide a list of all of the full- and part-time faculty and instructional staff that the institution employed during the fall 2003 term. Institutions were asked to include all employees with faculty

* Throughout this report, faculty and institution counts are rounded to the nearest 10 to protect the confidentiality of faculty and institutions. However, percentages cited are based on the original unrounded numbers.

Page 6: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Executive Summary

iv

status (both instructional and non-instructional) and all others with instructional responsibilities, regardless of faculty status. A total of 980 institutions provided a list suitable for sampling.

In the second stage of sampling, full- and part-time faculty and instructional staff employed by participating institutions as of November 1, 2003 were selected. Sampling was conducted on a flow basis, as lists were received, checked for accuracy, and processed. A total of 35,630 faculty were sampled from participating institutions. Of these, 34,330 were eligible.

Instrumentation The NSOPF:04 institution questionnaire was designed to be self-administered via the

Internet; the NSoFaS:04 website for institutional participation provided secure access to the questionnaire and information about each component of the study. To expedite completion, it could also be administered as a computer-assisted telephone interview (CATI), if necessary. The instrument was divided into major sections that collected information on the number of faculty and instructional staff employed at the target institution, the policies and practices that affected full-time faculty and instructional staff, the policies and practices that affected part-time faculty and instructional staff, and the percentage of undergraduate instruction assigned to various instructional personnel.

The NSOPF:04 faculty instrument was also designed as a web-based instrument for self-administration via the Internet and by CATI for nonresponse follow-up. The faculty website, like the institution website, provided secure access to the self-administered questionnaire as well as additional information about the study.

Both instruments were designed to accommodate the mixed-mode data collection approach and to ensure the collection of high-quality data. Design considerations included appropriate question wording for both self-administered and telephone interviews, and checks for out-of-range or inconsistent values. The faculty instrument consisted of the following eight sections grouped by topic:

• employment during the fall 2003 term (including academic rank, tenure status, and field of teaching);

• academic and professional background (including highest degree earned and employment history);

• institutional responsibilities and workload (including instructional activities and other work responsibilities performed in a typical week);

• scholarly activities (including productivity, funding of scholarly activities, and field of research);

• job satisfaction and retirement plans;

• monetary compensation (including income from the institution and other sources, structure of the employment contract, and household income);

• sociodemographic information (including gender, race, date of birth, marital status, number of dependent children, and citizenship); and

• opinions about working conditions at the institution.

Page 7: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Executive Summary

v

Institution Contacting Sampled institutions were contacted by mail, e-mail, and telephone beginning in spring

2003 to allow institutions sufficient time to plan for the study and to resolve any potential roadblocks to participation. Institution contacts were designed to verify institutional eligibility, secure timely participation in each survey component, and identify a staff person at each institution—called the Institution Coordinator—to respond to all NSoFaS:04 data requests. The Institution Coordinator was mailed an introductory letter and accompanying information packet, and then contacted by telephone to confirm the institution’s intent and ability to participate within schedule constraints. At this time, each coordinator was asked to complete a Coordinator Response Form that confirmed the data items requested for each component of NSoFaS:04 and the projected deadlines for completion of the study. Upon request, project staff prepared additional information packets for Institutional Review Boards (IRBs) and other deliberative bodies within institutions to secure the institution’s participation.

Beginning in fall 2003, each Institution Coordinator was mailed a binder containing complete specifications for participation. Institution Coordinators were asked to provide electronic lists of all eligible faculty and instructional staff on November 1, 2003, and to complete the institution questionnaire by December 6, 2003. Follow-up activities continued with the Institution Coordinator until all requested data was supplied.

Of the 1,070 eligible institutions, 980 (91 percent unweighted and weighted) provided faculty lists, and 920 (86 percent unweighted; 84 percent weighted) completed the institution questionnaire.

Help Desk and Interviewer Training Training programs were developed for help desk operators who would respond to

questions of sample members attempting to complete the web-based survey and for telephone interviewers who would conduct the nonresponse follow-up. Help desk operators received specific training in “frequently asked questions” regarding the instrument and technical issues related to completion of the self-administered questionnaire via the Internet. In addition, help desk operators received the same training as telephone interviewers because they were expected to complete the instrument over the telephone if requested by a caller. The telephone interviewer training focused on techniques for successfully locating and interviewing sample members, and covered such topics as administrative procedures required for case management, quality control of interactions with sample members and other contacts, and the organization and operation of the web-based faculty instrument to be used in data collection.

Faculty Locating and Survey Completion NSOPF:04 data collection procedures were designed to locate sample members,

encourage prompt completion of the self-administered questionnaire via the Internet, and conduct telephone interviews with nonrespondents.

Upon receipt of faculty lists, contact information for the sampled faculty and instructional staff was reviewed and assessed for completeness. Incomplete information was supplemented by searches of the institution’s website for telephone and address information. Intensive tracing was performed when all telephone numbers for a respondent were exhausted.

Page 8: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Executive Summary

vi

Faculty data collection utilized a mixed-mode approach; sample members could participate either by web-based self-administered questionnaire or by an interviewer-administered telephone interview. The participation of sample members was initially requested in a letter, which provided both instructions for completing the web questionnaire and completing the interview via CATI. Periodic reminder letters and e-mail messages were sent to nonrespondents to encourage their participation.

After 4 weeks, interviewers began calling the sample members directly to attempt a CATI interview. An early-response incentive was provided to encourage prompt completion of the instrument. Incentives were also offered to sample members who refused or were unresponsive.

Of the 34,330 eligible sample members, 26,110 (76 percent, unweighted and weighted) completed the faculty questionnaire during a field period from January to October of 2004. Seventy-six percent of respondents completed the self-administered web questionnaire, and 24 percent were interviewed by telephone. The average time to complete the survey was 30 minutes.

Evaluation of Operations and Data Quality Evaluations of operations and procedures focused on the joint institution contacting

endeavor, the timeline for data collection from institutions (faculty lists and institution questionnaires) and faculty (CATI and self-administered interviews), tracing and locating procedures, refusal conversion efforts, the effectiveness of incentives, and the length of the faculty interview.

Results of the data quality evaluations included the following:

• Eighty-two percent of faculty list counts were within 10 percent of the corresponding institution questionnaire counts. There were greater variances between list counts and IPEDS, which is based on a narrower definition of faculty. Patterns of discrepancies between IPEDS and list data followed expected patterns, with list counts larger than those from IPEDS.

• Item nonresponse was below 15 percent for 87 of the 90 items in the institution questionnaire and for 141 out of the 162 items in the faculty questionnaire.

• Of the 26,550 eligible sample members who started the interview, 570 (2 percent) broke off before completing the interview. Of these, 430 broke off before completing the workload section and were not considered to be partial completes. Of the 140 partial completes, 48 percent broke off in the scholarly activities section; 9 percent broke off in the job satisfaction section; 29 percent in the compensation section; 11 percent in the characteristics section; and 4 percent in the opinions section.

• A new assisted coding system, used to code field of teaching, highest degree field, and principal field of scholarly activity, coded 77 percent of verbatim strings; 23 percent of strings required manual coding.

• A recoding of 10 percent of teaching, research, and highest degree verbatim strings showed 71 percent were coded correctly, 13 percent incorrectly, and the remaining 15 percent were too vague to code. The coding performed by web respondents was more often accepted as correctly coded than that done by CATI interviewers.

Page 9: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Executive Summary

vii

• Of the approximately 25,760 postsecondary institutions coded in the faculty instrument, 1,130 (4 percent) were initially deemed uncodeable. Based on the institution information collected, however, 1,030 of these institutions were positively identified and recoded.

NSOPF:04 Data Files and Products NSOPF:04 data can be accessed both through the NCES Data Analysis System (DAS) for

public use and through electronically documented, restricted access data files (with associated Electronic Codebooks). The public-use DAS may be accessed on the NCES website at http://nces.ed.gov/das/.

Using DAS, researchers are able to

• create their own analysis tables;

• view the highlights of report findings, with figures and tables, for various postsecondary topics;

• see a comprehensive listing of analyses regarding postsecondary education and download the reports; and

• view and download DAS table parameter files (TPFs) used to generate report tables.

An ongoing series of descriptive statistical reports may be accessed online or ordered through NCES as they are released. Descriptive reports focus on topics of interest, such as undergraduate teaching, teaching with technology, distance education instruction, gender and racial/ethnic composition of the faculty population, tenure status, work activities and compensation, and characteristics of part-time faculty. Publications available for public use may be downloaded or ordered at http://nces.ed.gov/pubsearch/getpubcats.asp?sid=011.

Page 10: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Executive Summary

viii

Page 11: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

ix

Foreword This report describes the methods and procedures used for the data collection effort of the

2004 National Study of Postsecondary Faculty (NSOPF:04). NSOPF:04 serves a continuing need for data on faculty and instructional staff, all of whom directly affect the quality of education in postsecondary institutions.

We hope that the information provided here will be useful to a wide range of interested readers and that the results reported in the forthcoming descriptive summary report will encourage others to use the NSOPF:04 data. We welcome recommendations for improving the format, content, and approach, so that future methodology reports will be more informative and useful.

C. Dennis Carroll Associate Commissioner Postsecondary Studies Division

Page 12: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Foreword

x

Page 13: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

xi

Acknowledgments The authors gratefully acknowledge the assistance of staff members of the National

Center for Education Statistics (NCES) and the Institute of Education Sciences (IES) for their advice, guidance, and review in the design and conduct of the study, and in the preparation of this document. We are especially grateful to Linda Zimbler, the NSOPF:04 Project Officer; James Griffith, Program Director, Postsecondary Longitudinal and Sample Survey Studies (PLSSS); Tracy Hunt-White, Statistician (PLSSS), and C. Dennis Carroll, Associate Commissioner, Postsecondary Studies Division (PSD), for their constructive input and review.

Particular thanks are extended as well to the Technical Review Panel members, who provided considerable insight and guidance in the development of the design and instrumentation of this study. For their careful review of the document, we also extend particular thanks to Ralph Lee, NCES Office of the Deputy Commissioner (ODC); Professor James Palmer, Illinois State University; Michael Cohen; Bureau of Transportation Statistics; and Marilyn Seastrom, Chief Statistician and Program Director, NCES Statistical Standards Program (ODC).

We also extend our thanks to the project staff members of contractors RTI International, MPR Associates, Inc. and Pinkerton Computer Consultants, Inc. A number of staff from these organizations—including statisticians, analysts, survey managers, programmers, data collectors, and interviewers, too numerous to name here—worked long hours on this study. At RTI, we are especially indebted to Sallie Fiore, Sharon Powell, Lynne Hawley, and Craig R. Hollingsworth for their excellent and tireless efforts in preparing the drafts and final version of this report.

Most of all, we are indebted to the many Chief Administrators, Institution Coordinators, institution respondents, and faculty and instructional staff members who participated in the NSOPF:04 study. Their willingness to take the time to share information made this study a success.

Page 14: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Acknowledgments

xii

Page 15: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

xiii

Contents Executive Summary ...................................................................................................................................iii Foreword..................................................................................................................................................... ix Acknowledgments ......................................................................................................................................xi Chapter 1 Overview of NSOPF:04 ............................................................................................................ 1

1.1 Background and Purpose of NSOPF:04 ....................................................................................... 1 1.2 Methodological Issues and Changes for NSOPF:04..................................................................... 2

1.2.1 Combining NSOPF and NPSAS...................................................................................... 2 1.2.2 Institution Sampling and List Collection......................................................................... 2 1.2.3 Faculty Sampling and Data Collection ............................................................................ 3

1.3 NSOPF:04 Products...................................................................................................................... 4 Chapter 2 Design and Implementation of NSOPF:04 ............................................................................. 7

2.1 Sampling Design........................................................................................................................... 7 2.1.1 Institution Frame..............................................................................................................8 2.1.2 Faculty Frame.................................................................................................................. 9 2.1.3 Institution Sample Selection.......................................................................................... 10 2.1.4 Faculty Sample Selection .............................................................................................. 11

2.2 Instrumentation........................................................................................................................... 13 2.2.1 Development of Instrumentation ................................................................................... 13 2.2.2 Instrument Programming............................................................................................... 15 2.2.3 Institution Questionnaire ............................................................................................... 15 2.2.4 Faculty Questionnaire.................................................................................................... 18

2.3 Institution Data Collection.......................................................................................................... 25 2.3.1 Institution Website......................................................................................................... 26 2.3.2 Institution Contacting .................................................................................................... 28

2.4 Faculty Data Collection .............................................................................................................. 33 2.4.1 Faculty Website ............................................................................................................. 33 2.4.2 Locating and Interviewing Procedures .......................................................................... 34

2.5 Data Collection Systems............................................................................................................. 38 2.5.1 Instrument Development and Documentation System .................................................. 38 2.5.2 Integrated Management System .................................................................................... 39

Chapter 3 Data Collection Outcomes..................................................................................................... 41 3.1 Institution Data Collection Results ............................................................................................. 41

3.1.1 Institution Participation ................................................................................................. 41 3.1.2 Institution Survey Completion Timing.......................................................................... 44

3.2 Faculty Data Collection Results ................................................................................................. 45 3.2.1 Response Rate................................................................................................................ 45 3.2.2 Locating and Survey Completion .................................................................................. 47 3.2.3 E-mail Contacting Efforts.............................................................................................. 52 3.2.4 Refusal Conversion Efforts ........................................................................................... 52 3.2.5 Incentives....................................................................................................................... 53

3.3 Burden and Effort ....................................................................................................................... 54 3.3.1 Faculty Survey Completion Timing .............................................................................. 54 3.3.2 Help Desk ...................................................................................................................... 59 3.3.3 Interviewer Hours .......................................................................................................... 60

Page 16: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Contents

xiv

3.3.4 Number of Calls............................................................................................................. 60 3.4 Conclusions ................................................................................................................................ 62

Chapter 4 Evaluation of Data Quality .................................................................................................... 63 4.1 List Quality ................................................................................................................................. 63

4.1.1 List Types ...................................................................................................................... 63 4.1.2 List Data Quality ........................................................................................................... 65

4.2 Item Nonresponse ....................................................................................................................... 67 4.3 Faculty Data Quality................................................................................................................... 68

4.3.1 Item Mode Effects ......................................................................................................... 68 4.3.2 Breakoffs ....................................................................................................................... 71 4.3.3 Classification of Instructional Programs (CIP) Coding................................................. 72 4.3.4 IPEDS Coding ............................................................................................................... 73 4.3.5 Monitoring..................................................................................................................... 74 4.3.6 Interviewer Feedback .................................................................................................... 74 4.3.7 Instrument Feedback...................................................................................................... 78

4.4 Comparisons with NSOPF:99..................................................................................................... 78 Chapter 5 Data File Development and Imputation................................................................................ 81

5.1 Overview of the NSOPF:04 Data Files ...................................................................................... 81 5.2 Data Coding and Editing ............................................................................................................ 82 5.3 Data Perturbation........................................................................................................................ 83 5.4 Imputation Methodology ............................................................................................................ 83

5.4.1 Imputation Methods....................................................................................................... 84 5.4.2 Imputation of Faculty Data............................................................................................ 85 5.4.3 Imputation of Institution Data ....................................................................................... 86 5.4.4 Evaluation of Imputations ............................................................................................. 86

5.5 Derived Variables ....................................................................................................................... 88 Chapter 6 Weighting and Variance Estimation ..................................................................................... 91

6.1 Institution Weights ..................................................................................................................... 92 6.1.1 Adjustment for Institution Multiplicity ......................................................................... 93 6.1.2 Nonresponse Adjustment............................................................................................... 93 6.1.3 Poststratification Adjustment ........................................................................................ 94

6.2 Faculty Weights.......................................................................................................................... 94 6.2.1 Selection Probability for Faculty ................................................................................... 94 6.2.2 Adjustment for Faculty Multiplicity.............................................................................. 95 6.2.3 Adjustment for Unknown Eligibility Status .................................................................. 95 6.2.4 Nonresponse Adjustments ............................................................................................. 95 6.2.5 Poststratification/Raking Adjustment............................................................................ 95

6.3 Variance Estimation ................................................................................................................... 97 6.4 Design Effects and Standard Errors.......................................................................................... 100

References................................................................................................................................................ 101 Appendixes

A. Sampling Details ........................................................................................................................A-1 B. Technical Review Panel ............................................................................................................. B-1 C. Facsimile Instruments ................................................................................................................ C-1 D. Item Crosswalks .........................................................................................................................D-1

Page 17: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Contents

xv

E. Endorsements ............................................................................................................................. E-1 F. Contacting Materials ...................................................................................................................F-1 G. Training Materials ......................................................................................................................G-1 H. Quality of Lists...........................................................................................................................H-1 I. Nonresponse Bias Analysis ..........................................................................................................I-1 J. CIP Code Mapping...................................................................................................................... J-1 K. Analysis Variables......................................................................................................................K-1 L. GEM Adjustment Procedure ...................................................................................................... L-1 M. Design Effects ........................................................................................................................... M-1

Page 18: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Contents

xvi

Table Page 1. Schedule of major NSOPF:04 data collection activities: 2004 ...................................................... 3 2. Institution frame for the NSOPF:04, by Carnegie code, institution control, and degree

granted: 2004.................................................................................................................................. 9 3. Distribution of NSOPF:04 institution universe and sample, by institution control and

degree granted: 2004 .................................................................................................................... 11 4. Distribution of NSOPF:04 faculty sample sizes and completed interviews by faculty

stratum: 2004................................................................................................................................ 13 5. Content and formatting changes to the NSOPF:99 institution questionnaire in

preparation for the NSOPF:04 instrument: 2004 ......................................................................... 17 6. Overview of the NSOPF:04 questionnaire for faculty and instructional staff: 2004 ................... 19 7. Items removed from the NSOPF:04 faculty/instructional staff questionnaire following

the NSOPF:04 field test and estimated time savings: 2004 ......................................................... 20 8. Content and formatting changes to the NSOPF:99 faculty questionnaire in preparation

for the NSOPF:04 instrument: 2004............................................................................................. 22 9. Number and percentage of institutions providing faculty lists, by cycle of National Study

of Postsecondary Faculty (NSOPF): 1988 to 2004 ...................................................................... 42 10. Number and percentage of institutions providing faculty lists, by type of institution:

2004.............................................................................................................................................. 42 11. Number and percentage of institutions providing institution questionnaires, by type of

institution: 2004............................................................................................................................ 43 12. Number and percentage of institutions providing institution questionnaires, by mode of

administration: 2004..................................................................................................................... 43 13. Average and maximum completion time, in seconds, for forms in the institution

questionnaire: 2004 ...................................................................................................................... 45 14. Number of sampled, eligible, and responding faculty and response rates, by institution

type: 2004..................................................................................................................................... 46 15. Faculty response rates, by date lead letter package was mailed: 2004......................................... 47 16. Locate and interview rates, by intensive tracing efforts: 2004..................................................... 48 17. Locate rates, by intensive tracing source: 2004............................................................................ 48 18. Faculty and instructional staff requiring intensive tracing procedures, by employment

status and institution type: 2004................................................................................................... 49 19. Faculty locating and survey completion results, by employment status and institution

type: 2004..................................................................................................................................... 50 20. Response rates and mode of completion, by employment status and institution type:

2004.............................................................................................................................................. 51 21. Faculty response rates, by incentive period: 2004 ....................................................................... 54 22. Average on-screen, transit, and total survey completion time, in minutes, for the faculty

questionnaire, by mode: 2004 ...................................................................................................... 55 23. Average on-screen, transit, and total completion time, in minutes, by time of day and

mode: 2004................................................................................................................................... 56

Page 19: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Contents

xvii

24. Average and maximum completion time, in seconds, for forms in the faculty instrument: 2004.............................................................................................................................................. 56

25. Response pattern, by help desk problem type: 2004 .................................................................... 60 26. Total and average number of calls, by completion status and mode of completion: 2004........... 61 27. Average call attempts, by reached answering machine: 2004...................................................... 61 28. Number of submitted faculty lists, by type of institution and transmittal mode: 2004 ................ 64 29. Faculty list types, by NSOPF cycle: 1999 and 2004.................................................................... 64 30. Type of discrepancies encountered, by type of institution: 2004................................................. 66 31. Percentage differences between sources of data across all cycles of NSOPF: 1988 to

2004.............................................................................................................................................. 66 32. Mean differences between sources of data across all cycles of NSOPF: 1988 to 2004 ............... 67 33. Satisfaction with various aspects of job, by mode of administration: 2004 ................................. 71 34. Faculty instrument forms where more than 15 sample members terminated the

interview: 2004............................................................................................................................. 72 35. Summary of coding results for fields of teaching, research, and highest degree, by mode

of administration: 2004 ................................................................................................................ 73 36. Weighted estimates obtained based on the 1999 and 2004 survey data for a series of key

analytical variables: 2004............................................................................................................. 79 37. Description of missing data codes: 2004...................................................................................... 82 38. Prevalence of missing/imputed data for the faculty and institution surveys: 2004 ...................... 84 39. Before and after imputation distributions of key faculty questionnaire variables: 2004.............. 87 40. Before and after imputation distributions of key institution questionnaire variables: 2004......... 89 41. Counts of sampled, eligible, and participating institutions, by institution type: 2004 ................. 92 42. Faculty counts obtained from EAP:03, by institution type and employment status..................... 96 43. Faculty counts obtained from EAP:03 and NSOPF:04 sampling frame, by institution

type, race/ethnicity, and gender.................................................................................................... 96 44. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics: 2004................. 100

Figure Page 1. The 2004 National Study of Faculty and Students institution website home page ...................... 26 2. The 2004 National Study of Faculty and Students institution website status screen ................... 28 3. The NSOPF:04 faculty website home page ................................................................................. 34 4. NSOPF:04 faculty data collection overview ................................................................................ 35 5. Contacting and survey completion outcomes: 2004..................................................................... 46 6. Cumulative response rates, by mode of completion: 2004........................................................... 52

Page 20: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Contents

xviii

Page 21: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

1

Chapter 1 Overview of NSOPF:04

This document describes the study design, procedures, and outcomes for the 2004 National Study of Postsecondary Faculty (NSOPF:04), which was conducted for the National Center for Education Statistics (NCES) of the U.S. Department of Education, Washington, DC, as authorized by Title I, Section 153, of the Education Sciences Reform Act of 2002 [PL 107-279]. For the 2004 cycle, NSOPF:04 was conducted as a component study of the 2004 National Study of Faculty and Students (NSoFaS:04) under contract by RTI International,1 with the assistance of MPR Associates, Inc., and Pinkerton Computer Consultants, Inc. Results for the student component, the 2004 National Postsecondary Student Aid Study (NPSAS:04), are provided in a separate methodology report (Cominole et al.).

This introductory chapter provides an overview of NSOPF:04, including a description of the background and purpose of the study, the types of policy-relevant issues addressed, the changes to the study from previous cycles, the data and reports generated from the study, and the schedule of data collection activities.

1.1 Background and Purpose of NSOPF:04 NSOPF:04 was a comprehensive nationwide study of the characteristics, workload, and

career paths of postsecondary faculty and instructional staff.2 The study was based on a nationally representative sample of all full- and part-time faculty and instructional staff at public and private not-for-profit 2- and 4-year degree-granting institutions in the United States. The NSOPF:04 full-scale sample consisted of 35,630 faculty and instructional staff selected from 980 sampled institutions in the 50 states and the District of Columbia.3

NSOPF:04 comprises the fourth cycle of the National Study of Postsecondary Faculty. Previous studies, conducted in 1988, 1993, and 1999 (called NSOPF:88, NSOPF:93, and NSOPF:99, respectively), provided national profiles of faculty and instructional staff in postsecondary institutions, national benchmarks for faculty productivity and workload, and information on institutional policies and practices that affect faculty. The fourth cycle of the National Study of Postsecondary Faculty, NSOPF:04, expanded the information about faculty and instructional staff in two ways: (1) it allowed for comparisons to be made over an extended period of time, and (2) it helped examine emerging issues concerning faculty, such as changes related to increased use of the Internet and distance education.

NSOPF:04 was designed to address a variety of policy-relevant issues concerning faculty, instructional staff, and postsecondary institutions. The study included faculty and institution questionnaires covering general policies concerning faculty. Information obtained from these two sources helped address important questions about postsecondary education, such as the following:

1 RTI International is a trade name of Research Triangle Institute. 2 References to “faculty” in this report include instructional staff and others (e.g., administrators) with faculty status (who may or may not have instructional duties). 3 Throughout this report, faculty and institution counts are rounded to the nearest 10 to protect the confidentiality of faculty and institutions. However, percentages cited are based on the original unrounded numbers.

Page 22: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 1. Overview of NSOPF:04

2

• What are the background characteristics of full- and part-time faculty?

• What are their workloads and how is their time allocated between classroom instruction and other activities?

• What are the current teaching practices and uses of technology among postsecondary faculty and instructional staff?

• How satisfied are they with current working conditions and institutional policies?

• How are faculty and instructional staff compensated by their institutions? How important are other sources of income?

• What are the career and retirement plans of faculty and instructional staff?

• What retirement packages are available to faculty and instructional staff?

• Have institutions changed their policies on granting tenure to faculty members? Are changes anticipated in the future?

1.2 Methodological Issues and Changes for NSOPF:04

1.2.1 Combining NSOPF and NPSAS NSOPF:04 was, in one respect, unlike any previous cycle of NSOPF, as it was conducted

in tandem with another major study, NPSAS:04, under one overarching contract: NSoFaS:04. NCES recognized that, historically, there has been considerable overlap in the institutions selected for participation in NSOPF:04 and NPSAS:04. By combining the two independent studies under one contract, NCES sought to minimize the response burden on institutions and to realize data collection efficiencies. The NSOPF:04 and NPSAS:04 studies retain their separate identities. The purpose of this report is to summarize the methodology of NSOPF:04; sampling and data collection procedures for NPSAS:04 are referred to only as they are combined with, or impact, the parallel procedures for NSOPF:04.

The combination of NSOPF:04 and NPSAS:04 into NSoFaS:04 had important implications for the NSOPF:04 institution sample design and institution contacting procedures. Institutions for the NSOPF:04 sample were selected as a subsample of the NPSAS:04 sample institutions.4 This combination resulted in a somewhat larger sample of institutions for the full-scale study than previous NSOPF cycles (1,070 eligible institutions compared to 960 in 1999) and created a need to balance the design requirements of both studies in all institution-related study procedures.

1.2.2 Institution Sampling and List Collection Apart from the changes necessitated by combining NSOPF:04 and NPSAS:04, as noted

above, the key change in sampling procedures for NSOPF:04 was its use of a customized cost/variance optimization technique. This procedure was designed to identify the allocation that would accommodate all analytical objectives of this survey while minimizing data collection 4 The larger NPSAS sample includes about 400 schools not eligible for NSOPF, including less-than-two-year and proprietary schools, and schools located in Puerto Rico. It also includes about 140 institutions that were NSOPF-eligible but not included in the sample because the precision requirements for NSOPF could be met without their inclusion.

Page 23: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 1. Overview of NSOPF:04

3

costs. As with the institution-level sampling, a customized cost/variance optimization technique was used to determine the optimal allocation of faculty to the sampling strata.

In previous cycles, delays in receiving faculty lists created critical delays in sampling and contacting respondents during the time optimal to reach them (i.e., prior to the close of the regular academic year). Because the perceived burden of NSoFaS:04 would likely be greater than that of the individual studies by themselves, an advance notification and early contacting strategy was developed for this cycle. The purpose of advance notification and early contacting was to provide sufficient time to resolve any roadblocks to participation, allow the Institution Coordinator sufficient time to plan staffing and resources for the study, and to allow sufficient time for the completion of any review process the institution required, thereby facilitating the finish of data collection prior to the deadline.

For faculty list collection, procedures were developed that would encourage institutions to provide lists of faculty and complete related documentation (including the institution questionnaire) online. On the NSoFaS:04 website, a secure tool for uploading lists was provided to eliminate the need for institutions to send data files through conventional mail.

The institution questionnaire was designed as a single integrated web/computer-assisted telephone interview (CATI) instrument; there was no hardcopy instrument, although a facsimile was provided to allow dissemination of questions to different departments.

Table 1 summarizes the data collection schedule for the full-scale study.

Table 1. Schedule of major NSOPF:04 data collection activities: 2004

Activity Start date1 End date2 Select institution sample May 22, 2002 August 25, 2002 Institutional recruitment/early contacting of institution coordinators3 March 10, 2003 September 29, 2003 Obtain faculty lists4 September 29, 2003 July 11, 2004 Implement institution questionnaire September 29, 2003 October 22, 2004 Select faculty samples November 6, 2003 July 12, 2004 Send mail and e-mail to faculty January 15, 2004 October 1, 2004 Implement faculty web questionnaire January 15, 2004 October 6, 2004 Implement faculty CATI interviewing February 12, 2004 October 5, 2004 1 This is the date on which the activity was initiated for the first applicable institution and/or its associated faculty. 2 This is the date on which the activity was completed for the last applicable institution and/or its associated faculty. 3 The Chief Administrator’s office at each institution was contacted to appoint an Institution Coordinator, who served as the primary point of contact to deal with specific survey-related questions, correspondence, and follow-up. 4 Faculty sampling rates were determined based upon frame counts using Integrated Postsecondary Education Data System (IPEDS) information, and selected on a rolling basis as lists were received. NOTE: CATI = computer assisted telephone interview. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

1.2.3 Faculty Sampling and Data Collection Precision goals for NSOPF:04 were to secure national-level survey estimates with

precisions comparable to or better than those of NSOPF:99 for the overall faculty population. As with institution-level sampling, a customized cost/variance optimization technique was used to allocate the sample faculty to the institution and person strata while minimizing cost and

Page 24: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 1. Overview of NSOPF:04

4

variance. Further details about faculty sampling may be found in Section 2.1; sample allocation to strata is fully detailed in appendix A.”

Sample size was significantly larger than in the previous cycle: 35,630 faculty were sampled for NSOPF:04; of which, 34,330 were eligible. The final eligible sample for NSOPF:99 was 19,210. Criteria for faculty eligibility are discussed in section 2.1.2.

Prior to sampling, faculty counts from all lists provided by participating institutions were checked against both the Integrated Postsecondary Education Data System (IPEDS) and the counts provided by the institution on their institution questionnaire. (In 1999, the IPEDS comparison was used as a quality control check only when institution questionnaire counts were absent). As in NSOPF:99, institutions were contacted to resolve any discrepancies between data sources.

As in past cycles, faculty data collection utilized a mixed-mode approach; however, for NSOPF:04, sample members could participate only by a web-based self-administrated questionnaire or by an interviewer-administered telephone interview—there was no hardcopy version of the questionnaire. The participation of sample faculty members was initially requested in a letter that provided both instructions for completing the web questionnaire and calling to complete the interview via CATI. After 4 weeks, interviewers contacted the sample faculty members who had not completed the questionnaire to attempt a telephone interview. An early-response incentive was provided to encourage prompt completion of the instrument. Refusal or nonresponse incentives were also offered to selected sample members. Incentives are discussed in section 3.2.5.

1.3 NSOPF:04 Products Data from the full-scale study will be used by researchers and policymakers to examine a

wide range of topics, including who faculty are, what they do, and whether and how they are changing over time. NSOPF:04 provides data on each of these topics. The NCES Data Analysis System (DAS) for public release has been constructed from the data and is available to the public at http://nces.ed.gov/das. Electronically documented, restricted access data files with associated Electronic Codebooks (ECBs) are also available to qualified researchers.

The following types of reports are products of NSOPF:04: (1) this methodology report, providing details of sample design and selection procedures, data collection procedures, weighting methodologies, estimation procedures and design effects, and the results of nonresponse analyses; and (2) a series of descriptive statistical reports on key topics of interest. These topics include undergraduate teaching, faculty work activities and compensation, gender and racial/ethnic composition, and characteristics of part-time faculty. NSOPF:04 publications can be accessed electronically through the NCES website at http://nces.ed.gov/pubsearch/getpubcats.asp?sid=011.

Special tabulations are available on a limited basis from the National Education Data Resource Center (NEDRC) upon request. Use of NEDRC services is most appropriate for well defined questions that are likely to yield a few tables. It is recommended that those requiring more extensive research and in-depth analysis apply for direct access to the restricted access data files. Questions regarding NEDRC services may be directed by e-mail to [email protected] or to Aurora D’Amico at aurora.d'[email protected] or (202) 502-7334.

Page 25: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 1. Overview of NSOPF:04

5

The remainder of this report contains the details of various activities. Chapter 2 details the survey design and implementation. Data collection outcomes are reported in chapter 3. Chapter 4 presents evaluations of the quality of data collected from institutions and faculty. Chapter 5 details procedures for data file development and imputation. Chapter 6 reports on procedures for weighting and variance estimation.

Page 26: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 1. Overview of NSOPF:04

6

Page 27: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

7

Chapter 2 Design and Implementation of NSOPF:04

This chapter provides a detailed summary of the design and implementation of the 2004 National Study of Postsecondary Faculty (NSOPF:04) full-scale study. Sampling of institutions and of faculty and instructional staff is discussed in detail. In addition, instrument design and data collection procedures are described.

A Technical Review Panel (TRP) meeting was held on September 8–9, 2003. The panel, comprised of nationally recognized experts in higher education, reviewed the impact of methodological changes in sampling and data collection, including combining NSOPF:04 with NPSAS:04, the elimination of paper instruments, shortening the data collection period, and revisions to the instruments. The list of panel members is provided in appendix B.

2.1 Sampling Design NSOPF:04 employed a two-stage sampling methodology for selection of eligible faculty

and instructional staff based on a cost/variance optimization process, details of which are provided in appendix A. In the first step, samples of eligible institutions were selected within the following 10 institutional strata:

• public doctoral;

• public master’s;

• public baccalaureate;

• public associate;

• public other/unknown;

• private not-for-profit doctoral;

• private not-for-profit master’s;

• private not-for-profit baccalaureate;

• private not-for-profit associate; and

• private not-for-profit other/unknown.

In the second step, samples of faculty members were selected within sampled institutions using a stratified systematic sampling where the six strata were defined in the following hierarchical order:

• Hispanic;

• non-Hispanic Black;

• Asian and Pacific Islander;

• full-time female;

• full-time male; and

Page 28: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

8

• all other.

The institution frame was comprised of all 3,380 eligible postsecondary institutions, while the faculty frame included all faculty and instructional staff in the corresponding institutions, which was estimated to include approximately 1.1 million individuals (Zimbler 2001).5

The composition and eligibility definitions for these frames are outlined below.

2.1.1 Institution Frame The institution frame for the NSOPF:04, like previous NSOPF cycles, consisted of all

institutions meeting the following criteria:

• located in the 50 states or the District of Columbia;

• classified as participating in Title IV6 student aid programs;

• public or private not-for-profit;

• 2- or 4-year degree-granting;

• offers educational programs designed for students beyond high school;

• academically, occupationally, or vocationally oriented; and

• makes programs available to the public.

The resulting frame was a subset of that used for the National Postsecondary Student Aid Study (NPSAS:04), in that NSOPF:04 did not include private for-profit less-than-2-year non-degree-granting or Puerto Rican institutions that were included in NPSAS:04.

The institution frame for NSOPF:04 was constructed from the Winter 2001–02 Integrated Postsecondary Education Data System Data Collection (Winter:02 IPEDS) file. To allow precise survey estimates for sectors of interest to the education community, this set of institutions was stratified based on institution control and level of degree offered. Institution control distinguished between public and private not-for-profit institutions, while level of degree offered was based on the 2000 Carnegie classification system7 for segmentation of institutions. Table 2 summarizes the number of the eligible institutions for each of the resulting 10 primary institutional strata, based on the Winter:02 IPEDS file.

5 This was used as a preliminary estimate and was adjusted later. 6 Postsecondary institutions which have signed Title IV federal student aid program participation agreements with the U.S. Department of Education. 7 The Carnegie Classification is a taxonomy of colleges and universities in the United States according to such variables as degrees awarded, number of fields covered, and specialization.

Page 29: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

9

Table 2. Institution frame for the NSOPF:04, by Carnegie code, institution control, and degree granted: 2004

Degree granting Total Carnegie code Public Private not-for-profit Total 3,380 † 1,700 1,680 Doctoral 300 15, 16, and 52 190 110 Master’s 590 21 and 22 270 320 Bachelor’s 570 31, 32, and 33 90 480 Associate’s 1,180 40 and 60 1,030 150 Other/unknown 730 51, 53–59, and unknown 110 620 † Not applicable. NOTE: For sampling purposes, public baccalaureate, private associate, and other/unknown institutions are collapsed into a single stratum. Definitions of Carnegie codes are available at http://www.carnegiefoundation.org/classification. The institution universe counts include institutions that were added after the sample was selected to account for institutions that became eligible for NSOPF:04 after construction of the institution sampling frame from the Winter:02 IPEDS. Also, the 44 institutions that had an unknown Carnegie code at the time of sample selection have been reassigned to their appropriate strata. Therefore, there are no longer any institutions with unknown Carnegie codes in the sample, but some still remain in the universe. Numbers have been rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS), Fall 2000.

2.1.2 Faculty Frame The second-stage sampling frame for NSOPF:04 includes faculty and instructional staff

in the eligible postsecondary institutions. This includes both instructional faculty and faculty with no instructional responsibilities (e.g., research or administrative faculty) as well as staff with instructional responsibilities regardless of faculty status. In summary, eligible individuals for the NSOPF:04 study included any faculty and instructional staff who

• were permanent, temporary, adjunct, visiting, acting, or postdoctoral appointees;

• were employed full- or part-time by the institution;

• taught credit or noncredit classes;

• were tenured, nontenured but on tenure track, or nontenured and not on tenure track;

• provided individual instruction, served on thesis or dissertation committees, advised, or otherwise interacted with first-professional, graduate, or undergraduate students;

• were in professional schools (e.g., medical, law, dentistry); or

• were on paid sabbatical leave.

Ineligible individuals for NSOPF:04 included staff who:

• were graduate or undergraduate teaching or research assistants;

• had instructional duties outside of the United States, unless on sabbatical leave;

• were on leave without pay;

• were not paid by the institution, e.g., those in the military or part of a religious order;

• were supplied by independent contractors; or

• who otherwise volunteer their services.

Page 30: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

10

2.1.3 Institution Sample Selection The administration of NSOPF:04 consisted of a sample of 35,630 faculty and

instructional staff across a sample of 1,080 institutions in the 50 states and the District of Columbia. This section provides details regarding the composition and construction of the institution sampling frame and methods used for selection of the institution sample.

Institution frame construction

The institution sample was selected using Chromy’s sequential probability minimum replacement (PMR) sampling algorithm (Chromy 1979) to select institutions with probabilities proportional to a composite measure of size, details of which are provided in appendix A. For this purpose, each institution was assigned a measure of size (MOS) based on the number of eligible faculty and instructional staff and students in the given institution. Specifically, the composite size measure was the sum of cross products of sampling rates and population sizes for the groups, operating as the expected combined sample size at an institution. This measure was designed to ensure that student and faculty in certain minority strata would have a higher chance of selection. For faculty, these minority strata included:

• Hispanic;

• non-Hispanic Black or African American;

• Asian and Pacific Islander;

• female, full-time employee;

• male, full-time employee; and

• all others.

It should be noted that the MOS for each institution was calculated to reflect the number of students in the given institutions, since for this administration the institution samples for NPSAS:04 and NSOPF:04 were selected jointly. That is, precision requirements for NSoFaS:04 were considered jointly by reflecting both the faculty and student design objectives. Faculty counts needed for MOS calculations were initially obtained from the Fall Staff Survey component of the Winter:02 IPEDS data collection. However, this source could not provide all information necessary to classify faculty members into one of the above sampling strata. For instance, in a number of institutions faculty counts were not reported, while for others reported counts were not indexed by race and ethnicity. As a result, the missing information was imputed in two steps. In the first step, unreported (missing) faculty counts were imputed, while in the second step, faculty reported as unknown race/ethnicity or nonresident aliens were distributed among the known race categories using a special procedure, details of which are provided in appendix A.

Institution sample selection

The institution sampling frame was constructed from the IPEDS-IC files and was partitioned into institutional strata based on institutional control, highest level of offering, and Carnegie classification.8 As mentioned earlier, the sample of institutions was selected probability

8 More detailed information about the Carnegie classification can be found in appendix A.

Page 31: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

11

proportional to size (PPS) based on the number of faculty and students at each institution, using Chromy’s sampling algorithm. Sample sizes and their corresponding sampling rates were established using a customized cost/variance optimization procedure, which aimed to identify the allocation that would accommodate all analytical objectives of this survey while minimizing data collection costs.

Table 3 summarizes the distribution of the resulting sample of institutions for NSOPF:04. Subsequent to selection of the sample, the resulting institutions were contacted and asked to provide lists of eligible faculty and instructional staff for their institutions.

Table 3. Distribution of NSOPF:04 institution universe and sample, by institution control and degree granted: 2004

Total Public Private not-for-profit Degree granting Universe Sample

Universe Sample

Universe Sample

Total 3,380 1,080 1,700 680 1,680 400 Doctoral 300 300 190 190 110 110 Master’s 590 200 270 120 320 80 Bachelor’s 570 160 90 30 480 130 Associate’s 1,180 350 1,030 340 150 10 Other/unknown 730 70 110 10 620 60 NOTE: The universe and sample counts include institutions that were added after the sample was selected to account for institutions that became eligible for NSOPF:04 since construction of the institution sampling frame from the Winter:02 IPEDS. Also, the 44 sample institutions that had an unknown Carnegie code at the time of sample selection were reassigned to their appropriate strata. Therefore, there are no longer any institutions with unknown Carnegie codes in the sample, but some still remain in the universe. Universe and sample counts are rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

2.1.4 Faculty Sample Selection This section provides an overview of the faculty sample selection procedures, which

include methods used for frame construction and the technical details of cost/variance optimization process for selection of the initial sample sizes and calculation of needed sampling rates.

Faculty frame construction

The sampling frames for selection of faculty and instructional staff were constructed institution-by-institution. Each sampled institution was asked to provide a complete listing of eligible full- and part-time faculty and instructional staff. The majority of lists were delivered electronically; however, some of these lists were abstracted from online sources such as institution directories or supplied on paper.

Faculty sample selection

The sample of faculty was selected using an equal probability stratified systematic sampling, within cells indexed by institutional and faculty strata. As detailed in the next section, a customized cost/variance optimization program was utilized.

Page 32: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

12

Determining initial faculty sample sizes and sample allocation

A special cost/variance optimization program was used to determine the desired allocation of respondents to institution-by-person strata, the goal of which was to secure at least the same level of precision for key estimates as those achieved during the previous administration of the survey. This optimization process, which is detailed in appendix A, consisted of the following steps:

• establishing precision requirements for key estimates;

• constructing a cost model specific to the structure of the NSOPF:04 sample;

• developing a relative variance model; and

• determining the optimum sample allocation.

Faculty sample selection

Faculty members were sampled as faculty lists were received from participating institutions. Prior to selecting the faculty sample for a given institution, expected sample sizes for each faculty stratum were calculated using the institution-specific faculty list counts and sampling rates. These sampling rates were then modified, as necessary, for the reasons given below.

• Rates were increased across all faculty strata to ensure that at least ten faculty members were selected from each institution, if possible.

• Rates were increased within faculty strata to guarantee that at least one faculty member was selected per stratum within each institution, if possible.

• The sample yield was monitored throughout the months during which faculty lists were received, and the faculty sampling rates were adjusted periodically for institutions for which sample selection had not yet been performed to ensure that the desired faculty sample sizes were achieved.

Stratified systematic sampling was used to select faculty members from the faculty lists. Specifically, from each list (institution) sample faculty were selected within each faculty stratum defined by race/ethnicity, gender, and employment status using the corresponding rate for the given institution-faculty stratum, with academic field serving as an implicit sort variable. Whenever a list contained insufficient data to identify faculty strata, a systematic sample of faculty was selected using the overall sampling rate for the institution. For hard copy lists, the resulting sample was then keyed to create an electronic file. The following table 4 provides a summary of the required sample sizes, which were determined based on the cost/variance optimization process and the resulting completed interviews by faculty stratum.

Page 33: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

13

Table 4. Distribution of NSOPF:04 faculty sample sizes and completed interviews by faculty stratum: 2004

Faculty stratum Required sample size Completed interviews Total 24,500 26,100 Non-Hispanic Black 1,600 2,060 Hispanic 1,300 1,700 Asian and Pacific Islander 900 1,610 Other full-time female 4,600 5,850 Other full-time male 8,300 8,500 Other part-time 7,800 6,380 SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The information supplied for each sampled faculty member (e.g., name, academic field, residence) was checked against that of faculty previously selected from other institutions to identify and eliminate respondents sampled twice. Duplicates were eliminated from the sample of the current institution. Once the de-duplication process was complete and the institution’s final sample file was created, the institution’s final sample file was added to the master dataset. The master dataset contained all sampled faculty members and their relevant sampling information.

2.2 Instrumentation This section describes the institution and faculty instruments that were developed for the

NSOPF:04 full-scale study conducted during the 2003–04 academic year with a national sample of postsecondary institutions and faculty and instructional staff. Data collection for the study was by self-administered questionnaires on the Internet or computer-assisted telephone interviews (CATIs) with web nonrespondents. In contrast to the data collection approach for NSOPF:99, no paper-and-pencil questionnaire options were provided.9 Facsimiles of the electronic instruments, which provide item wording, response options, and information on respondent groups, are included in appendix C.

2.2.1 Development of Instrumentation Project staff from RTI and MPR Associates were responsible, respectively, for

developing and implementing study instrumentation for NSOPF:04 and for ensuring that the instruments, where possible, retained analytic comparability with earlier data collection rounds of the study. Revisions to the institution and faculty/instructional staff instruments built upon the NSOPF:99 instruments, and included the comments and suggestions of the Technical Review Panel (TRP), sample respondents contacted after the study for additional information, and other government officials and postsecondary researchers. (Copies of the NSOPF:99 data collection instruments for postsecondary institutions and faculty/instructional staff are included as appendixes A and B, respectively, in Abraham et al. 2002.) In May 2002, meetings with the TRP were conducted to review the relevance of policy issues examined in NSOPF:99, the importance of emerging issues (such as increased use of the Internet and distance education) not included in

9 A “facsimile” of the institution questionnaire—what the electronic instrument might have looked like if it was rendered as a hard-copy document—was included with the binder materials distributed to Institution Coordinators. However, this 12-page document was marked “informational copy only” and was not used for data collection.

Page 34: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

14

the 1999 instruments, and the consequences of adding, revising, or deleting items from the NSOPF:99 instruments.10

Following contract award for NSOPF:04, project staff developed and tested multiple versions of the institution and faculty/instructional staff instruments. A field test version of the instrumentation was developed at the start of the 2002–03 academic year and closely reviewed by members of the study TRP, government officials, postsecondary researchers, and other interested individuals. Then during the fall and spring terms of 2002–03, field test data collection for NSOPF:04 permitted the evaluation of the revised institution and faculty/staff instrumentation under conditions comparable to those to be employed during the NSOPF:04 full-scale study.11

Several policy, methodological, and practical concerns guided the development of instrumentation for NSOPF:04. To ensure the comparability of data elements from earlier rounds of the postsecondary faculty study in 1988, 1993, and 1999, one of the primary objectives of instrumentation was to maintain the trend analyses for this national, cross-sectional study. However, this goal was balanced by the importance of adequately considering emerging issues, while at the same time developing instruments that could be completed quickly and efficiently by sample members. For example, almost 70 percent of the institution responses for the 1999 study were obtained via paper-and-pencil questionnaire, and the average time to complete the institution questionnaire was 90 minutes. For the NSOPF:99 faculty questionnaire, over one-half (54 percent) of the respondents completed hardcopy instruments, with an average web and paper questionnaire completion time of 51 minutes; the average CATI completion time was 55 minutes.

Based on these considerations, the goals for the NSOPF:04 instrumentation included several elements:

• All data collection would be completed electronically, using web-based self-administered questionnaires, with telephone interviews for those who did not respond to the web self-administered questionnaires. No paper and pencil instruments would be received.

• All data collection instruments for the study would be shorter than the NSOPF:99 instruments, thus simultaneously increasing response rates while reducing the potential for bias and the need for costly refusal conversion efforts. The targets for average time to complete the instruments were set at 45 minutes for the institution questionnaire and 30 minutes for the faculty/instructional staff questionnaire.

• Consistent with the transition to all-electronic data collection, the NSOPF:04 instrumentation was designed to be easier for sample members to complete, to be easier for the study team to process, and to provide higher quality data.

• Finally, the instrumentation team sought to address emerging issues as well as to maintain comparability with earlier rounds of the study.

10 One important element in this process was a consideration of recent literature in the field; for example, Developing the 2004 Faculty Survey: Themes from the Literature on Postsecondary Education, developed by the American Institutes for Research (Berger et al. 2002). 11 Field test data collection for the institution questionnaire took place from September 2002 through June 2003; faculty/instructional staff field test data collection lasted from January 2003 through June 2003.

Page 35: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

15

With these goals established, planning and design for the NSOPF:04 institution and faculty/instructional staff questionnaires began. Specification for both instruments was in RTI’s Instrument Development and Documentation System (IDADS), a tool developed specifically for the design of complex electronic data collection instruments (see also section 2.5.1). Using IDADS, instrument designers entered information about each instrument item, including the variable data definition, formatting, and the desired on-screen presentation.12 For each of the NSOPF:04 instruments, designers specified the variable names and labels, values and value labels, “applies to” fields, and variable definitions (e.g., numeric, continuous, maximum and minimum values, field size, etc.).

2.2.2 Instrument Programming Despite the different data collection modes for NSOPF:04, the self-administered web

instruments for the institution and faculty/instructional staff respondents were identical to their corresponding CATI instruments. Both instruments were web-based products, located on U.S. Department of Education servers. The instruments were developed using Microsoft Corporation’s Active Server Pages (ASP) web programming language.13 This approach resulted in a computer-assisted data collection program that facilitated the preloading of full-screen data entry and editing of “matrix-type” responses. The web and CATI system presented interviewers with screens of questions to be completed, with the software guiding the respondent through the interview. Inapplicable questions were skipped automatically based on prior response patterns. On-screen clarification was available for all items.14 The instrument also provided real-time error checking for inconsistent or out-of-range responses and minimized the potential for inadvertently skipped items.

2.2.3 Institution Questionnaire Instrumentation activities for the NSOPF:04 institution questionnaire began in May 2002

with revisions to the NSOPF:99 instrument. Project staff began working with a revised version of the NSOPF:99 instrument that incorporated the lessons learned from the NSOPF:99 data collection, including the comments and suggestions for instrumentation provided by both the NSOPF TRP and a small number of study respondents who were contacted for additional information after the completion of NSOPF:99 data collection.

This information formed the input for the NSOPF:04 field test institution questionnaire that was administered to a purposive sample of 150 postsecondary institutions during the 2002–03 academic year. The interpretation of responses from the field sample members that completed the instrument (77 percent of the sample of institutions that were eligible to participate), results 12 In addition to instrument development, IDADS also provides a reference system for instrument reviewers and testers and serves as the data documentation system for the data products developed. 13 Active Server Pages (ASP) dynamically produce hypertext markup language (html) pages designed to facilitate information retrieval across the Internet. ASP code includes small embedded programs or scripts that are processed on a web server when accessed by users employing browser programs such as Netscape or Internet Explorer. Before responses are returned to a user, the request typically accesses databases and develops a customized response. 14 Each data collection screen or form for the NSOPF:04 field test faculty instrumentation included a link to a page of “help text” prepared specifically for the item and including key definitions, descriptions of respondents to whom the item applied, and other useful information. In an attempt to shorten the administration time for the full-scale instrument, the help text was shortened and appeared on the same form as the question wording and response options. This reduced the need for loading a separate web page for help. A separate help text web page was available for the institution questionnaire for both the field test and full-scale versions of the instrument.

Page 36: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

16

of debriefing sessions with institution contact personnel for the field test who were responsible for encouraging response from the institutions, and data collection timing information for the field test also served to inform revisions to the full-scale study institution questionnaire.

After careful consideration of this input and examination of the data collected during the 1998–99 academic year—including the patterns of responses and missing data, as well as time to complete estimates—instrument revisions were implemented. Like the NSOPF:99 institution questionnaire, the NSOPF:04 instrument was divided into major sections that collected information on the number of faculty and instructional staff employed at the target institution; the policies and practices that affected, respectively, full-time and part-time faculty and instructional staff; and the percentage of undergraduate instruction assigned to various instructional personnel. Descriptions of the information included in these sections follow (see also the instrument facsimile in appendix C):

• The first section (items 1A and 1B) collected information on the number of faculty and instructional staff employed either full time or part time at the target postsecondary institution during the fall term of the target academic year (2003–04). For NSOPF:04, institution personnel were requested to provide these counts “as of November 1, 2003 (or during the fall term of the 2003–04 academic year when your faculty lists are considered complete).”

• Institution instrument items 2 through 13 defined the second section of the questionnaire and collected information on the employment of the target institution’s full-time faculty and instructional staff. After first collecting information on the numbers of these personnel who entered or exited full-time employment during the previous academic year (2002–03 school year), this section examined the characteristics and policies of the target institution’s tenure system, employee benefits, union representation (if any), and personnel evaluation, as applied to full-time faculty and instructional staff.

• The third section of the institution questionnaire (items 14 through 18) examined the employment of the target institution’s part-time faculty and instructional staff. This section used items similar to those for full-time faculty and instructional staff in the previous section. These items included the availability of retirement plans to part-time faculty, the availability of and institution-level support for various types of employee benefits, and the characteristics of the institution’s personnel evaluation system.

• The fourth instrument section included a single question (19) that collected information on the percentage of the target institution’s undergraduate instructional activities assigned to various instructional groups, including full-time faculty and instructional staff, part-time faculty and instructional staff, teaching assistants such as graduate students, and others individuals.

• The last section of the NSOPF:04 institution questionnaire (item 20) collected respondent contact information and feedback on data collection. This section attributed the item responses for the entire institution questionnaire to individual respondents at the institution, which allowed data collection staff to recontact respondents for clarification of responses. These data elements—respondent name, job title, telephone number, and e-mail address—were not maintained after data collection was completed.

Page 37: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

17

Appendix D provides a crosswalk of NSOPF:04 institution questionnaire items to the institution questionnaires from NSOPF:88, NSOPF:93, and NSOPF:99. Table 5 notes how the NSOPF:04 questionnaire differs from the NSOPF:99 questionnaire. As noted in this table, nine items from the NSOPF:99 questionnaire were eliminated from the NSOPF:04 institution questionnaire, 14 items were revised, and three items for NSOPF:99 were repeated without change.

Table 5. Content and formatting changes to the NSOPF:99 institution questionnaire in preparation for the NSOPF:04 instrument: 2004

NSOPF:99 NSOPF:04 Item Content Action Item Changes 1 Numbers full/part-time faculty and

instructional staff Revised 1 Slight wording and instruction changes

2 Change in total number of full-time faculty and instruction staff over the past 5 years

Deleted

3 Policies to decrease the number of full-time faculty and instructional staff

Deleted

4 Availability of tenure system Unchanged 3 5 Changes in full-time faculty and instructional

staff between fall terms Revised 2 One response option added (item 2f),

slight wording change throughout, distinction among tenured, tenure track, and not tenure track eliminated

6 Number of staff considered for/granted tenure Revised 4/5 Asked as two questions with first as gate item.

7 Maximum number of years on tenure track Unchanged 6 8 Changes in tenure policy in past 5 years Revised 7/8/

7sp Broken into three items; response options revised (Option E, discontinued tenure, asked only of respondents who answered “no” to tenure availability)

9 Other actions to reduce tenured faculty Deleted 10 Number of full-time positions sought to hire Unchanged 9 11 Retirement plans available to full-time staff Deleted 12 Employee benefits available to full-time

faculty and instructional staff Revised

10A 10B

Broken into two items, part 10A serves as gate question Response categories for benefits were changed to All, Some, None, Don’t know Fully and partially subsidized categories combined

13 Additional employee benefits available to full-time faculty and instructional staff

Revised 11 Response categories for benefits changed to All, Some, None, and Don’t know; Slight wording change

14 Percentage of salary contributed by institution to benefits

Deleted

15 Collective bargaining for full-time faculty and instructional staff

Revised 12 Percentage of faculty represented by union eliminated

16 Teacher assessment with full-time faculty and instructional staff

Revised 13 Response options changed to Yes, No, Don’t Know; “Other, specify” option was eliminated

17 Availability of retirement plans for part-time faculty and instructional staff

Revised 14 Item wording revised for web data collection

18 Types of retirement plans available for part-time faculty and instructional staff

Deleted

19 Criteria for eligibility for retirement plans for part-time faculty and instructional staff

Deleted

See notes at end of table.

Page 38: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

18

Table 5. Content and formatting changes to the NSOPF:99 institution questionnaire in preparation for the NSOPF:04 instrument: 2004—Continued

NSOPF:99 NSOPF:04 Item Content Action Item Changes 20 Employee benefits available to part-time

faculty and instructional staff Revised

15A 15B

Broken into two items, part 15A serves as gate question Response categories for benefits were changed to All, Some, None, Don’t know Fully and partially subsidized categories combined

21 Additional employee benefits available to part-time faculty and staff

Revised 16 Response categories for benefits changed to All, Some, None, and Don’t know; Slight wording change

22 Benefit eligibility criteria for part-time faculty and instructional staff

Deleted

23 Percentage of salary contributed by institution to benefits

Deleted

24 Collective bargaining for part-time faculty and instructional staff

Revised 17 Percentage of faculty represented by union eliminated

25 Teacher assessment with part-time faculty and instructional staff

Revised 18 Response options changed to Yes, No, Don’t Know; “Other, specify” option was eliminated

26 Undergraduate instruction by instructional staff type

Revised 19 Response options changed

NOTE: Numbers in table correspond with the question number in the instrument. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

2.2.4 Faculty Questionnaire The NSOPF:04 questionnaire for faculty and instructional staff was divided into several

sections that described the study and respondents’ rights (informed consent); nature of employment; academic and professional background; instructional responsibilities and workload; scholarly activities; job satisfaction; compensation; background characteristics; and opinions. Included within the final section, where applicable, were items that collected address information for sample members who were eligible for response incentives. (See section 3.2.5 for additional information about the early-response and refusal conversion incentives.) Table 6 describes the instrument sections, including the number of forms (or screens) and data elements in each. Like the instrumentation for the study waves in 1988, 1993, and 1999, the NSOPF:04 faculty and instructional staff questionnaire emphasized descriptive and behavioral attributes rather than attitudinal measures.

The design of the faculty and instructional staff questionnaire included input from members of the NSOPF:99 TRP and representatives of offices of the U.S. Department of Education, as well as an analysis of the data collected during the 1999 study. Because the NSOPF:99 instrument took 55 minutes to complete, designers made a concerted effort to shorten the instrument and make it more efficient.15 Several questions were eliminated, and other questions were shortened or otherwise simplified. The instrument was then evaluated in a field 15 Efficiency for the NSOPF:04 instrument was gained by developing a shorter, tighter, and more focused interview that used state-of-the-art technology and design techniques. The sections and items were rearranged, coding procedures revised considerably to be interactive, skip patterns were employed, range checks were inserted, and other changes were implemented to make the instrument operate more efficiently.

Page 39: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

19

test carried out during the 2002–03 academic year under conditions similar to those employed during the full-scale study in 2003–04.

Table 6. Overview of the NSOPF:04 questionnaire for faculty and instructional staff: 2004

Section Forms/items1 Examples of content Total 81/1832 Informed consent 6/0 Description of the NSOPF:04 study and respondents’ rights as

participants. A. Nature of employment 17/18 Does the respondent have instructional responsibilities during the

Fall 2003 term? Does the respondent have faculty status? When did the person begin working? What are the respondent’s rank, tenure status, and teaching field?

B. Academic and professional background

16/23 What is the respondent’s highest degree? Where, when, and in what area was it earned? Is this the respondent’s first academic job? Where else did the person work? Does the respondent teach? How long has the person been teaching?

C. Instructional responsibilities and workload

13/66 How many hours during an average week does the sample member spend on instruction, research, and other activities? How many classes are taught, and what are their characteristics (e.g., duration, number/type of students, evaluation type)? What level of advising and individual instruction is offered?

D. Scholarly activities 7/20 What scholarly activities have sample members completed in their lifetime and during past 2 years? What is their principal scholarly field? Are scholarly activities funded?

E. Job satisfaction 2/10 How satisfied is the respondent with instructional duties and employment at the target school? What are the person’s retirement plans?

F. Compensation 7/12 What is the respondent’s compensation from the target institution and all other sources? What is the structure of the employment contract? What is the household income?

G. Sociodemographic characteristics

8/13 What is the respondent’s sex, date of birth, race/ethnicity, marital status, citizenship, and disability status? Does the person support dependents?

H. Opinions 2/5 What are the respondent’s opinions about the faculty reward system at the target institution? Would the sample member seek an academic career again?

I. Incentive information 3/16 Where applicable, these forms also collected address information from sample members qualified for nonresponse incentives.

1 The faculty/instructional staff questionnaire was divided into forms (screens) and items. Each form was structured to include related items. The first number is the number of forms in the section, and the second number is the number of items included on those forms. Forms and items were often skipped based on the responses to earlier forms and items. 2 The number of items in the faculty questionnaire (183) differs from the number of faculty items reported elsewhere in this document (e.g., 162 analysis variables and 144 stochastically imputed variables) because some items were for internal use only (e.g., verbatim text strings used to code field of teaching [see section 4.3.3], school name and city to code IPEDS [see section 4.3.4], and contact information for sending incentives). NOTE: IPEDS = Integrated Postsecondary Education Data System. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Following the field test, additional items were modified and eliminated to reach the desired 30-minute interview. The average CATI and web interview for the NSOPF:04 field test took 42 minutes, considerably longer than anticipated. The results of the NSOPF:04 field test reliability reinterview (Heuer et al. 2004), the policy relevance of each instrument item, and the input received from responding sample members, telephone interviewers/help desk staff, and members of the NSOPF:04 TRP were used to identify 27 forms from the full-scale study for elimination. Table 7 describes the NSOPF:04 field test items on these forms that were eliminated

Page 40: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

20

following the field test. The table also provides the average time to complete each item during the field test.16 After adjusting this average time to complete by the proportion of the overall respondent population that reached the item (i.e., complicated and time-consuming items will have little impact on the average time to complete the entire interview if most respondents do not attempt the item), these item reductions were expected to reduce the average time to complete the full-scale instrument by approximately 7 minutes.

Table 7. Items removed from the NSOPF:04 faculty/instructional staff questionnaire following the NSOPF:04 field test and estimated time savings: 2004

Instrument change Estimated time savings (in minutes) Total 7.483 Q7: Part-time faculty: years employed part-time 0.163 Q17B: Holds Ph.D. in addition to professional degree 0.005 Q17C: Year received doctoral degree < 0.001 Q17C2VS: Doctoral field: verbatim < 0.001 Q17C2CD: Online coding: doctoral field < 0.001 Q17C3: Online coding: doctoral degree institution 0.003 Q17D2: Online coding: bachelor’s degree institution 0.642 Q19C: Number classes taught at other postsecondary institution 0.065 Q20: Non-postsecondary education jobs related to teaching field 0.102 Q22: Total number of postsecondary institutions employed as faculty 0.222 Q25: First postsecondary faculty position—academic rank 0.115 Q29: Previous job related to teaching field 0.178 Q30: Years teaching in postsecondary institutions 0.152 Q34A–Q34D: Percent allotment of other time 1.273 Q40A–Q40G: Uses of website 0.410 Q43A–Q43D: Plan/develop instruction/employment opportunities 0.865 Q44A–Q44F: Training opportunities 0.933 Q45: Hours professional training in 2003 0.452 Q52Aicat: Categorical items for Q52AA–AG nonrespondents 0.105 Q58: Primary funding source 0.115 Q59: Number of grants/contracts 0.130 Q60A: Total funding grants/contracts 0.058 Q60B: Range total funding grants/contracts 0.012 Q63: Age expecting to stop working at postsecondary institution 0.338 Q76A–Q76E: Type of disability 0.015 Q78: Number of dependents 0.235 Q84: Respondent comments and suggestions 0.895 SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

It should be noted that approximately 7 minutes of the overall time to complete the field test interview were associated with “transit” time—in other words, the time involved to transmit information to each respondent, to “write” the form and related text onto each sample member’s screen, to transmit the responses back for storage, and to begin the transmission of the next item. Interview transit times are dependent on many factors such as server bandwidth, processing

16 The average time to complete each item is based on the time each respondent took to answer each item, as well as the time required to transmit the data collection image to the respondent and to transmit and write the information when the response was completed.

Page 41: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

21

efficiency, and instrument content (e.g., other things being equal, the transit time for a form with little text and graphic information will be less than another form with more text and graphics). Notably, transit times are often dependent on factors beyond the control of instrument designers. For example, the type of internet connection used by the sample member (telephone dial-up modem versus direct Ethernet connection with fiber optic lines) and the number of other users on the respondent’s internet service provider at the time will affect transit time. Section 3.3.1 describes the interview completion time and transit times for the NSOPF:04 full-scale faculty and instructional staff questionnaire.

To reduce transmission time from that experienced in the NSOPF:04 field test, project staff carefully reviewed the code efficiency of the web applications. The project also utilized an outside and independent review of the study procedures and programs. TechSages LLP, a computer consulting firm located in Durham, NC, reviewed the NSOPF:04 field test computer code. The group offered several recommendations for optimizing the code to improve execution speed including changes to database connectivity implementation, code structure, and variable scoping.

In addition to the changes in data processing and the reduction in the number of items included in the questionnaire noted above, instrument designers also implemented several other content related changes for the full-scale study. These included the following:

• Instrument designers eliminated the faculty/instructional staff questionnaire’s online help and replaced it with more targeted information placed directly on the form containing the question. For the field test, a callable help screen was available for each form of the faculty interview. By selecting a help button at the bottom of each form, the respondent could review a screen of related definitions, examples, and other information about the item. While these help screens provided useful information, accessing them did require the transmission of an additional form and, consequently, an increase in the interview completion time. While adding text, such as definitions or examples, does increase the transit time of a screen, the increase is negligible relative to the increases in interview time that would be obtained by accessing and transmitting a second web page of help text for the item. Adding definitions and examples to the original form of the interview reduced the need for help screens.

• For the full-scale study’s questionnaire, project staff also developed an online assisted-coding routine for respondent’s academic area or discipline. Assisted coding provided significant time savings over the online coding in the field test, which used two pull-down boxes for each academic discipline. The assisted-coding procedure developed for the full-scale study eliminated pull-down boxes for common disciplines (e.g., mathematics or English), considerably reducing the time each respondent took to code academic field. The pull-down boxes were available for unusual disciplines or when the sample member was not satisfied with the result of the assisted-coding activity. To use the assisted-coding routing, the sample member entered the name of the relevant academic field, and then confirmed or discarded the results of the matches with an assisted-coding dictionary developed from the Classification of Instructional Programs (U.S. Department of Education 2002).

• Instrument developers also improved item wording, and especially screen fills to reduce item wording.

Page 42: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

22

• Finally, the full-scale study instrument combined a number of instrument screens, thus reducing the number of overall forms and the number of data transmissions. (For example, forms Q65 and Q80 in the full-scale study instrument combined previously independent forms.)

Table 8 compares and contrasts the faculty and instructional staff instruments used for the NSOPF:04 and NSOPF:99 full-scale studies. As noted in this table, 39 items were eliminated from the 1999 instrument, 51 items were simplified or otherwise revised, 1 item was added, and 3 items were unchanged.

Table 8. Content and formatting changes to the NSOPF:99 faculty questionnaire in preparation for the NSOPF:04 instrument: 2004

NSOPF:99 NSOPF:04 Item Content Action

Item Changes 1 Instructional duties Revised 1 Wording change to highlight that teaching includes

credit and noncredit courses; on screen descriptions of instruction duties

2 Credit status of instructional duties Revised 2 Response options for instructional duties item changed to Yes/No for credits awarded for classes

3 Principal activity Revised 4 Pubic service option added; other specify for administration removed

4 Faculty status Revised 3 Faculty status “defined” as at target institution

5 Full- and part-time status Revised 5 Response category order changed

New 6 For part-time faculty/instructional staff, is position your primary employment

6 Reason working in part-time position Revised 8 Wording for response option modified; reason for holding PT position eliminated

7 Year began job Revised 9 Stem wording revised

8 Rank Revised 10 Open-ended specify field eliminated; examples given for “other” response option

9 Year achieved rank Revised 11 Stem modified to specify at “any institution”; response population subset to professors or associate professors only

10 Tenure status/date of tenure Revised 12/13 Stem modified to specify tenure at “any institution”

11 Duration of contract Deleted

12 Type of appointment Deleted

13 Chair of department Deleted

14 Principal field of teaching Revised 16 Assisted coding of teaching field discipline using Classification of Instructional Programs added

15 Principal field of research Revised 54 Stem wording changed to field of “scholarly activities”; assisted coding of discipline using Classification of Instructional Programs added; response population subset to respondents without specified teaching field

16 Degrees obtained (year received, field, and name, city, state of institution awarding)

Revised Formatted for web data collection, stem wording changed

17A1 Highest degree only collected

17A1B If highest degree reported is professional degree, does respondent also have PhD

17A2 Year received highest degree

17A3 Assisted coding of discipline using Classification of Instructional Programs added

See notes at end of table.

Page 43: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

23

Table 8. Content and formatting changes to the NSOPF:99 faculty questionnaire in preparation for the NSOPF:04 instrument: 2004—Continued

NSOPF:99 NSOPF:04 Item Content Action

Item Changes

17A4 Name, city, and state of institution awarding highest degree collected; respondent assisted online coding of institution using Integrated Postsecondary Education Data System added

17D Year bachelor’s degree awarded (if highest degree above bachelor’s);

17 Working toward a degree Deleted

18 Degree working toward Deleted

19 Primary employment Deleted

20 Outside consulting Deleted

21 Other professional employment Revised 18 Stem changed to include “all” positions outside of target institution

22 Number of other professional jobs during fall term

Revised 19A 19B

Formatted to include gate question; number of jobs expanded to include information on “full-time jobs” at other postsecondary institutions

23 Total jobs held in postsecondary education

Deleted

24 First and most recent jobs in higher education: years held, institution type, primary responsibility, employment status and title

Revised 21/23/ 24/26

NSOPF:04 simplified this question from 18 to 4 data elements. Item for 2004 asks if current job was first postsecondary education position, when position began, employment status, and tenure status of the position

25 Years teaching in higher education Deleted 26 Number of positions ever held outside

of higher education Revised 27 Changed to positions ever held outside

postsecondary education since highest degree 27 Job status of those positions Deleted 28 First and most recent jobs outside of

higher ed: Type of employer, and primary responsibility

Revised 28 Item simplified from 10 to 1 data elements. Item now collects only the employment sector of most recent job: first profession position outside higher education eliminated

29 Scholarly activities during career; scholarly activities during past 2 years

Revised 52A/52B

Formatted for web instrument; joint/sole responsibility eliminated; stem wording and item strings revised

30 Average time spent in activities per week

Revised 31 Item strings reworded and revised to include more examples’ open-ended specify field eliminated

31 Allocation of working time, preferred allocation of working time

Revised 32 Preferred allocation eliminated; item reformatted for web instrument; response categories combined, reworded, and simplified (e.g., asked only about time at target institution, focus changed to instructional activities, professional growth/ administration/service combined

32 Committee assignments Revised 48 Reformatted for web; stem wording revised to eliminate student level and number of committees chaired and served

33 Number of classes taught Revised 35A Reformatted for web; item expanded to include the number of “classes/sections” taught for credit and not for credit, wording revised to include “taught for credit toward degree”

34 Number of different courses taught Deleted 35/36 Number of remedial classes taught;

number of remedial classes not creditable towards degree

Revised 35B Item wording revised to include “remedial or developmental classes”; second item on distance education added

37 Number of continuing education classes taught

Deleted

38 Number of noncredit continuing education classes taught

Deleted

See notes at end of table.

Page 44: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

24

Table 8. Content and formatting changes to the NSOPF:99 faculty questionnaire in preparation for the NSOPF:04 instrument: 2004—Continued

NSOPF:99 NSOPF:04 Item Content Action

Item Changes

39 Number of students in all noncredit classes

Deleted

40 Number of classes taught for credit Revised See question 35A above 41 Details on up to five credit classes,

including discipline; description (weeks class met, credit hours, hours class met/week, number teaching assistants, number students, class team taught, hours per week respondent taught, and remedial and/or distance education); level of students, instructional method; and instructional medium

Revised 36/37 Reformatted for web; gate item of teaching/lab assistants added; class description matrix simplified; information collected included number of weeks and hours per week respondent taught class, credits for the class, number of students, and level of students

42 Undergraduate evaluation methods Revised 38 Stem wording revised, response options added, deleted, and revised; response values reworded

43 Websites Revised 39 Stem wording changed to include all instructional duties; response population subset to persons with instructional duties

44 How websites used Deleted 45 E-mail Deleted 46 Student percentage using e-mail Deleted 47 Hours spent responding to student

e-mail Revised 41 Stem wording revised to include “communicating

with students”; response population subset to persons with instructional duties

48 Internet access available Deleted 49 Individual instruction Revised 46/47/

47B Gate question added; stem wording changed; item reformatted for web

50 Contact hours with advisees Revised 50 Reformatted for web; stem wording revised 51 Office hours Revised 51 Stem wording expanded to include in-person and

online office hours 52 Engaged in research Revised 53 Stem wording revised; reference period the entire

academic year 53 Type of primary research Revised 56 Stem wording revised to include “principal scholarly

activity”; reference period the academic year; open-ended specify field eliminated

54 Engaged in funded research Revised 55 Stem wording revised to include “scholarly activities at target school” and exclude funding from basic salary; reference period the academic year

55 Principal/co-principal investigator on funded research

Deleted

56 Number supported by grants Deleted 57 Sources of funding Deleted 58 Total number of grants Deleted 59a Total funds Deleted 59b How received funds were used Deleted 60 Evaluation of facilities and resources Deleted 61 Use of institutional funds Deleted 62 Number and type of administrative

committees Deleted

63 Hours spent on administrative committee work

Revised 49 Reformatted for web, stem wording revised to include examples

64 Union membership Revised 14/15 Item reformatted for web; gate question added 65 Satisfaction with instructional duties Revised 61 Number of response options reduced; new options

added 66 Job satisfaction Revised 62 Number of response options reduced 67 Likelihood of leaving job Deleted See notes at end of table.

Page 45: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

25

Table 8. Content and formatting changes to the NSOPF:99 faculty questionnaire in preparation for the NSOPF:04 instrument: 2004—Continued

NSOPF:99 NSOPF:04 Item Content Action

Item Changes

68 Age to stop working at postsecondary institution

Deleted

69 Factors influencing possible decision to leave

Deleted

70 Most important factor regarding decision

Deleted

71 Option to draw on retirement Deleted 72 Retired previously Unchanged 64 73 Early retirement option Deleted 74 Age planning to retire Unchanged 65 75a/76 Basic salary for academic year/

Compensation for calendar year Revised 66/66B Reformatted for web with follow-up screen for

nonrespondents; stem wording revised to stress confidentiality; item wording revised to simplify response categories and provide examples

75b Basis of basic salary Revised 67/68/69 Reformatted for web into separate items; item wording revised to collect contract length and other pay arrangements; open-ended specify field eliminated

77 Income of spouse/significant other Deleted 78 Number of persons in household Deleted 79 Household income Revised 70A/70B Reformatted for web with follow-up screen for

nonrespondents; stem wording revised to include respondent’s salary reported earlier and onscreen definition of household income; follow-up screen for item nonrespondents added

80 Number of dependents Revised 79 Item changed to number of dependent children 81 Gender Unchanged 71 82 Month and year of birth Revised 72 Birth month eliminated 83 Ethnicity Revised 73 Reformatted for web instrument 84 Race Revised 74 Response options reordered to match current

federal standards for collecting racial information 85 Disability Revised 75 Stem wording revised to include additional on-

screen definitions 86 Type of disability Deleted 87 Marital status Revised 77 Wording and order of response options modified 88 Employment of spouse/significant

other Deleted

89 Country of birth Revised 80 Revised to ask if born in U.S. 90 Citizenship status Revised 81 Visa status and distinction between

native/naturalized citizenship eliminated 91 Parent and spouse education level Deleted 92 Opinions about target institution Revised 82/83 Number of response options reduced; new

options added — Open-ended comments Deleted NOTE: Numbers in table correspond with the question number in the instruments. Question numbers 7, 20, 22, 25, 29, 30, 33, 34, 40, 42, 43, 44, 45, 57, 58, 59, 60, 63, 76, and 78 in the NSOPF:04 faculty questionnaire were eliminated before data collection, and the instrument was not renumbered. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

2.3 Institution Data Collection The goals of the institution data collection for the NSOPF:04 study were to collect a list

of full- and part-time faculty and instructional staff (referred to as a “faculty list”) from each sampled institution and to obtain a completed questionnaire from each sampled institution. As described in section 2.1.4, the faculty list was used for selecting the faculty sample and also

Page 46: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

26

provided the contact information used for faculty data collection activities. The institution questionnaire, detailed in section 2.2.3, collected information on the policies and practices affecting full- and part-time faculty and instructional staff. To facilitate the process of obtaining faculty lists and completing the institution questionnaire, an institution website was developed, and for each sampled institution, the Chief Administrator (CA) was asked to appoint an Institution Coordinator (IC).

2.3.1 Institution Website The NSoFaS:04 website served a number of functions for both the NSOPF:04 and

NPSAS:04 studies. For institutions, it was a central repository for all study documents and instructions. It allowed for the uploading of electronic lists of faculty and instructional staff. In addition, it housed the institution questionnaire for the Institution Coordinator to complete online. Figure 1 presents the home page of the NSoFaS:04 website.

Figure 1. The 2004 National Study of Faculty and Students institution website home page

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Faculty and Students (NSoFaS:04) website.

Visitors to the website were provided with the following links (see navigation bar on the left side of the screen):

• Early Contacting provided information about the early institution contacting process for NSoFaS:04 for the initial stage. Section 2.3.2 provides details of early institution contacting.

Page 47: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

27

• About NSOPF (faculty) provided information on the study’s mandate and research objectives, with a link to National Center for Education Statistics (NCES) reports from previous study cycles.

• About NPSAS (student) provided comparable information (as noted above) for the student component of NSoFaS:04.

• Instructions provided links that allowed institution staff to view and print copies of various NSOPF:04 and NPSAS:04 forms (in pdf format).

• Endorsements listed the 25 national organizations that endorsed both studies. (The 24 NSOPF:04 endorsements are listed in appendix E; one endorsement was applicable only to proprietary schools that were eligible for NPSAS:04 but ineligible for NSOPF:04).

• Frequently Asked Questions (FAQs) included questions and answers concerning all stages of data collection for both components of NSoFaS:04.

• Help provided the help desk toll-free number and e-mail address for contacting project staff, along with instructions for logging in.

• Contact Us contained address information for RTI International.

• Other NCES Sites links to three NCES websites that provided more information about NCES programs and how to order publications.

All data entry applications were protected by Secure Sockets Layer (SSL) encryption. Further security was provided by an automatic “time out” feature, through which a user was automatically logged out of the NSOPF:04 institution questionnaire if the system was idle for 30 minutes or longer. The system did not use any persistent “cookies,”17 thus adhering to the Department of Education’s privacy policy.

A status screen, shown in figure 2, indicated which stages of institution data collection were completed (denoted by a check mark) and allowed institutions to select those stages that were not yet completed. Once a stage was completed, it was no longer accessible via the Web.

17 A persistent cookie is a piece of information, such as an IPEDS ID, that can be stored in a file on the user’s computer. This information could then be used to identify a computer without the user even logging into the application.

Page 48: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

28

Figure 2. The 2004 National Study of Faculty and Students institution website status screen

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Faculty and Students (NSoFaS:04) website.

2.3.2 Institution Contacting The eligible institution sample for the NSoFaS:04 consisted of 1,630 institutions, of

which 1,070 were sampled for NSOPF:04 as well as NPSAS:04. These 1,070 institutions were recruited to participate in both components of NSoFaS:04 (NSOPF:04 and NPSAS:04). The fielding of NSOPF:04 and NPSAS:04 together as the National Study of Faculty and Students was one of three changes made in the institution contacting procedures for this cycle of NSOPF.

The second change was to administer the institution questionnaire as a web or CATI instrument, with no hardcopy equivalent.

The third change was to begin recruiting institutions and initiating coordinator contacts in March 2003—a full 8 months prior to the November reference date for the fall term, and roughly 5 to 6 months earlier than the September start dates of previous cycles. This change was prompted by the need to draw a faculty sample and subsequently contact sampled faculty for participation prior to the 2004 summer break. It was hoped that the additional lead time would allow schools to better plan for the staffing and resources required for participation within the study’s schedule constraints, allow institutions additional time to initiate and complete any internal review procedures they felt necessary, and also allow the contractor time to work with institutions to resolve any potential roadblocks to their participation. This advance notification was intended both to speed up receipt of faculty lists, and to positively impact the institution response rate. By sampling and contacting faculty earlier in the academic year, it was hoped that a higher faculty response rate could be achieved.

Page 49: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

29

Prior to the field test, endorsements from organizations that had previously endorsed NSOPF and/or NPSAS were renewed and extended, as appropriate, to both NSoFaS:04 component studies. An effort was also made to solicit new endorsements from other organizations as well. In all, 25 organizations endorsed both components of NSoFaS:04; 24 of these were relevant to NSOPF:04.18 These endorsements were featured on all project letterhead and pamphlets and on the NSoFaS website. In addition, several of these organizations continued to promote the study throughout the data collection period in newsletters and other communications with their member institutions. See appendix E for a list of the 24 organizations that endorsed NSOPF:04.

For NSOPF:04, data collection proceeded in four stages:

• verification;

• institution recruitment;

• advance notification of the coordinator; and

• faculty list and institution questionnaire data collection procedures.

Procedures for each stage of data collection are outlined below.

Verification

Verification began on January 23, 2003, and was completed prior to the start of institution recruitment on March 10, 2003. Institution contactors were trained to contact the institution at their main number, verify address information and confirm the name and contact information for the CA at the institution. They also confirmed that the school was Title IV eligible and open to the general public during the fall 2003 term.

Institutions flagged as potentially ineligible—including closed institutions and institutions that indicated they were not Title IV eligible or open to the general public—were forwarded to project staff for review. Project staff also reviewed instances of sampled institutions merging with other institutions (sampled or unsampled), possible changes in mission that could affect the institution’s sampling strata, and changes in name or address, to confirm the institution was eligible and correctly identified.

Institution recruitment and advance notification of the coordinator

Institution recruitment began on March 10, 2003. The Chief Administrator (CA) at each institution sampled for NSoFaS:04 was sent the following materials (see appendix F for copies of these letters and pamphlets):

• a cover letter, printed on NCES letterhead, providing background information on NSOPF:04 and NPSAS:04;

• an NSoFaS:04 pamphlet summarizing the objectives of both NPSAS:04 and NSOPF:04, and providing background information and selected findings for each component;

18 One of the 25 organizations, associated with for-profit schools ineligible for NSOPF, was asked only for an endorsement for NPSAS.

Page 50: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

30

• an NSOPF:04 pamphlet, included to show what had been prepared for mailing to the sampled faculty;

• a NPSAS:04 pamphlet, included to show what had been prepared for mailing to sampled students; and

• a project timeline outlining the flow of activities for both component studies of NSoFaS:04 , and the projected schedule for each.

A team of institution contactors followed up with the CA by telephone. The CA was asked to name an Institution Coordinator (IC) by completing the Designation of Coordinator form online, or providing the information over the telephone. Once the IC was identified, they were mailed an identical packet, with a cover letter informing them that they would be mailed complete instructions for their participation in each component in September.

During this advance notification stage of data collection, ICs were asked to complete an online Coordinator Response Form (CRF) which could also be administered by CATI (see appendix F). This instrument confirmed that the institution could supply the items requested for the faculty and student lists within the stated schedule constraints. It also contained items designed to expedite collection of student record information for the student component.

ICs who indicated that a formal review process (such as an Institutional Review Board [IRB] review) was necessary before their institution would agree to participate were forwarded additional project materials as appropriate. A complete IRB approval packet was prepared for this purpose and mailed to the IC upon request. This packet included copies of instruments, as well as complete descriptions of relevant survey procedures (e.g., confidentiality and informed consent).

Faculty list collection procedures

Complete instructions for participation in both NSOPF:04 and NPSAS:04 were sent to all designated ICs on September 29, 2003. Binders continued to be mailed to ICs on a flow basis as they were designated. The mailing, which was packaged in a three-ring binder, included the following materials:

• a cover letter describing the study, the institution’s password, IPEDS unit ID,19 and URL (web address) necessary to access the NSoFaS:04 website (a separate letter was created for NPSAS:04-only sampled institutions);

• a copy of the letter that went to the CA, and a facsimile of the Designation of Coordinator form;

• a complete list of endorsements;

• a project timeline outlining the flow of activities for both component studies of NSoFaS:04 , and the projected schedule for each;

• instructions for preparing the list of faculty and instructional staff, including a list of data elements requested, and a suggested file layout;

19 Chief Administrators and Institution Coordinators used their institution IPEDS unit ID and a password to authenticate to the institution website. Faculty and instructional staff were assigned a study ID and password to authenticate to the faculty website.

Page 51: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

31

• complete instructions for participation in each phase of NSoFaS:04; and

• a list of transmittal options for sending faculty lists, by mail, e-mail, and direct upload to the NSoFaS:04 website, together with an express courier packet and label for mailing the lists if required.

The instructions directed the ICs to provide a list of full- and part-time faculty and instructional staff, including all personnel who had faculty status or any instructional responsibilities during the fall 2003 term. Institutions were encouraged to submit an electronic list by uploading it to the secure website. The data items requested for each listed faculty or instructional staff member were

• full name;

• academic discipline;

• department/program affiliation;

• full-time/part-time status;

• gender;

• race/ethnicity;

• employee ID number (to eliminate duplicates from sample); and

• contact information (institution and home mailing address, institution and home e-mail address [if available], and home and campus telephone numbers).

Follow-up with ICs was conducted by telephone, mail, and e-mail. Telephone prompts to the ICs were made for institutions that had not provided lists. To minimize the number of contacts made to an IC, prompting for NSOPF:04 was combined with prompting for NPSAS:04. E-mail prompts to ICs, keyed to pending project deadlines, were regularly utilized. E-mail prompts focused on timely completion of requested materials and encouraged review of the instructions for participation. As faculty lists were received, they were reviewed for completeness, readability, and accuracy. Additional follow-up to clarify the information provided or retrieve missing information was conducted by the institution contactors as necessary.

Counts of full- and part-time faculty were collected in both the institution questionnaire and in the faculty lists. For each institution, the counts of full- and part-time faculty were checked against those provided in the institution questionnaire and against 2001 IPEDS Fall Staff Survey data. IPEDS data were used for discrepancy checks whenever institution questionnaire data were unavailable but also served as an additional check to catch inaccuracies in matching questionnaire/list data that otherwise would not have been discovered. For further details regarding quality control checks, see section 4.1.2.

Reimbursement for the time and staff involved in providing the faculty list was offered to institutions indicating a difficulty in complying with the request within schedule constraints. A refusal conversion letter was mailed to institutions that had not responded by November 21, 2003. The letter underscored the offer of reimbursement. Beginning in May 2004 a flat $500 reimbursement was offered to institutions for providing the outstanding faculty and student lists

Page 52: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

32

by the end of June. This offer was extended both to explicit refusals and schools which indicated cooperation but had yet to comply.

For institutions lacking the resources to provide a complete list of full- and part-time faculty despite the offer of reimbursement, list information was, if possible, abstracted from course catalogs, faculty directories, and other publicly available sources. Those institutions for which usable lists were identified were notified of this sampling procedure; institutions which indicated that they did not want their faculty included in the sample were excluded. Faculty lists abstracted in this fashion were reviewed for completeness against IPEDS before being approved for sampling. Faculty list collection continued through July 11.

Institution questionnaire data collection procedures

Institution Coordinators were asked to complete the institution questionnaire (described in section 2.2.3) using the study’s institution website. Institution questionnaire follow-up was conducted simultaneously with follow-up for lists of faculty. If an institution was unable to complete the questionnaire online, efforts were made to collect the information over the telephone. This often involved contacting multiple offices within the institution, as questions about benefits and tenure policies could most frequently be completed by human resources and/or the academic affairs office, while questions about faculty counts and turnover were typically answered by institutional research staff.

To expedite data collection, missing questionnaire data was, in some instances, abstracted directly from benefits and policy documentation supplied by the institution, or publicly available on the institution’s website. In addition, several large multi-campus systems provided data for their campuses at a system level or indicated that specific policy and benefits information was the same for all related campuses.

Refusal conversion efforts for the institution questionnaire were conducted with institutions regardless of whether they supplied a list of faculty. After August, institutions which had not completed the questionnaire were offered a reimbursement of $50 for providing the questionnaire within schedule constraints. Data collection for the institution questionnaire closed on October 22, 2004.

Administrative systems and procedures

To efficiently track all mail and telephone follow-up (both incoming and outgoing) and processing and sampling activities, the study utilized an Institution Contacting System (ICS) specifically designed to meet the needs of the NSoFaS:04 project. The ICS was accessible to contactors, Call Center20 supervisors, and project staff. The NSoFaS:04 ICS was designed so that a change in status (e.g., a completed Designation of Coordinator form) automatically generated the next step (e.g., a mailout to the IC and an automatic appointment for telephone follow-up). Electronic call notes documented the outcome of every conversation. The system allowed interviewers to set appointments for future follow-up. Through the ICS, the interviewer had the ability to designate an IC, provide contact information, and access the institution questionnaire and other data collection instruments. The ICS gave interviewers the ability to generate an automatic e-mail to ICs containing the password and IPEDS unit ID required for access. The

20 RTI’s Call Center Services provides telephone, web, and tracing services for a wide variety of projects, and operates two call centers: one in Raleigh, NC, and one in Greenville, NC.

Page 53: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

33

problem report form feature of the ICS allowed institution contactors to immediately forward specific call notes to an e-mail box monitored by project staff. This ensured that refusals, requests for remails, and calls requiring follow-up by project staff were handled promptly.

Quality Circle meetings, attended by interviewers, supervisors, team leaders, and project staff, were held on a weekly basis to share ideas for gaining institutional cooperation and suggestions for improving procedures. Project staff solicited feedback from call center personnel on the ICS, scripts, and handling problems reported by respondents (e.g., difficulties accessing the website).

2.4 Faculty Data Collection The NSOPF:04 utilized a mixed-mode data collection methodology that allowed sample

members to participate either by web-based self-administered questionnaire or via an interviewer-administered telephone interview. At the start of faculty data collection, introductory materials were sent to sample members via first class mail as well as electronic mail (if an e-mail address was available). The initial letter included instructions for completing the self-administered questionnaire on the Internet or by calling a toll-free number to complete a telephone interview. After an initial 4-week period, telephone interviewers began calling sample members. The self-administered web instrument remained available to respondents throughout data collection. An early-response incentive, designed to encourage sample members to complete the self-administered questionnaire prior to outgoing CATI calls, was offered to sample members who completed the questionnaire within 4 weeks of the initial mailing. Incentives were also offered to selected sample members as necessary (i.e., those who refused and other nonrespondents).

2.4.1 Faculty Website The website for the NSOPF:04 served a dual purpose. The primary function was to

provide access to the web questionnaire for the sampled faculty and instructional staff. The secondary function was to provide information about the study, the selected sample, the sponsor, the contractor, and confidentiality. In addition to the information available on the site, links were provided to other relevant sites (e.g., NCES). The home page of the NSOPF:04 faculty website is depicted in figure 3.

The initial login page provided access to the self-administered questionnaire. The login process involved entering a specific study ID and password, which were provided to the respondent in every letter and e-mail message. Respondents could also obtain their study ID and password by sending an e-mail to the project, or by contacting a help desk agent at the NSOPF:04 toll-free number.

As with the institution application, the web instrument was protected by SSL encryption, an automatic time out feature, and omission of any persistent cookies.

Page 54: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

34

Figure 3. The NSOPF:04 faculty website home page

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

2.4.2 Locating and Interviewing Procedures The NSOPF:04 faculty data collection design involved locating sample members,

providing an opportunity for the faculty or instructional staff to complete the self-administered questionnaire, and following up with web nonrespondents after 4 weeks to conduct a computer-assisted telephone interview. The data collection period lasted approximately 9 months (January 15 through October 6, 2004). Data collection activities for faculty are shown in figure 4.

Page 55: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

35

Figure 4. NSOPF:04 faculty data collection overview

Load database with faculty contact information on flow basis as lists are

available from schools

Good locating information provided

by school?

Advance tracing for cases with missing or non-unique telephone numbers or fewer than 75 percent

e-mail addresses for selected faculty members

Lead letter mailing1

Begin web data collection

Web questionnaire complete?2

Completed survey

CATI follow-up begins after 4 weeks3

no

yes

yes

no

Good phone number?

CATI interview

complete?

Final nonrespondent

Intensive tracing

New telephone number?

Unable to locate

no

no

yes

yes

yes

no

1 If a home address was available for the sample member, the lead letter package was mailed to the home. If there was no home address, the package was mailed to the school address. If there was no specific school address available, the package was mailed to the main address on file for the school. Sending packages to the home address resulted in a higher response rate compared to sending packages to the school address (78 percent versus 67 percent; χ2 = 565.6, p < .0001). 2 The web interview option was available throughout data collection, even after telephone follow-up began. 3 The sample member’s office and home telephone numbers were called by computer-assisted telephone interview (CATI) interviewers. If no specific telephone number was available for the sample member, the school’s main telephone number was used. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 56: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

36

Mailouts

Faculty and instructional staff were sent a lead letter, instructions for accessing the web instrument via the Internet or with the assistance of a telephone interviewer, and a study pamphlet. (Examples of these materials are included in appendix F.) The lead letter introduced the study and listed the organizations that endorsed the study. If an e-mail address was available for a sample member, the introduction to the study was also sent via e-mail.

Periodically throughout the data collection period, reminder letters and e-mail messages were sent to nonrespondents to encourage their participation and to notify them of the incentive, if applicable. Examples of these follow-up contacts are included in appendix F.

Locating

Identifying a valid mailing address and telephone number for all selected faculty and instructional staff sampled from known institutions was critical to the success of the NSOPF:04. Locating activities were conducted in two stages: advance tracing, which took place before data collection began, and intensive tracing conducted during data collection.

Advance tracing. Upon receipt of faculty lists from participating institutions, contact information for the sampled faculty and instructional staff was reviewed and assessed for completeness. Schools for which fewer than 75 percent of the sampled cases had e-mail addresses (n = 430) were selected for tracing before being sent a lead letter. Prior to CATI operations, home contact information was sent to Telematch to obtain the latest telephone numbers.

Initial tracing efforts included searches on the school’s website for contact information. When this was not an option, more extensive database searches were employed during intensive tracing. In some cases, the searches confirmed or updated the contact information provided by the institution; in other cases, the searches resulted in new contact information. All locating information obtained as a result of these searches was loaded into the NSOPF:04 database.

Intensive tracing. Intensive tracing was performed on a case if advance tracing did not yield a telephone number for loading in CATI, or if the case was designated as a dead end in CATI (i.e., there were no more telephone numbers to call for the case). The following steps were performed by the tracing unit to locate sample members.

• Check the preloaded information using an online directory assistance search. This step was intended to identify the easy-to-locate cases (e.g., cases with the correct telephone number but the wrong area code).

• Conduct credit bureau database searches. The tracing staff had access to various proprietary databases (TransUnion, Equifax, and Experian) containing current address and phone listings for the majority of consumers with a credit history.

• Conduct additional intensive tracing. This step included (but was not limited to) searches using Lexis-Nexis and FastData, directory assistance calls, and searches of institution websites for campus directories.

Tracing staff checked all new leads procured during their tracing efforts to confirm the addresses and telephone numbers that were obtained. When a telephone number for a sample member was confirmed, telephone interviewing resumed for that case. Cases with new address

Page 57: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

37

information were mailed a lead-letter packet. If the tracing staff located a new e-mail address for a sample member, the information was loaded into the database for future e-mail reminders and other mailings to nonrespondents.

Staff training

The mixed-mode design of the NSOPF:04 data collection required the development of three separate training programs for data collectors: help desk training, CATI interviewer training, and tracing. In addition, separate training sessions were conducted for supervisors and monitors.

Detailed NSOPF:04 interviewer manuals were distributed at the outset of each training session. These manuals served as both an instruction guide for the training lectures, discussions, and practical exercises and as a reference guide for use after completion of training. Supplemental chapters that covered additional duties were provided for supervisors, monitors, and help desk agents. The manual’s table of contents and an agenda for telephone interviewer training are included in appendix G.

All training sessions included a study overview, a review of the confidentiality requirements, a demonstration interview, an in-depth review of the instrument, hands-on practice exercises with the instrument, and open-ended coding modules. In addition, the help desk and telephone interviewer training sessions included the following additional topics:

• Help desk agents reviewed the “frequently asked questions” in detail, with a focus on responses to technical issues as well as instrument-specific questions, and instructions for documenting each call to the study hotline.

• Telephone interviewers were trained in techniques for gaining cooperation of sample members, and of other contacts, as well as techniques for addressing the concerns of reluctant participants and for avoiding refusals.

Self-administered questionnaires

The first phase of data collection, lasting 4 weeks after the lead letters were mailed, provided an opportunity for respondents to complete the self-administered questionnaire via the Internet before the telephone follow-up calls began. The web interview site remained available 24 hours a day, 7 days a week, thereby giving sample members the option to complete the questionnaire online during the entire 9 months of data collection.

Help desk operations

The NSOPF:04 help desk opened on January 15, 2004, in anticipation of the first respondent calls after the lead-letter mailing. The help desk staff were available to assist sample members who had questions or problems accessing and completing the self-administered questionnaire. A toll-free hotline was set up to accept incoming help desk calls. If technical difficulties prevented a sample member from completing the self-administered questionnaire, a help desk staff member, also trained to conduct telephone interviews, would encourage the caller to complete a telephone interview rather than to attempt the self-administered questionnaire.

All incoming calls from sample members were documented using the help desk software. In addition to this primary documentation function, the software provided information needed to

Page 58: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

38

verify a sample member’s identity, login information (study ID and password) for the web questionnaire, and a means for tracking calls that could not be resolved immediately.

The help desk software also provided project staff with reports on the types and frequency of problems experienced by sample members, as well as a way to monitor the resolution status of all help desk inquiries.

Telephone interviewing

Telephone prompts to nonrespondents began on February 12, 2004, at the end of the early-response incentive period. CATI procedures included attempts to locate, gain cooperation from, and interview study sample members who had not completed the questionnaire online. Interviewers encouraged respondents to complete the interview by telephone as soon as they made contact. However, if the sample member expressed a preference for completing the self-administered questionnaire via the Internet, a callback was scheduled for 1 week later. During these callbacks, interviewers again prompted the faculty members to complete the questionnaire by telephone.

Refusal conversion procedures were used to gain cooperation from individuals who refused to complete the questionnaire. When a refusal was first encountered, either because the sample member refused or because a “gatekeeper” (secretary or spouse) refused on behalf of the sample member, the case was referred to a refusal conversion specialist. Refusal conversion specialists were selected from among those interviewers most skilled at obtaining cooperation and were given training in refusal conversion techniques tailored to NSOPF:04. The refusal training emphasized ways to gain cooperation, overcome objections, address the concerns of gatekeepers, and encourage participation.

2.5 Data Collection Systems

2.5.1 Instrument Development and Documentation System The Instrument Development and Documentation System (IDADS) is a controlled web

environment in which project staff developed, reviewed, modified, and communicated changes to specifications, code, and documentation for the NSOPF:04 instrument. All information relating to the NSOPF:04 instrument was stored in a Structured Query Language (SQL) Server database and was made accessible through Windows and web interfaces. There are three modules within IDADS: specification, programming, and documentation.

Initial specifications were generated within the IDADS specification module. This module enabled access for searching, reviewing, commenting on, updating, exporting, and importing information associated with instrument development. All records were maintained individually for each item, which provided a historical account of all changes requested by both project staff and NCES.

Once specifications were finalized, the programming module within IDADS produced hypertext transfer markup language (html), Active Server Pages (ASP), and JavaScript template program code for each screen based on the contents of the SQL Server database. This output included screen wording, response options, and code to write the responses to a database, as well as code to automatically handle such web instrument functions as backing up and moving forward, and recording timer data. For questions that had changed significantly since the field

Page 59: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

39

test, the programming staff edited the automatically generated code to customize screen appearance and program response-based routing. For questions with minor changes, the programming staff simply modified the program code used in the field test.

The documentation module contained the finalized version of all instrument items, their screen wording, and variable and value labels. Also included were the more technical descriptions of items such as variable types (alpha or numeric), information regarding to whom the item was administered, and frequency distributions for response categories. The documentation module was used to generate the instrument facsimiles and the Electronic Codebook (ECB) input files.

2.5.2 Integrated Management System All aspects of the study were under the control of an Integrated Management System

(IMS). The IMS was a comprehensive set of desktop tools designed to give project staff and NCES access to a centralized, easily accessible repository for project data and documents. The NSOPF:04 IMS consisted of three components: the management module, the Receipt Control System (RCS), and the Case Management System (CMS).

The management module of the IMS contained tools and strategies to assist project staff and the NCES project officer in managing the study. All information pertinent to the study was located there, accessible via the Internet, in a secure desktop environment. Available on the IMS website were the project schedule, monthly progress reports, daily data collection reports and status reports (available through the RCS described below), project plans and specifications, project information and deliverables, instrument specifications, staff contacts, the project bibliography, a document archive, and frequencies for the faculty and institution data. The IMS management module also had a download area from which the client and subcontractors retrieved large files when necessary.

The Receipt Control System (RCS) was an integrated set of systems that monitored all activities related to data collection, including tracing and locating. Through the RCS, project staff were able to perform stage-specific activities, track case statuses, identify problems early, and implement solutions effectively. RCS locator data were used for a number of daily tasks related to sample maintenance. Specifically, the mailout program produced mailings to sample members, the query system enabled administrators to review the locator information and status for a particular case, and the mail return system enabled project staff to update the locator database. The RCS also interacted with the Case Management System and tracing unit databases, sending locator data among the three systems as necessary.

The Case Management System (CMS) was the technological infrastructure that connected the various components of the CATI system, including the questionnaire, utility screens, databases, call scheduler, report modules, links to outside systems, and other system components. The call scheduler assigned cases to interviewers in a predefined priority order. In addition to delivering appointments to interviewers at the appropriate time, the call scheduler also calculated the priority scores (the order in which cases need to be called based on preprogrammed rules), sorted cases in non-appointment queues, and computed time zone adjustments to ensure that the sampled respondents were not phoned outside the specified calling

Page 60: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 2. Design and Implementation of NSOPF:04

40

hours.21 The call scheduler also allowed callbacks to be set and assigned status codes to the case. Using an algorithm based on the previous call results, the call scheduler determined which telephone number (e.g., home or work) associated with the case should be called next.

21 Call Center hours were 9:00 a.m. to 11:00 p.m. Monday through Friday, 9:00 a.m. to 5:00 p.m. Saturday, 1:30 p.m. to 9:30 p.m. Sunday, Eastern Standard Time. The CMS was programmed to account for time zones such that respondents would not be called after 9:00 p.m. local time. Work numbers were only called 8:00 a.m. to 5:00 p.m. Monday through Friday, local time.

Page 61: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

41

Chapter 3 Data Collection Outcomes

The success of the 2004 National Study of Postsecondary Faculty (NSOPF:04) was dependent upon achieving high levels of cooperation at all stages of the data collection process. The data collection results—namely the institution and faculty response rates, along with the results of the efforts that contributed to those rates—are the focus of this chapter.

3.1 Institution Data Collection Results

3.1.1 Institution Participation Of the 1,080 institutions selected to participate in NSOPF:04, 1,070 were eligible

institutions.22 Of the eligible institutions, 97 percent (unweighted) appointed an Institution Coordinator (IC) to assist with study requirements and 85 percent completed the Coordinator Response Form (CRF), indicating their initial intent to participate in both components of the study and adhere to project timelines. Ultimately, 91 (unweighted) percent of the eligible institutions provided a list of faculty and 86 percent completed institution questionnaires.

Fifty-seven institutions indicated having policies that required the 2004 National Study of Faculty and Students (NSoFaS:04) survey request be submitted to their Institutional Review Board (IRB) for formal approval. One advantage of the advance notification period is that it allowed the contractor sufficient time to prepare customized IRB approval packages for submission to each of these institutions. This procedure expedited the approval process and alleviated the burden on the IC. Of the 60 institutions that were sent IRB approval packages, all but three approved participation in NSOPF:04.

Faculty lists

Two key changes in data collection procedures had the potential to impact faculty list participation rates for NSOPF:04; namely the advance notification initiative begun in March 2003, and the decision to combine the data collection efforts for NSOPF:04 with the 2004 National Postsecondary Student Aid Study (NPSAS:04) under the NSoFaS:04. Table 9 compares the participation rate achieved in NSOPF:04 with previous cycles.

22 Ineligible institutions included institutions treated as mergers and reported for by other institutions, closed institutions, and institutions that did not meet eligibility requirements. Numbers of institutions are rounded to the nearest 10.

Page 62: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

42

Table 9. Number and percentage of institutions providing faculty lists, by cycle of National Study of Postsecondary Faculty (NSOPF): 1988 to 2004

NSOPF cycle Number of eligible

institutions Number of institutions

providing list Unweighted percent

participation rate1 NSOPF:88 field test 110 100 91.4 NSOPF:88 full-scale study 480 450 93.5 NSOPF:93 field test 140 120 89.0 NSOPF:93 full-scale study 960 820 84.9 NSOPF:99 field test 160 150 90.1 NSOPF:99 full-scale study 960 820 85.4 NSOPF:04 field test 150 130 89.9 NSOPF:04 full-scale study 1,070 980 91.3 1 Percentages are based on the number of eligible institutions within the row under consideration, and are based on original unrounded numbers. NOTE: Numbers of eligible institutions and numbers of institutions providing lists are rounded to the nearest 10. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

A total of 980 (91 percent, unweighted and weighted) of eligible institutions provided a faculty list, with all institutional strata exceeding a weighted participation rate of 85 percent. The breakdown of institutions providing faculty lists, by institution type, is presented in table 10.

Table 10. Number and percentage of institutions providing faculty lists, by type of institution: 2004

Number of institutions Percent participation rate1 Institution type Eligible Participating Unweighted Weighted Total 1,070 980 Public doctoral 190 180 92.7 93.2 Public master’s 120 100 89.7 89.1 Public bachelor’s 30 30 92.9 88.4 Public associate’s 330 290 89.1 87.4 Public other 10 10 100.0 100.0 Private not-for-profit doctoral 110 100 92.0 95.6 Private not-for-profit master’s 80 80 92.6 86.8 Private not-for-profit bachelor’s 130 120 94.6 93.1 Private not-for-profit associate’s 10 10 75.0 96.0 Private not-for-profit other 60 60 93.3 91.8 1 Percentages are based on the number of eligible institutions within the row under consideration, and are based on original unrounded numbers. NOTE: Number of eligible and participating institutions are rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Institution questionnaire

A total of 920 institutions, representing 84 percent of eligible institutions, completed the institution questionnaire. Table 11 provides a breakdown of institution participation by strata.

Page 63: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

43

Table 11. Number and percentage of institutions providing institution questionnaires, by type of institution: 2004

Number of institutions Participation rate percent1 Institution type Eligible Participating Unweighted Weighted Total 1,070 920 86.1 84.2 Public doctoral 190 170 86.5 84.7 Public master’s 120 110 90.5 89.6 Public bachelor’s 30 30 100.0 100.0 Public associate’s 330 290 89.1 83.6 Public other 10 10 87.5 98.9 Private not-for-profit doctoral 110 90 80.4 83.7 Private not-for-profit master’s 80 70 81.5 79.8 Private not-for-profit bachelor’s 130 110 83.8 77.7 Private not-for-profit associate’s 10 10 75.0 86.0 Private not-for-profit other 60 50 76.7 76.2 1 Percentages are based on the number of eligible institutions within the row under consideration, and are based on original unrounded numbers. NOTE: Number of eligible and participating institutions are rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table 12 shows the breakdown of completed institution questionnaires by mode of administration. Those completed in computer assisted telephone interview (CATI) mode include instances where the questionnaire was finalized with interviewer assistance (e.g., questionnaires wholly or partially data-entered by project staff from information supplied on hardcopy by the institution) and questionnaires completed, wholly or partly, by CATI. Web completions are defined as those questionnaires transmitted as complete by the institution, although some institutions may have provided some responses in CATI. Nearly 81 percent of institutions completed the institution questionnaire using the Web, and 19 percent completed it with the help of an interviewer. By comparison, in 1999, the only previous cycle in which a web questionnaire was used, 69 percent of the questionnaires were done on paper, and 31 percent were done on the Web. The percentage of interviews completed at least in part by CATI is fairly consistent with the number of questionnaires completed with interviewer assistance in previous cycles.

Table 12. Number and percentage of institutions providing institution questionnaires, by mode of administration: 2004

Mode Number of participating institutions Unweighted response rate1 Total 920 100.0 Web 740 80.5 CATI 180 19.5 1 Percentages are based on the total number of participating institutions, and are based on original unrounded numbers. NOTE: Number of participating institutions are rounded to the nearest 10. Detail may not sum to totals because of rounding. CATI = computer assisted telephone interview. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 64: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

44

3.1.2 Institution Survey Completion Timing The timing analysis was conducted by embedding time stamps in the programming code

for each form (screen) in the survey. From these time stamps, the number of seconds spent on each screen (on-screen time) and the transit time between screens (i.e., the time required to transmit data to the server, for the server to store the data and assemble the next page, and for the page to be transmitted and loaded on the computer) were calculated. A cumulative on-screen time and a cumulative transit time for the institution survey also were calculated from the time stamps. The sum of the cumulative on-screen and transit times was the total instrument time (i.e., the number of minutes it took to administer the institution questionnaire).

Unlike most questionnaires, which require the respondent to complete the survey in sequential order, the institution questionnaire included a status screen that allowed respondents to jump to particular questions they could answer, and skip over ones they could not answer. For most institutions, the questionnaire was completed in multiple internet sessions and, in some cases, by multiple people at the institution.

The target time to complete the institution questionnaire was 45 minutes. Based on the time stamps for each form, the actual time to complete the questionnaire ranged from less than 1 minute to 4 hours and 9 minutes, with an average of 35 minutes.23 Of these 35 minutes, approximately 31 minutes, on average, were spent answering questions (on-screen time) and 5 minutes, on average, were spent in transit. These numbers may be misleading because some institutions may have completed the sample hardcopy version of the questionnaire in advance, so their time to complete the web questionnaire simply reflected the time it took to key in their responses.

Table 13 reports the average and maximum times (in seconds) to complete each form in the institution instrument. Ten forms (screens) of the institution survey took more than 1 minute to complete, on average. Each of these forms required the respondent to look up information and/or requested several pieces of information, which accounts for the longer times on these screens.

23 The average time excludes 28 cases with unexplained negative transit times. Some very short survey completion times may be attributed to institutions who answered a small subset of items in the questionnaire. Very long survey times may be the result of the respondent timing out repeatedly while in the questionnaire (e.g., answering the phone).

Page 65: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

45

Table 13. Average and maximum completion time, in seconds, for forms in the institution questionnaire: 2004

Time in seconds Form Description Average Maximum

Number of cases

I1 Number full-/part-time faculty, fall 2003 211 1,319 920 I1B Have full-/part-time faculty, fall 2003 25 166 200 I2 Changes in number of full-time faculty 252 1,585 920 I2A Reason for discrepancy, I1A and I2G 84 538 240 I3 Full-time tenure: has tenure system 26 434 920 I4 Full-time tenure: number considered for tenure, 2002-03 62 620 750 I5 Full-time tenure: number granted tenure, 2002-03 16 164 650 I6 Full-time tenure: maximum years on tenure track 39 399 740 I7 Full-time tenure: institution actions, last 5 years 55 436 740 I7SP Full-time tenure: number early retirees, last 5 years 53 586 400 I8 Full-time tenure: discontinued tenure system, last 5 years 15 125 190 I9 Full-time faculty: positions sought to fill, fall 2003 45 403 910 I10A Full-time faculty: benefits available 111 1,091 910 I10B Full-time faculty: benefits subsidized 58 443 900 I11 Full-time faculty: other benefits available 80 427 910 I12 Full-time faculty: union representation 20 408 910 I13 Full-time faculty: teaching assessment 64 457 910 I14 Part-time benefit: retirement plan 38 435 910 I15A Part-time faculty: benefits available 61 479 910 I15B Part-time faculty: benefits subsidized 35 319 530 I16 Part-time faculty: other benefits available 52 327 910 I17 Part-time faculty: union representation 14 200 910 I18 Part-time faculty: teaching assessment 41 320 910 I19 Undergraduate instruction: percent assignment 126 941 920 I20 Comments/suggestions 143 785 920 NOTE: The number of cases per form varies due to the interview skip logic. Outliers for each form were top coded (mean + 3 standard deviations). Numbers of cases are rounded to the nearest 10. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

3.2 Faculty Data Collection Results Faculty data collection efforts for NSOPF:04 consisted of three essential steps: locating

(identifying telephone numbers and addresses for sample members), contacting (carrying out the necessary steps to reach the faculty member), and encouraging survey completion by web-based self-administration or CATI. This section describes the results of the NSOPF:04 data collection effort and evaluates the effectiveness of the data collection procedures used in locating, contacting, and interviewing sample members.

3.2.1 Response Rate Overall contacting and survey completion results for the faculty contact phase of

NSOPF:04 are presented in figure 5. Of the 35,630 cases in the original sample, 1,300 (4 percent) were excluded because they were ineligible for the study or deceased. Of the 34,330 eligible sample members, 29,820 (87 percent) were contacted and 26,110 completed the survey, for an unweighted and weighted response rate of 76 percent achieved in the 9-month period from January 15 to October 6, 2004.

Page 66: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

46

Figure 5. Contacting and survey completion outcomes: 2004

Respondentn = 26,110

Samplen = 35,630

Contactedn = 29,820

Not Contactedn = 4,500

Exclusionsn = 1,300

Nonrespondentn = 3,720 Deceased = 50

Ineligibles = 1,250

Completed survey = 24,360Partial survey = 140

Abbreviated survey = 1,610

Refusals = 2,400Time ran out = 1,320

NOTE: Numbers rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table 14 shows the response rates of faculty by institution type. Response rates range from 67 percent (weighted) of faculty at public bachelor’s degree-granting institutions to 91 percent (weighted) of faculty at private not-for-profit associate’s degree-granting institutions.

Table 14. Number of sampled, eligible, and responding faculty and response rates, by institution type: 2004

Faculty Percent response rate2 Institution type Sampled Eligible1 Responding1

Unweighted Weighted

Total 35,629 34,330 26,110 76.1 75.6

Institution level 2-year 9,188 8,830 6,440 73.0 73.7 4-year non-doctorate-granting 8,747 8,430 6,720 79.7 78.6 4-year doctorate-granting 17,694 17,070 12,950 75.8 75.0

Institution control Public 23,280 22,450 17,120 76.2 76.0 Private not-for-profit 12,349 11,880 8,990 75.7 74.7

Institution type Public doctoral 9,827 9,500 7,460 78.6 78.1 Public master’s 3,485 3,350 2,620 78.1 78.5 Public bachelor’s 693 680 510 75.4 67.4 Public associate’s 9,129 8,770 6,420 73.1 73.7 Public other 146 140 110 73.6 73.3 Private not-for-profit doctoral 4,652 4,470 3,160 70.7 68.2 Private not-for-profit master’s 3,020 2,890 2,270 78.6 78.5 Private not-for-profit bachelor’s 3,218 3,120 2,520 80.8 78.7 Private not-for-profit associate’s 242 240 190 79.8 91.0 Private not-for-profit other 1,217 1,160 850 73.1 70.6 1 Numbers rounded to the nearest 10. 2 Percentages are based on the number of eligible faculty within the row under consideration. Percentages are based on original unrounded numbers. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 67: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

47

Table 15 presents faculty response rates by when the lead letter package was mailed. Response rates range from 56 percent for one of the later mailings to 80 percent for the first mailing.

Table 15. Faculty response rates, by date lead letter package was mailed: 2004

Date mailed Numbereligible1

Percentresponse rate2

Total 34,330 76.1

January 14-February 13, 2004 18,690 80.3 February 14-March 13, 2004 7,910 76.3 March 14-April 13, 2004 1,230 74.6 April 14-May 13, 2004 3,330 67.1 May 14-June13, 2004 1,520 55.9 June 14-July 13, 2004 1,260 62.7 July 14-July 21, 2004 390 67.5 1 Numbers rounded to the nearest 10. 2 Percentages are based on the number of eligible faculty within the row under consideration. Percentages are based on original unrounded numbers. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

3.2.2 Locating and Survey Completion Most of the faculty lists provided by the institutions contained contact information for

sample members, including the sample member’s name, office telephone number, school name, school address, and department. For some cases, home addresses also were provided. In addition, a number of approaches were used to locate faculty and instructional staff, including advance tracing, the initial mailing to all sample members, follow-up letters and e-mails to nonrespondents, telephone tracing (interviewers calling telephone numbers provided on the faculty lists as well as any additional numbers obtained during the course of making those calls), and intensive tracing (i.e., using consumer databases, internet searches, and criss-cross directories).

Before the start of data collection, schools’ faculty lists were assessed for completeness of contact information. As necessary, advance tracing, described in section 2.4.2, was conducted. As shown in table 16, the contact information provided by the school proved effective in contacting faculty and instructional staff; 83 percent of sample members required no intensive tracing, while the remaining 17 percent required intensive tracing. Intensive tracing was required when a case did not have a telephone number associated with it or the CATI calls had exhausted all numbers for the case without reaching the sampled individual. Approximately 52 percent of cases sent to intensive tracing were located, compared to 90 percent of cases not sent to intensive tracing. Further, only 40 percent of cases sent to intensive tracing completed an interview compared with 80 percent of cases not sent to intensive tracing.

Page 68: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

48

Table 16. Locate and interview rates, by intensive tracing efforts: 2004

Located Completed survey

Intensive tracing status Total Number1 Percent2

Number1 Percent3

Total 35,629 29,820 83.7 26,110 73.3

Intensive tracing required 5,943 3,080 51.8 2,360 39.7 No intensive tracing required 29,686 26,750 90.1 23,750 80.0 1 Numbers rounded to the nearest 10. 2 Percentages are based on the total within the row under consideration. 3 Percentages are based on the number of eligible faculty within the row under consideration. Percentages are based on original unrounded numbers. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table 17 provides an overview of the primary sources used by tracers during the intensive tracing process. Tracers generally used multiple sources when tracing a case, so no one source can be pinpointed as the one that resulted in the “locate.” Among the sources used most frequently for intensive tracing were internet searches, directory assistance, and various consumer database searches.

Table 17. Locate rates, by intensive tracing source: 2004

Located Tracing source Total Number Percent1 Internet search 3,726 1,739 46.7 Directory assistance 3,529 1,730 49.0 Consumer database search—Lexis-Nexis 2,911 1,251 43.0 Reverse phone lookup—FastData 2,238 1,105 49.4 Name search—FastData 3,276 1,531 46.7 Address search—FastData 2,279 997 43.7 Neighbor search—FastData 14 4 28.6 Directory Assistance Plus—FastData 526 216 41.1 Consumer database search—TransUnion 2,374 1,131 47.6 Consumer database search—Experian search on Social Security number 1,394 723 51.9 Consumer database search—Experian address search 1,690 762 45.1 Other collateral source 1,714 768 44.8 1 Percentages are based on the total within the row under consideration. NOTE: Most cases were traced using multiple sources; therefore, row totals and percents are not mutually exclusive. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The breakdown of faculty and instructional staff requiring intensive tracing, by faculty status and institution type, is presented in table 18. Thirty-two percent of part-time faculty required intensive tracing, compared to 7 percent for full-time faculty (χ2 = 3806.9, p < .0001). Seventeen percent of faculty at public institutions required intensive tracing compared to 16 percent at private not-for-profit institutions (χ2 = 16.5, p < .0001).

Page 69: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

49

Table 18. Faculty and instructional staff requiring intensive tracing procedures, by employment status and institution type: 2004

Intensive tracing Employment status and institution type Total Number Percent1 Total 35,629 5,943 16.7

Employment status Full-time 21,891 1,544 7.1 Part-time 13,008 4,210 32.4 Unknown employment status 730 189 25.9

Institution control Public 23,280 4,019 17.3 Private not-for-profit 12,349 1,924 15.6

Institution type Public doctoral 9,827 951 9.7 Public master’s 3,485 455 13.1 Public bachelor’s 693 134 19.3 Public associate’s 9,129 2,465 27.0 Public other 146 14 9.6 Private not-for-profit doctoral 4,652 733 15.8 Private not-for-profit master’s 3,020 473 15.7 Private not-for-profit bachelor’s 3,218 462 14.4 Private not-for-profit associate’s 242 27 11.2 Private not-for-profit other 1,217 229 18.8 1 Percentages are based on the number of sample members within the row under consideration. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The results of faculty and instructional staff locating and survey completion, by faculty status and institution type, are shown in table 19. Ninety percent of full-time faculty members were located, compared with 75 percent of part-time faculty (χ2 = 1414.6, p < .0001). Eighty-one percent of full-time faculty completed the survey, compared with 69 percent of part-time faculty (χ2 = 903.8, p < .0001). When examined by institution type, locate rates ranged from 80 to 88 percent. Survey completion rates ranged from 71 percent for faculty at private not-for-profit doctorate-granting institutions to 81 percent at private not-for-profit baccalaureate-granting institutions.

Page 70: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

50

Table 19. Faculty locating and survey completion results, by employment status and institution type: 2004

Located Completed survey Employment status and institution type

Total sample Number1 Percent2

Number eligible1 Number1 Percent3

Total 35,629 29,820 83.7 34,330 26,110 76.1

Employment status Full-time 21,891 19,580 89.5 21,390 17,250 80.6 Part-time 13,008 9,740 74.9 12,270 8,430 68.7 Unknown employment status 730 500 68.8 660 420 64.3

Institution control Public 23,280 19,520 83.9 22,450 17,120 76.2 Private not-for-profit 12,349 10,300 83.4 11,880 8,990 75.7

Institution type Public doctoral 9,827 8,600 87.5 9,500 7,460 78.6 Public master’s 3,485 2,950 84.5 3,350 2,620 78.1 Public bachelor’s 693 560 81.7 680 510 75.4 Public associate’s 9,129 7,280 79.8 8,770 6,420 73.1 Public other 146 130 85.6 140 110 73.6 Private not-for-profit doctoral 4,652 3,770 81.1 4,470 3,160 70.7 Private not-for-profit master’s 3,020 2,540 84.0 2,890 2,270 78.6 Private not-for-profit bachelor’s 3,218 2,800 87.0 3,120 2,520 80.8 Private not-for-profit associate’s 242 200 84.3 240 190 79.8 Private not-for-profit other 1,217 990 80.9 1,160 850 73.1 1 Numbers rounded to the nearest 10. 2 Percentages are based on the number of sample members within the row under consideration. Percentages are based on original unrounded numbers. 3 Percentages are based on the number of eligible sample members within the row under consideration. Percentages are based on original unrounded numbers. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The results of faculty and instructional staff survey completion by mode of data collection are presented in table 20. A total of 19,780 respondents (76 percent) completed the self-administered questionnaire and 6,330 respondents (24 percent) completed the CATI interview. (It should be noted that 59.2 percent completed the survey during the early phase, without telephone followup). While NSOPF:04 exceeded the goal of having 50 percent of completes by web, a substantial portion of these web surveys were completed only after having been called by a CATI interviewer.

Page 71: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

51

Table 20. Response rates and mode of completion, by employment status and institution type: 2004

Mode of completion Total complete Self-administered CATI Employment status and

institution type Number eligible1 Number1 Percent2 Number1 Percent3 Number1 Percent3

Total 34,330 26,110 76.1 19,780 75.8 6,330 24.3

Employment status Full-time 21,390 17,250 80.6 13,980 81.0 3,280 19.0 Part-time 12,270 8,430 68.7 5,500 65.2 2,940 34.8 Unknown employment status 660 420 64.3 300 71.9 120 28.1

Institution control Public 22,450 17,120 76.2 12,850 75.1 4,270 24.9 Private not-for-profit 11,880 8,990 75.7 6,930 77.1 2,060 22.9

Institution type Public doctoral 9,500 7,460 78.6 6,090 81.6 1,370 18.4 Public master’s 3,350 2,620 78.1 1,980 75.6 640 24.4 Public bachelor’s 680 510 75.4 360 69.6 160 30.4 Public associate’s 8,770 6,420 73.1 4,350 67.7 2,070 32.3 Public other 140 110 73.6 70 68.9 30 31.1 Private not-for-profit doctoral 4,470 3,160 70.7 2,520 79.9 640 20.2 Private not-for-profit master’s 2,890 2,270 78.6 1,700 74.8 570 25.2 Private not-for-profit

bachelor’s 3,120 2,520 80.8 1,950 77.4 570 22.6 Private not-for-profit

associate’s 240 190 79.8 160 83.7 30 16.3 Private not-for-profit other 1,160 850 73.1 600 70.6 250 29.5 1 Numbers rounded to the nearest 10. 2 Percentages are based on the number of eligible sample members within the row under consideration. Percentages are based on original unrounded numbers. 3 Percentages are based on the number of completed interviews within the row under consideration. Percentages are based on original unrounded numbers. NOTE: All percents are unweighted. Reporting excludes 1,300 cases determined to be ineligible for study. CATI = computer assisted telephone interview. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Eighty-one percent of full-time faculty completed the self-administered survey, compared to 65 percent of part-time faculty (χ2 = 776.6, p < .0001). Seventy-seven percent of faculty and instructional staff at private not-for-profit institutions completed the self-administered survey, compared to 75 percent of faculty at public institutions (χ2 = 13.6, p < .0002). Self-administered web survey completion rates by institution type ranged from 68 percent for public associate’s degree-granting schools to 84 percent for private not-for-profit associate’s degree-granting schools. The cumulative response rate, overall and by mode, is shown in figure 6.

Page 72: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

52

Figure 6. Cumulative response rates, by mode of completion: 2004

0

20

40

60

80

100

Jan Feb Mar Apr May Jun Jul Aug Sep Oct

Overall

Web

CATI

Percent

Month of Data Collection

NOTE: Mode of completion for respondents who switched modes was determined by the mode at the time of survey completion. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

3.2.3 E-mail Contacting Efforts E-mail addresses of faculty and instructional staff were requested in the faculty lists.

Where e-mail addresses were not provided by the institution, efforts were made through an advance search of the institution’s online directory for e-mail addresses of sample members as well as other database searches. In addition, some sample members provided e-mail addresses when contacted by a telephone interviewer. E-mail addresses were available for 27,980 (82 percent) of the 34,330 eligible sample members.

Periodically throughout the data collection period, e-mail messages were sent to nonrespondents to encourage their participation (see appendix F). Sample members who received e-mails were more likely to complete the survey (78 percent) compared to sample members to whom no e-mail reminders were sent (53 percent; χ2 = 1867.2, p < .0001). Respondents with e-mail addresses were more likely to complete the self-administered web questionnaire (79 percent) than were respondents who were not sent e-mail reminders (55 percent; χ2 = 976.1, p < 0.0001).

3.2.4 Refusal Conversion Efforts Refusal conversion measures were used to gain cooperation from individuals who refused

to participate when contacted by telephone interviewers. Refusals came not only from sample members, but also occasionally from other household members or other contacts (such as

Page 73: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

53

secretaries).24 Whenever a refusal was encountered, unless it was deemed hostile, the case was referred to a specialist trained in refusal conversion techniques. Refusal conversion specialists were chosen based on their performance as interviewers, with those who were the most skilled in obtaining cooperation given additional training in converting refusals. This training was tailored to the concerns of faculty members and gatekeepers regarding participation, and focused on gaining cooperation and encouraging participation.

Ten percent of contacted cases refused to participate at some point during data collection. However, 18 percent of these cases were successfully converted and eventually completed the survey. Fifty-nine percent of the converted cases completed the web self-administered questionnaire, and 41 percent completed a telephone interview. An abbreviated instrument, consisting of sections A (nature of employment), B (academic/professional background), and G (sociodemographic characteristics) from the faculty instrument, was developed to convert nonrespondents by offering a shorter (10 minute) interview. The abbreviated instrument, used only in the final 3 weeks of data collection, yielded 1,610 interviews.

3.2.5 Incentives For the NSOPF:04 full-scale data collection, three types of incentives were offered to

eligible sample members. In accordance with findings from the NSOPF:04 field test incentive experiment25 (Heuer et al. 2004), incentives were offered during two phases of data collection: an initial early-response incentive period and a nonresponse incentive period. In addition to those periods, refusal incentives were made available following initial refusals. During each incentive phase, respondents were offered the choice of a $30 check or a $30 gift certificate to Amazon.com.

The initial early-response incentive was offered to all sample members for completion of the questionnaire within the first 4 weeks of data collection. The early-response incentive was designed to increase the response rate during the initial phase of data collection and promote a higher rate of web self-administered responses and reduce costs associated with telephone interviewing. Following the initial 4-week period, CATI telephone prompting began. During this second phase of the study, no incentive was offered to respondents for completing the interview. All nonrespondents from the first phase were contacted by telephone and asked to complete the survey, either on the phone or via the Web at their convenience.

Any sample member who refused to participate in the study was flagged for the refusal incentive. A refusal conversion letter was sent out to explain the study and request that the sample member reconsider the decision not to participate and to announce the reinstitution of the $30 incentive for participating. 24 Nearly 77 percent of all refusals were made by sample members, while the remaining 23 percent were made by other household members or other contacts. Of the sample members who initially refused, 17 percent eventually completed an interview. 25 The field test experimental design consisted of three randomly assigned early-response incentive groups who were offered $0, $20, or $30 to complete the self-administered questionnaire over the Internet within 3 weeks of the initial mailing and two nonresponse incentive groups of $0 and $30 for those who had not completed the survey by a certain date during data collection. The early-response incentive yielded 31 and 34 percent response rates for the $20 and $30 incentives, respectively, compared with a 16 percent response rate for the control group. The nonresponse incentive yielded a 47 percent response rate for those offered $30 and a 34 percent response rate for the control group. The differences between the treatment and the control groups were statistically significant for both phases of the experiment; however, the apparent difference in amounts ($20 versus $30) for the early-response incentive period, while in the expected direction, was not statistically significant.

Page 74: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

54

Nonresponse incentives were introduced after 8 weeks of CATI prompting of all nonrespondents who had not already been offered the refusal incentive. Letters and e-mail prompts were sent periodically to nonrespondents throughout the data collection period. All correspondence mentioned the incentive when it was available to sample members. Table 21 provides a breakdown of the types of incentives offered and the results of each incentive period.

Table 21. Faculty response rates, by incentive period: 2004

Number1 Percent2 Incentive offered Eligible Responded Eligible3 Responded4 Response rate5 Total 34,330 26,110 76.1 100.0 76.1

Early-response incentive 34,330 15,010 43.7 57.5 43.7 Period of no incentive 19,320 5,250 27.2 20.1 15.3 Refusal incentive 2,410 570 23.6 2.2 1.7 Nonresponse incentive 11,660 5,280 45.3 20.2 15.4 1 Numbers rounded to the nearest 10. 2 Percentages are based on original unrounded numbers. 3 Percentages are based on the number of eligible sample members within the row under consideration. 4 Percentages are based on the total number of respondents. 5 Percentages are based on the total number of eligible sample members. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

These results indicate that the combination of early-response incentive and other later incentives was required to reach the targeted response rate within the data collection schedule. While the early-response incentive was effective in getting 44 percent of the eligible sample members to complete the survey within the initial 4 weeks of the study, and an additional 15 percent of the sample members completed within 8 weeks after the initial incentive period, the cumulative response rate after 12 weeks was only 59 percent. The refusal and nonresponse incentives were undoubtedly helpful in attaining the additional 17 percent needed to reach the 76 percent response rate.

3.3 Burden and Effort

3.3.1 Faculty Survey Completion Timing Like the institution timing analysis, the faculty timing analysis was conducted by

embedding time stamps in the programming code for each form (screen) in the survey. From these time stamps, the number of seconds spent on each screen (on-screen time) and the transit time between screens (i.e., the time required to transmit data to the server, the time for the server to store the data and assemble the next page, and the time for the page to be transmitted and loaded on the computer) were calculated. A cumulative on-screen time and a cumulative transit time for the faculty survey also were calculated from the time stamps. The sum of the cumulative on-screen and transit times was the total instrument time (i.e., the number of minutes it took to administer the faculty questionnaire).

Following the 1999 cycle of NSOPF—which averaged over 50 minutes—the faculty questionnaire was shortened substantially, with a goal of achieving a 30-minute survey. The NSOPF:04 field test averaged 42 minutes. Based on the time stamps for each form in the full-scale instrument, the time to complete the entire survey ranged from 8 minutes to 3 hours and

Page 75: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

55

6 minutes,26 with an average time of 30 minutes. Of these 30 minutes, approximately 26 minutes, on average, were spent answering questions (on-screen time) and 3 minutes, on average, were spent saving data and loading forms (transit time).

Table 22 presents the overall timing data by mode for completed surveys (excluding partial and abbreviated interviews). Average on-screen time was significantly longer for CATI respondents than for web respondents (27 and 26 minutes, respectively; t = –4.46, p < .0001), while the average transit time was significantly shorter for CATI respondents than for web respondents (1 and 4 minutes, respectively; t = 34.94, p < .0001). Presumably, the longer on-screen time for CATI respondents is due to the time it takes to read text out loud, and to the fact that the respondent may ask questions. The shorter transit time for CATI is likely due to the use of a high-speed internet connection by interviewers. Some web respondents may have used slower dial-up connections, which increase transit time. Overall, the interview took less time for CATI respondents than for web respondents (29 minutes and 30 minutes, respectively; t = 7.80, p < .0001).

Table 22. Average on-screen, transit, and total survey completion time, in minutes, for the faculty questionnaire, by mode: 2004

All respondents Web respondents CATI respondents

Portion of interview Average

time Number of

cases1 Average

time Number of

cases1 Average

time Number of

cases1 Total 29.7 24,360 30.1 18,630 28.5 5,730

Onscreen 26.5 24,360 26.3 18,630 27.1 5,730 Transit 3.2 24,360 3.8 18,630 1.4 5,730 1 Numbers rounded to the nearest 10. Abbreviated and partial interviews excluded. NOTE: Detail may not sum to totals because of rounding. CATI = computer assisted telephone interview. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The onscreen, transit, and total times were significantly shorter for surveys that were completed during business hours (Monday through Friday, 9:00 a.m. to 6:00 p.m.) compared to those completed during evening and weekend hours (onscreen: 26 and 27 minutes, respectively; t = 4.79, p < .0001; transit: 3 and 4 minutes, respectively; t = 17.71, p < .0001; total: 29 and 31 minutes, respectively; t = 10.29, p < .0001), as shown in table 23. This may be due to faster internet connections for web respondents at their offices compared to their homes or time pressures during the workday.

26 Excludes the two highest outliers, both with transit times greater than 4 hours.

Page 76: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

56

Table 23. Average on-screen, transit, and total completion time, in minutes, by time of day and mode: 2004

Web respondents CATI respondents

Weekdays 9am–6pm Evenings/weekends Weekdays 9am–6pm Evenings/weekends Portion of interview

Average time

Number of cases1

Average time

Number of cases1

Average time

Number of cases1

Average time

Number of cases1

Total 29.2 11,620 31.4 7,010 28.5 3,710 28.6 2,020

Onscreen 26.0 11,620 26.9 7,010 27.0 3,710 27.2 2,020

Transit 3.3 11,620 4.6 7,010 1.5 3,710 1.4 2,020 1 Numbers rounded to the nearest 10. NOTE: Abbreviated and partial interviews excluded, as well as two outliers. Detail may not sum to totals because of rounding. CATI = computer assisted telephone interview. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table 24 provides the average and maximum times (in seconds) to complete each form in the faculty instrument. Seven forms (screens) in the faculty survey took more than 1 minute to administer, on average. These tended to be the more complicated forms and those that collected multiple pieces of information on a single screen. These forms are described in greater detail below.

Table 24. Average and maximum completion time, in seconds, for forms in the faculty instrument: 2004

Time in seconds Questionnaire form Description Average Maximum

Number of cases1

Q1 Instructional duties, any 19 102 26,110 Q2 Instructional duties related to credit courses/activities 12 64 24,310 Q3 Faculty status 10 97 26,110 Q4 Principal activity 16 87 26,110 Q5 Employed full or part time at this institution 7 56 26,110 Q6 Part-time employment is primary employment 8 46 8,370 Q8 Part-time but preferred full-time position 8 63 8,340 Q9 Year began current job 21 131 26,110 Q10 Rank 14 85 26,110 Q11 Rank, year attained professor or associate professor 27 194 9,500 Q12 Tenure status 12 100 26,110 Q13 Tenure, year attained at any postsecondary institution 19 160 8,440 Q14 Union status 9 74 26,110 Q15 Union status, reason not a member 12 82 20,850 Q16VS Principal field of teaching-verbatim 24 134 26,110 Q16AC Principal field of teaching-autocode 30 143 23,590 Q16CD Principal field of teaching-manual code 54 230 7,480 Q17A1 Highest degree 18 131 26,110 Q17A1B Hold PhD and professional degree 5 27 2,120 Q17A2 Highest degree date awarded 12 94 25,870 Q17A3VS Highest degree field-verbatim 16 96 25,860 Q17A3AC Highest degree field-autocode 16 104 24,710 Q17A3CD Highest degree field-manual code 30 189 6,570 Q17A4 Highest degree institution-code 51 245 25,850 See notes at end of table.

Page 77: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

57

Table 24. Average and maximum completion time, in seconds, for forms in the faculty instrument: 2004—Continued

Time in seconds Questionnaire form Description Average Maximum

Number of cases1

Q17A4A Highest degree institution-info for later coding 27 87 1,270 Q17D Bachelor’s degree date awarded 13 96 23,460 Q18 Other current jobs, number of jobs 15 82 26,110 Q19A Other current jobs, full-time employment 7 66 8,290 Q19B Other current jobs, number in postsecondary instruction 10 75 8,130 Q21 First postsecondary job, current job is first 17 86 26,110 Q23 First postsecondary job, year began 18 99 14,310 Q24 First postsecondary job, part or full time 8 55 26,110 Q26 First postsecondary job, tenure status 11 68 14,780 Q27 Other jobs, any outside postsecondary since degree 11 75 26,110 Q28 Other jobs, sector of previous job 24 123 26,110 Q31 Hours worked per week 94 338 24,580 Q32 Percent distribution of work activities 78 311 24,330 Q35A Number of classes taught, credit and noncredit 44 205 23,600 Q35B Number of classes taught, remedial and distance education 22 127 21,240 Q36 Teaching assistant in any credit class 9 66 20,230 Q37 Number and types of classes taught (up to five classes) 99 402 20,220 Q38 Tools instructor used to evaluate undergraduate students 68 234 16,430 Q39 Website for any instructional duties 14 140 23,020 Q41 Hours per week, e-mailing students 16 87 23,020 Q46 Individual instruction, any 15 94 24,550 Q47 Individual instruction, number of students 20 124 8,230 Q47B Individual instruction, number of hours 22 131 7,880 Q48 Hours per week, committees/advisees/office hours 61 251 24,530 Q52A Career publications/presentations 100 427 24,490 Q52B Recent publications/presentations 47 275 21,190 Q53 Scholarly activity, any 13 93 24,470 Q54VS Scholarly activity, principal field-verbatim 25 203 540 Q54AC Scholarly activity, principal field-autocode 14 53 410 Q54CD Principal research field-manual code 29 139 270 Q56 Scholarly activity, description 17 86 14,000 Q55 Scholarly activity, any funded 12 76 13,930 Q61 Satisfaction, authority/resources/salary/benefits 60 212 24,450 Q65 Retirement plans/history 18 91 24,440 Q66 Income, from institution/other sources 111 403 24,420 Q66B Amount of total individual income (range) 25 194 2,730 Q67 Type of contract, length of unit 18 190 24,410 Q68 Income paid per course/credit unit or term 11 75 6,260 Q69 Amount of income paid per course/credit unit or term 20 184 5,170 Q70A Amount of total household income 33 171 24,400 Q70B Amount of total household income (range) 13 80 3,400 Q71 Gender 9 152 25,990 Q72 Age, year of birth 6 47 25,990 Q73 Race/ethnicity, Hispanic/Latino 4 45 25,980 Q74 Race 9 65 25,980 See notes at end of table.

Page 78: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

58

Table 24. Average and maximum completion time, in seconds, for forms in the faculty instrument: 2004—Continued

Time in seconds Questionnaire form Description Average Maximum

Number of cases1

Q75 Disability, any 10 70 25,980 Q77 Marital status, fall 2003 7 49 25,980 Q79 Dependent children, number 8 48 25,980 Q80 United States birth/citizenship status 7 55 25,970 Q82 Opinion, institution fairness 32 125 24,360 Q83 Opinion about choosing an academic career again 8 57 24,360 1 Numbers rounded to the nearest 10. NOTE: The number of cases per form varies due to the interview skip logic. Outliers for each form were topcoded (mean + 3 standard deviations). SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Q31 and Q32. The questions that asked for the number of hours per week spent on work activities, Q31 (by paid and unpaid activities at the target institution and outside that institution), and the percent distribution of work activities, Q32, took 94 and 78 seconds, respectively, to administer. Each of these forms took longer when administered by telephone interviewers than when self-administered via the web instrument. Q31 averaged 91 seconds for web respondents compared with 103 seconds for CATI respondents (t = –13.64, p < .0001). Web respondents averaged 76 seconds on Q32 compared with an average time of 83 seconds for CATI respondents (t = –7.59, p < .0001). The complexity of these questions may have led to the longer times for CATI administration, as respondents often asked interviewers to repeat the question and examples, and asked questions about the appropriate category for certain types of activities.

Q37 and Q38. Two consecutive forms, Q37 and Q38, asked for a great deal of information on a single screen. Q37 was a matrix-style question that asked six questions about each of the credit classes (up to five) the respondent taught. This form took 99 seconds, on average, to administer, with CATI respondents taking significantly less time than web respondents (94 and 100 seconds, respectively, t = 4.26, p < .0001). The matrix of items on Q37, visually different from the rest of the forms in the questionnaire, likely took web respondents extra time to make sense of and answer.

Q38 asked respondents to identify which of 10 different types of student evaluation tools were used in their classes and whether they were used in all, some, or none of the classes. This form took an average of 68 seconds to administer, with CATI respondents taking significantly longer than web respondents (93 and 60 seconds, respectively, t = –49.69, p < .0001).

Q48. This form asked for the number of hours per week the respondent spent on four activities (thesis/dissertation committees, administrative committees, with advisees, and office hours). On average, respondents took 61 seconds to complete this form, with CATI respondents taking significantly longer than web respondents (72 and 57 seconds, respectively, t = –24.37, p < .0001).

Q52A. Q52A, which asked for the number of career publications or presentations in seven categories, took an average of 100 seconds to complete. This may have required respondents to locate their curricula vitae and count the number of publications. CATI respondents spent significantly more time on this item than web respondents (106 and 98 seconds, respectively, t = –5.88, p < .0001).

Page 79: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

59

Q66. The form asking about respondents’ compensation from the target institution and from other sources, Q66, took 111 seconds to complete, on average. This form consisted of six income questions, which were considered to be among the most sensitive items in the questionnaire. Average time to complete this form was shorter for web respondents (109 seconds) than for CATI respondents (118 seconds; t = –7.60, p < .0001).

3.3.2 Help Desk To gain a better understanding of the problems encountered by faculty members

attempting to complete the web self-administered questionnaire, software was developed to record each help desk incident that occurred during data collection. For each occurrence, help desk staff confirmed contact information for the sample member, recorded the type of problem, described the problem and resolution, noted its status (pending or resolved), and recorded the approximate time it took to assist the faculty member. Help desk staff were trained not only to answer any calls received from the help desk hotline, but also to conduct telephone interviews when needed. Help desk staff members assisted sample members with questions about the web instrument and provided technical assistance to sample members who experienced problems while completing the self-administered web survey. Help desk agents also responded to voice mail messages left by respondents when the call center was closed.

Help desk staff assisted 3,860 faculty members (11 percent of the sample). Eighty-one percent of these cases called the help desk only once, 12 percent called twice, 4 percent called three times, and 3 percent called four or more times. Of the 3,860 faculty members who called the help desk, 2,940 (76 percent) eventually completed the survey.

Twenty-nine percent of the problems reported by faculty members who called the help desk were for miscellaneous issues. The miscellaneous issues were first coded into specific issues and then these issues were coded into five broader categories as shown in table 25. First time calls included setting an appointment for the CATI interview, providing a new phone number or e-mail address, promising to complete by phone at a later date, or promising to complete the survey on the Web. Nearly 7 percent of help desk contacts were faculty members calling in to refuse. Follow-up calls to the help desk (6 percent) included faculty members checking on the incentive, or verifying that they had completed the survey. Other miscellaneous issues were less than 2 percent of all contacts. Slightly more than 1 percent of help desk calls reported that the faculty member was not at the phone number, e-mail address, or college that was contacted.

Other specific issues handled by the help desk included requests to complete the survey by telephone (21 percent), questions about the study (19 percent), browser setting and computer problems (14 percent), requests for study ID and/or password (12 percent), errors in questionnaire programming (3 percent), questions about questionnaire content (2 percent), website being down or unavailable (1 percent), and routing/skip problems (less than 1 percent).

Page 80: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

60

Table 25. Response pattern, by help desk problem type: 2004

Type of problem Number Percent Total 5,151 100.0

Miscellaneous 1,491 29.0 First time calls (set call back date/time, etc.) 698 13.6 SM called in to refuse 352 6.8 Follow-up calls (checking on incentive, verifying complete) 284 5.5 Other 94 1.8 SM not at this number/college 63 1.2

Called in to complete by phone 1,078 20.9 Question about study 964 18.7 Browser settings/computer problems 694 13.5 Study identification (ID) code/password 626 12.2 Program error 130 2.5 Questionnaire content 114 2.2 Website unavailable 47 .9 Routing/skip problems 7 .1 NOTE: Detail may not sum to totals because of rounding. SM = sample member. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

3.3.3 Interviewer Hours A total of 17,639 telephone interviewing staff hours (including help desk staffing,

telephone follow-up calls, and CATI interview hours) were expended during faculty data collection. These hours do not include supervision, monitoring, administration, and Quality Circle meetings. The average time spent per completed CATI interview was 2.7 hours and per completed interview overall (including web completes) was 0.7 hours. The average time to administer the CATI was 29 minutes, which shows that a majority of interviewer time was spent on other activities. These other activities focused on contacting and locating the sample member, with a small portion of time devoted to bringing up a case, reviewing its history, and closing the case (with the appropriate reschedule, comment, and disposition). A significant proportion of the web completes occurred after the period of telephone follow-up began and were completed only after several CATI follow-up calls had been made to the respondent.

3.3.4 Number of Calls Telephone interviewers made 226,777 call attempts to faculty members during the

NSOPF:04 data collection period (see table 26). The number of calls per case ranged from 0 to 152. On average, six calls27 were made to each sample member. Those who were not interviewed received the highest average number of calls. An average of four call attempts were required for respondents compared to an average of 13 call attempts for nonrespondents (t = 60.9, p < .0001). Faculty members who completed the web self-administered questionnaire were called significantly fewer times, with an average of three call attempts per completed survey, compared to an average of eight calls to CATI respondents (t = 41.5, p < .0001).

27 This figure includes cases where no call attempts were made, either because the respondent completed the questionnaire via the Web before CATI calling began, or the individual could not be located.

Page 81: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

61

Table 26. Total and average number of calls, by completion status and mode of completion: 2004

Completion status/mode Number of

cases1 Number of

calls Average

calls per case Total 35,630 226,777 6.4

Interviewed 26,110 102,946 3.9 Not interviewed 9,520 123,831 13.0

By mode 26,110 102,946 3.9 Web complete 19,780 53,621 2.7 CATI complete 6,330 49,325 7.8 1 Number of respondents rounded to nearest 10. NOTE: Detail may not sum to totals because of rounding. CATI = computer assisted telephone interview. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Call screening is a growing problem for studies that rely on the telephone as a mode of contact. Devices such as telephone answering machines can be used to screen unwanted calls. Of the 19,394 cases called by telephone interviewers, 15,183 cases (78 percent) reached an answering machine at least once (see table 27). Interviewers made significantly more calls to cases where an answering machine had been reached at least once (mean attempts = 13) than they did to cases where no answering machine was reached (mean attempts = 5; t = –46.81, p < .0001). Likewise, cases where an answering machine had been reached at least once were less likely to have completed the interview (54 percent) than cases where no answering machine was reached (63 percent; χ2 = 92.4, p < .0001).

Table 27. Average call attempts, by reached answering machine: 2004

Cases called in CATI Completed cases

Result of call attempt Number of

cases Average

number of calls Number of

cases1 Average

number of calls Reached answering machine at least once 15,183 13.4 8,230 11.2 Never reached an answering machine 4,211 5.4 2,630 4.0 1 Numbers rounded to the nearest 10. NOTE: Excludes 16,240 completed cases that were never called by telephone interviewers because they completed the self-administered questionnaire during or soon after the early-response period of data collection. Some of the cases called by telephone interviewers actually completed the web self-administered questionnaire. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Looking only at completed cases, significantly fewer calls were required to obtain a completed interview when no answering machine was reached (mean attempts = 4) compared to cases in which an answering machine was reached at least once (mean attempts = 11, t = -40.69, p < .0001). Those who possessed answering machines were included in the survey definition of “accessible”; however, it took considerable persistence and resources (in the form of repeated call attempts) to reach these faculty members. This finding demonstrates that answering machines and other call screening devices are increasing the effort that must be expended to reach these cases, thereby driving up interviewing costs.

In addition, cases where an answering machine was reached on more than one-half of the call attempts required significantly more effort to contact and interview. The mean number of attempts for cases that reached an answering machine less than one-half of the time was 10 compared to 13 (t = -15.3, p < .0001) for cases that reached an answering machine more than one-half of the time. Similarly, among completed cases, significantly fewer calls were needed to

Page 82: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 3. Data Collection Outcomes

62

complete an interview with cases where an answering machine was reached less than one-half of the time (mean attempts = 8) compared with those where an answering machine was reached more than one-half of the time (mean attempts = 11; t = -12.9, p < .0001).

3.4 Conclusions Of the 1,070 eligible institutions, 980 (91 percent, unweighted and weighted) provided

faculty lists and 920 (86 percent) completed the institution questionnaire. A total of 26,110 faculty and instructional staff completed the faculty survey for a 76 percent response rate. Approximately three-quarters (76 percent) of respondents completed the web self-administered questionnaire rather than the CATI (24 percent). Strategies that helped attain this response rate included tracing, e-mail contacting, and refusal conversion efforts, along with targeted incentives.

Page 83: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

63

Chapter 4 Evaluation of Data Quality

Evaluations of data quality serve to identify problems with the data collection processes and instruments in order to remedy them for the next cycle of the study. Project staff evaluated faculty list quality, item nonresponse, item mode effects, breakoffs, coding, quality control monitoring of interviewers, and interviewer feedback. The results of these evaluations are presented in this chapter.

4.1 List Quality

4.1.1 List Types Faculty lists may be characterized both by type of media—whether they are electronic or

hardcopy—and method of transmission (e.g., fax or mail, e-mail, electronic upload). For the 2004 National Study of Postsecondary Faculty (NSOPF:04), institutions were asked to provide a single, unduplicated (i.e., duplicate entries of names removed) electronic list of faculty in any commonly-used and easily processed format (e.g., ASCII fixed field, comma delimited, spreadsheet format). These preferred electronic file formats are far less labor intensive to process than paper lists and more easily unduplicated by ID number. However, as in previous cycles, paper lists were accepted, as were multiple files (e.g., separate files of full- and part-time faculty) and lists in electronic formats that did not lend themselves to electronic processing (such as word processing formats).

For the first time, institutions were given the option to transmit their electronic faculty lists via a secure upload to the National Study of Faculty and Students (NSoFaS:04 ) website and were encouraged to do so. (In previous cycles, direct upload was available only by file-transfer protocols, an option that few institutions utilized). Institutions were also given the option of sending a CD-ROM, diskette, or paper list containing the list data or sending the list via e-mail (as an encrypted file, if necessary).

As shown in table 28, the vast majority of lists received were in electronic formats. Of 980 participating institutions, 830 (85 percent) supplied an electronic list by upload, e-mail, CD-ROM, or diskette. Institutions showed a clear preference for uploading their list by direct upload; 590 institutions (60 percent of lists overall and 71 percent of electronic lists) delivered their data in this manner.

NSOPF:04 clearly benefited from the increased capability and willingness of institutions to supply lists in electronic formats, compared to previous cycles. As table 29 shows, 65 percent of institutions supplied an electronic list for NSOPF:99, with a majority of them in CD-ROM or diskette formats sent by mail.

Page 84: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

64

Table 28. Number of submitted faculty lists, by type of institution and transmittal mode: 2004

Number of institutions providing lists via six transmittal modes

Institution type

Number of sample

institutions1 Total UploadElectronic

& paper Diskette Paper

Abstracted from web directory E-mail

Total 1,080 980 590 # 40 # 140 200

Public doctoral 190 180 120 # # # 30 30Public master’s 120 100 60 # 10 # 10 30Public bachelor’s 30 30 20 # # # 10 #Public associate’s 340 290 170 # 30 # 30 60Public other 10 10 10 # # # # #Private not-for-profit doctoral 110 100 70 # # # 20 20Private not-for-profit master’s 80 80 50 # # # 10 10Private not-for-profit

bachelor’s 130 120 70 # # # 20 30

Private not-for-profit associate’s 10 10 10 # # # # #

Private not-for-profit other 60 60 30 # # # 20 10# Rounds to zero. 1 Number of institutions rounded to nearest 10. Detail may not sum to totals because of rounding and because of duplicative forms of list transmittal modes. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table 29. Faculty list types, by NSOPF cycle: 1999 and 2004

NSOPF:99 NSOPF:04

Type of list Number of institutions

Unweighted percent1

Number of institutions

Unweighted percent1

Total 820 100 980 100.0

Paper 290 35.0 10 0.6 Electronic (ftp or upload) 2 10 1.1 590 60.4 Electronic (E-mail) 220 26.6 200 20.7 CD-ROM or diskette 310 37.2 40 4.0 Abstracted from web resource3 — — 140 14.2 — Not available. 1 Percentages are based on original unrounded numbers. 2 FTP was utilized only in 1999; upload was utilized only in 2004. 3 In 1999, lists abstracted from web resources were processed as, and included with paper lists. NOTE: Numbers rounded to the nearest 10. Detail may not sum to totals because of rounding. FTP = file transfer protocol. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

For institutions that indicated they lacked the staff or resources to compile a list of faculty on their own within schedule constraints, it was sometimes possible to abstract a list from employee directories, course schedules, or course catalog listings available on the institution’s website or through other web resources. As in past cycles (where course catalogs or directories were used as lists of last resort), all such lists were reviewed to ensure they were sufficiently complete for sampling (i.e., included both full- and part-time faculty, did not systematically exclude any subset of faculty and instructional staff). It should be noted that in past cycles, course catalogs and directories comprised a large percentage of lists supplied on (or processed

Page 85: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

65

as) paper. While the web listings utilized for NSOPF:04 required more processing than electronic lists (including reformatting into a spreadsheet or re-keying), they proved, overall, to be far less problematic for processing and sampling than an equivalent paper list. Only 15 percent of institutions submitted paper lists or had lists abstracted from web resources for NSOPF:04; this compares to 35 percent of institutions who submitted paper lists (including lists abstracted from web resources) in NSOPF:99.

4.1.2 List Data Quality As in prior administrations of this study, secured faculty lists were evaluated for accuracy

and completeness of information before they were processed for sampling. To facilitate quality control, faculty list counts were compared against counts obtained from the following supplementary sources:

• the institution questionnaire and/or the file layout form, if a questionnaire was not completed but an overall faculty count was supplied;

• the 2001 Integrated Postsecondary Education Data System (IPEDS) Fall Staff Survey;

• the Contact Information and File Layout (CIFL) form (which included faculty counts, and used when questionnaire data was unavailable); and

• NSOPF:99: frame data from the 1999 survey.

Discrepancies in counts of full- and part-time faculty between the faculty list and other sources that were outside the expected range were investigated. All institutions with submitted lists that failed any checks were recontacted to resolve the observed discrepancies.

Because of time and definitional differences between NSOPF and IPEDS, it was expected that the faculty counts obtained from the institutions and IPEDS would include discrepancies. Consequently, quality control checks against IPEDS were less stringent than those against the institution questionnaire. However, list count comparisons against IPEDS and NSOPF:99 data were useful in identifying systematic errors, particularly those related to miscoding of the employment status of faculty members. Table 30 shows the types of discrepancies encountered by type of institution.

Page 86: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

66

Table 30. Type of discrepancies encountered, by type of institution: 2004

Discrepant with Insufficient data

Institution type Sampled

institutions IPEDS QUEX Unreadable Needed for

sampling CIFL Total 1,080 300 280 10 180 190

Public doctoral 190 70 50 # 40 30 Public master’s 120 30 30 # 10 20 Public bachelor’s 30 10 10 # 10 10 Public associate’s 340 80 90 # 30 60 Public other 10 # # # # # Private not-for-profit doctoral 110 40 30 # 20 20 Private not-for-profit master’s 80 20 20 # 20 10 Private not-for-profit bachelor’s 130 30 40 # 30 20 Private not-for-profit associate’s 10 # # # # # Private not-for-profit other 60 20 20 # 20 10 # Rounds to zero. NOTE: IPEDS is the National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System; QUEX refers to the institution questionnaire; CIFL refers to the contact information and file layout forms. Numbers rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table 31 shows the percent differences between the three sources of data for all cycles of NSOPF (1988, 1993, 1999, and 2004). The discrepancies between the faculty lists and institution questionnaire counts have declined over time. Also, table 32 shows mean differences between sources of data across all cycles of NSOPF. More details regarding the quality of faculty lists secured for NSOPF:04 are provided in appendix H.

Table 31. Percentage differences between sources of data across all cycles of NSOPF: 1988 to 2004

Percent difference in faculty counts Comparison Year

Number of institutions <-50 -50 to -31 -30 to -11 -10 to 10 11 to 30 31 to 50 >50

LIST-IPEDS 1988 1993 1999 2004

410 660 770 980

8.0 5.0 6.4 2.4

5.6 5.2 6.5 4.1

14.9 11.3 13.6 12.2

35.4 25.4 33.7 32.4

16.6 23.8 23.0 23.1

7.6 13.3 6.8 9.3

12.0 16.0 9.9

16.6

QUEX-LIST 1988 1993 1999 2004

410 750 770 900

1.9 3.7 1.4 1.2

3.9 6.5 2.7 1.3

16.6 13.2 7.0 3.8

51.2 41.7 72.3 82.6

15.1 12.3 5.6 4.9

2.4 6.1 3.2 2.0

8.8 16.5 7.8 4.1

QUEX-IPEDS 1988 1993 1999 2004

410 690 790 900

3.9 2.3 3.3 1.6

6.8 4.5 6.6 2.1

15.9 9.2

11.4 9.2

34.6 26.6 40.7 37.1

20.0 25.4 22.0 24.6

7.8 12.6 6.7 9.5

11.0 19.3 9.5

16.1 NOTE: LIST refers to the faculty list provided by sampled institutions; IPEDS is the National Center for Education Statistics’ Integrated Postsecondary Education Data System; QUEX refers to the institution questionnaire. Numbers rounded to the nearest 10. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 87: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

67

Table 32. Mean differences between sources of data across all cycles of NSOPF: 1988 to 2004

Standard error1

Comparison Year Number of institutions Mean difference

Mean percent difference in faculty counts

LIST-IPEDS 1988 1993 1999 2004

410 660 770 980

3.0 (17.3) 88.4* (22.6) 24.8 (13.8) 57.5* (13.1)

14.1* (3.8) 24.8* (3.1) 9.8* (2.1) 29.0* (3.2)

QUEX-LIST 1988 1993 1999 2004

410 750 770 900

8.5 (16.1) 23.5 (16.7) 16.1 (11.2) 7.6 (8.8)

11.4* (3.2) 142.4 (106.8) 14.9* (2.7) 5.1* (1.2)

QUEX-IPEDS 1988 1993 1999 2004

410 690 810 900

11.6 (14.7) 96.3* (21.5) 53.5* (12.8) 69.0* (9.4)

15.8* (3.6) 36.4* (5.2) 18.5* (2.7) 30.2* (3.3)

LIST-IPEDS2 1988 1993 1999 2004

330 520 640 790

-12.3 (10.9) 34.2* (9.4) 9.8 (9.8) 10.0 (9.3)

1.2 (1.1) 7.4* (1.0) 2.7* (0.8) 5.7* (0.7)

QUEX-LIST2 1988 1993 1999 2004

370 600 700 850

-12.1 (8.4) -22.0 (7.9) -18.5* (6.0) 2.0 (2.4)

-1.1* (0.8) -0.1* (0.8) -0.1* (0.9) 0.6 (0.3)

QUEX-IPEDS2 1988 1993 1999 2004

350 540 690 740

1.5 (9.1) 35.2* (8.2) 6.7 (8.5) 29.9* (8.9)

1.4 (1.1) 8.6* (0.9) 2.7* (0.7) 7.8* (0.7)

* Statistically significant at alpha = .05, based on paired t-test. 1 Standard errors assume simple random sampling. 2 Observations with percent differences greater than 50 in absolute value were excluded. NOTE: LIST refers to the faculty list provided by sampled institutions; IPEDS is the National Center for Education Statistics’ Integrated Postsecondary Education Data System; QUEX refers to the institution questionnaire. Numbers rounded to the nearest 10. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

4.2 Item Nonresponse Recent studies (for example, DeRouvray and Couper 2002) using web self-administered

questionnaires have shown higher than usual rates of missing data when the “refuse” and “don’t know” options are presented on the screen. To limit the rate of nonresponse in the institution and faculty instruments, the refusal option was unavailable to respondents and the “don’t know” option was limited to selected screens where the respondent might not know the answer (e.g., expected age at retirement). On the information page at the start of the questionnaire, respondents were instructed to click the “continue” button to proceed to the next question if they wished to decline to answer a question.

For the institution questionnaire, items with a high rate of missing responses are often those that required lookup by an office on campus other than the Institution Coordinator’s (e.g., Human Resources or Academic Affairs) and might reflect a lack of cooperation from those other offices. Two of the 90 items in the questionnaire had more than 15 percent of the data missing.

Page 88: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

68

Details of institution item nonresponse, including the nonresponse bias analysis, are presented in appendix I.

Thirty-four of the 162 items in the faculty questionnaire had more than 15 percent of the data missing.28 With the exception of the income items, which were expected to have higher rates of refusal due to their sensitive nature, the primary reason item nonresponse exceeded 15 percent for these items is that each applies to a relatively small subset of respondents (i.e., small denominator) and these items were not included in the abbreviated instrument. The nonresponse bias analysis and details of faculty item nonresponse are presented in appendix I.

4.3 Faculty Data Quality

4.3.1 Item Mode Effects The NSOPF:04 faculty instrument was designed to minimize potential mode effects by

using a single instrument for both self-administration and CATI. However, whenever multiple modes are used for data collection, the possibility of mode effects is inherent. Because respondents were offered the option of completing the interview by themselves on the Web or with an interviewer, there was the potential for bias due to self-selection or other factors which cannot be accounted for. Therefore, these results should be interpreted as how respondents in different modes of administration answered the survey questions and not as true mode differences.

Due to the large sample size, nearly all test statistics used to measure differences between self-administered and CATI respondents were significant. Reporting all of these statistically significant differences is not substantively meaningful; therefore, only differences of five percentage points or greater are reported.29,30

For this analysis, 47 variables were selected, covering the following topic areas: demographic variables, descriptive items, factual items, and opinion-based questions. Criteria for selection of items included importance to the content of this study. Items for which project staff had concerns that there might be mode effects (e.g., complex matrix items) were also selected. Although not presented in tables, the following discussion on item mode effects is based on special tabulations from the 2004 NSOPF faculty data.

Demographics

Compared to their CATI counterparts, web respondents were more likely to be White (Q74E: 87 percent versus 81 percent, z = 11.65, p < .001). Conversely, CATI respondents were

28 The items included in this analysis are listed in appendix K (Q1 through Q83). The number of items differs from the number of faculty items reported elsewhere in this document. For example, the difference between the number of analysis variables (162) and the number of items in the faculty questionnaire (183) occurs because some items in the faculty questionnaire were for internal use; similarly, there are fewer stochastically imputed variables (144) than analysis variables (162) because some variables had no missing data after logical imputations were performed. 29 For questions where means were used, the unit of measurement, range of answers, and standard deviation was evaluated to determine which statistically significant differences to report. 30 Footnotes are used to report differences in the 3 to 4 percent range since these could be seen as indicative of a substantively important difference.

Page 89: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

69

more likely to be Black or African American (Q74C: 14 percent versus 7 percent, z = -16.98, p < .001) than their web counterparts. No mode differences were observed for gender or age.31

Descriptors

Web respondents were more likely than CATI respondents to report research as their primary activity (Q4: 11 percent versus 6 percent, z = 11.62, p < .001), be employed full-time (Q5: 73 percent versus 53 percent, z = 29.72, p < .001), be an assistant professor (Q10: 19 percent versus 12 percent, z = 17.81, p < .001), be tenured (Q12: 34 percent versus 28 percent, z = 8.79, p < .001) or be on the tenure track (Q12: 16 percent versus 10 percent, z = 11.68, p < .001), and not be employed outside the target institution (Q18: 72 percent versus 59 percent, z = 61.15, p < .001). CATI respondents were more likely than web respondents to report teaching as their principal activity (Q4: 77 percent versus 70 percent, z = -10.73, p < .001), be an instructor (Q10: 27 percent versus 17 percent, z = -17.49, p < .001), not be on the tenure track (Q12: 51 percent versus 41 percent, z = 28.58, p < .001), and be employed outside the target institution (Q18: 31 percent versus 22 percent, z = -14.54, p < .001).32

Factual items

Twenty-four factual items were chosen, based on their importance to the study objectives. These factual items were expected to show few, if any, mode differences. These questions centered on eight main topic areas: number of classes taught, year began first postsecondary job, employment sector of previous job, hours per week spent on various tasks, percent time spent on various tasks, use of various methods in the classroom, other activities, and publications.

Classes taught. There were no significant differences observed in mean number of credit and noncredit classes taught at the target postsecondary institution (Q35A1 and Q35A2).

Year began first postsecondary job. There was no significant difference in the mean year web respondents began their first postsecondary job (Q23) compared to their CATI counterparts.

Employment sector of previous job. Web respondents were more likely to have no other job prior to their current position (Q28: 10 percent versus 5 percent, z = 12.21, p < .001) than were CATI respondents.33

Hours per week spent on various tasks. Web respondents reported spending more time on paid tasks at the target institution (Q31A), on average, than their CATI counterparts (37 hours versus 31 hours, t = 22.70, p < .001), while CATI respondents reported spending more time on paid tasks outside the institution (Q31C) than web respondents (12 hours versus 7 hours, t = -20.37, p < .001). No significant differences were found on hours spent on unpaid tasks at the

31 Two measures showed differences at the 3 and 4 percent level. Web respondents (57 percent) were more likely to be male (Q71) than their CATI counterparts (53 percent, z = 5.58, p < 0.001), and web respondents (7 percent) were more likely to be Asian (Q74B) than their CATI counterparts (4 percent, z = 8.44, p < 0.001). 32 Three measures showed differences at the 3 and 4 percent level. Web respondents were more likely to report administration as their primary activity (Q4: 9 percent versus 6 percent, z = 7.52, p < 0.001) and be an associate professor (Q10: 17 percent versus 13 percent, z = 7.54, p < 0.001). CATI respondents were more likely than web respondents to not report an academic title (or use the “other” category) (Q10: 23 percent versus 19 percent, z = -6.92, p < 0.001). 33 CATI respondents were more likely to have been employed in an elementary or secondary school prior to their current position (Q28: 20 percent versus 16 percent, z = -7.37, p < 0.001).

Page 90: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

70

institution (Q31B), unpaid tasks outside the institution (Q31D), or hours spent e-mailing students each week (Q41).

Percentage of time spent on various tasks. Respondents were asked to provide the percentage of time they spent on undergraduate instructional activities (Q32A), graduate instructional activities (Q32B), research activities (Q32C), and other activities (Q32D). CATI respondents reported spending a greater percentage of their time each week on instructional activities with undergraduates than web respondents (61 percent versus 53 percent, t = -13.71, p < .001).34

Use of various methods in the classroom. Of the 11 methods in question, only four showed a significant difference by mode. Compared to web respondents, CATI respondents were more likely to report using multiple choice exams (Q38A: 61 percent versus 55 percent, z = -6.64, p < .001), using essay midterm or final exams (Q38B: 61 percent versus 51 percent, z = -10.99, p < .001), and to report using service learning experiences (Q38J: 33 percent versus 26 percent, z = 25.23, p < .001). Web respondents were more likely to report using a website for instructional duties (Q39) compared to CATI respondents (45 percent versus 35 percent, z = 13.47, p < .01).35

Publications. The average number of articles published in refereed journals in their careers (Q52AA) was no different for web and CATI respondents.

Opinion

Thirteen opinion-based questions were evaluated for mode differences. Eight of these questions asked how satisfied respondents were with various aspects of their job, including: authority to make decisions, technology-based activities, equipment/facilities, institutional support for teaching improvement, workload, salary, benefits, and job overall (Q61 and Q62). As shown in table 33, CATI respondents were significantly more likely to report being either somewhat or very satisfied with five of the eight items—including equipment/facilities, institutional support for teaching improvements, workload, salary, and job overall—compared to web respondents.36 These differences may be due to the effect of social desirability on responses when an interviewer is involved.

34 Web respondents reported spending a greater percentage of their time each week on research (Q32C) compared to CATI respondents (15 percent versus 12 percent, t = 11.63, p < 0.001). 35 Three items showed differences at the 4 percent level. Compared to web respondents, CATI respondents were more likely to report using multiple drafts of written work (Q38E: 43 percent versus 39 percent, z = -4.48, p < 0.001), oral presentations by students (Q38F: 65 percent versus 61 percent, z = -4.52, p < 0.001), and student evaluations of each other’s work (Q38H: 41 percent versus 37 percent, z = -4.52, p < 0.001). 36 Two additional questions showed significant differences at the 3 and 4 percent level. CATI respondents were more likely than web respondents to report being somewhat or very satisfied with institutional support for technology based instructional activities (Q61B: 89 percent versus 85 percent, z = -6.96, p < 0.001) and the benefits available to them (Q62C: 74 percent versus 71 percent, z = -4.15, p < 0.001).

Page 91: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

71

Table 33. Satisfaction with various aspects of job, by mode of administration: 2004

Percent Item Description Web CATI Q61C Satisfaction with equipment/facilities 77.5 83.9 Q61D Satisfaction with institutional support for teaching improvement 68.8 82.7 Q62A Satisfaction with workload 77.0 83.6 Q62B Satisfaction with salary 60.7 70.2 Q62D Satisfaction with job overall 87.3 92.4 NOTE: Percentages are based on those indicating they were somewhat or very satisfied with that aspect of their job. CATI = computer assisted telephone interview. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The remaining five opinion-based questions asked respondents to indicate whether they agreed or disagreed that teaching was rewarded, part-time faculty were treated fairly, female faculty were treated fairly, and racial minorities were treated fairly (Q82); and whether they would choose an academic career again (Q83). CATI respondents were more likely than web respondents to somewhat or strongly agree that good teaching was rewarded (82 percent versus 76 percent, z = -9.35, p < .001) and part-time faculty were treated fairly (75 percent versus 65 percent, z = -13.62, p < .001).37

4.3.2 Breakoffs A total of 27,350 sample members started the faculty interview. Of these, 800 were

deemed ineligible. Of the 26,550 eligible sample members who started the interview, 26,110 completed either a full, abbreviated,38 or partial interview.39 An additional 10 cases either refused to be included as respondents or provided insufficient data to be useful. The remaining 430 broke off before completing the workload section (C) and were not considered to be partial completes.

Table 34 lists the forms (screens) that had more than 15 breakoffs. In most cases, the forms with the highest number of breakoffs required detailed recall or requested sensitive information.

37 One additional question showed a significant difference at the 3 percent level. CATI respondents were more likely than web respondents to either somewhat or strongly agree that female faculty members are treated fairly (Q82: 91 percent versus 88 percent, z = -6.04, p < 0.001). 38 The abbreviated interview consisted of sections A (nature of employment), B (academic/professional background) and G (sociodemographic characteristics) of the faculty interview. 39 Interviews that broke off after completing section C (workload) were considered partial completes. Of the 140 respondents who did so, 48 percent broke off in the scholarly activities section (D), 9 percent in the job satisfaction section (E), 29 percent in the compensation section (F), 11 percent in the characteristics section (G), and 4 percent in the opinions section (H).

Page 92: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

72

Table 34. Faculty instrument forms where more than 15 sample members terminated the interview: 2004

Forms1 Description Number of

breakoffs Q2 Instructional duties related to credit courses/activities 30 Q3 Faculty status 30 Q4 Principal activity 20 Q17A4 Highest degree school coding 30 Q31 Hours per week, paid and unpaid tasks at institution and elsewhere 40 Q32 Percent of time spent on instruction, research, and other activities 40 Q37 Description of each class taught (number of weeks, credits, students, etc.) 30 Q52A Career publications/presentations 30 Q66 Income, from institution/other sources 20 1 The faculty/instructional staff questionnaire was divided into forms (screens) and items. Each form was structured to include related items. NOTE: Numbers rounded to the nearest 10. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

4.3.3 Classification of Instructional Programs (CIP) Coding The assisted coding system was designed for the NSOPF:04 full-scale study to decrease

respondent burden by reducing the time and effort needed to code responses. The assisted coding system was used to code field of teaching, highest degree field, and principal field of scholarly activity. The codes for each of these fields were identical (see appendix J for a list of codes). Respondents were asked to provide a verbatim string. The assisted coding system parsed the string, looking for key words or phrases that matched categories in the database. If a match was located, a list of possible fields was provided for the respondent to choose from. In the event a match was not located or the respondent rejected the fields provided by the system, the respondent could manually code the field. This involved choosing a general category from the 32 categories provided in a drop-down box, and then selecting the specific category within the general category. There were a total of 136 specific categories, but within a general category there were never more than 19 specific categories to choose from.

The anticipated benefit to performing this coding in the interview for web respondents is obvious; the sample member can see the categories and select the appropriate general and specific categories. For telephone-administered interviews, this real-time coding may also improve data quality by capitalizing on the availability of the respondent to clarify coding choices at the time the coding was performed; interviewers were trained to use probing techniques to assist in the coding process.

The assisted coding system coded 75 percent of field of teaching strings, 79 percent of highest degree strings, and 50 percent of field of research strings. The assisted coding matches were accepted more readily by CATI interviewers than by web respondents for field of teaching and highest degree (teaching: 86 percent versus 69 percent, χ2 = 703.7, p < .0001; highest degree: 90 percent versus 74 percent, χ2 = 711.8, p < .0001) but the difference was not significant for field of scholarly activity (57 percent versus 48 percent, χ2 = 2.0, p < 0.16).

As part of the data evaluation activities, a random sample of 10 percent of the results for each of the three Classification of Instructional Program (CIP) codings (teaching, research, and highest degree) was selected. An expert coder evaluated the verbatim strings for completeness

Page 93: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

73

and for the appropriateness of the assigned codes, determining whether a string was too vague to code or whether a different code should be assigned.

Overall, 71 percent of those sampled for recoding were coded correctly, 13 percent were incorrectly coded, and 15 percent of the strings were too vague to determine whether they were correctly coded. Table 35 shows the results of the 10 percent recode, by mode. The expert coder agreed with the coding performed by the web respondent more often than that done by the CATI interviewer (χ2 = 9.69, p = 0.002).

Table 35. Summary of coding results for fields of teaching, research, and highest degree, by mode of administration: 2004

Web respondents CATI respondents Classification of Instructional Programs (CIP) field item

Coding attempts sampled

Percent coded

correctly Percent recoded

Percent too vague

to code

Coding attempts sampled

Percent coded

correctly Percent recoded

Percent too vague

to code Total 3786 72.3 14.0 13.7 1184 67.7 11.6 20.8 Teaching 1949 72.3 14.4 13.2 651 67.4 12.4 20.1 Research 39 53.8 28.2 17.9 5 60.0 20.0 20.0 Highest degree 1798 72.7 13.2 14.1 528 68.0 10.4 21.6 NOTE: Detail may not sum to totals because of rounding. CATI = computer assisted telephone interview. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

In addition to the 10 percent recode, all strings that were not coded, were partially coded (into a general area but not a specific discipline), or were coded “other” were evaluated by the expert coder and upcoded into the appropriate CIP categories, where possible. Of the 52,018 verbatim strings provided, a total of 1,506 strings (3 percent) qualified for this upcoding; 79 percent of these were web respondents and 21 percent were CATI respondents. Of these 1506 strings for which upcoding was attempted, 82 percent were upcoded, 18 percent were too vague to code, and less than 1 percent were correctly coded as “other.”

4.3.4 IPEDS Coding The faculty instrument included a coding system that assisted web respondents and

interviewers in collecting postsecondary institution information. This system was designed to improve data quality by allowing respondents to clarify coding choices at the time coding was performed. To assist in the coding process, web respondents were given detailed instructions on screen that enabled them to locate the postsecondary institution. In addition to these on-screen instructions, interviewers were given additional supervised training on how to effectively probe and code respondents’ answers.

The institution coding system assigned a six-digit IPEDS identifier for the postsecondary institution that awarded the respondent’s highest degree. To facilitate coding, the coding system requested the state and city in which the school was located; from that information, a list of possible schools was displayed, allowing the respondent to select the correct school. The system relied on a look-up table of institutions constructed from the IPEDS institution database.

Of the approximately 25,760 institutions coded over the course of data collection, 1,130 were initially deemed uncodeable. However, based on the information collected (institution name, location, level, and control), 1,025 institutions were positively identified and recoded

Page 94: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

74

during the data file editing stage of the project. Of the remaining 105 uncodeable institutions, 65 provided insufficient data, 20 were identified as closed, 10 were identified as foreign, and 10 were online institutions for which no IPEDS ID was available.

4.3.5 Monitoring Regular monitoring of telephone data collection serves a number of goals, all aimed at

maintaining a high level of data quality. These objectives are to identify problem items; to improve interviewer performance by reinforcing good interviewing behavior and discouraging poor behavior; to detect and prevent deliberate breaches of procedure, such as data falsification; and to assess the quality of the data collected.

Two types of monitoring were performed during the NSOPF:04 data collection. The first type was monitoring by project staff, which involved listening to the interview and simultaneously viewing the progress of the interview on screen, using remote monitoring telephone and computer equipment. Project staff evaluated such things as whether the interviewer sounded professional, probed for complete answers, and handled refusal cases appropriately. Interviewers received feedback on their skills, and additional training was provided, if necessary. When monitoring interviews, project staff also evaluated whether the interview was functioning properly and identified questions in the interview that were difficult to administer so that those items could be revised in future studies.

The second type of monitoring, quality assurance monitoring, was conducted by specially trained monitoring staff within the call center. Similar to project staff monitoring, the monitoring system provided for simultaneous listening and viewing of the interview. Monitors evaluated the interviewer-respondent interchange on whether the interviewer (1) delivered the question correctly and (2) keyed the appropriate response. Each of these measures was quantified and daily, weekly, and cumulative reports were produced. Monitoring took place throughout data collection, although monitoring efforts were scaled back around the 19th week due to lighter caseloads corresponding with the end of the academic year for many schools.

Of the 3,221 items monitored, a total of 28 question delivery errors and 14 data entry errors were observed.40 This yielded an average error rate of 0.9 percent for question delivery and 0.4 percent for data entry.

4.3.6 Interviewer Feedback

Quality Circle meetings

Quality Circle meetings provided opportunities for interviewers, supervisors, and project staff to discuss data collection issues. These meetings were scheduled regularly throughout the data collection period to ensure that CATI interviews were being conducted in the most effective manner. Interviewer representation was determined by a supervisor so that all staff would have the opportunity to attend these meetings. Project staff updated interviewers and supervisors on the progress of data collection and gathered information to solve problems encountered by interviewers while conducting interviews. The minutes from these meetings were prepared by project staff and were distributed to all interviewers and supervisors. Meeting minutes were

40 The data for one monitoring session on April 20, 2004 (61 items observed) were removed from the analysis because the monitor did not follow proper procedure.

Page 95: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

75

available in hardcopy and online. Examples of issues raised in Quality Circle meetings included the following.

Progress of data collection. Project staff provided updates regarding the interviews completed to date and goals for the upcoming week. This information benefited both the interviewers and technical staff by recognizing interviewers’ efforts and encouraging continued professionalism.

CATI Case Management System (CMS) issues. Interviewers had an opportunity to report CMS issues that required project staff review and discussion. Using the information provided by interviewers, project staff resolved these issues throughout data collection.

Data collection reminders. Several issues were stressed throughout data collection: reminders to verify address information for cases that needed to be remailed and for addresses for incentive checks, how to handle eligibility questions, and tips for locating sample members who are part-time employees. Interviewers were also reminded to complete problem sheets (see later section in this chapter) for any cases that needed attention.

Instrument issues. During the Quality Circle meetings, project staff clarified specific items in the instrument for the interviewers. These items were brought to the attention of project staff in problem sheets, project staff monitoring, or during the Quality Circle meetings themselves. Discussions focused on how to properly code responses (e.g., for Q10, adjunct faculty should be coded as “other,” for questions expecting a numeric response, answers between zero and one should be rounded up to one).

Coding. The majority of online coding during data collection was accurate, based on evaluation of verbatim strings and the codes assigned (see earlier section in this chapter on CIP coding), although in some cases the verbatim string was too vague to code. Interviewers were reminded to ask the sample member for the necessary level of detail while entering the verbatim string.

Web issues. A number of web-related issues were raised during Quality Circle meetings. Some sample members reported problems connecting to the website so interviewers were asked to first try to collect the data via CATI or to have someone from the help desk assist the sample member to get connected. Interviewers were reminded to clearly state the study web address (URL) to sample members.

Problem sheets

When interviewers encountered problems during an interview, a description of the issue was documented in the form of an electronic problem sheet. Project and interviewer supervisory staff regularly reviewed these problem sheets and worked on resolving problems, as appropriate. Approximately 1,169 problem sheets were submitted during the data collection period.

Problem sheets were used as follows:

• To address technical CMS issues. Interviewers documented details of the front-end issues so that a programmer could resolve them.

• To report system and web delays or access problems.

• To document sample member contact information as a workaround for front-end issues.

Page 96: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

76

• To alert project staff to questions about sample member eligibility, contact information, and refusals.

• To record incorrect data that were entered (but not corrected) for a case. Interviewers noted cases where project staff needed to take specific action. Project and interviewer supervisory staff ensured that issues pertinent to data collection were resolved as soon as possible.

Interviewer debriefing

A debriefing meeting was held at the end of data collection. The purpose of this meeting was to elicit feedback from the interviewers on various aspects of the data collection process, particularly the administration of the faculty questionnaire. In attendance were telephone interviewers, help desk operators and their supervisors, selected project staff, and the study project officer. The debriefing session was highly informative and gave project staff a wealth of information that will inform instrumentation and data collection activities for future studies.

Project staff asked interviewers which items in the instrument were problematic. Interviewers responded with general comments as well as item-specific ones, based on their interviewing experience.

General comments. Interviewers reported that sample members repeatedly indicated that parts of the questionnaire did not apply to them. Typically these respondents were part-time faculty or those who taught at community colleges, medical, or other specialty schools.

Interviewers felt that that the pop-up boxes used to confirm out-of-range values were intrusive, and slowed the pace of the interview unnecessarily. They recommended that pop-up boxes be used sparingly in future web questionnaires.

Question 1. Interviewers felt that the first question in the interview, which asked whether the respondent had instructional duties, was too long and “wordy.” They recommended that the question be shortened or broken into parts.

Question 3. Interviewers reported that adjunct faculty did not know what was meant by faculty status.

Question 9. The second sentence in the wording of this item (“consider promotions in rank as part of the same job”) was confusing for respondents. Interviewers suggested restructuring the question to include that information before the respondent attempts to answer the question.

Question 15. Q15 (reason for not being a member of a union) had a high rate of don’t know responses. Interviewers said this was because adjunct faculty often did not know whether unions were available.

Questions 16, 17, and 54. Interviewers were quite pleased with the new assisted coding system for field of teaching, highest degree, and scholarly activity. It proved to be less burdensome for them, although they indicated some difficulties finding the exact categories that the respondent wanted.

Question 17. Interviewers reported that IPEDS coding screens (Q17A4) were easy to use. One concern was that some schools were listed in the wrong city.

Page 97: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

77

Question 31. Sample members had difficulty distinguishing between paid and unpaid activities, and their ideas of each often differed from the examples provided in the instrument. Some respondents were upset at having to account for their time. Interviewers reported that respondents found this set of questions (Q31, Q32) difficult to answer as it was a lot of information to account for and difficult to break it down precisely.

Questions 31, 41, 47B, 48. Interviewers pointed out that sample members had a hard time providing answers to the hours per week questions when it is something they only do a couple of weeks out of the term (e.g., advising students). They thought some other unit of time might make it easier to collect this information.

Questions 32, 37, 47. Project staff questioned whether there was any confusion over “first-professional students.” Interviewers indicated that some faculty at technical schools did not know what was meant by first-professional students.

Question 35. Interviewers reported that sample members often were unclear what was meant by the term “distance education” in Q35C and suggested including the words “Internet courses” in the question wording.

Question 37. Interviewers indicated that the screen takes a lot of time to complete and those who teach unstructured courses found these items difficult to answer.

Question 50. Advising of students (Q50) was a difficult concept for some sample members in the field test and the wording was changed in the full-scale instrument to clarify the meaning. Interviewers indicated this was still a problem and suggested changing the definition provided on-screen. Respondents also wanted clarification of whether this was designated advisees only or whether it included other advising.

Question 52. Interviewers reported that Q52 (number of scholarly works) was administered fairly smoothly; most respondents had a general idea of the number of publications and presentations although a few consulted their resumes. A small number of respondents had numbers of publications that exceeded the maximum allowed and became upset that their volume of scholarly activity was not properly reflected. Interviewers reported that respondents seemed to get tired around this point in the instrument and felt that combining screens for Q52A and Q52B would improve the flow and reduce burden on the respondents.

Question 53. In the field test, respondents sometimes reported confusion over what was meant by scholarly activity. The question text was revised, and this problem was not reported in the full-scale questionnaire. However, some respondents were unsure whether to report only scholarly activities associated with the target institution or all scholarly activities.

Question 62. In the field test, Q62C (satisfaction with benefits) was not answered by many respondents (mostly part-timers) because they did not receive benefits. The wording was slightly altered for the full-scale questionnaire; however, interviewers reported that many part-time and adjunct faculty still could not answer this question. In particular, some respondents were unsure whether the question was asking about medical benefits or other benefits.

Questions 66 and 70. Sample members complained that Q66 and Q70 (income) items were intrusive. Interviewers suggested that having scripted text for why this question is asked would be helpful. Interviewers felt that income questions were unnecessarily repetitive.

Page 98: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

78

Question 74. Respondents insisted that “Caucasian” be listed among the response options (in parentheses after “White”). Interviewers suggested adding scripted text to explain why race is asked about on this form.

Question 82. Q82D (racial minorities treated fairly) had more than 10 percent missing when administered in CATI. Interviewers explained that some part-time and adjunct faculty did not have an opinion on this set of items. They suggested adding a “no opinion” option for each item on this form.

Interviewers who worked on the field test requested that Q84 (feedback textbox) be put back in the instrument as many sample members wanted to provide feedback.

4.3.7 Instrument Feedback Two issues with the faculty instrument became apparent in the data editing process. The

first issue had to do with Q1, whether the respondent had any instructional duties. Despite question wording intended to get the respondent to think beyond classroom teaching, half of the respondents who said they did not have any instructional duties provided responses indicating they did have instructional duties on other items in the instrument (i.e., taught one or more credit or noncredit classes [Q35A1>0 or Q35A2>0], provided any individual instruction [Q46=1], spent time on thesis or dissertation committees, comprehensive exams or orals committees [Q48>0], indicated that teaching was their principal activity [Q4=1], or spent time on undergraduate or graduate instructional activities [Q32A>0 or Q32B>0]). Items Q32A and Q32B contradicted Q1 most often. Rather than reconciling in the data editing phase, future cycles of NSOPF would benefit from asking follow-up questions immediately after Q1 for those respondents who said they did not have instructional duties.

The other issue concerned items asking about first-professional students (Q32B, Q37E, Q47A, and Q47B). This term, first-professional student, was apparently misunderstood by many faculty and instructional staff at 2-year institutions who indicated they taught first-professional students at that institution. While on-screen examples of first-professional programs were available on some of these forms, in the future it is advised that a check against level of target institution be inserted into the instrument logic for questions concerning first-professional students.

4.4 Comparisons with NSOPF:99 To assess the consistency of survey estimates between the current and prior

administrations of NSOPF, weighted estimates were obtained from the 1999 and 2004 survey data for a series of key analytical variables. The results of these assessments are summarized in table 36.

Page 99: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

79

Table 36. Weighted estimates obtained based on the 1999 and 2004 survey data for a series of key analytical variables: 2004

Weighted estimates Variable 1999 2004 Percent of full-time faculty who were tenured 53.1 47.3 Percent of part-time faculty who were not on tenure track 78.3 86.2 Percent of faculty who were part time 42.6 43.7 Percent of part-time faculty who had retired from another position 15.4 19.6 Percent of full-time faculty whose principal activity was teaching 64.5 62.5 Percent of full-time faculty whose principal activity was research 11.3 14.2 Average percent of time that full-time faculty taught undergraduates 41.3 43.2 Percent of full-time faculty with a doctorate 57.7 59.6 Average number of hours full-time instructional faculty taught per week 11.0 11.1 Average number of hours part-time instructional faculty taught per week 7.3 7.7 Average number of recent refereed publications, full-time faculty 3.9 2.2 Average number of recent refereed publications, part-time faculty 1.2 0.5 Average number of career refereed publications, full-time faculty 16.0 16.0 Average number of career refereed publications, part-time faculty 4.4 4.1 Average basic income of full-time faculty $56,841 $67,239 Average basic income of part-time faculty 11,613 11,010 Average consulting income of full-time faculty who consulted 8,221 7,379 Average consulting income of part-time faculty who consulted 10,579 10,908 Average household income, full-time faculty 163,127 117,702 Average household income, part-time faculty 125,693 92,636 Percent of full-time faculty who were Asian 5.5 9.0 Percent of part-time faculty who were Asian 2.9 3.8 Percent of full-time faculty who were Black 4.9 5.8 Percent of part-time faculty who were Black 4.3 5.7 Percent of full-time faculty who were Hispanic 3.4 3.5 Percent of part-time faculty who were Hispanic 3.9 3.5 Percent of full-time faculty who were White 85.1 81.7 Percent of part-time faculty who were White 87.6 86.9 Percent of full-time faculty who were female 36.3 38.6 Percent of full-time faculty in agriculture and home economics 0.6 2.5 Percent of full-time faculty in business 7.4 6.4 Percent of full-time faculty in education 8.7 7.6 Percent of full-time faculty in engineering 2.4 4.9 Percent of full-time faculty in fine arts 9.3 6.4 Percent of full-time faculty in health sciences 12.2 13.9 Percent of full-time faculty in first-professional health science1 4.3 6.6 Percent of full-time faculty in humanities 18.1 13.5 Percent of full-time faculty in natural sciences 16.1 22.4 Percent of full-time faculty in social sciences 9.6 10.5 Percent of full-time faculty in all other programs 15.7 12.1 1 First-professional health science is a subset of health sciences (previous row). NOTE: Detail may not sum to totals due to rounding. Differences in estimates between NSOPF:99 and NSOPF:04 may be due to a number of factors, including actual changes over time, differences in how an item was asked between the two years (see table 8), and data editing and imputation procedures (see chapter 5). SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 100: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 4. Evaluation of Data Quality

80

Page 101: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

81

Chapter 5 Data File Development and Imputation

This chapter provides an overview of all procedures used in the development of data files, including descriptions of data editing processes, data swapping, statistical imputations, and derived variable creation.

5.1 Overview of the NSOPF:04 Data Files Data obtained from the 2004 National Study of Postsecondary Faculty (NSOPF:04)

faculty and institution questionnaires are contained in two restricted data files (faculty and institution), which are available on a CD-ROM to researchers who have applied for and received authorization from the National Center for Education Statistics (NCES) to access restricted research files. The restricted data files are documented by an Electronic Codebook (ECB), a Windows-based interface that allows users to view descriptive information and statistics about variables and to select variables for extraction into SAS or SPSS data files. The faculty and institution data files can be merged together for joint analysis. The following files were produced:

Faculty data file. Provides faculty-level questionnaire data collected from 26,110 respondents. These data have been edited, swapped, and imputed. The file contains survey variables (variables that start with Q), derived variables (variables that start with X), and study weights for the faculty file (WTA00) and for the combined faculty and institution files (WTC00—or contextual weight). It also contains replicate weights for variance estimation for the faculty file (WTA01-WTA64) and for the combined faculty and institution files (WTC01-WTC64), the imputation flags (variables that start with F), and INSTID (the IPEDS ID) that will allow faculty file data to be merged with institution file data.

Institution data file. Provides institution-level data collected from 920 institutions. These data have been edited, perturbed, and imputed. The file contains the institution survey variables (variables that start with the letter I), derived variables (variables that start with the letter X), study weight for the institution file (WTB00), replicate weights for variance estimation for the institution file (WTB01-WTB64), imputation flags (variables that start with the letter FI), and INSTID (the IPEDS ID) that will allow data on the institution file to be merged with data on the faculty file.

The faculty and institution files can be merged together for joint analysis by performing a match merge using the variable INSTID. Please note that not all institutions that completed the institution questionnaire have responding faculty, and not all faculty have associated institution questionnaire data. For this reason, when analyzing the faculty and institution data together, responses should be weighted using the contextual weight variables on the faculty file.

The NSOPF:04 institution and faculty analysis variables are presented in appendix K.

Page 102: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

82

5.2 Data Coding and Editing The NSOPF:04 data were coded and edited using procedures developed and implemented

for previous NCES-sponsored studies. These coding and editing procedures were refined during the field test for use in the processing of NSOPF:04 full-scale data.

A large part of the data editing and coding was performed in the data collection instruments, including range edits; across-item consistency edits; and coding of fields of teaching, scholarly activities, and highest degree. During and following data collection, the data were reviewed to confirm that the data collected reflected the intended skip-pattern relationships. At the conclusion of data collection, special codes were inserted in the database to reflect the different types of missing data. There are a number of explanations for missing data; for example, the item may not have been applicable to certain respondents or a respondent may not have known the answer to the question. Table 37 lists the set of consistency codes used to assist analysts in understanding the nature of missing data associated with the NSOPF:04 data elements. With the exception of the not applicable codes, missing data were stochastically imputed (see section 5.4). Moreover, for hierarchical analyses and developing survey estimates for faculty members corresponding to sample institutions that provided faculty lists and responded to the institution survey, contextual weights were produced for such subsets of the responding faculty members. These weights, which aggregate to a number less than the weighted total for all responding faculty and instructional staff, are named WTC00 and can be found in weights.dat on the ECB file.

Table 37. Description of missing data codes: 2004

Missing data code Description -1 Don’t know; later set to missing and imputed -3 Not applicable (item was intentionally skipped) -5 Not applicable (item was asked but respondent indicated it was not applicable) -7 Item was not administered (abbreviated interview) or reached (partial interview); later imputed -9 Respondent did not provide an answer; later imputed SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The data cleaning and editing process for NSOPF:04 consisted of the following steps:

Step 1. Review of one-way frequencies for every variable to confirm no missing or blank values and to check for reasonableness of values. This involved replacing blank or missing data with -9 for all variables in the instrument database and examining frequencies for reasonableness of data values.

Step 2. Review of two-way cross-tabulations between each gate-nest41 combination of variables to check data consistency. Legitimate skips were identified using the interview programming code as specifications to define all gate-nest relationships and replace -9 (missing values that were blank because of legitimate skips) with -3 (legitimate skip code). Additional checks ensured that the legitimate skip code

41 Gate variables are items that determine subsequent instrument routing. Nest variables are items that are asked or not asked, depending on the response to the gate question. For example, in the faculty questionnaire, Q1 (which asks whether the respondent had instructional duties) determines whether Q2 (which asks whether the respondent’s instructional duties were related to credit courses/activities) is asked. Q1 is a gate item and Q2 is a nested item. Q2 is only asked if the response to Q1 was “yes.”

Page 103: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

83

was not overwriting valid data and that no skip logic was missed. In addition, if a gate variable was missing (-9), then the -9 was carried through the nested items.

Step 3. Identify and code items that were not administered due to a partial or abbreviated interview. This code replaced -9 values with -7 (item not administered) based on the section completion and abbreviated interview indicators.

Step 4. Recode “don’t know” responses to missing. This code replaced -1 (don’t know) values with -9 (missing) for later stochastic imputation. For selected items for which “don’t know” seemed like a reasonable response, variables were created both with and without the “don’t know” category.

Step 5. Identify items requiring recoding. During this stage, previously uncodeable values (e.g., text strings) collected in the various coding systems were upcoded, if possible (see sections 4.3.3 CIP coding and 4.3.4 IPEDS coding).

Step 6. Identify items requiring range edits, logical imputations, and data corrections. Descriptive statistics for all continuous variables were examined. Values determined to be out-of-range were either coded to the maximum (or minimum) reasonable value or set to missing for later imputation. Logical imputations were implemented to assign values to legitimately skipped items whose values could be implicitly determined from other information provided. Data corrections were performed where there were inconsistencies between responses given by the sample member.

Concurrent with the data cleaning process, detailed documentation was developed to describe question text, response options, recoding, range edits, logical imputations, data corrections, and the “applies to” text for each delivered variable.

5.3 Data Perturbation A restricted faculty-level data file was created for release to individuals who apply for

and meet standards for such data releases. While this file does not include personally identifying information (i.e., name and Social Security number), other data (i.e., institution, Integrated Postsecondary Education Data System [IPEDS] ID, demographic information, and salary data) may be manipulated in such a way to seem to identify data records corresponding to a particular faculty member. To protect further against such situations, some of the variable values were swapped between faculty respondents. This procedure perturbed and added additional uncertainty to the data. Thus, associations made among variable values to identify a faculty respondent may be based on the original or edited, imputed and/or swapped data. For the same reasons, the data from the institution questionnaire were also swapped to avoid data disclosure.

5.4 Imputation Methodology The NSOPF:04 data files include institution-level and faculty-level data obtained from

the institution and faculty surveys. All non-verbatim and non-text variables on the NSOPF:04 that had missing variables have been imputed. Specifically, a total of 144 variables were stochastically imputed for the faculty data, and 87 variables were stochastically imputed for the institution data. All remaining missing data were deemed not suitable for imputation, such as the postsecondary institution that awarded the highest degree of a faculty respondent. Most of these

Page 104: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

84

variables were imputed using a weighted sequential hot-deck imputation procedure. A number of variables, including gender and race/ethnicity, were imputed using a combination of cold-deck and logical imputation during the data editing process before the data file was considered ready for stochastic imputation. The specific imputation method used for each variable is specified in the imputation flags on the final restricted datasets.

Table 38 shows the number of variables that were imputed based on the percent missing (imputed) for faculty and institution survey data. Accordingly, data for 26 of the 144 faculty variables were imputed for less than 1 percent of all faculty respondents, whereas data for 7 of the faculty variables were imputed for more than 15 percent of the faculty respondents.

Table 38. Prevalence of missing/imputed data for the faculty and institution surveys: 2004

Percent imputed Faculty variables Institution variables Total 144 87

Less than 1 percent 26 0 Between 1 and 5 11 58 Between 5 and 10 93 15 Between 10 and 15 7 11 Over 15 7 3 NOTE: There are fewer stochastically imputed variables for the faculty and institution questionnaires than their corresponding analysis variables, since a subset of such items had no missing values after application of logical imputation. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

5.4.1 Imputation Methods In broad terms, there are three methods of imputation: logical, cold-deck, and hot-deck

imputation. Logical imputation is a process that aims to infer or deduce the missing values from answers to other questions. Cold-deck imputation involves replacing the missing values with data from sources such as data used for sampling frame construction. While resource intensive, these methods often obtain the actual value that is missing. Consequently, attempts were made to fill in the missing values of data using these two methodologies, to the extent possible. In contrast, stochastic imputation methods, such as sequential hot-deck imputation, rely on the observed data to provide replacing values (donors) for records with missing values.

Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. Under this methodology, while each respondent record has a chance to be selected for use as a hot-deck donor, the number of times a respondent record can be used for imputation will be controlled

To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictor) for each item being imputed were defined. For this study, imputation classes were developed by using a Chi-squared Automatic Interaction

Page 105: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

85

Detection (CHAID) analysis. The CHAID segmentation process divides the data into groups based on the most significant predictor of the item being imputed. Subsequently, this procedure will be repeated using the remaining predictor variables to split each of the emerging groups into smaller subgroups. In this process, a number of subgroups created during a previous iteration might get merged back to form new subgroups. This splitting and merging process continues until no more statistically significant predictors are found, at which point imputation classes are defined from the resulting segments. When dealing with categorical variables, the CHAID process may merge certain categories of such variables that are found not to be significantly different. Similarly, continuous variables are categorized to create the strongest categorical predictors of the item in question.

Using RTI’s sequential hot-deck method of imputation, once imputation classes are constructed, items within each class are sorted before the process of donor selection begins. If more than one sorting variable is chosen, a serpentine sort will be performed where the direction of the sort (ascending or descending) changes each time the value of a variable changes. The serpentine sort minimizes the change in the respondent’s characteristics every time one of the variables changes its value.

It should be noted that, for this study, distinction was made between legitimate and non-legitimate missing items for imputation. All responses that were left missing as a result of refusal were set to missing and then imputed. Additionally, if the interview was terminated early and some questions were not asked of the respondent, then the value of missing was assigned in those cases as well. However, respondents could legitimately skip questions that did not apply to them. In these cases, the missing responses were coded as legitimate skips (-3) and were not imputed.

5.4.2 Imputation of Faculty Data Item imputation for the faculty questionnaire was performed in several steps. In the first

step, the missing values of gender, race, and ethnicity were filled—using cold-deck imputation—based on the sampling frame information or institution record data. These three key demographic variables were imputed prior to any other variables since they were used as key predictors for all other variables on the data file.

After all logical and cold-deck imputation procedures were performed, the remaining variables were imputed using the weighted sequential hot-deck method. Initially, variables were separated into two groups: unconditional and conditional variables. The first group (unconditional) consisted of variables that applied to all respondents, while the second group (conditional) consisted of variables that applied to only a subset of the respondents. That is, conditional variables were subject to “gate” questions. After this initial grouping, these groups were divided into finer subgroups as detailed next.

The unconditional group was divided into two subgroups based on the percent of missing values: less than 1 percent versus greater than 1 percent missing. The conditional variables were divided into three subgroups based on the level of conditionality where this level was essentially determined by the sequence of the questionnaire. For variables in the conditional group, the questionnaire skip patterns were reviewed and variables were grouped according to which variables determine the values of other variables. After these subgroups were constructed, missing values of the variables were imputed in order from lowest percent missing to highest

Page 106: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

86

percent missing within each subgroup, first for the unconditional variables and then for the conditional variables in an ascending level of their conditionality.

All unconditional variables that had less than one percent missing were imputed using imputation classes defined by a combination of gender, race, and ethnicity. Moreover, institution type,42 institution size, and faculty type43 were used as sort variables to place like records in closer proximity to improve the donor selection process. The imputation classes for the remaining unconditional variables (that had more than one percent missing) and all conditional variables were determined by a CHAID analysis based on key demographic variables that were logically imputed and all imputed variables that had less than one percent missing. After all variables were imputed, consistency checks were applied to the entire faculty data file to ensure that the imputed values did not conflict with other questionnaire items, observed or imputed. This process involved reviewing all of the logical imputation and editing rules as well.

5.4.3 Imputation of Institution Data The imputation process for the missing data from the institution questionnaire involved

similar steps to those used for imputation of the faculty data. The missing data for variables were imputed using the weighted sequential hot-deck method. Analogous to the imputation process for the faculty data, the variables were partitioned into conditional and unconditional groups. The unconditional variables were sorted by percent missing and then imputed in the order from the lowest percent missing to the highest. The conditional group was partitioned into three subgroups based on the level of conditionality for each variable, and then imputed in that order. The imputation class for both unconditional and conditional variables consisted of the institution sampling stratum, and the sorting variables included the number of full-time and part-time faculty members.

5.4.4 Evaluation of Imputations A common measure for determining whether an imputation method produces acceptable

results (donors) is based on the similarity of the before and after imputation distributions within imputation classes. For evaluation of the imputation results, distributions were considered to be similar when absolute differences were less than 5 percent, where the absolute difference was calculated by comparing the before and after imputation weighted frequencies. If absolute differences were greater than 5 percent, the unweighted distributions were examined to see if the large differences were due to small imputation cells. When possible, such cases were evaluated and resolved by collapsing neighboring imputation classes. The before and after imputation distributions of several key variables are presented in table 39 for the faculty data and table 40 for the institution data. For more information regarding the bias due to item nonresponse, refer to appendix I.

42 Institutional type consisted of a cross-classification of control (public verses private not-for-profit) and degree type (doctoral, master’s, baccalaureate, associate’s, and other). 43 Faculty type (stratum) is based on faculty demographics, such as gender, race/ethnicity, and employment status.

Page 107: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

87

Table 39. Before and after imputation distributions of key faculty questionnaire variables: 2004 Before imputation After imputation

Variable description Variable category Number Percent

Number Percent Total 26,050 100.0 26,110 100.0 No faculty status 1,670 7.1 1,680 7.2 Had faculty status 24,390 92.9 24,430 92.8

Faculty status

Total 26,030 100.0 26,110 100.0 Teaching 18,660 73.2 18,710 73.2 Research 2,470 8.9 2,470 8.9 Public service 260 1.0 260 1.0 Clinical service 1,260 4.6 1,270 4.7 Administration 2,070 7.2 2,070 7.3 On sabbatical 380 1.4 380 1.4 Other activity 940 3.7 950 3.7

Principal activity

Total 26,100 100.0 26,110 100.0 Full time 17,750 62.4 17,750 62.4 Part time 8,350 37.6 8,360 37.6

Employed full or part time at this institution

Total 26,090 100.0 26,110 100.0 Not applicable 640 2.7 640 2.7 Professor 5,220 18.9 5,220 18.9 Associate professor 4,210 14.8 4,210 14.8 Assistant professor 4,620 16.1 4,620 16.1 Instructor 5,050 20.5 5,050 20.5 Lecturer 1,230 5.3 1,230 5.3 Other title 5,140 21.7 5,140 21.7

Rank

Total 25,930 100.0 26,110 100.0 Tenured 8,390 30.5 8,420 30.5 On tenure track but not tenured 3,840 13.4 3,860 13.4 Not on tenure track 11,330 47.5 11,430 47.6 Not tenured-no tenure system 2,380 8.6 2,390 8.6

Tenure status

Total 26,090 100.0 26,110 100.0 No degree 250 1.1 250 1.1 Doctorate degree 12,180 44.5 12,180 44.5 First-professional degree 2,010 7.4 2,010 7.4 Master of fine arts/social work 1,190 4.6 1,190 4.6 Other master’s degree 8,080 32.5 8,090 32.5 Bachelor’s degree 1,870 7.8 1,870 7.8 Associate’s degree or equivalent 390 1.6 390 1.6

Highest degree

Certificate/diploma-undergrad program 140 0.5 140 0.5

Total 21,500 100.0 26,110 100.0 $1-24,999 460 2.4 560 2.3 25,000-49,999 2,520 11.6 2,990 11.4 50,000-74,999 4,690 22.1 5,630 21.8 75,000-99,999 4,500 20.6 5,520 20.9 100,000-149,999 5,460 25.2 6,690 25.5 150,000-199,999 2,110 9.9 2,590 10.0 200,000-300,000 1,330 6.3 1,600 6.1 More than 300,000 430 2.0 530 2.0

Household income (range)

See notes at end of table.

Page 108: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

88

Table 39. Before and after imputation distributions of key faculty questionnaire variables: 2004—Continued

Before imputation After imputation Variable description Variable category Number Percent

Number Percent

Total 26,030 100.0 26,110 100.0 Not Hispanic/Latino 24,320 95.9 24,400 95.9 Hispanic/Latino 1,700 4.1 1,700 4.1

Race/ethnicity, Hispanic/Latino

Total 25,590 100.0 26,110 100.0 Not American Indian/Alaska Native 25,060 98.2 25,570 98.2 American Indian/Alaska Native 530 1.8 540 1.9

Race, American Indian or Alaska Native

Total 25,590 100.0 26,110 100.0 Not Asian 24,030 94.0 24,480 93.8

Race, Asian

Asian 1,560 6.0 1,630 6.2

Total 25,590 100.0 26,110 100.0 Not Black/African American 23,450 93.7 23,940 93.8

Race, Black or African American

Black/African American 2,140 6.3 2,170 6.2

Total 25,590 100.0 26,110 100.0 Not Native Hawaiian/Pacific Islander 25,500 99.7 26,020 99.7 Native Hawaiian/Pacific Islander 90 0.3 90 0.3

Race, Native Hawaiian or other Pacific Islander

Total 25,590 100.0 26,110 100.0 Not White 3,670 12.2 3,780 12.3

Race, White

White 21,920 87.9 22,330 87.7 NOTE: Numbers rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

5.5 Derived Variables For NSOPF:04, a total of 45 institution-level and 130 faculty-level derived variables were

constructed to simplify access to standard queries useful to analysts, as well as to enhance substantive analysis. Since research questions often require independent or control variables, this set of derived variables was added to the faculty data files. The 45 institution-level derived variables were also added to the institution data files. Multiple sources of data were used to create institution-derived variables, including selected 2000–01 and 1997–98 IPEDS surveys, the Carnegie classification system, and NSOPF:04 sampling information.

Page 109: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

89

Table 40. Before and after imputation distributions of key institution questionnaire variables: 2004

Before imputation After imputation Variable description Variable category Number Percent

Number Percent

Total 900 100.0 920 100.0 Not Applicable # 0.4 # 0.4 All 860 95.8 880 95.9 Some 40 3.7 40 3.7 None # 0.1 # 0.1

Full-time benefit: medical insurance

Total 900 100.0 920 100.0 Not Applicable # 0.4 # 0.4 All 820 88.1 830 88.2 Some 40 4.3 40 4.2 None 40 7.2 40 7.1

Full-time benefit: dental insurance

Total 890 100.0 920 100.0 Not Applicable # 0.4 # 0.4 All 790 86.8 810 86.7 Some 60 6.8 60 6.8 None 40 5.9 40 6.2

Full-time benefit: disability insurance

Total 900 100.0 920 100.0 Not Applicable # 0.4 # 0.4 All 810 92.2 840 92.3 Some 50 4.4 50 4.4 None 30 2.9 30 2.9

Full-time benefit: life insurance

Total 880 100.0 920 100.0 Not Applicable # 0.4 # 0.4 All 200 16.2 210 16.2 Some 50 3.1 60 3.2 None 620 80.3 650 80.2

Full-time benefit: child care

Total 860 100.0 920 100.0 Not Applicable # 0.5 # 0.4 All 470 50.6 520 50.4 Some 210 21.6 220 21.0 None 180 27.4 190 28.2

Full-time benefit: retiree medical insurance

Total 880 100.0 920 100.0 Not Applicable # 0.4 # 0.4 All 260 29.4 270 29.1 Some 20 1.8 20 1.8

Full-time benefit: cafeteria-style plan

None 600 68.4 630 68.7

Total 880 100.0 920 100.0 Not Applicable # 0.4 10 0.4 All 570 65.2 600 65.2 Some 30 3.8 30 3.9 None 270 30.6 290 30.6

Full-time benefit: wellness program

Total 900 100.0 920 100.0 Not Applicable 10 0.4 10 0.4 All 80 8.1 80 8.1 Some 390 34.4 400 34.4 None 420 57.1 430 57.2

Part-time benefit: medical insurance

Total 900 100.0 920 100.0 Not Applicable 10 0.4 10 0.4 All 70 7.9 70 7.8 Some 330 30.0 340 29.8

Part-time benefit: dental insurance

None 490 61.8 500 62.1 # Rounds to zero. NOTE: Institution counts are rounded to the nearest 10 to protect the confidentiality of faculty and institutions. However, percentages cited are based on the original unrounded numbers. Detail may not sum to totals due to rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 110: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 5. Data File Development and Imputation

90

Page 111: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

91

Chapter 6 Weighting and Variance Estimation

Three sets of analysis weights were calculated for this administration of the 2004 National Study of Postsecondary Faculty (NSOPF:04), details of which are provided in this section. First, a set of analysis weights was calculated for institutions responding to the institution survey. Next, analysis weights were constructed for responding faculty, which reflected the selection probabilities of institutions providing faculty lists and selection of faculty members within sample institutions. In addition, a set of contextual weights was calculated to use when linking faculty and institution survey data. These analysis weights were constructed as the product of corresponding sampling weights and adjustment factors for frame multiplicity, nonresponse, and poststratification to known control totals. As detailed in the following sections, each component of the final analysis weights represents either the inverse of a selection probability or a weight adjustment to reduce bias.

The institution analysis weights were computed as the product of the following five weight components and adjustment factors:

(1) institution sampling weight (WT1);

(2) institution multiplicity adjustment factor (WT2);

(3) institution nonresponse adjustment factor (WT3);

(4) institution poststratification adjustment factor (WT4); and

(5) institution ratio adjustment factor (WT5).

In order to compute the analysis weights for faculty, first a set of primary sampling unit (PSU) weights were created for institutions providing faculty lists. These interim weights, which are of no analytical utility, were only used as component weights for construction of the final analysis weights for faculty members. Ultimately, the faculty analysis weights were computed as the product of the following nine weight components and adjustment factors:

(1) institution sampling weight (WT1);

(2) institution multiplicity adjustment factor (WT2);

(3) institution nonresponse adjustment (WT3)44;

(4) institution poststratification adjustment factor (WT4);

(5) faculty sampling weight (WT5);

(6) faculty multiplicity adjustment factor (WT6);

(7) faculty unknown eligibility adjustment factor (WT7);

(8) faculty nonresponse adjustment factor (WT8); and

(9) faculty poststratification adjustment factor (WT9). 44 Note that here separate sets of nonresponse and poststratification adjustment factors (WT3 and WT4) were constructed for each institution as compared to those calculated above, since the set of institutions providing faculty lists was not the same as that responding to the institution questionnaire.

Page 112: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

92

Analogous to the calculation of analysis weights for the faculty, a set of contextual weights was constructed for the subset of faculty for whom their corresponding institutions had responded to the institution survey. Table 41 summarizes the distribution of institutions providing faculty lists and responding to the institution questionnaire by sampling strata.

Table 41. Counts of sampled, eligible, and participating institutions, by institution type: 2004

Responded Institution type

Sampledinstitutions

Eligibleinstitutions Faculty list Questionnaire

Total 1,080 1,070 980 920

Public doctoral 190 190 180 170 Public master’s 120 120 100 110 Public bachelor’s 30 30 30 30 Public associate’s 340 330 290 290 Public other 10 10 10 10 Private not-for-profit doctoral 110 110 100 90 Private not-for-profit master’s 80 80 80 70 Private not-for-profit bachelor’s 130 130 120 110 Private not-for-profit associate’s 10 10 10 10 Private not-for-profit other 60 60 60 50 NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04)

RTI’s weighting software GEM (Generalized Exponential Modeling) (Folsom and Singh 2000) has been used for calculation of all weight adjustment factors. Taking advantage of an iterative proportional fitting algorithm and the logit method, GEM provides a comprehensive weighting program that can utilize a large number of predictor variables for creating a more balanced set of weights while automatically curtailing extreme weights that can reduce the efficiency of weighted estimates. For more details on the GEM adjustment procedure, see appendix L. This section provides details of steps taken to construct the resulting weights.

6.1 Institution Weights The institution sampling frame for the NSOPF:04 included a total of 3,380 eligible units,

detailed composition of which is provided in section 2.1.1. Reflecting the probability proportional to size scheme of sample selection, the probability of selection for institution i in stratum r was calculated by:

⎪⎩

⎪⎨⎧

=+

selectionscertainty for 1,

selectionscertainty -nonfor ,r

rir

ri SSn

π

where:

nr = sample size for stratum r,

Sri = composite measure of size for institution i in stratum r, and

Sr+ = composite measure of size for all institutions in stratum r.

Page 113: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

93

The initial sample consisted of 1,220 institutions. However, this sample was reduced to a subsample of institutions, since a smaller sample was deemed adequate to secure all precision requirements of NSOPF:04. Therefore, the sampling weight for institution i in stratum r was calculated as a function of its initial and subsequent selection probabilities. With Rr representing the subsampling rate in stratum r, the sampling weight for the i-th institution in that stratum was calculated by:

rriri R

WT 111 ×=π

It should be noted that during the sample refreshing step, institutions were added to the sample of institutions, resulting in total sample of 1,080 institutions for NSOPF:04.

6.1.1 Adjustment for Institution Multiplicity During the institution recruitment and faculty list sampling stages, a number of

institutions were identified that had two or more records listed on the Integrated Postsecondary Education Data System (IPEDS). In some cases this was caused by institutions that had recently merged, while in other cases the sample institution had sent a single faculty list covering multiple campuses. For sampling purposes, combined faculty lists that could not be separated were treated as merged institutions and identified under a single IPEDS ID for purposes of tracking survey results.

For institutions with more than one chance of selection, a multiplicity adjustment factor was calculated by estimating, as if the selections were independent, the probability that each record could be selected. Consequently, when an institution had n chances of selection, its probability of selection was calculated by:

( )∏=

−−n

iri

1

11 π

Next, a multiplicity adjustment factor for the i-th sample institution was calculated by:

( ) i

n

iri

i

WTWT

111

12

1

×⎥⎦

⎤⎢⎣

⎡−−

=

∏=

π

If the given institution did not require such adjustment, its multiplicity adjustment factor was set to unity. This way, the product of WT1 and WT2 equals the reciprocal of the resulting multiple chance of selection for the institutions with positive multiplicity, and equals WT1 for all other institutions.

6.1.2 Nonresponse Adjustment For calculating the analysis weights for institutions responding to the institution

questionnaire, an institution (questionnaire) level nonresponse adjustment factor (WT3) was constructed using the product of the institution sampling weights (adjusted for multiplicity) and

Page 114: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

94

the faculty counts from the frame.45 For this purpose, the institutional respondent definition provided in section 3.1.1 was used to identify the institution subset belonging to the denominator of this adjustment factor. The resulting adjustment factors, which were calculated using GEM within cells defined by the 10 sampling strata and region, aimed to reduce or eliminate nonresponse bias in survey estimates. Construction of the nonresponse adjustment cells was based on variables that were deemed to be predictive of response status and available for both respondents and nonrespondents.

6.1.3 Poststratification Adjustment A set of poststratification adjustment factors (WT4) was calculated for the 920 institutions

responding to the questionnaire using the GEM program. Specifically, nonresponse-adjusted weights for these institutions were ratio-adjusted to the counts of institutions obtained from the sampling frame. Moreover, an additional adjustment factor was calculated to ensure that weighted counts of faculty obtained from the institution survey data would coincide with those obtained from the faculty survey data. As detailed in the next section, the final analysis weights for faculty included ratio adjustments to counts of faculty obtained from the Employees by Assigned Position Survey (EAP) conducted in the Winter 2003–04 IPEDS data collection cycle. In order to achieve the needed concurrence between the weighted estimates obtained from the institution and faculty surveys, the poststratified weights of the 920 institutions were ratio adjusted to the corresponding weighted totals from the faculty data. With this last adjustment factor computed (WT5), the final analysis weight for each responding institution (WTB00) was calculated by:

WTB00 = WT1 × WT2 × WT3 × WT4 × WT5

6.2 Faculty Weights The final analysis weights for faculty were constructed as the product of the final

institution weights for the 980 institutions that provided faculty lists (PSU weights), inverse of selection probabilities for faculty, and a series of adjustment factors at the faculty level. The needed PSU level weights, which are different from those calculated above for the 920 institutions responding to the institution questionnaire, were calculated by calibrating the product of the institution sampling weights (adjusted for multiplicity) and the faculty frame counts to the institution counts within each of the sampling strata. Note that since a minimum weighted response rate of 85 percent was secured overall and within each of the sampling strata for institutions providing faculty lists, a nonresponse adjustment factor was not calculated for these institution. Operationally, these institutions were assigned a nonresponse adjustment factor of unity, i.e., WT3 = 1.

6.2.1 Selection Probability for Faculty The overall faculty sampling strata were defined as the institution sampling strata crossed

with the faculty strata within institutions. The sample faculty members were systematically

45 The NSOPF:04 sample of institutions was selected using probabilities proportional to the number of faculty and instructional staff in each institution. Consequently, calculation of the analysis weights included multiplication of sampling weights (adjusted for multiplicity) by the faculty counts within each of the sampling strata. This means that a subset of institutions—particularly those that had small sampling weights, such as certainty institutions, could end up with weights that are less than one.

Page 115: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

95

selected from the faculty lists at institution-specific rates that were inversely proportional to the institution’s probability of selection, as dictated by the sample design. That is, the overall stratum sampling rate divided by the institution’s probability of selection:

ri

sis

ffπ

=|

where fs represented the overall faculty sampling rate, and πri represented the institution’s probability of selection. The sampling weights (WT5) for each of the 35,630 sample faculty members were calculated as the reciprocal of the above institution-specific faculty sampling rates.

6.2.2 Adjustment for Faculty Multiplicity Faculty members who worked at more than one eligible institution during the 2003–04

academic year had multiple chances of being selected, since they could have been selected from any of the eligible institutions they attended. When this was the case, the resulting multiplicity was adjusted for by dividing the sampling weight of the given faculty by the number of institutions he/she worked at that were eligible for sample selection. Specifically, the faculty multiplicity weight adjustment factor was defined as WT6 = 1/M, where M is the multiplicity or number of institutions attended by sample faculty, based on the interview data.

6.2.3 Adjustment for Unknown Eligibility Status For nonresponding faculty members whom project staff were unable to contact, the final

eligibility status could not be determined. These faculty members were treated as eligible, and their weights were adjusted to compensate for the small portion of faculty members who were actually ineligible. These weight adjustment factors (WT7), which were calculated within cells defined by a cross-classification of institution and faculty types, represented the estimated eligibility rates among faculty members with known eligibility status. For faculty members known to be eligible the weight adjustment factor was set to one.

6.2.4 Nonresponse Adjustments As reported earlier, faculty-level response rates were less than 85 percent, both overall

and within a number of sampling strata. Subsequent to a nonresponse bias analysis, details of which are provided in appendix I, adjustment factors were calculated within cells indexed by a cross-classification of the faculty and institution strata and length of time to respond. Again, the weighting program, GEM, was used to create the needed nonresponse adjustment factor (WT8) for each of the 26,110 responding faculty members.

6.2.5 Poststratification/Raking Adjustment To ensure population coverage, nonresponse adjusted weights were further adjusted to

match published faculty totals. Specifically, these weights were raked along two dimensions to control totals that were constructed using the Winter 2003-04 Employees by Assigned Position Survey (EAP:03). This source was used to obtain the total number of full- and part-time faculty members by institution type. Moreover, the NSOPF:04 sampling frame was used to generate the distribution of faculty members by race/ethnicity and gender, detailed construction of which is

Page 116: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

96

provided in section 2.1.4 and appendix A. The resulting two raking dimensions are summarized in tables 42 and 43. The raking adjustment factors (WT9) were calculated using GEM.

Table 42. Faculty counts obtained from EAP:03, by institution type and employment status Institution type Employment

status Total 1 2 3 4 5 6 7 8 9 10 Total 1,185,661 306,119 143,540 20,459 355,577 13,473 138,161 99,021 66,803 7,392 35,116

Full-time 662,407 238,168 90,183 11,213 116,491 6,514 94,688 44,198 41,709 3,881 15,362 Part-time 523,254 67,951 53,357 9,246 239,086 6,959 43,473 54,823 25,094 3,511 19,754

NOTE: Institution types are defined as follows: 1 = public doctoral, 2 = public master’s, 3 = public bachelor’s, 4 = public associate’s, 5 = public other, 6 = private not-for-profit doctoral, 7 = private not-for-profit master’s, 8 = private not-for-profit bachelor’s, 9 = private not-for-profit associate’s, 10 = private not-for-profit other. SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS), 2003 Employee by Assigned Position (EAP).

Table 43. Faculty counts obtained from EAP:03 and NSOPF:04 sampling frame, by institution type, race/ethnicity, and gender

Institution type Race/gender Total 1 2 3 4 5 6 7 8 9 10 Asian 80,704 35,993 8,804 980 11,968 567 14,203 3,583 2,430 184 1,992 Asian male 53,201 25,463 5,843 627 6,123 371 9,772 2,241 1,373 97 1,291 Asian female 27,503 10,530 2,961 353 5,845 196 4,431 1,342 1,057 87 701 Black 68,790 11,908 10,974 1,920 25,133 882 6,621 4,430 4,614 426 1,883 Black male 33,699 6,126 5,557 990 11,164 396 3,625 2,205 2,371 197 1,068 Black female 35,091 5,782 5,416 930 13,969 486 2,996 2,224 2,243 229 816 Hispanic 41,833 9,564 5,118 557 17,405 255 4,124 2,414 1,397 173 825 Hispanic male 23,177 5,654 2,761 317 9,284 133 2,451 1,243 717 103 513 Hispanic female 18,656 3,910 2,358 239 8,121 122 1,673 1,171 680 70 312 Other 994,330 248,656 118,643 17,002 301,071 11,769 113,212 88,594 58,360 6,608 30,416 Other male 575,155 158,016 66,289 9,751 154,787 6,751 74,456 49,303 33,395 3,450 18,956 Other female 419,175 90,639 52,354 7,251 146,284 5,018 38,756 39,291 24,966 3,158 11,460 Faculty 1,185,661 306,119 143,540 20,459 355,577 13,473 138,161 99,021 66,803 7,392 35,116 Male 685,238 195,259 80,451 11,686 181,358 7,652 90,305 54,994 37,857 3,848 21,828 Female 500,423 110,860 63,089 8,773 174,219 5,821 47,856 44,027 28,946 3,544 13,288 NOTE: Institution types are defined as follows: 1 = public doctoral, 2 = public master’s, 3 = public bachelor’s, 4 = public associate’s, 5 = public other, 6 = private not-for-profit doctoral, 7 = private not-for-profit master’s, 8 = private not-for-profit bachelor’s, 9 = private not-for-profit associate’s, 10 = private not-for-profit other. SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS), 2003 Employee by Assigned Position (EAP).

Finally, eligibility definitions for NSOPF:04 include non-faculty who provide instruction but do not have teaching as their principal activity. Since there were no published counts for such faculty members that could be used for weighting, nonresponse-adjusted weights were used to develop an estimate for this small subgroup. For this purpose, all emerging 560 non-faculty members retained their nonresponse adjusted weights—totaling to 26,803—as their final weights and were not included during the above raking process.46 That is, a value of one was assigned to the raking adjustment factor for these respondents when calculating their final analysis weights. In general, however, the final analysis weights (WTA00) were computed as the product of the institution and the faculty component weights by:

WTA00 = WT1 × WT2 × WT3 × WT4 × WT5 × WT6 × WT7 × WT8 × WT9

46 This means that the sum of the weights for all 26,110 respondents, which includes the 560 non-faculty members, is 1,211,744 = 1,185,661 + 26,083.

Page 117: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

97

6.3 Variance Estimation The 2004 National Study of Postsecondary Faculty (NSOPF:04) sampling design was a

stratified two-stage design. A stratified sample of postsecondary institutions was selected with probabilities proportional to a composite measure of size at the first stage, and a stratified systematic sample of faculty and instructional staff was selected from sample institutions providing lists at the second stage. Because of this complex sampling design, statistical analyses should be conducted using software packages that properly account for the employed survey design through use of survey weights.

Most commonly used statistical procedures assume that data are obtained from a simple random sample; that is, that the observations are independent and identically distributed. When the data have been collected using a complex sampling design, the simple random sampling assumption usually leads to underestimating the sampling variance, which would lead to artificially narrow confidence intervals and liberal hypothesis test results; that is to say, rejecting the null hypothesis when it is true more often than indicated by the nominal Type I error level. (Carlson et al. 1993).

Statistical strategies that have been developed to address this issue include: first-order Taylor series expansion of the variance equation; balanced repeated replication; and the Jackknife approach (Wolter 1985). Software packages that have been developed for analyzing complex sample survey data include SUDAAN, WesVar, Stata, and SAS. SUDAAN is a commercial product developed by RTI. Further information can be obtained from the website http://www.rti.org/sudaan. WesVar is a product of Westat, Inc., for which additional information can be obtained from the website http://www.westat.com/wesvar. Stata is a product of StataCorp LP; additional information about Stata can be found at the following website: http://www.stata.com. SAS information may be found on the SAS corporate website: http://www.sas.com. Also, the National Center for Education Statistics (NCES) has developed a software tool called the Data Analysis System (DAS) for analysis of complex survey data. Information about DAS is available from the website http://nces.ed.gov/das.

The variance estimation strategy chosen for NSOPF:04 has aimed to satisfy the following requirements and design features:

• variance reduction due to stratification at all stages of sampling;

• unequal weighting effects due to nonresponse adjustment and poststratification;

• variance inflation due to clustering;

• estimation of linear and nonlinear statistics such as quantiles; and

• variance reduction due to finite population corrections at the PSU (institution) stage of sampling and the high sampling rates in certain strata.

Commonly applied bootstrap variance estimation techniques satisfy the first four requirements. To meet the last requirement, however, the methodology developed by Kaufman was applied (Kaufman 2004). This methodology incorporates the finite population correction factors at both stages of sampling. However, for NSOPF:04, application of this method reflected the finite population correction factor at the first stage only where sampling fractions were often high. At the second stage, where the sampling fractions were generally low, the finite population correction factor was set to 1.00.

Page 118: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

98

The Kaufman methodology was used to develop a vector of 64 bootstrap sample weights that are included on the analysis file, along with the full sample analysis weights. Replicate weights were set to zero for units not selected in a particular bootstrap sample while weights for other units were inflated for the bootstrap subsampling. Note that analogous to the full sample weights, these replicate weights were also poststratified to the same set of control totals for calibration.

The number of replicate weights was set at 64 based on an empirical investigation of the behavior of variance estimates as the number of replicates increased. This investigation showed that the stability of variance estimates improved with increasing numbers of replicates and became fairly stable for most estimates when between 50 and 55 replicate weights were used. Also, a similar process of generating replicate weights was used for the institution file except that all procedures relating to the second stage of sampling were omitted.

The vector of B replicate weights allows for computing additional estimates for the sole purpose of estimating a variance. With the 64 sets of replicate weights, the variance of any statistic,θ̂ , can be estimated by separately calculating the statistic of interest from each replicate and then using the variability among the resulting estimates to calculate the variance of the given statistic by:

Bb

b∑=

−=

64

1

2* )ˆˆ()ˆvar(

θθθ ,

where *b̂θ is the estimate based on the b-th replicate weights.

Once the replicate weights are provided, this estimate can be produced by survey software packages such as SUDAAN, STATA, and WesVar. Here, the analyst should specify the full study and replicate weights appropriate for the given analysis. In this case, the analyst should specify the full study and replicate weights, which are appropriate for the given analysis. Below is an example of a generic SUDAAN code for producing point estimates and their associated standard errors using replicate weights that reflect the reduction in variance due to finite population correction (fpc) at the institution stage of sampling. The symbols /* and */ in the code indicate the beginning and end of a comment. Note that the dataset does not need to be sorted.

proc descript data=/* insert filename*/ design=brr;

weight STUDYWEIGHT;

repWgt BRRWT01–BRRWT64;

var /*insert variables*/;

subpopn /* insert domain of interest if analysis domain is a subset of faculty members*/;

print nsum mean semean / style=nchs;

run;

Again, it should be noted that there are three sets of study (analysis) weights and their corresponding replicate weights. These weights are:

1. Institution weights

Page 119: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

99

Analysis: WTB00 Replicate: WTB01 – WTB64

2. Faculty weights Analysis: WTA00 Replicate: WTA01 – WTA64

3. Contextual weights Analysis: WTC00 Replicate: WTC01 – WTC64

Should analysts decide to use the Taylor Series Linearization method for approximating standard errors of estimates, the design structure (level of clustering) should be specified by identifying the analysis strata and primary sampling units (PSU). Below is an example of generic SUDAAN code to produce estimates and standard errors using Taylor Series approximation. The symbols /* and */ in the code indicate the beginning and end of a comment. Note that the dataset must be sorted by analysis strata and analysis PSUs.47

proc descript data=/* insert filename*/ design=wr;

nest analysis stratum analysis PSU;

weight STUDYWEIGHT;

var /*insert variables*/;

subpopn /* insert domain of interest if analysis domain is a subset of faculty members */;

print nsum mean semean / style=nchs;

run;

For each of the three types of analyses—institution, faculty, and contextual (merged)—specific design variables, which are generically named analysis stratum and analysis PSU in the above code, need to be identified. These variables are available on corresponding final datasets as described below.

• Institution design variables Analysis PSU: IPSU Analysis Stratum: ISTRATUM

• Faculty design variables Analysis PSU: FPSU Analysis Stratum: FSTRATUM

• Contextual design variables Analysis PSU: FPSU Analysis Stratum: FSTRATUM

47 Please note that DAS uses Taylor linearization approach for variance estimation and calculation of DEFF and DEFT.

Page 120: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Chapter 6. Weighting and Variance Estimation

100

6.4 Design Effects and Standard Errors Table 44 provides estimates of design effects for selected faculty data. These estimates,

which consist of the ratio of variance of estimates under the employed design and simple random sampling (DEFF)48 and the square root of this ratio (DEFT),49 are typically used as measures for the efficiency of a sample design. The larger the design effect, the larger the variance of the estimate relative to what would have been obtained under simple random sampling where all units have the same chance of selection.

The standard errors were calculated using SUDAAN with the replicate weights that were calculated for these data, details of which are provided section 6.3. The average design effect for the listed key faculty estimates in table 44 was 1.88. Briefly, this indicates that due to differential sampling and weight adjustments, the resulting sample is 1.88 times less effective as compared to a simple random sample with 100 percent response rate. That is, the original sample size should be divided by 1.88 to obtain the effective sample size under the employed design. More detailed tables are available in appendix M.

Table 44. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics: 2004

Standard error Item Number1

Percent estimate Design SRS DEFF DEFT

Q1: Percent with instructional duties 26,110 97.0 0.14 0.11 1.69 1.30 Q2: Percent with some credit instruction 26,110 90.2 0.32 0.18 2.94 1.71 Q3: Percent who had faculty status 26,110 92.2 0.26 0.17 2.39 1.54 Q4: Percent whose principal activity was teaching 26,110 73.8 0.34 0.27 1.58 1.26 Q4: Percent whose principal activity was research 26,110 8.8 0.18 0.18 1.06 1.03 Q6: Percent part-time is primary employment 8,360 18.0 0.34 0.24 1.99 1.41 Q8: Percent part-time preferred full-time 8,360 28.0 0.38 0.28 1.83 1.35 Q10: Percent with academic rank of professor 26,110 21.2 0.31 0.25 1.53 1.24 Q12: Percent with tenure 26,110 39.8 0.45 0.30 2.24 1.50 Q15: Percent nonunion union not available 20,880 11.7 0.27 0.20 1.91 1.38 Q19A1: Percent with other job that is full-time 26,110 11.5 0.25 0.20 1.62 1.27 Q35A1: Percent teaching a single credit class 26,110 73.2 0.37 0.27 1.87 1.37 Q37F1: Percent with no TA in first class 21,460 93.7 0.19 0.15 1.66 1.29 Q37C2: Percent meet > 3 hours for second class 15,280 27.2 0.34 0.28 1.56 1.25 Q39: Percent with website for instruction 26,110 84.9 0.30 0.24 1.55 1.24 Q62A: Percent not “very satisfied” workload 26,110 31.5 0.62 0.38 2.69 1.64 Q64: Percent retired from another position 26,110 57.6 0.33 0.31 1.20 1.10 Q68: Percent paid by the course exclude salary 6,740 34.1 0.62 0.52 1.43 1.20 Q77: Percent marital status single 26,110 34.8 0.58 0.52 1.22 1.11 Q77: Percent marital status married 26,110 69.0 0.54 0.32 2.88 1.70 Q81: Percent United States citizen 26,110 36.9 0.97 0.59 2.71 1.65 1 This number reflects the total number of eligible respondents for each item, rounded to the nearest 10. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

48 The design effect (DEFF) is the variance estimate of an estimated parameter under the survey design divided by the variance estimate of an estimated parameter for a simple random sample of the same size. 49 The root design effect (DEFT) is the square root of the design effect (DEFF).

Page 121: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

101

References Abraham, S.Y., Steiger, D.M., Montgomery, M., Kuhr, B.D., Tourangeau, B., Montgomery, B.,

and Chattopadhyay, M. (2002). 1999 National Study of Postsecondary Faculty (NSOPF:99) Methodology Report (NCES 2002–154). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

Berger, A., Kirshtein, R., Zhang, Y., and Carter, K. (2002). Developing the 2004 Faculty Survey: Themes from the Literature on Postsecondary Education. Washington, DC: American Institutes for Research.

Carlson, B.L., Johnson, A.E., and Cohen, S.B. (1993). An Evaluation of the Use of Personal Computers for Variance Estimation with Complex Survey Data. Journal of Official Statistics, 9(4):795-814.

Chen, P., Penne, M.A., & Singh, A.C. (2000). Experience with the Generalized Exponential Model of Weight Calibration for the National Household Survey on Drug Abuse. American Statistical Association (ASA) Proceedings on Survey Research Methodology Section, 604-609.

Chromy, J.R. (1979). Sequential Sample Selection Methods. Proceedings of the American Statistical Association, Section on Survey Research, pp. 401–406.

Chromy, J.R. (1987). “Design Optimization with Multiple Objectives.” Proceedings of the American Statistical Association Section on Social Statistics, pp. 194-199.

Cohen, S.B. (1997). An Evaluation of Alternative PC-Based Software Packages Developed for the Analysis of Complex Survey Data. The American Statistician, 57(13): 285-292.

Cominole, M., Siegel, P., Dudley, K., Roe, D., and Gilligan, T. (Forthcoming). National Postsecondary Student Aid Study 2004 (NPSAS:04) Methodology Report (NCES 2006–180). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

DeRouvray, C., and Couper, M.P. (2002). Designing a Strategy for Reducing “No Opinion” Responses in Web-Based Surveys. Social Science Computer Review, 20(1):3–9.

Deville, J.C., & Särndal, C.E. (1992). Calibration Estimators in Survey Sampling. Journal of the American Statistical Association (JASA), 87, 376-382.

Education Sciences Reform Act of 2002. (2002). Public Law 107-279, Title I, Section 153. Available: http://www.ed.gov/policy/rschstat/leg/PL107-279.pdf

Folsom, R.E. and Singh, A.C. (2000). The Generalized Exponential Model for Sampling Weight Calibration for Extreme Values, Nonresponse, and Poststratification. Proceedings of the American Statistical Association, Section on Survey Research.

Folsom, R.E., & Witt, M.B. (1994). Testing a New Attrition Nonresponse Method for SIPP. ASA Proceedings on Survey Research Methodology Section, 428-433.

Page 122: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

References

102

Heuer, R.E., Cahalan, M., Fahimi, M., Curry-Tucker, J.L., Carley-Baxter, L., Curtin, T.R., Hinsdale, M., Jewell, D.M., Kuhr, B.D., and McLean, L. (2004). 2004 National Study of Postsecondary Faculty (NSOPF:04) Field Test Methodology Report (NCES 2004–01). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

Kaufman, S. (2004). Using the Bootstrap in a Two-Stage Design when Some Second-Stage Strata have only One Unit Allocated. Proceedings of the American Statistical Association, Section on Survey Research Methods.

Kojaku, L. and Nuñez, A. (1998). Descriptive Summary of 1995-96 Beginning Postsecondary Students, With Profiles of Students Entering 2- and 4-Year Institutions (NCES 1999–030). Washington DC: U.S. Department of Education, National Center for Education Statistics.

Nettles, M.T., Perna, L.W., and Bradburn, E.M. (2000). Salary, Promotion, and Tenure Status of Minority and Women Faculty in U.S. Colleges and Universities (NCES 2000–173). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

U.S. Department of Education (2001). National Postsecondary Student Aid Study: Student Financial Aid Estimates for 1999–2000 (NCES 2001–209). Washington, DC: National Center for Education Statistics.

U.S. Department of Education (2002). Classification of Instructional Programs: 2000 Edition (NCES 2002–165R). Washington, DC: National Center for Education Statistics.

Wolter, K. (1985). Introduction to Variance Estimation, Springer-Verlag: New York, NY.

Zimbler, L.J. (2001). Background Characteristics, Work Activities, and Compensation of Faculty and Instructional Staff in Postsecondary Institutions: Fall 1998 (NCES 2001–152). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

Page 123: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A Sampling Details

Page 124: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 125: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-3

A.1 Institution Frame Construction

The institution sample selection has been based on a probability proportional to size (PPS) selection methodology, where each institution was assigned a composite measure of size (MOS) that reflected the number of eligible faculty and instructional staff in each of the following six hierarchical strata in the given order of inclusion:1

• Hispanic;

• non-Hispanic Black;

• Asian and Pacific Islander;

• full-time female;

• full-time male; and

• all other.

Faculty counts needed for MOS calculations were initially obtained from the Fall Staff Survey Component of the Winter 2001-2002 Integrated Postsecondary Education Data System (IPEDS) Data Collection (Winter:02 IPEDS). However, this source could not provide all of the information necessary to classify faculty members into one of the above sampling strata. For instance, faculty counts were not reported in a number of institutions, while for others, reported counts were not indexed by race and ethnicity. As a result, the missing information had to be imputed in two steps. As detailed in the next section, the first step consisted of imputing unreported (missing) faculty counts, while in the second step, faculty reported as having unknown race/ethnicity or as nonresident aliens were distributed among the known race categories using a special procedure. Subsequent to these two steps, faculty members in each institution were classified into one of the six sampling strata.

A.1.1 Imputation of Missing Faculty Counts

As summarized in table A-1, starting with the 3,379 eligible institutions in the NSOPF:04 universe, the Winter:02 IPEDS provided faculty counts for 3,148 institutions, including counts of faculty with unknown race/ethnicity and those listed as nonresident aliens.2 Of these 3,148 institutions, 59 were main campuses (parents) that reported to IPEDS the total faculty at the main campus as well as those for their branch (child) campuses. Among the 231 institutions that had no reported faculty counts, 80 were children of the 59 parent campuses and the remaining 151 were campuses without any reported faculty counts.

1 It should be noted that since the institution samples for both National Study of Postsecondary Faculty (NSOPF:04) and National Postsecondary Student Aid Survey (NPSAS:04) surveys were selected jointly, the MOS for each institution was calculated to reflect the number of students in each institution as well. 2 Note that the universe count of 3,379 does not reflect the two institutions that were added after sample selection to refresh the sample.

Page 126: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-4

Table A-1. Composition of NSOPF:04 institutions based on availability of the Winter:02 IPEDS data

Institution type Frequency All eligible institutions 3,379

With IPEDS faculty counts 3,148 Non-parent 3,089 Parent 59

Without IPEDS faculty counts 231 Children 80 Other 151 NOTE: Parent refers to main campuses. Children refers to branch campuses. IPEDS = Integrated Postsecondary Education Data System. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The IPEDS allows institutions to provide combined faculty counts for themselves and their branch campuses. The unreported faculty counts for the 80 child campuses were reallocated from their parents according to the following steps. Here, all child institutions corresponding to a parent institution were included, even if such institutions were not eligible for NSOPF:04.

• For the 75 child institutions whose parents had a Carnegie code—accredited, degree-granting colleges and universities—the total count of faculty for each faculty group was reallocated from the parent institutions such that the parent retained twice as many faculty members as each of the children for the given group. This is the procedure followed for IPEDS.

• For the remaining five child institutions whose parents did not have a Carnegie code, the total count of faculty for each group was allocated equally between the parent and its child.

There were 151 institutions in the NSOPF universe that were not eligible for IPEDS imputation3 and had no reported faculty counts in the Winter:02 IPEDS. In order to calculate MOS for such institutions, missing counts of faculty members were imputed using a methodology similar to that used to impute all IPEDS data. Specifically, the following steps were taken:

• If data were available from the Fall Staff Survey (IPEDS-S:97 or -S:99), these data were used without any adjustments, with preference given to the more recent data.

• If data were not available from either the IPEDS-S:97 or -S:99, faculty counts were imputed as a function of student counts according to the following steps:

− Using the IPEDS 2000 Fall Enrollment dataset, for each institution the full-time equivalent (FTE) for students was calculated using the following formula:

FTE = Full-time + 31 Part-time

3 Institutions not in-scope for imputation for the Fall Staff Survey of IPEDS included those that were child institutions or had less than 15 full-time employees.

Page 127: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-5

− Each institution was assigned to an imputation group based on institution type and within each group the institution with the closest FTE was selected as the donor institution. Subsequently, the missing faculty counts for each subgroup were imputed using the following ratio estimator, in which the function “Integer” indicates the integer part of the resulting number:

⎥⎦

⎤⎢⎣

⎡×=

DonorDonor FTE

FTEFacultyIntegerFaculty Missing

Missing

A.1.2 Imputation of Missing Faculty Stratification Information

The majority of the 3,379 NSOPF:04 institutions included faculty members whose race and ethnicity were reported as nonresident alien or as unknown. These race/ethnicity categories had to be reconciled so that such faculty could be allocated to the six sampling strata. As detailed next, this process was carried out separately for faculty with reported unknown race/ethnicity and for faculty who are reported as nonresident aliens.

A large number of institutions included faculty for whom race/ethnicity was reported as unknown. These faculty were assigned to known race categories available from IPEDS (non-Hispanic Black, Hispanic, Asian/Pacific Islander, American Indian/Alaska Native, and White) using the following steps.

• For each institution, the percentage of faculty with known race/ethnicity was obtained for groups indexed by gender and employment status (part-time and full-time).

• When the reported race/ethnicity counts of faculty within a gender/employment group were at least 50 percent of the total faculty count for that group (including those with unknown race/ethnicity and nonresident aliens), the count of faculty with unknown race/ethnicity was distributed to each of the race categories in proportion to the reported counts within the gender/employment group.

• Conversely, when the reported race/ethnicity counts of faculty within a gender/employment group were less than 50 percent of the total faculty count, faculty with unknown race/ethnicity were distributed to each of the race categories in proportion to the average distribution of race for that gender/employment group within classes indexed by level of institution (2- and 4-year) and region. Average group distributions were constructed from those institutions with more than 50 percent of race/ethnicity of their faculty reported in the five categories.

Tables A-2 through A-5 summarize the resulting average distributions and other statistics for each of the 16 institution groups within the four gender/employment groups. In addition to faculty with unknown race/ethnicity, many institutions included counts of nonresident alien faculty for whom race/ethnicity was not reported. In such cases, counts of nonresident alien faculty members were distributed among the known race/ethnicity groups within groups indexed by gender and employment status. For this purpose, the needed distributions for nonresident

Page 128: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-6

aliens were obtained from the 1990 through 1999 Survey of Earned Doctorates (SED) data,4 since the distribution of reported race/ethnicity based on IPEDS is not representative of nonresident alien faculty members.

Table A-2. Race/ethnicity distribution of full-time male faculty and percent unknown race, by level and region of institution: 2002

Class Percent of race/ ethnicity known

Percent mean distribution of all faculty by race/ethnicity

Number Institution level OBE

region

0 - 49 50 - 100 Hispanic Black

Asian/ Pacific

Islander

American Indian/

Alaskan Native White

1 4-year or above 1 8 185 1.6 2.7 4.0 0.2 91.5 2 4-year or above 2 6 436 2.0 5.0 5.0 0.1 88.0 3 4-year or above 3 15 345 2.0 3.8 4.6 0.2 89.5 4 4-year or above 4 8 218 1.7 1.6 3.3 0.9 92.5 5 4-year or above 5 8 461 1.7 10.5 4.5 0.3 83.0 6 4-year or above 6 2 156 4.6 4.8 5.8 1.3 83.4 7 4-year or above 7 1 51 2.2 1.0 3.4 1.5 91.9 8 4-year or above 8 9 259 3.7 3.5 9.3 0.4 83.2 9 At least 2- but less than 4-year 1 4 45 1.5 4.4 1.9 0.2 92.0 10 At least 2- but less than 4-year 2 10 124 2.5 5.5 2.6 0.2 89.3 11 At least 2- but less than 4-year 3 1 156 1.4 5.1 2.6 1.2 89.7 12 At least 2- but less than 4-year 4 1 133 0.9 1.4 0.9 3.1 93.7 13 At least 2- but less than 4-year 5 1 366 1.6 8.9 1.2 0.5 87.8 14 At least 2- but less than 4-year 6 1 129 9.8 3.2 1.8 2.6 82.7 15 At least 2- but less than 4-year 7 1 43 2.3 1.1 0.7 5.6 90.3 16 At least 2- but less than 4-year 8 3 193 7.5 4.6 6.7 1.4 79.9 NOTE: Office of Business Economics (OBE) regions are defined as 1 = New England (CT, ME, MA, NH, RI, VT); 2 = Mid East (DE, DC, MD, NJ, NY, PA); 3 = Great Lakes (IL, IN, MI, OH, WI); 4 = Plains (IA, KS, MN, MO, NE, ND, SD); 5 = Southeast (AL, AR, FL, GA, KY, LA, MS, NC, SC, TN, VA, WV); 6 = Southwest (AZ, NM, OK, TX); 7 = Rocky Mountains (CO, ID, MT, UT, WY); 8 = Far West (AK, CA, HI, NV, OR, WA). SOURCE: U.S. Department of Education, National Center for Education Statistics, 2002 Integrated Postsecondary Education Data System (IPEDS).

Accordingly, in the first region there are eight 4-year or above institutions at which race/ethnicity was known for less than 50 percent of full-time male faculty members, while there were 185 institutions at which race/ethnicity was known for more than 50 percent of full-time male faculty. The corresponding numbers for at least 2- but less than 4-year institution are 4 and 45, respectively in region 1.

4 http://www.nsf.gov/statistics/srvydoctorates/

Page 129: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-7

Table A-3. Race/ethnicity distribution of part-time male faculty and percent unknown race, by level and region of institution: 2002

Class Percent of race/ ethnicity known

Percent mean distribution of all faculty by race/ethnicity

Number Institution level OBE

region 0–49 50–100 Hispanic Black

Asian/ Pacific

Islander

American Indian/

Alaskan Native White

1 4-year or above 1 41 152 1.4 3.7 2.1 0.2 92.6 2 4-year or above 2 70 372 1.8 6.0 2.5 0.1 89.6 3 4-year or above 3 64 296 1.1 4.6 2.4 0.2 91.7 4 4-year or above 4 56 170 0.7 3.0 1.1 0.9 94.4 5 4-year or above 5 85 384 1.5 13.4 2.2 0.3 82.6 6 4-year or above 6 27 131 6.2 6.6 3.8 1.1 82.4 7 4-year or above 7 12 40 3.1 1.6 1.7 2.0 91.6 8 4-year or above 8 66 202 3.6 4.1 9.5 0.6 82.2 9 At least 2- but less than 4-year 1 18 31 1.9 5.3 1.2 0.1 91.5 10 At least 2- but less than 4-year 2 16 118 2.9 7.5 2.8 0.2 86.7 11 At least 2- but less than 4-year 3 13 144 1.8 6.2 1.6 0.8 89.6 12 At least 2- but less than 4-year 4 28 106 1.7 3.4 0.7 2.3 92.0 13 At least 2- but less than 4-year 5 55 312 1.6 10.6 0.6 0.5 86.8 14 At least 2- but less than 4-year 6 26 104 11.5 4.2 1.4 1.9 81.0 15 At least 2- but less than 4-year 7 10 34 4.9 1.4 0.8 9.6 83.3 16 At least 2- but less than 4-year 8 17 179 6.6 4.5 6.8 1.4 80.8 NOTE: Office of Business Economics (OBE) regions are defined as 1 = New England (CT, ME, MA, NH, RI, VT); 2 = Mid East (DE, DC, MD, NJ, NY, PA); 3 = Great Lakes (IL, IN, MI, OH, WI); 4 = Plains (IA, KS, MN, MO, NE, ND, SD); 5 = Southeast (AL, AR, FL, GA, KY, LA, MS, NC, SC, TN, VA, WV); 6 = Southwest (AZ, NM, OK, TX); 7 = Rocky Mountains (CO, ID, MT, UT, WY); 8 = Far West (AK, CA, HI, NV, OR, WA). SOURCE: U.S. Department of Education, National Center for Education Statistics, 2002 Integrated Postsecondary Education Data System (IPEDS).

Table A-4. Race/ethnicity distribution of full-time female faculty and percent unknown race, by level and region of institution: 2002

Class Percent of race/ ethnicity known

Percent mean distribution of all faculty by race/ethnicity

Number Institution level OBE

region 0–49 50–100 Hispanic Black

Asian/ Pacific

Islander

American Indian/

Alaskan Native White

1 4-year or above 1 7 186 1.9 2.8 3.5 0.2 91.6 2 4-year or above 2 46 396 2.5 6.5 4.1 0.2 86.8 3 4-year or above 3 15 345 1.9 4.7 3.3 0.4 89.8 4 4-year or above 4 7 219 1.3 2.4 1.9 1.1 93.3 5 4-year or above 5 11 458 2.1 13.8 2.3 0.6 81.3 6 4-year or above 6 5 153 4.9 7.4 4.2 1.7 81.9 7 4-year or above 7 2 50 2.4 0.8 4.9 1.2 90.8 8 4-year or above 8 17 251 3.7 3.0 9.3 0.6 83.4 9 At least 2- but less than 4-year 1 2 47 1.1 3.5 0.9 0.3 94.3 10 At least 2- but less than 4-year 2 3 131 1.7 6.0 1.9 0.2 90.2 11 At least 2- but less than 4-year 3 0 157 1.1 6.5 2.0 0.5 89.9 12 At least 2- but less than 4-year 4 0 134 0.6 1.5 0.9 2.8 94.2 13 At least 2- but less than 4-year 5 0 367 1.3 13.0 0.7 0.3 84.6 14 At least 2- but less than 4-year 6 1 129 8.3 5.0 1.7 2.2 82.8 15 At least 2- but less than 4-year 7 1 43 2.8 0.6 0.3 7.3 89.0 16 At least 2- but less than 4-year 8 4 192 7.6 5.4 7.9 1.4 77.8 NOTE: Office of Business Economics (OBE) regions are defined as 1 = New England (CT, ME, MA, NH, RI, VT); 2 = Mid East (DE, DC, MD, NJ, NY, PA); 3 = Great Lakes (IL, IN, MI, OH, WI); 4 = Plains (IA, KS, MN, MO, NE, ND, SD); 5 = Southeast (AL, AR, FL, GA, KY, LA, MS, NC, SC, TN, VA, WV); 6 = Southwest (AZ, NM, OK, TX); 7 = Rocky Mountains (CO, ID, MT, UT, WY); 8 = Far West (AK, CA, HI, NV, OR, WA). SOURCE: U.S. Department of Education, National Center for Education Statistics, 2002 Integrated Postsecondary Education Data System (IPEDS).

Page 130: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-8

Table A-5. Race/ethnicity distribution of part-time female faculty and percent unknown race, by level and region of institution: 2002

Class

Percent of race/ethnicity

known Percent mean distribution

of all faculty by race/ethnicity

Number Institution level OBE

region 0–49 50–100 Hispanic Black

Asian/ Pacific

Islander

American Indian/

Alaskan Native White

1 4-year or above 1 40 153 1.4 2.0 2.6 0.1 94.0 2 4-year or above 2 92 350 2.6 6.6 3.5 0.2 87.1 3 4-year or above 3 63 297 1.5 5.7 2.9 0.3 89.7 4 4-year or above 4 52 174 1.2 2.3 1.3 1.1 94.2 5 4-year or above 5 91 378 1.9 15.0 1.8 0.1 81.3 6 4-year or above 6 33 125 6.3 8.5 2.6 0.8 81.8 7 4-year or above 7 11 41 3.0 0.7 2.0 1.6 92.7 8 4-year or above 8 66 202 5.3 3.2 8.2 0.5 82.8 9 At least 2- but less than 4-year 1 18 31 1.9 2.7 1.6 0.1 93.6 10 At least 2- but less than 4-year 2 14 120 3.2 7.0 2.4 0.2 87.3 11 At least 2- but less than 4-year 3 12 145 1.4 6.9 1.3 0.8 89.7 12 At least 2- but less than 4-year 4 21 113 0.9 1.6 0.5 1.8 95.2 13 At least 2- but less than 4-year 5 54 313 1.5 13.8 0.6 0.3 83.8 14 At least 2- but less than 4-year 6 22 108 12.7 5.1 1.1 1.7 79.4 15 At least 2- but less than 4-year 7 10 34 2.8 1.0 1.0 5.8 89.4 16 At least 2- but less than 4-year 8 18 178 6.3 4.9 8.0 1.6 79.2 NOTE: Office of Business Economics (OBE) regions are defined as 1 = New England (CT, ME, MA, NH, RI, VT); 2 = Mid East (DE, DC, MD, NJ, NY, PA); 3 = Great Lakes (IL, IN, MI, OH, WI); 4 = Plains (IA, KS, MN, MO, NE, ND, SD); 5 = Southeast (AL, AR, FL, GA, KY, LA, MS, NC, SC, TN, VA, WV); 6 = Southwest (AZ, NM, OK, TX); 7 = Rocky Mountains (CO, ID, MT, UT, WY); 8 = Far West (AK, CA, HI, NV, OR, WA). SOURCE: U.S. Department of Education, National Center for Education Statistics, 2002 Integrated Postsecondary Education Data System (IPEDS).

Using data from the prior 10 years of SED, counts of doctorate recipients with temporary resident status were obtained to construct surrogate distributions of race/ethnicity for nonresident alien faculty members. Specifically, the average race/ethnicity distributions of these individuals were calculated within groups indexed by gender, type of institution (public and private), and region. The appropriate distribution(s) were applied to the number of nonresident aliens in each institution to allocate such counts to one of the known race/ethnicity categories. Tables A-6 and A-7 provide a summary of the resulting average distributions and other statistics for nonresident aliens for each gender.

Page 131: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-9

Table A-6. Race/ethnicity distribution of full- and part-time male temporary residents constructed from the Survey of Earned Doctorates: 1990–1999

Class Percent distribution of temporary residents by race/ethnicity

Number Control OBE

region

Hispanic Black Asian/Pacific

Islander American Indian/

Alaskan Native White 1 Public 1 4.4 4.9 69.5 0.0 21.2 2 Public 2 4.0 3.2 71.2 0.0 21.6 3 Public 3 3.7 4.2 70.2 0.0 21.9 4 Public 4 5.8 4.5 70.8 0.0 18.9 5 Public 5 5.0 4.0 69.9 0.0 21.0 6 Public 6 8.3 3.2 69.0 0.0 19.5 7 Public 7 6.2 3.3 62.3 0.1 28.2 8 Public 8 6.3 2.2 63.4 0.0 28.0 9 Private 1 4.8 2.9 51.4 0.0 40.8 10 Private 2 5.0 4.0 60.5 0.0 30.6 11 Private 3 3.8 3.0 65.4 0.0 27.8 12 Private 4 6.0 4.8 61.2 0.0 28.1 13 Private 5 7.8 6.0 60.1 0.0 26.2 14 Private 6 6.7 3.7 60.3 0.0 29.2 15 Private 7 4.2 7.9 56.6 0.0 31.2 16 Private 8 3.5 1.8 59.1 0.0 35.6 NOTE: Office of Business Economics (OBE) regions are defined as 1 = New England (CT, ME, MA, NH, RI, VT); 2 = Mid East (DE, DC, MD, NJ, NY, PA); 3 = Great Lakes (IL, IN, MI, OH, WI); 4 = Plains (IA, KS, MN, MO, NE, ND, SD); 5 = Southeast (AL, AR, FL, GA, KY, LA, MS, NC, SC, TN, VA, WV); 6 = Southwest (AZ, NM, OK, TX); 7 = Rocky Mountains (CO, ID, MT, UT, WY); 8 = Far West (AK, CA, HI, NV, OR, WA). SOURCE: National Science Foundation, 1990–1999 Survey of Earned Doctorates (SED:90-99).

Table A-7. Race/ethnicity distribution of full- and part-time female temporary residents constructed from the Survey of Earned Doctorates: 1990–1999

Class Percent distribution of temporary residents by race/ethnicity

Number Control OBE

region

Hispanic Black Asian/Pacific

Islander American Indian/

Alaskan Native White 1 Public 1 6.9 3.8 60.7 0.2 28.4 2 Public 2 6.2 2.8 64.1 0.0 26.8 3 Public 3 5.1 4.1 68.1 0.0 22.7 4 Public 4 6.8 4.4 69.4 0.1 19.4 5 Public 5 6.4 4.0 65.4 0.0 24.2 6 Public 6 9.1 2.7 66.9 0.1 21.2 7 Public 7 9.4 2.8 56.3 0.0 31.5 8 Public 8 7.3 2.0 57.9 0.1 32.8 9 Private 1 5.4 1.8 49.8 0.0 43.0 10 Private 2 5.9 4.8 52.3 0.0 37.0 11 Private 3 2.2 2.4 64.8 0.0 30.6 12 Private 4 4.1 5.7 52.5 0.0 37.7 13 Private 5 10.1 6.3 49.1 0.0 34.5 14 Private 6 8.9 1.4 66.5 0.0 23.3 15 Private 7 0.0 0.0 0.0 0.0 0.0 16 Private 8 4.3 2.9 55.6 0.0 37.2 NOTE: Office of Business Economics (OBE) regions are defined as 1 = New England (CT, ME, MA, NH, RI, VT); 2 = Mid East (DE, DC, MD, NJ, NY, PA); 3 = Great Lakes (IL, IN, MI, OH, WI); 4 = Plains (IA, KS, MN, MO, NE, ND, SD); 5 = Southeast (AL, AR, FL, GA, KY, LA, MS, NC, SC, TN, VA, WV); 6 = Southwest (AZ, NM, OK, TX); 7 = Rocky Mountains (CO, ID, MT, UT, WY); 8 = Far West (AK, CA, HI, NV, OR, WA). SOURCE: National Science Foundation, 1990–1999 Survey of Earned Doctorates (SED:90-99).

Page 132: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-10

A.2 Institution Sample Selection

For sampling purposes, 10 institution strata were defined for the NSOPF:04 based on Carnegie classification codes and control. Since the institution sample for the faculty study was selected jointly with that for students, the 10 NSOPF sampling strata were collapsed from the related student-based 58 strata (STRAT58) for the 2004 National Study of Faculty and Students (NSoFaS:04). Table A-8 summarizes the distribution of the resulting sample of institutions by stratum for NSOPF:04. Table A-9 provides a crosswalk between the two sets of strata. Institution sample sizes and their corresponding sampling rates were established using a customized cost/variance optimization procedure, which aimed to identify the allocation that would accommodate all analytical objectives of this survey while minimizing data collection costs.

Table A-8. Distribution of NSOPF:04 full-scale institution sample, by type and Carnegie classification: 2004

Degree granting Total Public Private not-for-profit Total 1,080 680 400

Doctor’s 300 190 110 Master’s 200 120 80 Bachelor’s 160 30 130 Associate’s 350 340 10 Other/Unknown 70 10 60 NOTE: Detail may not sum to totals because of rounding. Numbers have been rounded to the nearest 10. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

A.3 Faculty Frame Construction

All sampled institutions were contacted to provide lists of faculty and instructional staff who were eligible for NSOPF:04. For this purpose, each institution was requested to provide a complete list of full- and part-time faculty and instructional staff as of November 1, 2003 (or during the fall term of the 2003–04 academic year).

A.3.1 List Request and Requirements

Each institution was given several options for providing faculty lists, including uploading an electronic copy of the list to a secure website, sending the list as an e-mail attachment, or mailing the list on diskette using the provided shipment material. It was requested that files containing the faculty lists follow a specific layout. Acceptable file formats included ASCII fixed field, ASCII comma-delimited, or Excel spreadsheet. For those institutions not capable of providing electronic lists of faculty and instructional staff, paper lists were suggested as a last choice of format. In addition to campus and home contact information, institutions were asked to provide basic demographic information such as gender, race/ethnicity, academic field, and employment status of each faculty member.

Towards the end of the list collection period, online course catalogs and institution websites were used to abstract lists for those sampled institutions that had failed to provide faculty lists, yet had online sources that could provide adequate information about their faculty

Page 133: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-11

and instructional staff. In all, a total of 139 such lists were abstracted to supplement the other lists that had been supplied by sample institutions. Online resources were approved for abstraction based on the completeness and inclusiveness of the information provided.

A.4 Faculty Sample Selection

The sample of faculty was selected using stratified systematic sampling within cells indexed by institutional and faculty strata as summarized in tables A-9 and A-10. Table A-9 presents the complete list of institutional strata used for NSoFaS, and indicates their correspondence to the strata used for NSOPF. Moreover, institution eligibility rates from the prior administrations of NSOPF were available only at a different level of aggregation (sector), a listing of which is provided in table A-11. Table A-12 provides a summary of faculty counts by NSoFaS institutional and NSOPF faculty strata, which were used for sample allocation based on the Winter:02 IPEDS. Stratum counts that are zero correspond to those that are specific to NPSAS:04.

Table A-9. NSoFaS institutional sampling strata: 2004

NSoFaS institution strata Institution type NSOPF strata 1 Public less than 2-year NPSAS only 2 Public 2-year associate’s 4 3 Public 2-year other – degree-granting 4 4 Public 2-year other – NPSAS only NPSAS only 5 Public 4-year master’s 2

6 Public 4-year bachelor’s 3 7 Public 4-year non-doctoral other 3,5 8 Public 4-year doctoral 1 9 Public 4-year doctoral other 2 10 Public 4-year NPSAS only NPSAS only

11 Private not-for-profit less-than-4-year associate’s 9 12 Private not-for-profit less-than-4-year other – degree-granting 9 13 Private not-for-profit less-than-4-year other – NPSAS only NPSAS only 14 Private not-for-profit 4-year master’s 7 15 Private not-for-profit 4-year bachelor’s 8

16 Private not-for-profit 4-year other 8,9,10 17 Private not-for-profit 4-year doctoral 6 18 Private not-for-profit 4-year doctoral master’s 7 19 Private not-for-profit 4-year doctoral other 7,8,10 20 Private not-for-profit 4-year NPSAS only NPSAS only

21 Private for-profit less-than-2-year NPSAS only 22 Private for-profit 2-year or more NPSAS only 23 CA Public 2-year 4 24 CA Public 4-year 1,2,3,5 25 CA Private not-for-profit 4-year 6,7,8,10 See notes at end of table.

Page 134: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-12

Table A-9. NSoFaS institutional sampling strata: 2004—Continued

NSoFaS institution strata Institution type NSOPF strata 26 CT Public 2-year 4 27 CT Public 4-year 1,2,3 28 CT Private not-for-profit 4-year 6,7,8,9,10 29 DE Public 2-year NPSAS only 30 DE Public 4-year 1,2

31 DE Private not-for-profit 4-year 6,8,10 32 GA Public 2-year 4 33 GA Public 4-year 1,2,3,4,5 34 GA Private not-for-profit 4-year 6,7,8,10 35 IL Public 2-year 4

36 IL Public 4-year 12 37 IL Private not-for-profit 4-year 6,7,8,10 38 IN Public 2-year 4 39 IN Public 4-year 1,2,3 40 IN Private not-for-profit 4-year 6,7,8,10

41 MN Public 2-year 4 42 MN Public 4-year 1,2,3 43 MN Private not-for-profit 4-year 6,7,8,9,10 44 NE Public 2-year 4 45 NE Public 4-year 1,2

46 NE Private not-for-profit 4-year 7,8,10 47 NY Public 2-year 4 48 NY Public 4-year 1,2,3,4,5 49 NY Private not-for-profit 4-year 6,7,8,10 50 OR Public 2-year 4

51 OR Public 4-year 1,2,5 52 OR Private not-for-profit 4-year 7,8,10 53 TN Public 2-year 4 54 TN Public 4-year 12 55 TN Private not-for-profit 4-year 6,7,8,9,10

56 TX Public 2-year 4 57 TX Public 4-year 1,2,3 58 TX Private not-for-profit 4-year 6,7,8,10 SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table A-10. Faculty sampling strata within institution: 2004

Faculty strata Faculty type 1 Non-Hispanic Black faculty 2 Hispanic faculty 3 Asian faculty 4 Other full-time female faculty 5 Other full-time male faculty 6 Other part-time faculty SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 135: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-13

Table A-11. NSOPF:99 institutional sectors used to obtain institution eligibility rates: 2004

Sector Sector type 1 Public 4-year 2 Private not-for-profit 4-year 4 Public 2-year 5 Private not-for-profit 2-year SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table A-12. Faculty population counts, by NSoFaS institutional and NSOPF faculty strata: 2004

Faculty strata NSoFaS institution strata Total 1 2 3 4 5 6

Total 1,074,011 62,578 37,821 73,196 198,179 317,958 384,279

1 † † † † † † † 2 177,454 13,073 5,798 3,858 27,718 27,872 99,135 3 4,469 223 39 58 682 828 2,639 4 † † † † † † † 5 41,689 2,991 829 1,833 10,249 15,007 10,780

6 12,931 899 238 532 2,712 4,236 4,314 7 6,594 172 94 331 1,395 2,180 2,422 8 176,687 7,267 4,604 18,439 38,125 75,878 32,374 9 25,245 3,321 465 1,388 5,305 8,261 6,505 10 † † † † † † †

11 3,838 251 106 124 859 938 1,560 12 700 49 33 25 119 103 371 13 † † † † † † † 14 24,427 790 401 876 4,874 6,435 11,051 15 35,501 2,351 554 1,167 7,981 12,173 11,275

16 12,773 722 180 400 1,917 3,200 6,354 17 59,170 3,103 1,743 5,787 10,421 23,490 14,626 18 23,282 1,237 563 758 4,760 6,850 9,114 19 8,023 572 145 341 1,476 2,929 2,560 20 † -† † † † † †

21 † † † † † † † 22 † † † † † † † 23 56,878 3,195 5,282 4,651 7,262 7,852 28,636 24 49,043 1,496 3,037 8,965 7,716 14,454 13,375 25 25,263 1,038 1,111 2,584 3,724 7,370 9,436

26 2,791 163 68 59 436 394 1,671 27 3,955 164 121 270 830 1,551 1,019 28 7,350 215 186 828 1,447 2,777 1,897 29 965 87 16 21 159 106 576 30 1,250 117 16 112 340 641 24

31 768 63 9 12 76 91 517 32 8,627 1,817 97 129 1,394 1,241 3,949 33 11,110 991 171 801 2,798 4,361 1,988 34 6,582 1,253 148 476 1,264 2,396 1,045 35 15,787 1,187 420 453 1,886 2,090 9,751

Page 136: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-14

See notes at end of table.

Table A-12. Faculty population counts, by NSoFaS institutional and NSOPF faculty strata: 2004—Continued

Faculty strata NSoFaS institution strata Total 1 2 3 4 5 6 36 13,634 786 386 1,595 3,144 5,217 2,506 37 18,990 806 415 1,551 3,652 6,264 6,302 38 3,601 146 34 46 476 486 2,413 39 11,839 360 255 935 2,614 4,612 3,063 40 6,173 257 109 255 1,133 2,193 2,226

41 6,198 136 65 123 1,257 1,526 3,091 42 7,227 152 136 597 1,755 3,125 1,462 43 5,779 116 96 208 1,265 1,800 2,294 44 2,346 19 41 19 442 593 1,232 45 4,455 116 98 346 988 1,854 1,053

46 2,148 39 40 94 474 711 790 47 16,189 1,152 671 546 2,210 2,390 9,220 48 26,275 1,972 1,041 2,047 4,046 7,309 9,860 49 48,793 2,287 1,465 3,691 7,977 14,892 18,481 50 6,286 113 206 220 1,067 1,054 3,626

51 6,715 76 158 550 1,509 2,337 2,085 52 2,556 21 45 79 519 882 1,010 53 3,827 379 33 34 760 719 1,902 54 7,560 596 100 540 1,795 3,084 1,445 55 6,289 474 74 371 1,449 2,665 1,256

56 26,411 2,242 3,185 831 3,751 4,137 12,265 57 27,253 1,133 2,136 2,658 5,817 10,547 4,962 58 10,315 393 558 582 2,154 3,857 2,771 † Not applicable. NOTE: Faculty strata are defined as 1 = Non-Hispanic Black, 2 = Hispanic, 3 = Asian, 4 = Other full-time female, 5 = Other full-time male, 6 = Other part-time. Blank strata are NPSAS-only and do not apply to NSOPF. NSoFaS = 2004 National Study of Faculty and Students. SOURCES: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04), U.S. Department of Education, National Center for Education Statistics, 2002 Integrated Postsecondary Education Data System (IPEDS).

A.4.1 Determining Initial Faculty Sample Sizes and Sample Allocation

This section provides an overview of the faculty sample selection procedure, which includes technical details of the cost/variance optimization process for selection of the initial sample sizes as well as specifications for calculation of initial and final (adjusted) sampling rates. A customized cost/variance optimization program was developed to determine the desired allocation of respondents to institution-by-person strata, which aimed to secure at least the same level of precision for key estimates as those achieved during the previous administration of the survey. This optimization process consisted of the following steps:

a. establishing precision requirements for key estimates;

b. constructing a cost model specific to the structure of the sample;

Page 137: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-15

c. developing a relative variance model; and

d. determining the optimum sample allocation.

A.4.2 Precision Requirements for Key Estimates

The precision goals were to secure national-level survey estimates with precisions comparable to or better than those of NSOPF:99 for the overall faculty population. For this purpose, the following two publications were reviewed to establish 268 key national-level estimates:

• Background Characteristics, Work Activities, and Compensation of Faculty and Instructional Staff in Postsecondary Institutions: Fall 1998 (NCES 2001–152) (Zimbler 2001); and

• Salary, Promotion, and Tenure Status of Minority and Women Faculty in U.S. Colleges and Universities (NCES 2000–173). (Nettles, Perna, and Bradburn 2000).

A.4.3 Cost/Variance Optimization

As mentioned earlier, a customized cost/variance optimization program was developed to determine the desired allocation of respondents to institution-by-person strata. The cost model necessary to support the cost/variance optimization process was the following:

240 240 14

01 1 1

h h h hk hkh h k

C C n C n n C= = =

= + +∑ ∑∑

where C represents the total cost of the NSoFaS, C0 represents the “fixed costs” that do not depend on the number of sample institutions or person, Ch represents the variable cost per participating institution in stratum h, Chk represents the variable cost per respondent in person stratum h within institution stratum k, nh represents the number of participating institutions selected from stratum h, and nhk represents the number of responding persons selected from the given stratum.

Only the components of variable cost, Ch and Chk, had to be estimated to support the cost/variance optimization. For this purpose, they were estimated using the spreadsheet developed for the study budget. The cost per participating institution was then estimated by holding the numbers of responding persons constant while varying the numbers of participating institutions. Likewise, the variable cost per participant was estimated by holding the number of participating institutions constant while varying the number of participating persons.

A.4.4 Relative Variance Model

The following model was used to represent the relative variance of estimates in different domains:

Page 138: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-16

( ) ( )

∑∑

∑ ∑

⎟⎟⎠

⎞⎜⎜⎝

⎛×

×⎟⎟⎠

⎞⎜⎜⎝

⎛×

++++

⎥⎦

⎤⎢⎣

⎡××⎟⎟

⎞⎜⎜⎝

⎛−

×+×⎟

⎟⎠

⎞⎜⎜⎝

⎛×

+++=

h k hkdhk

hkdhkgd

gdgdgdgd

gd

h khkdhk

hhhmdgd

gdgdgdgd

gdgd

nUWEW

CV

UWEWNrn

CVCVlVar

ασσσσσ

σσσσσ

22

24

23

22

21

24

22224

23

22

21

22 111Re

where the parameters of this model were defined for each institution stratum h and person stratum k as follows:

• Wdhk = proportion of domain d members who belong to stratum (h,k)

• UWEhk = unequal weighting effect within stratum (h,k)

• 21gdσ = the variance between institution strata

• 22 gdσ = the variance between institutions within strata

• 23gdσ = the variance between strata

• 24 gdσ = the variance between participants within person strata

• gd

Tgdgd y

CVσ

= = coefficient of variation among observations

• rh = stratum h institution response rate

• CVmd = coefficient of variation of cluster sizes (m) among domain d members

• αdhk = proportion of stratum (h,k) members who belong to domain d

The proportion of domain d members who belong to stratum (h,k), Wdhk, and the proportion of stratum (h,k) members who belong to domain d, αdh, were estimated using prior survey data. The components of variance ( 2

1gdσ , 22gdσ , 2

3gdσ , and 24gdσ ) were computed using the

method of moments procedures in SAS Proc Nested, which resulted in some negative estimates. Unequal weighting effects, UWEhk, were computed based on the statistical analysis weights. Since these values were highly variable, it was decided that they were not reliable estimates of the unequal weighting effects. Consequently, all the UWEs were set to a constant value of 1.05. Finally, the coefficient of variation, CVmd, of cluster sizes was computed for the members of each analysis domain using the same prior survey data.

A.4.5 Optimum Sample Allocation

The technique developed by Chromy (1987) was used to determine the sample allocation for each institution and person strata. This technique aimed to satisfy all precision constraints while minimizing the cost and relative variance models discussed earlier. The initial results from the optimization process were discussed with the National Center for Education Statistics (NCES) and the Technical Review Panel (TPR) in August 2002 before further refinements were applied to the resulting samples.

Page 139: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-17

The results of this initial sample optimization exercise were used as the basis for the sample of 1,080 institutions for NSOPF:04. As in previous cycles of NSOPF, all institutions with a Carnegie classification as public doctoral or private not-for-profit doctoral institutions were selected with certainty. After selecting the sample institutions, further refinements were made to determine which binding constraints could be relaxed in the optimization procedure. As precision constraints were iteratively relaxed during the optimization process, the sample size distributions were constrained to achieve approximately the institution- and person-level marginal distributions that were requested by NCES following the August 2002 TRP meeting. The optimization process was rerun conditional on the sample of institutions that had already been selected to determine the optimum allocation of the faculty sample sizes to these institutions. The results of this conditional optimization were used to set the final faculty sample rates, as discussed below. Note that the corresponding respondent counts are provided in tables 4 and 14 in the main body of this methodology report.

Table A-13. Target number of NSOPF:04 respondents, by institution and faculty strata: 2004

Institution stratum Respondents Faculty stratum Respondents Total 24,500 Total 24,500 Public doctor’s 6,200 Non-Hispanic Black 1,600 Public master’s 2,700 Hispanic 1,300 Public bachelor’s 600 Asian 900 Public associate’s 7,500 Other full-time female 4,600 Public other 500 Other full-time male 8,300 Private not-for-profit doctor’s 2,600 Other part-time 7,800 Private not-for-profit master’s 1,900 Private not-for-profit bachelor’s 1,700 Private not-for-profit associate’s 100 Private not-for-profit other 700 SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

A.5 Sampling Rates

Initial population-level sampling rates were calculated and adjusted in several steps to obtain the final institution-level sampling rates, a brief outline of which is listed below.

a. Calculation of the initial population-level sampling rates

b. Calculation of the conditional sampling rates, given the sample institutions

c. Adjustment of the sampling rates for expected rates of faculty ineligibility and nonresponse

d. Adjustment of the sampling rates for expected rates of institution ineligibility and nonresponse

e. Adjustment of the sampling rates (iteratively) to ensure that

Page 140: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-18

− no cell will have a sampling rate larger than 1;

− minimum sample size for each institution will be 10; and

− maximum sample size for each institution will be controlled.

In order to facilitate communication of computational details, the following notations will be used throughout this section.

− i ∈ [1,58] indexes institutional strata (STRAT58);

− j ∈ [1,6] indexes faculty strata (FACSTR);

− k ∈ [1,4] indexes institutional sector (SECTOR);

− Nij+ total number of faculty members in the i-jth institution-faculty stratum; and

− nij+ required sample of faculty respondents in the i-jth institution-faculty stratum.

A.5.1 Calculation of the Initial Sampling rates

The initial sampling, which are calculated as the ratio of the required sample sizes and population counts within cells indexed by the institutional and faculty strata. That is,

⎩⎨⎧

∈∈

==+

+

]6,1[]58,1[

,0

ji

Nn

SRij

ijij

These rates represent the faculty sampling rates that would be used if a census was to be conducted of all institutions, with all institutions being eligible and participating in the study.

A.5.2 Calculation of the Conditional Sampling Rates

Faculty sampling rates were computed for the institutions in the sample, conditional on the institutions that have been selected (as if they will all be participants). The conditional sampling rate was calculated for each sample institution as the product of its initial sampling rate and institution sampling weight. That is, the initial conditional sampling rate for the jth faculty stratum in the ith institutional stratum, given the selection of the lth institution was given by:

il

ijlij

SRCR

π

00 =

Here, πil represents the probability of selection of the lth sample institution in the ith institutional stratum.

Page 141: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-19

A.5.3 Adjustment of Conditional Sampling Rates for Expected Rates of Faculty Ineligibility and Nonresponse

Initial sampling rates were adjusted to account for anticipated faculty ineligibility and nonresponse. Since the only reliable information for this purpose was available at the institution sector level based on results from NSOPF:99, this adjustment was carried out at sector level. The needed rates for this adjustment are summarized in Table A-14.

Table A-14. Faculty level adjustment rates, by institutional sector: 2004

Sector Percent eligibility rate Percent response rate

Total 95 73

Public 2-year 95 70 Public 4-year 95 75 Private not-for-profit 2-year 95 69 Private not-for-profit 4-year 95 70 SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

With FEk and FRk representing the expected rates of faculty level eligibility and response for the kth sector, respectively, the adjusted sampling rate to account for faculty level attrition for the jth faculty stratum for all institutions in the kth institutional sector is given by:

kklijlij FRFE

CRCR 110|

1| ××=

Because of the above adjustment and because some sample institutions were expected to be ineligible, application of the resulting sampling rates to the frame counts for the eligible sample institutions could produce sample sizes that are far from the proposed total sample of 35,671 faculty members. Consequently, the adjusted sampling rates were ratio-adjusted to the desired total in two steps. First, the revised sample size for the jth faculty stratum in the ith institutional stratum was computed as follows, where Nijl represents the lth institution in the (i-j)th stratum:

( ) ⎩⎨⎧

∈∈

××=∑++ ]6,1[

]58,1[,671,35

1|

1

ji

CRNnn

ijllijijl

ijij

Next, verifying that ∑ =ijl

lijijl CRxN ,671,352| the ratio-adjusted sampling rates were

calculated for the eligible sample institutions as:

ilkkij

ijlij FRFEN

nCR

π1111

2| ×××=

+

+

Page 142: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-20

A.5.4 Adjustment of Sampling Rates for Expected Rates of Institution Ineligibility and Nonresponse

The above rates were further adjusted to account for anticipated institution level ineligibility and nonresponse, as summarized in table A-15. With IEk and IRk representing the expected rates of institution level eligibility and response for the kth sector, the revised rates were calculated as:

kklijlij IRIE

CRCR 112|

3| ××=

These calculations were performed on the data file of all eligible sample institutions among the final sample of 1,080 institutions. The eligibility rate, IEk, was set to 1.00 for all institutions that were known to be eligible. The eligibility was known for all institutions.

Table A-15. Institution level adjustment rates by institutional sector (rates based on the revised RTI proposal for the 2004 National Study of Faculty and Students): 2004

Sector Percent eligibility rate Percent response rate

Total 99 87

Public 2-year 99 84

Public 4-year 100 90

Private not-for-profit 2-year 95 95

Private not-for-profit 4-year 100 83 SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Again, because of the above adjustment, the resulting total sample of faculty members was larger than the proposed sample of 35,671. The resulting total count for each institution-faculty stratum was calculated to be used for further adjustments, by:

( )∑ ×=+l

lijijlij CRNn 3|

*

A.5.5 Final Adjustment of Sampling Rates

The final sampling rates must satisfy a number of design requirements before they could be used for selection of sample faculty members. Specifically, all sampling rates should be bound by 1 while ensuring that the resulting sample sizes are between 10 and a reasonable maximum for each sample institution. Achieving these objectives, starting with the above initial conditional sampling rates, entailed an iterative process as described next.

• Calculate the revised sample size for the jth faculty stratum of the lth sample institution in the ith institutional stratum by:

− 1,000 Let =Max −

3|

1Let lijijlijl CRNn ×=

Page 143: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-21

− ijlijlijlijl NnNn => 11 Then If

− 11111 Then If

liijlijl

jijlli n

MaxnnMaxnn+

+ ×=>= ∑

− 1111 10Then 10 If

liijlijlli n

nnn+

+ ×=<

− 1

*11*1 Then If

+

++ ×=≠∑

ij

ijijlijlij

lijl n

nnnnn

• Review the distribution of the total sample size for each institution, ni+l, to detect

potential outliers. Subsequently, the value of Max will be set to a new limit, which is determined after implementation of the above step.

• Next, the subsequent conditional sampling rates should be calculated as the ratio of the latest sample sizes and their corresponding population counts. That is,

ijl

ijllij N

nCR

14| =

• Calculate the next revised sample size for the jth faculty stratum of the lth sample institution in the ith institutional stratum by:

− 4|

2Let lijijlijl CRNn ×=

− ijlijlijlijl NnNn => 22 Then If

− 22222 Then If

liijlijl

jijlli n

MaxnnMaxnn+

+ ×=>= ∑

− 2222 10Then 10 If

liijlijlli n

nnn+

+ ×=<

− 2

*22*2 Then If

+

++ ×=≠∑

ij

ijijlijlij

lijl n

nnnnn

The above steps were repeated until no more adjustments were needed. That is, until all

sampling rates were less than or equal to one, no institution sample size was less than 10 or greater than the established maximum number, and the sample size was close to what was expected for every institution-faculty stratum. At this point, the final sampling rates were calculated by:

ijl

fijlf

lij Nn

CR =|

A.6 Faculty Sample Selection

Faculty members were sampled as faculty lists were received for participating institutions. Prior to selecting the faculty sample for a given institution, expected sample sizes

Page 144: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-22

for each faculty stratum were calculated using the institution-specific faculty list counts and sampling rates. Now that the actual list counts were available these sampling rates were then modified, as necessary, for the reasons given below.

• Rates were increased across all faculty strata to ensure that at least ten faculty members were selected from each institution, if possible.

• Rates were increased within faculty strata to guarantee that at least one faculty member was selected per stratum, when the calculated rates called for selection of less than one faculty, if possible.

• The sample yield was monitored throughout the months during which faculty lists were received, and the faculty sampling rates were adjusted periodically for institutions for which sample selection had not yet been performed to ensure that the desired faculty sample sizes were achieved.

Stratified systematic sampling was used to select faculty members from the faculty lists. Using PROC SURVEYSELECT in SAS, lists were sorted in a serpentine fashion by the academic field, race/ethnicity, gender, and employment status of the faculty members, and individuals were systematically selected within faculty strata.

These procedures had to be modified for lists that were received on hard copy. Quite often, these paper lists contained little information about the faculty members’ race/ethnicity, gender, and employment status. When this information was not available, and therefore faculty strata could not be identified, a systematic sample of faculty was selected using the overall sampling rate for the institution. If this personal data was provided, however, a systematic sample was selected using the largest stratum-specific sampling rate. This initial sample (sub-frame) was then keyed to create an electronic file to avoid data entry for the entire list. Subsequently, however, extra faculty members were subsampled out using PROC SURVEYSELECT to achieve the needed allocation of faculty from the given institution.

After the sample of faculty had been selected for an institution, the available information of the sample faculty members, including name, academic field, institution, race/ethnicity, and residence, was compared to that of faculty who had already been selected from other institutions. When duplicates were detected, the duplicate was eliminated from the sample of the current institution so that no faculty member would be included in the sample twice.

Once the de-duplication process was complete and the final sample file was created, the final step in sample selection was to add the institution’s final sample file to the master dataset. The master dataset contained all sampled faculty members and their relevant sampling information.

A.7 Appendix A References

Chromy, J.R. (1987). Design Optimization with Multiple Objectives. Proceedings of the American Statistical Association Section on Social Statistics 194-199.

Page 145: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-23

Kojaku, L. and Nuñez, A. (1998). Descriptive Summary of 1995-96 Beginning Postsecondary Students, With Profiles of Students Entering 2- and 4-Year Institutions (NCES 1999–030). U.S. Department of Education, National Center for Education Statistics. Washington DC: U.S. Government Printing Office.

Nettles, M.T., Perna, L.W., and Bradburn, E.M. (2000). Salary, Promotion, and Tenure Status of Minority and Women Faculty in U.S. Colleges and Universities (NCES 2000–173). U.S. Department of Education, National Center for Education Statistics. Washington DC: U.S. Government Printing Office.

U.S. Department of Education, National Center for Education Statistics (2001). National Postsecondary Student Aid Study: Student Financial Aid Estimates for 1999–2000 (NCES 2001–209). Washington, DC: U.S. Government Printing Office.

Zimbler, L.J. (2001). Background Characteristics, Work Activities, and Compensation of Faculty and Instructional Staff in Postsecondary Institutions: Fall 1998 (NCES 2001–152). Washington, DC: U.S. Department of Education, National Center for Education Statistics.

Page 146: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix A. Sampling Details

A-24

Page 147: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

B-1

Appendix B Technical Review Panel

Page 148: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 149: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Technical Review Panel

B-3

Eugene Anderson Research Associate American Council on Education Center for Policy Analysis &

Office of Minorities in Higher Education One Dupont Circle NW Washington, DC 20036-1193 Telephone: (202) 939-9393 Fax: (202) 785-2990 E-mail: [email protected] Roger Baldwin Professor Michigan State University 429 Erickson Hall East Lansing, MI 48824 Telephone: (517) 355-6452 Fax: (517) 353-6393 E-mail: [email protected] Ernst Benjamin Senior Consultant American Association of University Professors 1012 14th Street NW Suite 500 Washington, DC 20005-3465 Telephone: (202) 737-5900 E-mail: [email protected] May Chen Assistant Vice Chancellor Community College of Baltimore County 800 South Rolling Road Baltimore, MD 21228-5317 Telephone: (410) 869-1135 Fax: (410) 869-1265 E-mail: [email protected]

Peggye Cohen Assistant VP for Institutional Research George Washington University 2122 I Street NW Suite 809 Washington, DC 20052 Telephone: (202) 994-6509 Fax: (202) 994-0709 E-mail: [email protected] Valerie Conley Assistant Professor Ohio University Department of Counseling and Higher Education McCracken Hall 388 Athens, OH 45701 Telephone: (740) 593-9426 Fax: (740) 593-0477 E-mail: [email protected] John Curtis Director of Research American Association of University Professors 1012 14th Street NW Suite 500 Washington, DC 20005-3465 Telephone: (202) 737-5900 Ext. 3049 Fax: (202) 737-5526 E-mail: [email protected] Eric Dey Associate Professor University of Michigan 610 East University 2117 School of Education Building Ann Arbor, MI 48109-1259 Telephone: (734) 647-1651 Fax: (734) 764-2510 E-mail: [email protected]

Page 150: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Technical Review Panel—Continued

B-4

Elaine El-Khawas Professor George Washington University Dept. of Educational Leadership 2134 G Street NW Washington, DC 20052 Telephone: (202) 994-2196 E-mail: [email protected] James Fairweather Professor Michigan State University 416 Erickson Hall East Lansing, MI 48824 Telephone: (517) 353-3387 Fax: (517) 353-6393 E-mail: [email protected] Martin Finkelstein Professor Seton Hall University Department of Educational Leadership,

Management and Policy (ELMP) 418 Kozlowski Hall South Orange, NJ 07079 Telephone: (973) 275-2056 Fax: (201) 761-9758 E-mail: [email protected] Jon Fuller Senior Fellow National Association of Independent Colleges

and Universities 4 Brian Drive Carlisle, PA 17013 Telephone: (717) 240-0462 Fax: (717) 240-0486 E-mail: [email protected]

Rachel Hendrickson Coordinator of Higher Education Programs National Education Association 1201 16th Street NW Washington, DC 20036 Telephone: (202) 822-7115 Fax: (202) 822-7624 E-mail: [email protected] Frederick Jacobs Professor American University School of Education 4400 Massachusetts Avenue NW Washington, DC 20016-8030 Telephone: (202) 885-1187 Fax: (202) 885-2124 E-mail: [email protected] Jacqueline King Director, Center for Policy Analysis American Council on Education One Dupont Circle NW Suite 800 Washington, DC 20036 Telephone: (202) 939-9559 Fax: (202) 785-2990 E-mail: [email protected] David Leslie Chancellor College of William and Mary School of Education 324 Jones Hall P.O. Box 8795 Williamsburg, VA 23187-8795 Telephone: (757) 221-2349 Fax: (757) 221-2988 E-mail: [email protected]

Page 151: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Technical Review Panel—Continued

B-5

Jennifer Lindholm Associate Director CIRP Higher Education Research Institute UCLA/GSEIS 3005 Moore Hall Box 951521 Los Angeles, CA 90095-1521 Telephone: (310) 825-8622 Fax: (310) 206-2228 E-mail: [email protected] Alexander McCormick Senior Scholar The Carnegie Foundation 51 Vista Lane Stanford, CA 94305 Telephone: (650) 566-5149 Fax: (650) 326-0278 E-mail: [email protected] Mike Middaugh Assistant VP Office of Institutional Research University of Delaware 325 Hullihen Hall Newark, DE 19716 Telephone: (302) 831-2021 Fax: (302) 831-8530 E-mail: [email protected] Michael Nettles Executive Director Educational Testing Service Center for Policy Studies & Research Rosedale Road, R028, Mailstop 19-R Princeton, NJ 08541 Telephone: (609) 734-1236 Fax: (609) 734-1755 E-mail: [email protected]

James Palmer Professor Illinois State University Educational Administration and Foundations Campus Box 5900 De Garmo Hall 341 Normal, IL 61790-5900 Telephone: (309) 438-2041 Fax: (309) 438-8683 E-mail: [email protected] Kent Phillippe Senior Research Associate American Association of Community Colleges One Dupont Circle NW Suite 410 Washington, DC 20036 Telephone: (202) 728-0200 Ext. 222 Fax: (202) 833-2467 E-mail: [email protected] Richard Richardson Professor New York University 239 Greene Street Suite 300 New York, NY 10003-6674 Telephone: (212) 998-5640 Fax: (212) 995-4041 E-mail: [email protected] Terrence Russell Executive Director Association for Institutional Research Florida State University 222 Stone Building 600 West College Avenue Tallahassee, FL 32306-4462 Telephone: (850) 644-4470 Fax: (850) 644-8824 E-mail: [email protected]

Page 152: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Technical Review Panel—Continued

B-6

Jack Schuster Professor Claremont Graduate University School of Educational Studies Harper Hall 202 150 East 10th Street Claremont, CA 91711-6160 Telephone: (909) 607-3795 E-mail: [email protected] Craig Smith Assistant Director American Federation of Teachers 555 New Jersey Avenue NW Washington, DC 20001 Telephone: (202) 879-4559 Fax: (202) 393-6386 E-mail: [email protected] Peter Syverson Vice President for Research Council of Graduate Schools Suite 430 One Dupont Circle NW Washington, DC 20036 Telephone: (202) 223-3791 Fax: (202) 331-7157 E-mail: [email protected] Robert Toutkoushian Associate Professor Indiana University Department of Educational Leadership and

Policy Studies 201 North Rose Ave Education 4220 Bloomington, IN 47405 Telephone: (812) 856-8395 E-mail: [email protected]

Holly Zanville Associate Vice Chancellor for Academic Affairs University of Oregon Campus Oregon State System of Higher Education Susan Campbell Hall P.O. Box 3175 Eugene, OR 97403-1075 Telephone: (541) 346-5726 Fax: (541) 346-5764 E-mail: [email protected] National Association of Student Financial Aid

Administrators (NASFAA) Mary Ann O’Connor Project Director National Association of Student Financial Aid

Administrators (NASFAA) 1129 20th Street NW Suite 400 Washington, DC 20036 Telephone: (202) 785-0453 Fax: (202) 785-1487 E-mail: [email protected] Kenneth Redd Director of Research and Policy Analysis National Association of Student Financial Aid

Administrators (NASFAA) 1129 20th Street NW Suite 400 Washington, DC 20036-3453 Telephone: (202) 785-0453 Fax: (202) 785-1487 E-mail: [email protected]

Page 153: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Staff Contact List

B-7

U.S. Department of Education/

U.S. Government Elizabeth Hoover Contract Specialist U.S. Department of Education 550 12th Street, SW Room 7116 Washington, DC 20202-4447 Telephone: (203) 245-6166 E-mail: [email protected] Maggie Cahalan Team Leader, Secondary/Postsecondary Cross-cutting Program Policy and Program Studies Service, U.S. Department of Education 400 Maryland Avenue SW Room 6W118 Washington, DC 20202 Telephone: (202) 401-1679 Fax: (202) 401-5943 E-mail: [email protected] C. Dennis Carroll Associate Commissioner, Postsecondary Studies

Division National Center for Education Statistics, U.S. Department of Education 1990 K Street NW Room 8107 Washington, DC 20006-5652 Telephone: (202) 502-7323 Fax: (202) 502-7460 E-mail: [email protected]

Michael Cohen Assistant Director for Survey Programs Bureau of Transportation Statistics 400 7th Street SW Room 4432 Washington, DC 20590 Telephone: (202) 366-9949 Fax: (202) 366-3385 E-mail: [email protected] James Griffith Program Director, Postsecondary Longitudinal

and Sample Survey Studies-Postsecondary Studies Division

National Center for Education Statistics U.S. Department of Education 1990 K Street NW Room 8005 Washington, DC 20006 Telephone: (202) 502-7387 Fax: (202) 502-7460 E-mail: [email protected] Brian Harris-Kojetin Senior Statistician Statistical Programs and Standards Office of Management and Budget 725 17th Street NW Room 10201 Washington, DC 20503 Telephone: (202) 395-3093 Fax: (202) 395-7245 E-mail: [email protected] Gregory Henschel Program Analyst National Institute on Postsecondary Education,

Libraries and Lifelong Learning Institute of Education Sciences, U.S. Department of Education 555 New Jersey Avenue NW Suite 627 Washington, DC 20208-5531 Telephone: (202) 219-2082 Fax: (202) 501-3005 E-mail: [email protected]

Page 154: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Staff Contact List—Continued

B-8

Ricardo Hernandez Lead Research Analyst Office of Vocational and Adult Education

(OVAE), U.S. Department of Education Organizational Structure and Offices

Potomac Center Plaza 550 12th Street SW Room 11137 Washington, DC 20065 Telephone: (202) 245-7818 Fax: (202) 205-8793 E-mail: [email protected] Lisa Hudson Education Statistician National Center for Education Statistics, U.S. Department of Education 1990 K Street NW Room 9035 Washington, DC 20006 Telephone: (202) 502-7358 Fax: (202) 502-7490 E-mail: [email protected] Tracy Hunt-White Education Statistician National Center for Education Statistics, U.S. Department of Education 1990 K Street NW Room 8121 Washington, DC 20006 Telephone: (202) 502-7438 Fax: (202) 502-7460 E-mail: [email protected]

Paula Knepper Senior Technical Advisor, Postsecondary

Studies Division National Center for Education Statistics, U.S. Department of Education 1990 K Street NW Room 8104 Washington, DC 20006-5652 Telephone: (202) 502-7367 Fax: (202) 502-7372 E-mail: [email protected] Roslyn Korb Program Director Postsecondary Cooperative Programs and

Analysis and Dissemination Program, Postsecondary Studies Division, National Center for Education Statistics U.S. Department of Education, NCES 1990 K Street NW Room 8132 Washington, DC 20006-5652 Telephone: (202) 502-7378 Fax: (202) 219-1529 E-mail: [email protected] Edith McArthur Demographer National Center for Education Statistics, U.S. Department of Education 1990 K Street NW Room 9115 Washington, DC 20006-5652 Telephone: (202) 502-7393 Fax: (202) 502-7490 E-mail: [email protected]

Page 155: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Staff Contact List—Continued

B-9

Frank Shaw Budget Analyst National Endowment for the Humanities Office of Planning and Budget 1100 Pennsylvania Avenue NW Room 403 Washington, DC 20506 Telephone: (202) 606-8428 Fax: (202) 606-8619 E-mail: [email protected] Larry Suter National Science Foundation Research, Evaluation, and Communications 4201 Wilson Boulevard Suite 855 Arlington, VA 22230 Telephone: (703) 306-1650 Fax: (703) 306-0434 E-mail: [email protected] Linda Zimbler Project Officer, NSOPF:04 National Center for Education Statistics, U.S. Department of Education, 1990 K Street NW Room 8123 Washington, DC 20006-5652 Telephone: (202) 502-7481 E-mail: [email protected] MPR Associates Ellen Bradburn Senior Research Associate MPR Associates 2150 Shattuck Avenue Suite 800 Berkeley, CA 94704 Telephone: (510) 849-4942 Fax: (510) 849-0794 E-mail: [email protected]

Susan Choy Vice President MPR Associates, Inc. 2150 Shattuck Avenue Suite 800 Berkeley, CA 94704 Telephone: (510) 849-4942 Fax: (510) 849-0794 E-mail: [email protected] Gary Hoachlander President MPR Associates 2150 Shattuck Avenue Suite 800 Berkeley, CA 94704 Telephone: (510) 849-4942 Fax: (510) 849-0794 E-mail: [email protected]

Page 156: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Staff Contact List—Continued

B-10

Pinkerton Computer Consultants, Inc. Daniel Heffron Statistician Pinkerton Computer Consultants, Inc. 2750 Prosperity Avenue Suite 300 Fairfax, VA 22031 Telephone: (703) 245-7388 Fax: (703) 245-7560 E-mail: [email protected] RTI International Janet Austin Technical Staff Assistant/TRP Coordinator RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-7101 Fax: (919) 541-7014 E-mail: [email protected] Lisa Carley-Baxter Survey Manager, NSOPF RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 485-2616 Fax: (919) 541-1261 E-mail: [email protected] James Chromy Senior Statistical Advisor RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-7019 Fax: (919) 541-6416 E-mail: [email protected]

Jayme Curry-Tucker Education Analyst RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 485-5592 Fax: (919) 541-7014 E-mail: [email protected] T.R. Curtin Instrumentation Task Leader, NSOPF RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-6538 Fax: (919) 541-7014 E-mail: [email protected] Mansour Fahimi Statistical Task Leader, NSOPF RTI International 6110 Executive Boulevard Suite 420 Rockville, MD 20852-3907 Telephone: (301) 230-4675 Fax: (301) 230-4647 E-mail: [email protected] Patricia Green Project Director, NSOPF RTI International 203 North Wabash Suite 1900 Chicago, IL 60601 Telephone: (312) 456-5260 Fax: (312) 456-5250 E-mail: pgreen @rti.org

Page 157: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

National Study of Postsecondary Faculty (NSOPF:04) Staff Contact List—Continued

B-11

Ruth Heuer Methodology Report Task Leader, NSOPF RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-6457 Fax: (919) 541-7014 E-mail: [email protected] Marjorie Hinsdale-Shouse Data Collection Task Leader, NSOPF RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-7368 Fax: (919) 541-7250 E-mail: [email protected] Donna Jewell Deputy Project Director, NSOPF RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-7266 Fax: (919) 541-6178 E-mail: [email protected]

Brian Kuhr Institutional Coordinator, NSoFaS RTI International 203 North Wabash Suite 1900 Chicago, IL 60601 Telephone: (312) 456-5263 Fax: (312) 456-5250 E-mail: [email protected] John Riccobono Director, Education Surveys Division RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-7006 Fax: (919) 541-7014 E-mail: [email protected] Peter Siegel Research Statistician RTI International P.O. Box 12194 Research Triangle Park, NC 27709 Telephone: (919) 541-6348 Fax: (919) 541-6416 E-mail: [email protected]

Page 158: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix B. Technical Review Panel

B-12

Page 159: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

C-1

Appendix C Facsimile Instruments

Institution Instrument.......................................................................................................C-3

Faculty Instrument .........................................................................................................C-19

Page 160: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 161: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

C-3

Institution Instrument

Page 162: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 163: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-5

NSOPF:04 Institution Instrument

Full-Scale Study Facsimile

Page 164: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 165: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-7

►Introduction: Number of Faculty and Instructional Staff Form: I1 Name: I1a Label: Number full-time faculty, fall 2003, reported Name: I1b Label: Number part-time faculty, fall 2003 Form Administered To: All institutions StemWording: As of November 1, 2003 (or during the Fall Term of the 2003-2004 academic year when your faculty lists are considered complete), how many full-time and part-time faculty and instructional staff were employed by [FILL INSTNAME]? Please report the total number of persons (i.e., headcount) rather than full-time equivalents (FTEs). (Please enter a number in each box; if none, enter "0".) NOTE: By faculty and instructional staff, we mean any faculty PLUS any other employees with instructional responsibilities, regardless of whether or not they have faculty status. Please choose "Help" for additional details. * a. Full-time faculty and instructional staff. * b. Part-time faculty and instructional staff.

►SECTION A: Full-Time Faculty and Instructional Staff Form: I2 Name: I2a Label: Full-time numbers: faculty, fall 2002 Name: I2b Label: Full-time numbers: changed from part to full time, 2002-03 Name: I2c Label: Full-time numbers: hired, 2002-03 Name: I2d Label: Full-time numbers: retired, 2002-03 Name: I2e Label: Full-time numbers: left for other reasons, 2002-03 Name: I2f Label: Full-time numbers: changed from full to part time, 2002-03 Name: I2g Label: Full-time numbers: faculty, fall 2003, calculated Form Administered To: Institutions with full-time faculty and instructional staff StemWording: Please provide the following information about changes in the number of full-time faculty and instructional staff between the 2002 and 2003 Fall Terms at this institution. (Please enter a number in each box; if none, enter "0".) * a. Total at start of 2002-2003 academic year (on or about November 1, 2002) * b. Number who changed from part-time to full-time status during 2002-2003 academic year

(between Nov. 1, 2002 and Nov. 1, 2003) * c. Number of new hires during 2002-2003 academic year * d. Number retired between Nov. 1, 2002 and Nov. 1, 2003

Page 166: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-8

* e. Number who left for other reasons during 2002-2003 academic year * f. Number changed from full-time to part-time status during the 2002-2003 academic year * g. Total number as of Nov. 1, 2003 (or at the start of the 2003-04 academic year) Form: I2A Label: Full-time numbers: inconsistent count reason Form Administered To: Institutions with full-time faculty and instructional staff whose provided counts of full-time faculty and instructional staff are inconsistent (I1a ≠ I2g) StemWording: You provided two different counts of the number of full-time faculty as of November 1, 2003: ( I1a = [FILL I1a] and I2g = [FILL I2g]) . Please back up and correct the number or provide an explanation of this discrepancy. Reason for discrepancy: [ ] Form: I3 Label: Full-time tenure: has tenure system Form Administered To: Institutions with full-time faculty and instructional staff StemWording: Does [FILL INSTNAME] have a tenure system for any full-time faculty and instructional staff? 1 = Yes, has a tenure system 2 = Currently no tenure system, but still have tenured staff 3 = No tenure system Form: I4 Label: Full-time tenure: number considered for tenure, 2002-03 Form Administered To: Institutions with a tenure system for full-time faculty and instructional staff StemWording: During the 2002-2003 academic year (i.e., Fall 2002 through Spring 2003), how many full-time faculty and instructional staff at your institution were considered for tenure? * (Please enter a number in the box; if none, enter "0".)

Page 167: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-9

Form: I5 Label: Full-time tenure: number granted tenure, 2002-03 Form Administered To: Institutions with a tenure system for full-time faculty and instructional staff who considered at least one faculty member for tenure during the 2002-2003 academic year StemWording: Of the [FILL I4] faculty members considered for tenure during the 2002-2003 academic year, how many were granted tenure? * (If none, enter "0".) Form: I6 Label: Full-time tenure: maximum years on tenure track Form Administered To: Institutions with a tenure system for full-time faculty and instructional staff StemWording: For those on a tenure track but not tenured, what is the maximum number of years full-time faculty and instructional staff can be on a tenure track and not receive tenure at [FILL INSTNAME] ? 0 = No maximum 1 = 1 2 = 2 . . . 15 = 15 16 = More than 15 years Form: I7 Name: I7a Label: Full-time tenure: changed tenure policy Name: I7b Label: Full-time tenure: more stringent tenure standards Name: I7c Label: Full-time tenure: downsized tenured faculty Name: I7d Label: Full-time tenure: replaced tenured with fixed term Name: I7e Label: Full-time tenure: offered early retirement Form Administered To: Institutions with a tenure system for full-time faculty and instructional staff StemWording: During the past five years, has your institution done any of the following? * a. Changed policy for granting tenure to full-time faculty and instructional staff * b. Made the standards more stringent for granting tenure to full-time faculty

and instructional staff * c. Reduced the number of tenured full-time faculty and instructional staff through

downsizing

Page 168: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-10

* d. Replaced some tenured or tenure-track full-time faculty and instructional staff with

full-time faculty and instructional staff on fixed term contracts * e. Offered early or phased retirement to any tenured full-time faculty or

instructional staff 0 = No 1 = Yes Form: I7SP Name: I7e2 Label: Full-time tenure: number early retirees, last 5 years Form Administered To: Institutions offering early or phased retirement to any tenured full-time faculty and instructional staff StemWording: You said your institution offered early or phased retirement. How many full-time faculty and instructional staff took this during the past five years? * (If none, enter "0") Form: I8 Label: Full-time tenure: discontinued tenure system, last 5 years Form Administered To: Institutions with no tenure system for full-time faculty and instructional staff StemWording: Did [FILL INSTNAME] discontinue the tenure system within the last five years? 0 = No 1 = Yes Form: I9 Label: Full-time faculty: positions sought to fill, fall 2003 Form Administered To: Institutions with full-time faculty and instructional staff StemWording: How many full-time faculty and instructional staff positions was your institution seeking to fill for the 2003 Fall Term? * (If none, enter "0")

Page 169: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-11

Form: I10a Name: I10aa Label: Full-time benefit: medical insurance Name: I10ab Label: Full-time benefit: dental insurance Name: I10ac Label: Full-time benefit: disability insurance Name: I10ad Label: Full-time benefit: life insurance Name: I10ae Label: Full-time benefit: child care Name: I10af Label: Full-time benefit: retiree medical insurance Name: I10ag Label: Full-time benefit: cafeteria-style plan Form Administered To: Institutions with full-time faculty and instructional staff StemWording: Are the following employee benefits available to all, some, or none of the full-time faculty and instructional staff at [FILL INSTNAME]? * a. Medical insurance or medical care * b. Dental insurance or dental care * c. Disability insurance program * d. Life insurance * e. Child care * f. Medical insurance for retirees * g. "Cafeteria-style" benefits plan (a plan under which staff can trade off

some benefits for others, following guidelines established by the institution) –1 = Don’t know 1 = All 2 = Some 3 = None Form: I10b Name: I10ba Label: Full-time benefit: medical insurance subsidized Name: I10bb Label: Full-time benefit: dental insurance subsidized Name: I10bc Label: Full-time benefit: disability insurance subsidized Name: I10bd Label: Full-time benefit: life insurance subsidized Name: I10be Label: Full-time benefit: child care subsidized Name: I10bf Label: Full-time benefit: retiree medical insurance subsidized Name: I10bg Label: Full-time benefit: cafeteria-style plan subsidized Form Administered To: Institutions that provide at least one employee benefit to full-time faculty and instructional staff StemWording: [IF ONE BENEFIT SELECTED ON FORM I10] Is this employee benefit subsidized by your institution? (Subsidized means paid for completely or in part by the institution.)

Page 170: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-12

[ELSE IF MORE THAN ONE BENEFIT SECTED ON FORM I10A]Are these employee benefits subsidized by your institution? (Subsidized means paid for completely or in part by the institution.) [ENDIF] * Medical insurance or medical care * Dental insurance or dental care * Disability insurance program * Life insurance * Child care * Medical insurance for retirees * Cafeteria-style benefits plan (a plan under which staff can trade off some benefits

for others, following guidelines established by the institution) 0 = Not subsidized 1 = Fully/partially subsidized Form: I11 Name: I11a Label: Full-time benefit: wellness program Name: I11b Label: Full-time benefit: spouse tuition remission Name: I11c Label: Full-time benefit: children tuition remission Name: I11d Label: Full-time benefit: housing Name: I11e Label: Full-time benefit: transportation/parking Name: I11f Label: Full-time benefit: paid maternity leave Name: I11g Label: Full-time benefit: paid paternity leave Name: I11h Label: Full-time benefit: paid sabbatical leave Name: I11i Label: Full-time benefit: employee assistance program Form Administered To: Institutions with full-time faculty and instructional staff StemWording: Are the following employee benefits available to all, some, or none of the full-time faculty and instructional staff at [FILL INSTNAME]? * a. Wellness program or health promotion * b. Tuition remission/grants for spouse at this or other institutions * c. Tuition remission/grants for children at this or other institutions * d. Housing/mortgage; rent * e. Transportation/parking * f. Paid maternity leave * g. Paid paternity leave

Page 171: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-13

* h. Paid sabbatical leave * i. Employee assistance program –1 = Don’t know 1 = All 2 = Some 3 = None Form: I12 Label: Full-time faculty: union representation Form Administered To: Institutions with full-time faculty and instructional staff StemWording: Are any full-time faculty and instructional staff legally represented by a union (or other association) for purposes of collective bargaining with [FILL INSTNAME]? 0 = No 1 = Yes Form: I13 Name: I13a Label: Full-time assessment: student evaluations Name: I13b Label: Full-time assessment: student test scores Name: I13c Label: Full-time assessment: student career placement Name: I13d Label: Full-time assessment: other student performance Name: I13e Label: Full-time assessment: department chair evaluations Name: I13f Label: Full-time assessment: dean evaluations Name: I13g Label: Full-time assessment: peer evaluations Name: I13h Label: Full-time assessment: self-evaluations Form Administered To: Institutions with full-time faculty and instructional staff StemWording: Are any of the following used as part of institution or department/school policy in assessing the teaching performance of full-time instructional faculty/staff at this institution? Used for Teaching Assessment: * a. Student evaluations * b. Student test scores * c. Student career placement * d. Other measures of student performance * e. Department/division chair evaluations

Page 172: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-14

* f. Dean evaluations * g. Peer evaluations * h. Self-evaluations –1 = Don’t know 0 = No 1 = Yes

►SECTION B: Part-Time Faculty and Instructional Staff Form: I14 Label: Part-time benefit: retirement plan Form Administered To: Institutions with part-time faculty and instructional staff StemWording: In this next section, we will be asking you to consider [FILL INSTNAME]'s part-time faculty and instructional staff. Are any retirement plans available to part-time faculty or instructional staff at your institution? 0 = Not available to any part-time faculty and instructional staff 1 = Yes, available to some part-time faculty and instructional staff 2 = Yes, available to most part-time faculty and instructional staff 3 = Yes, available to all part-time faculty and instructional staff (Reminder: Part time refers to an individual's employment status at the institution rather than to the amount of instruction done by the individual.) Form: I15a Name: I15aa Label: Part-time benefit: medical insurance Name: I15ab Label: Part-time benefit: dental insurance Name: I15ac Label: Part-time benefit: disability insurance Name: I15ad Label: Part-time benefit: life insurance Name: I15ae Label: Part-time benefit: child care Name: I15af Label: Part-time benefit: retiree medical insurance Name: I15ag Label: Part-time benefit: cafeteria-style plan Form Administered To: Institutions with part-time faculty and instructional staff StemWording: Are the following employee benefits available to all, some, or none of the part-time faculty and instructional staff at [FILL INSTNAME]? * a. Medical insurance or medical care * b. Dental insurance or dental care * c. Disability insurance program

Page 173: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-15

* d. Life insurance * e. Child care * f. Medical insurance for retirees * g. "Cafeteria-style" benefits plan (a plan under which staff can trade off some benefits for others,

following guidelines established by the institution) –1 = Don’t know 1 = All 2 = Some 3 = None Form: I15b Name: I15ba Label: Part-time benefit: medical insurance subsidized Name: I15bb Label: Part-time benefit: dental insurance subsidized Name: I15bc Label: Part-time benefit: disability insurance subsidized Name: I15bd Label: Part-time benefit: life insurance subsidized Name: I15be Label: Part-time benefit: child care subsidized Name: I15bf Label: Part-time benefit: retiree medical insurance subsidized Name: I15bg Label: Part-time benefit: cafeteria-style plan subsidized Form Administered To: Institutions that provide at least one employee benefit to part-time faculty and instructional staff StemWording: Still thinking only of part-time faculty, [IF ONE BENEFIT SELECTED ON FORM I15A] is this employee benefit subsidized by your institution? [ELSE IF MORE THAN ONE BENEFIT SELECTED ON FORM I15A] are these employee benefits subsidized by your institution? [ENDIF] (Subsidized means paid for completely or in part by the institution.) *Medical insurance or medical care * Dental insurance or dental care * Disability insurance program * Life insurance * Child care * Medical insurance for retirees

Page 174: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-16

* "Cafeteria-style" benefits plan 0 = Not subsidized 1 = Fully/partially subsidized Form: I16 Name: I16a Label: Part-time benefit: wellness program Name: I16b Label: Part-time benefit: spouse tuition remission Name: I16c Label: Part-time benefit: children tuition remission Name: I16d Label: Part-time benefit: housing Name: I16e Label: Part-time benefit: transportation/parking Name: I16f Label: Part-time benefit: paid maternity leave Name: I16g Label: Part-time benefit: paid paternity leave Name: I16h Label: Part-time benefit: paid sabbatical leave Name: I16i Label: Part-time benefit: employee assistance program Form Administered To: Institutions with part-time faculty and instructional staff StemWording: Are the following employee benefits available to all, some, or none of the part-time faculty and instructional staff at [FILL INSTNAME]? * a. Wellness program or health promotion * b. Tuition remission/grants for spouse at this or other institutions * c. Tuition remission/grants for children at this or other institutions * d. Housing/mortgage; rent * e. Transportation/parking * f. Paid maternity leave * g. Paid paternity leave * h. Paid sabbatical leave * i. Employee assistance program –1 = Don't know 1 = All 2 = Some 3 = None

Page 175: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-17

Form: I17 Label: Part-time faculty: union representation Form Administered To: Institutions with part-time faculty and instructional staff StemWording: Are any part-time faculty and instructional staff legally represented by a union (or other association) for purposes of collective bargaining with [FILL INSTNAME]? 0 = No 1 = Yes Form: I18 Name: I18a Label: Part-time assessment: student evaluations Name: I18b Label: Part-time assessment: student test scores Name: I18c Label: Part-time assessment: student career placement Name: I18d Label: Part-time assessment: other student performance Name: I18e Label: Part-time assessment: department chair evaluations Name: I18f Label: Part-time assessment: dean evaluations Name: I18g Label: Part-time assessment: peer evaluations Name: I18h Label: Part-time assessment: self-evaluations Form Administered To: Institutions with part-time faculty and instructional staff StemWording: Are any of the following used as part of institution or department/school policy in assessing the teaching performance of part-time instructional faculty/staff at this institution? Used for Teaching Assessment: * a. Student evaluations * b. Student test scores * c. Student career placement * d. Other measures of student performance * e. Department/division chair evaluations * f. Dean evaluations * g. Peer evaluations * h. Self-evaluations –1 = Don't know 0 = No 1 = Yes

Page 176: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-18

►SECTION C: All Faculty and Instructional Staff Form: I19 Name: I19a Label: Undergraduate instruction: percent full-time faculty Name: I19b Label: Undergraduate instruction: percent part-time faculty Name: I19c Label: Undergraduate instruction: percent teaching assistants Name: I19d Label: Undergraduate instruction: percent other Form Administered To: All institutions StemWording: What percentage of undergraduate student credit hours were assigned to the following staff during the 2003 Fall term? Student credit hours are defined as the number of course credits or contact hours multiplied by the number of students enrolled. * Percent of undergraduate instruction assigned to: * a. Full-time faculty or instructional staff * b. Part-time faculty or instructional staff, including adjuncts * c. Teaching assistants such as graduate students who teach classes * d. Others

Page 177: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

C-19

Faculty Instrument

Page 178: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 179: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

C-21

NSOPF:04 Faculty Instrument

Full-Scale Study Facsimile

Page 180: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 181: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-23

►SECTION A: Nature of Employment Form: Q1 Label: Instructional duties, any Form Administered To: All faculty and instructional staff StemWording: During the 2003 Fall Term, did you have any instructional duties at [FILL INSTNAME], such as teaching students in one or more credit or noncredit courses, or advising or supervising students' academic activities? (By instructional duties, we mean teaching credit or noncredit courses, advising or supervising students' academic activities, serving on undergraduate or graduate thesis or dissertation committees, supervising independent study or one-on-one instruction, etc., during the 2003 Fall Term.) 0 = No 1 = Yes Form: Q2 Label: Instructional duties related to credit courses/activities Form Administered To: Faculty with instructional duties, Fall 2003 StemWording: Did any of your instructional duties include teaching students in credit courses, or advising students or supervising students' academic activities for which they received credit during the 2003 Fall Term? 0 = No 1 = Yes Form: Q3 Label: Faculty status Form Administered To: All faculty and instructional staff StemWording: During the 2003 Fall Term at [FILL INSTNAME], did you have faculty status as defined by that institution? 0 = No 1 = Yes

Page 182: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-24

Form: Q3X Label: Confirm study ineligibility Form Administered To: Sample members without faculty status and with no instructional duties during the 2003 Fall term StemWording: Just to confirm, you did not have faculty status and you did not teach any classes, or advise or supervise any students at [FILL INSTNAME] during the 2003 Fall Term? 1 = Agree: NOT faculty and DID NOT have any instructional duties 2 = Disagree: Had faculty status and/or had instructional duties Form: Q4 Label: Principal activity Form Administered To: All faculty and instructional staff StemWording: Was your principal activity at [FILL INSTNAME] during the 2003 Fall Term. . . (If you had equal responsibilities, please select one.) 1 = Teaching 2 = Research 3 = Public service 4 = Clinical service 5 = Administration (e.g., Dean, Chair, Director, etc.) 6 = On sabbatical from this institution 7 = Other activity (e.g., technical activity such as programmer or technician; other institutional

activities such as library services; subsidized performer, artist-in-residence, etc.) Form: Q5 Label: Employed full or part time at this institution Form Administered To: All faculty and instructional staff StemWording: During the 2003 Fall Term, did [FILL INSTNAME] consider you to be employed full time or part time? 1 = Full time 2 = Part time

Page 183: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-25

Form: Q6 Label: Part-time employment is primary employment Form Administered To: Part-time faculty and instructional staff StemWording: Do you consider your part-time position at [FILL INSTNAME] to be your primary employment? 0 = No 1 = Yes Form: Q8 Label: Part-time but preferred full-time position Form Administered To: Part-time faculty and instructional staff StemWording: Would you have preferred a full-time position for the 2003 Fall Term at [FILL INSTNAME]? 0 = No 1 = Yes Form: Q9 Label: Year began current job Form Administered To: All faculty and instructional staff StemWording: In what year did you start working at the job you held during the 2003 Fall Term at [FILL INSTNAME]? Consider promotions in rank as part of the same job. * Year: Form: Q10 Label: Rank

Form Administered To: All faculty and instructional staff

StemWording: During the 2003 Fall Term, was your academic rank, title, or position at [FILL INSTNAME] . . . (If no ranks are designated at your institution, select "Not applicable.") 0 = Not applicable (No formal ranks are designated at this institution) 1 = Professor 2 = Associate professor 3 = Assistant professor 4 = Instructor 5 = Lecturer 6 = Other title (e.g., Administrative, Adjunct, Emeritus, other)

Page 184: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-26

Form: Q11 Label: Rank, year attained professor or associate professor Form Administered To: Faculty and instructional staff who hold the rank of professor or associate professor StemWording: In what year did you first achieve the rank of [FILL Q10] at any institution? * Year: Form: Q12 Label: Tenure status Form Administered To: All faculty and instructional staff StemWording: During the 2003 Fall Term at [FILL INSTNAME], were you . . . 1 = Tenured 2 = On tenure track but not tenured 3 = Not on tenure track 4 = Not tenured because institution had no tenure system Form: Q13 Label: Tenure, year attained at any postsecondary institution Form Administered To: Tenured faculty and instructional staff StemWording: In what year did you first achieve tenure at any postsecondary institution? * Year: Form: Q14 Label: Union status Form Administered To: All faculty and instructional staff StemWording: Are you a member of a union or other bargaining association that is legally recognized to represent the faculty at [FILL INSTNAME]? 0 = No 1 = Yes

Page 185: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-27

Form: Q15 Label: Union status, reason not a member Form Administered To: Faculty and instructional staff who are not members of a union StemWording: Is that because a union is not available, you are not eligible to join, or you decided not to join? –1 = Don't know 1 = Union is not available 2 = Union is available, but I am not eligible 3 = I am eligible, but I decided not to join Form: Q16VS Label: Principal field of teaching-verbatim Form Administered To: All faculty and instructional staff StemWording: What is your principal field or discipline of teaching at [FILL INSTNAME]? (Enter the name of the principal field or discipline in the box below. This name will be used to match against a list of academic fields, so please be specific and do not use abbreviations or acronyms. If you have no principal field, select the "Not applicable" box.) * Name of principal field/discipline of teaching: * Not applicable (No principal teaching field or discipline) Form: Q16AC Label: Principal field of teaching-autocode Form Administered To: Faculty and instructional staff who provided a verbatim field of teaching StemWording: Please select the code below to confirm your field of teaching: [FILL Q16VS] If you do not agree with this code, select "None of these codes" to manually code the field.

Autocoding Explanation: Using the verbatim string of the respondent's teaching field (provided in Q16VS), item Q16AC matches the string to selected categories from the Classification of Instructional Programs (CIP), the federal statistical standard for classifying instructional program. CIP descriptions that match the verbatim string appear on the screen, and the respondent selects the code that best describes the teaching field. (See pages C-28 through C-30 for a list of codes and descriptions) Strings that do not match the CIP descriptions are routed to Q16CD for manual coding. The respondent can also modify the verbatim string and redo the match or manually code the teaching field in Q16CD. (Additional information on CIP can be found at http://nces.ed.gov/pubs2002/2002165.pdf.)

Page 186: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-28

Form: Q16CD Name: Q16CD2 Label: Principal field of teaching-general code Name: Q16CD4 Label: Principal field of teaching-specific code Form Administered To: Faculty and instructional staff who provided a verbatim field of teaching, but whose results were not autocoded StemWording: Please help us to categorize "[FILL Q16VS]" using the drop-down list boxes. (Coding Directions: Please select a general area and then the specific discipline within the general area. Use the arrow at the right side of the first dropdown box to display the general areas. Click to select the desired general area, and then select the desired specific discipline within the area from the second dropdown box.) * General Area:

01 = Agriculture/natural resources/related 17 = Library science 02 = Architecture and related services 18 = Mathematics and statistics 03 = Area/ethnic/cultural/gender studies 19 = Mechanical/repair technologies/techs 04 = Arts--visual and performing 20 = Multi/interdisciplinary studies 05 = Biological and biomedical sciences 21 = Parks/recreation/leisure/fitness studies 06 = Business/management/marketing/ related 22 = Precision production 07 = Communication/journalism/comm. Tech 23 = Personal and culinary services 08 = Computer/info sciences/support tech 24 = Philosophy, religion & theology 09 = Construction trades 25 = Physical sciences 10 = Education 26 = Psychology 11 = Engineering technologies/technicians 27 = Public administration/social services 12 = English language and literature/letters 28 = Science technologies/technicians 13 = Family/consumer sciences, human sciences 29 = Security & protective services 14 = Foreign languages/literature/linguistics 30 = Social sciences (except psych) and history 15 = Health professions/clinical sciences 31 = Transportation & materials moving 16 = Legal professions and studies 32 = Other

Page 187: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-29

* Specific Discipline:

0101 = Agriculture and related sciences 1102 = Chemical engineering 0102 = Natural resources and conservation 1103 = Civil engineering 0201 = Architecture and related services 1104 = Computer engineering 0301 = Area/ethnic/cultural/gender studies 1105 = Electrical/electronics/comms engineering 0401 = Art history, criticism & conservation 1106 = Engineering technologies/technicians 0402 = Design & applied arts 1107 = Environmental/environmental health eng 0403 = Drama/theatre arts and stagecraft 1108 = Mechanical engineering 0404 = Fine and studio art 1109 = Engineering, other 0405 = Music, general 1201 = English language and literature/letters 0406 = Music history, literature, and theory 1301 = Family/consumer sciences, human sciences 0407 = Visual and performing arts, other 1401 = Foreign languages/literature/linguistics 0408 = Commercial and advertising art 1501 = Alternative/complementary medicine/sys 0409 = Dance 1502 = Chiropractic 0410 = Film/video and photographic arts 1503 = Clinical/medical lab science/allied 0501 = Biochem/biophysics/molecular biology 1504 = Dental support services/allied 0502 = Botany/plant biology 1505 = Dentistry 0503 = Genetics 1506 = Health & medical administrative services 0504 = Microbiological sciences & immunology 1507 = Allied health and medical assisting services

0505 = Physiology, pathology & related sciences 1508 = Allied health diagnostic, intervention, treatment professions

0506 = Zoology/animal biology 1509 = Medicine, including psychiatry 0507 = Biological & biomedical sciences, other 1510 = Mental/social health services and allied 0601 = Accounting and related services 1511 = Nursing 0602 = Business admin/management/operations 1512 = Optometry 0603 = Business operations support/assistance 1513 = Osteopathic medicine/osteopathy 0604 = Finance/financial management services 1514 = Pharmacy/pharmaceutical sciences/admin 0605 = Human resources management and svcs 1515 = Podiatric medicine/podiatry 0606 = Marketing 1516 = Public health 0607 = Business/mgt/marketing/related, other 1517 = Rehabilitation & therapeutic professions 0608 = Management information systems/services 1518 = Veterinary medicine 0701 = Communication/journalism/related pgms 1519 = Health /related clinical services, other 0702 = Communication technologies/technicians and support services 1601 = Law 0801 = Computer/info tech administration/mgmt 1602 = Legal support services 0802 = Computer programming 1603 = Legal professions and studies, other 0803 = Computer science 1701 = Library science 0804 = Computer software and media applications 1801 = Mathematics 0805 = Computer systems analysis 1802 = Statistics 0806 = Computer systems networking/telecomm 1901 = Mechanical/repair technologies/techs 0807 = Data entry/microcomputer applications 2001 = Multi/interdisciplinary studies 0808 = Data processing 2101 = Parks, recreation and leisure studies 0809 = Information science/studies 2102 = Health and physical education/fitness 0810 = Computer/info sci/support svcs, other 2201 = Precision production 0901 = Construction trades 2301 = Culinary arts and related services 1001 = Curriculum and instruction 2302 = Personal and culinary services 1002 = Educational administration/supervision 2401 = Philosophy 1003 = Educational/instructional media design 2402 = Religion/religious studies 1004 = Special education and teaching 2403 = Theology and religious vocations 1005 = Student counseling/personnel services 2501 = Astronomy & astrophysics 1006 = Education, other 2502 = Atmospheric sciences and meteorology 1007 = Early childhood education and teaching 2503 = Chemistry 1008 = Elementary education and teaching 2504 = Geological & earth sciences/geosciences 1009 = Secondary education and teaching 2505 = Physics 1010 = Adult and continuing education/teaching 2506 = Physical sciences, other 1011 = Teacher ed: specific levels, other 2601 = Behavioral psychology 1012 = Teacher ed: specific subject areas 2601 = Behavioral psychology 1013 = Bilingual & multicultural education 2602 = Clinical psychology 1014 = Ed assessment 2603 = Education/school psychology 1015 = Higher education 2604 = Psychology, other 1101 = Biomedical/medical engineering 2701 = Public administration

Page 188: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-30

(Specific discipline continued) 2702 = Social work 3004 = Demography & population studies 2703 = Public administration & social svcs other 3005 = Economics 2801 = Science technologies/technicians 3006 = Geography & cartography 2901 = Corrections 3007 = History 2902 = Criminal justice 3008 = International relations & affairs 2903 = Fire protection 3009 = Political science and government 2904 = Police science 3010 = Sociology 2905 = Security and protective services, other 3011 = Urban studies/affairs 3001 = Anthropology (except psychology) 3012 = Social sciences, other 3002 = Archeology 3101 = Transportation & materials moving 3003 = Criminology 3201 = Other

►SECTION B: Academic/Professional Background Form: Q17a1 Label: Highest degree Form Administered To: All faculty and instructional staff StemWording: What is the highest degree you have completed? Do not include honorary degrees. (If you have none of the degrees or awards, select "Not applicable.") 0 = Not applicable (Do not hold a degree) 1 = Doctoral degree (Ph.D., Ed.D., etc.) 2 = First-professional degree (M.D., D.O., D.D.S. or D.M.D., LL.B., J.D., D.C.

or D.C.M., Pharm.D., Pod.D. or D.P., D.V.M., O.D., M.Div. or H.H.L. or B.D.) 3 = Master of Fine Arts, Master of Social Work (M.F.A., M.S.W.) 4 = Other master's degree (M.A., M.S., M.B.A, M.Ed., etc.) 5 = Bachelor's degree (B.A., A.B., B.S., etc.) 6 = Associate's degree or equivalent (A.A., A.S., etc.) 7 = Certificate or diploma for completion of undergraduate program (other than associate's or

bachelor's) Form: Q17a1b Label: Hold PhD in addition to professional degree Form Administered To: Faculty and instructional staff whose highest degree is a first-professional degree StemWording: Do you also hold a Ph.D. or other doctorate? 0 = No 1 = Yes

Page 189: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-31

Form: Q17a2 Label: Highest degree, date awarded Form Administered To: Faculty and instructional staff who hold a degree StemWording: In what year did you receive your [FILL Q17A1 or Q17A1B]? (If you have more than one degree at the same level, please select the most recent degree.) * Year received: Form: Q17a3VS Label: Highest degree field-verbatim Form Administered To: Faculty and instructional staff who hold a degree StemWording: In what field or discipline was your [FILL Q17A1 or Q17A1B]? (Enter the name of your degree field or discipline. This name will be used to match against a list of academic fields, so please be specific and do not use abbreviations or acronyms.) Form: Q17a3AC Label: Highest degree field-autocode Form Administered To: Faculty and instructional staff who provided a verbatim highest degree field StemWording: Please select the appropriate code for your [FILL Q17A1 or Q17A1B] field: [FILL Q17a3VS]. If you do not agree with these codes, select "None of these codes" to manually code the field.

Autocoding Explanation: Using the verbatim string of the respondent's highest degree field (provided in Q17A3VS), item Q17A3AC matches the string to selected CIP categories (see pages C-28 through C-30 for a list of codes and descriptions). Descriptions that match the verbatim string appear on the screen, and the respondent selects the code that best describes the degree field. Strings that do not match the CIP descriptions are routed to Q17A3CD for manual coding. (The respondent can also modify the verbatim string and redo the match or manually code the teaching field in Q17A3CD.)

Page 190: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-32

Form: Q17a3CD Name: Q17a3C2 Label: Highest degree field-general code Name: Q17a3C4 Label: Highest degree field-specific code Form Administered To: Faculty and instructional staff who provided a verbatim highest degree field, but whose results were not autocoded StemWording: Please help us categorize "[FILL Q17a3VS]" using the drop–down list boxes below. [IF Q16CD ≥ 0] (Select one from the list of disciplines you've already told us about:) [ENDIF] (Coding Directions: Please select a general area and then the specific discipline within the general area. Use the arrow at the right side of the first dropdown box to display the general areas. Click to select the desired general area, and then select the desired specific discipline within the area from the second dropdown box.) * General Area: * Specific Discipline: Note: Please refer to the complete list of instructional program codes on pages C-28 through C-30. Form: Q17a4 Name: Q17a4ST Label: Highest degree institution-state Name: Q17a4C Label: Highest degree institution-city Name: Q17a4N Label: Highest degree institution-name Name: Q17a4I Label: Highest degree institution-IPEDS Form Administered To: Faculty and instructional staff who hold a degree StemWording: Please help us code the postsecondary institution that awarded your [FILL Q17A1 or Q17A1B] by providing the state and city in which it was located. (Steps: 1. Please select the state in which the school was located. If the school was located in another

country, select "foreign country." 2. Enter the name of the city in which the institution was located. You can also use the "Browse"

link to identify the city. 3. Select the "Continue" button to list the schools located in that state and city. 4. Select the desired school.

Page 191: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-33

Problems? Try searching for the school by state without listing a city. If you still can't find the school, select the "Unable To Find School in List" button at the bottom of the search results.) * State/Foreign:

1 = Alabama 21 = Maryland 41 = South Carolina 2 = Alaska 22 = Massachusetts 42 = South Dakota 3 = Arizona 23 = Michigan 43 = Tennessee 4 = Arkansas 24 = Minnesota 44 = Texas 5 = California 25 = Mississippi 45 = Utah 6 = Colorado 26 = Missouri 46 = Vermont 7 = Connecticut 27 = Montana 47 = Virginia 8 = Delaware 28 = Nebraska 48 = Washington 9 = District of Columbia 29 = Nevada 49 = West Virginia 10 = Florida 30 = New Hampshire 50 = Wisconsin 11 = Georgia 31 = New Jersey 51 = Wyoming 12 = Hawaii 32 = New Mexico 52 = Puerto Rico 13 = Idaho 33 = New York 54 = American Samoa 14 = Illinois 34 = North Carolina 55 = Guam 15 = Indiana 35 = North Dakota 56 = Federated States of Micronesia 16 = Iowa 36 = Ohio 57 = Marshall Islands 17 = Kansas 37 = Oklahoma 58 = Northern Mariana Islands 18 = Kentucky 38 = Oregon 59 = Palau 19 = Louisiana 39 = Pennsylvania 60 = U.S. Virgin Islands 20 = Maine 40 = Rhode Island 63 = Foreign Country

* City: * School Name: Form: Q17d1 Label: Bachelor’s degree date awarded Form Administered To: Faculty and instructional staff who reported their highest degree as master’s level or above StemWording: In what year did you receive your bachelor’s degree? (If you have more than one degree at this level, please select the first degree.) * Year received: * Not applicable (Do not hold a bachelor's degree)

Page 192: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-34

Form: Q18 Label: Other current jobs, number of jobs Form Administered To: All faculty and instructional staff StemWording: While you were employed at [FILL INSTNAME], how many other jobs did you hold during the 2003 Fall Term? Please do not consider any outside consulting jobs. (If none, select "0.") 0 = 0 1 = 1 2 = 2 3 = 3 4 = 4 5 = 5 or more Form: Q19a1 Label: Other current jobs, full-time employment Form Administered To: Faculty and instructional staff with other employment (excluding consulting) StemWording: [IF Q18>1] Were you employed full time at any of these other jobs during the 2003 Fall Term? [ELSE] Were you employed full time at this other job during the 2003 Fall Term? [ENDIF] 0 = No 1 = Yes Form: Q19b1 Label: Other current jobs, number in postsecondary instruction Form Administered To: Faculty and instructional staff with other employment (excluding consulting) StemWording: How many of these other jobs involved instruction at another postsecondary institution during the 2003 Fall Term? (If none, select "0.") 0 = 0 1 = 1 2 = 2 3 = 3 4 = 4 5 = 5 or more

Page 193: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-35

Form: Q21 Label: First postsecondary job, current job is first Form Administered To: All faculty and instructional staff StemWording: Is the job you held at [FILL INSTNAME] during the 2003 Fall Term the first faculty or instructional staff position you have held at a postsecondary institution? Do not include teaching assistant or research assistant positions while you were working on your degree. 0 = No 1 = Yes Form: Q23 Label: First postsecondary job, year began Form Administered To: Faculty and instructional staff who have worked at another postsecondary institution StemWording: In what year did you begin your first faculty or instructional staff position at a postsecondary institution? (Do not include time when you were a teaching or research assistant.) * Year: Form: Q24 Label: First postsecondary job, part or full time Form Administered To: All faculty and instructional staff StemWording: [IF Q21=1] When you first started your job at [FILL INSTNAME], were you employed full time or part time? [ELSE] Were you employed full time or part time at your first faculty or instructional staff position? [ENDIF] (Do not consider teaching or research assistant positions.) 1 = Full time 2 = Part time

Page 194: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-36

Form: Q26 Label: First postsecondary job, tenure status Form Administered To: Faculty and instructional staff whose first job was full-time except if this is their first postsecondary institution position and there is no tenure system at this institution StemWording: [IF Q21=1] When you began working at [FILL INSTNAME], was your tenure status. . . [ELSE] When you began working at your first faculty or instructional staff job at a postsecondary institution, was your tenure status. . . [ENDIF] 1 = Tenured 2 = On tenure track but not tenured 3 = Not on tenure track 4 = Not tenured because institution had no tenure system Form: Q27 Label: Other jobs, any outside postsecondary since degree Form Administered To: All faculty and instructional staff StemWording: Since receiving your highest degree, have you held any positions outside of postsecondary institutions? 0 = No 1 = Yes

Page 195: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-37

Form: Q28 Label: Other jobs, sector of previous job Form Administered To: All faculty and instructional staff StemWording: Now we would like to know about the job you held prior to starting your current job at [FILL INSTNAME]. Was the job in a . . . (By "Current Job" we mean the position you held at [FILL INSTNAME] during the 2003 Fall Term.) 0 = Not applicable (No job immediately prior to this one) 1 = 4– or 2–year postsecondary institution 2 = Other educational institution 3 = Government (federal, state, local) or military organization 4 = Foundation or other nonprofit organization 5 = For profit business or industry 6 = Other

►SECTION C: Instructional Responsibilities and Workload Form: Q31 Name: Q31a Label: Hours per week on paid tasks at institution Name: Q31b Label: Hours per week on unpaid tasks at institution Name: Q31c Label: Hours per week on paid tasks outside of institution Name: Q31d Label: Hours per week on unpaid tasks outside of institution Form Administered To: All faculty and instructional staff StemWording: This next section of the questionnaire relates to your responsibilities on the job and your workload. On average, how many hours per week did you spend at each of the following work activities during the 2003 Fall Term? (Enter average number of hours. If not sure, give your best estimates. If none, enter "0." If less than one hour, enter “1.”)

* a. All paid activities at [FILL INSTNAME] (e.g., teaching, clinical service, class preparation, research, administration)

* b. All unpaid activities at [FILL INSTNAME] (e.g., club assistance, recruiting, attending institution events)

* c. Any other paid activities outside [FILL INSTNAME] including consulting, working at other jobs, teaching at other schools

* d. Unpaid professional service activities outside [FILL INSTNAME] related to your work. (Do not include volunteer work unrelated to your profession.)

Page 196: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-38

Form: Q32 Name: Q32a Label: Percent time spent on instruction, undergraduate Name: Q32b Label: Percent time spent on instruction, graduate/first-professional Name: Q32c Label: Percent time spent on research activities Name: Q32d Label: Percent time spent on other unspecified activities Form Administered To: Faculty and instructional staff who worked at least one hour per week at the target institution StemWording: [IF Q31A AND Q31B AND Q31C AND Q31D = BLANK] For the hours you worked during the 2003 Fall Term at [FILL INSTNAME], [ELSE] For the [FILL Q31A + Q31B] hours per week you worked during the 2003 Fall Term at [FILL INSTNAME], [ENDIF] we would like you to allot this time—using percentages—into four broad categories: Instruction with undergraduates, Instruction with graduate and first-professional students, Research, and Other Activities. (If you are not sure, give your best estimate. The percentages should sum to 100%. If none for a category, enter "0".) What percentage of your time was spent on. . . * a. Instructional Activities with Undergraduates, including teaching and preparing for

classes, advising, and supervising students at this institution? * b. Instructional Activities with Graduate and First Professional students, including

teaching and preparing for classes, advising, and supervising students at this institution? * c. Research Activities, other forms of scholarship, or grants at this institution? * d. All Other Activities at this institution like administration, professional growth, service, and

other activities not related to teaching or research.

Page 197: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-39

Form: Q35a Name: Q35a1 Label: Number of classes taught, credit Name: Q35a2 Label: Number of classes taught, noncredit Form Administered To: Faculty and instructional staff with instructional duties, Fall 2003 StemWording: Next, we would like to ask you about the classes or sections you taught during the 2003 Fall Term at [FILL INSTNAME]. Please do not include individualized instruction. Questions about independent study, intern supervision, and one-on-one instruction in performance, clinical, or research settings come later. (If none, select "no classes.") How many. . . * a. Classes/sections for credit towards degree did you teach? * b. Classes/sections not for credit towards degree did you teach? (Guidance on Counting Classes Count multiple sections of the same course separately. For example, Sociology 101 taught to

two different groups of students would count as two classes. Count lab or discussion sections as part of the same class unless they have separate credits

assigned to them. For example, a biology class with lectures, labs, and discussion sections each week counts as one class.)

0 = No classes 1 = 1 class . . . 19 = 19 classes 20 = 20 or more classes Form: Q35b Name: Q35b Label: Number of classes taught, remedial Name: Q35c Label: Number of classes taught, distance education Form Administered To: Faculty and instructional staff who taught at least one class StemWording: Of the [FILL Q35A] classes you taught at [FILL INSTNAME] in the 2003 Fall Term, (By remedial or developmental classes, we mean courses in reading, writing, math, or other courses for students lacking the skills necessary to perform college-level work at the level required by your institution. Some institutions refer to these courses as compensatory, basic skills, or some other term. By distance education, we mean classes where students and instructors are separated primarily or exclusively by distance or time.)

Page 198: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-40

* a. How many were remedial or developmental classes? * b. How many were taught through distance education, either exclusively or primarily? 0 = No classes 1 = 1 class . . . 19 = 19 classes 20 = 20 or more classes Form: Q36 Label: Teaching assistant in any credit class Form Administered To: Faculty and instructional staff who taught at least one class for credit StemWording: [IF Q35A1=1] Did you have teaching assistants, readers, graders, or lab assistants for the credit class you taught during the 2003 Fall Term at [FILL INSTNAME]? [ELSE] Did you have teaching assistants, readers, graders, or lab assistants for any of the credit classes you taught during the 2003 Fall Term at [FILL INSTNAME]? [ENDIF] 0 = No 1 = Yes Form: Q37 (loops for up to 5 classes) Name: Q37ai (i = 1 to 5) Label: Number of weeks taught, i-th credit class Name: Q37bi (i = 1 to 5) Label: Number of credit hours, i-th class Name: Q37ci (i = 1 to 5) Label: Number of hours taught per week, i-th class Name: Q37di (i = 1 to 5) Label: Number of students, i-th class Name: Q37ei (i = 1 to 5) Label: Primary level of students, i-th class Name: Q37fi (i = 1 to 5) Label: Teaching assistant, i-th class Form Administered To: Faculty and instructional staff who taught at least one class for credit StemWording: [IF Q35A1>5] You reported earlier that you taught [FILL Q35A1] classes for credit during the 2003 Fall Term at [FILL INSTNAME]. We have space for you to describe 5 of these classes. Please describe the ones you feel are most relevant for your instructional activities. We will call them classes A to E.

Page 199: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-41

[IF Q35A1 >1 AND Q35A1 ≤ 5] You reported earlier that you taught [FILL Q35A1] classes for credit during the 2003 Fall Term at [FILL INSTNAME]. Please answer the following questions for each of these classes, we will call A to [FILL B (IF Q35A1=2) OR C (IF Q35A1=3) OR D (IF Q35A1=4) OR E (IF Q35A1=5)]. [IF Q35A1=1] For the credit class that you reported teaching at [FILL INSTNAME] during the 2003 Fall Term, please answer the following questions. [ENDIF] * a. How many weeks did you teach the class? 0 0 weeks 1 1 week . . . . . . 24 24 weeks 25 25 weeks * b. How many credits were attached to the class? * c. How many hours did you teach the class per week?

(Do not include preparation time.) * d. How many students were enrolled in the class? * e. Were the students in this class primarily undergraduate, graduate, or first

professional (e.g., dental, medical, law, theology)? 1 = Undergraduate 2 = Graduate 3 = First professional * f. Did you have a teaching or lab assistant, reader, or grader assigned

to this class? 0 = No 1 = Yes

Page 200: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-42

Form: Q38 Name: Q38a Label: Undergrad class, multiple choice midterm/final exams Name: Q38b Label: Undergrad class, essay midterm/final exams Name: Q38c Label: Undergrad class, short answer midterm/final exams Name: Q38d Label: Undergrad class, term/research papers Name: Q38e Label: Undergrad class, multiple drafts of written work Name: Q38f Label: Undergrad class, oral presentations Name: Q38g Label: Undergrad class, group projects Name: Q38h Label: Undergrad class, student evaluations of each others^ work Name: Q38i Label: Undergrad class, laboratory/shop/studio assignments Name: Q38j Label: Undergrad class, service learn/co-op interactions with

business Form Administered To: Faculty and instructional staff who taught an undergraduate credit class StemWording: [IF Q37EI=1 FOR EXACTLY ONE OF THE Q37Ei, WHERE i=1 TO 5 OR (IF Q32A>0 AND Q32B=0 OR BLANK AND Q35A1=1)] For the undergraduate class you taught for credit during the 2003 Fall Term at [FILL INSTNAME], did you use any of the following? [ELSE] For the undergraduate classes you taught for credit during the 2003 Fall Term at [FILL INSTNAME], did you use any of the following? [ENDIF] Did you use. . . * a. Multiple-choice midterm or final exam? * b. Essay midterm or final exam? * c. Short-answer midterm or final exam? * d. Term/research papers and writing assignments? * e. Multiple drafts of written work? * f. Oral presentations by students? * g. Group and team projects producing a joint product? * h. Student evaluations of each other's work? * i. Laboratory, shop, or studio assignments?

Page 201: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-43

* j. Service learning, co-op experiences or assignments requiring interactions with the community or business/industry?

1 = Used in all classes 2 = Used in some classes 3 = Not used Form: Q39 Label: Website for any instructional duties Form Administered To: Faculty and instructional staff who had instructional duties StemWording: During the 2003 Fall Term at [FILL INSTNAME], did you have one or more web sites for any of your teaching, advising, or other instructional duties? (Web sites used for instructional duties might include the syllabus, readings, assignments, and practice exams for classes; might enable communication with students via listservs or online forums; and might provide real-time computer-based instruction.) 0 = No 1 = Yes Form: Q41 Label: Hours per week, e-mailing students Form Administered To: Faculty and instructional staff who had instructional duties StemWording: During the 2003 Fall Term at [FILL INSTNAME], how many hours per week did you spend communicating by e-mail (electronic mail) with your students? (If none, enter "0.") * Hours per week: Form: Q46 Label: Individual instruction, any Form Administered To: All faculty and instructional staff StemWording: During the 2003 Fall Term, did you provide individual instruction for credit to any student at [FILL INSTNAME]? By individual instruction, we mean independent study, supervising student teachers or interns, and one-on-one instruction like working with students in a clinical or research setting. Do not include dissertation or thesis committee work. 0 = No 1 = Yes

Page 202: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-44

Form: Q47 Name: Q47a1 Label: Individual instruction, number undergraduate students Name: Q47a2 Label: Individual instruction, number graduate students Name: Q47a3 Label: Individual instruction, number first-professional students Form Administered To: Faculty and instructional staff who provided individual instruction to students StemWording: [IF Q32A>0 AND Q32B=0 OR BLANK] How many undergraduate students received individual instruction for credit from you during the 2003 Fall Term? [ELSE] Of the students who received individual instruction for credit from you during the 2003 Fall Term, how many were . . . [ENDIF] (If none, enter "0.") * Undergraduate students * Graduate students * First-professional students (e.g., dental, medical, law, theology) Form: Q47b Name: Q47b1 Label: Individual instruction, hours with undergraduates Name: Q47b2 Label: Individual instruction, hours with graduate students Name: Q47b3 Label: Individual instruction, hours with first-professional students Form Administered To: Faculty and instructional staff who provided individual instruction to undergraduate, graduate, or first-professional students StemWording: Of the students who received individual instruction for credit from you during the 2003 Fall Term, what was the total number of hours you spent each week with your. . . (If less than one hour, enter “1.”) * Undergraduate students * Graduate students * First-professional students

Page 203: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-45

Form: Q48 Name: Q48 Label: Hours per week, thesis/dissertation committees Name: Q49 Label: Hours per week, administrative committees Name: Q50 Label: Hours per week, with advisees Name: Q51 Label: Hours per week, office hours Form Administered To: All faculty and instructional staff StemWording: The next items ask about the average number of hours each week during the 2003 Fall Term at [FILL INSTNAME] that you did the following activities. (If none, enter "0." If less than one hour, enter "1." If not sure, give your best estimate.) How many hours per week did you spend. . . * On undergraduate and graduate thesis or dissertation committees, comprehensive exams or orals

committees, or examination or certification committees? * On administrative committee work? Please include curriculum, personnel, governance, and other

committees at the department, division, institution, and system levels. * With students you were assigned to advise? (Do not include hours spent working with students

on their theses, dissertations, or independent studies.) * In regularly scheduled office hours in person or online?

►SECTION D: Scholarly Activities Form: Q52a Name: Q52aa Label: Career articles, refereed journals Name: Q52ab Label: Career articles, nonrefereed journals Name: Q52ac Label: Career book reviews, chapters, creative works Name: Q52ad Label: Career books, textbooks, reports Name: Q52ae Label: Career presentations Name: Q52af Label: Career exhibitions, performances Name: Q52ag Label: Career patents, computer software Form Administered To: All faculty and instructional staff StemWording: Next, we would like to consider your scholarly activities. During your entire career, how many of the following have you completed? (If not sure, give your best estimates.) * Articles published in refereed professional or trade journals; or creative works published in juried

media? * Articles published in nonrefereed professional or trade journals; or creative works published in

nonjuried media or in-house newsletters?

Page 204: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-46

* Published reviews of books, articles, or creative works; or chapters in edited volumes? * Textbooks, other books; monographs; research or technical reports disseminated internally or to

clients? * Presentations at conferences, workshops, etc.? * Exhibitions or performances in the fine or applied arts? * Patents and computer software products? (For publications, include only works that have been accepted for publication. Count multiple publications/presentations of the same work only once. Include electronic publications that are not published elsewhere in the appropriate categories.) Form: Q52b Name: Q52ba Label: Recent articles, refereed journals Name: Q52bb Label: Recent articles, nonrefereed journals Name: Q52bc Label: Recent book reviews, chapters, creative works Name: Q52bd Label: Recent books, textbooks, reports Name: Q52be Label: Recent presentations Name: Q52bf Label: Recent exhibitions, performances Name: Q52bg Label: Recent patents, computer software Form Administered To: Faculty and instructional staff who have presented or published during their career StemWording: We would like to consider the level of your scholarly activities during the last two years. * Of the [FILL Q52aa] articles or creative works published in refereed journals or juried media in

your career, how many were done in the last two years? * Of the [FILL Q52ab] articles or creative works published in nonrefereed journals or nonjuried

media in your career, how many were done in the last two years? * Of the [FILL Q52AC] reviews of books, articles, or creative works; chapters in edited volumes

published in your career, how many were in the last two years? * Of the [FILL Q52AD] textbooks, other books; monographs; and client reports you published

during your career, how many were done in the last two years? * Of the [FILL Q52ae] presentations you made at conferences or workshops in your career, how

many were made in the last two years? * Of your [FILL Q52af] career exhibitions or performances, how many were in the last two years? * Of your [FILL Q52ag] career patents, software products, or other works, how many were done in

the last two years?

Page 205: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-47

Form: Q53 Label: Scholarly activity, any Form Administered To: All faculty and instructional staff StemWording: Do you have any scholarly activities such as research, proposal development, creative writing, or other creative works in the 2003–04 academic year? 0 = No 1 = Yes Form: Q54VS Label: Scholarly activity, principal field-verbatim Form Administered To: Faculty and instructional staff who have scholarly activities and did not provide principal field of teaching (Q16VS) StemWording: What is your principal field or discipline of scholarly activity? (Enter the name of your principal field/discipline of scholarly activity. This name will be used to match against a list of academic fields, so please be specific and do not use abbreviations or acronyms.) * Name of principal field/discipline of scholarly activity: Form: Q54AC Label: Principal field of scholarly activity-autocode Form Administered To: Faculty and instructional staff who provided a verbatim field of scholarly activity StemWording: Please select the appropriate code for your field of scholarly activity: [FILL Q54VS]. If you do not agree with these codes, select "None of these codes" to manually code the field.

Autocoding Explanation: Using the verbatim string of the respondent's field of scholarly activity (provided in Q54VS), item Q54AC matches the string to selected CIP categories (see pages C-28 through C-30 for a list of codes and descriptions). Descriptions that match the verbatim string appear on the screen, and the respondent selects the code that best describes the field. Strings that do not match the CIP descriptions are routed to Q54CD for manual coding. (The respondent can also modify the verbatim string and redo the match or manually code the scholarly field in Q54CD).

Page 206: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-48

Form: Q54CD Name: Q54CD2 Label: Principal research field-general code Name: Q54CD4 Label: Principal research field-specific code Form Administered To: Faculty and instructional staff who provided a verbatim field of scholarly activity, but whose results were not autocoded StemWording: Please help us to categorize "[FILL Q54VS]" using the drop-down list boxes below. [IF Q17A3AC ≥ 0] (Select one from the list of disciplines you've already told us about:) [ENDIF] Coding Directions: Please select a general area and then the specific discipline within the general area. Use the arrow at the right side of the first dropdown box to display the general areas. Click to select the desired general area, and then select the desired specific discipline within the area from the second dropdown box.) * General area: * Specific Discipline: Note: Please refer to the complete list of instructional program codes on pages C-28 through C-30. Form: Q56 Label: Scholarly activity, description Form Administered To: Faculty and instructional staff engaged in scholarly activity StemWording: How would you describe your principal scholarly activity during the 2003–04 academic year? Is it. . . 1 = Basic research 2 = Applied or policy-oriented research or analysis 3 = Literary, performance, or exhibitions 4 = Program and curriculum design and development 5 = Other Form: Q55 Label: Scholarly activity, any funded

Form Administered To: Faculty and instructional staff engaged in scholarly activity

StemWording: During the 2003–04 academic year, are any of your scholarly activities at [FILL INSTNAME] funded? Do not include consulting services and research included as part of your basic salary. 0 = No 1 = Yes

Page 207: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-49

►SECTION E: Job Satisfaction Form: Q61 Name: Q61a Label: Satisfaction with authority to make decisions Name: Q61b Label: Satisfaction with technology-based activities Name: Q61c Label: Satisfaction with equipment/facilities Name: Q61d Label: Satisfaction with institutional support for teaching

improvement Name: Q62a Label: Satisfaction with workload Name: Q62b Label: Satisfaction with salary Name: Q62c Label: Satisfaction with benefits Name: Q62d Label: Satisfaction with job overall Form Administered To: All faculty and instructional staff with instructional responsibilities (Q61a–Q61d); All faculty and instructional staff (Q62a–Q62d) StemWording: [IF Q1=1 OR Q46=1 OR Q48>0 OR Q35A1>0 OR Q35A2>0] With regard to your job at [FILL INSTNAME] during the 2003 Fall Term, would you say you were very satisfied, somewhat satisfied, somewhat dissatisfied, or very dissatisfied with. . . [ELSE] With regard to your job at [FILL INSTNAME], would you say you are very satisfied, somewhat satisfied, somewhat dissatisfied, or very dissatisfied with. . . [ENDIF] * The authority you had to make decisions about content and methods in your instructional

activities * The institutional support for implementing technology-based instructional activities * Quality of equipment and facilities available for classroom instruction * Institutional support for teaching improvement (including grants, release time, and professional

development funds) * Your workload * Your salary * The benefits available to you * Your job at this institution, overall

Page 208: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-50

Form: Q65 Name: Q64 Label: Retired from another position Name: Q65 Label: Retire from all paid employment, planned age Form Administered To: All faculty and instructional staff StemWording: * Have you retired from another position? 0 = No 1 = Yes * At what age do you think you are most likely to retire from all paid employment? (Enter age or select "Don't know.") Years of age/Don't know

►SECTION F: Compensation Form: Q66 Name: Q66a Label: Amount of income from basic salary from institution Name: Q66b Label: Amount of income from other income from institution Name: Q66c Label: Amount of income from other academic institution Name: Q66d Label: Amount of income from consulting or freelance work Name: Q66e Label: Amount of income from other employment Name: Q66f Label: Amount of income from other unspecified sources Form Administered To: All faculty and instructional staff StemWording: We are almost finished. The next questions will be about your compensation and about your background. Your responses to these items—as with all items on this instrument—are voluntary and strictly confidential. They will be used only in statistical summaries. For the 2003 calendar year, please estimate your gross compensation before taxes. Do not include non-monetary compensation. (Enter dollar amount. If not sure, give your best estimates. If not applicable, enter "0.") First, your compensation from [FILL INSTNAME]: a. What is your basic salary during the calendar year from this institution? b. How much compensation did you receive from other income from this institution not included in

basic salary (e.g., for summer session, overload courses, administration, research, coaching sports, etc.)?

Page 209: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-51

Next, your compensation from other sources c. How much were you paid for employment at another postsecondary institution? d. How much were you paid for outside consulting or freelance work? e. How much were you compensated for any other employment besides consulting and another

postsecondary institution (e.g., speaking fees and honoraria, self-owned business, legal/medical/psychological services, professional performances/exhibitions)?

f. How much income did you receive from any other source (e.g., investment income,

royalties/commissions, pensions, real estate, loans, alimony, or child support)? Form: Q66b Label: Amount of total individual income (range) Form Administered To: Faculty and instructional staff who did not complete all compensation item amounts StemWording: The following ranges may make it easier for you to estimate your total income from all sources for the 2003 calendar year. (Your responses to these items are strictly confidential. They will be used only in statistical summaries.) 1 = $1–24,999 2 = $25,000–49,999 3 = $50,000–74,999 4 = $75,000–99,999 5 = $100,000–149,999 6 = $150,000–199,999 7 = $200,000–300,000 8 = More than $300,000 Form: Q67 Label: Type of contract, length of unit Form Administered To: All faculty and instructional staff StemWording: Is your basic salary at [FILL INSTNAME] this academic year based on a 9– or 10–month contract, an 11– or 12–month contract, or some other arrangement? (Please answer based on the length of your contract and how long you work rather than on the number of months you are paid.) 1 = 9– or 10–month contract 2 = 11– or 12–month contract 3 = Other, for example, by course or credit hour

Page 210: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-52

Form: Q68 Label: Income paid per course/credit unit or term Form Administered To: Faculty and instructional staff paid on something other than a 9–, 10–, 11–, or 12–month contract StemWording: What was the basis of your pay? Was it by. . . 1 = Course 2 = Credit hour 3 = Academic term 4 = Other (e.g., per student, hourly rate) Form: Q69 Label: Amount of income paid per course/credit unit or term Form Administered To: Faculty and instructional staff paid by course, credit hour, or academic term StemWording: How much were you paid per [FILL Q68]? Form: Q70a Label: Amount of total household income Form Administered To: All faculty and instructional staff StemWording: [IF RESPONDED TO ALL PARTS OF Q66AA-Q66AF] You told us before that your income from all sources for the 2003 Calendar year was $[FILL Q66ASUM]. What was your total household income before taxes for that same year? [ELSE IF Q66B ≥ 1 and Q66B ≤ 8] You told us before that your income from all sources for the 2003 Calendar year was [FILL Q66B]. What was your total household income before taxes for that same year? [ELSE] For the 2003 calendar year, what was your total household income before taxes? [ENDIF] (By household income, we mean the total income received by all persons, including yourself, residing in the house during the 2003 calendar year, but excluding minors and full-time students. Please include income from employment and from other sources including your spouse or partner, self-employment, interest earnings, alimony or child support, insurance benefits, and pension payments.) * Enter amount:

Page 211: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-53

Form: Q70b Label: Amount of total household income (range) Form Administered To: Faculty and instructional staff who did not provide their household income StemWording: The following ranges may make it easier for you to report your total household income. Was your income between. . . (Your responses to these items are strictly confidential. They will be used only in statistical summaries.) –1 = Don't know 1 = $1–24,999 2 = $25,000–49,999 3 = $50,000–74,999 4 = $75,000–99,999 5 = $100,000–149,999 6 = $150,000–199,999 7 = $200,000–300,000 8 = More than $300,000

►SECTION G: Sociodemographic Characteristics Form: Q71 Label: Gender Form Administered To: All faculty and instructional staff StemWording: The last few questions ask you to describe yourself and your opinions about your job. Are you . . . 1 = Male 2 = Female Form: Q72 Label: Age, year of birth Form Administered To: All faculty and instructional staff StemWording: In what year were you born? * Enter year:

Page 212: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-54

Form: Q73 Label: Race/ethnicity, Hispanic/Latino Form Administered To: All faculty and instructional staff StemWording: Are you Hispanic or Latino? 0 = No 1 = Yes Form: Q74 Name: Q74a Label: Race, American Indian or Alaska Native Name: Q74b Label: Race, Asian Name: Q74c Label: Race, Black or African American Name: Q74d Label: Race, Native Hawaiian or other Pacific Islander Name: Q74e Label: Race, White Form Administered To: All faculty and instructional staff StemWording: Please select one or more of the following choices to best describe your race. Are you. . . (Select all that apply.)

* American Indian or Alaska Native

* Asian

* Black or African American

* Native Hawaiian or Other Pacific Islander

* White

0 = No 1 = Yes Form: Q75 Label: Disability, any Form Administered To: All faculty and instructional staff StemWording: Do you have a long-lasting condition that substantially limits one or more of your major life activities? (By this we mean do you have a physical, visual, auditory, mental, emotional, or other disabling condition that limits your ability to see, hear, or speak; to learn, remember, or concentrate; to dress, bathe, or get around the house, or to get to school or around campus.) 0 = No 1 = Yes

Page 213: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-55

Form: Q77 Label: Marital status, fall 2003 Form Administered To: All faculty and instructional staff StemWording: On November 1, 2003, were you . . . 1 = Single and never married 2 = Married 3 = Living with partner or significant other 4 = Separated, divorced, or widowed Form: Q79 Label: Dependent children, number Form Administered To: All faculty and instructional staff StemWording: How many dependent children do you support? (A dependent child is a person 24 years old or younger for whom you provide at least half of his/her financial support.) * Number of dependent children: 0 = None 1 = 1 2 = 2 . . . 9 = 9 10 = 10 or more dependents Form: Q80 Name: Q80 Label: Born in United States Name: Q81 Label: Citizenship status

Form Administered To: All faculty and instructional staff

StemWording: Were you born in the United States?

0 = No 1 = Yes Are you a United States citizen?

0 = No 1 = Yes

Page 214: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix C. Facsimile Instruments

C-56

►SECTION H: Opinions Form: Q82 Name: Q82a Label: Opinion: teaching is rewarded Name: Q82b Label: Opinion: part-time faculty treated fairly Name: Q82c Label: Opinion: female faculty treated fairly Name: Q82d Label: Opinion: racial minorities treated fairly Form Administered To: All faculty and instructional staff StemWording: Do you strongly agree, somewhat agree, somewhat disagree, or strongly disagree that at [FILL INSTNAME]. . . * a. Good teaching is rewarded * b. Part-time faculty are treated fairly * c. Female faculty members are treated fairly * d. Faculty who are members of racial or ethnic minorities are treated fairly 1 = Strongly Agree 2 = Somewhat Agree 3 = Somewhat Disagree 4 = Strongly Disagree Form: Q83 Label: Opinion about choosing an academic career again Form Administered To: All faculty and instructional staff StemWording: Finally, if you had it to do over again, would you still choose an academic career? 0 = No 1 = Yes

Page 215: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

D-1

Appendix D Item Crosswalks

Institution .................................................................................................................................... D-3 Faculty......................................................................................................................................... D-9

Page 216: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 217: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-3

Institution Crosswalk

Page 218: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 219: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-5

Institution Questionnaire Crosswalk

The crosswalk below links the NSOPF:04 questionnaire items with similar items from the three previous NSOPF institution questionnaires: NSOPF:88, NSOPF:93, and NSOPF:99. This crosswalk will facilitate analyses of trends among postsecondary institutions. Linked questions may be identical in content and format or may differ in one or more ways. The question, item, or response wording; the order in which response options were presented; the manner in which the data were collected (e.g., categorical response option versus open-ended response fields, instructions to mark one versus all that apply); and the population to which the question applies may have changed. It is strongly recommended that analysts review documentation to determine whether linked questions are equitable for their purpose.

Table D-1. Institution Questionnaire Crosswalk: 2004

Variable name NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label 4 1a 1a 1A Number full-time faculty, fall 2003, reported 19 1b 1b 1B Number part-time faculty, fall 2003 6 2f 5aD 2A Full-time numbers: faculty, fall 2002 5bD 2B Full-time numbers: changed from part to full time, 2002-03 6 2b 5cD 2C Full-time numbers: hired, 2002-03 6 2c 5dD 2D Full-time numbers: retired, 2002-03 6 2e + 2d 5e 2E Full-time numbers: left for other reasons, 2002-03 2F Full-time numbers: changed from full to part time, 2002-03 4 2a 5f 2G Full-time numbers: faculty, fall 2003, calculated 3 5 4 3 Full-time tenure: has tenure system 7 8a 6a 4 Full-time tenure: number considered for tenure, 2002-03 7 8b 6b 5 Full-time tenure: number granted tenure, 2002-03 10 9a 7a 6 Full-time tenure: maximum years on tenure track 8a 7A Full-time tenure: changed tenure policy 12.5 10b 8b 7B Full-time tenure: more stringent tenure standards 7b 8c 7C Full-time tenure: downsized tenured faculty 12.4 10a 8d 7D Full-time tenure: replaced tenured with fixed term 12.1 11 8f 7E Full-time tenure: offered early retirement 11a 8g 7E2 Full-time tenure: number early retirees, last 5 years 8e 8 Full-time tenure: discontinued tenure system, last 5 years 3 10 9 Full-time faculty: positions sought to fill, fall 2003 13b 12a 10AA Full-time benefit: medical insurance 13c 12b 10AB Full-time benefit: dental insurance 13d 12c 10AC Full-time benefit: disability insurance 13e 12d 10AD Full-time benefit: life insurance 13h 12e 10AE Full-time benefit: child care 13n 12f 10AF Full-time benefit: retiree medical insurance 16 13o 12g 10AG Full-time benefit: cafeteria-style plan 14.04 13bA 12aa 10BA Full-time benefit: medical insurance subsidized See notes at end of table.

Page 220: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-6

Table D-1. Institution Questionnaire Crosswalk: 2004—Continued

Variable name NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label 14.05 13cA 12ba 10BB Full-time benefit: dental insurance subsidized 14.06 13dA 12ca 10BC Full-time benefit: disability insurance subsidized 14.07 13eA 12da 10BD Full-time benefit: life insurance subsidized 14.10 13hA 12ea 10BE Full-time benefit: child care subsidized 13nA 12fa 10BF Full-time benefit: retiree medical insurance subsidized 13oA 12ga 10BG Full-time benefit: cafeteria-style plan subsidized 14.01 13a 13a 11A Full-time benefit: wellness program 14.08 13f 13b 11B Full-time benefit: spouse tuition remission 14.09 13g 13c 11C Full-time benefit: children tuition remission 14.11 13i 13d 11D Full-time benefit: housing 13k 13e 11E Full-time benefit: transportation/parking 14.02 13l 13f 11F Full-time benefit: paid maternity leave 14.03 13m 13g 11G Full-time benefit: paid paternity leave 13h 11H Full-time benefit: paid sabbatical leave 13i 11I Full-time benefit: employee assistance program 13 19 15 12 Full-time faculty: union representation 18a 16a 13A Full-time assessment: student evaluations 18b 16b 13B Full-time assessment: student test scores 18c 16c 13C Full-time assessment: student career placement 18d 16d 13D Full-time assessment: other student performance 18e 16e 13E Full-time assessment: department chair evaluations 18f 16f 13F Full-time assessment: dean evaluations 18g 16g 13G Full-time assessment: peer evaluations 18h 16h 13H Full-time assessment: self-evaluations 23 34 17 14 Part-time benefit: retirement plan 37b 20a 15AA Part-time benefit: medical insurance 37c 20b 15AB Part-time benefit: dental insurance 37d 20c 15AC Part-time benefit: disability insurance 37e 20d 15AD Part-time benefit: life insurance 37h 20e 15AE Part-time benefit: child care 37n 20f 15AF Part-time benefit: retiree medical insurance 24 37o 20g 15AG Part-time benefit: cafeteria-style plan 37bA 20aa 15BA Part-time benefit: medical insurance subsidized 37cA 20ba 15BB Part-time benefit: dental insurance subsidized 37dA 20ca 15BC Part-time benefit: disability insurance subsidized 37eA 20da 15BD Part-time benefit: life insurance subsidized 37hA 20ea 15BE Part-time benefit: child care subsidized 37nA 20fa 15BF Part-time benefit: retiree medical insurance subsidized 37oA 20ga 15BG Part-time benefit: cafeteria-style plan subsidized 37a 21a 16A Part-time benefit: wellness program 37f 21b 16B Part-time benefit: spouse tuition remission See notes at end of table.

Page 221: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-7

Table D-1. Institution Questionnaire Crosswalk: 2004—Continued

Variable name NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label 37g 21c 16C Part-time benefit: children tuition remission 37i 21d 16D Part-time benefit: housing 37k 21e 16E Part-time benefit: transportation/parking 37l 21f 16F Part-time benefit: paid maternity leave 37m 21g 16G Part-time benefit: paid paternity leave 21h 16H Part-time benefit: paid sabbatical leave 21i 16I Part-time benefit: employee assistance program 22 43 24 17 Part-time faculty: union representation 42a 25a 18A Part-time assessment: student evaluations 42b 25b 18B Part-time assessment: student test scores 42c 25c 18C Part-time assessment: student career placement 42d 25d 18D Part-time assessment: other student performance 42e 25e 18E Part-time assessment: department chair evaluations 42f 25f 18F Part-time assessment: dean evaluations 42g 25g 18G Part-time assessment: peer evaluations 42h 25h 18H Part-time assessment: self-evaluations 17 26a 19A Undergraduate instruction: percent full-time faculty 41 26b 19B Undergraduate instruction: percent part-time faculty 26c 19C Undergraduate instruction: percent teaching assistants 26d 19D Undergraduate instruction: percent other NOTE: The actual name of each NSOPF:04 institution variable has an “I” as the starting character. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04), 1999 National Study of Postsecondary Faculty (NSOPF:99), 1993 National Study of Postsecondary Faculty (NSOPF:93), 1988 National Survey of Postsecondary Faculty (NSOPF:88).

Page 222: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 223: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

D-9

Faculty Crosswalk

Page 224: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 225: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-11

Faculty Questionnaire Crosswalk

The crosswalk below links the NSOPF:04 questionnaire items with similar items from the three previous NSOPF faculty questionnaires: NSOPF:88, NSOPF:93, and NSOPF:99. This crosswalk will facilitate analyses of trends among faculty at postsecondary institutions. Linked questions may be identical in content and format or may differ in one or more ways. The question, item, or response wording; the order in which response options were presented; the manner in which the data were collected (e.g., categorical response option versus open-ended response fields, instructions to mark one versus all that apply); and the population to which the question applies may have changed. It is strongly recommended that analysts review documentation to determine whether linked questions are equitable for their purpose.

Table D-2. Faculty Questionnaire Crosswalk: 2004

Variable name NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label Q1 Q1 Q1 Q1 Instructional duties, any Q2 Q1A Q2 Q2 Instructional duties related to credit courses/activities Q3 Q4 Q3 Faculty status Q2 Q3 Q4 Principal activity Q4 Q4 Q5 Q5 Employed full or part time at this institution Q6 Part-time employment is primary employment Q8 Part-time but preferred full-time position Q6 Q7 Q9 Year began current job Q12 Q9 Q8 Q10 Rank Q13 Q10 Q9 Q11 Rank, year attained professor or associate professor Q9 Q7 Q10 Q12 Tenure status Q10 Q7A Q10 Q13 Tenure, year attained at any postsecondary institution Q18 Q38 Q64 Q14 Union status Q38 Q64 Q15 Union status, reason not a member Q12 Q14 Q16VS Principal field of teaching-verbatim Q16CD2 Principal field of teaching-general code Q16 Q12 Q14 Q16CD4 Principal field of teaching-specific code Q26 Q16.1A Q16.1A Q17A1 Highest degree Q26 Q16.1B Q16.1B Q17A2 Highest degree, date awarded Q16.1D Q16.1C Q17A3VS Highest degree field-verbatim Q17A3C2 Highest degree field-general code Q26 Q16.1C Q16.1D Q17A3C4 Highest degree field-specific code Q26 Q16.1Eb Q16.1Eb Q17A4ST Highest degree institution-state Q26 Q16.1Eb Q16.1Eb Q17A4C Highest degree institution-city Q26 Q16.1Ea Q16.1Ea Q17A4N Highest degree institution-name Q17A4I Highest degree institution-IPEDS Q17A4LEV Highest degree institution, level Q17A4CN Highest degree institution, control Q26 Q16 Q16 Q17D1 Bachelor’s degree date awarded Q5 Q17A Q22 Q18 Other current jobs, number of jobs See notes at end of table.

Page 226: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-12

Table D-2. Faculty Questionnaire Crosswalk: 2004—Continued

Variable name NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label Q6 Q24B +

Q28B Q19A1 Other current jobs, full-time employment

Q6 Q19B1 Other current jobs, number in postsecondary instruction Q21 First postsecondary job, current job is first Q29 Q24.1a Q23 First postsecondary job, year began Q29 Q24.3a Q24 First postsecondary job, part or full time Q24.6a Q26 First postsecondary job, tenure status Q29 Q26 Q27 Other jobs, any outside postsecondary since degree Q29 Q19.2 Q28.2b Q28 Other jobs, sector of previous job Q36 Q36a Q30a Q31A Hours per week on paid tasks at institution Q36 Q36b Q30b Q31B Hours per week on unpaid tasks at institution Q36 Q36c Q30c Q31C Hours per week on paid tasks outside of institution Q36 Q36d Q30d Q31D Hours per week on unpaid tasks outside of institution Q37 Q37a Q31aA Q32A Percent time spent on instruction, undergraduate Q37 Q37a Q31bA Q32B Percent time spent on instruction, graduate/first-professional Q37 Q37b Q31cA Q32C Percent time spent on research activities Q37 Q37c

Q37d Q37e Q37f

Q31dA Q31eA Q31fA

Q32D Percent time spent on other unspecified activities

Q22A Q40 Q35A1 Number of classes taught, credit Q35A2 Number of classes taught, noncredit Q35 Q35B Number of classes taught, remedial Q412i Q35C Number of classes taught, distance education Q36 Teaching assistant in any credit class Q23.2Aa Q41.2Aa Q37A1 Number of weeks taught, 1st credit class Q23.2Ab Q41.2Ab Q37B1 Number of credit hours, 1st class Q32 Q23.2Ac Q41.2Ag Q37C1 Number of hours taught per week, 1st class Q32 Q23.2Ae Q41.2Ae Q37D1 Number of students, 1st class Q32 Q23.3A Q41.3A Q37E1 Primary level of students, 1st class Q32 Q23.2Ad Q41.2Ad Q37F1 Teaching assistant, 1st class Q23.2Ba Q41.2Ba Q37A2 Number of weeks taught, 2nd credit class Q23.2Bb Q41.2Bb Q37B2 Number of credit hours, 2nd class Q32 Q23.2Bc Q41.2Bg Q37C2 Number of hours taught per week, 2nd class Q32 Q23.2Be Q41.2Be Q37D2 Number of students, 2nd class Q32 Q23.3B Q41.3B Q37E2 Primary level of students, 2nd class Q32 Q23.2Bd Q41.2Bd Q37F2 Teaching assistant, 2nd class Q23.2Ca Q41.2Ca Q37A3 Number of weeks taught, 3rd credit class Q23.2Cb Q41.2Cb Q37B3 Number of credit hours, 3rd class Q32 Q23.2Cc Q41.2Cg Q37C3 Number of hours taught per week, 3rd class Q32 Q23.2Ce Q41.2Ce Q37D3 Number of students, 3rd class Q32 Q23.3C Q41.3C Q37E3 Primary level of students, 3rd class Q32 Q23.2Cd Q41.2Cd Q37F3 Teaching assistant, 3rd class Q23.2Da Q41.2Da Q37A4 Number of weeks taught, 4th credit class See notes at end of table.

Page 227: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-13

Table D-2. Faculty Questionnaire Crosswalk: 2004—Continued Variable name

NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label Q23.2Db Q41.2Db Q37B4 Number of credit hours, 4th class Q32 Q23.2Dc Q41.2Dg Q37C4 Number of hours taught per week, 4th class Q32 Q23.2De Q41.2De Q37D4 Number of students, 4th class Q32 Q23.3D Q41.3D Q37E4 Primary level of students, 4th class Q32 Q23.2Dd Q41.2Dd Q37F4 Teaching assistant, 4th class Q23.2Ea Q41.2Ea Q37A5 Number of weeks taught, 5th credit class Q23.2Eb Q41.2Eb Q37B5 Number of credit hours, 5th class Q32 Q23.2Ec Q41.2Eg Q37C5 Number of hours taught per week, 5th class Q32 Q23.2Ee Q41.2Ee Q37D5 Number of students, 5th class Q32 Q23.3E Q41.3E Q37E5 Primary level of students, 5th class Q32 Q23.2Ed Q41.2Ed Q37F5 Teaching assistant, 5th class Q24Ae Q42b Q38A Undergrad class, multiple choice midterm/final exams Q24Af Q42c Q38B Undergrad class, essay midterm/final exams Q24Ag Q42d Q38C Undergrad class, short answer midterm/final exams Q24Ah Q42e Q38D Undergrad class, term/research papers Q24Ai Q42f Q38E Undergrad class, multiple drafts of written work Q24Ac Q38F Undergrad class, oral presentations Q38G Undergrad class, group projects 24Ad 42a Q38H Undergrad class, student evaluations of each others^ work Q38I Undergrad class, laboratory/shop/studio assignments Q38J Undergrad class, service learn/co-op interactions with business Q43 Q39 Website for any instructional duties Q47 Q41 Hours per week, e-mailing students Q49a-c Q46 Individual instruction, any Q33 Q25.1A +

Q25.2A Q49a Q47A1 Individual instruction, number undergraduate students

Q33 Q25.3A Q49b Q47A2 Individual instruction, number graduate students Q33 Q25.3A Q49c Q47A3 Individual instruction, number first-professional students Q33 Q25.1B +

Q25.2B Q49a Q47B1 Individual instruction, hours with undergraduates

Q33 Q25.3B Q49b Q47B2 Individual instruction, hours with graduate students Q33 Q25.3B Q49c Q47B3 Individual instruction, hours with first-professional students Q32 Q48 Hours per week, thesis/dissertation committees Q63 Q49 Hours per week, administrative committees Q50 Q50 Hours per week, with advisees Q26 Q51 Q51 Hours per week, office hours Q30 Q20.1A +

Q20.3A Q29.1 Q52AA Career articles, refereed journals

Q30 Q20.2A +

Q20.4A Q29.2 Q52AB Career articles, nonrefereed journals

Q30 Q20.5A + Q20.6A

Q29.3 Q52AC Career book reviews, chapters, creative works

Q30 Q20.8A + Q20.7A + Q20.9A+ Q20.10A

Q29.4 Q52AD Career books, textbooks, reports

See notes at end of table.

Page 228: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-14

Table D-2. Faculty Questionnaire Crosswalk: 2004—Continued

Variable name NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label Q30 Q20.11A Q29.5 Q52AE Career presentations Q30 Q20.12A Q29.5 Q52AF Career exhibitions, performances Q30 Q20.13A +

Q20.14A Q29.6 Q52AG Career patents, computer software

Q30 Q20.1B + Q20.3B

Q29.1 Q52BA Recent articles, refereed journals

Q30 Q20.2B + Q20.4B

Q29.2 Q52BB Recent articles, nonrefereed journals

Q30 Q20.5B +

Q20.6B Q29.3 Q52BC Recent book reviews, chapters, creative works

Q30 Q20.8B + Q20.7B + Q20.9B + Q20.10B

Q29.4 Q52BD Recent books, textbooks, reports

Q30 Q20.11B Q29.5 Q52BE Recent presentations Q30 Q20.12B Q29.5 Q52BF Recent exhibitions, performances Q30 Q20.13B +

Q20.14B Q29.6 Q52BG Recent patents, computer software

Q28 Q52 Q53 Scholarly activity, any Q13 Q15 Q54VS Scholarly activity, principal field-verbatim Q52CD2 Principal research field-general code Q13 Q15 Q54CD4 Principal research field-specific code Q30 Q54 Q55 Scholarly activity, any funded Q29 Q53 Q56 Scholarly activity, description Q19 Q39a Q65a Q61A Satisfaction with authority to make decisions Q61B Satisfaction with technology-based activities Q61C Satisfaction with equipment/facilities Q61D Satisfaction with institutional support for teaching improvement Q19 Q40a Q66a Q62A Satisfaction with workload Q19 Q40f Q66g Q62B Satisfaction with salary Q19 Q40g Q66h Q62C Satisfaction with benefits Q19 Q40i Q66j Q62D Satisfaction with job overall Q25 Q46 Q72 Q64 Retired from another position Q74 Q65 Retire from all paid employment, planned age Q40 Q47a Q76a Q66A Amount of income from basic salary from institution Q40 Q47c +

Q47d + Q47f

Q76b Q66B Amount of income from other income from institution

Q40 Q47g Q76d Q66C Amount of income from other academic institution Q40 Q47i Q76g Q66D Amount of income from consulting or freelance work Q40 Q47n Q76e Q66E Amount of income from other employment Q40 Q47h +

Q47j + Q47k + Q47l + Q47m + Q47p + Q47q

Q76f + Q76h + Q76i + Q76j + Q76k + Q76m + Q76n

Q66F Amount of income from other unspecified sources

See notes at end of table.

Page 229: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-15

Table D-2. Faculty Questionnaire Crosswalk: 2004—Continued

Variable name NSOPF:88 NSOPF:93 NSOPF:99 NSOPF:04 NSOPF:04 variable label Q66B2 Amount of total individual income (range) Q47B Q75b Q67 Type of contract, length of unit Q75b Q68 Income paid per course/credit unit or term Q69 Amount of income paid per course/credit unit or term Q49 Q79 Q70A Amount of total household income Q70B Amount of total household income (range) Q41 Q51 Q81 Q71 Gender Q42 Q52 Q82 Q72 Age, year of birth Q43 Q54 Q83 Q73 Race/ethnicity, Hispanic/Latino Q44 Q53_1 Q84 Q74A Race, American Indian or Alaska Native Q44 Q53_2 Q84 Q74B Race, Asian Q44 Q53_3 Q84 Q74C Race, Black or African American Q44 Q53_2 Q84 Q74D Race, Native Hawaiian or other Pacific Islander Q44 Q53_4 Q84 Q74E Race, White Q85 Q75 Disability, any Q45 Q55 Q87 Q77 Marital status, fall 2003 Q79 Dependent children, number Q56 Q89 Q80 Born in United States Q46 Q57 Q90 Q81 Citizenship status Q82A Opinion: teaching is rewarded Q82B Opinion: part-time faculty treated fairly Q48 Q59e Q92f Q82C Opinion: female faculty treated fairly Q48 Q59f Q92g Q82D Opinion: racial minorities treated fairly Q59g Q92h Q83 Opinion about choosing an academic career again SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04), 1999 National Study of Postsecondary Faculty (NSOPF:99), 1993 National Study of Postsecondary Faculty (NSOPF:93), 1988 National Survey of Postsecondary Faculty (NSOPF:88).

Page 230: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix D. Item Crosswalks

D-16

Page 231: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

E-1

Appendix E Endorsements

Page 232: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 233: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

E-3

NSOPF Endorsements American Association for Higher Education American Association of Collegiate Registrars and Admissions Officers American Association of Community Colleges American Association of State Colleges and Universities American Association of University Professors American Council on Education American Federation of Teachers Association for Institutional Research Association of American Colleges and Universities Association of Catholic Colleges and Universities Career College Association The Carnegie Foundation for the Advancement of Teaching College and University Professional Association for Human Resources The College Board The College Fund/UNCF Council of Graduate Schools The Council of Independent Colleges Hispanic Association of Colleges and Universities National Association of College and University Business Officers National Association for Equal Opportunity in Higher Education National Association of Independent Colleges and Universities National Association of State Universities and Land-Grant Colleges National Association of Student Financial Aid Administrators National Education Association

Page 234: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix E. Endorsements

E-4

Page 235: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

F-1

Appendix F Contacting Materials

Institution Contacting Letters and Inserts Chief Administrator Letter ........................................................................................................F-3 Institution Coordinator Early Contacting Letter .......................................................................F-5 Institution Coordinator Binder Letter........................................................................................F-7 Institution Refusal Conversion Letter .......................................................................................F-9 Coordinator Response Form....................................................................................................F-11 Guidance for Preparing the List of Faculty and Instructional Staff ........................................F-18

Pamphlets NSoFaS Pamphlet....................................................................................................................F-25 NSOPF Pamphlet.....................................................................................................................F-29 NPSAS Pamphlet.....................................................................................................................F-31

Faculty Contacting Letters and E-Mail Lead Letter to Faculty .............................................................................................................F-33 Instructional Insert to Faculty..................................................................................................F-34 Initial E-Mail ...........................................................................................................................F-35 Early Response Deadline Reminder Letter .............................................................................F-36 First Follow-up E-Mail............................................................................................................F-37 Second Reminder E-Mail ........................................................................................................F-38 Third Reminder E-Mail ...........................................................................................................F-39 Nonresponse Letter..................................................................................................................F-40 Nonresponse E-Mail................................................................................................................F-41 Second Nonresponse Letter.....................................................................................................F-42 Second Nonresponse E-Mail ...................................................................................................F-43 Refusal Nonresponse Letter ....................................................................................................F-44 Refusal Nonresponse E-Mail...................................................................................................F-45 Early September Nonrespondents Letter.................................................................................F-46 Early September Nonrespondents E-Mail ...............................................................................F-47 Final Reminder E-Mail............................................................................................................F-48 Refusals May Not be Eligible Letter .......................................................................................F-49 Partials May Not be Eligible Letter.........................................................................................F-50

Page 236: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-2

Page 237: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-3

NSoFaS Endorsed by American Association for Higher Education American Association of Collegiate Registrars and Admissions Officers American Association of Community Colleges American Association of State Colleges and Universities American Association of University Professors American Council on Education American Federation of Teachers Association for Institutional Research Association of American Colleges and Universities Association of Catholic Colleges and Universities Career College Association The Carnegie Foundation for the Advancement of Teaching College and University Professional Association for Human Resources The College Board The College Fund/UNCF Council of Graduate Schools The Council of Independent Colleges Hispanic Association of Colleges and Universities National Accrediting Commission of Cosmetology Arts and Sciences National Association of College and University Business Officers National Association for Equal Opportunity in Higher Education National Association of Independent Colleges and Universities National Association of State Universities and Land-Grant Colleges National Association of Student Financial Aid Administrators National Education Association

CHIEF ADMINISTRATOR LETTER August 12, 2003 <CHIEF ADMIN NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Your IPEDS UNITID: Your password: Dear <NAME>: <INSTITUTION NAME> has been selected to participate in the 2004 National Study of Faculty and Students. The Higher Education Act (Sec. 131 (d), as amended in 1998) authorizes the U.S. Department of Education, National Center for Education Statistics (NCES) to periodically gather information from students, faculty, and instructional staff on two pivotal areas of national concern:

• How do students and their families finance education after high school? • Who teaches in our colleges and universities, and how do they conduct

their work? In response to the continuing need for these data, information was collected from students in 1987, 1990, 1993, 1996, and 2000 as part of the National Postsecondary Student Aid Study (NPSAS). Data on full- and part-time faculty and instructional staff were collected for the National Study of Postsecondary Faculty (NSOPF) in 1988, 1993, and 1999. NCES has contracted with RTI International (RTI) to conduct the next data collection cycle for both studies under the 2004 National Study of Faculty and Students (NSoFaS:04) in order to minimize the reporting burden to postsecondary institutions. Additional information about our plans for NSoFaS:04 is provided in the enclosed materials, which include an NSoFaS brochure and copies of the brochures that participating students or faculty will receive. Your institution’s participation is crucial to the success of NSoFaS:04. I am writing to request that you appoint an NSoFaS coordinator to oversee the preparation of lists of faculty/instructional staff and students at your institution. The NSoFaS coordinator will also complete a brief questionnaire on the Internet about your institution’s policies and procedures related to faculty and instructional staff. We will use the lists prepared by your institution to draw samples of faculty/instructional staff and students for participation in the 2004 NSOPF and NPSAS data collection cycles, respectively. Sampled faculty and students will be asked to complete a questionnaire on the Internet.

Page 238: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-4

The individual whom you designate as coordinator should be someone (such as the Director of Institutional Research) who is familiar with data and information sources at your institution. If you require assistance with selecting an appropriate coordinator, you may call the NSoFaS Help Desk at 1–866–NSOFAS4 (1–866–676–3274, toll-free). We are aware that you and the staff at your institution are confronted with many competing demands for your time. Therefore, we are providing you—and the coordinator you designate—with this advance notice of the study to allow you adequate time to plan for this data collection effort and, if needed, to contact us for more information prior to the start of data collection in the fall 2003/2004 term. Once designated, an RTI representative will contact your coordinator to discuss the study timeline and procedures required for your institution. Your coordinator will also be provided with a complete summary of our data request for the NPSAS and NSOPF components of NSoFaS. All responses that relate to or describe identifiable characteristics of individuals may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, unless otherwise compelled by law. The enclosed pamphlets detail our data collection procedures and provide a full description of the laws and procedures safeguarding the confidentiality of questionnaire responses, contact information, and other data. Additional information, including reports based on data from previous NSOPF and NPSAS studies, is available on the NSoFaS web site: https://surveys.nces.ed.gov/nsofas2004 If you have any questions about the study or procedures involved, please contact the RTI Project Coordinator, Brian Kuhr, at 1–866–676–3274 or via e-mail at [email protected]. You may also direct questions to NCES by contacting James Griffith at 1–202–502–7387 (e-mail address: [email protected]) or Linda Zimbler at 1–202–502–7481 (e-mail address: [email protected]). At your earliest convenience, please complete the NSoFaS Designate a Coordinator form online at the NSoFaS web site, using the IPEDS UNITID and password printed on the first page of this letter. We look forward to your participation in this important study. Thank you for your cooperation and prompt completion of the NSoFaS Designate a Coordinator form. Sincerely,

C. Dennis Carroll, Ph.D. Associate Commissioner Postsecondary Studies Division Enclosures

The NSoFaS Designate a Coordinator form may be completed online at https://surveys.nces.ed.gov/nsofas2004 To access the online form, enter the user name (which is your IPEDS UNITID) and password printed on the first page of this letter.

Page 239: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-5

NSoFaS Endorsed by American Association for Higher Education American Association of Collegiate Registrars and Admissions Officers American Association of Community Colleges American Association of State Colleges and Universities American Association of University Professors American Council on Education American Federation of Teachers Association for Institutional Research Association of American Colleges and Universities Association of Catholic Colleges and Universities Career College Association The Carnegie Foundation for the Advancement of Teaching College and University Professional Association for Human Resources The College Board The College Fund/UNCF Council of Graduate Schools The Council of Independent Colleges Hispanic Association of Colleges and Universities National Accrediting Commission of Cosmetology Arts and Sciences National Association of College and University Business Officers National Association for Equal Opportunity in Higher Education National Association of Independent Colleges and Universities National Association of State Universities and Land-Grant Colleges National Association of Student Financial Aid Administrators National Education Association

INSTITUTION COORDINATOR EARLY CONTACTING LETTER <DATE>

<COORD NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP>

Your IPEDS UNITID: Your PASSWORD:

Dear <NAME>

<INSTITUTION NAME> has been selected to participate in the 2004 National Study of Faculty and Students. The Higher Education Act (Sec. 131 (d), as amended in 1998) authorizes the U.S. Department of Education, National Center for Education Statistics (NCES) to periodically gather information from students, faculty, and instructional staff on two pivotal areas of national concern:

• How do students and their families finance education beyond high school?

• Who teaches in our colleges and universities, and how do they conduct their work?

In response to the continuing need for these data, information was collected from students in 1987, 1990, 1993, 1996, and 2000 as part of the National Postsecondary Student Aid Study (NPSAS). Data on full- and part-time faculty and instructional staff were collected for the National Study of Postsecondary Faculty (NSOPF) in 1988, 1993, and 1999. NCES has contracted with RTI International (RTI) to conduct the next data collection cycle for both studies under the 2004 National Study of Faculty and Students (NSoFaS:04) in order to minimize the reporting burden to postsecondary institutions. Additional information about our plans for NSoFaS:04 is provided in the enclosed materials, which include an NSoFaS brochure and copies of the brochures that participating students or faculty will receive.

The chief administrative officer of your institution has selected you as your institution’s coordinator for NSoFaS:04. The enclosed materials detail your role and the role of your institution in this study and contain a timetable of major project activities. You will have four primary responsibilities for NSoFaS:04:

• Complete the Coordinator Response Form online at the NSoFaS web site, within the next few weeks, using the user name and password printed at the top of this letter. We will schedule data collection for your institution based on the information you provide. A facsimile of the Coordinator Response Form is included in the attached folder.

• Oversee the preparation of two data files: (1) a list of faculty and instructional staff and (2) an enrollment list of students at your institution. These data files will be used to draw samples of faculty/instructional staff and students for participation in NSoFaS:04. Sampled faculty and students will be asked to complete a questionnaire on the Internet.

• Complete a separate web-based program requiring institution record information for a sample of

Page 240: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-6

students. NSoFaS:04 will begin in September 2003. At that time, complete instructions for your institution’s participation will be sent directly to you. In the meantime, please review the enclosed materials at your earliest convenience. We are aware that you and other staff at your institution are confronted with many competing demands for your time. We hope that giving you this advance notice of the study will provide you with ample time to plan for your school’s participation in NSoFaS:04. A project representative will call you in the next few days to ensure that you have received this notification and to answer any questions that you may have. You may also call the NSoFaS Help Desk directly at 1–866–NSOFAS4 (1–866–676–3274). All responses that relate to or describe identifiable characteristics of individuals may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, unless otherwise compelled by law. The enclosed materials detail our data collection procedures and provide a detailed description of the laws and procedures safeguarding the confidentiality of questionnaire responses, contact information, and demographic data. Additional information, including reports based on data from previous NSOPF and NPSAS studies, is available on the NSoFaS web site: https://surveys.nces.ed.gov/nsofas2004 If you have questions about the study or procedures, please contact the RTI Project Coordinator, Brian Kuhr, at 1–866–676–3274 or via e-mail at [email protected]. You may also direct questions to NCES by contacting James Griffith at 1–202–502–7387 (e-mail address: [email protected]) or Linda Zimbler at 1–202–502–7481 (e-mail address: [email protected]). At your earliest convenience, please complete Coordinator Response Form online at the NSoFaS web site, using the IPEDS UNITID and password printed on the first page of this letter. We look forward to your participation in this important study. Thank you for your cooperation. Sincerely,

C. Dennis Carroll, Ph.D. Associate Commissioner Postsecondary Studies Division Enclosures

The NSoFaS Coordinator Response Form may be completed online at

https://surveys.nces.ed.gov/nsofas2004

To access the online form, enter the IPEDS UNITID and password printed on the first page of this letter.

Page 241: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-7

NSoFaS Endorsed by American Association for Higher Education American Association of Collegiate Registrars and Admissions Officers American Association of Community Colleges American Association of State Colleges and Universities American Association of University Professors American Council on Education American Federation of Teachers Association for Institutional Research Association of American Colleges and Universities Association of Catholic Colleges and Universities Career College Association The Carnegie Foundation for the Advancement of Teaching College and University Professional Association for Human Resources The College Board The College Fund/UNCF Council of Graduate Schools The Council of Independent Colleges Hispanic Association of Colleges and Universities National Accrediting Commission of Cosmetology Arts and Sciences National Association of College and University Business Officers National Association for Equal Opportunity in Higher Education National Association of Independent Colleges and Universities National Association of State Universities and Land-Grant Colleges National Association of Student Financial Aid Administrators National Education Association

INSTITUTION COORDINATOR BINDER LETTER <DATE> <COORD NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP>

Your IPEDS UNITID: Your PASSWORD: Dear <NAME>: As the person designated to be the Institution Coordinator for the 2004 National Study of Faculty and Students (NSoFaS:04) at your institution, you are receiving detailed instructions (see enclosed binder) to ensure your full participation in both the study’s faculty and student components. We look forward to working with you on this important research effort, and are available to answer any questions you may have on how to carry out the coordination activities requested of you. As described in materials provided during the early notification period of the study this past spring/summer, NSoFaS:04 is being conducted for the U.S. Department of Education’s National Center for Education Statistics (NCES) by RTI International (RTI). This ongoing study, designed to collect data from nationally representative samples of postsecondary students and faculty and instruction staff, provides vital information on changes over time in two pivotal areas of national concern:

• How students and their families finance education after high school, and • Who teaches in our colleges and universities and how they conduct their work.

In response to the continuing need for the data provided by NSoFaS, Congress has authorized NCES to collect these data periodically. Data on full- and part-time faculty and instructional staff were collected through the faculty component—the National Study of Postsecondary Faculty (NSOPF)–in 1988, 1993, and 1999. Information on students and student financial aid was previously collected in 1987, 1990, 1993, 1996, and 2000 as part of the student component—the National Postsecondary Student Aid Study (NPSAS). Your institution has been sampled for participation in both the faculty and student components of NSoFaS:04. As the Institution Coordinator, you are asked to oversee the completion of the following activities for NSoFaS:04:

• Completion of the Coordinator Response Form (CRF) online at the NSoFaS web site, https://surveys.nces.ed.gov/nsofas2004/, using the IPEDS UNITID and password printed at the top of this letter. If you have already completed this document, a copy of the form may be printed from the web site after log in. A data collection timeline for your institution has been scheduled based on the information you provided. If you have not completed the CRF online, please do so at your earliest convenience. For reference, a facsimile of the CRF is included in the enclosed binder.

Page 242: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-8

• Preparation of a complete data file listing all full- and part-time faculty, adjunct faculty, and instructional staff (including available contact and demographic information). The file should be current as of November 1, 2003, or the date at your institution when faculty rosters for the fall academic term are complete. [FOR INST THAT COMPLETED THE CRF AND HAVE INDICATED DATE OTHER THAN DEC 6]: <Information provided on the CRF indicates that you will send your faculty list to RTI on <DATE>[FOR NULL/DEC 6: The NSoFaS help desk will call to confirm the date at which we can expect your institution’s list. It is important that we receive your institution’s list prior to the end of the fall term, if possible.]

• Completion of the Institution Questionnaire online at the NSoFaS web site. The questionnaire may be completed in multiple sessions; however, Question 1 (which asks for counts of full- and part-time faculty and instructional staff at your institution) should be answered at the time you send your list of faculty. A facsimile of the questionnaire is included in your binder. Please complete this questionnaire online by December 5, 2003, or by the date you submit your faculty list noted above if different.

• Preparation of a complete data file listing all students enrolled at your institution at any time between July 1, 2003, and April 30, 2004. Please refer to the enclosed NPSAS materials for a complete set of student eligibility criteria. Your list of students enrolled should be transmitted to RTI as early as possible. This data file will be used to draw a sample of students for participation in NPSAS. Sampled students will be asked to complete a questionnaire on our secured web site over the Internet. It is critical that we allow students ample time to respond before the end of the academic year. [FOR INST THAT COMPLETED A CRF: <Information provided on the CRF indicates that you will send the student list to RTI on <DATE>. [ NO CRF/ UNKNOWN AFTER DATE: The NSoFaS help desk will call to confirm the date at which we can expect your institution’s list.]

• Completion of a separate web-based computer-assisted data entry (webCADE) program that requires institution record information for those students who are sampled. This includes specific information on their enrollment status, financial assistance, and demographic characteristics. More details can be found in the enclosed binder.

All responses that relate to or describe identifiable characteristics of individuals may be used only for statistical purposes and may not be disclosed or used, in identifiable form, for any other purpose, unless otherwise compelled by law. The enclosed materials detail our data collection procedures and provide a detailed description of the laws and procedures safeguarding the confidentiality of individual questionnaire responses, contact information, and demographic data. Additional sources of information, including reports based on data from previous NSOPF and NPSAS studies, are available on the NSoFaS web site: https://surveys.nces.ed.gov/nsofas2004/.

If you have questions about the study purposes or procedures, please contact either of us or Brian Kuhr, Project Coordinator, at 1–866–NSOFAS4 (1–866–676–3274) or via e-mail at [email protected]. You may also direct questions to NCES by contacting either James Griffith at 1–202–502–7387 (e-mail address: [email protected]) or Linda Zimbler at 1–202–502–7481 (e-mail address: [email protected]).

We look forward to your participation in this important study. Thank you for your cooperation.

Sincerely,

John Riccobono, Ph.D. Margaret Cahalan, Ph.D. NPSAS Project Director NSOPF Project Director Enclosures

Your institution’s response to the National Study of Faculty and Students may be completed online athttps://surveys.nces.ed.gov/nsofas2004/

To upload lists or other data collection forms, go to the login tab found on the home/login page. You will be prompted to enter the IPEDS UNITID and password printed on the first page of this letter.

Page 243: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-9

NSoFaS Endorsed by American Association for Higher Education American Association of Collegiate Registrars and Admissions Officers American Association of Community Colleges American Association of State Colleges and Universities American Association of University Professors American Council on Education American Federation of Teachers Association for Institutional Research Association of American Colleges and Universities Association of Catholic Colleges and Universities Career College Association The Carnegie Foundation for the Advancement of Teaching College and University Professional Association for Human Resources The College Board The College Fund/UNCF Council of Graduate Schools The Council of Independent Colleges Hispanic Association of Colleges and Universities National Accrediting Commission of Cosmetology Arts and Sciences National Association of College and University Business Officers National Association for Equal Opportunity in Higher Education National Association of Independent Colleges and Universities National Association of State Universities and Land-Grant Colleges National Association of Student Financial Aid Administrators National Education Association

INSTITUTION REFUSAL CONVERSION LETTER

November 21, 2003

<NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP>

Your IPEDS UNITID: Your password:

Dear <NAME>:

I am writing to you again to urge your participation in the 2004 National Study of Faculty and Students (NSoFaS:04). Because your participation is so important to the success of NSoFaS:04, I have authorized assistance to your institution to facilitate its timely participation in NSoFaS:04.

The Higher Education Act (Sec. 131 (d)), as amended by the United States Congress in 1998, provides the U.S. Department of Education, National Center for Education Statistics (NCES) with the authority and a mandate to periodically gather data on the condition of postsecondary education in the United States. NSoFaS:04 plays an essential role in fulfilling this mandate.

NSoFaS:04 consists of two very important studies conducted by NCES: the 2004 National Study of Postsecondary Faculty (NSOPF:04) and the 2004 National Postsecondary Student Aid Study (NPSAS:04). The nationally representative sample for the two studies is selected from among all Title IV eligible institutions. To ensure representation of the entire range of postsecondary institutions in the nation, we count on cooperation from each of the sampled institutions. We are grateful for the outstanding cooperation that we have received in previous cycles of these studies. We urgently request your institution’s participation in NSoFaS:04.

We are well aware that, especially under difficult economic conditions, postsecondary institutions have limited staff and resources to devote to participating in research studies, regardless of their importance. That is why we have instructed RTI International, NCES’ contractor for NSoFaS:04, to provide your institution with the assistance necessary to accomplish the following:

• Provide a list of faculty and instructional staff employed by your institution as of November 1, 2003;

• Complete a brief Institution Questionnaire concerning your institution’s policies and procedures regarding faculty;

• Provide a list of postsecondary students enrolled at your institution between July 1, 2003 and April 30, 2004; and

• Complete a student record abstraction form for a small number of students selected from the enrollment list.

To assist your institution in participating in the study, NCES has authorized RTI International to provide compensation for the staff and resources required by your institution to compile lists of faculty and students and associated documentation.

Page 244: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-10

Moreover, if necessary, RTI will also arrange for one of its specially-trained staff to visit your institution and perform the record abstractions for sampled students. Data collection for NSoFaS:04 is both authorized and protected by federal confidentiality laws, including the Family Education Rights and Privacy Act (FERPA). The small number of faculty and students sampled from the lists provided by your institution will be asked to participate in NSoFaS:04 by completing a questionnaire online or by telephone in a confidential and secure manner. We encourage you to review the additional information available about NSoFaS:04 at the following web site: https://surveys.nces.ed.gov/nsofas2004/ Both the Institution Questionnaire and secure uploads for faculty and student lists may be accessed at this site. The user name (IPEDS UNITID) and password required to access the forms and procedures for your institution are printed at the top of this letter. Over the course of the next 2 weeks, a representative from RTI will be contacting you to discuss your needs and the best way to facilitate your institution’s participation in NSoFaS:04. You may also contact Brian Kuhr, the Project Coordinator at 1-866-676-3274 or by e-mail at [email protected] to confirm your participation in the study and to request any necessary assistance in providing the data requested. You may direct questions to NCES by contacting James Griffith at 1-202-502-7387 (e-mail address: [email protected]) or Linda Zimbler at 1-202-502-7481 (e-mail address: [email protected]). Once again, thank you for your consideration. Sincerely,

C. Dennis Carroll, Ph.D. Associate Commissioner Postsecondary Education Division

The NSoFaS forms may be completed online at https://surveys.nces.ed.gov/nsofas2004/ To access the online form, enter the user name (which is your IPEDS UNITID) and password printed on the first page of this letter.

Page 245: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-11

TThhee 22000044 NNaattiioonnaall SSttuuddyy ooff FFaaccuullttyy aanndd SSttuuddeennttss ((NNSSooFFaaSS::0044))

COORDINATOR RESPONSE FORM (CRF)

FACSIMILE If you completed the CRF in spring/summer 2003, a report can be viewed and/or printed from the web site with your responses—specifically, the due dates established for submitting your list of faculty and instructional staff and/or list of students enrolled. Follow the steps below to connect to the study’s secure web site.

Connect browser to: https://surveys.nces.ed.gov/nsofas2004/ At the Home/Login page: Enter your unique IPEDS UNITID and password.1 Select the option View Coordinator Response Form Report. (Click on link.)

If you did not complete the form in spring/summer 2003, please review this facsimile and complete the CRF online as soon as possible upon receipt of this binder. Follow the steps below to connect to the study’s secure web site.

Connect browser to: https://surveys.nces.ed.gov/nsofas2004/ At the Home/Login page:

Enter your unique IPEDS UNITID and password.1 Select the option Coordinator Response Form. (Click on button.)

If you are unable to complete the CRF online, you may complete the form by telephone. Please call the 2004 National Study of Faculty and Students (NSoFaS:04) Help Desk at 1–866–NSOFAS4 (1–866–676–3274). Staff members are available Monday through Friday, from 9 a.m. to 7 p.m. (Eastern Time). You will be able to immediately complete the information with a staff member or schedule an appointment to complete it at a more convenient time.

1 Your unique and secure Integrated Postsecondary Education Data System (IPEDS) UNITID and password are printed on the letter accompanying this material or they may be obtained by contacting the Help Desk at 1–866–NSOFAS4 (1–866–676–3274).

Page 246: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-12

TThhee 22000044 NNaattiioonnaall SSttuuddyy ooff FFaaccuullttyy aanndd SSttuuddeennttss ((NNSSooFFaaSS::0044))

Coordinator Response Form Your response to these questions will allow RTI to customize some of the systems on the NSoFaS web site with characteristics unique to your institution. This will make it easier for you and your staff to move through the various study components.

1. Institutions use different methods to account for a student’s credits—that is, to track completion of required curricula, courses, or programs offered at that institution.

How are course/programs measured at your institution?

2. Institutions use a variety of structures to quantify the hours that are taken by a student during a calendar year or school year.

What calendar system is used at your institution?

Semesters

Quarters

Trimesters

4-1-4

Differs by program

Continuous/Open Enrollment

No standard terms

Clock hours

Credit hours

Both

Page 247: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-13

TThhee 22000044 NNaattiioonnaall SSttuuddyy ooff FFaaccuullttyy aanndd SSttuuddeennttss ((NNSSooFFaaSS::0044))

3. Identify the names of each of the terms/enrollment periods (sometimes referred to as payment periods) that a student may enroll in between July 1, 2003, and June 30, 2004. Please include all terms, even those that may apply to special types of students (e.g., medical or MBA students). NOTE: SOME PORTION OF THE TERM MUST OCCUR BETWEEN JULY 1, 2003, AND JUNE 30, 2004, BUT MAY START PRIOR TO JULY 1 OR END AFTER JUNE 30. After all the terms are added, please press the Continue button.

Add Term

Please add a term.

Please enter the name of the term and the associated start and end dates.

Term Name:

Start date:

Month Day Year

January

1

2003

End date:

January

1

2003

THIS AN EXAMPLE OF HOW QUESTION 3 MAY BE COMPLETED.

3. Identify the names of each of the terms/enrollment periods (sometimes referred to as payment periods) that a student may enroll in between July 1, 2003, and June 30, 2004. Please include all terms, even those that may apply to special types of students (e.g., medical or MBA students). NOTE: SOME PORTION OF THE TERM MUST OCCUR BETWEEN JULY 1, 2003, AND JUNE 30, 2004, BUT MAY START PRIOR TO JULY 1 OR END AFTER JUNE 30. After all the terms are added, please press the Continue button. Delete? Term Name Term start date Term end date

First Summer 6/6/2003 7/15/2003

Second Summer 7/21/2003 8/8/2003

Fall 2003 8/28/2003 12/6/2003

Spring 2004 2/10/2004 5/5/2004

First Summer 2004 5/6/2004 6/15/2004

Add Term

Delete selected Terms

Page 248: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-14

TThhee 22000044 NNaattiioonnaall SSttuuddyy ooff FFaaccuullttyy aanndd SSttuuddeennttss ((NNSSooFFaaSS::0044))

4. Identify institution grants and scholarships. Include only those institutional grants and scholarships paid out of institutional revenue, including restricted funds that originate from private donations or endowments. Do not include grants or scholarships funded by state or federal sources, even if the award decisions are made by institution staff. State grant program funds that are allocated to and awarded by your institution (instead of a centralized state grant system that makes awards to students) should not be included as institutional aid.

Please list up to 12 names of the most prevalent institution grants and scholarships awarded and indicate whether “need,” “merit,” or “both” is considered when making these awards.

Check here if your institution does not award institution grants or scholarships. Then click on the Continue button below.

Add Aw ard

THIS IS AN EXAMPLE OF HOW QUESTION 4 MAY BE COMPLETED.

4. Identify institution grants and scholarships. Include only those institutional grants and scholarships paid out of institutional revenue, including restricted funds that originate from private donations or endowments. Do not include grants or scholarships funded by state or federal sources, even if the award decisions are made by institution staff. State grant program funds that are allocated to and awarded by your institution (instead of a centralized state grant system that makes awards to students) should not be included as institutional aid.

Please list up to 12 names of the most prevalent institution grants and scholarships awarded and indicate whether “need,” “merit,” or “both” is considered when making these awards. Delete? Name of Award Basis of Award Decision

Future Teachers of North Carolina Scholarship BOTH

Add Aw ard

Delete selected Aw ards

Page 249: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-15

TThhee 22000044 NNaattiioonnaall SSttuuddyy ooff FFaaccuullttyy aanndd SSttuuddeennttss ((NNSSooFFaaSS::0044))

NPSAS (STUDENT COMPONENT ONLY) INSTITUTIONS WILL AUTOMATICALLY SKIP THIS QUESTION WHEN FORM IS COMPLETED ON

WEB.

5. We would like to receive a list of faculty and instructional staff employed at your institution as of November 1, 2003. The table to the right depicts the data elements to be included on the list for each faculty and instructional staff member. We'd like to receive the list of faculty and instructional staff no later than December 5, 2003.

When will you be able to provide the list of faculty and instructional staff?

On or before December 5, 2003

After December 5, 2003. (A project staff member will call to establish a specific date.)

Faculty and Instructional Staff Data Elements

1. First Name

2. Middle Initial

3. Last Name

4. Name Suffix (e.g., Jr., Sr., III, etc.)

5. Employee ID

6. Race/Ethnicity

7. Gender

8. Employment Status

9. Academic Field

10. Campus Address 1

11. Campus Address 2

12. Campus City

13. Campus State

14. Campus Zip Code

15. Campus Telephone Number

16. Campus e-mail

17. Home Address 1

18. Home Address 2

19. Home City

20. Home State

21. Home Zip Code

22. Home Telephone Number

23. Home e-mail

Page 250: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-16

TThhee 22000044 NNaattiioonnaall SSttuuddyy ooff FFaaccuullttyy aanndd SSttuuddeennttss ((NNSSooFFaaSS::0044))

THESE DATES ARE AN EXAMPLE OF HOW THE DATE FILLS IN BASED ON YOUR INSTITUTION’S RESPONSE TO QUESTION 4

( IF ANY TERMS WERE ENTERED)

6. Please provide a list of all students enrolled at your institution. The table to the right depicts the data elements to be included on the list for each student. We’d like to receive the enrollment list as soon as possible. Based on the dates you provided for terms during the 2003-04 academic year, February 24, 2004, is 2 weeks after the beginning of the "Spring 2004" term, which is the last term with a start date that is on or before April 30, 2004.

When will you be able to provide the list of all students enrolled?

On or before February 24, 2004

After February 24, 2004. (A project staff member will call to establish a specific date.)

Student Data Element

1. First Name

2. Middle Initial

3. Last Name

4. Name Suffix (e.g., Jr., Sr., III, etc.)

5. Student ID

6. Social Security Number

7. Educational Level

8. First Time Beginner

9. Local Address 1

10. Local Address 2

11. Local City

12. Local State

13. Local ZIP Code

14. Local Telephone Number

15. Campus e-mail

16. Permanent Address 1

17. Permanent Address 2

18. Permanent City

19. Permanent State

20. Permanent ZIP Code

21. Permanent Telephone Number

22. Permanent e-mail

Page 251: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-17

TThhee 22000044 NNaattiioonnaall SSttuuddyy ooff FFaaccuullttyy aanndd SSttuuddeennttss ((NNSSooFFaaSS::0044))

7. When RTI receives your list of students enrolled, a random sample will be selected. During the final stage of the study, you will enter specific data from sampled students’ records pertaining to enrollment and financial aid status. NPSAS webCADE (a computer-assisted data entry Internet application) is the application developed to assist in your completing this stage. It will be available on the study web site once the sample has been selected. You will enter student data on this site using either Netscape 4.8 or higher or MS Internet Explorer 5.0 or higher with the following:

· 128-bit encryption. You may need to adjust your browser settings or download an update to activate 128-bit encryption.

· JavaScript enabled. JavaScript is the programming language of the interactive sections of our web site and must be enabled for many pages to work properly.

Will it be possible for you to use this software to provide the requested data?

Yes

No

Would like to discuss options with staff

OPTIONS AT END OF CRF

You have reached the end of this form. Please check the option that best describes how you would like us to proceed:

Close completed form: You have completed all the information, including all terms, awards, and dates when we can expect your faculty list and your list of students enrolled. Checking this option means that you are submitting this form as final. If you later determine that you need to make modifications, please call 1-866-NSOFAS4 (1-866-676-3274) or e-mail the changes to [email protected].

Keep form open for later completion: You have completed all or most of the information, including some terms, some awards, and dates when we can expect your faculty list and your list of students enrolled. Checking this option will allow you to continue accessing this form on the web until you are entirely satisfied that all information has been entered. NSoFaS staff may call you to offer their assistance.

Provide assistance: You would like NSoFaS staff to call you to schedule a time to complete the items. Checking this option forwards an auto e-mail to [email protected] and a staff person will call to set an appointment for completing the Response Form with you over the telephone. A facsimile of the form was provided with your early notification packet to assist with preparation of your responses at that time.

Page 252: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-18

Page 253: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-19

Page 254: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-20

Page 255: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-21

Page 256: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-22

Page 257: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-23

Page 258: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-24

ITEM 1: Fill in Date, Sender’s Name, Phone, Company, Address, City, State, and ZIP Code.

ITEM 2: Your Internal Billing Reference Information will be 08xxx.xxx.xxx for the list of faculty and instructional staff.

ITEM 3: ON AIRBILL TO BE ENTERED Recipient’s Name: Linda Rattelade Phone: (919) 541-8984 Company: RTI International Address: 1000 Parliament Ct., Suite 100 City: Durham State: NC ZIP Code: 27703-8464

ITEM 4a: Please mark FedEx Priority Overnight.

ITEM 5: Indicate the type of package/letter you are sending.

ITEM 7: Please check Third Party and use FedEx Account No. 15xxxxxxx.

Page 259: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-25

Page 260: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-26

Page 261: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-27

Page 262: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-28

Page 263: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-29

Page 264: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-30

Page 265: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-31

Page 266: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. C

ontacting Materials

F-32

Page 267: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-33

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

LEAD LETTER TO FACULTY <DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP>

Dear Colleague, I am writing to ask you to participate in an important study about postsecondary faculty and instructional staff in the United States. Specifically, I would like you to complete a questionnaire over the Internet about your background and work experiences at <INSTITUTION NAME>. You were selected as part of a nationally representative sample of faculty and instructional staff to take part in the fourth cycle of the National Study of Postsecondary Faculty (NSOPF). RTI International (RTI) of North Carolina is conducting this cycle of the study for the U.S. Department of Education. Your participation, while voluntary, is critical to the study’s success. On average, the questionnaire takes about 30 minutes to complete. Your responses will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file for follow-up purposes only. Your responses may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, except as required by law. We have enclosed a pamphlet that answers common questions about the study, and contains additional information on laws protecting your confidentiality. To respond to the questionnaire over the Internet:

• Go to: https://surveys.nces.ed.gov/nsopf/ • Type the study ID and password (see below) on the Home/Login page, and • Press “Enter” or click “Login” to begin the questionnaire.

To respond to the questionnaire by telephone with one of our trained interviewers, or ask questions about the study:

• Call 1–866–NSOPF04 (1–866–676–7304). If you complete the questionnaire by <DATE>, you may choose to receive either a $30 check or gift certificate from Amazon.com as a token of our appreciation. If you have questions or comments regarding the study, you may contact the RTI Project Director, Dr. Maggie Cahalan, at 1–866–676–7304 (e-mail address: [email protected]) or the NCES Project Officer, Linda Zimbler, at 1–202–502–7481 (e-mail address: [email protected]).

Sincerely,

C. Dennis Carroll, Ph.D. Associate Commissioner Postsecondary Studies Division

Enclosures

Go to: https://surveys.nces.ed.gov/nsopf/ Your study ID: Your password:

Page 268: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-34

HOW TO COMPLETE THE NSOPF:04 QUESTIONNAIRE

As a thank you from the U.S. Department of Education, if you complete the National Study of Postsecondary Faculty 2004 (NSOPF:04) questionnaire by <DATE>, you will receive either a $30 check or gift certificate from Amazon.com (your choice). Your participation is very important to the success of NSOPF:04.

To complete the self-directed web questionnaire:

1. Go to: https://surveys.nces.ed.gov/nsopf/

2. At the login and password prompts, enter the study ID and password printed in the lower right of the attached letter.

3. Press “Enter” or click “Login” to begin the questionnaire. You will need to use Internet Explorer or Netscape as your browser to complete the self-directed web version. If you need assistance in completing the web questionnaire or would like to complete the questionnaire over the phone, please call our Help Desk at 1–866–NSOPF04 (1–866–676–7304) for assistance. While you may complete the NSOPF web questionnaire throughout the data collection period, we will begin calling sample members to complete the interview over the phone starting <DATE>.

For more information about this study visit the web site at:

https://surveys.nces.ed.gov/nsopf/

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this information collection is 1850–0608. The time required to complete this information collection is estimated to average 30 minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate or suggestions for improving this form, please write to: U.S. Department of Education, Washington, DC 20202-4651. If you have comments or concerns regarding the status of your individual submission of this form, write directly to: Linda Zimbler, National Center for Education Statistics, U.S. Department of Education, 1990 K Street, NW, Room 8123, Washington, DC 20006-5652.

Page 269: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-35

INITIAL E-MAIL E-mail Subject line: U.S. Department of Education Study Attention: <NAME> Dear Colleague, You have been randomly selected to participate in the fourth cycle of the National Study of Postsecondary Faculty (NSOPF:04) that is being conducted on behalf of the United States Department of Education. We are requesting that you complete a questionnaire over the Internet using the secure study ID and password listed below. Participation in the study is voluntary; however, to ensure that the study represents the range of postsecondary faculty and instructional staff in the United States, the participation of each person selected in the sample is critical to the study’s success.

To find out more about the study, click the link below. To respond to the questionnaire over the Internet, log in using your study ID and password:

https://surveys.nces.ed.gov/nsopf/ Study ID: Password:

You will need to use Internet Explorer or Netscape as your browser to complete the web version.

The U.S. Department of Education has contracted with RTI International, an independent non-profit research organization, to conduct the study. To respond to the questionnaire by telephone or ask questions about the study, please call the RTI help desk at:

1-866-NSOPF04 (1-866-676-7304)

As a small token of our appreciation, if you complete the questionnaire by <DATE>, you may choose to receive either a $30 check or a $30 gift certificate from Amazon.com.

Your responses may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, except as required by law. Your responses will be secured behind firewalls and will be encrypted during Internet transmission. On average, the questionnaire takes about 30 minutes to complete. To learn more about the study and the laws protecting your confidentiality, please click on the link above.

Thank you in advance for your participation in this important study.

Sincerely, Linda Zimbler NSOPF Project Officer U. S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International Note: To ensure that as many sample members as possible receive this message, we also have sent printed materials to you via U.S. mail. All the information in the printed materials also is available through the web site listed above.

Page 270: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-36

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

EARLY RESPONSE DEADLINE REMINDER LETTER

<DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Dear Colleague:

We are writing to urge your completion of the questionnaire for the National Study of Postsecondary Faculty (NSOPF), sponsored by the U. S. Department of Education. As indicated previously, you were randomly selected for participation in this nationally representative sample of faculty and instructional staff.

At a time of rapid change in postsecondary education, NSOPF will provide critical updated information on the characteristics, workload and career paths of faculty and instructional staff in the United States. To adequately represent the full range of faculty and instructional staff throughout the nation, all persons having any full- or part-time instructional duties, or having faculty status in the fall of 2003, are eligible for inclusion. The participation of each individual selected is critical to the study’s success.

To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf and log in using your

Study ID:

Password:

You will need to use Internet Explorer or Netscape as your browser to complete the web version. If you need help accessing the web or if you prefer to complete the questionnaire by telephone, please call the RTI Help Desk at 1-866-NSOPF04 (1-866-676-7304). If you do not wish to receive an additional reminder e-mail message regarding this early-response incentive, you may call the number listed above and request to be removed from the mailing list. The U.S. Department of Education has contracted with RTI International, an independent non-profit research organization, to conduct the study. Whether by web or telephone, we urge you to complete the questionnaire promptly. If you complete the questionnaire by <DATE>, you may choose to receive either a $30 check or a $30 gift certificate from Amazon.com.

On average, the questionnaire takes about 30 minutes to complete. Your responses may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, except as required by law. Your responses will be secured behind firewalls and will be encrypted during Internet transmission. To learn more about the study and the laws protecting your confidentiality, please go to the web address listed above.

On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you in advance for your participation in this very important study.

Sincerely, Linda Zimbler Margaret Cahalan, Ph.D. NSOPF Project Officer NSOPF Project Director U.S. Department of Education RTI International

Page 271: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-37

FIRST FOLLOW-UP E-MAIL E-mail Subject line: U.S. Department of Education Study Reminder Attention: <NAME> Dear Colleague, We are writing to urge your timely completion of the questionnaire for the National Study of Postsecondary Faculty (NSOPF). As indicated in our previous correspondence, you were selected as part of a nationally representative sample for this major U.S. Department of Education study.

We are keenly aware of how busy faculty and instructional staff are, which is why we developed a web version of the questionnaire as a convenient way to participate in this important study. If you complete the questionnaire by <DATE>, you may elect to receive either a $30 check or a gift certificate from Amazon.com as a token of our appreciation. To find out more about the study, click the link below. To respond to the questionnaire over the Internet, log in using your study ID and password:

https://surveys.nces.ed.gov/nsopf/ Study ID: Password:

You will need to use Internet Explorer or Netscape as your browser to complete the web version. The Department of Education has contracted with an independent non-profit research organization, RTI, International, to conduct the study. If you need help completing the survey on the web or you prefer to complete the survey by telephone, please call the RTI Help Desk at 1-866-NSOPF04 (1-866-676-7304). Thank you again for your participation in this important study. Sincerely, Linda Zimbler NSOPF Project Officer U. S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International

Page 272: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-38

SECOND REMINDER E-MAIL Subject: U.S. Department of Education Study Follow-up Reminder Attention: <NAME> Dear Colleague: We are writing to urge your completion of the questionnaire for the National Study of Postsecondary Faculty (NSOPF), sponsored by the U.S. Department of Education. As indicated previously, you were randomly selected for participation in this nationally representative sample of faculty and instructional staff. At a time of rapid change in postsecondary education, NSOPF will provide critical updated information on the characteristics, workload and career paths of faculty and instructional staff in the United States. To adequately represent the full range of faculty and instructional staff throughout the nation, all persons having any full- or part-time instructional duties, or having faculty status in the fall of 2003, are eligible for inclusion. The participation of each individual selected is critical to the study’s success. To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf and log in using your

Study ID: Password:

You will need to use Internet Explorer or Netscape as your browser to complete the web version. If you need help accessing the web or if you prefer to complete the questionnaire by telephone, please call the RTI Help Desk at 1-866-NSOPF04 (1-866-676-7304). If you do not wish to receive an additional reminder e-mail message regarding this early-response incentive, you may call the number listed above and request to be removed from the mailing list. The U.S. Department of Education has contracted with RTI International, an independent non-profit research organization, to conduct the study. Whether by web or telephone, we urge you to complete the questionnaire promptly. If you complete the questionnaire by <DATE>, you may choose to receive either a $30 check or a $30 gift certificate from Amazon.com. On average, the questionnaire takes about 30 minutes to complete. Your responses may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose, except as required by law. Your responses will be secured behind firewalls and will be encrypted during Internet transmission. To learn more about the study and the laws protecting your confidentiality, please go to the web address listed above. On behalf of the U.S. Department of Education, we would like to thank you in advance for your participation in this very important study. Sincerely, Linda Zimbler NSOPF Project Officer U.S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International Note: To ensure that as many sample members as possible receive this message, we also have sent printed materials to you via U.S. mail.

Page 273: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-39

THIRD REMINDER E-MAIL Subject: U.S. Dept. of Ed. Study Early-response Period Ends After <DATE> Attention: <NAME> Dear Colleague: This message is only intended to be a gentle reminder to you that the early-response period for the National Study of Postsecondary Faculty (NSOPF) is drawing near. We are pleased to report that about 50 percent of faculty and instructional staff invited to participate along with you have already completed the questionnaire online. However, to adequately represent the entire range of faculty and instructional staff in the nation, we need at least 80 percent of the sample to complete the survey. We hope you will find the time to participate in the study soon. As a small token of our appreciation, if you complete the questionnaire by <DATE>, you may choose to receive either a $30 check or a $30 gift certificate from Amazon.com. To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf and log in using your

Study ID: Password:

You will need to use Internet Explorer or Netscape as your browser to complete the web version. Please be assured that your responses will be secured behind firewalls and will be encrypted during Internet transmission. If you need help accessing the web or if you prefer to complete the questionnaire by telephone, please call our Help Desk at 1-866-NSOPF04 (1-866-676-7304). If you do not wish to receive additional reminder e-mail messages, you may call the number listed above and request to be removed from the mailing list. Thank you in advance for your participation in this very important study. Your participation is so very critical to its success. Sincerely, Linda Zimbler NSOPF Project Officer U. S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International

Page 274: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-40

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

NONRESPONSE LETTER <DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Dear Colleague, The National Study of Postsecondary Faculty (NSOPF) needs your help in order to portray an accurate picture of the nation’s postsecondary educators. We hope that with the end of the school year, your schedule will allow time for you to complete the NSOPF questionnaire. As someone who plays a crucial role in education, we are sure you can appreciate the importance of having an adequate representation of the diversity of the nation’s faculty and instructional staff. This U.S. Department of Education sponsored study will provide critical information on the background characteristics, workloads, and career paths of faculty and instructors in postsecondary institutions. Your experiences and opinions are very important to the success of this study. As a token of our appreciation for completing the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com. Because we are keenly aware of how busy you are, we have developed a web version of the questionnaire as a convenient way for you to participate. You will need to use Internet Explorer or Netscape as your browser to complete the web version. To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf and log in using your

Study ID: Password:

All of your answers will be completely confidential and will not be released in any form that could lead to your identification. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file, and will never be linked to answers you provide. If you need help completing the questionnaire on the web or if you prefer to participate by telephone, we have a staff of professional interviewers available to assist you at 1-866-NSOPF04 (1-866-676-7304). Your name was randomly selected from a list that <INSTITUTION NAME> provided us of its fall 2003 faculty and instructional staff. If you were not employed at this institution in the fall, we would greatly appreciate it if you would contact us at the above number and let us know this information. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you in advance for your participation in this very important study. Sincerely, Linda Zimbler Margaret Cahalan, Ph.D. NSOPF Project Officer NSOPF Project Director U.S. Department of Education RTI International

Page 275: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-41

NONRESPONSE E-MAIL

E-mail Subject line: U.S. Department of Education Study – need your response Attention: <NAME> Dear Colleague, The National Study of Postsecondary Faculty (NSOPF) needs your help in order to portray an accurate picture of the nation's postsecondary educators. We hope that with the end of the school year, your schedule will allow time for you to complete the NSOPF questionnaire. As someone who plays a crucial role in education, we are sure you can appreciate the importance of having an adequate representation of the diversity of the nation's faculty and instructional staff. This U.S. Department of Education sponsored study will provide critical information on the background characteristics, workloads, and career paths of faculty and instructors in postsecondary institutions. Your experiences and opinions are very important to the success of this study. As a token of our appreciation for completing the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com. Because we are keenly aware of how busy you are, we have developed a web version of the questionnaire as a convenient way for you to participate. You will need to use Internet Explorer or Netscape as your browser to complete the web version. To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf and log in using your Study ID: Password: All of your answers will be completely confidential and will not be released in any form that could lead to your identification. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file, and will never be linked to answers you provide. If you need help completing the questionnaire on the web or if you prefer to participate by telephone, we have a staff of professional interviewers available to assist you at 1-866-NSOPF04 (1-866-676-7304). Your name was randomly selected from a list that <INSTITUTION NAME> provided us of its fall 2003 faculty and instructional staff. If you were not employed at this institution in the fall, we would greatly appreciate it if you would contact us at the above number and let us know this information. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you in advance for your participation in this very important study. Sincerely, Linda Zimbler NSOPF Project Officer U.S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International Note: To ensure that as many sample members as possible receive this message, we also have sent this to you via U.S. mail.

Page 276: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-42

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

SECOND NONRESPONSE LETTER <DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Dear Colleague, We are writing again to request your participation in the National Study of Postsecondary Faculty (NSOPF:04). We hope that with the end of the school year, your schedule will now allow time for you to participate in the study. The U.S. Department of Education needs your help in order to portray an accurate picture of the nation’s postsecondary educators. Your experiences and opinions are very important to the success of this study, and will provide critical information on the background characteristics, workloads, and career paths of faculty and instructors in postsecondary institutions. Your name was randomly selected from a list of fall 2003 faculty and instructional staff at <INSTITUTION NAME>. If you were not employed in this capacity at this institution in the fall of 2003, we would greatly appreciate it if you would contact us at the number below so that we may correct our database. As a token of our appreciation for completing the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com. The web version of the questionnaire has been designed as a convenient way for you to participate in the study as your schedule allows. You will need to use Internet Explorer or Netscape as your browser to complete the web version. If you need help completing the questionnaire on the web or if you prefer to participate by telephone, we have a staff of professional interviewers available to assist you at 1-866-NSOPF04 (1-866-676-7304). To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf and log in using your

Study ID: Password:

All of your answers will be completely confidential and will not be released in any form that could lead to your identification. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file and will never be linked to answers you provide. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you in advance for your participation in this very important study. Sincerely, Linda Zimbler Margaret Cahalan, Ph.D. NSOPF Project Officer NSOPF Project Director U.S. Department of Education RTI International

Page 277: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-43

SECOND NONRESPONSE E-MAIL

E-mail Subject line: U.S. Department of Education Study – picture of postsecondary educators Attention: <NAME> Dear Colleague, We are writing again to request your participation in the National Study of Postsecondary Faculty (NSOPF:04). We hope that with the end of the school year, your schedule will now allow time for you to participate in the study. The U.S. Department of Education needs your help in order to portray an accurate picture of the nation’s postsecondary educators. Your experiences and opinions are very important to the success of this study, and will provide critical information on the background characteristics, workloads, and career paths of faculty and instructors in postsecondary institutions. Your name was randomly selected from a list of fall 2003 faculty and instructional staff at <INSTITUTION NAME>. If you were not employed in this capacity at this institution in the fall of 2003, we would greatly appreciate it if you would contact us at the number below so that we may correct our database. As a token of our appreciation for completing the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com. The web version of the questionnaire has been designed as a convenient way for you to participate in the study as your schedule allows. You will need to use Internet Explorer or Netscape as your browser to complete the web version. If you need help completing the questionnaire on the web or if you prefer to participate by telephone, we have a staff of professional interviewers available to assist you at 1-866-NSOPF04 (1-866-676-7304). To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf/ and log in using your

Study ID: Password:

All of your answers will be completely confidential and will not be released in any form that could lead to your identification. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file and will never be linked to answers you provide. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you in advance for your participation in this very important study. Sincerely, Linda Zimbler NSOPF Project Officer U.S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International Note: To ensure that as many sample members as possible receive this message, we also have sent this to you via U.S. mail.

Page 278: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-44

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

REFUSAL NONRESPONSE LETTER <DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Dear Colleague, We are writing to request your participation in the 2004 National Study of Postsecondary Faculty (NSOPF). We understand you may have some reluctance to participate given the many demands that are placed on your time. However we would like to share some key information about the study in order to ensure that you are aware of the importance of this research effort for the U.S. Department of Education and the impact of your participation on the success of the study. NSOPF:04 is the third in a series of studies designed to capture the experiences of a wide variety of postsecondary faculty and instructional staff by providing critical information on their background characteristics, workloads, and career paths. Because you have been scientifically selected to represent thousands of other postsecondary staff, your experiences and opinions are key to the success of this study. Your answers will help researchers and policy makers respond to issues that directly affect the quality of education in postsecondary institutions. As a token of our appreciation for completing the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com. The web version of the questionnaire has been designed as a convenient way for you to participate in the study as your schedule allows. You will need to use Internet Explorer or Netscape as your browser to complete the web version. If you need help completing the questionnaire on the web or if you prefer to participate by telephone, we have a staff of professional interviewers available to assist you at 1-866-NSOPF04 (1-866-676-7304). To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf and log in using your Study ID: Password: All of your answers will be completely confidential and will not be released in any form that could lead to your identification. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file and will never be linked to answers you provide. Your name was randomly selected from a list of fall 2003 faculty and instructional staff at <INSTITUTION NAME>. If you were not employed in this capacity at this institution in the fall, we would greatly appreciate it if you would contact us at the number above and let us know so that we may correct our database. Thank you for considering this very important study. Your participation is critical to its ultimate success. Sincerely, Linda Zimbler Margaret Cahalan, Ph.D. NSOPF Project Officer NSOPF Project Director U.S. Department of Education RTI International

Page 279: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-45

REFUSAL NONRESPONSE E-MAIL

E-mail Subject line: U.S. Department of Education Study – requests your response Attention: <NAME> Dear Colleague, We are writing to request your participation in the 2004 National Study of Postsecondary Faculty (NSOPF). We understand you may have some reluctance to participate given the many demands that are placed on your time. However we would like to share some key information about the study in order to ensure that you are aware of the importance of this research effort for the U.S. Department of Education and the impact of your participation on the success of the study. NSOPF:04 is the third in a series of studies designed to capture the experiences of a wide variety of postsecondary faculty and instructional staff by providing critical information on their background characteristics, workloads, and career paths. Because you have been scientifically selected to represent thousands of other postsecondary staff, your experiences and opinions are key to the success of this study. Your answers will help researchers and policy makers respond to issues that directly affect the quality of education in postsecondary institutions. As a token of our appreciation for completing the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com. The web version of the questionnaire has been designed as a convenient way for you to participate in the study as your schedule allows. You will need to use Internet Explorer or Netscape as your browser to complete the web version. If you need help completing the questionnaire on the web or if you prefer to participate by telephone, we have a staff of professional interviewers available to assist you at 1-866-NSOPF04 (1-866-676-7304). To access the questionnaire on the web or to obtain more information about the study, go to https://surveys.nces.ed.gov/nsopf// and log in using your Study ID: Password: All of your answers will be completely confidential and will not be released in any form that could lead to your identification. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file and will never be linked to answers you provide. Your name was randomly selected from a list of fall 2003 faculty and instructional staff at <INSTITUTION NAME>. If you were not employed in this capacity at this institution in the fall, we would greatly appreciate it if you would contact us at the number above and let us know so that we may correct our database. Thank you in advance for your participation in this very important study. Your participation is critical to its ultimate success. Sincerely, Linda Zimbler NSOPF Project Officer U.S. Department of Education

Margaret Cahalan, Ph.D. NSOPF Project Director RTI International

Note: To ensure that as many sample members as possible receive this message, we also have sent this to you via U.S. mail.

Page 280: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-46

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

EARLY SEPTEMBER NONRESPONDENTS LETTER <DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Dear Colleague, As we approach the end of data collection for the 2004 National Study of Postsecondary Faculty (NSOPF), the U.S. Department of Education wants to ensure that all faculty and instructional staff employed in the 2003 Fall Term are well represented in the study. It is important that you are counted so that we have an accurate representation of the diversity and experience of our nation’s postsecondary educators. In fact, your participation is so critical to the success of the study that we have created an abbreviated version of the questionnaire. The current questionnaire takes about 10 minutes to complete. We hope that this reduced time requirement will allow those of you with more demanding schedules to participate in this important study and represent others with similar time constraints. We still offer two convenient ways to participate. You can visit our web site to answer these questions online in the privacy of your home or office. If you prefer to respond by phone, you can call 1-866-NSOPF04, a toll-free number. To participate or to learn more about the study, log in with the ID and password provided below.

https://surveys.nces.ed.gov/nsopf Study ID: Password:

If you complete the questionnaire by September 30, 2004, we will send you either a $30 check or $30 gift certificate from Amazon.com, as a token of our appreciation. Your responses will be kept confidential, will be encrypted during Internet transmission, and will be secured behind firewalls. All identifying information is maintained in a separate file and will never be linked to answers you provide. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you for your consideration of this important study. Sincerely, Linda Zimbler Margaret Cahalan, Ph.D. NSOPF Project Officer NSOPF Project Director U.S. Department of Education RTI International

EARLY SEPTEMBER NONRESPONDENTS E-MAIL

Page 281: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-47

Subject: Abbreviated version of U.S. Department of Education Study Attention: <NAME> Dear Colleague, As we approach the end of data collection for the 2004 National Study of Postsecondary Faculty (NSOPF), the U.S. Department of Education wants to ensure that all faculty and instructional staff employed in the 2003 Fall Term are well represented in the study. It is important that you are counted so that we have an accurate representation of the diversity and experience of our nation’s postsecondary educators. In fact, your participation is so critical to the success of the study that we extended the deadline and have created an abbreviated version of the questionnaire. The current questionnaire takes about 10 minutes to complete. We hope that this reduced time requirement will allow those of you with more demanding schedules to participate in this important study and represent others with similar time constraints. We still offer two convenient ways to participate. You can visit our web site to answer these questions online in the privacy of your home or office. If you prefer to respond by phone, you can call 1-866-NSOPF04, a toll-free number. To participate or to learn more about the study, log in with the ID and password provided below.

https://surveys.nces.ed.gov/nsopf Study ID: Password:

If you complete the questionnaire by October 5, 2004, we will send you either a $30 check or $30 gift certificate from Amazon.com, as a token of our appreciation. Your responses will be kept confidential, will be encrypted during Internet transmission, and will be secured behind firewalls. All identifying information is maintained in a separate file and will never be linked to answers you provide. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you for your consideration of this important study. Sincerely,

Linda Zimbler NSOPF Project Officer U.S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International

Page 282: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-48

FINAL REMINDER E-MAIL

Subject: Final Reminder and Deadline Extension for U.S. Department of Education Study Attention: <NAME> Dear Colleague, This final communication to you is to let you know that we have extended the deadline for the 2004 National Study of Postsecondary Faculty (NSOPF) through Tuesday, October 5, 2004. We are extending data collection to achieve a higher response rate. While 75 percent of all eligible sample members have completed the questionnaire, we want to make sure that we have the most accurate representation of our nation’s postsecondary educators. Your participation is critical in helping us achieve this goal. Please take 10 minutes out of your busy schedule to complete the questionnaire.

As a reminder, we still offer two convenient ways to participate. You can visit our website to answer these questions online in the privacy of your home or office. If you prefer to respond by phone, you can call 1-866-NSOPF04, a toll-free number. To participate or to learn more about the study, log in with the Study ID and password provided below.

https://surveys.nces.ed.gov/nsopf/ Study ID: Password:

If you complete the 10 minute questionnaire, we will send you either a $30 check or $30 gift certificate from Amazon.com, as a token of our appreciation. Your responses will be kept confidential, will be encrypted during Internet transmission, and will be secured behind firewalls. All identifying information is maintained in a separate file and will never be linked to answers you provide.

On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you for your consideration of this important study.

Linda Zimbler NSOPF Project Officer U.S. Department of Education Margaret Cahalan, Ph.D. NSOPF Project Director RTI International

Page 283: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-49

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

REFUSALS MAY NOT BE ELIGIBLE LETTER

<DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Dear Colleague, We recently contacted you about an important study for the U.S. Department of Education called the National Study of Postsecondary Faculty (NSOPF). As the end of the study draws near, we are writing to request that you take a few minutes to help us accurately document the records for <INSTITUTION NAME>. While we respect your decision should you ultimately choose not to participate in this study, we have found that other faculty who initially decline to complete the interview do so because they believe they are not eligible to participate based on the sample design. NSOPF is designed to be a comprehensive source of information, encompassing the experiences of many diverse postsecondary staff, including part-time and full-time faculty and instructional staff, and even adjunct staff who provide instruction to students. In order to ensure that the data for this study are as complete and accurate as possible, we would appreciate it if you could help us determine your eligibility for this study by answering just three short questions.

• Were you employed by <INSTITUTION NAME> during the 2003 Fall Term? • During the 2003 Fall Term at <INSTITUTION NAME>, did you have faculty status as

defined by that institution? • Did you have any instructional duties at <INSTITUTION NAME> during the 2003 Fall

Term (such as teaching students in one or more credit or noncredit courses, advising or supervising students' academic activities)?

We understand that faculty and instructional staff lead very busy lives, so we offer two convenient ways to participate. You can visit our web site to answer these questions online in the privacy of your home or office. If you prefer to respond by phone, you can call 1-866-NSOPF04, a toll-free number. For more information about the study, login with the ID and password provided below.

https://surveys.nces.ed.gov/nsopf Study ID: Password:

If you are eligible for the study and choose to complete the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com. Your responses are very important to the success of this study and will be kept completely confidential. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file and will never be linked to answers you provide. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you for your consideration of this important study. Sincerely, Linda Zimbler Margaret Cahalan, Ph.D. NSOPF Project Officer NSOPF Project Director U.S. Department of Education RTI International

Page 284: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix F. Contacting Materials

F-50

NSOPF:04 Endorsed by

American Association for Higher Education

American Association of Collegiate Registrars and Admissions Officers

American Association of Community Colleges

American Association of State Colleges and Universities

American Association of University Professors

American Council on Education

American Federation of Teachers

Association for Institutional Research

Association of American Colleges and Universities

Association of Catholic Colleges and Universities

Career College Association

The Carnegie Foundation for the Advancement of Teaching

College and University Professional Association for Human Resources

The College Board

The College Fund/UNCF

Council of Graduate Schools

The Council of Independent Colleges

Hispanic Association of Colleges and Universities

National Association of College and University Business Officers

National Association for Equal Opportunity in Higher Education

National Association of Independent Colleges and Universities

National Association of State Universities and Land-Grant Colleges

National Association of Student Financial Aid Administrators

National Education Association

PARTIALS MAY NOT BE ELIGIBLE LETTER <DATE> <FACULTY NAME> <ADDR 1> <ADDR 2> <CITY STATE ZIP> Dear Colleague, In reviewing the data for the National Study of Postsecondary Faculty (NSOPF), we observed that some faculty and instructional staff only answered the first one or two questions. We have also received reports from sample members who could not participate because they had difficulty completing the study on the web or did not complete the study because they believed they were not eligible to participate based on the initial questions. As the end of the study draws near, we are writing to clarify the eligibility criteria for the study and to request that you help us accurately document the records for <INSTITUTION NAME>. To that end, we would appreciate it if you would take a few minutes to answer just three short questions that will help us determine your eligibility for this study.

• Were you employed by <INSTITUTION NAME> during the 2003 Fall Term? • During the 2003 Fall Term at <INSTITUTION NAME>, did you have faculty status as

defined by that institution? • Did you have any instructional duties at <INSTITUTION NAME> during the 2003 Fall

Term (such as teaching students in one or more credit or noncredit courses, advising or supervising students' academic activities)?

We understand that faculty and instructional staff lead very busy lives, so we offer two convenient ways to participate. You can visit our web site to answer these questions online in the privacy of your home or office. If you prefer to respond by phone, you can call 1-866-NSOPF04, a toll-free number. For more information about the study, login with the ID and password provided below.

https://surveys.nces.ed.gov/nsopf Study ID: Password:

If you are eligible for the study and choose to complete the questionnaire, we would like to send you either a $30 check or gift certificate from Amazon.com, as a token of our appreciation. Your responses are very important to the success of this study and will be kept completely confidential. Your answers will be secured behind firewalls and will be encrypted during Internet transmission. All identifying information is maintained in a separate file and will never be linked to answers you provide. On behalf of the National Center for Education Statistics, U.S. Department of Education, we would like to thank you for your consideration of this important study. Sincerely, Linda Zimbler Margaret Cahalan, Ph.D. NSOPF Project Officer NSOPF Project Director U.S. Department of Education RTI International

Page 285: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

G-1

Appendix G Training Materials

Page 286: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 287: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix G. Training Materials

G-3

NSOPF:04 Telephone Interviewer Training Agenda Day 1 – Saturday, February 28, 2004

9:00a–9:30a Welcome and Introduction (30 min) 9:30a–9:45a Confidentiality (15 min) 9:45a–10:25a Demonstration Mock (40 min) 10:25a–10:40a Small group discussion of survey/FAQs (15 min) 10:40a–10:55a Break (15 min) 10:55a–12:25p Q x Q Review (90 min) 12:25p–1:00p Lunch Break (35 min) 1:00p–2:00p Round Robin Mock #1 (60 min) 2:00p–2:30p Open-Ended Coding Practice (30 min)

2:30p–3:30p Refusal Avoidance/Conversion (60min) 3:30p–3:45p Break (15 min) 3:45p–4:45p Contacting/Locating/Front-End (60 min) 4:45p–5:45p Round Robin Mock #2 (60 min) 5:45p–6:00p FAQ Review

Day 2 – Sunday, February 29, 2004

1:00p–1:15p FAQ Review (Oral Quiz) (15 min) 1:15p–2:00p Written Exercises (45 min) 2:00p–2:20p Written Open-Ended Coding Exercise (20 min) 2:20p-3:05p Additional Contacting/Locating Practice for new TIs (45 min) 3:05p–3:20p Break (15 min) 3:20p–3:50p FAQ Certification (30 min) 3:50p–5:00p Certification Interviews (70 min)

Page 288: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix G. Training Materials

G-4

Additional Training (on interviewers’ first shift after training): Finish coding exercise (20 min) Individual Mock Interview (30 min)

Page 289: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix G. Training Materials

G-5

Interviewer Training Manual Table of Contents

Page 1.0 INTRODUCTION ........................................................................................................... 1-1

1.1 What is the National Study of Postsecondary Faculty (NSOPF:04)?.................. 1-1 1.2 What is the Purpose of NSOPF:04?..................................................................... 1-2 1.3 Who Will You Be Interviewing? ......................................................................... 1-3 1.4 What is New for NSOPF:04?............................................................................... 1-3 1.5 Data Collection Schedule..................................................................................... 1-5 1.6 Project Staff ......................................................................................................... 1-6 2.0 GENERAL INTERVIEWING TECHNIQUES .............................................................. 2-1 2.1 Overview.............................................................................................................. 2-1 2.2 Best Practices in Conducting the Interview......................................................... 2-2 2.2.1 Asking the Questions ................................................................................. 2-2 2.2.2 Using Feedback.......................................................................................... 2-5 2.2.3 Recording Responses Accurately .............................................................. 2-6 2.2.4 Use of Judgment in Coding........................................................................ 2-7 2.3 Web-Based Instrumentation/Self-Administered Instrument................................ 2-8 2.3.1 Using a Windows-Based System............................................................... 2-8 2.3.2 Getting Around: Using a Mouse ............................................................... 2-8 2.3.3 Sizing/Moving Windows ........................................................................... 2-9 2.3.4 Working with Web Questions.................................................................. 2-10 2.4 The Web/CATI Instrument ................................................................................ 2-11 2.5 Sample Members’ Rights................................................................................... 2-12 2.6 Confidentiality ................................................................................................... 2-14 2.7 Obtaining Cooperation....................................................................................... 2-17 2.8 Refusals.............................................................................................................. 2-19 2.9 Answers to Questions ........................................................................................ 2-19 3.0 LOCATING AND CONTACTING SAMPLE MEMBERS ........................................... 3-1 3.1 Pre-CATI Tracing Activities................................................................................ 3-1 3.2 CATI Locating Procedures .................................................................................. 3-1 3.3 E-mail Addresses and Dept. Staff........................................................................ 3-2 3.4 TOPS Tracing Procedures.................................................................................... 3-3 3.5 Use of FTS Lines for Outbound Calling.............................................................. 3-5 3.6 Initial Contact....................................................................................................... 3-4 3.7 Scheduling a Callback.......................................................................................... 3-5 3.8 Telephone Answering Machine Message Protocol.............................................. 3-7 3.9 Status Codes......................................................................................................... 3-8 4.0 QUALITY CONTROL.................................................................................................... 4-1 4.1 Assuring Quality in the Interview........................................................................ 4-1

Page 290: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix G. Training Materials

G-6

4.1.1 Performance Monitoring............................................................................ 4-1 4.2 Electronic Problem Reports ................................................................................. 4-8 4.3 Quality Circle Meetings..................................................................................... 4-15 4.4 Conclusion ......................................................................................................... 4-15

List of Exhibits

Exhibit 1-2. NSOPF:04 Calendar ............................................................................................... 1-5 Exhibit 2-1. Glossary of Windows Terms .................................................................................. 2-9 Exhibit 2-2. Confidentiality Agreement ................................................................................... 2-15 Exhibit 2-3. Affidavit of Nondisclosure ................................................................................... 2-16 Exhibit 2-4. Answering Questions/Dealing with Reluctant NSOPF:04 Sample Members ...... 2-21

Exhibit 4-1. NSOPF:04 Productivity Report .............................................................................. 4-2 Exhibit 4-2. NSOPF:04 Monitoring Form.................................................................................. 4-4 Exhibit 4-3. Problem Reporting System (Opening Screen)...................................................... 4-10

List of Appendices Appendix A – Initial Mailing Materials...................................................................................... A-1 Appendix B – Trace Review Procedures .....................................................................................B-1 Appendix C – Refusal Conversion Procedures............................................................................C-1 Appendix D – Event and Status Codes ....................................................................................... D-1 Appendix E – CATI Queues and Menu Options .........................................................................E-1 Appendix G – Glossary of Terms ............................................................................................... G-1 Appendix H – List of Acronyms................................................................................................. H-1 Appendix I – Selected Help Text.................................................................................................I-1

Page 291: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

H-1

Appendix H Quality of Lists

Page 292: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 293: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix H. Quality of Lists

H-3

Tables H-1 through H-3 show the comparisons between institution questionnaire counts and tallied faculty list counts. Among the 900 institutions that provided faculty lists and responded to the questionnaire, 740 (83 percent) had list counts that were less than 10 percent discrepant. There were a greater number of discrepancies in part-time faculty counts than full-time faculty counts. Providing a complete and accurate list of part-time faculty is, for most institutions, the most difficult part of the NSOPF data request. Table H-1. Relative percentage discrepancy between total faculty counts from the institution

questionnaire and faculty list: 2004

Relative percentage discrepancy Institution type

Number of institutions < –10 –10 to 10 > 10

Total 900 60 740 100

Public doctor’s 170 10 130 20 Public master’s 100 10 90 10 Public bachelor’s 30 # 20 10 Public associate’s 280 20 230 40 Public other 10 # 10 # Private not-for-profit doctor’s 90 # 80 10 Private not-for-profit master’s 70 10 50 10 Private not-for-profit bachelor’s 110 10 100 10 Private not-for-profit associate’s 10 # 10 # Private not-for-profit other 50 # 40 10 # Rounds to zero. NOTE: Numbers are rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table H-2. Relative percentage discrepancy between part-time faculty counts from the institution and faculty list: 2004

Relative percentage discrepancy Institution type

Number of institutions < –10 –10 to 10 > 10

Total 900 90 680 120

Public doctor’s 170 20 120 20 Public master’s 100 10 80 10 Public bachelor’s 30 # 20 # Public associate’s 280 20 220 40 Public other 10 # 10 # Private not-for-profit doctor’s 90 10 70 10 Private not-for-profit master’s 70 10 50 10 Private not-for-profit bachelor’s 110 10 90 20 Private not-for-profit associate’s 10 # 10 # Private not-for-profit other 50 10 30 10 # Rounds to zero. NOTE: Numbers are rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 294: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix H. Quality of Lists

H-4

Table H-3. Relative percentage discrepancy between full-time faculty counts from the institution questionnaire and faculty list: 2004

Relative percentage discrepancy Institution type

Number of institutions < –10 –10 to 10 > 10

Total 900 80 770 60 Public doctor’s 170 20 130 20 Public master’s 100 10 90 # Public bachelor’s 30 # 20 # Public associate’s 280 20 240 20 Public other 10 # 10 # Private not-for-profit doctor’s 90 10 80 # Private not-for-profit master’s 70 10 60 # Private not-for-profit bachelor’s 110 10 100 # Private not-for-profit associate’s 10 # 10 # Private not-for-profit other 50 10 30 # # Rounds to zero. NOTE: Numbers are rounded to the nearest 10. Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 295: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-1

Appendix I Nonresponse Bias Analysis

Page 296: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 297: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-3

I.1 Overview

The bias in an estimated mean based on respondents, Ry , is the difference between this estimate and the target parameter, µ, which is the mean that would result if a complete census of the target population was conducted and all units responded. This bias can be expressed as follows:

µ−= RR yyB )(

However, for variables that are available from the frame, µ can be estimated by µ̂ without sampling error, in which case the bias in Ry can then be estimated by:

µ̂)(ˆ −= RR yyB Moreover, an estimate of the population mean based on respondents and nonrespondents can be obtained by:

( ) NRR yy ˆ ˆ1ˆ ηηµ +−= whereη̂ is the weighted unit nonresponse rate, based on weights prior to nonresponse adjustment. Consequently, the bias in Ry can then be estimated by:

( )NRRR yyyB −= ˆ)(ˆ η That is, the estimate of the nonresponse bias is the difference between the mean for respondents and nonrespondents multiplied by the weighted nonresponse rate, using the design weights prior to nonresponse adjustment.

I.2 Unit-level Nonresponse Bias Analysis A faculty respondent was defined as any sample member who was determined to be

eligible for the study and had valid data for the selected set of key analytical variables. As shown in section 3.2.1 (table 13) of the main body of this report, for the 34,330 eligible sample faculty members the unweighted and weighted response rates were both 76 percent. Since the faculty weighted response rate was below 85 percent for virtually all institution types, a nonresponse bias analysis was conducted for faculty members from all institution types. The nonresponse bias was estimated for the variables known for both respondents and nonrespondents within each institution type.

The steps for nonresponse bias analysis included estimating the nonresponse bias and testing (adjusting for multiple comparisons) to determine if the bias was significant at the 5 percent level. Second, nonresponse adjustment factors were computed, as detailed in section 6.2.3, to significantly reduce or eliminate nonresponse bias for variables included in the corresponding models. Third, after the weights were computed, any remaining bias was estimated and statistical tests were performed to determine their significance.

As shown in table I-1, the faculty weighting adjustments have reduced, and in some cases eliminated, bias for faculty members for all institution types. Significant bias was reduced for the variables known for most respondents and nonrespondents, which are considered key analytical

Page 298: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-4

variables and correlated with many of the other variables where bias is measured as a significant difference from zero using the National Center for Education Statistics (NCES) recommended method detailed in section I.1.

Table I-1. Summary of faculty nonresponse bias analysis, overall and by type of institution: 2004 Relative bias

Before weight adjustments After weight adjustments

Nonresponse bias statistics Nonresponse

rate Mean Median Percent

significant Mean Median Percent

significant All faculty 24.40 0.09 0.05 26.60 0.07 0.02 8.1 Public doctor’s 21.90 0.04 0.02 73.90 0.02 0.01 30.4 Public master’s 21.50 0.06 0.06 13.00 0.02 0.01 # Public bachelor’s 32.60 0.09 0.07 8.70 0.16 0.10 4.3 Public associate’s 26.30 0.06 0.05 21.70 0.04 0.01 4.3 Public other 26.70 0.12 0.04 100.00 0.06 0.02 # Private not-for-profit doctor’s 31.80 0.06 0.05 56.50 0.03 0.03 21.7 Private not-for-profit master’s 21.50 0.07 0.07 17.40 0.04 0.03 8.7 Private not-for-profit bachelor’s 21.30 0.07 0.06 18.20 0.03 0.03 # Private not-for-profit associate’s 9.00 0.25 0.15 35.00 0.12 0.05 5.0 Private not-for-profit other 29.40 0.08 0.03 18.20 0.14 0.08 4.5 # Rounds to zero. NOTE: The percent significant reflects the ratio of biased estimates to biased and unbiased estimates for the items involved in this analysis. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Analogous analyses were conducted for the institution survey where a responding institution was one that had responded to the institution questionnaire. The corresponding institution level results are summarized in table I-2. Here too, the institution weighting adjustments have reduced percent significant bias overall and for all institution types. Note that such analyses were not carried out when computing the institution component weights for the faculty analysis weights, since the weighted response rate for institutions providing lists of faculty and instructional staff exceeded 85 percent.

Table I-2. Summary of institution nonresponse bias analysis, overall and by type of institution: 2004

Relative bias Before weight adjustments After weight adjustments

Nonresponse bias statistics Nonresponse

rate Mean Median Percent

significant Mean Median Percent

significant All faculty 15.80 0.06 0.04 7.69 0.20 0.12 7.69 Public, doctor’s 15.30 0.06 0.05 6.25 0.09 0.07 6.25 Public, master's 10.40 0.09 0.05 18.75 0.17 0.12 # Public, bachelor’s 0.00 0.21 0.21 100.00 0.41 0.38 6.25 Public, associate’s 16.40 0.12 0.09 18.75 0.18 0.11 12.5 Public, other/unknown 1.10 0.20 0.00 38.46 0.52 0.22 7.69 Private not-for-profit doctor’s 16.30 0.08 0.06 12.50 0.15 0.13 6.25 Private not-for-profit master's 20.20 0.08 0.07 6.25 0.29 0.25 # Private not-for-profit bachelor’s 22.30 0.11 0.13 13.33 0.23 0.18 6.67 Private not-for-profit associate’s 14.00 0.12 0.00 6.67 0.92 0.64 13.33 Private not-for-profit other/unknown 23.80 0.12 0.09 100.00 0.53 0.42 12.5 # Rounds to zero. NOTE: The percent significant reflects the ratio of biased estimates to biased and unbiased estimates. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 299: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-5

I.3 Item-Level Bias Analysis For items with less than 85 percent weighted response rate, respondents and

nonrespondents are compared on the sampling frame and/or questionnaire variables when data on respondents and nonrespondents were available. For this purpose, item response rates (RRI) were calculated as the ratio of the number of respondents for whom an in-scope response was obtained (Ix for item x) to the number of unit level respondents (I) minus the number of respondents with a valid skip item for item x (Vx), or:

x

xx

VIIRRI−

=

A faculty member was defined to be an item respondent for an analytic variable if the given faculty member had data for that variable, observed or deduced via logical imputation. Table I-3 provides a summary of response rates for variables with a response rate less than 85 percent—overall or within an institution type. A nonresponse bias analysis was conducted for these items, results of which are summarized in table I-4. For these items, the nonresponse bias was estimated for variables known for both respondents and nonrespondents and tested (adjusting for multiple comparisons) to determine if the bias was significant at the 5 percent level.

With the exception of the income items, which were expected to have higher rates of refusal due to their sensitive nature, the primary reason item nonresponse exceeded 15 percent for these items is that each applies to a relatively small subset of respondents (i.e., small denominator) and none of these items were asked on the abbreviated instrument, which was administered to about 1,600 responding faculty members.

The Q37 items were presented as a matrix that asked 6 questions about each of the classes taught by the respondents. The rate of nonresponse increased for each subsequent class described, due primarily to the smaller number of respondents to whom the question applied, relative to the static number of respondents who completed the abbreviated instrument and were not asked this question.

Income paid per course, credit unit, or term (Q68) was missing for 26 percent of respondents to whom this item applied. This item was asked only of those who indicated their salary was not based on a 9- or 10-month, or 11- or 12-month contract. Of those who provided the basis of their pay (course, credit hour, academic term [Q68]), a follow-up question (Q69) asked for the amount of income paid per course, credit unit, or term. This item was missing for 36 percent of respondents to whom this item applied.

Page 300: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-6

Appendix I. N

onresponse Bias A

nalysis

Table I-3. Faculty item response rates for items with < 85 percent response rate for any institution type before weight adjustments, overall and by institution type: 2004

Public Private, not-for-profit Variable Overall Doctor’s Master’s Bachelor’s Associate’s Other Doctor’s Master’s Bachelor’s Associate’s Other Q31A 88.7 90.5 90.9 81.9 88.3 89.9 87.8 87.8 86.4 98.9 80.7 Q31B 88.7 90.5 89.9 81.3 88.2 89.9 88.1 88.3 87.0 98.0 82.1 Q31C 90.0 91.3 90.8 83.4 90.0 93.3 89.3 90.3 87.7 98.1 84.3 Q31D 90.0 91.2 90.9 81.3 89.8 92.3 89.6 90.4 87.7 97.9 84.0 Q32A 91.2 92.4 92.2 83.5 91.1 92.5 91.0 92.1 89.0 99.2 84.3 Q32B 91.2 92.4 92.2 83.5 91.1 92.5 91.1 92.0 89.0 99.2 84.3 Q32C 91.3 92.4 92.2 83.5 91.2 92.5 91.0 92.0 89.0 99.2 84.3 Q32D 91.2 92.4 92.2 83.5 91.1 92.5 91.0 92.1 89.0 99.2 84.3 Q35A1 91.6 92.6 92.5 83.6 91.6 93.5 91.4 92.5 89.4 99.2 85.4 Q35A2 91.6 92.5 92.5 83.5 91.6 93.5 91.4 92.5 89.3 99.2 85.4 Q35B 90.4 91.6 91.7 81.9 90.1 93.0 90.4 91.8 88.8 94.4 84.2 Q35C 90.5 91.5 91.7 82.4 90.0 93.0 90.5 91.4 88.6 97.8 84.5 Q36 89.9 90.2 91.7 82.4 89.9 93.2 88.3 91.6 88.3 99.2 83.6 Q37A1 89.0 89.1 90.9 82.4 89.0 91.6 86.7 91.4 87.9 99.1 82.9 Q37A2 84.5 83.5 88.6 78.7 84.6 89.1 78.4 87.6 84.4 99.0 73.7 Q37A3 78.0 69.9 85.6 72.1 79.5 87.3 61.5 82.6 79.4 98.8 63.7 Q37A4 67.7 47.6 76.3 64.7 73.5 81.6 33.9 72.2 64.8 98.8 52.7 Q37A5 51.4 25.5 50.9 35.2 65.0 59.7 14.3 43.8 40.4 98.1 41.3 Q37B1 87.2 86.2 91.1 81.9 88.3 91.6 79.8 90.3 86.7 99.1 80.4 Q37B2 83.4 81.2 88.9 78.0 84.2 89.1 74.2 87.2 83.2 99.0 72.9 Q37B3 77.0 67.3 85.9 72.1 79.1 87.3 57.5 82.0 78.3 98.8 62.1 Q37B4 66.9 45.1 76.6 64.6 73.0 81.6 30.6 71.9 64.3 98.8 50.8 Q37B5 50.7 23.9 51.3 35.5 64.4 59.7 13.0 42.3 39.2 98.1 41.0 Q37C1 89.1 89.1 90.9 81.6 89.2 91.6 87.0 91.3 88.0 99.1 82.2 Q37C2 84.3 83.3 88.8 77.9 84.5 89.1 77.9 87.6 84.2 98.7 72.8 Q37C3 77.7 69.5 85.3 71.2 79.4 87.3 61.6 82.5 79.4 98.3 62.8 Q37C4 67.5 47.2 76.5 63.9 73.3 81.6 34.1 72.4 64.8 97.9 51.3 Q37C5 51.1 25.3 51.2 35.0 64.3 59.7 14.7 43.0 39.9 98.1 41.3 Q37D1 88.7 88.6 90.9 81.8 88.9 91.6 86.0 90.9 87.8 99.1 81.9 Q37D2 84.3 83.2 88.6 77.9 84.3 89.1 77.9 87.2 84.4 98.9 73.2 Q37D3 77.7 70.0 85.4 71.0 79.4 87.3 61.0 81.9 79.5 98.8 63.0 Q37D4 67.5 47.4 76.5 63.4 73.3 81.6 33.7 72.3 65.1 98.8 51.1 Q37D5 51.2 25.3 51.2 35.7 64.4 59.7 14.5 43.6 40.4 98.1 41.1 See notes at end of table.

Page 301: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-7

Appendix I. N

onresponse Bias A

nalysis

Table I-3. Faculty item response rates for items with < 85 percent response rate for any institution type before weight adjustments, overall and by institution type—Continued

Public Private, not-for-profit Variable Overall Doctor’s Master’s Bachelor’s Associate’s Other Doctor’s Master’s Bachelor’s Associate’s Other Q37E1 89.5 89.5 91.6 82.4 89.9 93.2 87.2 91.6 88.1 99.2 81.8 Q37E2 84.3 82.4 87.8 77.0 85.8 91.1 77.0 86.8 83.9 99.0 73.1 Q37E3 78.3 69.7 85.3 71.0 80.8 89.7 62.4 82.3 78.9 98.9 63.1 Q37E4 68.1 47.6 76.7 63.4 74.5 84.9 34.2 71.9 65.1 98.8 51.4 Q37E5 52.0 25.4 51.0 35.1 66.3 67.1 15.2 43.4 40.3 98.1 39.9 Q37F1 89.7 90.0 91.6 82.0 89.8 93.2 87.9 91.5 88.0 99.2 83.3 Q37F2 85.2 84.2 89.2 78.8 85.7 91.1 78.8 87.6 84.6 97.9 74.7 Q37F3 78.7 71.5 86.0 72.5 80.3 89.7 62.6 82.2 79.3 98.9 65.1 Q37F4 68.6 49.7 77.0 64.8 74.0 84.9 35.6 72.7 65.3 98.8 53.9 Q37F5 52.6 28.4 51.1 35.9 65.6 67.1 17.4 45.0 40.4 98.1 43.3 Q38A 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38B 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38C 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38D 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38E 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38F 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38G 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38H 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38I 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q38J 87.0 84.1 90.3 81.6 89.5 92.8 77.4 89.0 87.4 99.2 72.6 Q39 92.2 93.5 92.8 84.5 92.0 94.6 92.0 92.9 89.4 99.2 86.5 Q41 92.0 93.2 92.5 84.7 91.9 94.6 91.9 92.7 89.3 99.2 86.1 Q46 91.6 92.7 92.5 83.9 91.7 93.5 91.5 92.5 88.9 99.2 85.3 Q47A1 91.5 92.6 92.4 83.7 91.6 93.5 91.2 92.5 88.9 99.2 85.3 Q47A2 91.6 92.6 92.4 83.5 91.7 93.5 91.1 92.6 89.3 99.2 85.2 Q47A3 91.6 92.5 92.3 83.5 91.7 93.5 91.2 92.5 89.3 99.2 85.3 Q47B1 91.4 92.5 92.0 83.6 91.5 93.5 91.2 92.4 88.7 99.2 84.8 Q47B2 91.5 92.4 92.2 83.5 91.7 93.5 90.9 92.5 89.3 99.2 84.9 Q47B3 91.5 92.4 92.3 83.5 91.7 93.5 91.1 92.5 89.3 99.2 85.2 Q48 91.2 92.1 92.2 83.4 91.3 93.5 90.6 92.2 88.8 99.1 84.4 Q49 91.1 92.0 92.2 83.1 91.3 93.5 90.6 92.2 88.8 99.1 84.5 Q50 91.1 92.0 92.2 83.1 91.3 93.5 90.4 92.1 88.8 99.1 84.4 Q51 91.1 92.0 92.2 83.1 91.2 93.5 90.5 92.0 88.7 99.1 84.4 See notes at end of table.

Page 302: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-8

Appendix I. N

onresponse Bias A

nalysis

Table I-3. Faculty item response rates for items with < 85 percent response rate for any institution type before weight adjustments, overall and by institution type—Continued

Public Private, not-for-profit Variable Overall Doctor’s Master’s Bachelor’s Associate’s Other Doctor’s Master’s Bachelor’s Associate’s Other Q52AA 91.1 91.9 91.9 83.1 91.1 93.5 90.6 91.9 89.0 99.1 84.8 Q52AB 91.0 91.9 91.8 83.1 91.1 93.5 90.6 91.7 89.0 99.1 84.1 Q52AC 91.0 91.9 91.8 83.1 91.1 93.5 90.6 91.9 88.7 99.1 84.6 Q52AD 91.0 91.9 91.8 83.6 91.1 93.5 90.6 91.9 88.9 99.1 84.6 Q52AE 91.0 91.9 91.9 83.0 91.0 93.5 90.6 91.9 88.9 99.1 85.0 Q52AF 91.0 91.9 91.9 83.4 91.1 93.5 90.5 91.9 88.9 99.1 85.0 Q52AG 91.0 91.8 91.8 83.4 91.1 93.5 90.7 91.9 89.0 99.1 84.8 Q52BA 90.9 91.8 91.7 83.1 91.0 93.5 90.4 91.8 88.9 99.1 84.6 Q52BB 90.8 91.7 91.5 83.1 91.0 93.5 90.5 91.5 88.8 99.1 84.1 Q52BC 90.8 91.6 91.6 83.1 91.1 93.5 90.3 91.7 88.7 99.1 84.6 Q52BD 90.8 91.6 91.7 83.4 91.0 93.5 90.4 91.7 88.8 99.1 84.5 Q52BE 90.7 91.5 91.7 82.6 90.8 93.5 90.1 91.5 88.7 99.1 85.0 Q52BF 91.0 91.8 91.7 83.4 91.1 93.5 90.2 91.8 88.9 99.1 84.7 Q52BG 91.0 91.7 91.8 83.4 91.1 93.5 90.6 91.8 88.9 99.1 84.8 Q53 91.3 92.2 92.2 83.2 91.5 93.5 90.7 91.8 88.9 99.2 85.0 Q55 91.1 92.0 92.1 82.8 91.3 93.5 90.6 91.7 88.8 99.2 85.0 Q56 84.6 89.4 88.1 73.9 72.1 89.6 87.4 84.7 82.9 98.4 72.8 Q61A 89.2 89.2 91.0 82.0 89.9 94.5 87.0 89.3 87.1 99.1 83.6 Q61B 87.4 87.3 89.3 82.0 87.9 93.3 84.3 88.1 86.2 99.1 81.8 Q61C 88.4 88.0 89.9 81.1 89.3 94.5 85.9 88.9 87.0 99.1 82.8 Q61D 85.9 85.9 87.8 80.5 86.6 92.7 82.9 85.6 84.4 99.0 80.0 Q62A 90.8 91.5 92.0 82.8 91.0 91.7 90.3 91.5 88.6 99.1 84.5 Q62B 90.5 91.4 91.5 82.8 90.8 93.5 89.5 91.3 88.2 99.0 84.1 Q62C 85.7 89.7 88.3 76.1 83.0 90.1 87.1 83.1 83.7 97.3 74.6 Q62D 90.7 91.6 92.1 82.8 90.7 93.5 90.0 91.4 88.4 99.1 84.9 Q64 90.3 91.2 91.4 81.8 90.1 93.3 90.0 91.2 88.1 99.2 83.8 Q65 89.8 90.8 91.1 81.2 89.6 88.9 89.6 91.0 87.4 98.6 83.4 Q66A 82.0 83.7 84.8 79.3 82.5 84.2 77.6 83.7 79.4 90.7 73.1 Q66B 84.3 85.8 85.9 80.3 84.6 86.0 81.8 85.7 81.2 94.8 76.5 Q66C 79.3 81.8 80.2 75.7 78.0 74.0 78.0 80.9 78.0 91.5 71.5 Q66D 79.4 81.7 80.5 76.0 78.4 74.7 77.7 81.0 78.0 91.9 71.6 Q66E 79.1 81.6 80.2 75.4 77.9 74.5 77.6 80.5 77.8 91.7 70.4 Q66F 82.6 84.1 84.0 78.9 82.9 84.7 79.8 83.1 80.4 94.7 74.4 See notes at end of table.

Page 303: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-9

Appendix I. N

onresponse Bias A

nalysis

Table I-3. Faculty item response rates for items with < 85 percent response rate for any institution type before weight adjustments, overall and by institution type—Continued

Public Private, not-for-profit Variable Overall Doctor’s Master’s Bachelor’s Associate’s Other Doctor’s Master’s Bachelor’s Associate’s Other Q67 90.4 91.5 91.7 82.7 90.6 92.5 89.1 91.4 88.3 99.1 83.1 Q68 74.3 56.3 71.2 59.7 83.4 76.4 60.0 81.4 64.6 96.2 69.8 Q69 64.5 46.1 65.5 45.8 72.2 64.7 47.8 74.9 58.0 91.4 61.8 Q70B 83.8 85.5 85.0 77.3 83.6 89.7 80.1 84.3 82.5 95.9 77.9 Q82A 89.2 89.9 90.1 81.9 89.7 90.9 87.6 89.6 87.4 98.9 82.7 Q82B 87.2 85.5 88.2 78.9 89.8 93.3 83.1 89.4 85.9 98.8 82.2 Q82C 86.9 88.2 88.0 78.5 86.9 87.5 85.3 87.0 86.1 98.6 80.1 Q82D 85.1 86.5 86.1 76.4 85.1 84.5 83.4 85.0 84.2 96.8 78.0 Q83 89.9 90.8 90.9 82.5 90.3 92.6 88.2 90.7 87.3 99.1 83.7 NOTE: None of the items with less than 85 percent response rate were included in the abbreviated questionnaire, which consisted of items 1-28 and 71-81. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 304: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-10

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004 Public Private not-for-profit

Variable Variable label All

faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other Q31A Hours per week on paid

tasks at institution

Mean estimated bias † † † 0.03 † † † † † † 0.04 Median estimated bias † † † 0.03 † † † † † † 0.02 Percent significant bias † † † 17.39 † † † † † † 47.83

Q31B Hours per week on unpaid tasks at institution

Mean estimated bias † † † 0.03 † † † † † † 0.04 Median estimated bias † † † 0.02 † † † † † † 0.02 Percent significant bias † † † 13.04 † † † † † † 26.09

Q31C Hours per week on paid tasks outside of institution

Mean estimated bias † † † 0.02 † † † † † † 0.02 Median estimated bias † † † 0.02 † † † † † † 0.02 Percent significant bias † † † 13.04 † † † † † † 26.09

Q31D Hours per week on unpaid tasks outside of institution

Mean estimated bias † † † 0.02 † † † † † † 0.03 Median estimated bias † † † 0.02 † † † † † † 0.02 Percent significant bias † † † 13.04 † † † † † † 26.09

Q32A Percent time spent on instruction, undergraduate

Mean estimated bias † † † 0.02 † † † † † † 0.02 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 8.70 † † † † † † 56.52

Q32B Percent time spent on instruction, graduate/first-professional

Mean estimated bias † † † 0.02 † † † † † † 0.02 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 8.70 † † † † † † 56.52

Q32C Percent time spent on research activities

Mean estimated bias † † † 0.02 † † † † † † 0.02 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 8.70 † † † † † † 56.52

See notes at end of table.

Page 305: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-11

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q32D Percent time spent on other unspecified activities

Mean estimated bias † † † 0.02 † † † † † † 0.02 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 8.70 † † † † † † 56.52

Q35A1 Number of classes taught, credit

Mean estimated bias † † † 0.02 † † † † † † † Median estimated bias † † † 0.02 † † † † † † † Percent significant bias † † † 8.70 † † † † † † †

Q35A2 Number of classes taught, noncredit

Mean estimated bias † † † 0.07 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q35B Number of classes taught, remedial

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.05 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 13.04

Q35C Number of classes taught, distance education

Mean estimated bias † † † 0.07 † † † † † † 0.03 Median estimated bias † † † 0.06 † † † † † † 0.02 Percent significant bias † † † 17.39 † † † † † † 13.04

Q36 Teaching assistant in any credit class

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.04 Percent significant bias † † † 17.39 † † † † † † 13.04

Q37A1 Number of weeks taught, 1st credit class

Mean estimated bias † † † 0.07 † † † † † † 0.05 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 13.04

Q37A2 Number of weeks taught, 2nd credit class

Mean estimated bias 0.03 0.04 † 0.03 0.10 † 0.11 † 0.03 † 0.06 Median estimated bias 0.03 0.03 † 0.03 0.06 † 0.08 † 0.03 † 0.04 Percent significant bias 45.45 43.48 † 13.04 13.04 † 17.39 † 21.74 † 56.52 See notes at end of table.

Page 306: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-12

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q37A3 Number of weeks taught, 3rd credit class

Mean estimated bias 0.06 0.08 † 0.05 0.14 † 0.18 0.08 0.06 † 0.10 Median estimated bias 0.05 0.05 † 0.04 0.16 † 0.14 0.07 0.05 † 0.06 Percent significant bias 57.58 47.83 † 21.74 8.70 † 17.39 43.48 21.74 † 60.87

Q37A4 Number of weeks taught, 4th credit class

Mean estimated bias 0.12 0.17 0.18 0.10 0.19 0.09 0.28 0.17 0.12 † 0.12 Median estimated bias 0.10 0.14 0.16 0.08 0.13 0.07 0.25 0.16 0.10 † 0.08 Percent significant bias 60.61 52.17 13.04 26.09 13.04 30.43 26.09 17.39 26.09 † 60.87

Q37A5 Number of weeks taught, 5th credit class

Mean estimated bias 0.21 0.27 0.24 0.23 0.39 0.13 0.58 0.26 0.26 † 0.21 Median estimated bias 0.19 0.26 0.22 0.20 0.30 0.11 0.60 0.25 0.26 † 0.20 Percent significant bias 48.48 30.43 4.35 30.43 8.70 30.43 34.78 17.39 21.74 † 65.22

Q37B1 Number of credit hours, 1st class

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.04 Percent significant bias † † † 17.39 † † † † † † 8.70

Q37B2 Number of credit hours, 2nd class

Mean estimated bias 0.04 0.05 † 0.03 0.10 † 0.12 † 0.04 † 0.05 Median estimated bias 0.03 0.03 † 0.03 0.06 † 0.09 † 0.04 † 0.04 Percent significant bias 51.52 47.83 † 13.04 13.04 † 17.39 † 26.09 † 56.52

Q37B3 Number of credit hours, 3rd class

Mean estimated bias 0.07 0.09 † 0.05 0.14 † 0.17 0.09 0.06 † 0.09 Median estimated bias 0.05 0.07 † 0.04 0.16 † 0.13 0.08 0.05 † 0.04 Percent significant bias 57.58 47.83 † 17.39 8.70 † 17.39 47.83 26.09 † 60.87

Q37B4 Number of credit hours, 4th class

Mean estimated bias 0.13 0.19 0.17 0.10 0.20 0.09 0.28 0.17 0.13 † 0.12 Median estimated bias 0.11 0.16 0.16 0.08 0.13 0.08 0.25 0.15 0.10 † 0.08 Percent significant bias 60.61 52.17 13.04 26.09 13.04 34.78 26.09 17.39 26.09 † 60.87

Q37B5 Number of credit hours, 5th class

Mean estimated bias 0.21 0.30 0.24 0.23 0.39 0.13 0.58 0.27 0.28 † 0.21 Median estimated bias 0.19 0.27 0.22 0.20 0.30 0.10 0.60 0.28 0.27 † 0.20 Percent significant bias 51.52 43.48 4.35 30.43 13.04 30.43 34.78 21.74 21.74 † 65.22

See notes at end of table.

Page 307: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-13

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q37C1 Number of hours taught per week, 1st class

Mean estimated bias † † † 0.02 † † † † † † 0.03 Median estimated bias † † † 0.02 † † † † † † 0.03 Percent significant bias † † † 13.04 † † † † † † 56.52

Q37C2 Number of hours taught per week, 2nd class

Mean estimated bias 0.03 0.04 † 0.03 0.10 † 0.11 † 0.04 † 0.06 Median estimated bias 0.03 0.03 † 0.03 0.06 † 0.08 † 0.03 † 0.04 Percent significant bias 45.45 43.48 † 13.04 13.04 † 17.39 † 26.09 † 43.48

Q37C3 Number of hours taught per week, 3rd class

Mean estimated bias 0.06 0.08 † 0.05 0.14 † 0.17 0.08 0.06 † 0.09 Median estimated bias 0.05 0.05 † 0.04 0.16 † 0.13 0.07 0.04 † 0.06 Percent significant bias 57.58 47.83 † 21.74 8.70 † 17.39 43.48 21.74 † 56.52

Q37C4 Number of hours taught per week, 4th class

Mean estimated bias 0.12 0.17 0.18 0.09 0.19 0.09 0.27 0.16 0.12 † 0.12 Median estimated bias 0.10 0.14 0.17 0.09 0.13 0.08 0.22 0.13 0.10 † 0.06 Percent significant bias 60.61 52.17 17.39 26.09 8.70 30.43 26.09 17.39 26.09 † 26.09

Q37C5 Number of hours taught per week, 5th class

Mean estimated bias 0.21 0.27 0.24 0.23 0.37 0.13 0.57 0.25 0.27 † 0.21 Median estimated bias 0.19 0.25 0.22 0.19 0.30 0.10 0.60 0.26 0.25 † 0.20 Percent significant bias 51.52 34.78 4.35 30.43 13.04 34.78 34.78 26.09 17.39 † 65.22

Q37D1 Number of students, 1st class

Mean estimated bias † † † 0.03 † † † † † † 0.04 Median estimated bias † † † 0.02 † † † † † † 0.02 Percent significant bias † † † 8.70 † † † † † † 56.52

Q37D2 Number of students, 2nd class

Mean estimated bias 0.03 0.04 † 0.03 0.10 † 0.12 † 0.04 † 0.07 Median estimated bias 0.03 0.02 † 0.03 0.06 † 0.09 † 0.03 † 0.03 Percent significant bias 45.45 43.48 † 13.04 13.04 † 17.39 † 26.09 † 56.52

Q37D3 Number of students, 3rd class

Mean estimated bias 0.06 0.08 † 0.05 0.14 † 0.18 0.08 0.06 † 0.08 Median estimated bias 0.05 0.04 † 0.05 0.16 † 0.15 0.06 0.05 † 0.05 Percent significant bias 54.55 47.83 † 17.39 13.04 † 17.39 47.83 21.74 † 60.87 See notes at end of table.

Page 308: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-14

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q37D4 Number of students, 4th class

Mean estimated bias 0.12 0.17 0.17 0.10 0.20 0.09 0.28 0.16 0.12 † 0.12 Median estimated bias 0.10 0.14 0.15 0.08 0.12 0.08 0.25 0.17 0.10 † 0.08 Percent significant bias 60.61 52.17 13.04 21.74 13.04 30.43 26.09 17.39 26.09 † 60.87

Q37D5 Number of students, 5th class

Mean estimated bias 0.21 0.28 0.24 0.23 0.38 0.13 0.57 0.27 0.27 † 0.21 Median estimated bias 0.19 0.26 0.22 0.20 0.31 0.10 0.60 0.26 0.26 † 0.20 Percent significant bias 51.52 39.13 4.35 30.43 13.04 30.43 34.78 21.74 21.74 † 65.22

Q37E1 Primary level of students, 1st class

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.04 Percent significant bias † † † 17.39 † † † † † † 13.04

Q37E2 Primary level of students, 2nd class

Mean estimated bias 0.03 0.04 † 0.03 † † 0.11 † 0.04 † 0.05 Median estimated bias 0.02 0.02 † 0.02 † † 0.08 † 0.03 † 0.02 Percent significant bias 45.45 43.48 † 17.39 † † 39.13 † 21.74 † 56.52

Q37E3 Primary level of students, 3rd class

Mean estimated bias 0.06 0.08 † 0.05 0.12 † 0.17 0.09 0.05 † 0.07 Median estimated bias 0.04 0.05 † 0.03 0.15 † 0.13 0.08 0.04 † 0.06 Percent significant bias 57.58 52.17 † 21.74 8.70 † 34.78 43.48 21.74 † 60.87

Q37E4 Primary level of students, 4th class

Mean estimated bias 0.12 0.17 0.16 0.10 0.18 0.09 0.27 0.17 0.11 † 0.11 Median estimated bias 0.09 0.12 0.15 0.07 0.12 0.07 0.20 0.18 0.08 † 0.07 Percent significant bias 63.64 43.48 8.70 26.09 13.04 34.78 39.13 13.04 26.09 † 60.87

Q37E5 Primary level of students, 5th class

Mean estimated bias 0.21 0.26 0.22 0.23 0.35 0.13 0.56 0.29 0.24 † 0.19 Median estimated bias 0.19 0.26 0.14 0.21 0.30 0.10 0.60 0.26 0.24 † 0.12 Percent significant bias 51.52 34.78 4.35 30.43 8.70 34.78 65.22 26.09 17.39 † 65.22

Q37F1 Teaching assistant, 1st class

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.04 Percent significant bias † † † 13.04 † † † † † † 13.04 See notes at end of table.

Page 309: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-15

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q37F2 Teaching assistant, 2nd class

Mean estimated bias † 0.04 † 0.08 † † 0.04 † 0.05 † 0.07 Median estimated bias † 0.02 † 0.07 † † 0.04 † 0.03 † 0.06 Percent significant bias † 43.48 † 17.39 † † 39.13 † 17.39 † 13.04

Q37F3 Teaching assistant, 3rd class

Mean estimated bias 0.06 0.08 † 0.05 0.13 † 0.17 0.09 0.06 † 0.08 Median estimated bias 0.04 0.05 † 0.04 0.15 † 0.13 0.07 0.06 † 0.04 Percent significant bias 54.55 47.83 † 17.39 8.70 † 34.78 30.43 26.09 † 60.87

Q37F4 Teaching assistant, 4th class

Mean estimated bias 0.11 0.17 0.17 0.09 0.19 0.09 0.27 0.16 0.12 † 0.12 Median estimated bias 0.09 0.14 0.14 0.07 0.13 0.07 0.20 0.15 0.11 † 0.10 Percent significant bias 57.58 52.17 13.04 17.39 13.04 34.78 39.13 17.39 21.74 † 60.87

Q37F5 Teaching assistant, 5th class

Mean estimated bias 0.20 0.26 0.20 0.23 0.37 0.13 0.56 0.26 0.26 † 0.19 Median estimated bias 0.19 0.21 0.15 0.20 0.32 0.10 0.65 0.20 0.27 † 0.15 Percent significant bias 51.52 34.78 8.70 30.43 13.04 30.43 65.22 8.70 21.74 † 65.22

Q38A Undergrad class, multiple choice midterm/final exams

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

Q38B Undergrad class, essay midterm/final exams

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

† † †Q38C Undergrad class, short answer midterm/final exams

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

See notes at end of table.

Page 310: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-16

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q38D Undergrad class, term/research papers

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

Q38E Undergrad class, multiple drafts of written work

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

Q38F Undergrad class, oral presentations

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

Q38G Undergrad class, group projects

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

Q38H Undergrad class, student evaluations of each others’ work

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

Q38I Undergrad class, laboratory/shop/studio assignments

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

Q38J Undergrad class, service learn/co-op interactions with business

Mean estimated bias † 0.04 † 0.07 † † 0.04 † † † 0.10 Median estimated bias † 0.03 † 0.07 † † 0.02 † † † 0.08 Percent significant bias † 65.22 † 13.04 † † 47.83 † † † 8.70

See notes at end of table.

Page 311: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-17

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q39 Web site for any instructional duties

Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.04 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q41 Hours per week, e-mailing students

Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.05 † † † † † † † Percent significant bias † † † 8.70 † † † † † † †

Q46 Individual instruction, any Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q47A1 Individual instruction, number undergraduate students

Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q47A2 Individual instruction, number graduate students

Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q47A3 Individual instruction, number first-professional students

Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q47B1 Individual instruction, hours with undergraduates

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.05 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

See notes at end of table.

Page 312: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-18

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q47B2 Individual instruction, hours with graduate students

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q47B3 Individual instruction, hours with first-professional students

Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q48 Hours per week, thesis/ dissertation committees

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q49 Hours per week, administrative committees

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q50 Hours per week, with advisees

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q51 Hours per week, office hours

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q52AA Career articles, refereed journals

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

See notes at end of table.

Page 313: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-19

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q52AB Career articles, nonrefereed journals

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q52AC Career book reviews, chapters, creative works

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q52AD Career books, textbooks, reports

Mean estimated bias † † † 0.06 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q52AE Career presentations Mean estimated bias † † † 0.06 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q52AF Career exhibitions, performances

Mean estimated bias † † † 0.07 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q52AG Career patents, computer software

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q52BA Recent articles, refereed journals

Mean estimated bias † † † 0.02 † † † † † † 0.03 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 8.70 † † † † † † 56.52

Q52BB Recent articles, nonrefereed journals

Mean estimated bias † † † 0.02 † † † † † † 0.03 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 8.70 † † † † † † 56.52

See notes at end of table.

Page 314: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-20

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q52BC Recent book reviews, chapters, creative works

Mean estimated bias † † † 0.02 † † † † † † 0.03 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 13.04 † † † † † † 56.52

Q52BD Recent books, textbooks, reports

Mean estimated bias † † † 0.02 † † † † † † 0.03 Median estimated bias † † † 0.01 † † † † † † 0.02 Percent significant bias † † † 13.04 † † † † † † 56.52

Q52BE Recent presentations Mean estimated bias † † † 0.02 † † † † † † † Median estimated bias † † † 0.01 † † † † † † † Percent significant bias † † † 8.70 † † † † † † †

Q52BF Recent exhibitions, performances

Mean estimated bias † † † 0.02 † † † † † † 0.03 Median estimated bias † † † 0.02 † † † † † † 0.02 Percent significant bias † † † 13.04 † † † † † † 56.52

Q52BG Recent patents, computer software

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q53 Scholarly activity, any Mean estimated bias † † † 0.07 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q55 Scholarly activity, any funded

Mean estimated bias † † † 0.07 † † † † † † † Median estimated bias † † † 0.06 † † † † † † † Percent significant bias † † † 17.39 † † † † † † †

Q56 Scholarly activity, description

Mean estimated bias 0.04 † † 0.03 0.11 † † 0.02 0.04 † 0.09 Median estimated bias 0.04 † † 0.03 0.07 † † 0.01 0.03 † 0.05 Percent significant bias 48.48 † † 17.39 8.70 † † 30.43 26.09 † 60.87

See notes at end of table.

Page 315: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-21

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q61A Satisfaction with authority to make decisions

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 13.04 † † † † † † 17.39

Q61B Satisfaction with technology-based activities

Mean estimated bias † † † 0.08 † † † † † † 0.04 Median estimated bias † † † 0.05 † † † † † † 0.04 Percent significant bias † † † 26.09 † † † † † † 4.35

Q61C Satisfaction with equipment/facilities

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.05 † † † † † † 0.03 Percent significant bias † † † 21.74 † † † † † † 17.39

Q61D Satisfaction with institutional support for teaching improvement

Mean estimated bias † † † 0.07 † † 0.03 † 0.04 † 0.05 Median estimated bias † † † 0.05 † † 0.02 † 0.04 † 0.04 Percent significant bias † † † 17.39 † † 34.78 † 13.04 † 4.35

Q62A Satisfaction with workload

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39

Q62B Satisfaction with salary Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 13.04

Q62C Satisfaction with benefits Mean estimated bias † † † 0.09 0.03 † † 0.04 0.04 † 0.06 Median estimated bias † † † 0.06 0.02 † † 0.03 0.03 † 0.04 Percent significant bias † † † 26.09 17.39 † † 21.74 21.74 † 17.39

Q62D Satisfaction with job overall

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 17.39 See notes at end of table.

Page 316: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-22

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q64 Retired from another position

Mean estimated bias † † † 0.08 † † † † † † 0.04 Median estimated bias † † † 0.04 † † † † † † 0.04 Percent significant bias † † † 13.04 † † † † † † 13.04

Q65 Retire from all paid employment, planned age

Mean estimated bias 0.04 0.05 0.11 0.07 0.12 0.04 0.18 0.08 0.06 0.08 0.14 Median estimated bias 0.03 0.03 0.10 0.05 0.10 0.04 0.14 0.05 0.05 0.07 0.08 Percent significant bias 18.18 34.78 17.39 4.35 8.70 17.39 21.74 13.04 13.04 21.74 17.39

Q66A Amount of income from basic salary from institution

Mean estimated bias 0.03 0.03 0.06 0.03 0.07 0.03 0.09 0.03 0.04 † 0.06 Median estimated bias 0.02 0.01 0.05 0.02 0.05 0.02 0.06 0.02 0.02 † 0.04 Percent significant bias 39.39 47.83 17.39 13.04 8.70 26.09 13.04 34.78 17.39 † 30.43

Q66B Amount of income from other income from institution

Mean estimated bias 0.02 † † 0.03 0.06 † 0.08 † 0.04 † 0.06 Median estimated bias 0.02 † † 0.02 0.05 † 0.06 † 0.02 † 0.04 Percent significant bias 42.42 † † 13.04 8.70 † 13.04 † 4.35 † 26.09

Q66C Amount of income from other academic institution

Mean estimated bias 0.03 0.03 0.06 0.03 0.07 0.03 0.09 0.04 0.04 † 0.07 Median estimated bias 0.02 0.02 0.05 0.03 0.03 0.02 0.07 0.03 0.02 † 0.03 Percent significant bias 30.30 39.13 8.70 13.04 13.04 26.09 13.04 17.39 17.39 † 21.74

Q66D Amount of income from consulting or freelance work

Mean estimated bias 0.03 0.03 0.06 0.03 0.07 0.03 0.08 0.04 0.04 † 0.06 Median estimated bias 0.02 0.01 0.06 0.02 0.04 0.02 0.04 0.03 0.03 † 0.05 Percent significant bias 33.33 34.78 8.70 13.04 13.04 21.74 13.04 17.39 8.70 † 21.74

Q66E Amount of income from other employment

Mean estimated bias 0.03 0.03 0.06 0.03 0.07 0.03 0.08 0.04 0.04 † 0.06 Median estimated bias 0.02 0.01 0.05 0.02 0.04 0.02 0.06 0.03 0.03 † 0.04 Percent significant bias 30.30 34.78 4.35 13.04 8.70 21.74 13.04 17.39 13.04 † 21.74

See notes at end of table.

Page 317: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-23

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q66F Amount of income from other unspecified sources

Mean estimated bias 0.03 0.02 0.06 0.03 0.07 0.03 0.07 0.04 0.04 † 0.06 Median estimated bias 0.02 0.02 0.05 0.02 0.05 0.02 0.06 0.03 0.03 † 0.04 Percent significant bias 48.48 47.83 13.04 8.70 8.70 21.74 13.04 34.78 4.35 † 26.09

Q67 Type of contract, length of unit

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.06 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 4.35

Q68 Income paid per course/ credit unit or term

Mean estimated bias 0.11 0.15 0.14 0.15 0.35 0.09 0.24 0.18 0.11 † 0.15 Median estimated bias 0.07 0.12 0.09 0.07 0.37 0.03 0.20 0.12 0.07 † 0.07 Percent significant bias 39.39 43.48 17.39 17.39 8.70 30.43 39.13 43.48 17.39 † 26.09

Q69 Amount of income paid per course/credit unit or term

Mean estimated bias 0.13 0.20 0.17 0.17 0.37 0.11 0.28 0.28 0.13 † 0.18 Median estimated bias 0.08 0.14 0.14 0.07 0.39 0.05 0.28 0.18 0.08 † 0.10 Percent significant bias 42.42 52.17 13.04 26.09 100.00 39.13 30.43 47.83 17.39 † 21.74

Q70B Amount of total household income (range)

Mean estimated bias 0.02 † † 0.03 0.08 † 0.08 0.03 0.03 † 0.06 Median estimated bias 0.02 † † 0.03 0.07 † 0.07 0.03 0.02 † 0.04 Percent significant bias 42.42 † † 13.04 17.39 † 26.09 17.39 13.04 † 21.74

Q82A Opinion: teaching is rewarded

Mean estimated bias † † † 0.07 † † † † † † 0.04 Median estimated bias † † † 0.05 † † † † † † 0.03 Percent significant bias † † † 13.04 † † † † † † 13.04

Q82B Opinion: part-time faculty treated fairly

Mean estimated bias † † † 0.08 † † 0.03 † † † 0.05 Median estimated bias † † † 0.05 † † 0.03 † † † 0.03 Percent significant bias † † † 13.04 † † 39.13 † † † 13.04

See notes at end of table.

Page 318: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-24

Appendix I. N

onresponse Bias A

nalysis

Table I-4. Summary of faculty item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

Q82C Opinion: female faculty treated fairly

Mean estimated bias † † † 0.08 † † † † † † 0.05 Median estimated bias † † † 0.05 † † † † † † 0.04 Percent significant bias † † † 13.04 † † † † † † 13.04

Q82D Opinion: racial minorities treated fairly

Mean estimated bias † † † 0.08 † 0.06 0.02 † 0.04 † 0.06 Median estimated bias † † † 0.07 † 0.07 0.02 † 0.04 † 0.05 Percent significant bias † † † 17.39 † 4.35 34.78 † 17.39 † 17.39

Q83 Opinion about choosing an academic career again

Mean estimated bias † † † 0.07 † † † † † † 0.03 Median estimated bias † † † 0.05 † † † † † † 0.03 Percent significant bias † † † 17.39 † † † † † † 8.70 † Not applicable. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 319: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-25

Similarly, an institution was defined to be an item respondent for an analytic variable if the given institution had data for that variable, observed or deduced via logical imputation. Table I-5 provides a summary of response rates for all institution items, overall and by institution type. For variables with a response rate less than 85 percent—overall or within an institution type—a nonresponse bias analysis was conducted, results of which are summarized in table I-6. Analogous to the faculty item nonresponse bias analysis, for these items the nonresponse bias was estimated for variables known for both respondents and nonrespondents and tested (adjusting for multiple comparisons) to determine if the bias was significant at the 5 percent level.

Two of the 90 items had an overall response rate of less than 85 percent. Item 7E2 of the institution questionnaire was asked only if the institution reported having offered early or phased retirement to any tenured full-time faculty or instructional staff (item 7E). Apparently, the number of full-time faculty and instructional staff who took early or phased retirement in the previous 5 years was difficult for some institutions to quantify. Asking about a single academic year may yield a lower rate of nonresponse.

Item 15BG was asked if the institution reported having a cafeteria-style plan benefit for all or some part-time faculty and instructional staff (item 15AG). Only a small percentage of schools offer this benefit to part-time staff, and a relatively high percentage of schools were not sure whether this benefit was offered. Hence the high nonresponse on this nested question asking whether the benefit was subsidized is due almost entirely to the nonresponse on the gate question, coupled with the small number of schools to which the nested item applied.

Page 320: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-26

Appendix I. N

onresponse Bias A

nalysis

Table I-5. Institution item response rates for items with <85 percent response rate for any institution type before weight adjustment, overall and by institution type: 2004

Public Private not-for-profit Variable All institutions Doctor’s Master's Bachelor’s Associate’s Other

Doctor’s Master’s Bachelor’s Associate’s Other

I2A 96.7 96.0 94.6 67.3 99.3 100.0 94.7 97.5 91.8 100.0 100.0 I2B 97.2 96.0 94.6 67.3 99.3 100.0 94.7 97.5 95.3 100.0 100.0 I2C 96.8 96.0 94.6 67.3 99.4 100.0 94.7 97.5 92.1 100.0 100.0 I2D 97.3 96.5 94.6 67.3 99.3 100.0 94.7 97.5 95.3 100.0 100.0 I2E 97.2 96.0 94.6 67.3 99.3 100.0 94.7 97.5 95.3 100.0 100.0 I2F 97.2 96.0 94.6 67.3 99.3 100.0 94.7 97.5 95.3 100.0 100.0 I4 91.5 95.0 88.1 59.6 90.1 100.0 89.1 96.2 88.0 100.0 100.0 I5 91.6 94.3 88.1 66.8 90.0 100.0 91.2 95.5 90.6 100.0 100.0 I7A 96.3 97.6 94.8 70.6 95.9 100.0 98.4 98.4 95.9 100.0 100.0 I7B 96.3 97.6 94.8 70.6 95.9 100.0 98.4 98.4 95.9 100.0 100.0 I7C 96.3 97.6 94.8 70.6 95.9 100.0 98.4 98.4 95.9 100.0 100.0 I7D 96.1 97.6 94.8 70.6 95.9 100.0 98.4 98.4 94.7 100.0 100.0 I7E 96.3 97.6 94.8 70.6 95.9 100.0 98.4 98.4 95.9 100.0 100.0 I7E2 84.5 79.1 76.1 37.5 81.6 82.3 90.0 90.2 92.6 100.0 100.0 I9 91.8 82.5 86.0 58.9 94.2 92.3 88.5 97.6 95.1 100.0 90.5 I15BA 96.7 97.0 100.0 80.1 93.1 100.0 97.7 98.9 97.4 100.0 100.0 I15BB 96.4 96.7 100.0 78.6 90.7 100.0 97.5 98.9 99.0 100.0 100.0 I15BE 93.5 96.4 96.4 70.1 91.1 100.0 94.0 100.0 96.1 100.0 100.0 I15BF 95.8 98.6 100.0 78.3 90.8 100.0 96.5 100.0 100.0 100.0 100.0 I15BG 83.2 81.4 97.5 51.7 69.3 100.0 88.1 96.7 78.8 100.0 88.0 I19A 90.8 86.3 81.3 76.9 96.1 100.0 89.0 92.4 94.9 100.0 81.7 I19B 90.8 86.3 81.3 76.9 96.1 100.0 89.0 92.4 94.9 100.0 81.7 I19C 91.3 86.3 82.0 76.9 96.7 100.0 89.9 93.9 95.2 100.0 81.7 I19D 91.1 86.3 82.0 76.9 96.3 100.0 89.9 93.3 95.2 100.0 81.7 SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 321: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-27

Appendix I. N

onresponse Bias A

nalysis

Table I-6. Summary of institution item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004 Public Private not-for-profit

Variable Variable label All

faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's OtherI2A Full-time numbers: faculty, fall

2002

Mean estimated bias † † † 0.04 † † † † † † † Median estimated bias † † † 0.03 † † † † † † † Percent significant bias † † † 12.50 † † † † † † †

I2B Full-time numbers: changed from part to full time, 2002–03

Mean estimated bias † † † 0.04 † † † † † † † Median estimated bias † † † 0.03 † † † † † † † Percent significant bias † † † 12.50 † † † † † † †

I2C Full-time numbers: hired, 2002–03

Mean estimated bias † † † 0.04 † † † † † † † Median estimated bias † † † 0.03 † † † † † † † Percent significant bias † † † 12.50 † † † † † † †

I2D Full-time numbers: retired, 2002–03

Mean estimated bias † † † 0.04 † † † † † † † Median estimated bias † † † 0.03 † † † † † † † Percent significant bias † † † 12.50 † † † † † † †

I2E Full-time numbers: left for other reasons, 2002–03

Mean estimated bias † † † 0.04 † † † † † † † Median estimated bias † † † 0.03 † † † † † † † Percent significant bias † † † 12.50 † † † † † † †

I2F Full-time numbers: changed from full to part time, 2002–03

Mean estimated bias † † † 0.04 † † † † † † † Median estimated bias † † † 0.03 † † † † † † † Percent significant bias † † † 12.50 † † † † † † †

I4 Full-time tenure: number considered for tenure, 2002–03

Mean estimated bias † † † 0.07 † † † † † † † Median estimated bias † † † 0.07 † † † † † † † Percent significant bias † † † 6.25 † † † † † † †

I5 Full-time tenure: number granted tenure, 2002–03

Mean estimated bias † † † 0.07 † † † † † † † Median estimated bias † † † 0.07 † † † † † † † Percent significant bias † † † 6.25 † † † † † † † See notes at end of table.

Page 322: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-28

Appendix I. N

onresponse Bias A

nalysis

Table I-6. Summary of institution item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

I7A Full-time tenure: changed tenure policy

Mean estimated bias † † † 0.01 † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † 25.00 † † † † † † †

I7B Full-time tenure: more stringent tenure standards

Mean estimated bias † † † 0.01 † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † 25.00 † † † † † † †

I7C Full-time tenure: downsized tenured faculty

Mean estimated bias † † † 0.01 † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † 25.00 † † † † † † †

I7D Full-time tenure: replaced tenured with fixed term

Mean estimated bias † † † 0.01 † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † 25.00 † † † † † † †

I7E Full-time tenure: offered early retirement

Mean estimated bias † † † 0.01 † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † 25.00 † † † † † † †

I7E2 Full-time tenure: number early retirees, last 5 years

Mean estimated bias 0.13 0.09 0.01 0.17 0.50 0.14 † † † † † Median estimated bias 0.10 0.07 # 0.22 0.42 0.07 † † † † † Percent significant bias 7.69 6.25 # 6.25 12.50 12.50 † † † † †

I9 Full-time faculty: positions sought to fill, fall 2003

Mean estimated bias † 0.08 † † † † † † † † † Median estimated bias † 0.05 † † † † † † † † † Percent significant bias † 6.25 † † † † † † † † †

I15BA Part-time benefit: medical insurance subsidized

Mean estimated bias † † † # † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † # † † † † † † † See notes at end of table.

Page 323: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

I-29

Appendix I. N

onresponse Bias A

nalysis

Table I-6. Summary of institution item nonresponse bias analysis before weight adjustments, overall and by institution type: 2004—Continued

Public Private not-for-profit Variable Variable label

All faculty Doctor's Master's Bachelor's Associate's Other Doctor's Master's Bachelor's Associate's Other

I15BB Part-time benefit: dental insurance subsidized

Mean estimated bias † † † # † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † # † † † † † † †

I15BE Part-time benefit: child care subsidized

Mean estimated bias † † † 0.02 † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † 56.25 † † † † † † †

I15BF Part-time benefit: retiree medical insurance subsidized

Mean estimated bias † † † # † † † † † † † Median estimated bias † † † # † † † † † † † Percent significant bias † † † # † † † † † † †

I15BG Part-time benefit: cafeteria-style plan subsidized

Mean estimated bias 0.11 0.09 † 0.08 0.35 † † † 0.11 † † Median estimated bias 0.08 0.04 † # # † † † # † † Percent significant bias # 18.75 † 25.00 # † † † 18.75 † †

I19A Undergraduate instruction: percent full-time faculty

Mean estimated bias † † 0.05 0.06 † † † † † † 0.02 Median estimated bias † † # 0.03 † † † † † † # Percent significant bias † † 12.50 12.50 † † † † † † #

I19B Undergraduate instruction: percent part-time faculty

Mean estimated bias † † 0.05 0.06 † † † † † † 0.02 Median estimated bias † † # 0.03 † † † † † † # Percent significant bias † † 12.50 12.50 † † † † † † #

I19C Undergraduate instruction: percent teaching assistants

Mean estimated bias † † 0.05 0.07 † † † † † † 0.02 Median estimated bias † † # 0.03 † † † † † † # Percent significant bias † † 12.50 12.50 † † † † † † #

I19D Undergraduate instruction: percent other

Mean estimated bias † † 0.05 0.07 † † † † † † 0.02 Median estimated bias † † # 0.03 † † † † † † # Percent significant bias † † 12.50 12.50 † † † † † † # † Not applicable. # Rounds to zero. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 324: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-30

I.4 Bias Reduction due to Imputation Bias resulting from missing data can occur at the unit level due to differential

nonresponse or undercoverage, while bias at the item level is often due to unanswered questions or inconsistent responses that are typically set to missing once they fail edit checks. Section I.2 described measures taken to reduce bias due to unit nonresponse, while this section evaluates how well the imputation succeeded in reducing bias for items with a weighted response rate of less than 85 percent (using weights prior to nonresponse adjustment) by estimating bias before and after imputation. For continuous variables, the estimated bias was calculated as the mean before imputation minus the mean after imputation. For categorical variables, the estimated bias was computed for each category as the percentage of faculty members in that category before imputation minus the corresponding percentage after imputation. The estimated bias was then tested (adjusting for multiple comparisons) to determine if the bias was significant at the 5 percent level. A categorical variable was considered significantly biased if the bias for any of its categories was significant. The results for faculty items are shown in table I-7 for continuous variables and table I-8 for categorical variables.

Table I-7. Faculty item nonresponse bias analysis for continuous variables, before and after imputation: 2004

Mean Bias Item Before imputation After imputation Estimated Relative Significant Q37A2 13.68 13.65 0.03 0.002 N Q37A3 13.80 13.73 0.07 0.005 Y Q37A4 13.73 13.64 0.10 0.007 Y Q37A5 13.68 13.52 0.16 0.012 Y Q37B2 3.05 3.06 0.00 -0.001 N Q37B3 2.95 2.94 0.01 0.002 N Q37B4 2.81 2.80 0.01 0.003 N Q37B5 2.70 2.69 0.01 0.004 N Q37C2 3.94 3.95 -0.01 -0.003 N Q37C3 3.85 3.88 -0.03 -0.008 N Q37C4 3.85 3.83 0.02 0.005 N Q37C5 3.95 3.99 -0.04 -0.010 N Q37D2 25.36 25.42 -0.06 -0.002 N Q37D3 23.37 23.32 0.05 0.002 N Q37D4 22.65 22.82 -0.17 -0.008 N Q37D5 21.40 21.75 -0.36 -0.016 N Q65 66.13 66.36 -0.23 -0.004 Y Q66A 43,383.00 42,799.04 583.90 0.014 Y Q66B 3,210.20 3,220.17 -9.93 -0.003 N Q66C 2,631.30 2,819.71 -188.39 -0.067 Y Q66D 2,602.80 2,509.66 93.17 0.037 Y Q66E 11,247.00 10,760.89 486.00 0.045 Y Q66F 6,480.60 6,499.25 -18.68 -0.003 N Q69 1,921.30 1,978.28 -56.95 -0.029 N NOTE: None of the items with less than 85 percent response rate were included in the abbreviated questionnaire, which consisted of items 1-28 and 71-81. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 325: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-31

Table I-8. Faculty item nonresponse bias analysis for categorical variables, before and after imputation: 2004

Percent distribution Bias Item

Response Category1 Before After

Estimated Relative Significant

Q37E2 1 82.09 81.60 0.49 0.01 Y Q37E2 2 14.21 14.29 -0.08 -0.01 N Q37E2 3 3.70 4.11 -0.41 -0.10 Y

Q37E3 1 84.51 84.38 0.13 0.00 N Q37E3 2 12.58 12.42 0.16 0.01 N Q37E3 3 2.91 3.20 -0.29 -0.09 Y

Q37E4 1 88.25 87.78 0.47 0.01 N Q37E4 2 9.27 9.33 -0.06 -0.01 N Q37E4 3 2.48 2.89 -0.41 -0.14 Y

Q37E5 1 91.09 90.09 1.00 0.01 Y Q37E5 2 6.30 6.59 -0.29 -0.04 N Q37E5 3 2.61 3.32 -0.71 -0.21 Y

Q37F3 0 91.17 90.83 0.33 0.00 Y Q37F3 1 8.83 9.17 -0.33 -0.04 Y

Q37F4 0 92.03 91.94 0.08 0.00 N Q37F4 1 7.97 8.06 -0.08 -0.01 N

Q37F5 0 92.60 91.98 0.62 0.01 Y Q37F5 1 7.40 8.02 -0.62 -0.08 Y

Q56 1 42.74 42.78 -0.03 0.00 N Q56 2 21.82 21.50 0.32 0.01 N Q56 3 10.76 10.64 0.13 0.01 N Q56 4 15.42 15.56 -0.15 -0.01 N Q56 5 9.26 9.52 -0.26 -0.03 N

Q68 1 37.81 36.87 0.95 0.03 Y Q68 2 33.54 33.17 0.38 0.01 N Q68 3 8.92 9.36 -0.44 -0.05 Y Q68 4 19.73 20.61 -0.89 -0.04 Y

Q70B 1 3.01 2.71 0.30 0.11 Y Q70B 2 11.87 11.21 0.66 0.06 Y Q70B 3 22.09 21.54 0.55 0.03 Y Q70B 4 20.24 20.61 -0.37 -0.02 N Q70B 5 24.91 25.66 -0.75 -0.03 Y Q70B 6 9.55 9.91 -0.36 -0.04 Y Q70B 7 6.24 6.22 0.02 0.00 N Q70B 8 2.10 2.14 -0.04 -0.02 N SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

The above analyses were repeated for institution items as well, the analogous results of which are summarize in tables I-9 and I-10. Again, for continuous variables, the estimated bias was calculated as the mean before imputation minus the mean after imputation, while for categorical variables, the estimated bias was computed for each category as the percentage of institutions in that category before imputation minus the corresponding percentage after

Page 326: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-32

imputation. The estimated bias was then tested (adjusting for multiple comparisons) to determine if the bias was significant at the 5 percent level.

Table I-9. Institution item nonresponse bias analysis for continuous variables, before and after imputation: 2004

Mean Bias Significant Variable Before Imputation After Imputation Estimated Relative Q37A2 13.68 13.65 0.03 0.00 N Q37A3 13.80 13.73 0.07 0.01 Y Q37A4 13.73 13.64 0.10 0.01 Y Q37A5 13.68 13.52 0.16 0.01 Y Q37B2 3.05 3.06 0.00 0.00 N Q37B3 2.95 2.94 0.01 0.00 N Q37B4 2.81 2.80 0.01 0.00 N Q37B5 2.70 2.69 0.01 0.00 N Q37C2 3.94 3.95 -0.01 0.00 N Q37C3 3.85 3.88 -0.03 -0.01 N Q37C4 3.85 3.83 0.02 0.01 N Q37C5 3.95 3.99 -0.04 -0.01 N Q37D2 25.36 25.42 -0.06 0.00 N Q37D3 23.37 23.32 0.05 0.00 N Q37D4 22.65 22.82 -0.17 -0.01 N Q37D5 21.40 21.75 -0.36 -0.02 N Q65 66.13 66.36 -0.23 0.00 Y Q66A 43383.00 42799.04 583.90 0.01 Y Q66B 3210.20 3220.17 -9.93 0.00 N Q66C 2631.30 2819.71 -188.39 -0.07 Y Q66D 2602.80 2509.66 93.17 0.04 Y Q66E 11247.00 10760.89 486.00 0.05 Y Q66F 6480.60 6499.25 -18.68 0.00 N Q69 1921.30 1978.28 -56.95 -0.03 N SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 327: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-33

Table I-10. Institution item nonresponse bias analysis for categorical variables, before and after imputation: 2004

Percent distribution Bias Variable Category Before Imputation After Imputation Estimated Relative Significant Q15 1 70.23 68.96 1.27 0.02 Y Q15 2 10.17 11.67 -1.50 -0.13 Y Q15 3 19.60 19.37 0.23 0.01 N

Q37E2 1 82.09 81.60 0.49 0.01 Y Q37E2 2 14.21 14.29 -0.08 -0.01 N Q37E2 3 3.70 4.11 -0.41 -0.10 Y

Q37E3 1 84.51 84.38 0.13 0.00 N Q37E3 2 12.58 12.42 0.16 0.01 N Q37E3 3 2.91 3.20 -0.29 -0.09 Y

Q37E4 1 88.25 87.78 0.47 0.01 N Q37E4 2 9.27 9.33 -0.06 -0.01 N Q37E4 3 2.48 2.89 -0.41 -0.14 Y

Q37E5 1 91.09 90.09 1.00 0.01 Y Q37E5 2 6.30 6.59 -0.29 -0.04 N Q37E5 3 2.61 3.32 -0.71 -0.21 Y

Q37F2 0 88.48 88.04 0.44 0.01 Y Q37F2 1 11.52 11.96 -0.44 -0.04 Y

Q37F3 0 91.17 90.83 0.33 0.00 Y Q37F3 1 8.83 9.17 -0.33 -0.04 Y

Q37F4 0 92.03 91.94 0.08 0.00 N Q37F4 1 7.97 8.06 -0.08 -0.01 N

Q37F5 0 92.60 91.98 0.62 0.01 Y Q37F5 1 7.40 8.02 -0.62 -0.08 Y

Q56 1 42.74 42.78 -0.03 0.00 N Q56 2 21.82 21.50 0.32 0.01 N Q56 3 10.76 10.64 0.13 0.01 N Q56 4 15.42 15.56 -0.15 -0.01 N Q56 5 9.26 9.52 -0.26 -0.03 N

Q68 1 37.81 36.87 0.95 0.03 Y Q68 2 33.54 33.17 0.38 0.01 N Q68 3 8.92 9.36 -0.44 -0.05 Y Q68 4 19.73 20.61 -0.89 -0.04 Y

Q70B 1 3.01 2.71 0.30 0.11 Y Q70B 2 11.87 11.21 0.66 0.06 Y Q70B 3 22.09 21.54 0.55 0.03 Y Q70B 4 20.24 20.61 -0.37 -0.02 N Q70B 5 24.91 25.66 -0.75 -0.03 Y Q70B 6 9.55 9.91 -0.36 -0.04 Y Q70B 7 6.24 6.22 0.02 0.00 N Q70B 8 2.10 2.14 -0.04 -0.02 N

Q82D 1 58.09 57.76 0.33 0.01 N Q82D 2 33.87 34.09 -0.22 -0.01 N Q82D 3 6.06 6.16 -0.11 -0.02 N Q82D 4 1.99 1.99 0.00 0.00 N SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 328: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-34

I.5 Temporal Analysis Additionally, potential bias due to nonresponse was assessed by comparing the data

obtained from those who responded earlier in the field period against late respondents. Specifically, the pattern of mean response in reference to the date surveys were secured was modeled for various faculty subdomains for each of the following institution sectors:

• two-year institutions;

• four-year doctoral institutions; and

• four-year non-doctoral institutions.

Moreover, these response patterns were modeled at the overall level regardless of institution sector as well.

For this purpose, the length of the field period was divided into the following mutually exclusive and exhaustive nine milestones:

• before the early response incentive period ended—day 1 to day 28;

• day 29 to day 60;

• day 61 to day 90;

• day 91 to day 120;

• day 121 to day 150;

• day 151 to day 180;

• day 181 to day 210;

• day 211 to day 240; and

• beyond day 240—when the abbreviated questionnaire was administered after September 14, 2004.

Subsequently, the pattern of mean response was modeled for subdomains of faculty and instructional staff by institution sector and overall including:

• percentage of full-time faculty whose principal activity is teaching;

• percentage of full-time faculty whose highest degree is a Ph.D.;

• percentage of part-time faculty whose highest degree is a Ph.D.;

• mean age of full-time faculty;

• percentage of faculty responding via computer assisted telephone interview (CATI);

• percentage of faculty responding via the Web;

• percentage of male faculty; and

• percentage of female faculty.

The mean response rates remain monotonous across time in virtually all cases. In particular, these rates display few or no fluctuations between the early and late field periods.

Page 329: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-35

Table I-11 displays the faculty response over time based on institution type and classification; table I-12 displays responses based on mean age and income.

Since no significant differences were found at the nominal 5 percent level, this suggests that no discernible bias has been introduced by securing survey responses late in the field period. The only differences that were significant were those comparing percentages of faculty responding via CATI or the Web at the beginning to those at the last period of data collection. Figures I-1 and I-2 provide a visual summary of the results for CATI and Web responses; on each graph the plotted lines reflect the cumulative mean response rates (unweighted) for the specific subdomain of faculty members, overall and by institution sector. However, such differences are intuitive, as during the first part of the field period (day 1 to 28) faculty members were offered incentives to complete the survey via the Web and no outbound calls were made to secure responses by CATI.

Table I-11. Cumulative faculty response rates by faculty and institution characteristics, by date: 2004

Response time in days Faculty/institution type <=28 <=60 <=90 <=120 <=150 <=180 <=210 <=240 SEP14+ Full-time faculty whose principal

activity is teaching

2-year 82.1 84.8 82.2 83.9 87.5 93.8 92.3 100.0 88.6 Doctoral 51.8 53.1 50.3 47.9 48.4 39.7 47.6 22.2 52.1 Nondoctoral 80.0 78.3 78.4 83.5 70.4 61.1 100.0 100.0 82.9 Overall 63.5 65.4 62.6 62.2 61.1 53.3 71.4 63.2 67.1

Full-time faculty whose highest degree is a Ph.D.

2-year 53.9 44.2 34.5 28.2 26.5 12.9 24.5 20.0 34.9 Doctoral 83.1 76.3 70.1 67.8 61.5 52.7 53.8 52.9 68.0 Nondoctoral 74.5 60.9 55.0 55.2 45.4 32.7 34.8 60.0 57.7 Overall 74.8 63.4 56.0 52.8 45.5 31.8 36.5 40.4 55.6

Part-time faculty whose highest degree is a Ph.D.

2-year 46.1 55.8 65.5 71.8 73.5 87.1 75.5 80.0 65.1 Doctoral 16.9 23.7 29.9 32.2 38.5 47.3 46.2 47.1 32.0 Nondoctoral 25.5 39.1 45.0 44.8 54.6 67.3 65.2 40.0 42.3 Overall 25.2 36.6 44.0 47.2 54.5 68.2 63.5 59.6 44.4

Full-time faculty who teach undergraduates

2-year 97.4 97.0 95.3 99.1 98.2 100.0 100.0 100.0 96.4 Doctoral 67.7 67.0 65.7 64.9 60.6 62.1 71.4 55.6 74.7 Nondoctoral 93.9 93.2 95.3 93.7 92.6 88.9 100.0 100.0 85.8 Overall 78.8 79.2 78.0 77.3 75.1 73.9 85.7 78.9 81.7

Full-time faculty whose highest degree is a Ph.D.

2-year 18.6 18.3 14.2 14.3 16.1 12.5 15.4 25.0 17.5 Doctoral 71.2 67.6 65.3 65.1 62.6 70.7 57.1 33.3 65.6 Nondoctoral 64.7 61.7 58.2 62.7 55.6 61.1 87.5 83.3 57.1 Overall 61.6 56.5 54.3 56.1 51.3 58.7 50.0 47.4 54.4

Part-time faculty whose highest degree is a Ph.D.

2-year 9.8 8.4 6.2 9.8 11.0 13.0 10.0 18.8 9.7 Doctoral 30.0 31.5 27.4 28.0 21.6 15.4 27.8 25.0 35.1 Nondoctoral 21.5 19.9 20.7 22.7 24.6 16.2 26.7 75.0 25.6 Overall 20.1 18.4 16.4 18.3 17.0 14.2 17.8 28.6 21.8

See notes at end of table.

Page 330: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-36

Table I-11. Cumulative faculty response rates by faculty and institution characteristics, by date: 2004—Continued

Response time in days Faculty/institution type <=28 <=60 <=90 <=120 <=150 <=180 <=210 <=240 SEP14+ Full-time faculty who are tenured

2-year 44.1 52.4 46.2 53.6 42.9 62.5 46.2 75.0 53.0 Doctoral 45.6 51.8 45.3 45.0 45.8 51.7 52.4 22.2 52.1 Nondoctoral 42.5 46.4 43.7 42.4 27.8 38.9 75.0 66.7 47.1 Overall 44.6 50.6 45.1 45.8 41.5 51.1 54.8 47.4 50.9

Full-time faculty who are Hispanic 2-year 9.6 12.2 15.4 10.7 14.3 12.5 7.7 9.6 12.2 Doctoral 5.1 4.6 4.6 7.6 5.8 6.9 0.0 5.1 4.6 Nondoctoral 4.1 4.8 7.0 6.3 3.7 0.0 0.0 4.1 4.8 Overall 5.5 6.2 7.2 7.8 7.2 6.5 2.4 5.5 6.2

Part-time faculty who are Hispanic 2-year 10.8 12.4 12.8 11.2 5.2 9.3 10.0 10.8 12.4 Doctoral 6.0 7.6 8.3 8.8 8.2 3.8 5.6 6.0 7.6 Nondoctoral 4.8 5.4 1.7 3.9 7.7 2.7 6.7 4.8 5.4 Overall 7.5 9.1 8.7 8.9 6.6 6.6 8.2 7.5 9.1

Full-time faculty who are Asian 2-year 5.7 4.9 6.5 3.6 10.7 6.3 7.7 25.0 4.2 Doctoral 8.0 7.5 8.5 9.6 9.0 10.3 14.3 0.0 11.2 Nondoctoral 5.6 4.3 4.7 6.3 7.4 0.0 12.5 0.0 5.0 Overall 7.0 6.3 7.3 7.8 9.1 7.6 11.9 5.3 8.2

Part-time faculty who are Asian 2-year 3.4 5.2 5.9 5.6 7.7 3.7 2.5 12.5 3.9 Doctoral 5.1 5.4 3.9 7.3 6.2 0.0 0.0 0.0 7.4 Nondoctoral 3.6 2.3 5.7 3.9 6.2 2.7 0.0 0.0 5.1 Overall 4.1 4.5 5.2 5.8 6.9 2.5 1.4 7.1 5.3

Full-time faculty who are Black 2-year 12.9 19.0 23.7 18.8 19.6 25.0 7.7 25.0 17.5 Doctoral 3.7 6.0 7.8 6.1 6.5 3.4 4.8 0.0 8.4 Nondoctoral 4.3 9.4 11.3 8.2 5.6 16.7 25.0 0.0 6.7 Overall 5.3 9.4 11.5 8.7 9.1 9.8 9.5 5.3 9.6

Part-time faculty who are Black 2-year 15.9 22.2 16.8 18.6 22.6 25.9 27.5 31.3 24.6 Doctoral 3.4 7.6 5.2 7.3 4.1 3.8 16.7 12.5 10.0 Nondoctoral 6.3 10.6 8.6 3.9 10.8 18.9 13.3 0.0 8.5 Overall 9.0 14.7 11.2 11.9 14.5 18.8 21.9 21.4 15.9

Full-time faculty holding a position outside of a postsecondary institution

2-year 18.6 15.7 18.9 27.7 14.3 6.3 7.7 0.0 25.9 Doctoral 7.5 8.2 9.3 7.1 9.7 15.5 4.8 11.1 9.6 Nondoctoral 12.2 13.9 15.5 13.9 9.3 33.3 25.0 0.0 14.6 Overall 10.4 11.1 12.5 12.1 10.6 17.4 9.5 5.3 13.9

Faculty responding via CATI 2-year 4.6 60.1 56.0 42.2 24.2 28.2 38.5 47.1 28.9 Doctoral 6.1 65.0 56.6 44.1 44.5 50.9 47.8 60.0 38.0 Nondoctoral 6.2 63.4 56.4 43.8 38.3 40.5 48.7 66.0 34.3 Overall 10.1 67.4 56.7 46.1 51.7 46.8 56.6 85.0 39.2

See notes at end of table.

Page 331: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-37

Table I-11. Cumulative faculty response rates by faculty and institution characteristics, by date: 2004—Continued

Response time in days Faculty/institution type <=28 <=60 <=90 <=120 <=150 <=180 <=210 <=240 SEP14+ Faculty responding via web

2-year 89.9 32.6 43.3 53.9 48.3 53.2 43.4 15.0 60.8 Doctoral 95.4 39.9 44.0 57.8 75.8 71.8 61.5 52.9 71.1 Nondoctoral 93.9 35.0 43.4 55.9 55.5 49.1 52.2 40.0 62.0 Overall 93.8 36.6 43.6 56.2 61.7 59.5 51.3 34.0 65.7

Male faculty 2-year 50.8 50.4 50.0 47.6 48.3 49.2 47.2 60.0 41.9 Doctoral 59.3 63.5 57.9 56.7 60.7 57.3 35.9 52.9 59.1 Nondoctoral 54.4 55.8 51.2 53.1 53.8 52.7 43.5 40.0 55.5 Overall 56.3 57.9 53.9 53.1 54.8 52.9 42.6 53.2 53.1

Female faculty 2-year 49.2 49.6 50.0 52.4 51.7 50.8 52.8 40.0 58.1 Doctoral 40.7 36.5 42.1 43.3 39.3 42.7 64.1 47.1 40.9 Nondoctoral 45.6 44.2 48.8 46.9 46.2 47.3 56.5 60.0 44.5 Overall 43.7 42.1 46.1 46.9 45.2 47.1 57.4 46.8 46.9

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Table I-12. Mean age and income of faculty response, based on faculty and institution type, by date: 2004

Response time in days Faculty/institution type <=28 <=60 <=90 <=120 <=150 <=180 <=210 <=240 SEP14+ Mean age of full-time faculty

2-year 49 51 50 49 49 53 53 51 49 Doctoral 49 51 50 50 49 48 49 48 51 Nondoctoral 49 51 50 49 49 53 54 51 51 Overall 49 51 50 50 49 50 51 50 51

Mean age of part-time faculty

2-year 48 49 48 49 48 48 46 48 48 Doctoral 49 51 48 49 48 48 51 43 51 Nondoctoral 49 50 50 50 48 51 48 59 49 Overall 49 50 49 49 48 49 48 48 49

Mean income of full-time faculty

2-year $50,755 $52,894 $53,010 $50,781 $50,922 $52,252 $59,068 $58,000 $54,615 Doctoral 73,917 76,051 76,958 79,292 84,715 81,778 77,522 106,238 68,487 Nondoctoral 51,697 53,476 53,365 51,533 55,477 56,922 57,645 68,975 62,256 Overall 64,825 66,070 67,107 68,097 71,616 71,780 68,024 84,315 64,253

Mean income of part-time faculty

2-year 9,047 8,539 9,005 10,171 8,107 8,786 10,139 15,681 10,610 Doctoral 15,800 13,325 15,330 15,484 12,566 7,656 8,196 11,175 13,440 Nondoctoral 9,493 8,448 9,852 11,472 7,352 9,828 15,847 9,375 12,679 Overall 11,598 9,961 11,215 12,138 9,316 8,683 10,833 13,493 12,032

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 332: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix I. Nonresponse Bias Analysis

I-38

Figure I-1. Cumulative percentage of faculty responding via CATI, by selected types of institutions and response time

0

20

40

60

80

100

<=28 <=60 <=90 <=120 <=150 <=180 <=210 <=240 SEP14+

Response time in days

2-yearDoctoralNondoctoral

Percent

NOTE: The overall percentage is not included in this figure because it is indistinguishable from the nondoctoral percentage. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Figure I-2. Cumulative percentage of faculty responding via Web, by selected types of institutions and response time: 2004

0

20

40

60

80

100

<=28 <=60 <=90 <=120 <=150 <=180 <=210 <=240 SEP14+

Response time in days

2-yearDoctoralNondoctoral

Percent

NOTE: The overall percentage is not included in this figure because it is indistinguishable from the nondoctoral percentage. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 333: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

J-1

Appendix J CIP Code Mapping

Page 334: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 335: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix J. CIP Code Mapping

J-3

Table J-1. CIP code mapping

Codes 2004

Label1 1988 1993 1999 General

code Specific

code CIP Agriculture, natural resources and related sciences

Agriculture and related sciences Natural resources and conservation

001,002004,003

101,102110,103

101,102110,103

1

101 102

01 03

Architecture and related services

Architecture and related services

005-009

121-130

121-130 2

201 04 04

Area, ethnic, cultural, and gender studies

Area, ethnic, cultural, and gender studies

113

544

544 3

301 05 05

Arts – visual and performing

Art history, criticism, and conservation Commercial and advertising art Dance Design and applied arts Drama/theatre arts and stagecraft Film/video and photographic arts Fine and studio art Music, general Music history, literature, and theory Visual and performing arts, other

010

012 013 014 015 016 017 018

011,019

141

143 144 145 146 147 148 149

142,150

141

143 144 145 146 147 148 149

142,150

4

401 408 409 402 403 410 404 405 406 407

50.0 50.0703 50.0402 50.03 50.04 50.05 50.06 50.0702 50.0901 50.0902 50.99

Biological and biomedical sciences

Biochemistry, biophysics and molecular biology Botany/plant biology Genetics Microbiological sciences and immunology Physiology, pathology, and related sciences Zoology/animal biology Biological and biomedical sciences, other

094

098 099

091,093,100

391 393 394

395,396 397 398

392,400

391 393 394

395,396 397 398

392,400

5

501 502 503 504 505 506 507

26 26.02 26.03 26.08 26.05 26.09 26.07 26.99

Business, management, marketing, and related

support services Accounting and related services Business administration, management, and

operations Business operations support and assistant

services Finance and financial management services Human resources management and services Management information systems and services Marketing Business, management, marketing, and related

support services, other

020 022

023

021

024,025

026 027

161 163

164

162

165,166

167 170

161 163

164

162

165,166

167 170

6

601 602

603

604 605 608 606 607

52

52.03 52.02

52.04

52.08 52.10 52.12 52.14 52.99

Communication, journalism, communication

technologies, and related programs Communication, journalism, and related

programs Communication technologies/technicians and

support services

028-030,032

031

181-183,190

184

181-183,190

184

7

701

702

09

10

See notes at end of table.

Page 336: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix J. CIP Code Mapping

J-4

Table J-1. CIP code mapping—Continued

Codes 2004

Label1 1988 1993 1999 General

code Specific

code CIP Computer and information sciences and support

services Computer/information technology administration

and management Computer programming Computer science Computer software and media applications Computer systems analysis Computer systems networking and

telecommunications Data entry/microcomputer applications Data processing Information science/studies Computer and information sciences and support

services, other

033

034

036

035

037

201

202

204

203

210

201

202

204

203

210

8

801

802 803 804 805 806

807 808 809 810

11

11.10

11.02 11.07 11.08 11.05 11.09

11.06 11.03 11.04 11.99

Construction trades

Construction trades

122-125

601-610

601-610 9

901 46 46

Education Bilingual, multilingual, and multicultural education Curriculum and instruction Educational administration and supervision Educational assessment, evaluation, and research Educational/instructional media design Higher education/higher education administration Special education and teaching Student counseling and personnel services Education, other Teacher education: Early childhood education and

teaching Teacher education: Elementary education and

teaching Teacher education: Secondary education and

teaching Teacher education: Adult and continuing

education and teaching Teacher education: Specific levels, other Teacher education: Specific subject areas

040 041 042 043

045 046

038,039044,047

048

049

050

051

052 053

223 224 225 226

228 229

221,222227,230

241

242

243

244

245 250

223 224 225 226

228 229 230

221,222227,231

241

242

243

244

245 250

10

1013 1001 1002 1014 1003 1015 1004 1005 1006 1007

1008

1009

1010

1011 1012

13 13.02 13.03 13.04 13.06 13.05 13.0406 13.10 13.11 13.99 13.1210

13.1202

13.1205

13.1201

13.1299 13.13

Engineering, engineering technologies/technicians Biomedical/medical engineering Chemical engineering Civil engineering Computer engineering Electrical, electronics, and communications

engineering Engineering technologies/technicians Environmental/environmental health engineering Mechanical engineering Engineering, other

055

056

059

507 054,058

265 262

263

280

264

261,270

265 262

263

280

264

261,270

11

1101 1102 1103 1104 1105

1106 1107 1108 1109

14.05 14.07 14.08 14.09 14.10

15 14.14 14.19 14.99

English language and literature/letters English language and literature/letters

060-067

291-300

291-300

12

1201

23 23

Family and consumer sciences/human sciences Family and consumer sciences/human sciences

087

350

350

13

1301

19 19

See notes at end of table.

Page 337: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix J. CIP Code Mapping

J-5

Table J-1. CIP code mapping—Continued

Codes 2004

Label1 1988 1993 1999 General

code Specific

code CIP Foreign languages, literatures, and linguistics

Foreign languages, literatures, and linguistics

068-077

311-320

311-320 14

1401 16 16

Health professions and related clinical sciences

Alternative and complementary medicine and medical systems

Chiropractic Clinical/medical laboratory science and allied

professions Dental support services and allied professions Dentistry Health and medical administrative services Allied health and medical assisting services Allied health diagnostic, intervention and

treatment professions Medicine, including psychiatry Mental and social health services and allied

professions Nursing Optometry Osteopathic medicine/osteopathy Pharmacy, pharmaceutical sciences, and

administration Podiatric medicine/podiatry Public health Rehabilitation and therapeutic professions Veterinary medicine Health professions and related clinical services,

other

079 080 078

081

082

083

084

085 086

332 333 331

334

335

336

337

338 340

332 333 331

334

335

336

337

338 340

15

1501

1502 1503

1504 1505 1506 1507 1508

1509 1510

1511 1512 1513 1514

1515 1516 1517 1518 1519

51 51.33

51.01 51.10

51.06 51.04 51.07 51.08 51.09

51.12 51.15

51.16 51.17 51.19 51.20

51.21 51.22 51.23 51.24 51.99

Legal professions and studies

Law Legal support services Legal professions and studies, other

089

370

370

16

1601 1602 1603

22 22.01 22.03 22.99

Library science

Library science

090

380

380 17

1701 25 25

Mathematics and statistics

Mathematics Statistics

101 430 440

390 18

1801 1802

27 27.01 27.05

Mechanical and repair technologies/technicians Mechanical and repair technologies/technicians

128-131

641-644

641-644

19

1901

47 47

Multi/interdisciplinary studies Multi/interdisciplinary studies

103

460

20

2001

30 30

Parks, recreation, leisure, and fitness studies Parks, recreation and leisure studies Health and physical education/fitness

104 470 430 470

21

2101 2102

31 31.01 31.05

Precision production Precision production

132-137

661-670

661-670

22

2201

48 48

Personal and culinary services Culinary arts and related services Personal services, other

126,127

621,630

621,630

23

2301 2302

12 12.05 12.99

See notes at end of table.

Page 338: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix J. CIP Code Mapping

J-6

Table J-1. CIP code mapping—Continued

Codes 2004

Label1 1988 1993 1999 General

code Specific

code CIP Philosophy, religion, and theology

Philosophy Religion/religious studies Theology and religious vocations

105 480 480 490

440 441 442

24

2401 2402 2403

38.01 38.02 39

Physical sciences Astronomy and astrophysics Atmospheric sciences and meteorology Chemistry Geological and earth sciences/geosciences Physics Physical sciences, other

092

095 096 097 100

411

412 414 413 420

411

412 414 413 420

25

2501 2502 2503 2504 2505 2506

40 40.02 40.04 40.05 40.06 40.08 40.99

Psychology

Behavioral psychology Clinical psychology Education/school psychology Psychology, other

106 510 510 26

2601 2602 2603 2604

42 42.17 42.02 42.18 42.99

Public administration and social service professions

Public administration Social work Public administration and social service

professions, other

108 520 520 27

2701 2702 2703

44 44.04 44.07 44.99

Science technologies/technicians

Science technologies/technicians

109

530

530 28

2801 41 41

Security and Protective services

Corrections Criminal justice Fire protection Police science Security and protective services, other

107 500 500 29

2901 2902 2903 2904 2905

43 43.0102 43.0104 43.02 43.0107 43.99

Social sciences and history (except psychology)

Anthropology Archeology Criminology Demography and population studies Economics Geography and cartography History International relations and affairs Political science and government Sociology Urban studies/affairs Social sciences, other (except psychology)

111 112

114 115 116 117 118 119 120

110,121

542 543

545 546 547 548 549 550 551

541,560

542 543

545 546 547 548 549 550 551

541,560

30

3001 3002 3003 3004 3005 3006 3007 3008 3009 3010 3011 3012

45.02 45.03 45.04 45.05 45.06 45.07 54.01 45.09 45.10 45.11 45.12 45.99

Transportation and materials moving

Transportation and materials moving

138-141

681-690

681-690 31

3101 49 49

Other

Other

102,999

450,900

900 32

3201

99.99 1 The general categories used in the 2004 coding scheme are those on the left margin. The specific disciplines within each general category are indented. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 339: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

K-1

Appendix K Analysis Variables

Institution ................................................................................................................................... K-3 Faculty......................................................................................................................................... K-9

Page 340: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 341: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

K-3

Institution

Page 342: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 343: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-5

Table K-1. Institution Variables

Variable name Prefix Label I1A Employment_Employer Number full-time faculty, fall 2003, reported I1B Employment_Employer Number part-time faculty, fall 2003 I2A Employment_Employer Full-time numbers: faculty, fall 2002 I2B Employment_Employer Full-time numbers: changed from part to full time, 2002-03 I2C Employment_Employer Full-time numbers: hired, 2002-03 I2D Employment_Employer Full-time numbers: retired, 2002-03 I2E Employment_Employer Full-time numbers: left for other reasons, 2002-03 I2F Employment_Employer Full-time numbers: changed from full to part time, 2002-03 I2G Employment_Employer Full-time numbers: faculty, fall 2003, calculated I3 Employment_Employer Full-time tenure: has tenure system I4 Employment_Employer Full-time tenure: number considered for tenure, 2002-03 I5 Employment_Employer Full-time tenure: number granted tenure, 2002-03 I6 Employment_Employer Full-time tenure: maximum years on tenure track I7A Employment_Employer Full-time tenure: changed tenure policy I7B Employment_Employer Full-time tenure: more stringent tenure standards I7C Employment_Employer Full-time tenure: downsized tenured faculty I7D Employment_Employer Full-time tenure: replaced tenured with fixed term I7E Employment_Employer Full-time tenure: offered early retirement I7E2 Employment_Employer Full-time tenure: number early retirees, last 5 years I8 Employment_Employer Full-time tenure: discontinued tenure system, last 5 years I9 Employment_Employer Full-time faculty: positions sought to fill, fall 2003 I10AA Employment_Benefits Full-time benefit: medical insurance I10AB Employment_Benefits Full-time benefit: dental insurance I10AC Employment_Benefits Full-time benefit: disability insurance I10AD Employment_Benefits Full-time benefit: life insurance I10AE Employment_Benefits Full-time benefit: child care I10AF Employment_Benefits Full-time benefit: retiree medical insurance I10AG Employment_Benefits Full-time benefit: cafeteria-style plan I10BA Employment_Benefits Full-time benefit: medical insurance subsidized I10BB Employment_Benefits Full-time benefit: dental insurance subsidized I10BC Employment_Benefits Full-time benefit: disability insurance subsidized I10BD Employment_Benefits Full-time benefit: life insurance subsidized I10BE Employment_Benefits Full-time benefit: child care subsidized I10BF Employment_Benefits Full-time benefit: retiree medical insurance subsidized I10BG Employment_Benefits Full-time benefit: cafeteria-style plan subsidized I11A Employment_Benefits Full-time benefit: wellness program I11B Employment_Benefits Full-time benefit: spouse tuition remission I11C Employment_Benefits Full-time benefit: children tuition remission I11D Employment_Benefits Full-time benefit: housing I11E Employment_Benefits Full-time benefit: transportation/parking I11F Employment_Benefits Full-time benefit: paid maternity leave I11G Employment_Benefits Full-time benefit: paid paternity leave I11H Employment_Benefits Full-time benefit: paid sabbatical leave I11I Employment_Benefits Full-time benefit: employee assistance program

See notes at end of table.

Page 344: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-6

Table K-1. Institution Variables—Continued

Variable name Prefix Label I12 Employment_Description Full-time faculty: union representation I13A Employment_Employer Full-time assessment: student evaluations I13B Employment_Employer Full-time assessment: student test scores I13C Employment_Employer Full-time assessment: student career placement I13D Employment_Employer Full-time assessment: other student performance I13E Employment_Employer Full-time assessment: department chair evaluations I13F Employment_Employer Full-time assessment: dean evaluations I13G Employment_Employer Full-time assessment: peer evaluations I13H Employment_Employer Full-time assessment: self-evaluations I14 Employment_Benefits Part-time benefit: retirement plan I15AA Employment_Benefits Part-time benefit: medical insurance I15AB Employment_Benefits Part-time benefit: dental insurance I15AC Employment_Benefits Part-time benefit: disability insurance I15AD Employment_Benefits Part-time benefit: life insurance I15AE Employment_Benefits Part-time benefit: child care I15AF Employment_Benefits Part-time benefit: retiree medical insurance I15AG Employment_Benefits Part-time benefit: cafeteria-style plan I15BA Employment_Benefits Part-time benefit: medical insurance subsidized I15BB Employment_Benefits Part-time benefit: dental insurance subsidized I15BC Employment_Benefits Part-time benefit: disability insurance subsidized I15BD Employment_Benefits Part-time benefit: life insurance subsidized I15BE Employment_Benefits Part-time benefit: child care subsidized I15BF Employment_Benefits Part-time benefit: retiree medical insurance subsidized I15BG Employment_Benefits Part-time benefit: cafeteria-style plan subsidized I16A Employment_Benefits Part-time benefit: wellness program I16B Employment_Benefits Part-time benefit: spouse tuition remission I16C Employment_Benefits Part-time benefit: children tuition remission I16D Employment_Benefits Part-time benefit: housing I16E Employment_Benefits Part-time benefit: transportation/parking I16F Employment_Benefits Part-time benefit: paid maternity leave I16G Employment_Benefits Part-time benefit: paid paternity leave I16H Employment_Benefits Part-time benefit: paid sabbatical leave I16I Employment_Benefits Part-time benefit: employee assistance program I17 Employment_Description Part-time faculty: union representation I18A Employment_Employer Part-time assessment: student evaluations I18B Employment_Employer Part-time assessment: student test scores I18C Employment_Employer Part-time assessment: student career placement I18D Employment_Employer Part-time assessment: other student performance I18E Employment_Employer Part-time assessment: department chair evaluations I18F Employment_Employer Part-time assessment: dean evaluations I18G Employment_Employer Part-time assessment: peer evaluations I18H Employment_Employer Part-time assessment: self-evaluations I19A Employment_Employer Undergraduate instruction: percent full-time faculty I19B Employment_Employer Undergraduate instruction: percent part-time faculty

See notes at end of table.

Page 345: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-7

Table K-1. Institution Variables—Continued

Variable name Prefix Label I19C Employment_Employer Undergraduate instruction: percent teaching assistants I19D Employment_Employer Undergraduate instruction: percent other X01I1 Employment_Employer Number total faculty, fall 2003 X01I2 Employment_Employer Full-time numbers: total left 2002-03 X02I2 Employment_Employer Full-time numbers: total new 2002-03 X01I7 Employment_Employer Full-time tenure: any action taken last 5 years X01I12 Employment_Description Any faculty represented by a union X01I13 Employment_Employer Full-time assessment: any student measure X02I13 Employment_Employer Full-time assessment: any administrative measure X03I13 Employment_Employer Full-time assessment, student evaluations, DK with No X04I13 Employment_Employer Full-time assessment, student test scores, DK with No X05I13 Employment_Employer Full-time assessment, student career placement, DK with No X06I13 Employment_Employer Full-time assessment, other measure of stud perf, DK with No X07I13 Employment_Employer Full-time assessment, dept/division chair eval, DK with No X08I13 Employment_Employer Full-time assessment, dean evaluations, DK with No X09I13 Employment_Employer Full-time assessment, peer evaluations, DK with No X10I13 Employment_Employer Full-time assessment, self evaluations, DK with No X11I13 Employment_Employer Full-time assessment, any student measure, DK with No X12I13 Employment_Employer Full-time assessment, any admin measure, DK with No X01I18 Employment_Employer Part-time assessment: any student measure X02I18 Employment_Employer Part-time assessment: any administrative measure X03I18 Employment_Employer Part-time assessment, student evaluations, DK with No X04I18 Employment_Employer Part-time assessment, student test scores, DK with No X05I18 Employment_Employer Part-time assessment, student career placement, DK with No X06I18 Employment_Employer Part-time assessment, other measure of stud perf, DK with No X07I18 Employment_Employer Part-time assessment, dept/division chair eval, DK with No X08I18 Employment_Employer Part-time assessment, dean evaluations, DK with No X09I18 Employment_Employer Part-time assessment, peer evaluations, DK with No X10I18 Employment_Employer Part-time assessment, self evaluations, DK with No X11I18 Employment_Employer Part-time assessment, any student measure, DK with No X12I18 Employment_Employer Part-time assessment, any admin measure, DK with No X01Q0 Institution_Type 1994 Carnegie (6 cat, all liberal arts) by control, selected cats X02Q0 Institution_Type 1994 Carnegie (6 cat, private liberal arts) by control, selected cats X03Q0 Institution_Type 1994 Carnegie (5 category) by control, selected categories X04Q0 Institution_Type 1994 Carnegie (8 category) by control X05Q0 Institution_Type 1994 Carnegie (10 category), separates I/II X06Q0 Institution_Type 1994 Carnegie, 4-year versus 2-year X08Q0 Institution_Type 1994 Carnegie, doctoral/nondoctoral/2-yr by control X09Q0 Institution_Other Degree of urbanization X10Q0 Institution_Other Ratio of FTE enrollment/FTE faculty X11Q0 Institution_Other Enrollment, undergraduate X12Q0 Institution_Other Enrollment, undergraduate, collapsed X13Q0 Institution_Other Enrollment FTE, undergraduate X14Q0 Institution_Other Enrollment FTE, undergraduate, collapsed

See notes at end of table.

Page 346: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-8

Table K-1. Institution Variables—Continued

Variable name Prefix Label X15Q0 Institution_Other Enrollment, first-professional X16Q0 Institution_Other Enrollment, first-professional, collapsed X17Q0 Institution_Other Enrollment FTE, first-professional X18Q0 Institution_Other Enrollment FTE, first-professional, collapsed X19Q0 Institution_Other Enrollment, graduate X20Q0 Institution_Other Enrollment, graduate, collapsed X21Q0 Institution_Other Enrollment FTE, graduate X22Q0 Institution_Other Enrollment FTE, graduate, collapsed X23Q0 Institution_Other Enrollment, total X24Q0 Institution_Other Enrollment, total, collapsed X25Q0 Institution_Other Enrollment FTE, total X26Q0 Institution_Other Enrollment FTE, total, collapsed X27Q0 Institution_Other Enrollment minority, American Indian/Alaska Native X28Q0 Institution_Other Enrollment minority, Asian/Pacific Islander X29Q0 Institution_Other Enrollment minority, Black non-Hispanic X30Q0 Institution_Other Enrollment minority, Hispanic X31Q0 Institution_Other Core expenses, instruction (in 1000’s) X32Q0 Institution_Other Core expenses, instruction, collapsed X33Q0 Institution_Other Core expenses, research (in 1000’s) X34Q0 Institution_Other Core expenses, research, collapsed X35Q0 Institution_Other Core expenses, total (in 1000’s) X36Q0 Institution_Other Core expenses, total, collapsed X37Q0 Institution_Other Region where institution located X38Q0 Institution_Type 1994 Carnegie, doctoral/nondoctoral/2-year X50Q0 Institution_Other Percent of full-time faculty covered for retirement X51Q0 Institution_Other Average expenditure per faculty member covered for retirement X100Q0 Institution_Type 2000 Carnegie Code, all categories X101Q0 Institution_Type Control, public versus private not-for-profit X102Q0 Institution_Type Level 4-year versus 2-year X103Q0 Institution_Type Control and level X104Q0 Institution_Type 2000 Carnegie code, 10 category X105Q0 Institution_Type 2000 Carnegie code, 9 category X106Q0 Institution_Type 2000 Carnegie code, 7 category X107Q0 Institution_Type 2000 Carnegie code, 5 category X109Q0 Institution_Type 2000 Carnegie, doctoral/nondoctoral X110Q0 Institution_Type 2000 Carnegie code (10 category) by control X111Q0 Institution_Type 2000 Carnegie, 2-year versus 4-year X112Q0 Institution_Type 2000 Carnegie, doctoral/nondoctoral by control X113Q0 Institution_Type 2000 Carnegie, 2-year/4-year by control X120Q0 Institution_Type 2000 Carnegie code (5 category) by control X121Q0 Institution_Type 2000 Carnegie code (5 category) by control, selected categories SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 347: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-9

Faculty

Page 348: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 349: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-11

Table K-2. Faculty Variables

Variable name Prefix Label Q1 Employment_Description Instructional duties, any Q2 Employment_Description Instructional duties related to credit courses/activities Q3 Employment_Description Faculty status Q4 Employment_Description Principal activity Q5 Employment_Description Employed full or part time at this institution Q6 Employment_Description Part-time employment is primary employment Q8 Employment_Description Part-time but preferred full-time position Q9 Employment_Description Year began current job Q10 Employment_Description Rank Q11 Employment_Description Rank, year attained professor or associate professor Q12 Employment_Description Tenure status Q13 Employment_Description Tenure, year attained at any postsecondary institution Q14 Employment_Description Union status Q15 Employment_Description Union status, reason not a member Q16CD2 Employment_Description Principal field of teaching-general code Q16CD4 Employment_Description Principal field of teaching-specific code Q17A1 Education_Attainment Highest degree Q17A2 Education_Attainment Highest degree, date awarded Q17A3C2 Education_Attainment Highest degree field-general code Q17A3C4 Education_Attainment Highest degree field-specific code Q17A4ST Education_Attainment Highest degree institution-state Q17A4I Education_Attainment Highest degree institution-IPEDS Q17A4LEV Education_Attainment Highest degree institution, level Q17A4CN Education_Attainment Highest degree institution, control Q17A4CRN Education_Attainment Highest degree institution, 2000 Carnegie (4 cat) by control,

selected Q17A4CC Education_Attainment Highest degree institution, 2000 Carnegie code, 17 category Q17D1 Education_Attainment Bachelor’s degree date awarded Q18 Employment_Description Other current jobs, number of jobs Q19A1 Employment_Description Other current jobs, full-time employment Q19B1 Employment_Description Other current jobs, number in postsecondary instruction Q21 Employment_History First postsecondary job, current job is first Q23 Employment_History First postsecondary job, year began Q24 Employment_History First postsecondary job, part or full time Q26 Employment_History First postsecondary job, tenure status Q27 Employment_History Other jobs, any outside postsecondary since degree Q28 Employment_History Other jobs, sector of previous job Q31A Employment_Time Allocation Hours per week on paid tasks at institution Q31B Employment_Time Allocation Hours per week on unpaid tasks at institution Q31C Employment_Time Allocation Hours per week on paid tasks outside of institution Q31D Employment_Time Allocation Hours per week on unpaid tasks outside of institution Q32A Employment_Time Allocation Percent time spent on instruction, undergraduate Q32B Employment_Time Allocation Percent time spent on instruction, graduate/first-professional

See notes at end of table.

Page 350: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-12

Table K-2. Faculty Variables—Continued

Variable name Prefix Label Q32C Employment_Time Allocation Percent time spent on research activities Q32D Employment_Time Allocation Percent time spent on other unspecified activities Q35A1 Instruction_Classroom_Classes Number of classes taught, credit Q35A2 Instruction_Classroom_Classes Number of classes taught, noncredit Q35B Instruction_Classroom_Classes Number of classes taught, remedial Q35C Instruction_Classroom_Classes Number of classes taught, distance education Q36 Instruction_Classroom_Assistants Teaching assistant in any credit class Q37A1 Instruction_Classroom_Weeks Number of weeks taught, 1st credit class Q37B1 Instruction_Classroom_Hours Number of credit hours, 1st class Q37C1 Instruction_Classroom_Hours Number of hours taught per week, 1st class Q37D1 Instruction_Classroom_Students Number of students, 1st class Q37E1 Instruction_Level Primary level of students, 1st class Q37F1 Instruction_Classroom_Assistants Teaching assistant, 1st class Q37A2 Instruction_Classroom_Weeks Number of weeks taught, 2nd credit class Q37B2 Instruction_Classroom_Hours Number of credit hours, 2nd class Q37C2 Instruction_Classroom_Hours Number of hours taught per week, 2nd class Q37D2 Instruction_Classroom_Students Number of students, 2nd class Q37E2 Instruction_Level Primary level of students, 2nd class Q37F2 Instruction_Classroom_Assistants Teaching assistant, 2nd class Q37A3 Instruction_Classroom_Weeks Number of weeks taught, 3rd credit class Q37B3 Instruction_Classroom_Hours Number of credit hours, 3rd class Q37C3 Instruction_Classroom_Hours Number of hours taught per week, 3rd class Q37D3 Instruction_Classroom_Students Number of students, 3rd class Q37E3 Instruction_Level Primary level of students, 3rd class Q37F3 Instruction_Classroom_Assistants Teaching assistant, 3rd class Q37A4 Instruction_Classroom_Weeks Number of weeks taught, 4th credit class Q37B4 Instruction_Classroom_Hours Number of credit hours, 4th class Q37C4 Instruction_Classroom_Hours Number of hours taught per week, 4th class Q37D4 Instruction_Classroom_Students Number of students, 4th class Q37E4 Instruction_Level Primary level of students, 4th class Q37F4 Instruction_Classroom_Assistants Teaching assistant, 4th class Q37A5 Instruction_Classroom_Weeks Number of weeks taught, 5th credit class Q37B5 Instruction_Classroom_Hours Number of credit hours, 5th class Q37C5 Instruction_Classroom_Hours Number of hours taught per week, 5th class Q37D5 Instruction_Classroom_Students Number of students, 5th class Q37E5 Instruction_Level Primary level of students, 5th class Q37F5 Instruction_Classroom_Assistants Teaching assistant, 5th class Q38A Instruction_Methods Undergrad class, multiple choice midterm/final exams Q38B Instruction_Methods Undergrad class, essay midterm/final exams Q38C Instruction_Methods Undergrad class, short answer midterm/final exams Q38D Instruction_Methods Undergrad class, term/research papers Q38E Instruction_Methods Undergrad class, multiple drafts of written work Q38F Instruction_Methods Undergrad class, oral presentations Q38G Instruction_Methods Undergrad class, group projects

See notes at end of table.

Page 351: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-13

Table K-2. Faculty Variables—Continued

Variable name Prefix Label Q38H Instruction_Methods Undergrad class, student evaluations of each others’ work Q38I Instruction_Methods Undergrad class, laboratory/shop/studio assignments Q38J Instruction_Methods Undergrad class, service learn/co-op interactions with

business Q39 Technology_Use Website for any instructional duties Q41 Technology_Use Hours per week, e-mailing students Q46 Instruction_Individual Individual instruction, any Q47A1 Instruction_Individual Individual instruction, number undergraduate students Q47A2 Instruction_Individual Individual instruction, number graduate students Q47A3 Instruction_Individual Individual instruction, number first-professional students Q47B1 Instruction_Individual Individual instruction, hours with undergraduates Q47B2 Instruction_Individual Individual instruction, hours with graduate students Q47B3 Instruction_Individual Individual instruction, hours with first-professional students Q48 Instruction_Individual Hours per week, thesis/dissertation committees Q49 Employment_Time Allocation Hours per week, administrative committees Q50 Employment_Time Allocation Hours per week, with advisees Q51 Employment_Time Allocation Hours per week, office hours Q52AA Scholarship_Publications Career articles, refereed journals Q52AB Scholarship_Publications Career articles, nonrefereed journals Q52AC Scholarship_Publications Career book reviews, chapters, creative works Q52AD Scholarship_Publications Career books, textbooks, reports Q52AE Scholarship_Publications Career presentations Q52AF Scholarship_Publications Career exhibitions, performances Q52AG Scholarship_Publications Career patents, computer software Q52BA Scholarship_Publications Recent articles, refereed journals Q52BB Scholarship_Publications Recent articles, nonrefereed journals Q52BC Scholarship_Publications Recent book reviews, chapters, creative works Q52BD Scholarship_Publications Recent books, textbooks, reports Q52BE Scholarship_Publications Recent presentations Q52BF Scholarship_Publications Recent exhibitions, performances Q52BG Scholarship_Publications Recent patents, computer software Q53 Scholarship_Research Scholarly activity, any Q54CD2 Scholarship_Research Principal research field-general code Q54CD4 Scholarship_Research Principal research field-specific code Q56 Scholarship_Research Scholarly activity, description Q55 Scholarship_Research Scholarly activity, any funded Q61A Employment_Satisfaction Satisfaction with authority to make decisions Q61B Employment_Satisfaction Satisfaction with technology-based activities Q61C Employment_Satisfaction Satisfaction with equipment/facilities Q61D Employment_Satisfaction Satisfaction with institutional support for teaching

improvement Q62A Employment_Satisfaction Satisfaction with workload Q62B Employment_Satisfaction Satisfaction with salary Q62C Employment_Satisfaction Satisfaction with benefits Q62D Employment_Satisfaction Satisfaction with job overall

See notes at end of table.

Page 352: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-14

Table K-2. Faculty Variables—Continued

Variable name Prefix Label Q64 Employment_Description Retired from another position Q65 Employment_Future Retire from all paid employment, planned age Q66A Finances_Income Amount of income from basic salary from institution Q66B Finances_Income Amount of income from other income from institution Q66C Finances_Income Amount of income from other academic institution Q66D Finances_Income Amount of income from consulting or freelance work Q66E Finances_Income Amount of income from other employment Q66F Finances_Income Amount of income from other unspecified sources Q66SUM Finances_Income Amount of total individual income Q66B2 Finances_Income Amount of total individual income (range) Q67 Finances_Income Type of contract, length of unit Q68 Finances_Income Income paid per course/credit unit or term Q69 Finances_Income Amount of income paid per course/credit unit or term Q70A Finances_Family Amount of total household income Q70B Finances_Family Amount of total household income (range) Q71 Background_Demographics Gender Q72 Background_Demographics Age, year of birth Q73 Background_Demographics Race/ethnicity, Hispanic/Latino Q74A Background_Demographics Race, American Indian or Alaska Native Q74B Background_Demographics Race, Asian Q74C Background_Demographics Race, Black or African American Q74D Background_Demographics Race, Native Hawaiian or other Pacific Islander Q74E Background_Demographics Race, White Q75 Background_Disabilities Disability, any Q77 Background_Demographics Marital status, fall 2003 Q79 Background_Demographics Dependent children, number Q80 Background_Demographics Born in United States Q81 Background_Demographics Citizenship status Q82A Institution_Climate Opinion: teaching is rewarded Q82B Institution_Climate Opinion: part-time faculty treated fairly Q82C Institution_Climate Opinion: female faculty treated fairly Q82D Institution_Climate Opinion: racial minorities treated fairly Q83 Employment_Satisfaction Opinion about choosing an academic career again X01Q1 Instruction_Overall Any instructional duties for credit X02Q1 Instruction_Overall Faculty status or instruction for credit X03Q1 Employment_Description Faculty status and duties X04Q1 Employment_Description Faculty status and credit/noncredit X05Q1 Employment_Description Credit instruction and teaching as principal activity X01Q3 Employment_Description Employment, principal activity, faculty status X01Q4 Employment_Description Principal activity, modified X01Q5 Employment_Description Only employment is part-time at this institution(exclude

consulting) X02Q5 Employment_Description Other employment and employment status at the institution in

Fall 2003 X03Q5 Employment_Description Employment status, gender

See notes at end of table.

Page 353: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-15

Table K-2. Faculty Variables—Continued

Variable name Prefix Label X04Q5 Employment_History Employment status at first PSE job and current job X05Q5 Employment_Description Employment status at this institution and other jobs in Fall

2003 X06Q5 Employment_Description Part-time faculty this institution with other PSE jobs in Fall

2003 X01Q9 Employment_Description Years held current job X02Q9 Employment_Description Age when began current job X03Q9 Employment_Description Years held current job, collapsed X04Q9 Employment_Description Age when began current job, collapsed X01Q10 Background_Demographics Rank, gender X02Q10 Employment_Description Rank, employment status X03Q10 Employment_Description Rank, years since rank achieved X04Q10 Employment_Description Rank, age achieved rank of full professor X05Q10 Employment_Description Rank, years since full professor achieved X06Q10 Employment_Description Rank, years since rank achieved, collapsed X07Q10 Employment_Description Rank, age achieved rank of full professor, collapsed X08Q10 Employment_Description Rank, years since full professor achieved, collapsed X09Q10 Employment_Description Rank, age achieved rank of associate professor X10Q10 Employment_Description Rank, age achieved rank of associate professor, collapsed X11Q10 Employment_Description Rank, years since associate professor achieved X12Q10 Employment_Description Rank, years since associate professor achieved, collapsed X01Q12 Employment_Description Tenure status, collapsed further X02Q12 Background_Demographics Tenure status, gender X03Q12 Employment_Description Tenure status, years since tenure achieved X04Q12 Employment_Description Tenure status, age achieved tenure X05Q12 Employment_Description Tenure status, years since tenure achieved, collapsed X06Q12 Employment_Description Tenure status, age achieved tenure, collapsed X01Q14 Employment_Description Union status, combined X01Q15 Employment_Description Union status, reason not a member, with don’t know X01Q16 Employment_Description Principal field of teaching, NSOPF:88 expanded (26 category) X02Q16 Employment_Description Principal field of teaching, NSOPF:88 (10 category) X03Q16 Employment_Description Principal field of teaching, vocational included (7 category) X04Q16 Employment_Description Principal field of teaching, recoded (11 category) X05Q16 Employment_Description Teaching or research field, NSOPF:88 expanded (26

category) X06Q16 Employment_Description Teaching or research field, NSOPF:88 (10 category) X07Q16 Employment_Description Teaching or research field, vocational included (7 category) X08Q16 Employment_Description Teaching or research field, recoded (11 category) X09Q16 Employment_Description Principal field of teaching-specific code (contiguous values) X10Q16 Employment_Description Teaching or research field-general code X11Q16 Employment_Description Teaching or research field-specific code X01Q17 Education_Attainment Highest degree collapsed further X02Q17 Education_Attainment Highest degree collapsed X03Q17 Education_Attainment Highest degree either doctorate or first-professional

See notes at end of table.

Page 354: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-16

Table K-2. Faculty Variables—Continued

Variable name Prefix Label X04Q17 Education_Attainment Highest degree field, NSOPF:88 expanded (26 category) X05Q17 Education_Attainment Highest degree field, NSOPF:88 (10 category) X06Q17 Education_Attainment Highest degree, years between bachelors and doctorate X07Q17 Education_Attainment Highest degree, age received X08Q17 Education_Attainment Highest degree, age received, collapsed X09Q17 Education_Attainment Highest degree, years since receiving X10Q17 Education_Attainment Highest degree, years since receiving collapsed X11Q17 Education_Attainment Highest degree field, vocational included (7 category) X12Q17 Education_Attainment Highest degree field, recoded (11 category) X13Q17 Education_Attainment Highest degree, years between bachelors and doctorate,

collapsed X14Q17 Education_Attainment Highest degree institution, 2000 Carnegie (10 cat) by control X15Q17 Education_Attainment Highest degree institution, 2000 Carnegie (5 cat) by control,

selected X16Q17 Education_Attainment Highest degree institution, 2000 Carnegie (7 cat) X17Q17 Education_Attainment Highest degree institution, 2000 Carnegie (5 cat) X18Q17 Education_Attainment Highest degree field-specific code (contiguous values) X19Q17 Education_Attainment Highest degree, 1994 Carnegie I/II X20Q17 Education_Attainment Highest degree, 1994 Carnegie matches NSOPF88 X21Q17 Education_Attainment Highest degree, 1994 Carnegie matches NSOPF93 X01Q18 Employment_Description Other employment in Fall 2003 X02Q18 Employment_Description Number of non-PSE-instructional jobs held in Fall 2003 X01Q21 Employment_Description Current PSE job is the first, and only current, PSE job X02Q21 Employment_Description Prior employment status, PSE and other X01Q23 Employment_History Year began first faculty or instructional staff job X02Q23 Employment_History Years since began first faculty or instructional staff job X03Q23 Employment_History Age when began first faculty or instructional staff job X04Q23 Education_Attainment Had doctorate when began first faculty or instructional staff

job X05Q23 Employment_History Years since began first faculty or instructional staff job,

collapsed X06Q23 Employment_History Age when began first faculty or instructional staff job,

collapsed X01Q31 Employment_Time Allocation Average total hours per week worked X02Q31 Employment_Time Allocation Work more than 40 hours per week X01Q32 Employment_Time Allocation Percent of time spent on instruction X01Q35 Instruction_Overall Any instruction for class, individual, or committees X02Q35 Instruction_Overall Any instruction, type X03Q35 Instruction_Overall Any instruction, combination X04Q35 Instruction_Overall Type of classes taught X05Q35 Instruction_Overall Total number of classes taught (for-credit and not-for-credit) X01Q36 Employment_Description Rank and teaching assistant for credit classes X02Q36 Employment_Description Tenure status and teaching assistant for credit classes X01Q37 Instruction_Classroom_Hours Total hours/week teaching credit classes X02Q37 Instruction_Overall Total student contact hours/week in credit classes X03Q37 Instruction_Classroom_Hours Total classroom credit hours in classes

See notes at end of table.

Page 355: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-17

Table K-2. Faculty Variables—Continued

Variable name Prefix Label X04Q37 Instruction_Overall Total student credit hours in classes X05Q37 Instruction_Level Level of students in credit classes X06Q37 Instruction_Overall Number of credit classes reported in detail X07Q37 Instruction_Classroom_Classes Number of credit classes that were undergraduate X08Q37 Instruction_Classroom_Classes Number of credit classes that were graduate/first-professional X09Q37 Instruction_Overall Total students taught in credit classes X10Q37 Instruction_Classroom_Assistant Reported TA in at least one credit class any level X11Q37 Instruction_Classroom_Assistant Reported TA in at least one undergraduate credit class X12Q37 Instruction_Classroom_Assistant Reported TA in at least one graduate/first- professional credit

class X13Q37 Instruction_Classroom_Hours Total classroom credit hours in classes, undergraduate X14Q37 Instruction_Classroom_Hours Total classroom credit hours in classes, graduate/first-

professional X15Q37 Instruction_Classroom_Hours Total hours/week teaching credit classes, undergrad X16Q37 Instruction_Classroom_Hours Total hours/week teaching credit classes, graduate/first-

professional X17Q37 Instruction_Classroom_Hours Total student contact hours/week in credit classes,

undergraduate X18Q37 Instruction_Classroom_Hours Total student contact hours/week in credit classes,

graduate/first- professional X19Q37 Instruction_Classroom_Hours Total student credit hours in classes, undergraduate X20Q37 Instruction_Classroom_Hours Total student credit hours in classes, graduate/first-

professional X21Q37 Instruction_Classroom_Students Total students taught in credit classes, undergraduate X22Q37 Instruction_Classroom_Students Total students taught in credit classes, graduate/first-

professional X23Q37 Instruction_Classroom_Students Average for-credit class size X24Q37 Instruction_Classroom_Students Average undergraduate for-credit class size X25Q37 Instruction_Classroom_Students Average graduate/1st professional for-credit class size X26Q37 Instruction_Level Taught at least one undergraduate class for credit X01Q39 Technology_Use Technology index X01Q47 Instruction_Individual Level of student for individual instruction X02Q47 Instruction_Individual Individual instruction, number graduate/first-professional

students X03Q47 Instruction_Individual Individual instruction, number total students X04Q47 Instruction_Individual Individual instruction, hours with total students X05Q47 Instruction_Individual Individual instruction, hours with graduate/first-professional

students X01Q52 Scholarship_Publications Career total publications/scholarly works X02Q52 Scholarship_Publications Recent total publications/scholarly works X03Q52 Scholarship_Publications Recent total presentations, exhibitions, or performances X04Q52 Scholarship_Publications Career total presentations, exhibitions, or performances X01Q54 Scholarship_Research Principal research field, NSOPF:88 expanded (26 category) X02Q54 Scholarship_Research Principal research field, NSOPF:88 (10 category) X03Q54 Scholarship_Research Principal research field, vocational included (7 category)

See notes at end of table.

Page 356: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-18

Table K-2. Faculty Variables—Continued

Variable name Prefix Label X04Q54 Scholarship_Research Principal research field, recoded (11 category) X05Q54 Scholarship_Research Principal research field-specific code (contiguous values) X01Q61 Employment_Satisfaction Satisfaction, index (sum) of instruction items X01Q62 Employment_Satisfaction Satisfaction, index (sum) of employment items X01Q65 Employment_Future Retire from all paid employment, planned age, collapsed X02Q65 Employment_Future Retire from all paid employment, years until, collapsed X03Q65 Employment_Future Retire from all paid employment, planned age, collapsed, with

DK X04Q65 Employment_Future Retire from all paid employment, years until X05Q65 Employment_Future Retire from all paid employment, years until, collapsed, with

DK X01Q66 Finances_Income Basic salary from institution, collapsed X02Q66 Finances_Income Institution total income except basic salary, collapsed X03Q66 Finances_Income Total income from the institution X04Q66 Finances_Income Total income from the institution, collapsed X05Q66 Finances_Income Outside income, consulting/freelance work, collapsed X06Q66 Finances_Income Outside income, total excluding consulting X07Q66 Finances_Income Outside income, total excluding consulting, collapsed X08Q66 Finances_Income Outside income, total (including consulting) X09Q66 Finances_Income Outside income, total (including consulting), collapsed X10Q66 Finances_Income Received compensation from outside consulting work X11Q66 Finances_Income Outside employment income, excluding consulting X12Q66 Finances_Income Outside employment income, excluding consulting, collapsed X13Q66 Finances_Income Outside employment income, including consulting X14Q66 Finances_Income Outside employment income, including consulting, collapsed X01Q70 Finances_Family Amount of total household income, collapsed, with don’t

knows X01Q72 Background_Demographics Age in 2004 X02Q72 Background_Demographics Age, matches NSOPF:88 distribution X03Q72 Background_Demographics Age, matches NSOPF:93 distribution X04Q72 Background_Demographics Age, below or above 55 years X01Q74 Background_Demographics Race recoded, no more than one race X02Q74 Background_Demographics Race including more than one X03Q74 Background_Demographics Race/ethnicity recoded X04Q74 Background_Demographics Race/ethnicity recoded multiple X05Q74 Background_Demographics Race recoded including multiple according to OMB X06Q74 Background_Demographics Race/ethnicity including multiple, non-Hispanic X01Q77 Background_Demographics Marital status and dependent children X02Q77 Background_Demographics Marital status and dependent children, single parent in 2004 X01Q81 Background_Demographics Citizenship status and birth X02Q81 Background_Demographics Citizenship status and minority status X03Q81 Background_Demographics Citizenship status and ethnicity

See notes at end of table.

Page 357: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-19

Table K-2. Faculty Variables—Continued

Variable name Prefix Label X01Q0 Institution_Type 1994 Carnegie (6 cat, all liberal arts) by control, selected cats X02Q0 Institution_Type 1994 Carnegie (6 cat, private liberal arts) by control, selected

cats X03Q0 Institution_Type 1994 Carnegie (5 category) by control, selected categories X04Q0 Institution_Type 1994 Carnegie (8 category) by control X05Q0 Institution_Type 1994 Carnegie (10 category), separates I/II X06Q0 Institution_Type 1994 Carnegie, 4-year versus 2-year X08Q0 Institution_Type 1994 Carnegie, doctoral/nondoctoral/2-yr by control X09Q0 Institution_Other Degree of urbanization X10Q0 Institution_Other Ratio of FTE enrollment/FTE faculty X11Q0 Institution_Other Enrollment, undergraduate X12Q0 Institution_Other Enrollment, undergraduate, collapsed X13Q0 Institution_Other Enrollment FTE, undergraduate X14Q0 Institution_Other Enrollment FTE, undergraduate, collapsed X15Q0 Institution_Other Enrollment, first-professional X16Q0 Institution_Other Enrollment, first-professional, collapsed X17Q0 Institution_Other Enrollment FTE, first-professional X18Q0 Institution_Other Enrollment FTE, first-professional, collapsed X19Q0 Institution_Other Enrollment, graduate X20Q0 Institution_Other Enrollment, graduate, collapsed X21Q0 Institution_Other Enrollment FTE, graduate X22Q0 Institution_Other Enrollment FTE, graduate, collapsed X23Q0 Institution_Other Enrollment, total X24Q0 Institution_Other Enrollment, total, collapsed X25Q0 Institution_Other Enrollment FTE, total X26Q0 Institution_Other Enrollment FTE, total, collapsed X27Q0 Institution_Other Enrollment minority, American Indian/Alaska Native X28Q0 Institution_Other Enrollment minority, Asian/Pacific Islander X29Q0 Institution_Other Enrollment minority, Black non-Hispanic X30Q0 Institution_Other Enrollment minority, Hispanic X31Q0 Institution_Other Core expenses, instruction (in 1000’s) X32Q0 Institution_Other Core expenses, instruction, collapsed X33Q0 Institution_Other Core expenses, research (in 1000’s) X34Q0 Institution_Other Core expenses, research, collapsed X35Q0 Institution_Other Core expenses, total (in 1000’s) X36Q0 Institution_Other Core expenses, total, collapsed X37Q0 Institution_Other Region where institution located X38Q0 Institution_Type 1994 Carnegie, doctoral/nondoctoral/2-year X99Q0 Institution_Type Institution state X100Q0 Institution_Type 2000 Carnegie code, detailed X101Q0 Institution_Type Institution control X102Q0 Institution_Type Institution level X103Q0 Institution_Type Institution control and highest degree awarded, 4 category X104Q0 Institution_Type 2000 Carnegie code, 10 category

See notes at end of table.

Page 358: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix K. Analysis Variables

K-20

Table K-2. Faculty Variables—Continued

Variable name Prefix Label X105Q0 Institution_Type 2000 Carnegie code, 9 category X106Q0 Institution_Type 2000 Carnegie code, 7 category X107Q0 Institution_Type 2000 Carnegie code, 5 category X108Q0 Institution_Type Institution highest degree awarded, 2 category X109Q0 Institution_Type 2000 Carnegie, doctoral/nondoctoral X110Q0 Institution_Type 2000 Carnegie code (10 category) by control X111Q0 Institution_Type 2000 Carnegie, 2-year versus 4-year X112Q0 Institution_Type 2000 Carnegie, doctoral/nondoctoral by control X113Q0 Institution_Type 2000 Carnegie, 2-year/4-year by control X120Q0 Institution_Type 2000 Carnegie code (5 category) by control X121Q0 Institution_Type 2000 Carnegie code (5 category) by control, selected

categories X122Q0 Institution_Type Institution control and highest degree awarded, 6 category SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 359: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

L-1

Appendix L GEM Adjustment Procedure

Page 360: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 361: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix L. GEM Adjustment Procedure

L–3

Generalized Exponential Model (GEM) – an overview In survey practice, design weights are adjusted to correct the bias introduced by

differential nonresponse and undercoverage via nonresponse adjustment and post-stratification. Since these adjustments can increase variance of estimates by creating extreme weights, oftentimes, extreme weight adjustments are applied to reduce variance inflation due to weighting. The Generalized Exponential Model (GEM) program was developed at RTI (Folsom and Singh, 2000) to provide a unified method for all weight adjustments.

GEM is an expansion of the commonly used method of iterative proportional fitting (raking) based on a generalization of Deville and Särndal’s (1992) logit method, in that bounds on weights are not required to be uniform. For this purpose, GEM has a built-in extreme weight control feature that allows for different bounds on the adjusted weights for different sample units. This control feature can be used for a separate extreme weight adjustment after poststratification such that sample distribution of weights obtained after the initial poststratification is preserved.

The unadjusted initial weights were classified as extreme if they fell outside of the interval of median ± 3 × interquartile range (IQR) within specified domains, where domains were defined as functions of design strata with a minimum sample size requirement of 30. The goal of model fitting was to keep as many variables as possible without unduly increasing the unequal weighting effects (UWE) and the extreme weight proportion. A mixture of forward and backward selection schemes were used for fitting the GEM model. The model started with the main effects and then two-way and higher order interaction effects were added in a forward manner. Subsequently, the bounds were successively tightened until the model statistics and characteristics were satisfied.

During the modeling, a number of statistics were closely monitored to uncover any unusual impact of weight adjustment on the initial weights. These statistics included UWEs, extreme weight proportions, and distribution of large adjustment factors. GEM summary/ diagnostic statistics provide information on the distribution of the initial and adjusted weights, number of variables in the final model, and the extreme weight proportions.

Appendix L References Chen, P., Penne, M.A., & Singh, A.C. (2000). Experience with the Generalized Exponential

Model of Weight Calibration for the National Household Survey on Drug Abuse. American Statistical Association (ASA) Proceedings of the American Statistical Association, Section on Survey Research.

Deville, J.C., & Särndal, C.E. (1992). Calibration Estimators in Survey Sampling. Journal of the American Statistical Association (JASA), 87, 376-382.

Folsom, R.E., & Witt, M.B. (1994). Testing a New Attrition Nonresponse Method for SIPP. ASA Proceedings on Survey Research Methodology Section, 428-433.

Folsom, R.E. and Singh, A.C. (2000). The Generalized Exponential Model for Sampling Weight Calibration for Extreme Values, Nonresponse, and Poststratification. Proceedings of the American Statistical Association, Section on Survey Research.

Page 362: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix L. GEM Adjustment Procedure

L–4

Page 363: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

M-1

Appendix M Design Effects

Page 364: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006
Page 365: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-3

Researchers who do not have access to software packages such as SUDAAN for calculation of design-based standard errors can use a relevant estimate of design effect from this table to approximate the standard errors of statistics for the 2004 National Study of Postsecondary Faculty (NSOPF:04). For estimates of a proportion, p̂ , a design-based standard error can be approximated by:

DEFT)ˆ1(ˆ×

−×n

pp

Similarly, design-based standard errors for estimates of means can be approximated by the following formula, in which S2 represents the sample variance under simple random sampling.

DEFT2

×n

S

For instance as reported in table M-1, it is estimated that among the 7,460 sample faculty members at public doctoral institutions, 39.7 percent were with tenure as of fall 2003 (Q12). The design based (proper) standard error of this estimate is 0.67 percent, while under the simple random sampling assumption the corresponding (improper) standard error is 0.57 percent. However, a rough approximation of the designed based standard error of this estimate can be produced by multiplying the value of standard error obtained under the simple random estimate assumption (0.57 percent) by the estimate of root design effect. This technique is not recommended, however; there are many commercially available statistical software packages to do this.

Page 366: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-4

Table M-1. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by institution type

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Institution Type2

Q1: Percent with instructional duties 7,460 94.15 0.31 0.27 1.32 1.15 1 Q2: Percent with some credit instruction 7,460 86.80 0.48 0.39 1.53 1.24 1 Q3: Percent who had faculty status 7,460 94.29 0.34 0.27 1.60 1.26 1 Q4: Percent whose principal activity was research 7,460 23.35 0.62 0.49 1.61 1.27 1 Q4: Percent whose principal activity was teaching 7,460 51.45 0.54 0.58 0.87 0.93 1 Q6: Percent part-time is primary employment 1,130 44.57 1.76 1.48 1.42 1.19 1 Q8: Percent part-time preferred full-time 1,130 27.44 1.36 1.33 1.05 1.02 1 Q10: Percent with academic rank of professor 7,460 25.92 0.64 0.51 1.61 1.27 1 Q12: Percent with tenure 7,460 39.69 0.67 0.57 1.40 1.18 1 Q15: Percent nonunion union not available 6,340 69.32 0.61 0.58 1.10 1.05 1 Q19A1: Percent with other job that is full-time 7,460 9.48 0.37 0.34 1.22 1.10 1 Q35A1: Percent teaching a single credit class 7,460 27.76 0.54 0.52 1.09 1.04 1 Q37C2: Percent meet > 3 hours for second class 3,700 23.29 0.75 0.69 1.16 1.08 1 Q37F1: Percent with no TA in first class 5,590 71.32 0.70 0.61 1.32 1.15 1 Q39: Percent with web site for instruction 7,460 43.53 0.60 0.57 1.08 1.04 1 Q62A: Percent not “very satisfied” workload 7,460 65.16 0.53 0.55 0.92 0.96 1 Q64: Percent retired from another position 7,460 6.58 0.37 0.29 1.63 1.28 1 Q68: Percent paid by the course 760 41.86 1.74 1.79 0.95 0.98 1 Q77: Percent marital status married 7,460 75.15 0.53 0.50 1.12 1.06 1 Q77: Percent marital status single 7,460 11.28 0.44 0.37 1.43 1.20 1 Q81: Percent United States citizen 7,460 88.86 0.42 0.36 1.35 1.16 1

Q1: Percent with instructional duties 2,620 98.66 0.26 0.22 1.34 1.16 2 Q2: Percent with some credit instruction 2,620 96.31 0.43 0.37 1.37 1.17 2 Q3: Percent who had faculty status 2,620 91.84 0.74 0.54 1.91 1.38 2 Q4: Percent whose principal activity was research 2,620 1.51 0.30 0.24 1.60 1.26 2 Q4: Percent whose principal activity was teaching 2,620 83.40 1.07 0.73 2.18 1.48 2 Q6: Percent part-time is primary employment 670 39.61 2.25 1.89 1.41 1.19 2 Q8: Percent part-time preferred full-time 670 32.77 1.74 1.82 0.92 0.96 2 Q10: Percent with academic rank of professor 2,620 21.73 1.00 0.81 1.55 1.25 2 Q12: Percent with tenure 2,620 35.77 1.11 0.94 1.39 1.18 2 Q15: Percent nonunion union not available 1,720 62.65 1.75 1.17 2.26 1.50 2 Q19A1: Percent with other job that is full-time 2,620 15.95 0.92 0.72 1.67 1.29 2 Q35A1: Percent teaching a single credit class 2,620 20.84 0.86 0.79 1.17 1.08 2 Q37C2: Percent meet > 3 hours for second class 1,920 22.06 1.06 0.95 1.25 1.12 2 Q37F1: Percent with no TA in first class 2,370 89.73 0.67 0.62 1.16 1.08 2 Q39: Percent with web site for instruction 2,620 46.02 1.26 0.97 1.67 1.29 2 Q62A: Percent not “very satisfied” workload 2,620 65.50 1.19 0.93 1.63 1.28 2 Q64: Percent retired from another position 2,620 11.42 0.83 0.62 1.79 1.34 2 Q68: Percent paid by the course 490 37.66 3.20 2.19 2.14 1.46 2 Q77: Percent marital status married 2,620 71.39 1.02 0.88 1.33 1.15 2 Q77: Percent marital status single 2,620 12.54 0.76 0.65 1.39 1.18 2 Q81: Percent United States citizen 2,620 94.53 0.49 0.44 1.20 1.10 2

Q1: Percent with instructional duties 510 98.09 1.54 0.60 6.49 2.55 3 Q2: Percent with some credit instruction 510 95.84 1.82 0.88 4.26 2.06 3 See notes at end of table.

Page 367: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-5

Table M-1. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by institution type—Continued

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Institution Type2

Q3: Percent who had faculty status 510 93.06 1.74 1.12 2.40 1.55 3 Q4: Percent whose principal activity was research 510 0.48 0.24 0.31 0.61 0.78 3 Q4: Percent whose principal activity was teaching 510 86.61 2.32 1.50 2.38 1.54 3 Q6: Percent part-time is primary employment 140 33.70 5.85 4.02 2.11 1.45 3 Q8: Percent part-time preferred full-time 140 45.17 5.43 4.24 1.64 1.28 3 Q10: Percent with academic rank of professor 510 14.08 3.05 1.54 3.93 1.98 3 Q12: Percent with tenure 510 29.59 1.98 2.02 0.96 0.98 3 Q15: Percent nonunion union not available 340 56.38 8.12 2.68 9.16 3.03 3 Q19A1: Percent with other job that is full-time 510 17.62 2.39 1.68 2.02 1.42 3 Q35A1: Percent teaching a single credit class 510 22.41 2.56 1.84 1.93 1.39 3 Q37C2: Percent meet > 3 hours for second class 390 24.29 3.10 2.18 2.02 1.42 3 Q37F1: Percent with no TA in first class 470 93.19 1.86 1.16 2.56 1.60 3 Q39: Percent with web site for instruction 510 47.37 4.81 2.20 4.77 2.18 3 Q62A: Percent not “very satisfied” workload 510 64.03 2.81 2.12 1.76 1.33 3 Q64: Percent retired from another position 510 11.68 2.62 1.42 3.41 1.85 3 Q68: Percent paid by the course 110 31.91 10.51 4.42 5.64 2.38 3 Q77: Percent marital status married 510 69.71 2.52 2.03 1.54 1.24 3 Q77: Percent marital status single 510 10.76 2.13 1.37 2.43 1.56 3 Q81: Percent United States citizen 510 95.55 1.49 0.91 2.68 1.64 3 Q1: Percent with instructional duties 6,420 98.94 0.18 0.13 1.96 1.40 4 Q2: Percent with some credit instruction 6,420 88.51 0.72 0.40 3.31 1.82 4 Q3: Percent who had faculty status 6,420 89.74 0.52 0.38 1.87 1.37 4 Q4: Percent whose principal activity was research 6,420 0.04 0.03 0.03 0.97 0.99 4 Q4: Percent whose principal activity was teaching 6,420 89.33 0.53 0.39 1.88 1.37 4 Q6: Percent part-time is primary employment 3,500 34.18 0.99 0.80 1.53 1.24 4 Q8: Percent part-time preferred full-time 3,500 40.24 1.08 0.83 1.70 1.30 4 Q10: Percent with academic rank of professor 6,420 9.00 0.76 0.36 4.58 2.14 4 Q12: Percent with tenure 6,420 17.83 0.64 0.48 1.81 1.35 4 Q15: Percent nonunion union not available 3,980 50.28 1.46 0.79 3.38 1.84 4 Q19A1: Percent with other job that is full-time 6,420 32.58 0.84 0.59 2.06 1.43 4 Q35A1: Percent teaching a single credit class 6,420 27.52 0.69 0.56 1.55 1.25 4 Q37C2: Percent meet > 3 hours for second class 3,890 43.75 1.13 0.80 2.02 1.42 4 Q37F1: Percent with no TA in first class 5,390 93.99 0.52 0.32 2.54 1.59 4 Q39: Percent with web site for instruction 6,420 31.59 0.95 0.58 2.67 1.63 4 Q62A: Percent not “very satisfied” workload 6,420 49.26 0.82 0.62 1.72 1.31 4 Q64: Percent retired from another position 6,420 16.06 0.70 0.46 2.32 1.52 4 Q68: Percent paid by the course 3,000 26.74 1.39 0.81 2.97 1.72 4 Q77: Percent marital status married 6,420 71.69 0.95 0.56 2.87 1.69 4 Q77: Percent marital status single 6,420 10.71 0.55 0.39 2.06 1.44 4 Q81: Percent United States citizen 6,420 98.14 0.21 0.17 1.53 1.24 4 Q1: Percent with instructional duties 110 99.15 0.95 0.89 1.12 1.06 5 Q2: Percent with some credit instruction 110 92.09 8.46 2.62 10.41 3.23 5 Q3: Percent who had faculty status 110 79.54 9.38 3.92 5.73 2.39 5 Q4: Percent whose principal activity was research 110 2.02 1.09 1.37 0.64 0.80 5 See notes at end of table.

Page 368: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-6

Table M-1. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by institution type—Continued

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Institution Type2

Q4: Percent whose principal activity was teaching 110 87.80 4.78 3.18 2.26 1.50 5 Q6: Percent part-time is primary employment 50 32.82 5.85 6.85 0.73 0.85 5 Q8: Percent part-time preferred full-time 50 35.62 8.43 6.99 1.46 1.21 5 Q10: Percent with academic rank of professor 110 12.05 1.26 3.16 0.16 0.40 5 Q12: Percent with tenure 110 26.85 1.98 4.30 0.21 0.46 5 Q15: Percent nonunion union not available 80 68.09 15.72 5.35 8.65 2.94 5 Q19A1: Percent with other job that is full-time 110 20.84 5.90 3.94 2.24 1.50 5 Q35A1: Percent teaching a single credit class 110 31.12 10.61 4.50 5.57 2.36 5 Q37C2: Percent meet > 3 hours for second class 60 26.31 8.87 5.59 2.51 1.59 5 Q37F1: Percent with no TA in first class 100 92.51 2.59 2.66 0.95 0.97 5 Q39: Percent with web site for instruction 110 48.35 5.30 4.85 1.19 1.09 5 Q62A: Percent not “very satisfied” workload 110 55.72 7.05 4.82 2.14 1.46 5 Q64: Percent retired from another position 110 17.86 11.66 3.72 9.82 3.13 5 Q68: Percent paid by the course 40 33.77 16.58 7.88 4.42 2.10 5 Q77: Percent marital status married 110 66.41 5.07 4.59 1.22 1.10 5 Q77: Percent marital status single 110 11.90 5.80 3.15 3.40 1.84 5 Q81: Percent United States citizen 110 94.02 4.66 2.30 4.10 2.02 5 Q1: Percent with instructional duties 3,160 94.56 0.60 0.40 2.22 1.49 6 Q2: Percent with some credit instruction 3,160 86.28 0.55 0.61 0.80 0.89 6 Q3: Percent who had faculty status 3,160 95.71 0.54 0.36 2.27 1.51 6 Q4: Percent whose principal activity was research 3,160 20.98 0.74 0.72 1.03 1.02 6 Q4: Percent whose principal activity was teaching 3,160 52.90 1.08 0.89 1.49 1.22 6 Q6: Percent part-time is primary employment 710 31.72 1.84 1.75 1.11 1.05 6 Q8: Percent part-time preferred full-time 710 30.13 1.71 1.72 0.98 0.99 6 Q10: Percent with academic rank of professor 3,160 23.77 0.62 0.76 0.68 0.82 6 Q12: Percent with tenure 3,160 30.51 0.75 0.82 0.84 0.91 6 Q15: Percent nonunion union not available 3,040 86.37 0.69 0.62 1.24 1.12 6 Q19A1: Percent with other job that is full-time 3,160 16.26 0.55 0.66 0.70 0.84 6 Q35A1: Percent teaching a single credit class 3,160 30.37 0.98 0.82 1.42 1.19 6 Q37C2: Percent meet > 3 hours for second class 1,450 28.16 1.42 1.18 1.44 1.20 6 Q37F1: Percent with no TA in first class 2,300 72.08 1.09 0.94 1.37 1.17 6 Q39: Percent with web site for instruction 3,160 40.76 0.81 0.87 0.86 0.92 6 Q62A: Percent not “very satisfied” workload 3,160 58.08 1.03 0.88 1.37 1.17 6 Q64: Percent retired from another position 3,160 8.24 0.61 0.49 1.57 1.25 6 Q68: Percent paid by the course 540 46.27 2.04 2.14 0.91 0.96 6 Q77: Percent marital status married 3,160 74.67 0.90 0.77 1.34 1.16 6 Q77: Percent marital status single 3,160 12.20 0.62 0.58 1.13 1.06 6 Q81: Percent United States citizen 3,160 88.03 0.55 0.58 0.92 0.96 6 Q1: Percent with instructional duties 2,270 98.40 0.32 0.26 1.46 1.21 7 Q2: Percent with some credit instruction 2,270 96.21 0.42 0.40 1.08 1.04 7 Q3: Percent who had faculty status 2,270 92.58 0.72 0.55 1.70 1.30 7 Q4: Percent whose principal activity was research 2,270 0.34 0.15 0.12 1.53 1.24 7 Q4: Percent whose principal activity was teaching 2,270 86.88 0.86 0.71 1.47 1.21 7 Q6: Percent part-time is primary employment 1,000 23.27 2.42 1.34 3.28 1.81 7 See notes at end of table.

Page 369: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-7

Table M-1. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by institution type—Continued

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Institution Type2

Q8: Percent part-time preferred full-time 1,000 27.91 1.44 1.42 1.03 1.02 7 Q10: Percent with academic rank of professor 2,270 12.51 0.67 0.69 0.93 0.96 7 Q12: Percent with tenure 2,270 19.19 1.00 0.83 1.46 1.21 7 Q15: Percent nonunion union not available 2,130 81.42 1.41 0.84 2.79 1.67 7 Q19A1: Percent with other job that is full-time 2,270 31.46 1.64 0.97 2.84 1.69 7 Q35A1: Percent teaching a single credit class 2,270 29.91 1.29 0.96 1.79 1.34 7 Q37C2: Percent meet > 3 hours for second class 1,440 37.87 2.86 1.28 4.99 2.23 7 Q37F1: Percent with no TA in first class 2,040 92.04 1.09 0.60 3.31 1.82 7 Q39: Percent with web site for instruction 2,270 43.29 1.64 1.04 2.47 1.57 7 Q62A: Percent not “very satisfied” workload 2,270 52.49 1.17 1.05 1.25 1.12 7 Q64: Percent retired from another position 2,270 15.94 1.27 0.77 2.75 1.66 7 Q68: Percent paid by the course 860 56.06 2.98 1.70 3.09 1.76 7 Q77: Percent marital status married 2,270 75.63 1.38 0.90 2.35 1.53 7 Q77: Percent marital status single 2,270 10.78 1.06 0.65 2.65 1.63 7 Q81: Percent United States citizen 2,270 96.69 0.53 0.38 2.00 1.41 7 Q1: Percent with instructional duties 2,520 97.52 0.53 0.31 2.90 1.70 8 Q2: Percent with some credit instruction 2,520 95.55 0.86 0.41 4.39 2.10 8 Q3: Percent who had faculty status 2,520 91.72 1.01 0.55 3.39 1.84 8 Q4: Percent whose principal activity was research 2,520 1.25 0.37 0.22 2.85 1.69 8 Q4: Percent whose principal activity was teaching 2,520 83.00 1.20 0.75 2.59 1.61 8 Q6: Percent part-time is primary employment 680 30.33 3.25 1.76 3.40 1.84 8 Q8: Percent part-time preferred full-time 680 27.75 2.20 1.72 1.64 1.28 8 Q10: Percent with academic rank of professor 2,520 19.86 1.27 0.79 2.56 1.60 8 Q12: Percent with tenure 2,520 28.09 1.84 0.89 4.22 2.05 8 Q15: Percent nonunion union not available 2,300 81.27 2.07 0.81 6.49 2.55 8 Q19A1: Percent with other job that is full-time 2,520 18.46 1.27 0.77 2.71 1.65 8 Q35A1: Percent teaching a single credit class 2,520 23.85 1.48 0.85 3.03 1.74 8 Q37C2: Percent meet > 3 hours for second class 1,800 28.11 2.06 1.06 3.77 1.94 8 Q37F1: Percent with no TA in first class 2,290 85.87 1.06 0.73 2.12 1.46 8 Q39: Percent with web site for instruction 2,520 42.21 1.69 0.98 2.96 1.72 8 Q62A: Percent not “very satisfied” workload 2,520 56.44 1.55 0.99 2.46 1.57 8 Q64: Percent retired from another position 2,520 11.90 0.89 0.64 1.90 1.38 8 Q68: Percent paid by the course 540 56.68 3.57 2.13 2.81 1.68 8 Q77: Percent marital status married 2,520 72.43 1.20 0.89 1.80 1.34 8 Q77: Percent marital status single 2,520 13.69 0.91 0.68 1.75 1.32 8 Q81: Percent United States citizen 2,520 95.32 0.61 0.42 2.13 1.46 8 Q1: Percent with instructional duties 190 98.64 1.00 0.84 1.40 1.18 9 Q2: Percent with some credit instruction 190 95.07 3.41 1.57 4.71 2.17 9 Q3: Percent who had faculty status 190 74.61 5.58 3.16 3.12 1.77 9 Q4: Percent whose principal activity was teaching 190 79.49 5.71 2.93 3.80 1.95 9 Q6: Percent part-time is primary employment 190 57.09 5.42 3.59 2.28 1.51 9 Q8: Percent part-time preferred full-time 60 40.26 3.59 6.18 0.34 0.58 9 Q10: Percent with academic rank of professor 190 0.00 0.00 0.00 9 See notes at end of table.

Page 370: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-8

Table M-1. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by institution type—Continued

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Institution Type2

Q12: Percent with tenure 190 8.93 2.71 2.07 1.72 1.31 9 Q15: Percent nonunion union not available 60 39.47 4.22 6.16 0.47 0.69 9 Q19A1: Percent with other job that is full-time 190 25.10 1.15 3.15 0.13 0.36 9 Q35A1: Percent teaching a single credit class 190 98.43 1.02 0.90 1.28 1.13 9 Q37C2: Percent meet > 3 hours for second class 170 90.02 0.92 2.30 0.16 0.40 9 Q37F1: Percent with no TA in first class 190 21.35 3.31 2.97 1.24 1.11 9 Q39: Percent with web site for instruction 190 13.20 3.17 2.46 1.67 1.29 9 Q62A: Percent not “very satisfied” workload 140 18.05 4.83 3.31 2.13 1.46 9 Q64: Percent retired from another position 190 49.08 5.08 3.63 1.96 1.40 9 Q68: Percent paid by the course 170 92.59 5.46 1.99 7.53 2.74 9 Q77: Percent marital status married 190 7.48 1.60 1.91 0.70 0.84 9 Q77: Percent marital status single 190 13.98 4.92 2.52 3.83 1.96 9 Q81: Percent United States citizen 190 84.06 2.58 2.66 0.94 0.97 9 Q1: Percent with instructional duties 50 27.05 5.70 6.10 0.87 0.93 10 Q2: Percent with some credit instruction 850 97.63 0.82 0.52 2.44 1.56 10 Q3: Percent who had faculty status 850 94.74 1.07 0.77 1.95 1.40 10 Q4: Percent whose principal activity was research 850 83.12 2.49 1.29 3.75 1.94 10 Q4: Percent whose principal activity was teaching 850 94.27 1.13 0.80 2.02 1.42 10 Q6: Percent part-time is primary employment 850 55.28 2.30 1.71 1.82 1.35 10 Q8: Percent part-time preferred full-time 420 22.07 3.06 2.02 2.29 1.51 10 Q10: Percent with academic rank of professor 850 1.33 0.64 0.39 2.61 1.62 10 Q12: Percent with tenure 850 19.68 2.15 1.36 2.49 1.58 10 Q15: Percent nonunion union not available 420 31.08 4.87 2.26 4.64 2.15 10 Q19A1: Percent with other job that is full-time 850 13.54 2.72 1.17 5.35 2.31 10 Q35A1: Percent teaching a single credit class 850 95.19 1.30 0.73 3.11 1.76 10 Q37C2: Percent meet > 3 hours for second class 740 88.65 2.36 1.16 4.11 2.03 10 Q37F1: Percent with no TA in first class 850 33.17 3.13 1.62 3.75 1.94 10 Q39: Percent with web site for instruction 850 30.57 2.14 1.58 1.83 1.35 10 Q62A: Percent not “very satisfied” workload 490 26.40 4.67 1.99 5.50 2.34 10 Q64: Percent retired from another position 850 35.00 5.48 1.64 11.20 3.35 10 Q68: Percent paid by the course 780 88.63 2.48 1.14 4.72 2.17 10 Q77: Percent marital status married 850 14.07 1.96 1.19 2.70 1.64 10 Q77: Percent marital status single 850 10.49 1.43 1.05 1.84 1.36 10 Q81: Percent United States citizen 850 69.94 3.04 1.57 3.73 1.93 10 1 Numbers rounded to the nearest 10. 2 Institution types are defined as follows: 1 = public doctor’s; 2 = public master’s; 3 = public bachelor’s; 4 = public associate’s; 5 = public other; 6 = private not-for-profit doctor’s; 7 = private not-for-profit master’s; 7 = private not-for-profit bachelor’s; 8 = private not-for-profit associate’s; 9 = private not-for-profit associate’s; and 10 = private not-for-profit other. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 371: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-9

Table M-2. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by race/ethnicity

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Race/ethnicity2

Q1: Percent with instructional duties 20,390 9712 015 012 162 127 White Q2: Percent with some credit instruction 20,390 9055 031 020 225 150 White Q3: Percent who had faculty status 20,390 9204 028 019 218 148 White Q4: Percent whose principal activity was teaching 20,390 7486 036 030 139 118 White Q4: Percent whose principal activity was research 20,390 777 018 019 097 099 White Q6: Percent part-time is primary employment 6,260 3459 068 060 126 112 White Q8: Percent part-time preferred full-time 6,260 3312 068 059 129 114 White Q10: Percent with academic rank of professor 20,390 1865 035 027 162 127 White Q12: Percent with tenure 20,390 2836 041 032 171 131 White Q15: Percent nonunion union not available 16,600 6971 059 036 276 166 White Q19A1: Percent with other job that is full-time 20,390 2162 039 029 179 134 White Q35A1: Percent teaching a single credit class 20,390 2758 041 031 175 132 White Q37F1: Percent with no TA in first class 16,860 8552 034 027 156 125 White Q37C2: Percent meet > 3 hours for second class 11,990 3171 071 043 283 168 White Q39: Percent with web site for instruction 20,390 3991 047 034 192 138 White Q62A: Percent not “very satisfied” workload 20,390 5638 036 035 107 103 White Q64: Percent retired from another position 20,390 1233 033 023 200 142 White Q68: Percent paid by the course 5,080 3695 100 068 218 148 White Q77: Percent marital status single 20,390 1075 027 022 150 123 White Q77: Percent marital status married 20,390 7422 041 031 180 134 White Q81: Percent United States citizen 20,390 9610 019 014 200 141 White Q1: Percent with instructional duties 1,940 9826 033 030 120 109 Black Q2: Percent with some credit instruction 1,940 9108 087 065 182 135 Black Q3: Percent who had faculty status 1,940 9107 099 065 232 152 Black Q4: Percent whose principal activity was teaching 1,940 7588 142 097 213 146 Black Q4: Percent whose principal activity was research 1,940 538 075 051 212 146 Black Q6: Percent part-time is primary employment 890 2342 216 142 233 153 Black Q8: Percent part-time preferred full-time 890 4053 212 164 167 129 Black Q10: Percent with academic rank of professor 1,940 1253 117 075 244 156 Black Q12: Percent with tenure 1,940 2423 141 097 211 145 Black Q15: Percent nonunion union not available 1,440 6361 207 127 265 163 Black Q19A1: Percent with other job that is full-time 1,940 2790 106 102 108 104 Black Q35A1: Percent teaching a single credit class 1,940 2565 152 099 235 153 Black Q37F1: Percent with no TA in first class 1,580 8947 112 077 209 145 Black Q37C2: Percent meet > 3 hours for second class 1,120 3118 183 138 176 133 Black Q39: Percent with web site for instruction 1,940 3516 137 108 160 127 Black Q62A: Percent not “very satisfied” workload 1,940 5879 166 112 221 149 Black Q64: Percent retired from another position 1,940 1134 090 072 157 125 Black Q68: Percent paid by the course 730 3867 285 180 251 158 Black Q77: Percent marital status single 1,940 1879 106 089 144 120 Black Q77: Percent marital status married 1,940 5811 140 112 156 125 Black Q81: Percent United States citizen 1,940 9272 068 059 135 116 Black See notes at end of table.

Page 372: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-10

Table M-2. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by race/ethnicity—Continued

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Race/ethnicity2

Q1: Percent with instructional duties 1,530 93.47 0.78 0.63 1.50 1.23 Asian2 Q2: Percent with some credit instruction 1,530 85.75 0.95 0.89 1.13 1.06 Asian2 Q3: Percent who had faculty status 1,530 95.18 0.79 0.55 2.08 1.44 Asian2 Q4: Percent whose principal activity was teaching 1,530 54.23 1.22 1.28 0.92 0.96 Asian2 Q4: Percent whose principal activity was research 1,530 27.06 1.30 1.14 1.31 1.15 Asian2 Q6: Percent part-time is primary employment 350 43.28 4.10 2.65 2.39 1.54 Asian2 Q8: Percent part-time preferred full-time 350 42.80 3.32 2.65 1.57 1.25 Asian2 Q10: Percent with academic rank of professor 1,530 17.49 1.22 0.97 1.59 1.26 Asian2 Q12: Percent with tenure 1,530 30.97 1.47 1.18 1.55 1.24 Asian2 Q15: Percent nonunion union not available 1,160 69.95 1.83 1.35 1.85 1.36 Asian2 Q19A1: Percent with other job that is full-time 1,530 9.60 0.87 0.75 1.32 1.15 Asian2 Q35A1: Percent teaching a single credit class 1,530 24.34 1.48 1.10 1.81 1.35 Asian2 Q37F1: Percent with no TA in first class 1,180 72.53 1.67 1.30 1.65 1.28 Asian2 Q37C2: Percent meet > 3 hours for second class 830 29.47 1.86 1.58 1.38 1.18 Asian2 Q39: Percent with web site for instruction 1,530 41.76 1.29 1.26 1.04 1.02 Asian2 Q62A: Percent not “very satisfied” workload 1,530 71.18 1.27 1.16 1.21 1.10 Asian2 Q64: Percent retired from another position 1,530 5.02 0.65 0.56 1.37 1.17 Asian2 Q68: Percent paid by the course 260 24.30 3.38 2.66 1.61 1.27 Asian2 Q77: Percent marital status single 1,530 12.58 1.30 0.85 2.34 1.53 Asian2 Q77: Percent marital status married 1,530 79.22 1.47 1.04 2.01 1.42 Asian2 Q81: Percent United States citizen 1,530 66.41 1.41 1.21 1.35 1.16 Asian2 Q1: Percent with instructional duties 1,700 97.77 0.43 0.36 1.48 1.21 Hispanic Q2: Percent with some credit instruction 1,700 88.37 1.12 0.78 2.07 1.44 Hispanic Q3: Percent who had faculty status 1,700 90.88 0.84 0.70 1.46 1.21 Hispanic Q4: Percent whose principal activity was teaching 1,700 77.18 1.39 1.02 1.87 1.37 Hispanic Q4: Percent whose principal activity was research 1,700 7.36 0.58 0.63 0.84 0.92 Hispanic Q6: Percent part-time is primary employment 660 29.53 2.73 1.78 2.37 1.54 Hispanic Q8: Percent part-time preferred full-time 660 44.80 2.97 1.94 2.35 1.53 Hispanic Q10: Percent with academic rank of professor 1,700 12.43 0.82 0.80 1.06 1.03 Hispanic Q12: Percent with tenure 1,700 24.07 1.35 1.04 1.69 1.30 Hispanic Q15: Percent nonunion union not available 1,270 59.94 2.20 1.37 2.56 1.60 Hispanic Q19A1: Percent with Hispanic job that is full-time 1,700 24.05 1.74 1.04 2.81 1.68 Hispanic Q35A1: Percent teaching a single credit class 1,700 26.96 1.31 1.08 1.49 1.22 Hispanic Q37F1: Percent with no TA in first class 1,390 84.79 1.22 0.96 1.60 1.27 Hispanic Q37C2: Percent meet > 3 hours for second class 1,000 30.66 1.96 1.46 1.80 1.34 Hispanic Q39: Percent with web site for instruction 1,700 37.84 1.77 1.17 2.27 1.51 Hispanic Q62A: Percent not “very satisfied” workload 1,700 57.72 1.75 1.20 2.15 1.47 Hispanic Q64: Percent retired from another position 1,700 9.47 1.04 0.71 2.14 1.46 Hispanic Q68: Percent paid by the course 520 39.54 3.38 2.15 2.46 1.57 Hispanic Q77: Percent marital status single 1,700 15.98 1.24 0.89 1.95 1.40 Hispanic Q77: Percent marital status married 1,700 66.47 1.70 1.14 2.20 1.48 Hispanic Q81: Percent United States citizen 1,700 86.72 0.98 0.82 1.43 1.19 Hispanic See notes at end of table.

Page 373: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-11

Table M-2. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by race/ethnicity—Continued

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Race/ethnicity2

Q1: Percent with instructional duties 550 97.76 1.23 0.63 3.79 1.95 Other Q2: Percent with some credit instruction 550 90.89 2.23 1.23 3.29 1.81 Other Q3: Percent who had faculty status 550 93.28 1.17 1.07 1.20 1.10 Other Q4: Percent whose principal activity was teaching 550 78.99 2.38 1.74 1.87 1.37 Other Q4: Percent whose principal activity was research 550 4.94 0.97 0.93 1.09 1.05 Other Q6: Percent part-time is primary employment 190 35.23 4.64 3.45 1.81 1.34 Other Q8: Percent part-time preferred full-time 190 54.59 4.77 3.59 1.76 1.33 Other Q10: Percent with academic rank of professor 550 15.83 1.69 1.56 1.17 1.08 Other Q12: Percent with tenure 550 23.00 1.92 1.80 1.13 1.07 Other Q15: Percent nonunion union not available 420 63.68 2.71 2.36 1.32 1.15 Other Q19A1: Percent with other job that is full-time 550 19.15 1.94 1.68 1.33 1.15 Other Q35A1: Percent teaching a single credit class 550 23.80 2.12 1.82 1.36 1.17 Other Q37F1: Percent with no TA in first class 450 84.18 2.03 1.71 1.40 1.19 Other Q37C2: Percent meet > 3 hours for second class 340 30.94 2.97 2.51 1.41 1.19 Other Q39: Percent with web site for instruction 550 42.88 3.47 2.12 2.68 1.64 Other Q62A: Percent not “very satisfied” workload 550 60.64 2.97 2.09 2.02 1.42 Other Q64: Percent retired from another position 550 10.47 1.61 1.31 1.52 1.23 Other Q68: Percent paid by the course 160 43.14 7.04 3.95 3.17 1.78 Other Q77: Percent marital status single 550 11.98 1.96 1.39 2.00 1.41 Other Q77: Percent marital status married 550 65.51 2.82 2.03 1.92 1.39 Other Q81: Percent United States citizen 550 96.35 0.93 0.80 1.33 1.16 Other 1 Numbers rounded to the nearest 10. 2 Black includes African American; Asian/Pacific Islander includes Native Hawaiian; Hispanic includes Latino; and Other includes American Indian/Alaska Native and those who selected more than one race. Race categories exclude Hispanic origin unless specified. NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 374: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-12

Table M-3. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by gender

Standard error Item Number1 Estimate Design SRS DEFF DEFT Gender Q1: Percent with instructional duties 14,600 96.91 0.19 0.14 1.80 1.34 Male Q2: Percent with some credit instruction 14,600 91.12 0.35 0.24 2.26 1.50 Male Q3: Percent who had faculty status 14,600 93.46 0.30 0.20 2.09 1.45 Male Q4: Percent whose principal activity was research 14,600 11.11 0.28 0.26 1.13 1.06 Male Q4: Percent whose principal activity was teaching 14,600 72.48 0.40 0.37 1.15 1.07 Male Q6: Percent part-time is primary employment 4,150 26.10 0.94 0.68 1.91 1.38 Male Q8: Percent part-time preferred full-time 4,150 34.95 0.69 0.74 0.87 0.93 Male Q10: Percent with academic rank of professor 14,600 23.66 0.43 0.35 1.52 1.23 Male Q12: Percent with tenure 14,600 33.61 0.52 0.39 1.78 1.33 Male Q15: Percent nonunion union not available 11,820 69.35 0.63 0.42 2.24 1.50 Male Q19A1: Percent with other job that is full-time 14,600 22.85 0.43 0.35 1.52 1.23 Male Q35A1: Percent teaching a single credit class 14,600 27.57 0.44 0.37 1.45 1.20 Male Q37C2: Percent meet > 3 hours for second class 8,720 31.78 0.78 0.50 2.42 1.56 Male Q37F1: Percent with no TA in first class 12,200 82.12 0.34 0.35 0.98 0.99 Male Q39: Percent with web site for instruction 14,600 40.84 0.52 0.41 1.65 1.29 Male Q62A: Percent not “very satisfied” workload 14,600 57.12 0.48 0.41 1.38 1.17 Male Q64: Percent retired from another position 14,600 13.31 0.39 0.28 1.95 1.39 Male Q68: Percent paid by the course 3,420 39.11 1.18 0.84 2.00 1.41 Male Q77: Percent marital status married 14,600 78.60 0.44 0.34 1.70 1.30 Male Q77: Percent marital status single 14,600 10.11 0.30 0.25 1.45 1.20 Male Q81: Percent United States citizen 14,600 92.62 0.25 0.22 1.32 1.15 Male Q1: Percent with instructional duties 11,510 97.09 0.15 0.16 0.95 0.97 Female Q2: Percent with some credit instruction 11,510 88.96 0.47 0.29 2.54 1.59 Female Q3: Percent who had faculty status 11,510 90.43 0.40 0.27 2.08 1.44 Female Q4: Percent whose principal activity was research 11,510 5.71 0.20 0.22 0.90 0.95 Female Q4: Percent whose principal activity was teaching 11,510 75.48 0.51 0.40 1.62 1.27 Female Q6: Percent part-time is primary employment 4,210 42.82 1.01 0.76 1.76 1.33 Female Q8: Percent part-time preferred full-time 4,210 34.54 0.92 0.73 1.57 1.25 Female Q10: Percent with academic rank of professor 11,510 10.28 0.37 0.28 1.67 1.29 Female Q12: Percent with tenure 11,510 20.52 0.45 0.38 1.41 1.19 Female Q15: Percent nonunion union not available 9,070 68.43 0.75 0.49 2.35 1.53 Female Q19A1: Percent with other job that is full-time 11,510 19.04 0.56 0.37 2.37 1.54 Female Q35A1: Percent teaching a single credit class 11,510 26.62 0.53 0.41 1.63 1.28 Female Q37C2: Percent meet > 3 hours for second class 6,560 31.08 0.87 0.57 2.32 1.52 Female Q37F1: Percent with no TA in first class 9,260 88.90 0.45 0.33 1.86 1.36 Female Q39: Percent with web site for instruction 11,510 38.29 0.66 0.45 2.10 1.45 Female Q62A: Percent not “very satisfied” workload 11,510 58.24 0.51 0.46 1.25 1.12 Female Q64: Percent retired from another position 11,510 9.44 0.35 0.27 1.61 1.27 Female Q68: Percent paid by the course 3,330 34.33 1.19 0.82 2.10 1.45 Female Q77: Percent marital status married 11,510 65.90 0.56 0.44 1.62 1.27 Female Q77: Percent marital status single 11,510 13.43 0.36 0.32 1.30 1.14 Female Q81: Percent United States citizen 11,510 95.11 0.28 0.20 1.89 1.38 Female 1 Numbers rounded to the nearest 10.

NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 375: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-13

Table M-4. Design effects (DEFFs) and root design effects (DEFT) for faculty statistics, by employment status

Standard error Item Number1 Estimate Design SRS DEFF DEFT

Employment status

Q1: Percent with instructional duties 17,750 96.24 0.19 0.14 1.72 1.31 Full-time Q2: Percent with some credit instruction 17,750 90.91 0.26 0.22 1.50 1.22 Full-time Q3: Percent who had faculty status 17,750 96.29 0.25 0.14 3.20 1.79 Full-time Q4: Percent whose principal activity was research 17,750 14.33 0.27 0.26 1.06 1.03 Full-time Q4: Percent whose principal activity was teaching 17,750 62.38 0.46 0.36 1.57 1.25 Full-time Q10: Percent with academic rank of professor 17,750 28.52 0.53 0.34 2.48 1.58 Full-time Q12: Percent with tenure 17,750 47.51 0.66 0.37 3.09 1.76 Full-time Q15: Percent nonunion union not available 13,830 78.71 0.58 0.35 2.76 1.66 Full-time Q19A1: Percent with other job that is full-time 17,750 1.85 0.12 0.10 1.41 1.19 Full-time Q35A1: Percent teaching a single credit class 17,750 15.84 0.29 0.27 1.13 1.06 Full-time Q37C2: Percent meet > 3 hours for second class 11,940 29.01 0.59 0.42 2.02 1.42 Full-time Q37F1: Percent with no TA in first class 14,640 79.52 0.45 0.33 1.80 1.34 Full-time Q39: Percent with web site for instruction 17,750 48.89 0.48 0.38 1.64 1.28 Full-time Q62A: Percent not “very satisfied” workload 17,750 67.73 0.44 0.35 1.60 1.26 Full-time Q64: Percent retired from another position 17,750 4.93 0.19 0.16 1.37 1.17 Full-time Q68: Percent paid by the course 590 12.41 1.75 1.36 1.67 1.29 Full-time Q77: Percent marital status married 17,750 73.71 0.41 0.33 1.56 1.25 Full-time Q77: Percent marital status single 17,750 11.41 0.29 0.24 1.51 1.23 Full-time Q81: Percent United States citizen 17,750 91.17 0.26 0.21 1.52 1.23 Full-time Q1: Percent with instructional duties 8,360 97.94 0.21 0.16 1.85 1.36 Part-time Q2: Percent with some credit instruction 8,360 89.28 0.59 0.34 3.08 1.76 Part-time Q3: Percent who had faculty status 8,360 86.88 0.52 0.37 1.95 1.40 Part-time Q4: Percent whose principal activity was research 8,360 1.71 0.19 0.14 1.75 1.32 Part-time Q4: Percent whose principal activity was teaching 8,360 88.39 0.49 0.35 1.97 1.40 Part-time Q6: Percent part-time is primary employment 8,360 34.12 0.62 0.52 1.43 1.20 Part-time Q8: Percent part-time preferred full-time 8,360 34.75 0.58 0.52 1.22 1.11 Part-time Q10: Percent with academic rank of professor 8,360 4.39 0.30 0.22 1.76 1.33 Part-time Q12: Percent with tenure 8,360 3.00 0.23 0.19 1.57 1.25 Part-time Q15: Percent nonunion union not available 7,050 57.52 0.78 0.59 1.76 1.33 Part-time Q19A1: Percent with other job that is full-time 8,360 46.15 0.71 0.55 1.68 1.29 Part-time Q35A1: Percent teaching a single credit class 8,360 41.73 0.65 0.54 1.46 1.21 Part-time Q37C2: Percent meet > 3 hours for second class 3,340 36.93 1.27 0.83 2.30 1.52 Part-time Q37F1: Percent with no TA in first class 6,810 92.00 0.42 0.33 1.67 1.29 Part-time Q39: Percent with web site for instruction 8,360 28.00 0.75 0.49 2.34 1.53 Part-time Q62A: Percent not “very satisfied” workload 8,360 44.56 0.59 0.54 1.17 1.08 Part-time Q64: Percent retired from another position 8,360 20.33 0.57 0.44 1.68 1.30 Part-time Q68: Percent paid by the course 6,150 38.52 1.02 0.62 2.72 1.65 Part-time Q77: Percent marital status married 8,360 72.54 0.68 0.49 1.95 1.40 Part-time Q77: Percent marital status single 8,360 11.66 0.51 0.35 2.13 1.46 Part-time Q81: Percent United States citizen 8,360 96.91 0.24 0.19 1.60 1.27 Part-time NOTE: Detail may not sum to totals because of rounding. SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Page 376: 2004 National Study of Postsecondary Faculty …nces.ed.gov/pubs2006/2006179.pdf2004 National Study of Postsecondary Faculty (NSOPF:04) Methodology Report Technical Report May 2006

Appendix M. Design Effects

M-14