the presidential management fellows program: lessons learned during 27 years of program success

16
The Presidential Management Fellows Program: Lessons learned during 27 years of program success Bernard J. Nickels a, , J. Patrick Sharpe b , Kim Bauhs b , Anne Holloway-Lundy b a U.S. Office of Personnel Management, Richard B. Russell Federal Building, 75 Spring St, SW, Atlanta, GA 30303, United States b U.S. Office of Personnel Management, 1900 E Street, NW, Washington, DC 20415, United States Abstract The Presidential Management Fellows (PMF) Program represents one of many examples of the Federal Government's ability to develop and implement long-term, wide-ranging human resources initiatives. Originally called the Presidential Management Intern Program, this program has been in existence for approximately 25 years with over 3500 alumni. Annually, PMF represents a major undertaking for the U.S. Office of Personnel Management (OPM), which closely coordinates with many other Federal agencies who assist in the assessment center portion of the selection process. PMF is intended to attract and select graduate-level students to the Federal Government. It uses a multiple-hurdle selection strategy that includes a nomination process, an accomplishment record, and an assessment center. The program is wide-reaching in that it addresses many aspects of the human resources management process including recruitment, selection, career development and training, and succession planning. This article presents a brief overview of the evolution of the PMF Program and its legislative history, a description of Program recruitment and branding efforts, the selection process, and career development initiatives associated with the PMF Program. The logistical and operational realities of such a wide-scale decentralized selection process are addressed with an emphasis on program successes, lessons learned, and improvement opportunities. Several recent efforts to assess program effectiveness are described. © 2006 Elsevier Inc. All rights reserved. Keywords: Presidential Management Fellows Program; Federal Government; U.S. Office of Personnel Management The Presidential Management Fellows (PMF) Program, originally called the Presidential Management Intern (PMI) Program, was established by Executive Order in 1977 to attract to the Federal service outstanding graduate students from a wide variety of academic disciplines who demonstrate an exceptional ability for, as well as a clear interest in and commitment to, leadership in the analysis and management of public policies and programs. The PMF Program has been attracting outstanding master's, JD, and doctoral-level students to the Federal service from diverse social and cultural backgrounds. The Program provides a continuing source of trained leaders to meet the future challenges of public service. PMF assignments may involve a wide range of topics/responsibilities including domestic or inter- national issues, technology, science, criminal justice, health, financial management, and many other fields in support of public service programs. Human Resource Management Review 16 (2006) 324 339 www.socscinet.com/bam/humres The opinions expressed herein are solely those of the authors and do not necessarily reflect the opinions of the U.S. Office of Personnel Management or of any agency of the U.S. Government. Corresponding author. E-mail address: [email protected] (B.J. Nickels). 1053-4822/$ - see front matter © 2006 Elsevier Inc. All rights reserved. doi:10.1016/j.hrmr.2006.05.005

Upload: bernard-j-nickels

Post on 05-Sep-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

Human Resource Management Review 16 (2006) 324–339www.socscinet.com/bam/humres

The Presidential Management Fellows Program: Lessonslearned during 27 years of program success☆

Bernard J. Nickels a,⁎, J. Patrick Sharpe b, Kim Bauhs b, Anne Holloway-Lundy b

a U.S. Office of Personnel Management, Richard B. Russell Federal Building, 75 Spring St, SW, Atlanta, GA 30303, United Statesb U.S. Office of Personnel Management, 1900 E Street, NW, Washington, DC 20415, United States

Abstract

The Presidential Management Fellows (PMF) Program represents one of many examples of the Federal Government's ability todevelop and implement long-term, wide-ranging human resources initiatives. Originally called the Presidential Management InternProgram, this program has been in existence for approximately 25 years with over 3500 alumni. Annually, PMF represents a majorundertaking for the U.S. Office of Personnel Management (OPM), which closely coordinates with many other Federal agencieswho assist in the assessment center portion of the selection process. PMF is intended to attract and select graduate-level students tothe Federal Government. It uses a multiple-hurdle selection strategy that includes a nomination process, an accomplishment record,and an assessment center. The program is wide-reaching in that it addresses many aspects of the human resources managementprocess including recruitment, selection, career development and training, and succession planning. This article presents a briefoverview of the evolution of the PMF Program and its legislative history, a description of Program recruitment and brandingefforts, the selection process, and career development initiatives associated with the PMF Program. The logistical and operationalrealities of such a wide-scale decentralized selection process are addressed with an emphasis on program successes, lessonslearned, and improvement opportunities. Several recent efforts to assess program effectiveness are described.© 2006 Elsevier Inc. All rights reserved.

Keywords: Presidential Management Fellows Program; Federal Government; U.S. Office of Personnel Management

The Presidential Management Fellows (PMF) Program, originally called the Presidential Management Intern (PMI)Program, was established by Executive Order in 1977 to attract to the Federal service outstanding graduate studentsfrom a wide variety of academic disciplines who demonstrate an exceptional ability for, as well as a clear interest in andcommitment to, leadership in the analysis and management of public policies and programs. The PMF Program hasbeen attracting outstanding master's, JD, and doctoral-level students to the Federal service from diverse social andcultural backgrounds. The Program provides a continuing source of trained leaders to meet the future challenges ofpublic service. PMF assignments may involve a wide range of topics/responsibilities including domestic or inter-national issues, technology, science, criminal justice, health, financial management, and many other fields in support ofpublic service programs.

☆ The opinions expressed herein are solely those of the authors and do not necessarily reflect the opinions of the U.S. Office of PersonnelManagement or of any agency of the U.S. Government.⁎ Corresponding author.E-mail address: [email protected] (B.J. Nickels).

1053-4822/$ - see front matter © 2006 Elsevier Inc. All rights reserved.doi:10.1016/j.hrmr.2006.05.005

Page 2: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

325B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

Since the inception of the original PMI Program in 1977, over 3500 alumni have served in over 50 Federalagencies. Many are now high-ranking Federal officials who are changing policies and directing important programs.Because of the excellent reputation of these outstanding men and women, many Federal agencies are making thePMF Program a cornerstone of their succession planning, given the need to replace senior leadership as the Federalworkforce ages.

1. Legislative history of the PMF program

1.1. Executive orders 12008, 12364, 12645, and 13318

In August 1977, President Carter established the Presidential Management Intern (PMI) Program, through ExecutiveOrder 12008 to attract to the Federal service up to 500 men and women of exceptional management potential who hadreceived special training in planning and managing public programs and policies. Outstanding individuals who hadrecently received, or were to shortly receive an advanced degree oriented toward public management were eligible toapply for the Program. Upon selection, candidates were appointed to positions under Schedule A of the excepted service(a Federal Hiring Authority) for up to 2 years and were assigned responsibilities consistent with the public managementpurpose of the Program. Although interns were not assured further Federal employment at the end of the two-yearperiod, they could be granted competitive civil service status if they satisfactorily completed the internships asprescribed by the Civil Service Commission, the predecessor to the Office of Personnel Management (OPM).

President Reagan issued two Executive Orders to modify the PMI Program. The first, Executive Order 12364(issued in May 1982), expanded the Program's target group to include men and women from a variety of academicdisciplines, not just who had received special training in planning and managing public programs and policies, butthose who had a clear interest in, and commitment to, a career in the analysis and management of public policies andprograms. It also limited the number of new interns to no more than 200 in any fiscal year. Executive Order 12645(issued in July 1988) increased the maximum number of new Interns to 400 per year.

President George W. Bush sought to expand and modernize the Program. Executive Order 13318 renamed the PMIProgram to the “Presidential Management Fellows Program” to better reflect its high standards, rigor, and prestige. ThisExecutive Order fundamentally transformed the Program by eliminating the cap of 400 participants and expandingparticipation to all executive branch agencies, making it more competitive and strengthening its standards. The goalwas to align the Program more strategically with today's workforce needs and to expand leadership developmentprograms to address the Federal Government's human capital needs.

Since the inception of the original PMI Program in 1977, over 3500 alumni have committed to working for Americain the Federal service. Many of these “best and brightest” leaders continue to serve in senior management positionsthroughout the Federal Government.

2. PMF program cycle

The PMF Program follows an annual cycle, starting with application and nomination, followed by assessment andappointment, through training and development, then graduation and conversion to permanent positions. A typicaltimeline for the PMF cycle can be found in Table 1, although the exact dates are subject to change annually.

2.1. Application, nomination, and selection

The PMF Program uses an online application system. Students interested in being considered for the PMF Programare required to submit to OPM an electronic application which includes an accomplishment record (Schmidt et al.,1979; Hough, 1984). Students submit applications through OPM's website before being nominated by their schools.The applicant must include the name and e-mail address of the PMF school nominating official. OPM contacts thenominating official directly with further instructions for submitting official nominations to OPM. The application cycleis described in detail each year on the PMF website.

After applying to the PMF Program, students must be nominated by their school's Nomination Official (dean,chairperson, or program director, or their designee, such as a nomination coordinator). Each school conducts acompetitive screening process to evaluate its PMF applicants based on several eligibility criteria. Nomination criteria

Page 3: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

Table 1Typical PMF implementation schedule

Target dates Activity

PMF selectionSeptember 1 toOctober 15

Typical open application period

October 16 to October 31 Typical nomination period of applicants by academiaNovember 1 to 31 OPM conducts initial assessmentBy December 15 OPM invites semi-finalists to an assessment centerJanuary 1 to February 28 OPM conducts final assessment at centers nationwideBy March 15 OPM announces the selection of PMF finalistsBy October 1 OPM determines the annual number of Fellows that may be appointed based on input

from agency Chief Human Capital Officers and othersJanuary to March Agencies post PMF jobs in the Projected Positions SystemBy March 15 OPM makes lists of PMF finalists and their resumes available to agenciesMid-March to mid-April OPM hosts the annual job fair for agencies and PMF finalistsUntil mid-March next year Agencies appoint selected finalists as Fellows

PMF career development opportunitiesUpon appointment Agencies approve Individual Development Plans (IDPs) designed to impart the

competencies required of the target positionFall and winter OPM hosts the PMF Orientation and Training Program (sessions typically held in

October, November, January, and March)Years 1 and 2 Agency provides opportunities for training, developmental assignments, rotations,

and other activities in support of the Fellow's IDP30 days prior to conversion Agency Executive Resources Board evaluates the Fellow to determine successful completion

of the Program no later than 30 calendar days before expiration of the appointmentBy mid-December (year 2) OPM hosts the PMF Graduation and Training Program

326 B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

typically include demonstration of (1) breadth and quality of accomplishments, (2) capacity for leadership, and (3)commitment to a career in the analysis and management of public policies and programs. Eligibility requirementsmandate that nominated students complete a graduate degree (master's or doctoral-level degree) from a formallyaccredited academic institution during the year in which they are nominated for the program. School officials use anonline process to make their nominations.

OPM selects PMF candidates using a two-phase structured assessment process. The first phase of this process is areview and evaluation by OPM assessors of the candidate's accomplishment record responses. From this list, OPMdesignates semi-finalists who are then invited to an in-person assessment center. OPM administers the assessmentcenter concurrently at various sites throughout the United States. Candidates who participate in the assessment centercomplete individual, group, and written exercises. The candidate's performance is evaluated by OPM-trained panels ofassessors, and OPM designates PMF finalists. Each year, OPM seeks agency volunteers to serve as panel members forthe assessment centers. Agency PMF Coordinators, human resources staff, or hiring officials are encouraged toparticipate. Upon completion of the assessment process, agencies have 12 months to hire Fellows from a large pool offinalists with diverse backgrounds and skills.

2.2. PMF career development opportunities

The PMF Program emphasizes leadership development by affording PMFs with a myriad of challenging careeropportunities. The PMF Program Office facilitates and provides structured orientation to the PMF program andgraduation training programs. Agencies provide at least 80 h of formal training a year, including training in corecompetencies targeted to a functional area into which their Fellows will most likely be converted. The Federal agenciesalso arrange for on-the-job training and other developmental opportunities such as seminars, briefings, andconferences. Moreover, agencies provide PMFs with at least one rotational assignment that will provide the Fellowswith a broader perspective on the Federal Government. According to the program objectives, it is the jointresponsibility of both Fellows and agency supervisors to negotiate the developmental activities and work respon-sibilities that will prepare the Fellow for a targeted position at the conclusion of the Program.

Page 4: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

327B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

3. Logistical and operational issues

PMF selection occurs annually through a complex process involving multiple test sites, multiple Federal agencies,and several thousand applicants. From beginning to end, the selection process takes approximately 7 months. Thelogistical and operational issues associated with implementing such a large-scale process are discussed in the presentsection.

3.1. Logistics

In terms of logistics, the PMF selection process can be divided into several important phases: (1) Application, (2)School Nomination Process, (3) Accomplishment Record, (4) Assessment Center, (5) Job Fair and Placement, and (6)Review and Preparation for the Next Cycle. Several groups within OPM work together to manage and implement theselection process successfully each year. The PMF Program Office is the prime manager of all aspects of the program,including the selection process. A group of personnel psychologists with expertise in industrial/organizationalpsychology both developed the selection assessments and serve as consultants throughout the process. A group ofhuman resource (HR) specialists provide the vast majority of assessors used in the various stages of selection. The HRspecialist group also manages the assessment center sites located throughout the country. Finally, a group ofinformation technology (IT) specialists provide technical support throughout the process, including maintenance ofonline application data and assessment data from various portions of the selection process. Each of the six phases of theprocess is discussed in more detail below. The leadership competencies assessed in various stages of the selectionprocess are provided in Table 2.

3.1.1. ApplicationThe PMF application is an online application system that requires applicants to respond to several demographic

questions and submit three accomplishment essays for the accomplishment record portion of the assessment process(discussed below). Students submit applications directly to OPM prior to the school nomination process. This allowsOPM to track the number of applicants nominated and not nominated. The application is available through the PMFwebsite beginning on September 1 of each year and remains posted through the middle of October. Applicantinformation is maintained by OPM's IT group, which is also responsible for all correspondence with applicantsthroughout the selection process.

3.1.2. School nomination processThe PMF Program is unique in that entities external to the Federal Government are responsible for a portion of the

candidate selection process. More specifically, colleges and universities across the country are provided with guidelinesfor nominating students to be considered for the program. This makes communication with participating schoolscritical. To facilitate this communication, schools are asked to supply the name of a nominating official who then

Table 2Current PMF competency dimension by assessment matrix

Schoolnomination

Accomplishmentrecord

Writtendemonstration

Assessment center individualpresentation

Assessment center groupdiscussion

Interest in governmentservice

✓ – – – –

Breadth and quality ofaccomplishments

✓ – – – –

Problem solving – ✓ – ✓ ✓Resilience – ✓ – – –Adaptability – – – ✓ ✓Oral communication – – – ✓ ✓Demonstrated

leadership– – – – ✓

Interpersonal skills – ✓ – – ✓Written expression – – ✓ – –

Page 5: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

328 B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

becomes the point-of-contact for nomination decisions for all students from that institution. The PMF website containsa special section for inputting the final nomination decisions for each applicant from the program. Schools have the last2 weeks of October to submit final nomination decisions for their applicants. Students who were not nominated areinformed of the decision by the schools and by OPM. Those who are nominated receive further consideration by OPMin the next phase of the process—the accomplishment record.

3.1.3. Accomplishment recordAs part of the online application, applicants give three examples of accomplishments that demonstrate three basic

leadership competencies—Problem Solving, Interpersonal Skills, and Resilience. Trained assessors at OPM, drawnprimarily from the HR specialist group, evaluate the accomplishment record responses during the month of November.

OPM personnel psychologists train the assessors and coordinate the accomplishment record process. All assessorsmust be trained each year regardless of past experience as an assessor. During the training, assessors receive guidelinesfor scoring, which include benchmark accomplishment examples that serve as anchors for five rating points. Assessorsalso complete practice ratings of accomplishments to make sure that they understand all rating procedures and applybenchmark examples correctly.

Assessors, located nationwide, have access to the accomplishment essays through a secure website on OPM'sintranet. In teams of two, the assessors rate accomplishments related to one competency, with a total of 15 assessorteams completing the evaluation process within a one-month timeframe. The assessor teams rate the accomplishmentsindividually on a scale of 1 (“Poor”) to 5 (“Outstanding”). To encourage consistency in using the rating scales, assessorteams must review and compare their first 30 accomplishment ratings. Any discrepancy of two or more points on therating scale for the first 30 accomplishments is discussed, and assessors revise their ratings accordingly. OPMpersonnel psychologists monitor assessor ratings throughout the process to ensure general consistency and to follow upwhen problems arise.

Assessor accomplishment ratings are averaged to produce a final accomplishment score for each competency. Thethree competency scores are summed to produce a total accomplishment score for each applicant. A top-downapproach is used for selecting semi-finalists for inclusion in the assessment center phase. The PMF Program Officenotifies all nominated applicants via email to inform them of their status by early December to allow semi-finalistsadequate time to make travel arrangements for the next phase in the process—the assessment center.

3.1.4. Assessment centerSemi-finalists participate in a one-day assessment center held during January and early February in 1 of 15 sites across

the nation. The assessment center consists of three exercises: (1) an individual presentation, (2) a group exercise, and (3) awriting exercise. The exercises measure leadership competencies, including Problem Solving, Oral Communication,Adaptability, Interpersonal and Team Skills, Demonstrated Leadership, and Written Expression.

Each assessment center site has a designated coordinator to handle logistics, scheduling, and tracking rating formsand other documentation associated with the assessment center. The coordinators also enter all candidate scores into asecure database created especially for the PMF assessment center.

Candidate responses to the assessment center exercises are evaluated by panels of three trained assessors—oneassessor fromOPM and two from other Federal agencies. Similar to the accomplishment record process, OPM assessorsare predominantly drawn from the HR specialist group. The assessors from other agencies serve on a voluntary basis. Allassessors are required to participate in formal training, facilitated by OPM personnel psychologists, with an emphasis onpractice ratings.

The assessors evaluate the individual presentations and group exercises in person on the day of assessment. In theindividual presentation, candidates review information on a current issue involving Federal Government legislation,regulations, or judicial findings, and then they make a brief presentation to the assessor panel stating theirrecommendations on the issue. In the group exercise, several candidates discuss the same issue addressed in theindividual presentation, but the group must reach consensus on how to address the issue and present the recommendationto the assessor panel. The assessors observe the entire group discussion, including the presentation.

Assessors score the individual and group exercises immediately after observing candidate performance. Theassessors rate candidates on the leadership competencies using five-point rating scales with behavioral anchors. In theindividual presentation, the three assessors rate each candidate independently on multiple competencies and then cometo consensus on a final score for each competency. For the group presentation, different combinations of two assessors

Page 6: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

329B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

rate each candidate on multiple competencies and reach consensus on final competency scores. OPM retains completedrating and note-taking forms from each assessor for record-keeping purposes and any potential future reference.

In addition to the live exercises, candidates complete a three-page written exercise on a specific topic. Theseexercises are sent to a central location to be distributed to and rated by a different set of trained assessors (OPM HRspecialists). A team of two assessors evaluates each essay and must reach consensus on a final score. As with the liveexercises, assessors use a five-point scale to evaluate candidates on the writing exercise.

Once the site coordinator enters scores for the written and live exercises, IT specialists check the data prior tosending it to OPM personnel psychologists. The personnel psychologists also check the data and work with the ITgroup to develop competency scores for each individual using a pre-determined weighting protocol. The sixcompetency dimension scores are combined into a total score, and a top-down approach is used to select the finalists.The number of finalists is driven by agency demand for PMFs. In the past few years the demand for PMFs has beenaround 400; however, given attrition and other factors, OPM typically selects between 650 and 700 finalists forplacement. All assessment center participants are notified of the results via e-mail, and feedback on assessment centerperformance is provided upon request.

3.1.5. Job fair and placementSelection as a PMF finalist does not guarantee a Federal job; they still must be placed in a Federal agency after

selection. Finalists take an active role in the placement process and follow a process similar to the one they wouldfollow if competing in the open Federal job market. Factors that influence placement include:

• Personal preference for a specific agency, position, or program;• Agency participation in the PMF Program;• Desired and available geographic locations;• Special expertise or skills required by an agency.

Finalists can approach agencies on their own; however, OPM hosts an annual job fair in late March/early April toassist in matching finalists with agencies. At the job fair, representatives from Federal agencies provide finalists withinformation, answer questions, and in some cases conduct interviews. The PMF Program Office tracks all PMFs placedwith Federal agencies to ensure that they meet the requirements of the two-year program.

3.1.6. Review and preparation for the next cycleEach year after the PMF selection process (i.e., after the job fair), OPM conducts internal meetings for the purpose of

reviewing the just-completed process and planning for the next assessment cycle. Representatives from the PMFProgram Office, the personnel psychologist group, the HR specialist group, and the IT group meet to address problemsthat emerged in the previous cycle and to make suggestions for improvement.

3.2. Operational challenges

The PMF selection process involves large-scale assessment in multiple locations across the country, and thuspresents OPM with a number of operational challenges to ensure success. Four of the biggest challenges are describedin this section.

3.2.1. Management of nationwide test sitesThe management of multiple test sites across the nation adds complexity to the selection process and presents a

number of challenges. PMF test sites have been established in many regions of the United States, mostly near majorurban areas. Because semi-finalists must travel to assessment center sites at their own expense, OPM must maintain abalance between convenience for candidates and the cost of additional test sites. In most cases, the test sites are set up inOPM or other Government facilities, but they are also set up in hotels and local colleges. The Washington, DC test siteattracts by far the most candidates.

Scheduling of candidates in multiple sites is challenging, because of rescheduling needs of candidates, candidate no-shows, and requests for changes in test sites. A central schedule is maintained in the PMF computerized database tofacilitate schedule and test location changes. The assessment center is set up to handle 10 candidates per day. Some of

Page 7: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

330 B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

the larger test sites must schedule multiple groups of 10 candidates per day to run all scheduled candidates through theassessment center within the six-week timeframe.

Another challenge is ensuring that all sites have the required resources to run the assessment center effectively andefficiently. To meet this challenge, a lead site coordinator is appointed with responsibility for making sure that all siteshave the appropriate materials, space, and equipment. Site coordinators attend annual training to learn about issues thathave occurred in the past cycle and about any new procedures for the upcoming cycle. A site coordinator handbookprovides coordinators with all of the information they need to know about setting up and running the assessment center.

A final challenge is maintaining standardization of the assessment process. Candidates who participate in theassessment center in Washington, DC should have the same basic experiences as candidates who go through theassessment center in San Antonio or Boston. As discussed earlier, all assessors involved in the process must attendmandatory training on an annual basis before the assessment center begins nationwide. The training is specificallydesigned to foster standardization across test locations. Incidents that represent threats to standardization are documentedeach year and steps are taken to address the threat and communicate the threat to all site coordinators and assessors.

3.2.2. Coordination with schoolsSince colleges and universities play an important role in the selection of PMFs, efficient coordination with schools is

critical to the overall success of the Program. The schools have a keen interest in the PMF Program's success, and manyacademic programs (particularly public policy and public administration) use their past record of success in placingPMFs for recruiting purposes. Some schools have developed extremely rigorous nomination procedures to identify thebest-qualified students. At the other extreme, some schools are not even aware of the Program until a student from theirinstitution applies. Because schools must make the official nominations, OPM reaches out to academic institutions toheighten awareness of the program and facilitate the nomination process. Even with this outreach, OPM spends muchtime each year following up with schools about nomination decisions.

3.2.3. Coordination with Federal agenciesOne aim of the PMF Program is to identify the leaders for tomorrow's Federal Government. While OPM's main role

in this process is to select individuals with the basic skills to be effective leaders, Federal agencies that place PMFfinalists are responsible for developing the leadership skills of the PMFs. The PMF Program Office coordinates withagencies to monitor the status of Fellows and to confirm that they are meeting basic requirements. Agencies pay a flatfee for placement of a PMF, which covers costs for running the program, including the selection process. OPM cannotmonitor PMFs due to staffing and resource restrictions, and therefore relies on agencies to provide PMFs withappropriate developmental experiences. Agencies dedicate varying amounts of resources to the PMF Program, withsome having formal PMF-related activities as well as PMF coordinators responsible for agency involvement in theProgram.

In addition to coordinating on placement and development, agencies provideOPMwith volunteers for the assessmentcenter. In the Washington site alone, 50 to 60 agency assessors are needed to evaluate candidates. In other locations,agency assessors are recruited from field offices in close proximity to the test sites. A large degree of coordination isnecessary to schedule these agency assessors for training and for service on the assessment center panels.

3.2.4. Communication and coordination within OPMPerhaps the most critical challenge OPM faces in implementing the PMF selection process is internal

communication and coordination. As mentioned earlier, four groups within OPM work closely to implement theselection process each year: (1) the PMF Program Office, (2) personnel psychologists, (3) the HR specialist group, and(4) the IT group. Given the high visibility of the Program, the OPM Director often reviews and approves variousactivities associated with program operations. In some cases, OPM's legal office and HR policy office also reviewvarious aspects of the Program.

3.3. Program successes and lessons learned

The PMF Program has an excellent reputation in the Federal community for providing agencies with high-quality,high-potential employees for 27 years. This section discusses four broad operational issues that highlight bothsuccesses and lessons learned.

Page 8: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

331B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

3.3.1. Managing a program with Governmentwide impactA clear success of the PMF Program is its Governmentwide impact in helping agencies identify high-quality

employees with leadership potential. The Program is an example of the Federal Government's ability to pool resourcesfor a common purpose and address the hiring needs of multiple agencies at the same time. In recent years, OPM hasviewed the PMF Program as one option for agencies to consider in addressing the growing concern over leadershipsuccession planning. In the PMF selection process, an emphasis is placed on the core competencies that provide afoundation for success as a leader (e.g., problem solving, writing, and interpersonal skills).

3.3.2. The prescreen dilemmaSometimes, however, success breeds problems. One issue that OPM has faced over the past few years is an

explosion in the number of applicants to the Program. In 1996 when the current assessment center was firstimplemented, the number of applicants was fewer than 500. Applicant numbers have increased dramatically in the pastfew years to well over 3000. Once the number of applicants surpassed the maximum number the assessment centerscould handle efficiently (i.e., 1200), OPM personnel psychologists worked with the PMF Program Office to develop aprescreen assessment to reduce the number of candidates evaluated in the assessment center and increase the efficiencyof the process.

The accomplishment record was selected as a viable, competency-based prescreen. In 1996, OPM piloted a self-report competency-based rating schedule as a potential prescreening tool. However, because the tool lacked facevalidity applicants had a negative reaction to the assessment. To remedy this, the accomplishment record was usedbeginning in 2003 to allow candidates to describe their own accomplishments in relation to the target competencies.This assessment approach has excellent face validity, and OPM has received little negative feedback about it.

Because the applicant numbers have continued to increase (currently over 3000 applicants), OPM now faces a newdilemma. The volume of applicants is growing beyond the expected efficiency of even the accomplishment recordapproach. This has caused OPM to examine other options for prescreening, with online assessment of some sort as aleading possibility.

3.3.3. Assessment gamingThe level of prestige associated with placement as a PMF increases the motivation of applicants to do well and

schools to have high placement rates. As a result, many academic programs provide their students with formal trainingand preparation for the PMF assessment center. Students who have participated in past assessment centers sometimesprovide the school with information about the exercise content and instructions. To address this issue, OPM changesthe specific content of the exercises on an annual basis and, in some cases, the exercise procedures as well. The biggestconcern with assessment gaming is that an imbalance exists between schools who formally prepare students for theassessment center and those that do not. This disadvantages students that do not have the benefit of such preparation.OPM has instituted some procedures to partially counteract this imbalance, including not permitting students to discussspecifics about their school or academic program.

3.3.4. Leveraging technologyWhen feasible, OPM has attempted to leverage technology to facilitate assessment and selection of PMF finalists. Due

to resource limitations and other barriers, it typically takes a few years to incorporate new procedures that take advantage oftechnological advances. Some of the recent advances implemented by the PMF Program Office are an online application;an automated applicant tracking, scheduling, and scoring system; and an automated assessor interface for theaccomplishment record. The IT group was instrumental in developing and implementing these new tools.

3.3.5. Unmet opportunities—formal program evaluationAlthough OPM commits a large amount of resources to the PMF Program, some unmet opportunities exist. One of

the more pressing concerns facing the PMF Program Office is the lack of resources to formally evaluate the Programand the selection process on an ongoing basis. The Program Office and other stakeholder groups within OPM certainlymake efforts to evaluate pieces of the Program. However, such evaluation is typically conducted as resources allow andwhen crises occur.

The U.S. Merit Systems Protection Board (MSPB) recently conducted an evaluation of the Program and found bothpositive and negative results in a number of areas (U.S. Merit Systems Protection Board, 2001). At the time of the

Page 9: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

332 B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

MSPB study, the Program was still known as the Presidential Management Intern (PMI) Program. On the positive side,the study found that PMIs advance into management ranks at a higher rate than other comparable employees who havegraduate degrees and entered the Federal service at a comparable grade level. In addition, 76% of supervisorsresponding to an MSPB survey indicated that the quality of the PMIs they hired in the last 3 years was better than thequality of hires for similar openings using other methods. Supervisors also rated their PMIs very favorably with regardto overall job performance and in specific areas including analytical ability, writing ability, leadership ability, andknowledge of public policies and programs. The MSPB report also indicated that the turnover rates for PMIs, oftenthought to be higher, were about the same as for comparable employees. The MSPB report pointed to areas forimprovement for the Program, including a need to more clearly define its purpose and to ensure that the developmentalcomponents of the Program deliver results for participating individuals and agencies. MSPB also called for a formalvalidation study to evaluate the selection process.

More formal evaluation would enable OPM to better address some of the areas of concern discussed in the MSPBstudy. Some examples of research studies conducted by OPM for the purpose of evaluating the selection process areprovided in the next section.

4. Research studies

Despite the very long and primarily successful history of this Program, it continues to face some operational andmethodological challenges. One of the greatest obstacles faced by scientist/practitioners is the continual challenge ofbalancing good science with the organizational realities that come with limited resources (e.g., budget, time, people).Typically, there are insufficient resources available to accomplish all project objectives and, in this context, we are oftenfaced with very difficult choices regarding program design and implementation. This is particularly true when it comes tothe resources required to conduct a full-scale criterion-related validation study. When resources simply are not available,scientist/practitioners are often required to find creative (and often partial) solutions to this problem. Such is the case withthe PMF Program in that, to date, there has yet to be a full-scale systematic criterion-related validation study.

Despite the fact that resources are often limited, there have been several recent efforts to assess programeffectiveness. These efforts include an interrater reliability study, an analysis of the nomination process, and an analysisof the accomplishment record. Although these are all relatively small-scale studies, collectively they represent ourongoing efforts to marshal validation evidence in the absence of the resources required to conduct a full-scale empiricalvalidation study. Some of the described studies were conducted prior to Executive Order 13318, while the Program wasstill known as PMI.

4.1. Study 1: Interrater reliability in the Presidential Management Intern assessment center

At the time this study was conducted (2001), PMI finalists were selected based on their performance in an assessmentcenter consisting of three exercises—an individual presentation, a group discussion, and a written demonstration. Asshown in Table 3, assessors evaluated the candidates on several dimensions. Analytical Thinking, Policies andPrograms, and Oral Communication were evaluated during the individual and group exercises. Interpersonal and TeamSkills and Demonstrated Leadership were measured in the group exercise as well. The candidates were evaluated on

Table 3PMI dimensions by exercises matrix for previous version of the assessment center

Competency assessed Assessment exercise

Individual presentation Group discussion Written demonstration

Analytical thinking ✓ ✓ –Policies and programs ✓ ✓ –Oral communication ✓ ✓ –Demonstrated leadership – ✓ –Interpersonal and team skills – ✓ –Interest in government service – – ✓Breadth and quality of accomplishments – – ✓Written expression – – ✓

Page 10: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

333B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

Interest in Government Service, Breadth and Quality of Accomplishments, and Written Expression in the writtendemonstration. A final assessment center score was computed by summing the dimension scores. Dimensions that weremeasured in two exercises, i.e., Analytical Thinking, Policies and Programs, and Oral Communication, were averagedprior to computing the final score so that there was only one score for each assessment center dimension.

An important aspect of evaluating any assessment instrument is an examination of the reliability of the instrument. Acommonmethod of evaluating reliability in an assessment center is the computation of interrater reliability, or the degree towhich assessors provide the same assessment when exposed to the same stimulus information, i.e., to the same candidate.

4.1.1. Method

4.1.1.1. Sample. Pre-consensus ratings for the 1450 candidates who completed the PMI Assessment Center duringthe 2000–2001 program year were examined.

4.1.1.2. Data analysis. Intraclass correlation coefficients (ICCs) are commonly used as measures of interraterreliability. Following the guidance provided by Shrout and Fleiss (1979) and McGraw and Wong (1996), the one-wayrandom effects model [ICC (1)] was selected for the present study. This model is appropriate because it was notpossible to associate each rating with a particular rater, so all differences in scores between assessors must be treated aserror. The ICC is interpreted as the amount of variance attributable to differences among the objects being rated, in thiscase, the amount of variance attributable to differences among PMI candidates.

There are two forms of ICC (1). The reliability of a single rater's scores is conveyed by ICC (1, 1). In contrast, ICC(1, k) represents the reliability of the mean of the ratings provided by k raters. Both ICC (1, 1) and ICC (1, k) arereported. Although the PMI Assessment Center uses a consensus process to arrive at the final dimension score ratherthan averaging the assessors' scores, as shown in Table 4, there is a strong correlation between the consensus scores andthe average of the assessors' pre-consensus scores. Hence, it could be argued that the reliability of the consensus scoreswould be similar to the reliability of the average of the pre-consensus scores, estimated by ICC (1, k). This point will beaddressed further in Discussion.

In addition, Nunnally's (1978) formula for the reliability of a linear combination was used to compute an estimate ofthe reliability of the overall assessment center score, which is computed as the sum of the dimension scores. Nunnally'sformula includes the reliability of each component of the linear combination. The ICC (1, 1) values for each dimensionwere used in this formula to compute the reliabilities of the composite scores for Analytical Thinking, OralCommunication, Policies and Programs, and ultimately, for the total score.

4.1.2. ResultsTable 5 presents the ICC (1, 1) for each dimension measured in the PMI Assessment Center, as well as the upper and

lower bound of the 95% confidence interval. The confidence interval is statistically computed and indicates that there isa 0.95 probability that the true ICC is within the stated interval. As shown, the ICCs range from 0.69 for AnalyticalThinking as measured in the Group Discussion exercise to 0.88 for Interest in Government Service as measured in theWritten Demonstration exercise.

Table 4Correlations of consensus scores with average scores

Dimension Correlation of consensus score with the average of pre-consensus scores

Individual presentation analytical thinking 0.96Individual presentation policies and programs 0.96Individual presentation oral communication 0.95Group discussion demonstrated leadership 0.95Group discussion interpersonal and team skills 0.93Group discussion analytical thinking 0.93Group discussion policies and programs 0.94Group discussion oral communication 0.93Written expression 0.95Interest in government service 0.97Breadth and quality of accomplishments 0.96

Page 11: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

Table 5Reliability of ratings based on the use of a single rater

Variable ICC 95% CI lower bound 95% CI upper bound

Individual presentationAnalytical thinking 0.70 0.67 0.72Policies and programs 0.69 0.67 0.71Oral communication 0.70 0.68 0.72

Group discussionDemonstrated leadership 0.77 0.74 0.79Interpersonal and team skills 0.72 0.69 0.74Analytical thinking 0.69 0.66 0.72Policies and programs 0.73 0.70 0.75Oral communication 0.71 0.69 0.74

Written demonstrationWritten expression 0.79 0.76 0.81Interest in government service 0.88 0.87 0.89Breadth and quality of accomplishments 0.78 0.76 0.80

334 B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

Table 6 presents ICC (1, k) for each dimension and the corresponding 95% confidence intervals. The ICCs for theaverage of multiple raters range from 0.82 for Analytical Thinking in the Group Discussion exercise to 0.94 for Interestin Government Service in the Written Demonstration exercise.

As shown in Table 7, the estimated reliabilities of the composite scores for Analytical Thinking, Policies andPrograms, and Oral Communication are 0.79, 0.81, and 0.81, respectively. The estimate of the reliability of the totalscore is 0.94.

4.1.3. DiscussionThe present study examined the interrater reliability of PMI Assessment Center ratings. According to Thornton

(1992), reliabilities between 0.80 and 1.00 are considered high, and reliabilities between 0.60 and 0.80 are consideredmoderate. Hence, the estimates of reliability based on ratings from a single assessor, ICC (1, 1), meet Thornton'scriterion for moderate levels of reliability, with one dimension, Interest in Government Service, meeting the criterionfor a high level of reliability. In addition, the estimate of the reliability of the total assessment center score, computedusing the estimates of reliability for a single assessor, was 0.94, which is quite high.

Table 6Reliability of ratings based on the use of multiple raters

Variable ICC 95% CI lower bound 95% CI upper bound

Individual presentation a

Analytical thinking 0.87 0.86 0.88Policies and programs 0.87 0.86 0.88Oral communication 0.87 0.86 0.88

Group discussion b

Demonstrated leadership 0.87 0.85 0.88Interpersonal and team skills 0.84 0.82 0.85Analytical thinking 0.82 0.80 0.83Policies and programs 0.84 0.83 0.86Oral communication 0.83 0.81 0.85

Written demonstration b

Written expression 0.88 0.87 0.89Interest in government service 0.94 0.93 0.94Breadth and quality of accomplishments 0.88 0.87 0.89a Three raters are assumed for the individual presentation.b Two raters are assumed for the group discussion and written demonstration.

Page 12: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

Table 7Estimated reliabilities for composite scores

Composite Estimated reliability

Analytical thinking 0.79Policies and programs 0.81Oral communication 0.81Total assessment center score 0.94

Analytical thinking, policies and programs, and oral communication are composite scores that are computed by averaging the scores in the individualpresentation and the group discussion. The total assessment center score is the sum of the standardized scores for the eight assessment centerdimensions. Estimates of reliability for a single rater, ICC (1, 1) were used with Nunnally's (1978) formula to compute the estimated reliabilities ofthe composite scores.

335B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

As mentioned above, reliability estimates were also computed for the score representing the average of theindividual assessors' scores. Assessment centers use multiple raters to offset individual biases, errors of observation orinterpretation, and unreliability of individual ratings (Cascio, 1991). To the extent that use of multiple raters achievesthese goals, the reliability of the consensus scores should be higher than the reliability of the individual assessors'scores. Thus, the reliability estimates for a single assessor (ICC 1, 1) may represent an underestimate of the reliability ofthe dimension score arrived at through the consensus process. Given the strong correlations between the average of theassessors' scores and the consensus scores (reported in Table 4), the reliability of the consensus score may be similar tothe reliabilities computed using ICC (1, k), reported in Table 6. As shown, all of the reliability estimates based on theuse of multiple raters met Thornton's criterion for high levels of interrater reliability.

Numerous efforts have been made in the PMI Assessment Center process to enhance interrater reliability, includingthe use of a consistent rating standard and scale, the use of specific benchmarks for each dimension, and detailedassessor training to discuss these rating tools and practice evaluating candidate behavior. The practice rating exercisesgive the assessors an opportunity to apply the information provided in training and to engage in a discussion of why acandidate merits a particular score. The positive findings with regard to interrater reliability in the present study seem toindicate that these efforts have achieved a level of success.

Some caveats to these results are warranted, however. First, it is important to note that interrater reliability is onlyone method of estimating reliability, and this method is focused on identifying error in measurement resulting fromthe rater. Other sources may add erroneous variance to assessment center scores, such as temporary states of thecandidate (e.g., illness, nervousness, fatigue) or temporary changes in the environment (e.g., noise, uncomfortabletemperature, a disruptive group member), which were not addressed in the present study. Nonetheless, it isimportant to demonstrate, as the current study has, that there are not substantial differences in assessment centerscores due to raters.

Second, in order to determine more precise estimates of the reliability of the consensus scores, it would be desirableto have multiple assessor teams rate a number of candidates. Obtaining these data would allow an assessment of thevariance across teams of assessors, which is of interest because final ratings are determined using the consensus scores.

Finally, it is important to note that reliability of scores is only one important consideration in determining a test'susefulness and appropriateness. A test must provide reliable evaluations of applicants' skills, that is, evaluations with aminimal amount of error. An even more critical issue, however, is whether an applicant's performance on the test ispredictive of performance on the job. Therefore, a valuable next step in evaluating the appropriateness of the PMIassessment center would be a criterion-related validation study, in which performance scores are collected for currentPMIs, and the statistical relationship between test performance and job performance is examined.

4.2. Study 2: Presidential Management Fellows Program analysis of pre-screen effectiveness

In 2003, the PMF Program Office added an accomplishment record pre-screen to the PMF selection process toaddress steadily increasing applicant numbers. The number of PMF applicants had increased from under 500 in 1996 towell over 2000 in 2002. Prior to 2003, all PMF applicants nominated by their schools were invited to participate in theassessment center. Performance on the assessment center was used to identify a final pool of candidates eligible forplacement as PMFs in Federal agencies. Based on guidance from OPM personnel psychologists, the PMF ProgramOffice implemented an accomplishment record as part of the 2003 PMF application and selection process. The purposeof the pre-screen was to identify a subset of high potential candidates from the pool of nominated applicants for further

Page 13: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

336 B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

consideration in the nationwide assessment center. The goal was to reduce the number of candidates evaluated in theassessment center to a more manageable number (i.e., between 1000 and 1200).

The inclusion of the pre-screen resulted in demonstrated cost savings due to the lower number of candidatesparticipating in the resource-intensive assessment center. Nonetheless, OPM personnel psychologists conducted ananalysis of assessment center data to determine if the accomplishment record pre-screen had an effect on the quality ofcandidates participating in the assessment center. To examine pre-screen effectiveness, assessment center performancewasexamined in the 2 years prior to implementation of the pre-screen (i.e., 2001 and 2002) and in the 2 years afterimplementation (i.e., 2003 and 2004). Assessment center scores for the two-year intervals were combined to ensure datastability. Table 8 provides the results of the analysis involving the five key competency dimensions included in allassessment cycles. (Note: The Programs and Policies dimension was dropped in 2004 and replaced with Adaptability.) Asindicated in Table 8, all five mean assessment center dimension scores increased after pre-screen implementation. Amultivariate analysis of variance (MANOVA) was used to determine if differences between the two sets of mean scores(i.e., before and after pre-screen implementation) were statistically significant. Results of the MANOVA indicated a smallbut statistically significant effect (Wilk's Lamda=0.996, p<0.0001). Follow-up analyses indicated that candidates scoredsignificantly higher on all five assessment center dimensions in the 2 years after implementation of the pre-screen.

At face value these results clearly suggest that PMF candidates participating in the assessment center in the twoyears that included the accomplishment record pre-screen performed better than candidates in the previous 2 yearswithout a pre-screen. However, two cautionary notes must be considered. First, a pool of applicants in any given yearcould vary in quality in terms of the competencies assessed. Thus, it is difficult to state conclusively that the increase inperformance observed on the assessment center dimensions is due solely to implementation of the pre-screen. The factthat 2 years worth of data were combined pre- and post-implementation does strengthen the results. Second, ananomaly was observed with theWritten Expression dimension such that almost all of the increase in this dimension wasdue to scores from the 2003 data set. The mean Written Expression score for 2004 was actually lower than scoresobserved in both 2001 and 2002. These results could be attributed to the actual skill level of the applicant pools in 2003and 2004 or to procedural changes in the writing exercises that were instituted in these years. Because of this ambiguity,results related to this dimension should be interpreted with caution.

4.3. Study 3: analysis of accomplishment record performance by nominees and non-nominees

In 2004, OPM personnel psychologists conducted a study, at the request of the PMF Program Office, comparingapplicants who were, and were not, nominated by their schools for further consideration in the PMF selection process.As previously described, applicants apply for the program online and must be nominated by school officials who havean appropriate knowledge of the applicants' abilities and achievements to certify that each nominee has a seriousinterest in obtaining a PMF position. The nominating official also attests that each applicant was selected using

Table 8Mean assessment center scores before and after implementation of the pre-screen

Competencydimension

Mean score before implementation a

[N=4319]Mean score after implementation b

[N=2292]Statistically significant difference? c

[p<0.05]

(1) Analyticalthinking

3.56 (0.79) 3.61 (0.71) Yes

(2) Oralcommunication

3.64 (0.76) 3.69 (0.70) Yes

(3) Demonstratedleadership

3.62 (0.93) 3.72 (0.84) Yes

(4) Interpersonal/teamskills

3.75 (0.84) 3.84 (0.79) Yes

(5) Written expression 3.24 (0.97) 3.31 (0.91) Yes

Means are based on assessor competency scores ranging from 1 (low level of competency) to 5 (high level of competency). Figures in parentheses arestandard deviations.a Based on combined data from the last 2 years in which a pre-screen was not used (i.e., 2001 and 2002).b Based on combined data from the two most recent assessment cycles, both of which included an accomplishment record pre-screen (i.e., 2003

and 2004).c Based on between-subjects tests follow-up tests conducted as part of a multivariate analysis of variance (MANOVA).

Page 14: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

337B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

competitive campus nomination procedures. If the school does not nominate an applicant, the individual is notconsidered further.

School nomination procedures vary widely in terms of their rigor and content. It is difficult to gauge the validity ofschool nomination procedures because of this variability as well as the number of schools involved. One avenue forevaluating the nomination procedures is to compare nominated applicants to those who were not nominated by theirschools. In the 2003–2004 selection process, a total of 2503 applicants were nominated (“Nominees”) and 251 werenot nominated (“Non-Nominees”). OPM personnel psychologists compared these groups in terms of performance onthe accomplishment record. Because Nominees should generally be higher quality candidates than Non-Nominees, itwas hypothesized that the Nominees would perform better than Non-Nominees on the accomplishment record.

4.3.1. Design and analysisBecause the accomplishments submitted by Non-Nominees were not officially rated as part of the selection process,

OPM personnel psychologists arranged to have these rated at a later date by the same trained assessors that evaluatedthe accomplishments of the Nominees. The assessors were all highly experienced in rating accomplishments, and allwere blind to the purpose of the study.

The Non-Nominees were divided into three groups based on different reasons for not being nominated: (1) NotNominated, (2) No Decision, and (3) Not Reviewed. The Not Nominated group (N=173) consisted of applicants thatwere officially not nominated by schools as a result of school nomination procedures. The No Decision group (N=28)contained those applicants for which no decision was officially made by the school, even though school officialsreviewed the applicant materials. The Not Reviewed group (N=50) consisted of applicants whose records were notreviewed by school officials. All three groups ultimately were not nominated.

Analysis of variance (ANOVA) was used to examine differences between the Nominees and the three Non-Nomineegroups on accomplishment record scores. Separate analyses were conducted for the total accomplishment record score(i.e., three competencies combined) and the three individual competency scores.

Table 9Mean accomplishment record scores for the four nomination groups

Nomination group Mean Standard deviation Post hoc analysis (significant group differences)

Total accomplishment record scoreNominated (N=2503) 8.68 1.62 2, 3, 4Not nominated (N=173) 7.33 1.66 1, 4No decision (N=28) 6.91 1.95 1Not reviewed (N=50) 6.46 1.89 1, 2

F=74.41, p<0.001, η2=0.075

Analytical thinking scoreNominated (N=2503) 3.02 0.73 2, 3, 4Not nominated (N=173) 2.55 0.74 1No decision (N=28) 2.34 0.79 1Not reviewed (N=50) 2.39 0.71 1

F=40.59, p<0.001, η2=0.042

Demonstrated leadership scoreNominated (N=2503) 2.74 0.79 2, 3, 4Not nominated (N=173) 2.55 0.87 1, 4No decision (N=28) 2.25 0.81 1Not reviewed (N=50) 2.09 0.95 1, 2

F=16.50, p<0.001, η2=0.018

Interpersonal and team skills scoreNominated (N=2503) 2.92 0.72 2, 3, 4Not nominated (N=173) 2.23 0.64 1No decision (N=28) 2.32 0.80 1Not reviewed (N=50) 1.98 0.73 1

F=82.35, p<0.001, η2=0.082

Page 15: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

338 B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

4.3.2. ResultsResults of the analyses are summarized in Table 9. Mean accomplishment record scores for the total

accomplishment record are provided at the top of Table 9, along with standard deviations. The trend of the meansindicates that the Nominee group scored higher than all three of the Non-Nominee groups. The ANOVA conducted totest for differences among the mean accomplishment record scores indicated a statistically significant difference amongthe groups (F=74.41, p<0.001, η2 =0.075). Follow-up tests (i.e., Bonferroni post hoc) resulted in statisticallysignificant differences between the Nominee group and each of the three Non-Nominee groups on the totalaccomplishment record score, as hypothesized. This trend held up for the three individual competency scores as well asshown in Table 9. In each case, the Nominee group scored significantly higher than the three Non-Nominee groups onthe accomplishment scores.

4.3.3. ConclusionThe results of the Non-Nominee study provide some evidence that the school nomination procedures are generally

capable of distinguishing higher quality candidates from lower quality candidates. The accomplishment recordassessed applicants on three fundamental leadership competencies, but it would be difficult to generalize the resultsmuch beyond this. More work needs to be done to fully understand the specific procedures used by schools to nominatePMF applicants.

5. Discussion

The PMF Program is a very popular Federal hiring mechanism, and most executive branch departments as well asmany smaller agencies employ Presidential Management Fellows to some degree. The PMF Program Office reportedthe State Department, the Department of Health and Human Services, Department of Housing and UrbanDevelopment, Social Security Administration, Department of Justice, and Department of Defense are typically amongthe top 10 employers of PMFs annually.

Agencies have reported using the Program for a variety of reasons. Agencies generally find that as a Federal hiringmechanism, the PMF Program is fairly easy to use (especially when compared to competitive examining), because itconducts all of the upfront screening necessary to deliver highly competitive candidates. Essentially, the agency onlyhas to select the individuals from the final candidate list.

Additionally, agencies have reported a repeated positive experience with Fellows each year. One agency, forexample, is expanding its annual hires to almost double what the top agency has hired in the past because it believes itcannot get “off the street” hires (i.e. hires obtained through traditional hiring mechanisms) that come close to the qualityof PMFs.

Given the workforce challenges that most agencies face, they must consider a variety of sources to meet theirrecruitment needs, and in this regard the Fellows Program is considered an essential component of many agencies'workforce and succession plans. This is particularly true in some smaller agencies with limited resources, in that thoseagencies may be unable to replicate in-house the nationwide search and extensive assessment process that they receivefrom the Fellows Program. The PMF Program allows these smaller agencies to participate to the extent that theirresources will allow simply by paying a fee per PMF appointment.

The PMF Program's popularity and long-term success not withstanding, the program still faces several challengesand limitations. Many of these limitations stem from the realities and challenges faced by having centralizedadministration but decentralized funding. That is, program-wide decisions pertaining to methodology (includingresearch studies) are made by the PMF Program Office, but the funding to support those program-wide decisions comesfrom the agencies that hire PMFs (through a per-placement charge). With limited funding available, difficult choicesmust be made. For example, as previously noted there has yet to be a full-scale systematic evaluation of the PMFProgram's effectiveness. While there have been several smaller-scale efforts, we must continue to strive to find theresources to conduct a full-scale evaluation. An empirical validation study would be an excellent first step in thisregard.

Additionally, implementing a strategy to more systematically track and monitor PMFs' performance and retentioninformation would allow us to conduct ROI analyses, thus contributing another important piece of validation evidence.Studies comparing PMFs and non-PMFs with respect to their relative impact or contribution to the Federal Governmentwill also provide a rich source of validation evidence. Finally, while in many ways the PMF Program is unique in its

Page 16: The Presidential Management Fellows Program: Lessons learned during 27 years of program success

339B.J. Nickels et al. / Human Resource Management Review 16 (2006) 324–339

charter, scope, and complexity, efforts should be undertaken to identify private-sector programs with similarcharacteristics for benchmark purposes. The Program Office is continually striving to find the resources necessary toaddress these challenges and opportunities.

Over the years, the PMF Program has undergone continual evolution and has experienced its share of successes andsetbacks. Most importantly however, it has survived for over a quarter of a century. While we are certain to face anonslaught of new challenges, we look forward to the next 27 years of success for the Presidential Management FellowsProgram.

References

Cascio, W. F. (1991). Applied psychology in personnel management (Fourth Edition). Englewood Cliffs, NJ: Prentice Hall.Hough, L. M. (1984). Development and evaluation of the “accomplishment record” method of selecting and promoting professionals. Journal of

Applied Psychology, 69(1), 135−146.McGraw, K. O., & Wong, S. P. (1996). Forming inferences about some intraclass correlation coefficients. Psychological Methods, 1, 30−46.Nunnally, J. C. (1978). Psychometric theory. New York, NY: McGraw-Hill Publishing Company.Schmidt, F. L., Caplan, J. R., Bemis, S. E., Decuir, R., Dinn, L., & Antone, L. (1979). Development and evaluation of behavioral consistency method

of assembled examining. Tech. Rep., vol. 79-21 U.S. Civil Service Commission, Personnel Research and Development Center.Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86(2), 420−428.Thornton, G. C. (1992). Assessment centers in human resource management. Reading, MA: Addison-Wesley Publishing Company.U.S. Merit Systems Protection Board (2001). Growing leaders: The presidential management intern program. Washington, DC.