reproductions supplied by edrs are the best that can be madebenchmarking airline operational...

109
DOCUMENT RESUME ED 439 267 CE 079 895 AUTHOR Carney, Thomas Q., Ed.; Luedtke, Jacqueline R., Ed.; Johnson, Jeffrey A., Ed. TITLE Collegiate Aviation Review, 1999. INSTITUTION University Aviation Association, Auburn, AL. SPONS AGENCY Purdue Univ., Lafayette, IN. ISSN ISSN-1523-5955 PUB DATE 1999-10-00 NOTE 108p.; Papers presented at the 1998 Fall Education Conference of the University Aviation Association. Sponsored by Mechtronix Systems, Inc., Cessna Aircraft Company, and Morris Publishing. Published annually. PUB TYPE Collected Works Serials (022) -- Reports Research (143) Speeches/Meeting Papers (150) JOURNAL CIT Collegiate Aviation Review; v17 n1 Oct 1999 EDRS PRICE MF01/PC05 Plus Postage. DESCRIPTORS Airports; *Aviation Education; *Aviation Technology; College Faculty; Computer Uses in Education; Curriculum Development; Economic Climate; *Educational Quality; Faculty Development; *Flight Training; Followup Studies; Graduate Surveys; Higher Education; Instructional Development; Internship Programs; *Labor Force Development; Models; Program Effectiveness; Public Policy; School Surveys; Simulation; Student Attitudes; Teacher Attitudes; Teacher Education; Teacher Role; Technical Education IDENTIFIERS Global Economy ABSTRACT This document, published annually, contains six papers devoted to aviation education. "Enhancing Global Competitiveness: Benchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D. Kane, Rebecca K. Lutte) outlines a model to help policymakers and others evaluate the effects of airline size on economic competition. "Multiple Expert Evaluations of a PC [Personal Computer] Computer-Based Aviation Flight Training Device" (Jeffrey S. Forrest) examines the usability of PC-based aviation training devices. "An Examination of the U.S. Collegiate Aviation Workforce in Preparing the Next Generation of Aviation Faculty Members beyond 2000" (Jeffrey A. Johnson) explores whether the next generation of aviation faculty is being adequately prepared. "Assessing the Environment and Outcomes of Four-Year Aviation Programs: Does Program Quality Make a Difference?" (Paul D. Lindseth) reports on a study of students, faculty, and alumni of various four-year aviation programs that was designed to determine whether four-year aviation programs are emphasizing environment and outcome indicators of quality. "Airport Internships: Effectively Structuring a Departmental Rotation Internship" (C. Daniel Prather) presents airport managers' perceptions of airport internships. "An Evaluation of a Major Airline Flight Internship Program: Participant's Perceptions of Curricular Preparation for, and Components of, the Internship" (Jose R. Ruiz, David A. NewMyer, D. Scott Worrells) presents the results of a survey of 110 former university interns who served semester-long flight operations internships at United Airlines. Most papers contain substantial bibliographies. (MN) Reproductions supplied by EDRS are the best that can be made from the original document.

Upload: others

Post on 17-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

DOCUMENT RESUME

ED 439 267 CE 079 895

AUTHOR Carney, Thomas Q., Ed.; Luedtke, Jacqueline R., Ed.;Johnson, Jeffrey A., Ed.

TITLE Collegiate Aviation Review, 1999.INSTITUTION University Aviation Association, Auburn, AL.SPONS AGENCY Purdue Univ., Lafayette, IN.ISSN ISSN-1523-5955PUB DATE 1999-10-00NOTE 108p.; Papers presented at the 1998 Fall Education

Conference of the University Aviation Association. Sponsoredby Mechtronix Systems, Inc., Cessna Aircraft Company, andMorris Publishing. Published annually.

PUB TYPE Collected Works Serials (022) -- Reports Research (143)Speeches/Meeting Papers (150)

JOURNAL CIT Collegiate Aviation Review; v17 n1 Oct 1999EDRS PRICE MF01/PC05 Plus Postage.DESCRIPTORS Airports; *Aviation Education; *Aviation Technology; College

Faculty; Computer Uses in Education; Curriculum Development;Economic Climate; *Educational Quality; Faculty Development;*Flight Training; Followup Studies; Graduate Surveys; HigherEducation; Instructional Development; Internship Programs;*Labor Force Development; Models; Program Effectiveness;Public Policy; School Surveys; Simulation; StudentAttitudes; Teacher Attitudes; Teacher Education; TeacherRole; Technical Education

IDENTIFIERS Global Economy

ABSTRACTThis document, published annually, contains six papers

devoted to aviation education. "Enhancing Global Competitiveness:Benchmarking Airline Operational Performance in Highly RegulatedEnvironments" (Brent D. Bowen, Dean Headley, Karisa D. Kane, Rebecca K.Lutte) outlines a model to help policymakers and others evaluate the effectsof airline size on economic competition. "Multiple Expert Evaluations of a PC[Personal Computer] Computer-Based Aviation Flight Training Device" (JeffreyS. Forrest) examines the usability of PC-based aviation training devices. "AnExamination of the U.S. Collegiate Aviation Workforce in Preparing the NextGeneration of Aviation Faculty Members beyond 2000" (Jeffrey A. Johnson)explores whether the next generation of aviation faculty is being adequatelyprepared. "Assessing the Environment and Outcomes of Four-Year AviationPrograms: Does Program Quality Make a Difference?" (Paul D. Lindseth) reportson a study of students, faculty, and alumni of various four-year aviationprograms that was designed to determine whether four-year aviation programsare emphasizing environment and outcome indicators of quality. "AirportInternships: Effectively Structuring a Departmental Rotation Internship" (C.

Daniel Prather) presents airport managers' perceptions of airportinternships. "An Evaluation of a Major Airline Flight Internship Program:Participant's Perceptions of Curricular Preparation for, and Components of,the Internship" (Jose R. Ruiz, David A. NewMyer, D. Scott Worrells) presentsthe results of a survey of 110 former university interns who servedsemester-long flight operations internships at United Airlines. Most paperscontain substantial bibliographies. (MN)

Reproductions supplied by EDRS are the best that can be madefrom the original document.

Page 2: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

viimmingvimminf

arirlj°TPERMISSION TO REPRODUCE ANDDISSEMINATE THIS MATERIAL HAS

BEEN GRANTED BY

C. iSj4/

TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC)

- UNIVER stn. AVIATION ASSOCIATION -

SEWING THE STANDARD OF

EXCELLENCEFOR

COLLEGIATE AVIATION!

U.S. DEPARTMENT OF EDUCATIONOffice of Educational Research and Improvement

ED ATIONAL RESOURCES INFORMATIONCENTER (ERIC)

This document has been reproduced asreceived from the person or organizationoriginating it.Minor changes have been made toimprove reproduction quality.

° Points of view or opinions stated in thisdocument do not necessarily representofficial OERI position or policy.

UNIVERSITY AVIATIONASSOCIATION

October 1999 Volume 17 : Number 1

2BEST COPY AVM LAKE

Page 3: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

UNIVERSITY AVIATIONASSOCIATION

COLLEGIATE AVIATIONREVIEW

Thomas Q. Carney, Ph.D., EditorJacqueline R. Luedtke, Ph.D., Associate EditorJeffrey A. Johnson, Ph.D., Associate Editor

Sponsored by Mechtronix Systems, Inc; Cessna Aircraft Company;Morris Publishing; the University Aviation Association; and PurdueUniversity

Cover design by Marc H. LuedtkeOctober 1999 Volume 17: Number 1

3

Page 4: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

The Collegiate Aviation Review (CAR)Fall 1999, Volume 17, Number 1Copyright © 1999 University Aviation Association

All correspondence and inquiries should be directed to:

University Aviation Association3410 Skyway DriveAuburn, AL 36830Telephone: (334) 844-2434Email: [email protected]

Printed in the USA by

MPMORRIS PUBLISHING

3212 East Highway 30 Kearney, NE 68847 1-800-650-7888

11

4

Page 5: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Editorial Board

Of the

Collegiate Aviation Review

Thomas Q. Carney, Purdue UniversityEditor

Jacqueline R. Luedtke, Utah State UniversityAssociate EditorChair, UAA Publications Committee

Jeffrey A. Johnson, University of Nebraska at OmahaAssociate Editor

Ballard M. Barker, Florida Institute of TechnologyBrent D. Bowen, University of Nebraska at OmahaLarry G. Carstenson, University of Nebraska at KearneyGerald P. Chubb, The Ohio State UniversityMavis F. Green, University of Illinois at Urbana-ChampaignAbe Harraf, Embry-Riddle Aeronautical UniversityDavid A. NewMyer, Southern Illinois University at CarbondaleGary J. Northam, Parks College of Saint Louis UniversityAlexander T. Wells, Embry-Riddle Aeronautical UniversityMichael E. Wiggins, Embry-Riddle Aeronautical UniversityFrank G. Mitchell, President, University Aviation Association

iii

Page 6: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

ACKNOWLEDGEMENTS

As the University Aviation Association continues to mature as an organization of aviationeducators and the institutions they serve, so too has the scholarly activity conducted by UAAmembers increased in both quality and quantity. This is evident in the increased number ofsubmissions to the Collegiate Aviation Review, which included ten papers for this edition. As along-time member of the UAA Publications Committee, it is particularly gratifying to the Editorto see this growth in scholarly activity, along with the opportunities provided by the Associationfor members to share their work with others.

This year the CAR has a new "look", the product of the vision and hard work of a number ofindividuals, along with the generosity of our sponsors, Mechtronix Systems, Inc; CessnaAircraft Company; Morris Publishing; the University Aviation Association; and PurdueUniversity. Without the tireless efforts of Jacqueline Luedtke, Larry Carstenson, FrankMitchell, and Gary Kite ley, the new and enhanced format of the CAR would not have beenpossible. My profound thanks to each of these colleagues, and to our sponsors.

No juried publication could exist, of course, unless experts in the field serve as anonymousreviewers. Indeed, the ultimate guarantors of quality and appropriateness of scholarly materialsfor a publication are the knowledge, integrity, and thoroughness of those who serve in thiscapacity. This year, a concerted effort was made to involve more of the professional members ofUAA in the review process, with the goal of having four reviewers for each paper, and noindividual reviewing more than one. The thoughtful, careful, and timely work of each of theseprofessionals added substantively to the quality of the journal, and made my task as editor mucheasier. Thanks to each reviewer for performing this critically important work.

In addition to the members of the UAA Publications Committee, the reviewers include:

Herbert B.ArmstrongTim BradyJames E. CrehanAlan C. DavisThomas L. DeckardGerry R. FairbairnRonald J. FerraraRobert S. FinkelsteinKent W. LovelaceRebecca K. LutteRoyce A. MartinWilliam K. McCurryRobert K. MockIsaac R. NetteyDavid A. NewMyerGary J. Northam

Dowling CollegeEmbry-Riddle Aeronautical UniversityUniversity of AlaskaEmbry-Riddle Aeronautical UniversityWestern Michigan UniversityDaniel Webster CollegeMiddle Tennessee State UniversityNorth Shore Community CollegeUniversity of North DakotaUniversity of Nebraska at OmahaOhio UniversityArizona State UniversityMetropolitan State College of DenverTexas Southern UniversitySouthern Illinois University at CarbondaleParks College of Engineering and Aviation

iv

Page 7: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

List of reviewers,continued

John C. OggDonald A. PetrinStephen M. QuiltyJacqueline B. SandersGuy M. SmithAlan J. StolzerWilma J. WalkerJohn P. Young

Thomas Q. Carney, Ph.D.EditorAugust, 1999

Dowling CollegePurdue UniversityBowling Green State UniversityMercer Co. Community CollegeEmbry-Riddle Aeronautical UniversityParks College of Engineering and AviationEastern Kentucky UniversityPurdue University

Page 8: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

STATEMENT OF OBJECTIVES

The Collegiate Aviation Review is published annually by the University Aviation Association,and is distributed to the members of the Association. Papers published in this volume wereselected from submissions that were subjected to a blind peer review process, and were presentedat the 1998 Fall Education Conference of the Association.

The University Aviation Association is the only professional organization representing all levelsof the non-engineering/technology element in collegiate aviation education. Working through itsofficers, trustees, committees and professional staff, the University Aviation Association plays avital role in collegiate aviation and in the aviation industry.

The University Aviation Association accomplishes its goals through several objectives. Theseobjectives are:

To encourage and promote the attainment of the highest standards in aviation educationat the college level.

To provide a means of developing a cadre of aviation experts who make themselvesavailable for such activities as consultation, aviation program evaluation, speakingassignments, and other professional contributions that stimulate and develop aviationeducation.

To furnish a national vehicle for the dissemination of intelligence relative to aviationamong institutions of higher education and governmental and industrial organizations inthe aviation/aerospace field.

To permit the interchange of information among institutions that offer non-engineeringoriented aviation programs including business technology, transportation, andeducation.

To actively support aviation/aerospace-oriented teacher education with particularemphasis on the presentation of educational workshops and the development ofeducational materials in the aviation and aerospace fields.

University Aviation Association3410 Skyway DriveAuburn, AL 36830

Telephone: (334) 844-2434Email: [email protected]

vi

Page 9: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Call for Papers

for the

2000 UAA Fall Education Conference

and the

Collegiate Aviation Review (CAR)

Both qualitative and quantitative research manuscripts are acceptable. All submissions must beaccompanied by a statement that the manuscript has not been previously published and is notunder consideration for publication elsewhere.

All authors will be required to sign a "Transfer of Copyright and Agreement to Present"statement in which (1) the copyright to any submitted paper which is subsequently published inthe CAR will be assigned to the University Aviation Association (UAA) and in which (2) theauthors agree to present any accepted paper to a UAA conference to be selected by the UAA, ifrequested.

Authors should submit five double-spaced copies of the manuscript, conforming to theguidelines contained in the Publication Manual of the American Psychological Association, 4thEd. (APA). If the manuscript is accepted for publication, the author(s) will be required to submitthe manuscript on 3 1/2 inch computer disk in either WordPerfect (6.0 or later version) orMicrosoft Word format.

The UAA review process is a refereed process using "blind" peer reviewers. A list of allreviewers is available from the CAR editor and will be published in the CAR.

All manuscripts must be mailed to the CAR editor with a postmark date no later than March 1,2000, and should be sent to:

Dr. Thomas Q. CarneyDepartment of Aviation TechnologyPurdue University1 Purdue AirportWest Lafayette, IN 47906

Questions regarding the submission or publication process may be directed to the editor at (765)494-9954 or may be sent by email to: [email protected]

Full-time graduate students are encouraged to submit manuscripts to the CAR for review in thegraduate student category. A travel stipend up to $500 is available for the successful graduatestudent submissions. Contact the editor or UAA for additional information.

vii

Page 10: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

TABLE OF CONTENTS

Enhancing Global Competitiveness: Benchmarking Airline Operational Performancein Highly Regulated Environments

Brent D. Bowen, Dean Headley, Karisa D. Kane, and Rebecca K. Lutte 1

Multiple Expert Evaluations of a PC Computer-Based Aviation Flight Training DeviceJeffrey S. Forrest 15

An Examination of the U.S. Collegiate Aviation Workforce in Preparing the NextGeneration of Aviation Faculty Members Beyond 2000

Jeffrey A. Johnson 31

Assessing the Environment and Outcomes of Four-YearAviation Programs: Does Program Quality Make a Difference?

Paul D. Lindseth 40

Airport Internships: Effectively Structuring a Departmental Rotation InternshipC. Daniel Prather 53

An Evaluation of a Major Airline Flight Internship Program: Participant's Perceptionsof Curricular Preparation for, and Components of, the Internship

Jose R. Ruiz, David A. New Myer, and D. Scott Worrells 74

viii 10

Page 11: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Enha cing Global Competitive ess: Benchmarking Airline Operational Perforin Highly Regulated Environments

Brent . BowenUniversity of Nebraska at Omaha

Dean HeadleyWichita State University, (Kansas)

Karisa D. KaneUniversity of Nebraska at Omaha

Rebecca K. LuneUniversity of Nebraska at Omaha

ABSTRACT

I I ance

Enhancing competitiveness in the global airline industry is at the forefront of attentionwith airlines, government, and the flying public. The seemingly unchecked growth of majorairline alliances is heralded as an enhancement to global competition. However, like manymega-conglomerates, mega-airlines will face complications driven by size regardless of themany recitations of enhanced efficiency. Outlined herein is a conceptual model to serve as adecision tool for policy-makers, managers, and consumers of airline services. This model isdeveloped using public data for the United States (U.S.) major airline industry available from theU.S. Department of Transportation, Federal Aviation Administration, the National Aeronauticsand Space Administration, the National Transportation Safety Board, and other public andprivate sector sources. Looking at historical patterns of Airline Quality Rating results providesthe basis for establishment of an industry benchmark for the purpose of enhancing airlineoperational performance. Applications from this example can be applied to the manycompetitive environments of the global industry and assist policy-makers faced with rapidlychanging regulatory challenges.

INTRODUCTION

Looking at historical patterns of theAirline Quality Rating (AQR) may providethe basis for establishment of an industrybenchmark for the purpose of enhancingairline operational performance.Benchmarking is a process that helpscompanies to find high performance levelsin other organizations and to learn enoughabout how they are achieving those levels so

1

the practice producing the high performancecan be applied to one's own company(Keehley, Medlin, MacBride & Longmire,1997). Enhancing competitiveness in theglobal airline industry is at the forefront ofattention with airlines, government, and theflying public. The seemingly uncheckedgrowth of major airline alliances is heraldedas an enhancement to global competition.However, like many mega-conglomerates,mega-airlines will face complications driven

11

Page 12: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

by size regardless of the many recitations ofenhanced efficiency.

Outlined 'herein is a conceptualmodel to serve as a decision-tool for policymakers, managers, and consumers of airlineservices. The AQR can serve as a model forother organizations on how to use data as abenchmark to help an organization orindustry improve its performance. TheAQR is a summary of month-by-monthquality ratings for the major U.S. airlinesduring a one-year period. The AQR uses 19data points such as pilot deviations, loadfactors and the number of accidents. (SeeTable 1). The AQR model uses publiclyavailable data from the Department ofTransportation, Federal AviationAdministration, National Aeronautics andSpace Administration, NationalTransportation Safety Board, as well asother sources. Applications from the AQRcan be applied to the many competitiveenvironments of our global industry andassist policy-makers faced with rapidlychanging regulatory challenges.

The AQR serves as an annuallyreported benchmark in the aviation industry.The ultimate benefit of benchmarking is

enhanced competitiveness. An airlinestriving to improve its service identifiesindustry leaders and seeks to understandhow the leaders achieve successfulperformance levels. The airline then adaptsthese strategies to their own organization.

Benchmarking can best be described as"the continuous process of measuringproducts, services, and practices against thecompany's toughest competitors or thosecompanies renowned as industry leaders"(Camp, 1992, p. 3).

Benchmarking can also be describedas the "Consumer Reports" of the public andprivate sectors. It provides consumers withaccurate and reliable information with whichthey can set standards, make comparisons,judge performances, and consequently make

a purchasing decision. The AQR is aninnovative example of a benchmark in theairline industry and can serve as aframework for organizations in othercompetitive environments. Using theAirline Quality Rating system and monthlyperformance data for each airline for thecalendar year, individual and comparativeratings are reported. The AQR uses datapoints from key public sources and providesa starting point for monitoring the quality ofan individual airline. With all of thecompetitive forces at play in the globalairline industry, a basic quality assessmenttool would be useful to variousgovernments, competitors, and internationalairline travelers. The AQR applied to majorU.S. carriers can also be applied tointernational airlines provided thatcomparable data are available. Consumerscan use this ranking system to makecomparisons and judge the variousperformances of the airlines.

Benchmark Purposes and RationaleMany reasons exist to benchmark the

performance of an organization or industry.First of all, it simply works. To the surpriseof many organizations, benchmarkingreveals sizable performance gaps. AlaskaAirlines had a high rate of mishandledbaggage in 1997, which placed it well belowthe industry average. This performance gapis now identified and can be improved. Infact, the 1998 AQR results showed a slightimprovement in Alaska Airlines'mishandled baggage rate. The airlinemoved from being ranked tenth worst toninth worst in baggage handling (Bowen &Headley, 1999). Secondly, recognition islikely to follow. Besides the internalbenefits, external benefits such as publicityare likely to occur. The AQR is nationallybroadcast to more than 50 millionconsumers on the major news networks andin major newspapers. As competition for

21 2

Page 13: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

this achievement increases, the airlines willundoubtedly seek to be the best andimplement innovative and successfulpractices. Finally, airlines cannot afford notto benchmark. Airline consumer complaintsrose 20 percent from 1996 to 1997 (Bowen& Headley, 1998). Consumers aredemanding a high-quality return for theirmoney.

Benchmarking works because itillustrates improvements in quality andperformance. A perfect example isContinental Airlines. Continental was themost improved airline from 1996 to 1997 asthey moved from fifth to third position.They improved their mishandled baggagerate and denied boardings, and hadconsistently good performance in all areasrated. The AQR scores over the years showthat Continental Airlines is clearly the mostimproved of the major carriers. Theirconsistent improvement since 1994 hasmoved them from last to third on the qualityscale.

Benchmarking can be defined by thefollowing criteria; it must be successful overtime, have quantifiable results, beinnovative, be repeatable, and must not belinked to unique demographics (Keehley etal., 1997). The AQR qualifies as abenchmark by meeting all of these criteria.The AQR has a comprehensive database ofsuccess dating to 1991. Seven consecutiveyears of data have been collected andanalyzed. The AQR has quantitative resultsderived from a weighted average of 19factors with relevance to consumers whenjudging the quality of airline services. "TheAirline Quality Rating approach focuses onquantitative factors rather than qualitativefactors in order to provide a more objectiveresult in assessing service quality levelsacross all major domestic airlines. The useof quantifiable, readily available dataprovides an objective starting point formonitoring the quality of service an

3

individual airline might be providing andallows it to be directly compared with othercompetitors" (Bowen & Headley, 1997, p.58). The AQR uses an innovative approachby combining basic ideas and raw materialwith a specific purpose in mind. "Theobjective in developing the AQR was tobetter organize readily available data for theconsumer and offer it in a more useful,understandable, and objective form" (Bowen& Headley, 1997, p. 57). Another criteria ofa benchmark is that it should be repeatablewith some modifications. The AQR hasbeen successfully repeated from 1991 to1998. Minor modifications were madewhen the number of carriers changed fromyear to year. Finally, a good benchmark isnot linked to unique demographics. "Theresults of a benchmark study are just asnapshot, or a moment in time. But whenyou add data from your industry and yourorganization to your benchmark subject'sdatabase, trends invariably start to emergeand become clear" (Finnigan, 1996, p. 144).

Defining Performance Measurement: TheAirline Quality Rating

The majority of quality ratingsavailable rely on subjective surveys ofconsumer opinion which are completedinfrequently. This subjective approachyields a quality rating that is essentially non-comparable from survey to survey for anyspecific airline. Timeliness of survey basedresults can be problematic as well in the fastchanging airline industry. Before theAirline Quality Rating, there was effectivelyno consistent method for monitoring thequality of airlines on a timely, objective, andcomparable basis. With the introduction ofthe AQR, a multi-factor, weighted averageapproach became available. This approachhad not been used before in the airlineindustry. The method relies on takingpublished, publicly available data thatcharacterizes airline performance on critical

Page 14: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

quality factors important to consumers andcombines them into a rating system. Thefinal result is a rating for individual airlineswith ratio scale properties comparableacross airlines and across time.

The Airline Quality Rating is aweighted average of 19 factors that haveimportant to consumers when judging thequality of airline services. Factors includedin the rating scale were taken from an initiallist of over 80 potential factors. Factorswere screened to meet two basic criteria; 1)a factor must be obtainable from publisheddata sources for each airline; and 2) a factormust have relevance to consumer concernsregarding airline quality. Data used incalculating ratings represent performanceaspects (i.e. safety, on-time performance,financial stability, lost baggage, deniedboardings) of airlines that are important toconsumers. Many of the factors used arepart of the Air Travel Consumer Reportprepared by the U.S. Department ofTransportation.

Final factors and weights wereestablished by surveying airline industryexperts, consumers, and public agencypersonnel regarding their opinion as to whatconsumers would rate as important injudging airline quality. Also, each weightand factor were assigned a plus or minussign to reflect the nature of impact for thatfactor on a consumer's perception of quality.For instance, the factor that includes on-time performance is included as a positivefactor because it is reported in terms of on-time successes, suggesting that a highernumber is favorable to consumers. Theweight for this factor is high due to theimportance most consumers place on thisaspect of airline service. Conversely, thefactor that includes accidents is included asa negative factor because it is reported interms of accidents relative to the industryexperience, suggesting that a higher numberis unfavorable to consumers. Because safety

is important to most consumers the weightfor this factor is also high. Weights andpositive/negative signs are independent ofeach other. Weights reflect importance ofthe factor in consumer decision making,while signs reflect the direction of impactthat the factor should have on theconsumer's rating of airline quality. Whenall factors, weights, and impacts arecombined for an airline and averaged, asingle continuously scaled value is obtained.This value is comparable across airlines and

across time periods.

4 14

Page 15: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 1Airline Quality Rating Factors, Weights and Impact

FACTOR WEIGHT IMPACT (+/-)

1 Average Age of Fleet 5.852 Number of Aircraft 4.543 On-Time 8.634 Load Factor 6.985 Pilot Deviations 8.036 Number of Accidents 8.387 Frequent Flier Awards 7.358 Flight Problems' 8.059 Denied Boarding? 8.0310 Mishandled Baggage' 7.9211 Fare? 7.6012 Customer Service' 7.2013 Refunds' 7.3214 Ticketing/Boarding' 7.0815 Advertising' 6.8216 Credit" 5.9417 Other" 7.3418 Financial Stability 6.5219 Average Seat-Mile Cost 4.49

Note: 'Data for these factors are drawn from the Department of Transportation'smonthly Air Travel Consumer Report.

5

15

Page 16: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

The Airline Quality Ratingmethodology allows comparison of majorairline domestic operations on a regularbasis (as often as monthly) using a standardset of quality factors. Unlike otherconsumer opinion approaches, which rely onconsumer surveys and subjective opinion,the AQR uses a mathematical formula thattakes multiple weighted objective factorsinto account in arriving at a single rating foran airline. The rating scale is useful becauseit provides consumers and industry watchersa means for looking at comparative qualityfor each airline on a timely basis usingobjective, performance-based data.

The equation, known as the nationalAirline Quality Rating (AQR), where Q isquality, C is weight, and V is the value ofthe variables, is stated Q = W[i1-19] x F[i 1 -19]. Figure 1 presents the formula as aweighted average, which results in ratioscale numbers.

Figure 1Weighted Average Formula for the AOR

AQR- w1F1 + w2F2 + w3F3 +1- ... w19F19

± wo

Note. From "Airline Quality Report," by B.Bowen and D. Headley, 1991, NIAR Report91-11. Wichita State University.

Framing a Benchmark ProcedureBenchmarking asks two fundamental

questions: how well is the agency doing andis the agency's performance improving ordeteriorating? Only then should a thirdquestion be asked: is another company doingsomething better than this agency? TheAQR can be used as a benchmark by themajor airlines to answer these veryquestions. Northwest Airlines can look atthe results and ask itself how their company

is doing and if their performance isimproving or deteriorating. Then they canlook to see who is doing something betterthan they are. Northwest Airlines couldlook at the success of Southwest Airlines,the top ranked airline, to gain insight as tohow they are successful. "Sharingexperiences and learning from theexperience of other organizations is thecheapest and most efficient, effective andcompelling means for improvingperformance" (Keehley et al., 1997, p. 207).

Industry Week named the Xeroxcompany best-in-class in benchmarking.Robert Camp from Xerox wrote a book onbenchmarking in 1992, Benchmarking: TheSearch for Industry Best Practices that Leadto Superior Performance. Camp says thefirst step in benchmarking is to decide whatto focus on. Select areas that are importantto customers, critical success factors, areasfor greatest improvement, competitivepressure points and problem areas(Richardson, 1992, p. 33). The air carriersrealize that customer satisfaction is a keypoint to consumers and that consumers havea choice when selecting an air carrier.Using Northwest Airlines as an example, thecompany should seek out areas that areimportant to customers such as customerservice, mishandled bags, fares, and deniedboardings. They should choose problemareas such as on-time perforniance and focuson these areas for improvement.

Step two is to understand yourcompany's own processes by clarifying,identifying, and prioritizing your own bestpractices. Benchmarking is best utilizedwhere there is the opportunity for majorpayback and it is not advised to benchmarkan organization's strengths. Because of theexpense involved, a company should notnecessarily benchmark a process in whichthey know they are successful. Instead,focus on performance areas that couldprovide the most significant return. For the

6

16

Page 17: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

airline industry, these areas could includeimproving on-time performance andreducing consumer complaints. Next, usepeople with knowledge and experience inthe function. Benchmarking should beconducted by teams with the appropriateskills such as a team facilitator, analyticalskills, and information search capability. Acompany should train the teams in theessentials of benchmarking. The fourth stepis to make sure the teams are focused onbest practices. Often times results or returnson assets are the focus of a company. Thenumbers however, do not tell anything aboutthe process. In benchmarking, the numbersare only 10% of the activity whole processesare 90%. The next step is to find a companythat does it the best. Benchmarking withmore than one company gives validation thatyou are finding the best practice. USAirways, for example, can use Southwest,Alaska, Continental, or another airline theyfeel is doing something superior. The finalstep is to update. As processes andcompetition change over time, industry bestpractices should change accordingly (Camp,1992).

Benchmark Procedural Validity andReliability

The AQR has accomplishednumerous objectives accepted as keyingredients of benchmarking. It is based onobjective criteria, thereby eliminatingperception and opinion. (Velocci, 1997).While based primarily on public sector data,realization and inclusion of private sectorinformation provides substantial benefit.The AQR has spanned seven years,therefore encountering a changing businessenvironment, public policy, and economicconditions. Metrics derived from publicallyavailable data sources insure accountabilityand validity through constant replication andconstituent observation. As a methodology,

AQR annual results have been subjected topeer review on numerous occasions.Widespread citation in academic literature,media reports, and airline reportscontinuously validate the mechanisms usedto establish this industry benchmark. Thedetails of this methodological approach andvalidation have been addressed in annualpublication of results. A key test for datareliability is computation of Cronbach'sAlpha. Reliability of the rating scale (SeeTable 2) was measured as extremely high(Bowen, Headley, & Lutte, 1993).

The reliability, as defined as thefreedom from the random error and itsability to yield consistent results, isestablished by Cronbach's Alpha. (Bowen,Headley, & Luedtke, 1992). Cronbach'sAlpha estimates the internal consistencyreliability of a scale made up of a number ofequally weighted items with values betweenzero and one. Coefficients above 0.6 aredesirable and many would argue that valuesabove 0.8 are needed for a developed scale.A reliability coefficient sets an upper limitfor the (criterion) validity of a scale(Cronbach, 1951).

7

17

Page 18: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 2Reliability Coefficient

Measure Score Scale

Result

Cronbach'sAlpha

0.87 0-1.0 Extremelyhighvalidity

Controlling for VariabilityTesting the AQR model involved

basic concepts such as control limitation andstandard deviation range comparisons toperformance data and to model variability inthe baseline year. A statistical processcontrol charting established the uppercontrol limitations and the lower controllimitations. These limits represent atargeted range of variability based on oneyear of experience and are projectedoutward across the next year. Statisticalprocess control testing for the AQR wascalculated over 24 measurement periods toprovide maximum representation ofvariability. This tool can be used with theAQR scores to set benchmark standards forindividual airlines and for the airlineindustry. As a model, the AQR meets theprerequisites of accurate numerical data andchronologically recorded data (Bowen,Headley, & Lutte, 1993).

Common cause variability occurswhen points are randomly distributed aboutthe center line within the upper and lowercontrol limits. Common cause variabilityinvolves more complicated factors thatcannot be easily altered in the short-term. Inthe AQR these factors are areas such asfinancial stability, age of the fleet, andnumber of accidents. Common causevariability represents the level of quality thatthe organization or industry is capable of

producing. It is entirely possible that anorganization may be within control limitsand still be performing at an inadequatelevel of quality to compete. The secondtype of variability is called local faults.Local faults are factors that are easilyidentifiable and can generally be controlledby employees. In the AQR these factorswould be such things as mishandled baggageor customer complaints about service. Alocal fault is indicative of a situation that istemporarily out-of-control. Local faults aretypically short-term and are often correctedby employees actually responsible forperformance (Fellers, 1992).

The AQR Benchmark: Results in ActionThe Airline Quality Rating was

developed and first announced in early 1991as an objective method of comparing airlineperformance on combined multiple factorsimportant to consumers. Over a span ofseven years the Airline Quality Rating hasprovided a summary of month-by-monthquality ratings for the ten major U.S. airlinesoperating during this period. Using theAQR system and monthly performance datafor each airline for themulti-year period provides comparative datafor a longer term view of quality in theindustry.Since the Airline Quality Rating iscomparable across airlines and across time,monthly rating results can be examined bothindividually and collectively. A compositeindustry average that combines the ten majorairlines which are monitored each month on19 criteria over the seven year span isrepresented in Table 3. Table 4 provides asummary of data.

8 18

Page 19: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 3Benchmark Indicators 1991- 1997

A R Result

1997 1996 1995 1994

0.0001 -0.0762 -0.0948 -0.1103

1993 1992 1991

-0.0706 -0.0309 -0.0167

Note. From "Airline Quality Report," by B.Bowen and D. Headley, 1991-1998,Aviation Monograph Reports. Wichita StateUniversity and University of Nebraska atOmaha.

Table 4Summary of Data

Mean -0.05629

Standard Deviation 0.04072

Standard Error 0.01539

Minimum -0.1100

Maximum 0.000

Median -0.07000

Lower 95% CI -0.09395

Upper 95% CI -0.09395

Note. t= 3.675 with 6 degrees of freedomThe two-tailed P value is 0.0106, consideredsignificant

Continuing a trend started in 1994,the AQR industry average scores show an

9

industry that is improving in quality. 1997shows the largest change for industryaverage AQR scores of any of the past sevenyears. For 1997 the overall industry averageAQR score was the highest of any of theseven years rated. The AQR scoreimprovement was the most of any year-to-year score changes since 1991. Whilefactors of on-time performance, involuntarydenied hoardings, and mishandled baggageare better, a 20% increase in the number ofcomplaints filed with the Department ofTransportation runs counter to a recoveredindustry. Financial performance hascertainly improved along with someindicators of quality performance. Increasedconsumer dissatisfaction expressed by anincreased volume of complaints seems toindicate that how things are done is just asimportant as what gets done.

The AQR was originally developedfor the eventual purpose of benchmarkingthe U.S. major airline industry, which ishighly competitive and highly regulated.The airlines clearly compete for the AQRrating. American Airlines launched a largemarketing campaign when they were ratedthe number one in airline quality in 1991,1992, and 1994. Regulatory officials,consumers, financial analysts, and others areinterested in monitoring overall industryperformance and the resulting effects ofsituational environment changes. Airlinesmust monitor operational performance tomaintain competitiveness. Each airline mustmonitor performance to industry standardand previous case history for that air carrier.Thus each airline will have to know the

effect of each operational performanceindicator and act to effect change. Table 5portrays each airlines' results for the sevenyear span. The order is from high to lowscore for the calendar year of 1997.

19

Page 20: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 5Industry Average AQR Scores for U.S. Major Airlines

1997 1996 1995 1994 1993 1992 1991

Southwest 0.346 0.306 0.221 0.211 0.252 0.251 0.220Alaska 0.112Continental 0.069 -0.095 -0.340 -0.574 -0.540 -0.274 -0.266American 0.050 0.033 0.164 0.225 0.231 0.290 0.323United 0.041 0.031 0.058 0.123 0.176 0.214 0.168Delta 0.000 -0.017 -0.024 -0.031 0.076 0.123 0.193Northwest -0.069 -0.100 -0.222 -0.210 -0.247 -0.193 -0.143America West -0.116 -0.275 -0.145 -0.282 -0.294 -0.267 -0.325Trans World -0.199 -0.302 -0.303 -0.307 -0.286 -0.398 -0.435US Airways -0.233 -0.267 -0.262 -0.148 -0.003 -0.024 0.115

Mean SD SE Min Max1997 0.0001 0.1678 0.0531 -0.2330 0.34601996 -0.0762 0.1939 0.0646 -0.3020 0.30601995 -0.0948 0.2077 0.0692 -0.3400 0.22101994 -0.1103 0.2671 0.0890 -0.5740 0.22501993 -0.0706 0.2805 0.0935 -0.5400 0.25201992 -0.0309 0.2603 0.0868 -0.3980 0.29001991 -0.0167 0.2773 0.0924 -0.4350 0.3230

Note. From "The 1998 Airline Quality Rating," by B. Bowen and D. Headley, 1998, AviationMonograph Report 98-1. Wichita State University and University of Nebraska at Omaha.

10

Page 21: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 6Monthly AQR Scores: Southwest Airlines

1997 1996 1995 1994 1993 1992 1991January 0.348 0.274 0.222 0.233 0.280 0.291 0.244February 0.351 0.284 0.229 0.233 0.300 0.287 0.254March 0.355 0.288 0.255 0.239 0.295 0.274 0.241April 0.309 0.268 0.265 0.202 0.238 0.266 0.245May 0.305 0.241 0.256 0.210 0.245 0.263 0.250June 0.323 0.250 0.230 0.206 0.241 0.261 0.254July 0.350 0.351 0.204 0.221 0.174 0.265 0.203August 0.349 0.351 0.203 0.221 0.170 0.270 0.183September 0.353 0.400 0.232 0.236 0.169 0.256 0.202October 0.394 0.319 0.197 0.191 0.308 0.266 0.196November 0.337 0.330 0.187 0.187 0.304 0.159 0.190December 0.384 0.316 0.175 0.151 0.306 0.149 0.179

Average 0.346 0.306 0.221 0.211 0.252 0.251 0.220

Mean SD SE Min MaxJanuary 0.2703 0.0428 0.0162 0.2220 0.3480February 0.2769 0.0427 0.0161 0.2290 0.3510March 0.2781 0.0403 0.0152 0.2390 0.3550April 0.2561 0.0329 0.0124 0.2020 0.3090May 0.2529 0.0285 0.0108 0.2100 0.3050June 0.2521 0.0362 0.0137 0.2060 0.3230July 0.2526 0.0723 0.0273 0.1740 0.3510August 0.2496 0.0757 0.0286 0.1700 0.3510September 0.2640 0.0828 0.0313 0.1690 0.4000October 0.2673 0.0777 0.0294 0.1910 0.3940November 0.2420 0.0777 0.0294 0.1590 0.3370December 0.2371 0.0957 0.0362 0.1490 0.3840

Note. From "The 1998 Airline Quality Rating," by B. Bowen and D. Headley, 1998, AviationMonograph Report 98-1. Wichita State University and University of Nebraska at Omaha.

21"

Page 22: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

As an example, Table 6 conveys theperformance of 1997's leader, SouthwestAirlines. This chart visually presents 1997'sperformance and provides the historicaltrend data for one year. Additionally, Table6 shows performance over the seven yearspan which could set a higher benchmark forthis individual carrier than use of theindustry average as a benchmark.Identification of key benchmarks areavailable for any targeted point. Eachairline will be able to analyze performancerelative to the overall industry and pastindividual case.

Applications for the BenchmarkStandard

In order for benchmarking to besuccessful, lasting performance improve-ments must be made. Sustaining themomentum is crucial to overcoming oldpractices and implementing new ones. Newprocesses in organizations require constantattention and continual practice. Oldpractices must be unlearned. Three types ofissues arise: ensuring the successfulimplementation and operation of bestpractice in organization, institutionalizingbenchmarking as the way to search for bestpractices throughout the agency, and clearlydefining the future of benchmarking for bestpractices as a means for bringing betterservice to customers (Keehley et al., 1997).

The major airlines are realizing thatit is important to attract and retaincustomers. "Companies are learning that itis important to monitor customers' needsand wants and then strive to meet thoseneeds and wants. If an airline fails toprovide quality/satisfaction in its services(i.e. passenger satisfaction), it will lose itscustomers to its competitors" (Bowen &Headley, 1997, p. 61). "It is essential for allbusiness organizations to retain existingcustomers and attract new ones. Since thesigns from the service provider (emitter) are

interpreted by the customer they can eitherstrengthen or weaken the persuasiveinfluence of the company and thereby affectits image and the customer response. Itwould be interesting to research what thesesigns are in the area of service provision andtheir impact on the loss or gain of trade"(Malver, 1988, p. 223). Studies mayindicate signs, whether they are positive ornegative, and the impact on the customer.These impacts determine whether thecustomer will remain or leave. You canperform research to detect signs that have "acommon international interpretation and thesame impact irrespective of the nationalityof the passenger" (Malver, 1988, p. 223).Findings from this study may help the"company to improve the delivery of serviceand to contribute the development of thediscipline itself" (Malver, 1988, p. 224).The results from the AQR could mostcertainly help the major airlines to improvetheir delivery of service. Alaska Airlinescould improve the number of mishandledbags and involuntary denied boardings andAmerican could improve its on-timeperformance. All of the major airlines canuse the results to see how they compareagainst the competition and improve theirrespective services.

CONCLUSIONBenchmarking is not a solution to all

of the problems an agency faces but "apowerful weapon in the performanceimprovement arsenal" (Keehley et al., 1997,p. 207). Benchmarking cannot solve all ofthe problems, but it allows an agency to lookoutward and provides the reason andmethods that organizations need to seek outbest practices and solve performanceproblems. The need for excellence willbecome even greater in the future asconsumers become more demanding."Budgets will shrink, the demand foraccountability will increase, the need for

12

22

Page 23: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

demonstrable results will grow" (Keehely etal., 1997, p. 206). The use of the AQR as anindustry benchmark can enhance airlineoperational performance.

Prior to the AQR, a consistentmethod for monitoring airline quality on atimely, objective, and comparable basis didnot exist. For the first time in the airlineindustry, a rating was developed that used amulti-factor weighted average approach thatresulted in a starting point for monitoringairline quality. The end result is a rating forindividual airlines with ratio scale propertiesthat can be compared across airlines andacross time. Additionally, the rating turnsdata into a more useful and understandableform for consumers.

Because most airline operations aresimilar throughout the world, this approachcan also be used by many countries to

enhance the quality of their airlines. Aglobal airline performance benchmarkwould be in the best interests of all theairlines and consumers. Such a benchmarkcould identify some basic performancefactors that could be tracked internationally.The AQR offers a readily availableblueprint of a benchmark that is applicableto global airline benchmarking and to otherorganizations and industries. It is envisionedthat the AQR benchmark will provide abaseline for future comparative research.Such comparative research could includecorrelational studies. These studies couldattempt to show a cause and effectrelationship between the AQR and airlinefinancial performance or the AQR andairline safety.

13 2 3

Page 24: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

References

Bowen, B. D., & Headley, D. E. (1999). The airline quality rating 1999. (UNO AviationMonograph Report 99-3). Omaha, NE Aviation Institute.

Bowen, B. D., & Headley, D. E. (1998). The airline quality rating 1998. (UNO AviationMonograph Report 98-1). Omaha, NE, Aviation Institute.

Bowen, B. D., & Headley, D. E. (1991). The airline quality rating 1991. (NIAR Report91-11). Wichita, KS, Wichita State University.

Bowen, B. D., & Headley, D. E., & Luedtke, J. (1992, Winter). A quantitativemethodology for measuring airline quality. Journal of Aviation/Aerospace Education andResearch, 2, (2), 27-33.

Bowen, B. D., Headley, D. E., & Lutte, R. (1993, Fall). The airline quality rating:Developing an industry standard. Journal of Aviation/Aerospace Education and Research, 4, (1),33-39.

Camp, R. C. (1992). Learning from the best leads to superior performance. Journal ofBusiness Strategy, 11, 55-65.

Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests.Psychometrika, 16, 297-334.

Fellers, G. (1992). The Deming vision: SPC/TOM for administrators. Milwaukee, WI:ASQC Quality Press.

Finnigan, J.P. (1996). The manager's guide to benchmarking. San Francisco: Jossey-Bass.

Headley, D. E., & Bowen, B. D. (1997). International airline quality measurement.Journal of Air Transportation World Wide, 2, 55-63.

Keehley, P., Medlin, S., MacBride, S., & Longmire, L. (1997). Benchmarking for bestpractices in the public sector. San Francisco: Jossey-Bass.

Malver, H. (1988). Service in the airlines: Customer or competition oriented?Unpublished doctoral dissertation: Stockholm University, Sweden.

Richardson, H. L. (1992, October). Improve quality through benchmarking.Transportation and Distribution, 33, 32-37.

Velocci, A. L. (1997, June 9). Competitive index: Benchmarking tool retains basic focus.Aviation Week and Space Technology, 49.

14 24

Page 25: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Multiple Expert Evaluatio IS of a PC Computer-Based Aviation Flight Training Device

Jeffrey S. ForrestSchool of Computer and Information Sciences

Nova Southeastern University

ABSTRACT

The usability of a personal computer based aviation-training device (PCATD) was investigatedby conducting multiple expert evaluations. One group of experts performed a heuristicevaluation of the PCATD system. Experts in a second group evaluated the PCATD byconducting a cognitive- walkthrough analysis. An ethnographic analysis was also carried out bydirectly observing and interviewing the participating experts during the evaluations. Expertsevaluated the usability of the PCATD as applied to various practical test standards used forinstrument flight training. Strong consensus by the experts in both groups indicated that thePCATD was usable for fundamental flight training as required by the Federal AviationAdministration's Instrument Rating curriculum. Issues concerning various PCATD simulationfidelities and related inconsistencies in interface design were discovered. These issues causedconcern over using the PCATD for training that could be applied to actual flight time.

STATEMENT OF THE PROBLEM

Aviation flight training and pilotcertification within the US is administered bythe US Department of Transportation'sFederal Aviation Administration (FAA). In1994, the FAA began to consider affordableinnovations that might enhance theimprovement of pilot performance (Beringer,1996). The FAA focused on the use of off-the-shelf (OTS), flight training simulationsthat could be supported on personalcomputers (PCs). By 1997, the FAA hadpublished its guidelines for approvingpersonal computer-based aviation trainingdevices (PCATD) for use in flight training("Qualification and," 1997). At the time ofthis study, there were at least fourcommercial entities offering off-the-shelfPCATDs approved by the FAA(Chamberlain, 1998). Little is known about

the usability characteristics for any of thecurrently approved PCATD systems.

The focus of this project was toconduct multiple expert usability evaluationsof one selected PCATD system. Evaluationsincluded expert usability (heuristic),cognitive walkthrough, and ethnographicanalysis as applied to specific FAA trainingguidelines conducted on the PCATD. Theidentification and application of thesetechniques are discussed subsequently inthis study.

Evaluation GoalsThe first goal for this evaluation was

to uncover new knowledge regarding theusability of PCATD systems by FAACertified Flight Instructors (CFIs). CFIsselected as participating experts were highlyexperienced in the utilization of computergenerated flight simulation. The informationgained from this project was also used to

1525

Page 26: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

make recommendations toward theimprovement of interface design, andapplication of the PCATD as a flight-trainingtool.

LITERATURE REVIEW

Historical Overview of Flight TrainingDevice Evaluation and Related Theory

The birth of the modern flight-training simulator is often attributed to EdLink who created the "Link Trainer" in 1929(Gunston, Pyle, & Chemel, 1992). The LinkTrainer was described as a ground-baseddevice that pilots could use to learn the basicskills needed to fly before leaving the ground(Gunston et al., 1992). Link trainers weredesigned to simulate the use of flightinstruments typical of aircraft being producedduring that time.

The realism or "fidelity" (Caro, 1988)of the Link Trainer was very low ascompared to the flight simulator producedtoday. Simulation fidelity has been identifiedas a two dimensional measurement of therealism associated with physical andfunctional characteristics (Hays & Singer,1989). The ability of a simulator toaccurately represent the visual, spatial orkinesthetic characteristics of the flightenvironment is known as physical fidelity(Hays & Singer). Hays and Singer contrastthe informational, or stimulus and responsecharacteristics, as the functional fidelity ofthe simulation. Physical characteristics of theearly Link Trainer emphasized attributes suchas the location of flight controls and limitedvisual (spatial) training. Later models of theLink Trainer began to incorporate moreaccurate representations of informationdisplayed on instruments in response to thepilot's actions in a training situation.

Since the advent of the Link Trainer,flight simulators have evolved to a state oftechnology that can completely duplicate the

16

flight environment for a specific aircraft. Itis now possible to competently train flightcrewmembers (pilots) to fly a specific typeof aircraft without ever using the actualaircraft as a part of the training program.'This current level of high fidelity flightsimulation was developed from a need totrain crewmembers to perform tasks notpreviously possible or to a skill levelpreviously unattainable (Caro, 1988). Theevolution of flight simulation technologywas motivated and built upon learningtheories advocated by cognitive scientistssuch as Charles Osgood and EdwardThorndike (Caro, 1988). These theoriesstipulated that successful learning fromsimulation requires that the simulation havea one-to-one relationship to reality (Caro,1988). This approach to simulation designseeks high levels of fidelity as thecharacteristic that will foster successfullearning. As physical and informationalsimulation reaches reality, learning will bemore effective. For this reason, modernsimulators have reached a level of highphysical and informational reality. Bill Siuruand John Busick (1994) describe today'sflight simulation as computer facilitated"virtual reality." They describe virtualreality as multi-sensory flight simulationthat provides three dimensional sight andsound along with feedback for touch andmotion ( Siuru & Busick).

Since the late 1960s, theeffectiveness of high fidelity in flightsimulation has been questioned by severalresearchers (Macfarlane, 1997; Caro, 1988;Prophet & Boyd, 1970; Grimsley, 1969).Macfarlane (1997) stated, "...the evolutionof flight simulation, as a realisticrepresentation of flight parameters, has oftenovershadowed the practical value ofsimulators and led to a number of falseassumptions about their training value" (p.59). According to Macfarlane (1997),simulation fidelity should be evaluated in

26

Page 27: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

terms of "task fidelity" and "instructionalfidelity" (p. 63). He defines task fidelity asthe degree to which simulation is able torecreate the actual parameters of a mission, interms of training and practice. Instructionalfidelity is defined by Macfarlane as theeffectiveness of the simulation, as part of aninstructional system, to transfer knowledge tothe training crewmembers. Macfarlane'staxonomy for fidelity does not feature theimportance of physical and knowledgerealities as was stressed earlier for successfulsimulation design. Instead, proper simulationdesign is based upon first asking what it is tobe accomplished, then designing or selectingthe simulation that best meets that need.

Macfarlane (1997) further emphasizesthis strategy by stating "Simulation shouldnot be undertaken for simulation's sake butrather for some predetermined purpose...."(p. 73). Proper instructional design andinstructional systems development areessential to Macfarlane's philosophy ofsimulation as applied to training. Simulationsshould be used to support the instructionaldesign and the related systems necessary tomeet the goals of the learning objective(s).Evaluation of simulator effectiveness shouldfocus on the relationship between desiredlearning objectives and the simulation fidelityrequired to meet those specific objectives.High fidelity as a characteristic of simulationdesign does not insure effective crewmembertraining.

Paul Caro (1998) also supportsMacfarlane's reasoning by identifying thedesign characteristics that support effective,low fidelity flight simulators. Caro stated thatfidelity should be designed around theelements of cues, discriminations, mediation,and generalizations. As an example of thesecriteria, consider a low fidelity computerbased training (CBT) simulator. Assume thatthe example CBT unit has a standardcomputer monitor, keyboard, and mouse. Thegraphical user interface (GUI) depicted on

17

the monitor only shows a few elements ofthe actual flight environment. According toCaro, cues are meanings assigned by thepilot to stimulus represented on the GUI. Ifthe simulation environment offers the pilotan opportunity to learn and assign thecorrect meaning to the stimulus provided,then effective simulation has taken placewithout the need for high fidelity.Discrimination is the ability of the pilot todifferentiate between various stimuli, andassign the proper meaning to eachrecognized stimulus. The CBT simulationneed not offer realistic physical cues inorder for the pilot to properly discriminatebetween various stimuli. As an example, thepilot could learn to discriminate and assignmeaning to stimuli solely from on-screentext descriptions or audio explanations. Carorefers to simulation design elements thatfoster discrimination, such as on screen textdescriptions, as mediations. Mediations alsoinclude generalizations, which are lowfidelity representations that allow a transferof knowledge to occur. Generalizations areelements of low fidelity that are used insimulation when the pilot already hasknowledge of the element being representedby the generalization. For example, it maynot be necessary to simulate the ability, orfidelity, to adjust a flight instrument if thatpilot is already aware of how to adjust anduse the flight instrument.

Low Fidelity Flight Training SimulationThe value and use of simulation as a

training device in aviation has been welldocumented over the past 30 years(Beringer, 1996,). Over this period, thetraditional emphasis of designing flight-training simulators with high fidelitycharacteristics has significantly increasedthe cost of aviation simulation devices. Thisexpense has created an industry demand forlower cost, OTS low fidelity trainingdevices (Wilson, 1998). The advent of the

27

Page 28: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

personal computer (PC) has facilitated thedesign and implementation of lower cost, lowfidelity training devices. PC-CBT devicesthat properly match fidelity with learningobjectives are now in demand bycommercial, military, and civilian aviationtraining facilities (Sutton, 1998).

Within the US, simulators must beapproved by the FAA for use in FAArequired pilot or crew training programs. TheFAA's responsibility regarding PC basedflight simulation is to certify that the level offidelity is compatible with the learningobjectives associated with specific FAAtraining objectives. In this way, the FAA"qualifies" the fidelity of the simulation and"approves" the use of the simulator forspecific training curriculum (Chamberlain,1998).

In 1995, the FAA began to approveand qualify low fidelity PC Based AviationTraining Devices (PCATD). The primarymotivation for the FAA's support of PCATDwas to potentially reduce the overall cost offlight training to the industry and improvepilot procedural training as related to specificFAA training guidelines (Beringer, 1996).The FAA's approval of PCATD applies tospecific primary instrument trainingguidelines published by the FAA (FederalAviation Administration, 1997).

Little is known about the usability oflow fidelity PCATDs as applied to flighttraining required for the Instrument Rating. Astudy conducted by Dennis Beringer (1996)compared a PCATD to alternate forms ofFAA approved training. In this study,Beringer (p. 11) found that the examinedPCATD had "...sufficient task fidelity tomotivate generalizable behavior, producingoutcomes that are comparable to thoseobtained in other simulation devices, in fact,aircraft." Beringer's study also incorporated acomponent of evaluation similar to acognitive walkthrough. A cognitivewalkthrough has been described by Miller

18

and Jeffries (1992) as an evaluation thatcompares the ability of the interface to theuser goals and expectations. In Beringer's(1996) study, the users (pilots) were asked tocompare the PCATD fidelity to the "realworld" aircraft. Overall, the users found thePCATD more sensitive than the actualaircraft and harder to fly (Beringer, 1996).

A more recent study conducted byTaylor, Lintern, Hu lin, Talleur, Emanuel,and Phillips (1997) measured theeffectiveness of PCATD training ascompared to actual flight training. Thisstudy did not specifically evaluate theusability of the PCATD in the trainingenvironment. Methodology focused on thecomparison of user performance indexes andFAA published guidelines using both thePCATD and actual aircraft trainingenvironments.

Simulation EvaluationShneiderman (1998) has stated that

the primary goal for usability evaluations"...is to force as much possible of theevolutionary development into theprerelease phase, when change is relativelyeasy and inexpensive to accomplish" (p.144). This philosophy for evaluation appliesduring the design, or formative stage ofsystem development. Wilson (1998)describes how current CBT aviationsimulation is designed with little opportunityfor formative evaluation. Instead, most lowfidelity aviation CBT simulators are beingoffered as a "proof of concept" product,whereby evaluation is primarily summativein the form of end-user feedback (p. 28).Literature offered by Beringer (1996) andTaylor et al. (1997) seems to also supportthis conclusion in regards to the PCATD.Emphasis on the evaluation of the PCATDhas been focused on the transfer of learningas a proof of concept, rather than theevaluation of effective PCATD design.

28

Page 29: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

METHODOLOGY

Evaluation MethodsThe purpose of this evaluation was to

assess the usability of a PCATD as applied toselected FAA Instrument Rating trainingguidelines. The evaluation concluded withrecommendations on the improvement forPCATD interface design and applicationwithin the aviation-training environment.

The proposed PCATD evaluation wassummative and conducted within an actualtraining environment. Shneiderman (1998)suggests that expert reviews be conducted assummative evaluations. He offers severalmodels for expert evaluations that areparticularly viable for the PCATD. First,Shneiderman suggests the heuristicevaluation as a method to "...critique aninterface to determine conformance with ashort list of design heuristics" (p. 126). Inthis study, the design heuristics that will beused are the "eight golden rules" for interfacedesign as also suggested by Shneiderman(1998). Shneiderman's design rules will alsobe supplemented with criterion for "checklistevaluations" as provided by Ravden andJohnson (1989). Miller and Jeffries (1992)found that heuristic evaluations are verysuccessful for discovering most of the majorproblems inherent to the design of a userinterface. The heuristic evaluation wasconducted to provide evidence for specificimprovements in the PCATD interfacedesign.

Cognitive walkthroughs are alsosuggested by Shneidennan (1998) andWharton, Bradford, Jeffries and Franzke(1992) for evaluating the interface whileconducting a specific task. This evaluationrequired experts to conduct training task asdefined by FAA approved guidelines.Cognitive walkthroughs are based upon theevaluation theory of "learning by doing" andfocus on basic usability principles (Whartonet al., 1992). Miller and Jeffries (1992)

19

compared the advantages of variousstructured evaluation processes. Theydetermined that cognitive walkthroughs arewell suited for discovering problems withthe interface as related to the user goals andassumptions. Therefore, the cognitivewalkthrough should provide data leading toan assessment of the cues, discriminations,mediations, and generalizations (Caro, 1998)that will be experienced by the user of thePCATD interface.

It has been recommended that inaddition to structured evaluations, thepotential affects of culture or the socialsituation should also be factored into theevaluation (Sommerville, Bentley, Rodden,& Sawyer, 1994). Sommerville, et al (1994),and Shneiderman (1998) suggest usingethnographic observation as a complementto other forms of evaluation. Anethnographer for this evaluation was presentfor both the heuristic and cognitivewalkthrough evaluations. Ethnographicobservations and interpretations were madein order to help determine the factors notinherent to the PCATD that influence theevaluations conducted in the heuristic andcognitive walkthrough evaluations.

SubjectsTwo groups consisting of three FAA

Certified Flight Instructors (CFIs) wereasked to participate in the evaluation. Onegroup conducted the heuristic evaluation,while the other implemented the cognitivewalkthrough. Experts were solicited from apopulation of CFIs having over ten yearsexperience in the application and evaluationof flight training simulators. Miller andJeffries (1992) found that as the relativeexpertise of evaluators increases, the fewerthe number of experts that are required forthe evaluation. In this evaluation, the samenumber of experts participated in both theheuristic and cognitive walkthrough

29

Page 30: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

evaluations as was used in previoussuccessful studies conducted by Miller andJeffries (1992).

SettingThe PCATD evaluations were

conducted within the Aerospace ScienceDepartment (ASD) of a midwestern college.The CFIs participating in the evaluationswere currently employed as faculty membersof the ASD. The heuristic, cognitivewalkthrough and ethnographic evaluationswere conducted within the aviationsimulation lab that is used by the ASD totrain pilots.

ApparatusThe selected PCATD evaluated was

full functioning, commercially available, andFAA approved. The PCATD simulates theflight environment for a single engine aircraftused for primary instrument flight training.The ASD is certified by the FAA toadminister approved instrument flighttraining using the selected PCATD.

ProcedureAll CFIs employed by the ASD were

invited to volunteer as expert evaluators.Each participating CFI was professionallytrained in human factors analysis andsimulation based training. Under thesecircumstances, expert evaluations can beconducted within one to two days (Dumas &Redish, 1993). Each CFI was given thirtyminutes to conduct their heuristic orcognitive walkthrough following a specifiedFAA training standard.

The FAA training standards followedwere specified in the FAA's publicationInstrument Rating for Airplane, Helicopter,and Airship Practical Test Standards("Instrument Rating for," 1994) (PTS). Thethree CFIs conducting the heuristicevaluation were asked to select any threetasks referred to as "areas of operation"

20

defined by the FAA's PTS. Each expert thenused the PCATD to apply the three chosenoperational areas of operation in any mannerthey deemed suitable to primary instrumentinstruction.

The PTS areas of operation wereconsidered adequate for evaluation. Eacharea can be quickly evaluated, is stated inthe user's words, provides enoughinformation to complete the task, and islinked directly to the goals of the proposedevaluation (Dumas & Redish, 1993). As theheuristic evaluators explored their selectedareas of operation, they were asked to writecomments on a survey addressing "areas ofconcern" (Dumas & Redish, 1993). Theseareas of concern were related to the "eightgolden rules" for interface design assuggested by Shneiderman (1998).

The CFIs participating in thecognitive walkthrough were asked to"practice" three pre-identified areas ofoperation contained within the PTS 2 usingthe PCATD. They then answered a post-tasksurvey qualifying the PTS operations inrelation to the cues, discriminations,mediation, and generalizations (Caro, 1998)inherent to the PCATD interface (seeAppendix B).

A single ethnographer was alsopresent for each of the expert evaluations.The ethnographer was an Instrument GroundInstructor with over ten years experience inflight simulator training and human factorsassociated with student interaction and CBT.The ethnographer observed each expertevaluator in both groups. Ethnographicexamination uncovered how the commoncultural values of the experts' influence theirperception on the characteristics inherent tothe PCATD. The primary goal of theethnography was to detect the affect ofculture on the perception, or experience 3 ofusing the PCATD (Gall, Borg, & Gall,1996). As suggested by Sommerville,Bentley, Rodden and Sawyer (1994, p. 358),

30

Page 31: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

no specific set of instructions were providedto the experts concerning the ethnographicobservation. However, the experts wereencouraged to "think out loud" and discussany aspect of the PCATD with theethnographer. This strategy was successfullyused by Karat, Campbell, and Fiegel (1992)in a study comparing techniques in userinterface evaluation.

Analysis of the DataQualitative analysis was applied to

the results obtained from the heuristic andethnographic evaluations (see Appendix A).Specifically, a "hermeneutic circle" (Gall etal, 1996, p. 706) analysis was applied to theconcerns and issues raised by each evaluatorof the PCATD. In this analysis, meaning wasinterpreted from the concerns or commentsmade by each evaluator. Meaning was alsoapplied to the overall concerns made by eachevaluator and taken as a whole (Gall et al,1996). Conclusions and recommendationswere made regarding the usability of thePCATD, as based upon the analysis.Recommendations for improving theusability of the PCATD were also made.

A questionnaire measuring eachevaluator's attitude regarding aspects of thePCATD usability was provided to eachmember of the cognitive walkthrough (seeAppendix B). The questionnaire contained anordinal scale measuring ten (10) levels ofagreement for each area of concern (Gall etal, 1996). The questionnaire was pre-testedfor clarity and understanding by various CFIswithin the ASD. The experts used within theevaluation did not represent a normalpopulation. Therefore, a Kruskal-WallisANOVA analysis was applied to the ordinalresults provided by each expert (Gall et al,1996, p. 297). Kruskal-Wallis ANOVAanalysis provided quantitative resultsmeasuring the level of agreement betweeneach expert's cognitive walkthroughevaluation. Qualitative conclusions were

21

made based upon the quantitative analysis.Recommendations were made regarding theusability of the PCATD, as based upon thecognitive walkthrough analysis.Recommendations for improving theusability of the PCATD were also madebased upon a synthesis of all evaluations andanalyses.

RESULTS

Results of the Heuristic EvaluationThree CFIs participated in the

heuristic evaluation of the PCATD. EachCFI was given approximately 30 minutes toconduct their evaluation. Each heuristicevaluator wrote comments regarding "areasof concern" as they used the PCATD toexplore their selected areas of operation (seeAppendix A). These comments werequalitatively evaluated and related toShneiderman's "eight golden rules" (1998).

31

Page 32: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Issues of simulator fidelity werecharacterized in terms of cues,discrimination, and mediation (Caro, 1998).The following results provide each questionasked on the survey along with a qualitativeanalysis of the comments made by all threeevaluators. Relevant ethnographic analysis isalso included for each question.

Heuristic and Ethnographic Results

What are your concerns regardingthe clarity of objects, or information,displayed on the PCATD screen andcontrol system?

Two of the three evaluators remarkedthat the icons presented in the PCATD userinterface were "too small" and depictedimages that were "unknown" in terms ofimplied meaning or utility. The thirdevaluator felt that all of the PCATD interfaceelements presented were "clear and easy toidentify."

According to Shneiderman (1998),shortcuts such as icons are "appreciated byknowledgeable and frequent users" (p. 74).Although knowledgeable, the experts werenot frequent users of the specific PCATDbeing evaluated. Evaluators expressingconcern regarding the ambiguity of the iconsfelt that an adequate solution would be toplace short, abbreviated textual descriptionsunder each icon. This feature offered as auser option would consider the experiencelevel of the user, as suggested byShneiderman (1998) when consideringcombining text with icon representations.

Shneiderman (1998) also suggeststhat each icon should be designed in a"familiar and recognizable manner." Sincethe PCATD technology is relatively new, itwas difficult for the evaluators to relate theicons presented to any pre-existing CBTinterface designs. The icons represented inthe evaluated PCATD may become familiarand recognizable standards in future PCATD

22

systems. According to Caro (1998), theaddition of textual descriptions wouldenhance the quality mediation as supportedby the PCATD interface.

Ethnographic observation revealedthat one of the evaluators had limited priorexperience in viewing the user-interface forthe evaluated PCATD. This probablyaccounts for his characterization of eachelement being clear and easy to identify.However, upon further questioning by theethnographer, the evaluator agreed that thenovice user would benefit from texturaldescriptions related to each specific icon.

What are your concerns regardingthe compatibility of objects, orinformation displayed by the PCATD, tosimilar attributes as experienced in actualflight?

All three evaluators agreed thatsimulation fidelity of the PCATD as relatedto the actual flight environment was quitegood. Ethnographic evaluation determinedthat the evaluators found certain flightmaneuvers as "jerky" and "too rapid" inresponse fidelity. As suggested by Caro(1998), the cues provide by these unrealisticfidelities might deter the student pilot fromlearning the correct meaning of the stimulusbeing provided by the PCATD. All threeevaluators felt that the cues provided duringthese maneuvers would be of minor concernto the novice student. They also felt thatalthough fidelities in these maneuvers werenot realistic, the actual outcome of thesimulation was accurate enough to provideproper understanding of the learningobjective by the student pilot.

32

Page 33: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

What are your concerns regardingthe consistency of PCATD performanceand display as applied to each PTS rea ofoperation that was conducted?

The cues, discriminations, andmediations provided by the PCATD whilesimulating flight were considered adequatefor successfully conducting all but one areaof operation contained within the PTS. It wasdetermined by two of the evaluators that thePCATD did not allow the student to performa flight maneuver referred to as a "stall." Thisdeficiency in simulation fidelity wasconsidered by all three evaluators as a seriousissue requiring attention in software redesignand upgrade by the manufacture of thePCATD. Cues (Caro, 1998) provided duringthe stall maneuver were considered accurate.However, the simulation was not able toprovide the correct "feedback"(Shneiderman, 1998) of the instance in timethat the actual aerodynamic effect of the stalloccurred.

What are your concerns regardingthe ease of operating the PCATD?

Ethnographic observation deter-mined that all three evaluators viewed thePCATD as "relatively easy to use." It wasgenerally agreed that students having a verybasic understanding of the personal computerwould find the PCATD very easy to use. Therule of "consistency" (Shneiderman, 1998) ascompared to other PC-based software wasconsidered very strong as applied to theoverall PCATD design.

However, one of the evaluatorsdetermined that it was not possible to "multi-task," or switch to other software applicationswhile using the PCATD. Shneiderman (1998)suggests that design elements that cause aloss of user control can build anxiety anddissatisfaction. All three evaluators agreedthat this design flaw would cause potentialaggravation for the instructor using thePCATD in a training environment. It was felt

23

that the lack of multi-tasking would haveminimum impact on the student's ability touse the PCATD.

What were the best aspects of thePCATD for the student pilot as a user?

All three evaluators felt that theoverall fidelity of the PCATD provided apositive experience for learning and buildingcompetency in the skill required by theFAA's PTS. The ability to repeat flight-training exercises in the level of fidelityoffered by the PCATD was considered itsstrongest attribute.

What were the worst aspects of thePCATD for the student pilot as a user?

Two of the three evaluators foundthe design and fidelity of the flight controlhardware unsatisfactory. Confusion wasobserved when all three evaluatorsattempted to manually adjust the radiofrequencies required to operate theinstruments being displayed by thesimulation. The ethnographer noted thatnegative comments were made regarding thecues, discriminations, and mediationsoffered by the radio hardware interface. Theevaluators felt that this design would causethe students confusion over simulationconsistency (Shneiderman, 1998) ascompared to the actual operation of aircraftradios in the flight environment.

Is there anything else about thePCATD you would like to add?

Two of the three evaluators addedcomments to this question. One evaluatorsuggested that an additional displaycontaining "approach chart" information beadded to the screen. Approach charts areused by pilots during the arrival and landingphase of flight. The evaluator stated that thisfeature would simulate fidelity comparableto various flight information systems used inthe flight deck of an actual aircraft. This

33

Page 34: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

suggestion would potentially reduce thememory load on the user while improving thecapability to assimilate information(Shneiderman, 1998) while using thePCATD.

Of particular interest were thecomments made by the second evaluatorresponding to this question. This evaluatorfelt that the PCATD offered excellentfidelities for practice and instructor leaddemonstrations. However, he did not believethat the fidelities for the hardware (manualcontrols) were sufficient to use the PCATDas training device that would meet certainexperience requirements for FAA pilotcertification. He stated that the PCATD wasan excellent classroom-training device, butshould not be used to replace any of theFAA's regulatory flight experience.

Further questioning of this evaluatordetermined that he had extensive experiencewith flight training simulators that offered thehighest state-of-the-art fidelity. The priorexperiences of this evaluator in regards tovery high fidelity simulation technologymight have biased his judgement againstPCATD as sufficient in meeting theregulatory requirements of the FAA. It wasalso his opinion that the PCATD providedmuch more of a training process, rather thana simulation.

Results of the Cognitive WalkthroughEvaluation

Three CFIs participated in thecognitive walkthrough evaluation of thePCATD. Each CFI was given approximately30 minutes to conduct their evaluation. All ofthe CFIs were of the male gender. Severalfemale CFIs were invited to participate, butwere unable to do so.

Each evaluator participating in thecognitive walkthrough was asked to"practice" three pre-identified areas ofoperation contained within the PTS 2 usingthe PCATD. They then answered a post-task

survey qualifying the PTS operations inrelation to the cues, discriminations,mediation, and generalizations (Caro, 1998)inherent to the PCATD interface (seeAppendix B). Ordinal responses measuredthe level of agreement to each statementasked. A rank of one (1) represented anattitude of complete agreement. A rank ofnine (9) represented an attitude of completedisagreement. The rank of ten (10) was usedto indicate that the question was notapplicable to the characteristic beingevaluated. A Kruskal-Wallis ANOVAanalysis was also conducted on the ordinalresponses submitted on the survey for thecognitive walkthrough. This analysismeasured the overall level of agreement (Hstatistic) between each evaluator's cognitivewalkthrough.

Cognitive Walkthrough ResultsNone of the evaluators for the

cognitive walkthrough responded with arank of ten (10, or not applicable) to any ofthe survey questions (see Appendix B). Theleast level of agreement indicated from oneof the evaluators was a four (4). This rankwas assigned to the question, "The objects,or information, displayed on the PCATDscreen are identifiable to those sameelements as experienced in the actual flightenvironment" (see Appendix B). Rank levelresponses for the evaluation ranged fromone (1) to four (4). The overall average (x)rank level of agreement to all questions byall evaluators was x = 1.8. Table 1

summarizes the average rank level (xq)response for each of the questionsadministered in the cognitive walkthroughevaluation.

The Kruskal-Wallis ANOVAproduced a relatively small test statistic (H =0.522 with 2 degrees of freedom) indicatinga strong level of agreement for the rank levelresponses provided by each evaluator of thecognitive walkthrough. It was further

24

34

Page 35: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

assumed that a significant difference betweenthe responses would be considered to exist ifthe Kruskal- Wallis probability test wasp<0.05. Kruskal-Wallis analysis on the rankresponses produced a probability of p=0.77,indicating no statistically significantdifference between the responses made byeach evaluator.

Table 1Average Rank Level (xq) Response for eachQuestion Administered within the CognitiveWalkthrough Evaluation

Question (Q) xq

(Q) 1 2

(Q) 2 1.3

(Q) 3 2.3

(Q) 4 2

(Q) 5 2

(Q) 6 1.7

(Q) 7 1.3

Note. A rank of one (1) equaled completeagreement while a rank of nine (9) equaledcomplete disagreement. See Appendix Bsection titled "Cognitive WalkthroughEvaluation Primary Questions for Levels ofAgreement" for each specific question askedduring the cognitive walkthrough evaluation.

Ethnographic observations conductedduring the cognitive walkthrough resulted insimilar conclusions as made by the evaluatorsfor the heuristic evaluation. The evaluatorsfor the cognitive walkthrough felt thatPCATD simulation for flights conducted at(a) "slow speeds," (b) "high pitch attitudes,"and in (c) "turbulence" resulted inconsistencies ( Shneiderman, 1998) not foundin actual flight. One evaluator remarked thatthe visual fidelity of the "natural horizon[earth-sky boundary] caused visualdisorientation." All of the experts in thisevaluation experienced the same frustrationas those in the cognitive walkthrough when

attempting to control and adjust thesimulated radio-navigation PCATDhardware. All evaluators stated that theconsistency (Shneiderman, 1998) for cues,discriminations, and mediations (Caro,1998) was very poor in terms of radio-navigational hardware. Further questioningby the ethnographer revealed that all threeevaluators questioned the decision of theFAA to certify the PCATD as a trainingdevice that can be applied to actual trainingflight time as required by regulation. Thisconcern was based upon the poor fidelitiesassociated with the hardware incorporatedwithin the PCATD design.

CONCLUSION

DiscussionConsensus of agreement was strong

among the evaluators for both the heuristicand cognitive walkthrough evaluationsconducted. Heuristic and ethnographicanalysis of the PCATD confirmed similarareas of concern by each evaluator regardingthe usability of the simulator as a trainingdevice for primary instrument students. Therelatively small value of the Kruskal- Wallistest statistic H (H=0.522; p=0.733) indicateda strong level of agreement among theevaluators participating within the cognitivewalkthrough.

Ethnographic analysis determinedthat all experts for heuristic and cognitivewalkthrough felt that the usability of thePCATD was sufficient for training primaryinstrument students. All evaluatorsexpressed strong concern over the designand fidelity of the PCATD hardware used tosimulate aircraft control. One evaluator fromthe heuristic evaluation felt that the PCATDshould not have been approved by the FAAas training that applied to flight timerequired by regulation. Ethnographicexamination during the cognitive

25

35

Page 36: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

walkthrough discovered that all three expertshad similar concerns regarding of theapproval of the PCATD as an FAA approvedtraining device.

A shortcoming to this evaluation wasthat all of the CFI's that volunteered toparticipate were of the male gender. Anapproximately equal number of female andmale CFIs from the ASD were invited toparticipate in either of the evaluations. It isknown that two of the ASD female CFIs hadscheduling conflicts. Vardaman (1997) statedthat males tend to like computers more thanfemales. It would have been beneficial to thisevaluation to incorporate the possible affectof gender and the CFI's qualification of thePCATD.

All of the experts from bothevaluations felt that the PCATD wasadequate for training each area of operationas described in the FAA's PTS. The onlyconcern was the lack of consistency(Shneiderman, 1998) in fidelities for thoseareas of operation requiring slow flightspeeds or extreme flight attitudes (position ofaircraft). Results of the cognitivewalkthrough evaluation support theconclusion that the PCATD is adequate forthe training, and use by, student pilotspursuing primary instrument training.

Limitations of the AnalysisThe primary objective of this study

was to discover issues of usability regardingthe PCATD design as applied to trainingrequired for the Instrument Rating.Conclusions made in this study were basedupon the ethnographic observations andopinions made by expert evaluators. Thisstudy included six expert evaluators inaddition to the ethnographer.

Jakob Nielson (1993) has providedevidence that on average, three to fiveexperts offer the greatest incrementaladvantage for discovering issues of usabilityduring an evaluation. However, Nielson also

26

recommends that in the evaluation ofmission critical systems, more evaluatorsshould be used. Based upon the work byNielson, this study recommends that futureevaluations of the PCATD system shouldemploy between seven and 15 expertevaluators. Nielson advises that on average,six evaluators will discover 80 percent of allrelevant usability issues, while 15 evaluatorswill increase the probability to 90 percent.

A further limitation to the analysis ofthis evaluation was the length of timeprovided to conduct each evaluation. Eachevaluator was provided with 30 minutes toconduct their review of the PCATD,exclusive of the time required to fill out eachsurvey form. Nielson (1993) recommendsthat from one to two hours be provided foreach expert to conduct their evaluation.

As a final concern, it is important tonote that this study does not offerstatistically significant data that can supportinferential conclusions. This study did usevalid methodology for exploring the issuesof PCATD usability. However, it is stronglyrecommended that the number of expertevaluators be significantly increased forfuture PCATD evaluations offeringconclusions supported by inferentialstatistics.

RecommendationsBased upon the usability

investigations conducted in this study, thefollowing recommendations are made forfurther investigation and potentialimprovement in PCATD usability anddesign:

1. Flight control hardware should beredesigned for consistency(Shneiderman, 1998). Particularattention should be focused onimproving the cues, disseminations, andmediations (Caro, 1998) of the radio-

36

Page 37: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

navigational hardware associated with thePCATD.

2. Improvements should be made inallowing the user to control the amount ofinteractions (Shneiderman, 1998) used tocontrol the configuration of the PCATDsimulation software. Icons, objects, andother information provided by thePCATD offered meaning that would onlybe understood by the experienced user ofthe PCATD.

3. Simulation for fidelities experiencedduring slow airspeeds, unusual attitudes,or turbulence require improvements in atleast the visual cues (Caro, 1998) beingdisplayed by the PCATD.

4. An option should be added allowing thePCATD user to display additionaldatabase information (such as approachcharts) on a separate monitor consistentwith actual flight deck configuration.

Further Recommendations for StudyEvaluators for this project expressed

concern that the FAA approved the selectedPCATD for use in meeting certainrequirements of actual flight time requiredfor pilot certification. This concern was basedupon PCATD hardware related fidelityproblems discovered in this study. Furtherresearch is recommended to determine thefidelities required for hardware design thatwould improve the interface consistency ofthe PCATD as related to the actual flightenvironment.

This study conducted expert andethnographic evaluations of a selectedPCATD simulator. Further research focusedon the design and fidelity of the PCATD, asevaluated by the student pilot, should beconsidered. New efforts in research shouldalso consider evaluating the PCATDinterface during actual training conditions.

27

37

Page 38: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

REFERENCES

Beringer, D. B. (1996, April). Use of off-the-shelf PC-based flight simulators for aviationhuman factors research. Washington, DC: Office of Aviation Medicine. (NTIS No.DOT/FAA/AM-96/15)

Caro, P. W. (1988). Flight training and simulation. In Weiner, E. & Nagel, D. (Eds.).Human Factors in Aviation (pp. 229-261). NY, NY: Academic Press.

Chamberlain, D. H. (1998, July/August). Qualified PC computer-based training devicestake off. FAA Aviation News, 37, (5), 8-10.

Dumas, J., & Redish, J. (1993). A practical guide to usability testing. Norwood, NJ:Ablex Publishing.

Federal Aviation Administration. (1997, May 12). Qualification And Approval OfPersonal Computer-Based Aviation Training Devices. (FAA Advisory Circular AC No: 61-126).Washington, D.C.: United States Department of Transportation.

Federal Aviation Administration. (1994). Instrument Rating for Airplane, Helicopter, andAirship Practical Test Standards. Newcastle, Washington: Aviation Supplies & Academics.

Gall, M., Borg, W. & Gall, J. (1996). Educational Research. NY, NY: LongmanPublishers.

Grimsley, D. L. (1969). Acquisition, retention, and retraining: Group studies on usinglow-fidelity training devices (Tech. Rep. No. 69-4). Washington, DC: The George WashingtonUniversity, Human Resources Research Office.

Gunston, B., Pyle, M., & Chemel, E. (Eds.). (1992). Chronicle of Aviation. Liberty, MO:JL International Publishing.

Hays, R., & Singer, M. (1989). Simulator fidelity in training systems de-sign: Bridgingthe gab between reality and training. New York, NY: Springer-Verlag.

Karat, C., Campbell, R., & Fiegel, T (1992). Comparison of empirical testing andwalkthrough methods in user interface evaluation. ACM CHI'92 (pp. 397-404). Monterey:ACM.

Macfarlane, R. (1997). Simulation as an instructional procedure. In Hunt, G. (Ed.).Designing instruction for human factors training in aviation (pp. 59-93). Brookfield, USA:Avebury Aviation.

Miller, J., & Jeffries, R. (1992). Usability evaluation: science of trade-offs. IEEESoftware, 9, (5). 97-102.

Nielsen, J. (1993). Usability Engineering. San Francisco, CA. Morgan Kaufmann.O'Neil, H. F., & Baker, E. L. (1994). Technology assessment in software applications.

Hillsdale, NJ: Lawrence Erlbaum.Prophet, W. W., & Boyd, H. A. (1970). Device-task fidelity and transfer of training:

Aircraft cockpit procedures training (Tech Rep. No. 70-10). Alexandria, VA: Human ResourcesResearch Organization.

Ravden, S., & Johnson, G. (1989). Evaluating Usability of Human-Computer Interfaces.Chichester: Ellis Horwood Ltd.

Redmond-Pyle, D., & Moore, A. (1995). Graphical User Interface Design andEvaluation. NY, NY: Prentice Hall.

28

38

Page 39: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Siegfried, T. (1994). User interface evaluation: A structured approach. New York, NY:Plenum Press.

Shneiderman, B. (1998). Designing the User Interface. (3rd ed.). Reading, MA: Addison-Wesley.

Sommerville, I., Bentley, R., Rodden, T., & Sawyer, P. (1994). Cooperative systemsdesign. The Computer Journal, 37, (5). 357-366.

Sutton, 0. (1998, April) Training business on a roll. Interavia, 63, (619). 19-22.Siuru, B., & Busick, J. (1994). Future Flight. Blue Ridge Summit, PA: Tab Aero.Wharton, C., Bradford, J., Jeffries, R., & Franzke, M. (1992). Applying cognitive

walkthroughs to more complex user interfaces: experiences, issues, and recommendations. ACMCHI'92 (pp. 381-388). Monterey: ACM.

Wilson, J. R. (1998, April). Wanted: Off the shelf solutions. Interavia, 63, (619). 23-28.

FOOTNOTES

This is often referred to as "zero flight time" or ZFT. Under ZFT, a training program isconducted entirely within the flight simulator. The aircraft is not used until training using thesimulator is complete.

2 Area IV (A) straight and level flight; (C) rate climbs and descents; and (F) steep turns.3 Gall, Borg and Gall (1996, p.608) refer to this characteristic of cultural perception as

"emic" ethnography. Emic ethnography attempts to qualify the affect of culture on the humanperception of reality.

APPENDIX A

Heuristic Evaluation - Primary Questions for Areas of Concern

1. What are your concerns regarding the clarity of objects, or information, displayed on thePCATD screen and control system?

2. What are your concerns regarding the compatibility of objects, or information displayed bythe PCATD, to similar attributes as experienced in actual flight?

3. What are your concerns regarding the consistency of PCATD performance and display asapplied to each PTS area of operation that was conducted?

4. What are your concerns regarding the ease of operating the PCATD?5. What were the best aspects of the PCATD for the student pilot as a user?6. What were the worst aspects of the PCATD for the student pilot as a user?7. Is there anything else about the PCATD you would like to add?

3929

Page 40: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Appendix B

Cognitive Walkthrough Evaluation Primary Questions for Levels of Agreement

Each comment will be answered using an ordinal scale measuring levels of agreement:ex. 1 = "strongly agree," to 9 = "strongly disagree" with 10 representing not applicable (NA)(Shneiderman, 1998, p.140). After conducting the three prescribed areas of operation, the expertwill answer the following questions:

1. The student pilot will be able to interpret the objects, or information, displayed on thePCATD screen.

2. The student pilot will be able to relate the objects, or information, displayed on the screen tothe required knowledge areas fundamental to primary instrument flight training.

3. The objects, or information, displayed on the PCATD screen are identifiable to those sameelements as experienced in the actual flight environment.

4. Adequate information is provided on the PCATD screen for the student pilot to interpret themeaning of each object or action being simulated.

5. The overall simulation of the PCATD is adequate in terms of realism as applied to primaryinstrument training.

6. Response of the PCATD to user control input is adequate for primary instrument training.7. As compared to other approved flight training devices, the PCATD is acceptable for primary

instrument training.

30

40

Page 41: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

An Examination of the U.S. Collegiate Aviation Workforce in Preparing the NextGeneration of Aviation Faculty Members Beyond 2000

Jeffrey A. JohnsonAviation Institute

University of Nebraska at Omaha

ABSTRACT

Aviation as an academic field of study has evolved in the span of a century. As the newmillennium approaches, collegiate aviation will be called upon to prepare a new generation ofhighly skilled workers. These workers need to be educated by current and future generations ofaviation faculty members. The purpose of this study was to examine the US collegiate aviationworkforce to determine if the next generation of faculty members are being adequately prepared.A descriptive study survey questionnaire was used to collect data for this study which was sent toUS University Aviation Association (UAA) institutional members in order to ascertain theirworkforce needs. The study found that a significant amount of hiring for qualified aviationfaculty members is already occurring. The survey results also indicated a substantial number ofretirements is either taking place or is anticipated to take place by the year 2000. A verysignificant finding was that almost all of the respondents believe the public at large does not havean adequate understanding of collegiate aviation.

INTRODUCTION ANDBACKGROUND

During the twentieth century, the entire fieldof aviation has advanced tremendously.From the historic flight at Kitty Hawk toroutine daily transoceanic flights carrying aseemingly countless number of passengersfrom all walks of life, aviation still seems tobe evolving at phenomenal rates. In the USalone, there are over 500 colleges anduniversities that offer some type of aviationrelated program (Collegiate Aviation Guide,1994). According to Fuller and Truitt(1997), the academic field of aviation hasmatured from a more historictechnical/vocational orientation to a presentday contemporary study involving science,business and public administration,

technology, and the social sciences found inmodern day colleges and universities. Thesechanges in the academic field of aviation havenecessitated changes in the aviation educator'srole as well. As the aviation academic fieldcontinues to evolve, a new generation ofaviation faculty members must be prepared tofulfill the personnel needs of the industrial,governmental, and academic sectors ofaviation beyond the year 2000.

Educators have several formidablechallenges in preparing a new generation ofaviation faculty members. The first challengelies in the area of minimum requirements foremployment. Unlike many traditionalacademic fields of study in higher education(e.g., history and philosophy) where theminimum benchmark for prospective facultymembers is an earned doctoral degree, the

31

41

Page 42: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

benchmark for the prospective aviationfaculty member is often more demanding.Unlike history and philosophy, aviation'stechnical/ vocational orientation started atthe airfield and has evolved into a complexmulti disciplinary academic field of studyfound in many colleges and universities(Fuller & Truitt, 1977). This evolution hasprecipitated a need for aviation facultymembers to possess not only a graduatedegree (with greater emphasis on thedoctorate) and preferential teachingexperience, but actual aviation practitioneroriented field experience combined withprofessional certification credentials.

Another challenge that seems to plaguecollegiate aviation is residual negativepublic perception. This adverse perceptionof aviation as a legitimate field of academicstudy still creates hurdles for currentaviation faculty members to overcome andmay hinder new faculty member entrants.Fortunately, there is evidence thatimprovement in public perception is gainingmomentum. During the 1970s, the Dean ofUniversity of North Dakota (UND) Schoolof Aerospace Sciences, John Odegard,stated, "The acceptability of aviation in theacademic community has been painfullyslow, but improvement appears to be rapidlyon the upswing" (Matson, 1977, p. 178). By1997, former president of Embry-RiddleAeronautical University, Steven Sliwa,argued that aviation had still not reachedgeneral acceptance in higher education(University Aviation AssociationNewsletter, 1997). Specific areas of negativepublic perception may possibly includeaviation's recent entrance to highereducation in comparison to traditional fieldsof study, an absence of a longstandingrecord in research (Truitt & Kaps, 1995),and beliefs held by some traditionalacademicians that aviation belongs in

32

technical schools and not colleges anduniversities.

Another area that has continually plaguedaviation educators is a failure to reach aconsensus on how educators collectivelyidentify academic programs in the field. In astudy conducted by Johnson (1997), 14different terms or phrases were used byaviation educators to identify collegiateaviation.

METHODOLOGY

SubjectsThe population for this study included all

US UAA institutional members. TheNovember 1997 UAA membership listindicated there were 100 US institutionalmembers. Key assumptions made about thesubjects during the study included: (a)University Aviation Association institutionalmembers as representative experts in theirfield; (b) the data generated from theinstitutional members can be used toaccurately assess how well aviation educatorsare doing in preparing a future generation ofaviation faculty members; (c) the institutionalmembers were current in academic mattersconcerning their hiring needs and could makereasonable assumptions about future hiringneeds; and (d) the members responded to thequestionnaire in a sincere manner using theirprofessional, educational, and experientialexpertise.

Research InstrumentThe instrument used to collect the data

was a survey questionnaire developedspecifically for the study. The survey wasdistributed to all 100 US member institutionsvia US mail. A usable return rate of 56surveys (56.0%) was received for the study.The survey was comprised of two sections.The first section incorporated a series of

42

Page 43: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

questions posed to the institutional membersconcerning their aviation faculty recruitmentneeds, hiring requirements, salary structures,experiential knowledge of their new hires,etc. In response to the survey questions,respondents were directed to choose from aseries of short statements ranging fromyes/no responses to minimum educationalrequirements. The second section of thesurvey instrument incorporated ademographic section. Responses left blankby institutional members were indicated byN/R (Not Reported). In evaluating the datapresented in the following tables, roundingerrors should be taken into consideration.

DATA ANALYSIS

DemographicsData from the survey questionnaires

were compiled from the software program,Minitab (1998). Demographic characteristicsincluded gender, highest degree held,position, institutional affiliation,employment status, and institutionallongevity. Of the 56 respondents, 55(98.2%) are male, 46 (82.1%) are employedat a public institution, 26 (46.4%) aretenured, and 15 (26.8%) have at least 16years of employment experience at theirpresent institution. Thirty UAA institutionalmembers (53.4%) are 51 years of age orolder, 35 (62.5%) have 10,000 students orless at their institutions, 26 (46.5%) are atthe associate professor level or higher, and22 (39.3%) are employed at doctoralgranting institutions of higher education.The percentage of female UAA institutionalrepresentatives in collegiate aviation hasremained relatively unchanged from 1993 to1998 (see Table 1). The percentages offemale aviation faculty members depicted inTable 1 still remain far below the nationalaverage of 32.5 percent as indicated by the1992 data from the US Department of

33

Education (cited in The Chronicle of HigherEducation: Almanac Issue, 1997).

Table 1Gender by Year

FemaleN%

MaleN%

N/RN%

TotalN%

1993* 5

(6.4)74(93.8)

0(0.0)

79(100.0)

1996** 1

(1.3)74(98.7)

0(0.0)

75(100.0)56(100.0)

1998 4(7.1)

51

(91.1)1

(1.8)

Note. The data in row 1* are from TheFeasibility of Developing a Profession-allyAccredited Non-EngineeringAeronautical/Aerospace Science DoctoralDegree Program in US Universities (p. 38) byJ. A. Johnson, 1993, Ann Arbor, MI: Master'sAbstracts International. The data from row2** are from An Analysis of CurriculumDesign in Developing a Doctor of PhilosophyProgram in Aeronology (p. 59) by J. A.Johnson, 1997, Ann Arbor, MI: DissertationAbstracts International.

Percentage wise, Table 2 illustrates arelatively stable trend in the highest degreeheld by UAA institutional memberrespondents in collegiate aviation during arecent five year period. Note thatapproximately one-half of the respondentshave a master's degree as the highest degreeheld while only one-third of the respondentshave a doctoral degree. Likewise, respondentsin possession of an associate's degree as thehighest degree held represent a very smallpercentage of all the respondents during thesame time frame.

43

Page 44: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 2Respondents' Highest Degree Held vs. Year

Asso-ciateN %

Bache-lor'sN %

Master'sN %

Doc-torateN

TotalN

1993' 3 11 42 23 79(3.8) (13.9) (53.2) (29.1) (100.0)

1996' 1 13 32 28 74' (1.3) (17.3) (42.7) (37.3) (100.0)1998 1 6 29 18 54

(1.9) (11.1) (53.7) (33.3) (100.0)

Note. The data in row 1* are from TheFeasibility of Developing a ProfessionallyAccredited Non-EngineeringAeronautical/Aerospace Science DoctoralDegree Program in US Universities (p. 39)by J. A. Johnson, 1993, Ann Arbor, MI:Master's Abstracts International. The datafrom row 2** are from An Analysis ofCurriculum Design in Developing a Doctorof Philosophy Program in Aeronology (p.59) by J. A. Johnson, 1997, Ann Arbor, MI:Dissertation Abstracts International.

Data Tabulations

The data from the study wereincorporated into a series of tables. Some ofthe data illustrated in this section have beencross tabulated using demographicinformation to illustrate comparisons. InTable 3, almost one-half (_N =24, 42.8%) ofthe respondents are recruiting facultymembers predominantly at publicinstitutions (_N =20, 35.7%). Fourrespondents (7.1%) at private institutions arereportedly hiring faculty members. Thehiring activity is indicative of theimportance for all of collegiate aviation toactively encourage careers in the field.

34

Table 3Respondents' Institutional Affiliation andRecruitment Status for Hiring AviationFaculty Members at the Instructor, AssistantProfessor, Associate Professor, or ProfessorRanks

PublicN %

PrivateN %

N/RN %

TotalIV %

Hiring 20 4 0 24(35.7) (7.1) (0.0) (42.8)

Not 25 5 1 31Hiring (44.6) (8.9) (1.8) (55.4)N/R I 0 0 1

(1.8) (0.0) (0.0) (1.8)Total 46 9 1 56

(82.1) (16.0) (1.8) (100.0)

Out of the 24 respondents (42.8%) hiringfaculty members depicted in Table 4, most ofthe hiring taking place is at doctoral grantinginstitutions of higher education19.6%) followed by community colleges(N=7, 12.5%). The least amount of hiringtaking place is at bachelor degree grantinginstitutions 1.8%). Some hiring activityat master's degree granting institutions is alsooccurring 8.9%). With respect to hiringinactivity, doctoral and associate degreegranting institutions are evenly split at 11

members apiece (19.6%).

44

Page 45: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 4Highest Degree Offered by Respondents'Institutions and Recruitment Status forHiring Aviation Faculty Members at theInstructor, Assistant Professor, AssociateProfessor, or Professor Ranks

Asso-ciateN %

Bache-lor'sN %

Master'sN %

Doc-torateN

TotalN %

Hiring 7 I 5 II 24(12.5) (1.8) (8.9) (19.6) (42.8)

Not 11 4 5 11 31Hiring (19.6) (7.1) (8.9) (19.6) (55.4)N/R 0 I 0 0 1

(0.0) (1.8) (0.0) (0.0) (1.8)Total 18 6 10 22 56

(32.1) (10.7) (17.9) (39.2) (100.0)

The data illustrated in Table 5 illustrateparallel trends to the data previouslyillustrated in Table 4. Note that in Table 4,11 UAA institutional members (19.6%) atdoctoral granting institutions are currentlyhiring compared to the same number ofrespondents who reported retirements oranticipated retirements in Table 5. Thenumber of respondents reporting retirementssubstantially drops to four (7.1%) atmaster's degree granting institutions, two(3.6%) at four-year degree institutions, andthree (5.4%) at community colleges. Slightlyover one-quarter of all non-retirements areconcentrated at the community colleges(N=15, 26.8%).

35

Table 5Highest Degree Offered by Respondents'Institutions Versus Reported or AnticipatedRetirements by the Year 2000

Asso-dateN %

Bache-lor'sN %

Master'sN %

Doc-torateN

TotalN %

Retire 3 2 4 11 20(5.4) (3.6) (7.1) (19.6) (35.7)

None 15 4 5 11 35(26.8) (7.1) (8.9) (19.6) (62.5)

N/R 0 0 1 0 1

(0.0) (0.0) (1.8) (0.0) (1.8)Total 18 6 10 22 56

(32.2) (10.7) (17.9) (39.2) (100.0)

Table 6 indicates that a significantmajority of the UAA respondents th=35,62.5%) require the master's degree as aminimum educational requirement for facultynew hires. Nine respondents (16.1%) reportthe bachelor's degree as a minimumprerequisite followed by seven (12.5%) whorequire the doctorate. Four respondents (7.1%)in the Other category have indicatedspecialized expertise in the form of a licenseor professional certification as a minimumeducational requirement. Only one UAAinstitutional member (1.8%) did not respond.

Table 6Minimum Educational Requirements forAviation Faculty New Hires

Bachelor's Master's Doctorate Other N/R TotalN % N % N %9 35 7 4 1 56(16.1) (62.5) (12.5) (7.1) (1.8) (100.0)

As shown in Table 7, slightly less thanone-quarter of all the UAA respondentsth=22, 39.3%) believe that collegiate aviationis average in its effectiveness to promote andprepare a future generation of facultymembers. Moreover, collegiate aviation'soverall effectiveness does not seem to "makethe grade" when combining Below Averageand Poor responses which equates to nearly

45

Page 46: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

one-half th=25, 44.6%) of the institutionalmembership who responded to the surveyquestionnaire. No one indicated thatcollegiate aviation's effectiveness was in theExcellent category while slightly less thanone-fifth (_N =9, 16.1%) of the reportedmembership was in the Good category.

Table 7Collegiate Aviation's Effectiveness inPromoting and Preparing a FutureGeneration of Faculty Members

Excel-lentN %

GoodN %

AverageN %

BelowAverageN

PoorN %

TotalN %

0(0.0)

9

(16.1)22(39.3)

16

(28.5)9(16.1)

56(100.0)

Table 8 illustrates the UAA respondents'perceptions of their own salary structure incomparison to other aviation programs.Nearly one-half of the respondents (_N =26,46.4%) believe their salary structures areaverage in comparison to other programs.Only 3 respondents (5.4%) reported theirsalary structure as Poor. When combiningExcellent and Good responses, slightly overone-quarter (_N =15, 26.8%) of the membersconsider their salary structure better thanaverage in comparison to other programs.Note the number of respondents in theExcellent/Good category equals the numberof respondents in the Below Average/Poorcategory (N=15, 26.8%). In essence, anormal distribution exists around theconcentration of Average responses.

Table 8Respondents' Reported Salary Structurein Comparison to Other Aviation Programs

Excel-lentN %

GoodN %

AverageN %

BelowAverageN %

PoorN %

TotalN %

7(12.5)

8

(14.3)26(46.4)

12(21.4)

3

(5.4)56(100.0)

36

The distribution of responses in Table 8has a strong resemblance to the distribution ofresponses in Table 9. Again, nearly one-halfof the respondents (_N =26, 46.4%) believetheir salary with respect to cost of living isaverage. Only one individual reported anearned salary vs. cost of living as Excellent.Collectively though, exactly one-quarter of therespondents (N=14, 25.0%) report theirsalaries as Excellent/Good. Thirteen (23.3%)of the reported UAA institutional membershipreport their salary as Below Average and onlythree members (5.4%) report a response ofPoor.

Table 9Respondents' Reported Salaries WithRes ect to Cost of LivingExcel-lentN %

GoodN %

AverageN %

BelowAverageN %

PoorN %

TotalN %

1

(1.8)13

(23.2)26(46.4)

13

(23.2)3

(5.4)56(100.0)

Table 10 provides an illustration of theUAA institutional members' aviation programsalary ranges. Reported salaries are clusteredin the $20,000 - $59,000 range. The mostprolific response was the $40,000 - $59,000salary range reported by 36 respondents(64.3%). According to US Department ofEducation (as cited in The Chronicle ofHigher Education: Almanac Issue, 1997), theaverage salary for all institutions of highereducation adjusted for a nine month academicyear (except those without academic ranks)are as follows: Professor, $67,415; AssociateProfessor, $49,695; Assistant Professor,$41,041; Instructor, $31,756; Lecturer,$34,755; and No Rank, $36,502. The overallaverage salary was $52,556. Table 10 alsodepicts a very small percentage of salaries inthe Less than $20,000 and the $100,000 ormore category ranges at one response (1.8%)per category. Respondents reporting in the$80,000 - $99,000 category also comprise avery small percentage (_N =2, 3.6%).

46

Page 47: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 10Reported Salary Rangesby UAA Institutional Members

Less than$20,000N %

$20,000$39,000N %

$40.000$59,000N

1 32 36(1.8) (57.1) (64.3)$60,000 $80,000 $100,000$79,000 $99,000 or moreN % N % N10 2 1

(17.9) (3.6) (1.8)

Note. Respondents were able to list morethan one salary range.

In Table 11, 55 out of the 56 respondingmembers indicated that public perception ofcollegiate aviation is inadequate. None ofthe members indicated adequate publicperception. Only one respondent (1.8%) didnot respond.

Table 11Does the Public At Large Havean Adequate Understandingof Collegiate Aviation?

Yes No N/R TotalN % N % N % N %0 55 1 56(0.0) (98.2) (1.8) (100.0)

CONCLUSIONS ANDRECOMMENDATIONS

ConclusionsOverall, 24 UAA respondents (42.8%)

indicated they were actively hiring newaviation faculty members for their programs.Most of the hiring taking place is at doctoralgranting institutions of higher educationth=11, 19.6%) followed by communitycolleges (_N =7, 12.5%). Data results also

37

indicate that 11 doctoral institutions (19.6%)are either losing faculty members toretirements or will be losing faculty toretirements by the year 2000. This findingsuggests that numerical retirements and hiringmay be consistent at the doctoral level. Theresults also indicate that greater emphasis onpreparing a future generation of aviationfaculty members with earned doctorates isbecoming increasingly important. In addition,the data indicate that preparing aviationfaculty members for careers in communitycollege settings is also an area of increasingimportance. The results of the study indicatethat 62.5 percent (_N =35) of all the UAArespondents require, at a minimum, themaster's degree for aviation faculty new hires.Significantly less th=7, 12.5%) require thedoctorate.

In the area of collegiate aviation'seffectiveness in promoting and preparing afuture generation of faculty members, 22respondents (39.3%) reported Average. BelowAverage and Poor responses equated to nearlyone-half of the respondents (_N =25, 44.6%). Ina related area, almost all of the UAArespondents (N=55, 98.2%) believe the publicat large does not have an adequateunderstanding of collegiate aviation. Clearly,promoting aviation faculty careers and publicawareness are two areas that collegiateaviation needs to improve upon.

The most commonly reported salaryranges are as follows: $40,000 - $59,000(_N =36, 64.3%) and $20,000 - $39,000 (_N =32,57.1%). The national average reported by theUS Department of Education (cited in TheChronicle of Higher Education: AlmanacIssue, 1997) for all professors (except thoseinstitutions without academic ranks) is$52,556. Nearly one-half of the UAAmembers (_N =26, 46.4%) believe theirprogram salaries are average with respect tocost of living while an additional 13

47

Page 48: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

(_N=23.2%) report their program salarystructure is good.

RecommendationsHistorically, other studies (Matson,

1977; Taylor, 1990; Truitt & Kaps, 1995;Johnson, 1997) have indicated that negativepublic perception of collegiate aviation hasbeen problematic. Despite great strides incollegiate aviation in a relatively shortperiod of time, inadequate public awarenessstill presents a problem as indicated by thedata collected for this study. The identityissue still plagues collegiate aviation. Howaviation educators are collectively identifiedin the academic field still remains obscure atbest. Johnson (1997) found 14 differentterms or phrases were used by aviationeducators to identify collegiate aviation. Thestudy also found a disturbing observationnoted by several traditional scholars with noprevious aviation experience: If aviationeducators and scholars cannot articulate whothey are as a collective body, then how dothey expect us and the general public toidentify who they are as well?

One solution is to redefine or invent acollective term or phrase to describe whatwe do and how we identify ourselves. By1997, Johnson (1997) invented the wordaeronology as "the study of the non-engineering aspects of aviation, aeronautics,and aerospace sciences and technologies" (p.28). In the field of engineering, Narayanan(1999) has made reference to aeronology "asa subject bridging various disciplines inaerospace sciences" (p. 11) in a researchproposal to the National Science Foundation(NSF). Regardless of what word or phrase isused, it is recommended that aviation as anacademic field of study should resolve theidentity issue. This should alleviate someinternal and external perceptual problems.

Other recommendations includeelevating the awareness of aviation related

38

higher education opportunities to students andindustry representatives. This can beaccomplished at the institutional level orthrough involvement in national organizationssuch as UAA conferences. As the importanceof aviation related research further escalates,the need to prepare a future generation offaculty members with earned doctoratesbecomes more imperative as well. By doingso, the academic vitality of aviation will bepreserved by the preparation of a newgeneration of faculty members capable ofaddressing the demands of the twenty-firstcentury and beyond.

4.8

Page 49: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

REFERENCES

Collegiate Aviation Guide. (1994). Auburn, AL: University Aviation Association.Fuller, M. & Truitt, L. (1997). Aviation education: Perceptions of airport consultants. Journal

of Air Transportation World Wide, 2(1), 64-80.Johnson, J. A. (1993). The feasibility of developing a professionally accredited non-

engineering aeronautical/aerospace science doctoral degree program in US universities (Master'sthesis, Embry-Riddle Aeronautical University, 1993). Master's Abstracts International, 32/01,24.

Johnson, J. A. (1997). An analysis of curriculum design in developing a doctor of philosophyprogram in aeronology (Doctoral dissertation, Bowling Green State University, 1997).Dissertation Abstracts International, 58-08A, 3037.

Matson, R. (1977). The book of aerospace education. Washington, DC: American Society forAerospace Education.

Minitab for Windows 12.2 [Computer Software]. (1998). State College, PA: Minitab, Inc.Narayanan, R. (1999). Establishment of the University of Nebraska airborne remote sensing

facility (NSF research proposal). Lincoln, NE: Author.Taylor, K. (1990). The marketability of an aviation-related doctoral degree. Unpublished

manuscript, Embry-Riddle Aeronautical University. Daytona Beach, FL.The Chronicle of Higher Education: Almanac Issue. (1997, August). Washington, DC: The

Chronicle of Higher Education.Truitt, L. & Kaps, R. (1995). Publishing aviation research: An Interdisciplinary

review of scholarly journals. Journal of Studies in Technical Careers, XV(4), 229-243.

3949

Page 50: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Assessing the Environment and Outcomes of Four-YearAviation Programs: Does Program Quality Make a Difference?

Paul D. LindsethUniversity of North Dakota

ABSTRACT

The higher education literature concerning academic program quality offers differingopinions as to which indicators should determine program quality (Cameron, 1987; Tan, 1992).Recently, greater attention has been focused upon the environment and the outcomes of highereducation academic programs (Astin, 1991). The purpose of this study was to determine to whatextent the highest quality U.S. four-year aviation programs follow current literature trends andemphasize environment and outcome indicators of quality. Students (N=447), faculty (N=167),and alumni (N=577) from high, medium, and low quality four-year aviation programs asdetermined in Lindseth's (1996) study, were surveyed using the Educational Testing Service'sProgram Self Assessment tool. The instrument measures perceptions of students, faculty, andalumni toward 16 composite characteristics or indicators of academic program quality. Resultsshowed that except for the indicator internship experiences, the emphasis placed on environmentand outcome indicators of academic program quality was not significantly different at the highestquality U.S. four-year aviation programs as compared to intermediate and low quality four-yearaviation programs.

INTRODUCTION

The overriding theme in theliterature concerning academic programquality is that scholars find it hard to agreeon which indicators should be used todetermine program quality (Cameron, 1987;Tan, 1992). As noted in reviews of researchliterature (Conrad & Blackburn, 1985b;Kuh, 1981; Tan, 1992), authors list manyindicators that could be classified as inputvariables to the academic program such asfacilities, equipment and dollars. However,an increasing number of environment(process) and outcome variables are beingidentified as well. For example, Astin's(1985, p. 60-61) "talent-development"concept of educational quality is that "trueexcellence lies in the institution's ability toaffect its students and faculty favorably, toenhance their intellectual and scholarly

40

development, and to make a positivedifference in their lives." This view ofquality, labeled the value-added view, doesfocus more on process (environment) andoutcome indicators of quality.

Conrad and Pratt (1985) also presentquestions about processes such as howshould an academic program commitresources to the academic processesinvolved in teaching, research, and service?Examples of these academic processes arefaculty-student interactions anddevelopment of students' critical thinkingand problem solving ability. Kolb's (1984)learning theory and Pace's (1979, 1984,1990) quality of student effort theory are notfactors in most quality ratings. Theknowledge gained from research on learningand thinking and how academic programsmay adopt curricula to reflect thisknowledge is seldom addressed. The

50

Page 51: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

processes (environment) taking place withinthe design of an academic program can bevery important indicators of programquality. In addition, the "extracurriculum"needs to be considered in an evaluation ofacademic program quality since theactivities of students outside the classroomscertainly may enhance or detract from theoverall learning experience of each student(Conrad & Pratt, 1985; Kuh, Schuh, Whitt &Associates, 1991). The extracurriculummay include events such as professionalgroup meetings that are held on or nearbythe campus.

All of these considerations point to amultidimensional approach in definingindicators of quality academic programs.Furthermore, quality indicators should beexamined at the program level as well as theinstitutional level (Fairweather & Brown,1991). According to the higher educationliterature, (Astin, 1985, 1991; Bogue &Saunders, 1992; Kuh, 1981; Kuh et al.,1991; Pace, 1990) focusing more onprocesses and outcomes will help gain abetter perspective on the overall indicatorsof quality in academic programs. Thus tofurther investigate quality in one specifichigher education academic program, thefollowing research study of four-yearaviation programs is presented.

Research DesignThe purpose of this study was to determineto what extent the highest quality U.S. four-year aviation programs follow currentliterature trends and emphasize environmentand outcome indicators of quality. It was aquantitative study of six of the top tenhighest quality U.S. four-year aviationprograms identified in Lindseth's (1996)study of academic program quality, as wellas six randomly selected intermediatequality program and six randomly selectedlow quality programs. The overall rankingof all 68 U.S. four-year aviation programs

41

(Lindseth, 1996) was used to select the 18programs that were studied. From theoverall rankings, the six highest qualityprograms were selected from the top tenprograms identified in Lindseth's (1996)study. If one of the 18 aviation programsdecided not to participate in the study, theseventh highest ranked program wasselected for the top program sample and soforth. For the intermediate and low qualityprogram samples, another program wasrandomly selected from the applicablecategory. Intermediate quality programswere those programs rated in the middleone-third of programs and low qualityprograms were those rated in the lower one-third of programs (Lindseth, 1996).Regardless of what criteria emerged asindicators of quality in Lindseth's (1998)grounded theory study of four-year aviationprograms, determining whether the highestquality U.S. four-year aviation programs arefollowing current literature trends andemphasizing environment and outcomevariables of program quality seemedessential in a study of academic programquality.

Sample PopulationIn this research, the 18 U.S. four-yearaviation programs were studied through anEducational Testing Service (ETS) ProgramSelf-Assessment Survey filled out byfaculty, students, and alumni from each ofthe 18 aviation programs. All undergraduateaviation students at each program, classifiedacademically as seniors, comprised thestudent sample. The assumption was thatsenior students were better able to judge theprogram's quality than junior, sophomore,or freshman aviation students. All aviationfaculty members at each program comprisedthe faculty sample. In additions, the 18four-year aviation programs were asked toprovide a list of alumni and their addresseswho had graduated from the aviation

51

Page 52: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

program in the past ten years. A randomlyselected sample (using a table of randomnumbers) of 50 alumni from each programwas invited to participate in the study. Thecompleted questionnaires from the faculty,students, and alumni were used to analyze towhat extent the highest quality four-yearaviation programs emphasize environmentand outcome indicators of quality.

InstrumentationThe ETS's Program Self-AssessmentSurveys were used as the measurementinstruments. Each ETS instrumentaddresses whether U.S. four-year aviationprograms emphasize key environment andoutcome indicators of quality as measuredby perceptions of students, faculty, andalumni. The instruments are Likert-scaledmeasurement instruments consisting of a 62-item program quality assessmentquestionnaire developed by Clark (1983)and the ETS. The instruments were initiallydeveloped for graduate programs but weremodified for undergraduate academicprograms. These instruments were chosenbecause they measure to what extentstudents, faculty, and alumni perceive theiraviation program emphasizes keyenvironment and outcome variables ofprogram quality. Furthermore, the literaturereview showed that quality academicprograms are shifting their focus from inputvariables to environment and outcomevariables. The 16 composite indicators ofquality that the ETS instruments measure areas follows (ETS, 1996):

1. Environment for Learning

Extent to which members of thedepartment work together to achieveprogram goals, provide a supportiveenvironment characterized by mutualrespect and concern between students

42

52

and professors, and are open to newideas and different scholarly points ofview.

2. Scholarly Excellence

Perceived scholarly and professionalcompetency of the - department faculty,academic ability and efforts ofstudents, and intellectual stimulation inthe program.

3. Quality of Teaching

Assessment of faculty awareness ofnew developments in the field,teaching methods, grading procedures,preparation for classes, and interest inassisting students with their academicwork.

4. Faculty Concern for Students

Extent to which faculty members areperceived to be accessible, interestedin the welfare and academicdevelopment of students, and aware ofstudent needs, concerns, andsuggestions.

5. Curriculum

Ratings of the variety, depth, andavailability of course and programofferings, program flexibility,opportunities for individual projects,and interactions with relateddepartments.

6. Departmental Procedures

Ratings of departmental policies andprocedures, such as studentparticipation in departmentaldecisions, relevance andadministration of degree requirements,

Page 53: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

evaluation of students' progresstoward the degree, academic andcareer advisement of students, andhelpfulness to graduates in findingappropriate employment.

7. Available Resources

Adequacy of available facilities, suchas libraries and laboratories, andoverall adequacy of physical, financial,and support staff resources; perceptionof the institution's commitment to theprogram.

8. Student Satisfaction with Program

Self-reported student and alumnisatisfaction with the program asreflected in judgments about theamount that has been learned,preparation for intended career, andwillingness to recommend the programto a friend.

9. Internship, Fieldwork, or ClinicalExperiences

Ratings of departmental training,supervision, and assigned duties;contribution of the experiences toacademic and professional develop-ment; and adequacy of office spaceand equipment.

10. Resource Accessibility

Self-reported student satisfaction withopportunities for intellectual and socialinteraction among persons in theprogram, with student services and

financial assistance, and with campusservices for nonresident students.

11. Employment Assistance

Alumni assessment of the employmentassistance received through thedepartment's formal or informalefforts, individual professors,placement office, listings of openingsfrom professional associations, andunsolicited letters sent to employers.

12. Faculty Work Environment

Self-reported faculty satisfaction withdepartmental objectives andprocedures, academic freedom,opportunities to influence decisions,and relationships with other facultymembers; faculty judgments ofdepartmental management in suchareas as planning, administration, andcareer development of faculty.

13. Faculty Program Involvement

Extent to which faculty membersreport involvement in the program:teaching required courses,participating in policy and curriculumdecisions and departmentalexaminations, directing independentstudies, supervising field work orinternships, serving as a facultyadvisor, and arranging student contactswith nonacademic professionals.

14. Faculty Research Activities

Extent to which faculty membersreport receiving awards foroutstanding research or scholarlywriting, editing professional journals,refereeing articles submitted to

43 53

Page 54: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

professional journals, and receivinggrants to support research or otherscholarly or creative work.

15. Faculty Professional Activities

Extent to which faculty membersreport serving on national review oradvisory councils, holding office inregional or national professionalassociations, and receiving awards foroutstanding teaching or professionalpractice.

16. Student Accomplishments

Self-reported student accomplishmentsin several categories of activity andrecognition, including attendance andpresentations at professional meetings;writing of scholarly papers, planningand involvement in research projects;development of professional skills andknowledge; recognition through prizes,awards, fellowships, training grants, orscholarships; and participation indepartment or program planning.

ETS developed similar but separateinstruments for students, faculty and alumni.The reliability coefficient alpha for theinstruments is a=.83 (Clark, 1983) forsurveying graduate programs. A pilot testwas accomplished on aviation faculty andstudents to check for reliability of theinstruments for undergraduate aviationprograms. A test-retest procedure wasconducted 14 days apart for both the facultyand students. The faculty instrument test-retest correlation coefficient obtained was.93 (p<.05). The student instrumentrevealed a test-retest correlation coefficientof .83 (p<.01). With these relatively highvalues, the instruments appear to be reliable

44

for use in researching program quality inU.S. four-year aviation programs.

To ensure content and constructvalidity of the instruments, a group of fiveexperts were randomly selected from anofficial list of Council on AviationAccreditation (CAA) accreditors receivedfrom CAA headquarters. These expertsprovided feedback as to whether each of thethree instruments (faculty, student, andalumni) is a valid measure of four-yearaviation program quality. Four of the fiveexperts all agreed that all three instrumentswere valid measures of quality. The fifthexpert was not able to respond due to otherprofessional commitments. However, withfour of the experts all in agreement, it wasconcluded that the instruments would bevalid for this particular type of research onfour-year aviation programs. Theinstruments consist of a section onperceptions of program quality and ademographic section. Applicabledemographic items, as well as itemssuggested by the CAA panel of experts,were added to a supplemental section ofeach instrument.

Data CollectionThe program administrators of each

of the 18 aviation programs were contactedby telephone and the importance of theresearch study was explained along with theprotocol procedures. An introductoryconsent letter was also sent to eachadministrator clarifying the research study.Two programs in each of the three groups(high, medium, and low quality programs)declined to participate for various reasons,ranging from time constraints on faculty andstudents to a perception that their inputwould be of little benefit given theirparticular circumstances (e.g., program wasgoing to close, unionized faculty were onstrike). Thus, two other programs in each

54

Page 55: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

group were selected and participationapproval was obtained.

A research assistant to act in theresearcher's behalf was obtained at each ofthe 18 programs. This was done to insureminimum sampling error and expeditiousdata collection. The research assistant waseither the aviation program administrator, anaviation faculty member, or in one case, agraduate student. Each research assistantreceived training in regard to the protocol ofthe research. The assistants were then sentthe appropriate number of faculty andsenior-level student ETS questionnaires fortheir respective program. The researchassistant distributed the questionnaires to allfaculty and senior-level students along withcover letters at a convenient time during thesemester. The purpose of the study anddirections for the questionnaire wereexplained in the cover letter. The assistantcollected and returned the questionnaires tothe researcher, maintaining respondentconfidentiality. Some programs were notable to allow class time for the students tocomplete the questionnaire. In these cases,the response rate suffered somewhat,however no program's response ratedropped below 40%. Also, some assistantswere not as diligent as others to administerand collect the questionnaires. As a result ittook four months to receive all thequestionnaires.

The response rates for students andfaculty were fairly similar between the twogroups but within each group the low qualityprogram students and faculty responded at amuch higher rate than both the intermediateand high quality program students andfaculty. A possible explanation for this mayhave been greater opportunity by theresearch assistant to contact faculty andstudent respondents at the low qualityprograms due to the program's small size,even though in a previous study programsize did not correlate with program quality

45

(Lindseth, 1996). The overall studentresponse rate was 59% (N=447). Studentsfrom the highest quality programs respondedat a rate of 54% (N=268), while the studentresponse rate from intermediate qualityprograms was 63% (N=135), and from lowquality programs 77% (N=44). The overallfaculty response rate was 54% (N=167).The highest quality program facultyresponded at a 49% rate (N=119), theintermediate quality program facultyresponded at a 55% rate (N=31), and the lowquality program faculty responded at an88% rate (N=17).

The alumni responses were obtainedthrough a mail survey of the ETS alumniquestionnaire. Each of the 18 aviationprograms did provide a listing of names andaddresses of alumni who had graduated withan aviation degree during the past ten years.Fifty respondents were then randomlyselected from the alumni lists. Sevenprograms had not graduated a total of 50alumni in the past ten years, so all graduatesof these programs in the past ten years (theentire population) were surveyed. A coverletter explaining the purpose of the researchwas sent to each alumnus along with theETS alumni questionnaire. The overallresponse rate for the alumni after a postcardfollow-up and an additional follow-up letterwas 42% (N=577). The response rate forthe alumni of the highest quality programswas 42% (N=286), for the intermediategroup 40% (N=154), and for the lowerquality group 43% (N=137).

Results and AnalysisThe survey data gathered was analyzedutilizing the Statistical Package for SocialSciences (SPSS-X). Scores were analyzedseparately for students, faculty, and alumnifrom each program. The response means,plus or minus the standard deviations for

55

Page 56: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

students, faculty, and alumni from eachprogram for each applicable indicator ofquality from the ETS instruments aredisplayed in Tables 1, 2, and 3.

Testing for DifferencesTo test for significant differences amongstudent (N=447), faculty (N=167), andalumni (N=577) from the highest qualityprograms, intermediate quality, and lowerquality programs, ANOVAs (analysis ofvariance) were computed using thestatistical package SPSS-X. ANOVAs wereaccomplished for each of the applicable 16composite indicators of quality on the ETSinstrument.

ANOVAs on Student Data.Of the eleven applicable student indicators,ANOVA analysis found significant

46

differences in five of the eleven scales. AScheffe test was accomplished to determinebetween which groups the means weresignificantly different. Table 1 displaysthese differences.

Interpreting the data from this table,three statistically significant determinationscan be made. First, students attendingintermediate quality aviation programsperceive a more conducive environment forlearning and greater student accomplishmentthan students attending high qualityprograms. Second, students attending highquality programs perceive greateravailability of resources and betterinternship experiences than studentsattending either intermediate or low qualityaviation programs. Finally, studentsattending intermediate quality aviationprograms perceive greater accessibility toresources than students attending lowquality aviation programs.

56

Page 57: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 1.Comparison of Means ± Standard Deviations andANOVAs for Indicators of Quality as Rated by Students inHigh, Medium, and Low Quality Programs

Students (N=447)

HighQualityPrograms(N=268)

MediumQualityPrograms(N=135)

LowQualityPrograms(N=44)

FValue p=

Environment for Learning 2.93 ± .43a 3.10 ± .45a 2.97 ± .44 3.68 .0267

Scholarly Excellence 2.98 ± .43 2.99 ± .63 2.85 ± .51 NS

Quality of Teaching 2.97 ± .45 2.92 ± .65 2.92 ± .52 NS

Faculty Concern for Students 2.88 ± .48 3.04 ± .57 2.98 ± .57 NS

Curriculum 2.63 ± .54 2.52 ± .71 2.37 ± .58 NS

Departmental Procedures 2.65 ± .46 2.69 ± .62 2.61 ± .58 NS

Available Resources 2.78 ± .56" 2.41± .61" 2.12± .57" 22.46 .0000

Student Satisfaction withProgram

3.32 ± .60 3.19 ± .68 3.16 ± .55 NS

Internships 3.15 ± .37" 2.90± .32" 2.69± .48" 8.76 .0004

Resource Accessibility 2.37 ± .60 2.56 ± .77' 2.19 ± .68' 4.31 .0144

Student Accomplishments .38 ± .17d .49 ± .19d .45 ± .18 9.44 .0001

NS = Non-significant differenceaSignificant differences exist between high quality programs and medium quality programs"Significant differences exist between high quality programs and both medium and low qualityprograms`Significant differences exist between medium quality programs and low quality programsdSignificant differences exist between high quality programs and medium quality programs

47

57

Page 58: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

ANOVAs on faculty data. Only one of the eleven ETS indicators of quality scales showedsignificant differences between high, medium, and low quality programs (See Table 2). Theenvironment for learning scale was rated significantly higher by the intermediate qualityprograms when compared to faculty ratings from the highest quality programs. All other scalesshowed no significant differences. Thus, the only statistically significant determination that canbe made from the faculty questionnaires is intermediate quality program faculty perceive theenvironment for learning of their respective program at a higher level than faculty at the highestquality programs.

Table 2.Comparison of Means ± Standard Deviations andANOVAs for Indicators of Quality as Rated by Faculty inHigh, Medium, and Low Quality Programs

Faculty (N=167)

HighQualityPrograms(N=119)

MediumQualityPrograms(N=31)

Low QualityPrograms(N=17)

FValue

p=

Environment for Learning 3.05 ± .39a 3.35 ± .38a 3.20 ± .45 4.04 .0211

Scholarly Excellence 2.96 ± .44 2.95 ± .38 3.01 ± .39 NS

Quality of Teaching 2.91 ± .55 3.10 ± .41 3.06 ± .59 NS

Faculty Concern for Students 3.06 ± .47 3.29 ± .39 3.29 ± .53 NS

Curriculum 2.72 ± .59 2.68 ± .69 2.51 ± .70 NS

Departmental Procedures 2.80 ± .55 2.99 ± .46 2.81 ± .47 NS

Available Resources 2.64 ± .65 2.36 ± .61 2.23 ± .77 NS

Faculty Work Environment 2.86 ± .46 3.06 ± .53 3.09 ± .46 NS

Faculty Program Involvement 1.84 ± .53 2.04 ±.90 1.98 ± .64 NS

Faculty Research Activities 1.81 ± .23 1.85 ± .23 1.87 ± .20 NS

Faculty ProfessionalActivities

1.67 ± .32 1.67 ± .29 1.69 ± .23 NS

NS = Non-significant difference

'Significant differences exist between high quality programs and medium quality programs

48

58

Page 59: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

ANOVAs on alumni data. For the ETS questionnaire sent to alumni, there were ten applicableindicators of quality. ANOVA analysis found significant differences in only two of the tenscales among the three groups of alumni. The two scales where significant differences werefound were available resources and internships. A Scheffe test was done to determine betweenwhich groups the means were statistically significant. Table 3 displays these differences. Fromthe data in Table 3, two statistically significant determinations can be made. The first addressesthe indicator available resources. Alumni who attended the highest quality programs perceivethey had greater availability of resources than alumni who attended either intermediate or lowquality programs. Second, alumni graduating from high quality programs perceive greaterbenefit from their internship experiences than alumni graduating from low quality programs.

Table 3.Comparison of Means ± Standard Deviations andANOVAs for Indicators of Quality as Rated by Alumni inHigh, Medium, and Low Quality Programs

Alumni (N=577)

HighQualityPrograms(N=286)

MediumQualityPrograms(N=154)

LowQualityPrograms(N=137)

FValue

13=

Environment for Learning 2.98 ± .41 2.94 ± .43 2.78 ± .57 NS

Scholarly Excellence 2.88 ± .58 2.72 ± .59 2.82 ± .62 NS

Quality of Teaching 2.95 ± .53 2.79 ± .59 2.85 ± .61 NS

Faculty Concern for Students 2.98 ± .54 2.87 ± .59 2.93 ± .70 NS

Curriculum 2.42 ± .65 2.29 ± .70 2.38 ± .63 NS

Departmental Procedures 2.43 ± .58 2.37 ± .63 2.39 ± .60 NS

Available Resources 2.90 ± .58a 2.25 ± .67a 2.26 ± .63a 30.06 .0000

Student Satisfaction withProgram

3.03 ± .73 2.82 ± .74 3.00 ± .76 NS

Internships 3.03 ± .60" 2.70 ± .54 2.56 ± .74" 5.95 .0035

Employment Assistance 1.32 ± .63 1.27 ± .79 1.40 ± .59 NS

NS = Non-significant difference'Significant differences exist between high quality programs and both medium and low qualityprograms°Significant differences exist between high quality programs and low quality programs

4959

Page 60: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

DISCUSSION OF RESULTS ANDANALYSIS

Similar Emphasis on Environment andOutcome Indicators of Quality

This study focused on theenvironment and outcomes of U.S. four-yearaviation programs. All of the ETSinstrument scales examined environmentand outcome variables to some extent exceptfor the available resources scale. Theenvironment and outcomes of four-yearaviation programs are very important whenusing the ETS Program Self AssessmentSurvey tool.

In examining the mean scores bystudents, faculty, and alumni for eachindicator of quality scale, as well as thetotaled average student, faculty, and alumnimean score, several intermediate qualityprogram means were higher than the highquality program means. Some of the lowquality program means were also higherthan the high quality program means. Butwhy did this happen? Does this datainvalidate previous research findings? Areason for the different results is becauseprogram quality was measured in anotherway in this study, (e.g., through the ETSsurvey instruments). When analyzingstudent, faculty, and alumni group means forsignificant differences, the results show thatthe highest quality aviation programs do notemphasize environment and outcomevariables as indicators of quality, at least tono greater extent than the intermediate andlow quality programs. In a few cases, thehighest quality programs actually emphasizeenvironment and outcome variables to alesser extent.

However, to show that thedifferences are not very extensive amongstudents, faculty, and alumni at the threegroups of U.S. four-year aviation programs(high, medium, and low), additional

comparisons were made. For example,among students, only five of the elevenapplicable ETS indicators of quality scalesshowed significant differences betweenhigh, medium, and low quality programgroups. Only two of these scales, availableresources and internships, showedsignificant differences between students inthe highest quality programs compared tostudents in both the medium qualityprograms and low quality programs. Theinternship scale measured an environmentvariable whereas the available resourcesscale was an input variable. On the otherhand, the environment for learning and thestudent accomplishment scales were ratedsignificantly higher by the intermediatequality group over the high quality group.Furthermore, although not significantlyhigher, the rated mean for these scales bythe low quality group was also higher thanthe high quality group of aviation programs.The last significantly different rating in thestudent sample was on the scale resourceaccessibility. The medium quality programgroup rated it significantly higher than thelow quality program group. Thus, only oneof the eleven student ETS indicators ofquality scales measuring environment oroutcome variables was rated significantlyhigher by the highest quality programs.According to the students, it could beconcluded that the highest quality four-yearaviation programs are not emphasizingenvironment and outcome variables that areindicators of quality to any greater extentthan intermediate or low quality aviationprograms.

The survey completed by facultyfrom high quality, medium quality, and lowquality programs found ten of the elevenindicators of quality scales had nosignificant differences between groups. Theonly significant difference appeared in theenvironment for learning scale. The

50

60

Page 61: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

intermediate quality program faculty ratedtheir environment for learning significantlyhigher than the high quality program faculty.The low quality program faculty also rated

this scale higher than the high qualityprogram faculty, although not significantly.These results overwhelmingly indicate thataccording to the faculty, the highest qualityprograms are not emphasizing environmentand outcome variables of indicators ofquality to any greater extent than theintermediate or low quality program.

When examining the results from thealumni sample, the conclusions are similarto the student data. Of the ten applicableETS instrument indicators of quality scales,two were found to have significantdifferences between alumni groups. Thesescales were available resources andinternships. The available resources scale,an input variable scale, was ratedsignificantly higher by the high qualityprogram alumni as compared to both themedium quality and low quality programalumni. Similar to the student ratings, theinternship scale was rated significantlyhigher by the high quality program alumnias compared to the low quality programalumni. Although the medium qualityprogram alumni's mean rating on this scalewas also lower than the high qualityprogram alumni, it was not significant.Thus, just like the student category, the onlyindicator of quality scale rated significantlyhigher by the high quality programs thatmeasures environment or outcome variableswas the internship scale.

CONCLUSION

Considering the opinions fromstudents, faculty, and alumni, the highestquality U.S. four-year aviation programs donot emphasize environment and outcomeindicators of quality to any greater extentthan intermediate and low quality aviation

51

programs. According to the surveys fromstudents, faculty, and alumni, the only areawhere more emphasis is placed by thehighest quality programs on environment oroutcome indicators of quality is the area ofinternship experiences. This emphasisshould be maintained at the highest qualityprograms and the other programs shouldconsider increasing emphasis in this areasince the aviation industry experts placed agreat deal of importance on performance ofgraduates, and generally speaking, thesegraduates had come from four-year aviationprograms with very active internshipprograms. Thus, even though from previousresearch (Lindseth, 1996) aviation educationexperts generally agreed on which programswere of highest quality, only one of theenvironment and outcome indicators ofquality focused upon in this study wereemphasized to a greater extent at the highestquality programs when compared to theintermediate and lower quality programs.Based upon the results of this study, furtherresearch should be done on the importanceof the internship experience as well as theimportance of the environment for learningwithin collegiate aviation academicprograms.

61

Page 62: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

REFERENCES

Astin, A. W. (1985). Achieving educational excellence. San Francisco: Jossey-Bass.Astin, A. W. (1991). Assessment for excellence: The philosophy and practice of

assessment and evaluation in higher education. New York: MacMillan.Bogue, E. G., & Saunders, R. L. (1992). The evidence for quality. San Francisco:

Jossey-Bass.Cameron, K. S. (1987). Improving academic quality and effectiveness, in M. W.

Peterson and L. A. Mets (Eds.) Key resources on higher education governance, management,and leadership, pp. 322-346. San Francisco: Jossey-Bass.

Clark, M. J. (1983). Graduate program self-assessment service: Handbook for users.Princeton, NJ: Educational Testing Service.

Conrad, C. F., & Blackburn, R. T. (1985b). Program quality in higher education: Areview and critique of literature and research. In Higher education: Handbook of theory andresearch, Vol. 1, pp. 283-308. New York: Agathon Press.

Conrad, C. F., & Pratt, A. M. (1985). Designing for quality. Journal of HigherEducation, 56 (6) 601-621.

ETS. (1996). Program Self-Assessment Service. Princeton, NJ: Educational TestingService.

Fairweather, J. S., & Brown, D. F. (1991). Dimensions of academic program quality.The Review of Higher Education, 14 (2) 155-176.

Kolb, D. A. (1984). Experiential learning: Experience as the source of learningdevelopment. Englewood Cliffs, NJ: Prentice Hall.

Kuh, G. D. (1981). Indices of quality in the undergraduate experience. AAHE-ERICHigher Education Research Report No. 4. Washington, D.C.: American Association for HigherEducation.

Kuh, G. D., Schuh, J. H., Whitt, E. J. & Associates. (1991). Involving colleges. SanFrancisco: Jossey-Bass.

Lindseth, P. D. (1998). Developing a model of four-year aviation program quality: Agrounded theory approach. Collegiate Aviation Review, Sept. 1998. 11-23.

Lindseth, P. D. (1996). Identifying indicators of program quality in U. S.baccalaureate aviation programs. Dissertation. Ann Arbor, MI. The University of Michigan.

Pace, C. (1990). The undergraduates: A report of their activities and progress in collegein the 1980s. Los Angeles: University of California, Center for the Study of Evaluation.

Pace, C. (1984). Measuring the quality of college student experiences. Los Angeles:University of California, Higher Education Research Institute.

Pace, C. (1979). Measuring outcomes of college. San Francisco: Jossey-Bass.Tan, D. L. (1992). A multivariate approach to the assessment of quality. Research in

Higher Education, 33 (2) 205-226.

52

62

Page 63: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Airport Internships:Effectively Structuring a Departmental Rotation Internship

C. Daniel PratherTampa, Florida

Author Note

This study on the perceptions of airport managers regarding airport internships isone aspect of a paper entitled Airport internships: Combining formal education andpractical experience for a successful airport management career, which was prepared as arequirement for the American Association of Airport Executives' (AAAE) Accreditationprogram. The remaining aspect of the AAAE paper, which was published in the 1998Collegiate Aviation Review, is a study on the views of airport managers regarding post-secondary aviation education. Findings presented do not necessarily reflect the views ofmy employer, the Hillsborough County Aviation Authority.

ABSTRACT

Preparing for a career in airport management not only requires appropriateeducation, but formal training as well. Too often, many college graduates are faced withno experience and entry-level airport management positions that require someexperience. Unless a recent graduate is able to secure an airport internship, progressionin the airport management field may seem daunting, if not impossible. To this end, thisarticle presents findings on the expert opinions of airport managers nationwide regardingthe most important airport departments in which an intern should gain experience, thebenefits of internship programs, and the recommended structure of a departmentalrotation airport internship program. Utilizing the 1996-97 AAAE Membership Directoryand Yellow Pages of Corporate Members (American Association of Airport Executives,1997), a written mail survey was sent to a nationwide random sample of 200 airportmanagers in January 1998. Results, which are presented using percentage distributiontables and descriptive statistics, show that the majority of airport managers view theircareers as challenging and interesting, consider Finance along with Planning &Development the most important airport departments in which an intern should gainexperience, and feel that an internship program is extremely beneficial to the intern,while also being beneficial to the airport.

53 63

Page 64: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

INTRODUCTION

On-the-job training was oncesufficient for the individual entering thefield of airport management. In one sense, itstill is. Today, however, this formal trainingprogram is known as airport internships.Individuals no longer may be able to enterthe field with sufficient education alone.Experience is now a necessity and for thoserecently graduated, this experience mayseem impossible to obtain. Airports offeringinternships are allowing individuals new tothe field to apply knowledge gained in theclassroom to practical airport business.

An article by Thiessee, New Myer,and Widick (1992) details five traditionalstructures of an airport intern program. Thefirst alternative is known as job shadowing.The intern becomes an administrativeassistant to the airport manager, therebylearning by close managerial cooperation.Second, a departmental rotation internshipallows the intern to rotate through thevarious departments at the airport, such aspublic relations, ground transportation,operations, and maintenance. At smallerairports without these separate departments,the job-shadowing role becomes moreprevalent. The third alternative is a singledepartment-based internship, which allowsthe student to gain experience in only onedepartment. Although this is not advisablefor airport management students, it iscommon for mechanical engineeringstudents to intern solely in one area, such asthe Facilities department at an airport. Theacademic internship seeks as its end goal areport, presentation, manual, orbrochure/booklet. This arrangement beginswith a short period of job shadowing,quickly followed by intense research on thetask at hand. Finally, a specific taskinternship may be similar to the academicinternship, but is designed to give interns

54

experience that is task-oriented, notnecessarily departmentally based.

The author conducted research todetermine which of these five types ofinternships is most prevalent in the airportindustry. Since the "Positions Open" sectionof AAAE's Airport Report appears to be theauthoritative nationwide source of availableairport internships, each issue of thisnewsletter was examined during the October1992 through October 1997 time period.These 120 issues yielded 27 internshipadvertisements for a total of 35 availableinternship positions.

During these five years and of these35 positions, 40 percent (14) had a title ofAirport Management Intern, 34 percent (12)were recognized as Airport Interns, and 9percent (3) were Administrative Interns.The majority of the appointments (51percent) were for a term of two years. Ninepercent advertised a 15-week (summer)training program. Other lengths were oneyear along with six months. Finally, 94percent (33) were rotational in structure,explained by Theisse et al. (1992) as adepartmental rotation internship. Due to theubiquity of departmental rotation internshipsat airports offering internships, this paperfocuses specifically on this arrangement.Appendix A is a listing, by airport, of theinternship positions that were advertisedduring this time period.

The airport business is in a constantstate of dynamics and as airport managersare responding to this change, so too shouldrecent graduates. For this to be effectivelyaccomplished, current airport managers,those aspiring to become such, anduniversities assisting with this task, shouldbe aware of the perceptions of airportmanagers regarding four important areas: (a)descriptive words applicable to the airportmanagement career, (b) the relativeimportance of airport departments in which

64

Page 65: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

to gain experience, (c) the benefits of airportinternship programs, and (d) therecommended number of work days anairport intern should spend in each airportdepartment.

As such, this paper details thefindings of a survey sent to 200 randomlyselected airport managers in a nationwidestudy. Their opinions on their airportmanagement career, the relative importanceof different airport departments in which anintern should gain experience, and thebenefits of airport internship programs weresought. These responses allowed the authorto devise formulae that may be adopted byairports to appropriately structure adepartmental rotation internship at theirairport. It is hoped that data presented andrecommendations made will assist airportmanagers, recent graduates, and universityaviation programs in becoming properlyeducated about the needs of the airportmanager of tomorrow.Methodology

ParticipantsIn selecting participants for this

study, the 1996-97 AAAE MembershipDirectory and Yellow Pages of CorporateMembers (AAAE, 1997) was utilized. Thisdirectory contains a comprehensive listingof airport managers nationwide. Indeed,AAAE's membership is comprised ofairport managers employed at the primaryair carrier airports, accounting for 99 percentof nationwide enplanements, as well as atmany of the smaller commercial service,reliever, and general aviation airports. Eachindividual member airport was counted toarrive at a total population of 690 airports.Out of this total, the goal was to receive 150(n) usable surveys; therefore, assuming aresponse rate of 75 percent (p), the selectedsample size was 200 (N) [n/p = NJ. Eachairport in the Directory was numberedalphabetically and a random numbers table

55

[Table B-1 in the text by Alreck and Settle(1995)] was used to arrive at 200 randomlyselected numbers. These numbers were thenmatched to the corresponding airports toarrive at a random sample of 200 airports touse for the study. This random samplingmethodology was chosen for two reasons.First, it avoids the possibility of biasintroduced by a nonrandom selection ofsample elements. Second, it provides aprobabilistic basis for the selection of asample.

Survey InstrumentSince viewpoints were the main end

product desired in this study, it was decidedthat a survey instrument would be utilized.Surveys can be designed to "measure thingsas simple as respondents' physical ordemographic characteristics or as complexas their attitudes, preferences, or lifestylepatterns" (Alreck and Settle, 1995, p. 5).Further, because survey research usessampling, information about an extremelylarge population can be obtained from arelatively small sample of people. As aresult, the author designed a four-pagesurvey instrument specifically for this study(see Appendix B). All questions wereclosed-ended to allow for easier coding ofdata. Further, many questions were scaledon a five point Likert scale. This was used to"obtain people's position on certain issues orconclusions" (Alreck & Settle, 1995, p.116).

The survey begins with a definitionof Airport Manager, which is defined as "theindividual managing all facets of the day-to-day activities of the airport and known bysuch titles as Executive Director andDirector of Aviation." Airport Intern isdefined as "an individual working full- orpart-time in a temporary status gainingexperience in the airport management field."These definitions, which are defined by the

author, were included to reduce any

65

Page 66: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

misunderstanding that might arise whenthese terms were encountered whilecompleting the survey. The survey thenprogresses into Section A, which iscomposed of an adjective checklist. Thissection allowed some exploratory researchinto how airport managers view their career.This type of question was included for thebenefit of current students who may beinterested in knowing the percentage ofsurvey respondents considering the careerstressful or political, for example.

The next section focused on thedepartments at an airport to which an internmight be exposed. The section was scaled ona Likert scale to allow for responses ofairport managers to be gauged on this five-point scale. Choices included 0 (Don'tKnow), 1 (Extremely Unimportant), 2(Unimportant), 3 (Neutral), 4 (Important),and 5 (Extremely Important). Participantswere instructed to circle the number thatmost closely corresponded to their opinionabout the importance of each airportdepartment.

The section focusing on airportdepartments presented fifteen airportdepartments, listed in alphabetical order.The actual departments, ranging fromAircraft Rescue and Fire Fighting toRecords Management, are most common tomedium and large hub airports. Thefunctions represented, however, are commonto airports of any size.

Participants were also instructed toregister their opinion regarding the benefitsof airport internship programs in general.This section also consisted of a Likert scalewith choices composed of 0 (Don't Know), 1(Extremely Unimportant), 2 (Unimportant),3 (Neutral), 4 (Important), and 5 (ExtremelyImportant).

ProcedureIn the cover letter accompanying

each survey, participants were instructed on

56

the reason for the research, how they werechosen, the importance of their participation,the estimated time required to complete thesurvey, and the fact that participation wasvoluntary. Further, they were told to skipany questions they did not want to answer.As discussed earlier, the sample chosen wasa simplified random sample withoutreplacement, in that each participant had anequal probability of being selected, and onceselected, would not be chosen again.

The section focusing on descriptivewords stated, "Which of the followingwords describe your airport managementcareer?" Participants were instructed toplace a check next to any and all words thatapplied. Regarding airport departments,participants were asked, "For a successfulcareer as an airport manager, how importantdo you feel practical experience is in each ofthe following airport departments?"Participants were instructed to circle thenumber corresponding to their feeling ofdepartment importance for each item.Finally, regarding the importance ofinternship programs, participants wereasked, "In general, how beneficial do youfeel an airport internship program is to theintern and the airport?" As with theprevious section, participants were given aLikert scale and asked to circle the numbermost closely corresponding to their feelingof each item.

The 200 surveys were mailed onDecember 30, 1997. As of January 12, 1998,a response rate of 43 percent (86 surveys)had been received. Following the advice ofFowler (1993), a reminder postcard to allnon-respondents was mailed emphasizingthe importance of the study and the benefitof a high rate of response. One hundred andthree postcards were mailed to all non-respondents on January 15, 1998. Thisreminder mailing gave recipients theopportunity to receive another survey byfax, but only one recipient made such a

66

Page 67: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

request. This second mailing resulted in atotal survey response rate of 66 percent, with132 usable surveys being returned by theestablished deadline.

Data AnalysisOnce the surveys were returned, a

statistical analysis program, SPSS forWindows, was utilized to analyze the surveyresults. Descriptive statistics wereproduced, including frequencies, means, andstandard deviations. The results arereproduced in this article in a tabular formatto allow for easy comparison amongcategories.

RESULTS

DemographicsAs a result of the 34 percent (68) of

survey recipients who did not respond, onemay ask if this introduced non-response biasinto the results. The respondents of thissurvey very closely match the AAAEmembership at large. This membership,according to AAAE, is "truly representativeof airport management throughout thecountry" (AAAE, 1998). AAAEmembership is composed of non-hub, othercommercial service, and general aviationairports (75%), large hub (5%), medium hub(8%), and small hub (12%) (Susan Lausch,AAAE, personal fax, February 20, 1998).The survey respondents were composed ofnon-hub, other commercial service, andgeneral aviation (72%), large hub (7%),medium hub (9%), and small hub (13%).Therefore, due to the random nature of thedesign and the apparent random nature ofresponses, these survey results should provestatistically significant and may begeneralized to the population of airportmanagers nationwide.

The respondents were 88 percentmale and 12 percent female. Thirty-nine

57

percent of participants were more than 50years of age, with 34 percent and 23 percentbeing between 41 and 50 years of age and30 to 40 years of age, respectively. Further,45 percent of respondents are known asAirport Managers, with 20 percent beingknown as Airport Directors. It should benoted that although most airport managersare male, the number of females obtainingthis position seems to be on the increase. Infact, according to a study in 1994 by Truitt,Hamman, and Palinkas, 94 percent ofresponding airport managers were males.Therefore, it appears that females arerecognizing the opportunities in airportmanagement and contributing to thediversity of this profession.

Descriptive WordsThe first section of the survey listed

15 adjectives to allow airport managers todescribe their airport management career.This exploratory research will prove helpfulto students who are aspiring to enter theairport industry. Further, it gives currentairport managers insight into how their peersview the career. Table 1 is a tabular displayof those words and the numbers andpercentages of respondents agreeing witheach.

67

Page 68: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 1Evaluation of Words Describing AirportManagement Career

Yes NoChallenging 118 (90) 13 (10)

Competitive 53 (41) 78 (60)Dangerous 6 (05) 125 (95)Disappointing 12 (09) 119 (91)Easy 6 (05) 125 (95)Enjoyable 89 (68) 42 (32)Exciting 76 (58) 55 (42)Fulfilling 72 (55) 59 (45)Important 88 (67) 43 (33)Interesting 119 (91) 12 (09)Low-Paying 23 (18) 108 (82)Political 91 (70) 40 (31)Rewarding 90 (69) 41 (31)Secure 18 (14) 113 (86),Stressful 83 (63) 48 (37)

Note 1: Number in parentheses representspercentages.Note 2: Row percentages may not total 100percent due to rounding.Note 3: Words are listed in alphabeticalorder as they appeared on survey instrument.Note 4: N = 131 for all cases.

Ninety-one percent of respondentsfeel their career is interesting and ninetypercent feel it is challenging. These twowords claimed the majority of positiveresponses; however, the following wordswere also identified by respondents asdescribing their airport management career:political (70%), rewarding (69%), enjoyable(68%), important (67%), stressful (63%),exciting (58%), and fulfilling (55%). Wordsreceiving very little agreement aredangerous (5%) and easy (5%). Therefore,for those aspiring to be airport managers andwishing for an interesting and challengingcareer, the field of airport management

58

68

would appear to be a reasonable choice.These students should also realize, however,that the field is political, stressful, and notvery easy, according to the survey results.

Table 2Evaluation of Words Describing AirportManagement Career Ranking of MeanRatings

Words M SD

Interesting 1.092 0.290

Challenging 1.099 0.300Political 1.305 0.462Rewarding 1.313 0.465Enjoyable 1.321 0.469Important 1.328 0.471Stressful 1.366 0.484Exciting 1.420 0.495Fulfilling 1.450 0.499Competitive 1.595 0.493Low-paying 1.824 0.382Secure 1.863 0.346Disappointing 1.908 0.290Dangerous 1.954 0.210Easy 1.954 0.210

Note 1: Rating system utilized as follows:1 = Yes (Agreed)2 = No (Disagreed)

Note 2: Words are listed by ascending valueof mean.Note 3: M = Mean; SD = standard deviation

Table 2 is a listing of the descriptivestatistics related to each word. Words arelisted in ascending order by value of Mean.In other words, the lowest value Meanequates to the highest level of agreement.These data simply confirm the findingspresented in the percentage distribution table(Table 1).

Page 69: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Airport Depart'', e isIn examining the importance of

airport departments, the goal is to arrive atthe most appropriate structure of adepartmental rotation internship program.Specifically, which airport departmentsshould the intern experience and how muchtime should be spent in each department?The answers to these questions willobviously vary with each airport as staffingand time allow. Nonetheless, it isinformative to know those departmentswhich are seen as important by airportmanagers, so those airports interested inproviding departmental rotation internshipsmay structure their program accordingly.Further, this information gives studentsinsight into which departments are mostimportant as they begin seeking practicalexperience within the airport industry.(Please refer to Tables 3 and 4.)

59

Combining the important andextremely important categories, Financeaccompanied by Planning and Developmenttied for first place, each receiving 92 percentof responses in these two categories. Otherdepartments viewed as important andextremely important were Properties andContracts (89%), Operations (88%),Information/Public Relations (83%),General Aviation (76%), Facilities/Maintenance (72%), Design andConstruction (69%), and Human Resources(65%). At the other extreme, twodepartments which received the most marksfor extremely unimportant and unimportantare International Commerce (34%) andAircraft Rescue/Fire Fighting (27%).Airport managers feel these last twodepartments should not receive muchemphasis in an internship program.

69

Page 70: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 3Evaluation of Airport Departments

ExtremelyUn-

Im ortant

Un-Important

Neutral Important ExtremelyImportant

Airport Departments 1 2 3 4 5 nAircraft Rescue/FireFighting

10 (08) 24 (19) 32 (25) 42 (32) 22 (17) 130

Records Management 6 (05) 19 (15) 42 (33) 45 (35) 17 (13) 129

Design and Construction 1 (01) 9 (07) 31 (24) 68 (53) 20 (16) 129

Communications Center 6 (05) 20 (16) 55 (43) 38 (30) 9 (07) 128

Facilities/Maintenance 1 (01) 3 (02) 32 (24) 70 (53) 25 (19) 131

Finance 1 (01) 1 (01) 9 (07) 61 (47) 59 (45) 131

General Aviation 0 8 (06) 23 (18) 71 (54) 29 (22) 131

Ground Transportation 2 (02) 25 (19) 53 (41) 45 (35) 5 (04) 130Human Resources 0 5 (04) 39 (31) 53 (42) 29 (23) 126

Information/PublicRelations

1 (01) 2 (02) 20 (16) 57 (45) 48 (38) 128

International Commerce 12 (11) 26 (23) 52 (46) 20 (18) 3 (03) 113

Operations 1 (01) 2 (02) 13 (10) 60 (46) 54 (42) 130Planning and Development 0 1 (01) 10 (08) 70 (54) 49 (38) 130

Police/Security 5 (04) 18 (14) 48 (37) 48 (37) 10 (08) 129

Properties and Contracts 0 4 (03) 11 (08) 73 (56) 43 (33) 131

Note 1: Number in parentheses represents percentagesNote 2: Row percentages may not total 100 percent due to roundingNote 3: n reflects all valid cases, excepting "Don't Know" responses and nonresponses

60 7

Page 71: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 4Evaluation of Airport DepartmentsRanking of Mean Ratings

Departments M SDInternational Commerce 2.788 0.949Communications Center 3.187 0.945Ground Transportation 3.200 0.848Police/Security 3.310 0.942Aircraft Rescue/Fire Fighting 3.323 1.183Records Management 3.372 1.039Design and Construction 3.752 0.829Human Resources 3.841 0.824Facilities/Maintenance 3.878 0.765General Aviation 3.924 0.800Information/Public Relations 4.164 0.801Properties and Contracts 4.183 0.710Operations 4.262 0.763Planning and Development 4.285 0.638Finance 4.344 0.710Note 1: Rating system provided for evaluators was as follows:

0 = Don't Know1 = Extremely Unimportant2 = Unimportant3 = Neutral4 = Important5 = Extremely Important

Note 2: Only responses 1-5 were used in calculating statistics.Note 3: M = mean; SD = standard deviation

6171

Page 72: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

In using these numbers to most efficientlystructure a departmental rotation internshipprogram, one must keep in mind that thesedata represent expert opinion of airportmanagers only. They do not take intoaccount the views of interns who may wishto learn more about Aircraft Rescue/FireFighting, for example, even though it isconsidered of little importance to airportmanagers.

INTERNSHIP PROGRAM BENEFITS

For those airports and students stilldoubting the value of internships, it mayprove helpful to analyze the extent ofbenefits, as stated by airport managers, ofairport internships in general to both theairport and intern involved. These responses,therefore, do not necessarily focus solely ondepartmental rotation internships. It will firstbe insightful to examine the number ofsurvey participants who have actuallyexperienced an internship. Eighty-sevenpercent of respondents have never beenemployed as an airport intern. Of the 13percent of respondents who have, only 5individuals report this being a requirementfor graduation. Furthermore, 53 percent ofresponding airports do not have an activeinternship program. Of those, 69 percentstate they would not be willing and able toimplement an internship program undersufficient guidance. Although the specificreasoning for this finding was not addressedin the survey, written comments indicatedthat lack of funding is a hindrance inimplementing an internship program at someresponding airports. However, eighteenairports are willing and able to implementsuch a program and five airports respondedwith maybe. As such, it is prudent to discussthe benefits associated with implementing

62

an internship program.

Table 5Evaluation of Airport Internship ProgramBenefits

Extremely Non- Neutral Beneficial Extremelynon- Beneficial

BeneficialBeneficial

Beneficiary 1 2 3 4 5 n

Intern 1

(01)0 8

(07)34(29)

73(63)

116

Airport 1

(01)5

(05)36(36)

41

(41)18

(18)101

Note 1: Number in parentheses representspercentages.Note 2: n reflects all valid cases

Sixty-three percent of respondentsfeel that an internship program is extremelybeneficial to the intern, with 29 percentfeeling it is beneficial to the intern. Benefitslessen but are still quite high with regard toairports. Fifty-nine percent of respondentsfeel internships are a combination ofextremely beneficial and beneficial toairports. Thirty-six percent of participantsare neutral on this subject. It is obvious,therefore, that benefits exist to a high degreefor both interns and the airports employingthem (Tables 5 and 6).

7 2

Page 73: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 6Evaluation of Airport Internship ProgramBenefitsRanking of Mean Ratings

Beneficiary M SDIntern 4.661 0.789Airport 4.025 1.136

Note 1: Rating system provided forevaluators was as follows:

0 = Don't Know1 = Extremely Nonbeneficial2 = Nonbeneficial3 = Neutral4 = Beneficial5 = Extremely Beneficial

Note 2: Only responses 1-5 were used incalculating statistics.Note 3: M = mean; SD = standard deviation

Recommended Structure of DepartmentalRotation Airport Internship Program

As stated earlier, one of the purposesof this essay is to recommend to airportmanagers the most appropriate structure of adepartmental rotation airport internshipprogram. This section outlines therecommended structure for internshipprograms lasting two years, one and one-half years, and one year.

The procedure in structuring aninternship program should occur in thefollowing steps: (a) rank the departments atyour airport using tables 3 and 4, (b)determine the amount of total time the internwill work at your airport, (c) use the formulaoutlined below to determine the number ofdays the intern should spend in each airportdepartment.

The formulae, which were designedby the author as a result of this researcheffort, produce the number of days in whichan intern should spend in each departmentaccording to the level of importance eachdepartment received in the survey. These

63

formulae are guidelines only; however, thenumber of days suggested by the formulaecorrelate to the perceived importance ofeach department and are quite reasonable asa timeframe for each intern. If the internprogram is for two years, the followingformula should be used: 0.5 (a + b) = ydays. The first variable, a, is the percentageof important marks for that department,without the percent sign. Next, b is thepercentage of extremely important marks forthat department, without the percent sign.Finally, y is the actual number of workdaysthe intern should spend in that departmenton a continuous basis. If the intern programis for a length of one and one-half years, thefollowing formula is suggested: 0.4 (a + b)= y days. If the intern program is to last oneyear, use the following:0.3 (a + b) = y days.

The formulae are designed to givethe airport manager a rough rule-of-thumb touse in determining the length of time anintern should spend in each airportdepartment. The reader must remember thatthe numbers generated by the formulae yieldthe actual number of workdays suggested.Fourteen days, for instance, equates toalmost three calendar weeks, rather than twocalendar weeks. The total numbers are basedon a standard of 260 workdays per year.This equates to 21.6 working days permonth. Therefore, if a formula yields 21.6for a department, the airport manager shouldassign the intern to that department for anentire calendar month.

These formulae assume (a) that thereis no "home" department to which the internmust return after each departmental rotation,(b) that time spent in each department iscontinuous, and (c) that a minimum of oneworkweek (five days) is spent in eachdepartment regardless of the level ofperceived importance or equation result. Forthe actual amount of suggested time thatshould be spent in each department,

7 3

Page 74: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

depending on whether the internship is twoyears, one and one-half years, or one year,refer to Appendix C.

Extent of Experience NecessaryFor those students aspiring to

become airport managers, it will provehelpful to examine how airport managersfeel regarding the number of yearsexperience required to obtain such aposition. Of those working at large hubairports, 66 percent state that 10 to 15 yearsof experience are needed to obtain a positionof airport manager at a large hub airport. Ofthose working at a medium hub, 73 percentreport that 10 to 15 years are needed toobtain a position of airport manager at amedium hub airport. Small hub respondents(71%) consider 15 years or less arenecessary at a small hub airport. Non-hubrespondents (97%) report that 15 years orless are necessary at a non-hub airport.Other commercial service respondents(92%) explain that less than 10 years isadequate at these airports. Finally, 98percent of general aviation respondentsreport that less than fifteen years ofexperience are necessary to obtain a positionof airport manager at a GA airport.

Fortunately, according to thesenumbers, airport management neophytes canreasonably expect to become an airportmanager at a large hub airport by the age of36. Assuming entry into the industry at theage of 21, probabilities increase that one willobtain a position of airport manager by theage of 36 as hubs decrease in size. In otherwords, it is much more feasible to obtain aposition of airport manager at a generalaviation airport by the age of 36 than a largehub. Even so, these numbers serve tomotivate young individuals in the field whohave such high aspirations.

64

RecommendationsThe main purpose of this research is

to offer recommendations regarding depart-mental rotation airport internships toaviation management students, universities,and airport managers. Theserecommendations will hopefully assist theseparties in responding to the complexchallenges that are to be expected in the nextcentury.

Aviation management students1. Review Tables 1 and 2 to

determine if an airport managercareer is truly desired.

2. Review Tables 3 and 4 tounderstand the perceivedimportance by airport managersof different airport departments.

3. Review Tables 5 and 6 to realizethe benefits of airport internshipprograms.

4. Review Appendix A for an ideaof which airports have offeredinternship programs in the past.

Universities1. Encourage students to search

early and thoroughly forinternships.

2. Build a relationship withlocal airports to encourage theiruse of interns.

3. Utilize Appendix A to informstudents of airports that haveoffered internship programs inthe past.

Airport Managers1. Seriously consider implementing

an internship program if yourairport has not already done so.

74

Page 75: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

2. Study Tables 3 and 4 andAppendix C for guidance inappropriately structuring adepartmental rotation internshipprogram at your airport.

3. If your airport already has a non-rotational internship program inplace, consider implementing adepartmental rotation internshipprogram. This type of internshipis most common at airports, andappears to be most beneficial forboth the airport and the studentsinvolved.

These recommendations summarizethe main findings of this research. They arebased mainly on the viewpoints of airportmanagers. Even so, these viewpointsrepresent current, expert opinions in theairport industry and should not be takenlightly.

Conclusion

The airport industry is currentlyexperiencing unprecedented levels ofgrowth. With this growth, airport managersare being forced to rely on innovativemethods to obtain capital, educateemployees, encourage competitive forces,and continue ensuring the safety andsecurity of the flying public. These areasare best learned by on-the-job experience,supplemented with education. To enableairports to continue meeting the challengesthat lie ahead, therefore, aspiring airportmanagers need to be given adequateopportunities early in their career toexperience all sectors of the airportenvironment. Airport internships are anexcellent choice in achieving this goal.

657 5

Page 76: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

REFERENCES

Alreck, P. L., & Settle, R. B. (1995). The survey research handbook:Guidelines and strategies for conducting a survey (2" ed.). Chicago: Irwin ProfessionalPublishing.

American Association of Airport Executives. (1998). 1997-98 Membership directoryand yellow pages of corporate members. Alexandria, VA: Author.

American Association of Airport Executives. (1997). 1996-97 Membership directoryand yellow pages of corporate members. Alexandria, VA: Author.

American Association of Airport Executives. (October 1992 - October 1997). AirportReport. Alexandria, VA: Author

Fowler, F. J., Jr. (1993). Survey research methods (2"d ed.). Newbury Park: SAGEpublications.

Thiesse, J. L., NewMyer, D. A., & Widick, L. L. (1992). "FBO and airport internshipsfor university aviation students: Benefits for students, universities, and the aviation industry."Journal of Studies in Technical Careers, XIV (4), 253-264.

Truitt, L. J., Hamman, J. A., & Palinkas, K. G. (1994, Winter). Graduate education inairport administration: Preparing airport managers for the 21st century. Journal ofAviation/Aerospace Education and Research 4 (2), 9-16.

66 7 6

Page 77: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

App

endi

x A

Adv

ertis

ed I

nter

nshi

ps A

ppea

ring

in A

AA

E A

irpo

rt R

epor

tO

ctob

er 1

992

thro

ugh

Oct

ober

199

7

Air

port

/City

Posi

tion

Titl

eL

engt

h of

Pos

ition

Rot

atio

nD

ate

Adv

ertis

edB

ucha

nan

Fiel

d/C

onco

rde,

CA

Adm

inis

trat

ive

Inte

rn2

year

sY

es5/

15/9

2D

FW/D

alla

s, F

t. W

orth

, TX

Man

agem

ent T

rain

ing

Inte

rnn/

an/

a5/

1/92

Falc

on F

ield

/Mes

a, A

ZA

irpo

rt M

anag

emen

t Int

ern

2 ye

ars

Yes

10/1

5/93

Ren

o T

ahoe

Int

l./R

eno,

NV

Man

agem

ent I

nter

n2

year

sY

es4/

15/9

4B

ucha

nan

Fiel

d/C

onco

rde,

CA

Adm

inis

trat

ive

Inte

rn2

year

sY

es7/

1/94

Pain

e Fi

eld/

Eve

rett,

WA

Man

agem

ent I

nter

n2

year

sY

es10

/1/9

4D

ane

Cou

nty/

Mad

ison

, WI

Air

port

Man

agem

ent I

nter

n10

mon

ths

Yes

1/15

/95

Gra

nd R

apid

s, M

IA

irpo

rt I

nter

n15

wee

ksY

es2/

15/9

5C

rite

s Fi

eld/

Wau

kesh

a, W

IA

irpo

rt M

anag

emen

t Int

ern

6 m

onth

sY

es4/

15/9

5R

eno

Tah

oe I

ntl./

Ren

o, N

VA

irpo

rt M

anag

emen

t Int

ern

2 Y

ears

Yes

4/15

/95

Myr

tle B

each

, SC

Air

port

Int

ern

1 ye

arY

es9/

1/95

Kan

sas

City

, MO

Air

port

Man

agem

ent T

rain

ee2

year

sY

es10

/1/9

5C

rite

s Fi

eld/

Wau

kesh

a, W

IA

irpo

rt M

anag

emen

t Int

ern

6 m

onth

sY

es11

/15/

95B

ucha

nan

Fiel

d/C

onco

rde,

CA

Adm

inis

trat

ive

Inte

rn2

year

sY

es2/

1/96

Gra

nd R

apid

s, M

IA

irpo

rt I

nter

n15

wee

ksY

es2/

1/96

Tam

pa I

nter

natio

nal/T

ampa

, FL

Air

port

Ope

ratio

ns S

peci

alis

t2

year

sY

es2/

1/96

Ren

o T

ahoe

Int

l./R

eno,

NV

Air

port

Man

agem

ent I

nter

n2

year

sY

es4/

1/96

Pain

e Fi

eld/

Eve

rett,

WA

Man

agem

ent I

nter

n2

year

sY

es4/

1/96

Dan

e C

ount

y/M

adis

on, W

IA

irpo

rt M

anag

emen

t Int

ern

1 ye

arY

es4/

15/9

6Fa

lcon

Fie

ld/M

esa,

AZ

Air

port

Man

agem

ent I

nter

n2

year

sY

es6/

1/96

Gra

nd R

apid

s, M

IA

irpo

rt I

nter

n15

wee

ksY

es1/

15/9

7D

etro

it, M

IA

irpo

rt I

nter

n4

mon

ths

Yes

2/15

/97

Pain

e Fi

eld/

Eve

rett,

WA

Man

agem

ent I

nter

n2

year

sY

es4/

15/9

7R

eno

Tah

oe I

ntl./

Ren

o, N

VA

irpo

rt M

anag

emen

t Int

ern

2 ye

ars

Yes

6/1/

97Ph

oeni

x In

tl./P

hoen

ix, A

ZA

irpo

rt M

anag

emen

t Int

ern

2 ye

ars

Yes

7/1/

97Ph

ilade

lphi

a, P

AA

irpo

rt A

dmin

istr

ativ

e T

rain

een/

an/

a8/

1/97

Bis

mar

k, N

DA

irpo

rt I

nter

n1

year

Yes

8/1/

97

Not

e 1:

Pos

ition

s ap

pear

in o

rder

by

date

adv

ertis

ed.

Not

e 2:

Tab

le d

ispl

ays

27 a

dver

tisem

ents

pla

ced

by 1

5 in

divi

dual

air

port

s.So

urce

: AA

AE

Air

port

Rep

ort O

ctob

er 1

992

Oct

ober

199

77

8

Page 78: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Appendix B

Airport Manager Survey

*Note: The entire survey provides data on more subjects than were discussed in this paper.

Sections A, B, and F J are of particular interest to the readers of this paper.

79

Page 79: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Airport Manager SurveyNote: Airport Manager is defined as the individual managing all facets of the day-to-day activities of

the airport and known by titles such as Executive Director and Director of Aviation.Airport Intern is defined as an individual working full- or part-time in a temporary status gainingexperience in the airport management field.

A. Which of the following words describe your airport management career? Please place a check (N../ ) inthe box next to any and all words below you feel accurately describe your airport management career.

CI Challenging 11 Competitive El Dangerous 1-3

El Disappointing 01 Easy El Enjoyable 4-6

L1 Exciting 13 Fulfilling CI Important 7-9

El Interesting l Low-paying CI Political 10-12

El Rewarding C3 Secure 1=1 Stressful 13-15

B. For a successful career as an airport manager, how important do you feel practical experience is ineach of the following airport departments? If you feel experience in the department is extremely important,pick a number from the right side of the scale (disregard coding numbers on the extreme far right) and circle iton the scale beside the item. If you feel it's not important at all, pick a number from the left, and if you feel theimportance is between these extremes, pick a number from someplace in the middle of the scale to show youropinion. If you don't know about a specific department, use the number zero (0) on the far left.SCALE: Don't Extremely

know unimportantNeutral Extremely

important

AktaftRSi tiibtill.gRecords ManagementDesign. and ConstructignCommunications CenterFacilitieA/MgpFinance

:,;;:ii;;;;A:11b.4.';',, ,;,-:;,., ii.,;,.;;:....... ..g: 2.1 L. ,.,...0 1

/.......: .......if -...-.....,_:.:.1:-.:")..i.:::11:,.......1.: . _ .. ..

0 1 2

2,,...i..4)ii ,.., , '--,,, .....121:: .-2. .-

0

Ggneral Avia.tin. . -....,1,... i,,,.--Itz.1..,1:_:..AL. , .:2._._....... ..Ground Transportation 0 1 2

Human. it_es_gyrces ... ,,..: LO.',.;,s,..J,___:.,......:.., .ZInformation/Public Relations 0 I

1-Ipternalional.Camerce - ...0 ..:-.,-_:;,'i;J.1 ..,...,..,..... .-.. 2 .

Operations 0 I .2

PlanxiingAptPevelprunept_,,,.. , .'t; '-- ..'"..2"".......,...:;,

Police/Security 0 I 2

P10,12 gti Anapr=g a a 1 6144" ,t,rva;;13;;;4.r; 4,

Other (specify) 0 1 2

_,_3........_._...........4,:ii:,..i.-.....,...J......,.......

3 4 5

3 .........,..... _4;.*:.i.._,............_. _.5..:-

3 4 5.....

3, 4-. 5 ...

3:.. 43 4

..,3,.. -.,::-.,;::4,....3 4

. .

3

3 4 5

3 4 5

3 4 . 5

3 4 5

. 5

5

4

5

4

5

6

K

9

I

11

12

13

14

15

16

C. In preparing students for a successful career as an airport manager, how important do you feel eachof the following academic fields is as a major area of study? Utilize scale as in question above. If you don'tknow about a specific area of study, use the number zero (0).SCALE: Don't Extremely

know unimportantNeutral Extremely

important

Accp.gnling,_.,.., ::.....4;, ,;.,". ,. -.,-,,_,L,,,,slaaia.zir:acti.aeL:"g,..,:l....,.,........L.,...._....4._L._-Aviation Management 0 1 2

Applied.. S cienceassiinology .,,..ii,.....,:-,::,.; . L. ... ,:l .::,.......Page 1 of 4

BEST COPY AVAILABLE

80

.._ .. _ __.. . ..:___.1.3 4 5 1_

3 4 . 5. 3.. . .

Page 80: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Computer Science 0 1 2 3 4 5 4_Computer0 1 2 3 4 5 5I

6

7 1Engineering 0 1 2 3 4 5

nuan.ee 2 3 4 5Foreign Language--;,,Rimplyt,

0 1 2 3 4 5 8

9 il3 4 5International Relations/Business 0 1 3 5 10

11 1TAW 0 3 5Management 0 1 2 3 4 5 12

Mar Iceting 0 1 2 3 4 5 . '13 1

Political ScienceIhY4610g3r

_.. ..._. 2 14

15 1

Public Administration 0 1 2 3 4 5 16

17; ]'$ppechCommunication 1. 2 -5

D. In preparing students for a successful career as an airport manager, how important do you feel eachof the following aviation academic courses is? Utilize scale as in question above. If you don't know about aspecific academic course, use the number zero (0).SCALE: Don't Extremely

know unimportantNeutral Extremely

important

;A;ir Cargo 4 LogisticsAir Traffic Administration 0 1 2 3 4 5 2

Air Transportation 0 I 2 4 21]4Aigort Administration 0 1 2 3 4 5

FApoli eorology 0 4Airport Finance 0 2 3 4

lAfiation Iisimancie 0 1 2 3 4 5 v . 3 7

Aviation Labor Relations 0 1 2 ,- t-19 j;Walton Law,Eftegu o

Aviation Marketing 0 1 2 3 4 5 10

-till12

13 1

Aviation Safety 0 1 2 3 4Aviation Policy & Planning

"04

1

1

2 3 4 5.

A:*'llt1011 CO 0International Aviation 0

01

1

2

23

344

5

514

15 JPrillef.,rtanSPO'riatiOnPrivate Pilot Ground 0 1 2 3 4 5 16

E. What do you feel is the highest-level academic degree preferred by employers, when combined withexperience, to attain a position of airport manager? Place a check () in the box next to the one highest-level academic degree you feel is preferable for an airport manager position. Only check one.

C:1 High school diplomaII Associates degree171 Bachelors degreeEl Masters degree1J Doctorate degreeCI Other (specify)

Page 2 of 4

81

1

2

3

4

5

6

Page 81: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Computer Science, .

2 3 4 5

,

5Engineering 0 1 2 3 4

Finance. i' 4"

Foreign Language 0 1 2 3 4 5..1,0,. :0.914" b" : .;;;,. .,'c,.':4:..,,,,,' .;',:,,.i,,,,'.z...i: .,...<:',?....-elP,..;, Jrt...E.:,...:,:.;.,t6?..i:;:.,,........ 9 1

International Relations/Business 0 1 2 3 4 5 1 0

.''' ,r, ..','"" :,,jel,14,-.);:c7.747.,:,",,J!',37n14i1=,,,,t ''''-ir "A.:".; X7itry: ..;._,, -r -,,,,,,. ,,:' 7.-.,"V4, :" '''''' ' ' '''''''' ' ,k A1'W-: '.. .."'',lli.4i?..,.te5311:44'4,f;t/ki;±aY5.1iAl...V4S%4Atc*.i',,ii0.!,kiiP.3i$4519,1-,` .7I'Ll. 1A11,:,,' 110.i ..,, ktfl, .!,.:r.,- ..;,,3,51.,i,..0.... ,_....5...,.... Ilts,..... ,. . ., .. ,. ., ii.

Management 0 I 2 3

ting,:..1 .,,,,,:4,-L.' .-...-%.,:7-* ''' -Arzz:TI-j,ili- .41,;,,,A..,4,..4.2.::,,,I:i0:' ilic,..,i,,,::44,,r4-24c:..4,:-,:;i1,r..;;......i,3,,,,;,-,L,-,,,,, .... : . A ..,(.. ...,:,,...._,:.,5..:'":.... 13 :

Political Science 0 1 2 3 4 5 14

..:::flftwzatL,a,,,ILLI;i1::.- -:;;.4,e:....;,;-ii::; 5: .,-, .--.....,-.:-.7...,...., 4.: ,.:,.. - .., .;.,.; ,3.Public AdministrationAdministration 0 1 2 3 4 5

. ..

pech -.. : --:.;:.,-0,,,lz.j.,_.:,,, 2,:';',:ti.;.i.:,,1.L. -:.: gLi, ...I:a;,..w;:;,.:,,,. :,.. ..,41.n._,.:_..,._

D. In preparing students for a successful career as an airport manager, how important do you feel eachof the following aviation academic courses is? Utilize scale as in question above. If you don't know about aspecific academic course, use the number zero (0).

SCALE: Don'tknow

Extremelyunimportant

Neutral Extremelyimportant

Air Traffic Administration 0

Airport Administration 0

AgplaWkoroji); 5 . :

Airport Finance

2 3 4 5

3 4 5^ -

0 1 2 3 4 5.

Aviation Labor Relations 0 1 2 3 4 5

7:;',-;6424gitisi, ' ' 4. ,, ^tit IA 4 -'-', ,:..--;:.,,v,:. , .- ,,,,:-.....,...ilz.-

Aviation Marketing 0 1 2 3 44 4 ..:755..- 11

I 0

Aviation Policy & Planning 0 I 2 3 4 12

Milatuk_'6 IV i; : i,..... 13

International Aviation 0 I 2 3 4 5 1 4

Princinic prIhonggatoji,:z:' 1:"' '''''-'7; ' c '''''' 7°. cLA'. '''''. ' '4. -' " ' ',. ,:. s.:,-IAL-3,:t

2

4

.-. 5.

6

8

.'; -":""Sitir ja.....41...,a=._N77 4 ' . '

" _ -11.gt _ -

Private Pilot Ground 0 1 5 16

E. What do you feel is the highest-level academic degree preferred by employers, when combined withexperience, to attain a position of airport manager? Place a check (V) in the box next to the one highest-

level academic degree you feel is preferable for an airport manager position. Only check one.

CI I ligh school diplomaAssociates degreeBachelors degree

C1 Masters degreeO Doctorate degreeO Other (specify)

BEST COPY AVAILABLE

Page 2 of 4

3

4

5

6

Page 82: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

F. In general, how beneficial do you feel an airport internship program is to the intern and the airport?If you don't know, use the number zero (0).

SCALE: Don't Extremelyknow nonbeneficial

Neutral Extremelybeneficial

InternAirport 1 2 3

G. Have you personally ever been employed as an airport intern?

CI YesEl No [If not, go to Question I.]

H. if yes, was it a required aspect of your academic degree?

0 Yes0 No

44

I. Do you currently or have you previously worked at an airport that has an active airport internship

program?

17I Yes [If yes, go to Question K.]CI No

2

2

J. If your airport does not have an internship program, assuming an intern salary of $20,000 per year,do you believe your airport would be willing and able to implement one under sufficient guidance?

1:1 YesCI No

K. In which of the following categories does your airport belong? Only check one.

_LArgOlub_ Lg. least 1_ % daLlAngwanplArtems.CI Medium hub (between .25 % and 1 % of all annual enplanements)

__-CI Non-hub (over 10,000 annual enplanements, but less than .05 % of annual enplanements)

.

0 General Aviation (no scheduled commercial asscnger service)

.M tar &

4

5

6

L. How important do you feel American Association of Airport Executives Accreditation (A.A.E.) status

is in obtaining a position as airport manager? If you don't know, use the number zero (0).

SCALE: Don't Extremelyknow unimportant

Neutral Extremelyimportant

A.A.E. 0 1 2

BEST COPYAVAILABLE

Page 3 of 4

82

3 4 5

Page 83: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

M. Do you currently have (or re you actively pursuing) AAAE Accreditation (A.A.E.) status?O YesO No

N. Of the following, what is the highest-level academic degree you have completed?the one box next to the highest-level degree you've completed. Only check one.

High school diplomaO Associates degree0 Bachelors degreeO Masters degree

Doctorate degreeOther (specify)

2

Place a check (,/) in

1

2

3

4

5

6

0. What is your gender?MaleFemale 2

P. What is your current age?O Less than 30 yearsO 30 - 40 years

41 - 50 yearsO More than 50 years

1

2

3

4

Q. How many years of experience in the airport business do you feel are necessary to obtain a position ofairport manager at an airport of your airport's hub size?O Less than 10 years

10 - 15 years 2

16 - 20 years 3

O 21- 25 years 4

More than 25 years s

R. What is the title of your current position?1:1 Airport DirectorO Airport ManagerO Director of AirportsO Director of Aviation0 Executive Director

Managing DirectorOther (specify)

S. If you would like to receive an abstract of the completed survey results, please write your mailingaddress:

1

2

3

4

5

6

7

Please (1) use the enclosed self-addressed stamped envelope to mail your results to Daniel Prather,Hillsborough County Aviation Authority, P. O. Box 22287, Tampa, FL 33622 or (2) fax your completed

survey to Daniel Prather at (813) 875-6670. Thank you very much for your time and effort!

83

Page 84: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Appendix Cecommended Number of Workdays that Intern Should Spend in Each Airport

Department

Airport Department Important(4)

Extremelyimportant (5)

Timeallowed-

Timeallowed-

Timeallowed-

a B TwoYears (y)

1 1/2Years (y)

1 Year (y)

Aircraft Rescue and Firefighting 32 17 25 17 12

Records Management 35 13 24 15 12

Design and Construction 53 16 35 20 16

Communications Center 30 7 19 10 9

Facilities/Maintenance 53 19 36 22 17

Finance 47 45 46 36 25

General Aviation 54 22 38 24 19

Ground Transportation 35 4 20 9 9

Human Resources 42 23 33 22 16

Information/Public Relations 45 38 42 32 22

International Commerce 18 3 11 5 5

Operations 46 42 44 34 24

Planning and Development 54 38 46 34 24

Police/Security 37 8 23 12 10

Properties and Contracts 56 33 45 31 23

TOTAL DAYS IN ALL DEPARTMENTSTOTAL MONTHS IN ALL DEPARTMENTSTOTAL MONTHS INTERN HASAVAILABLENUMBER OF DAYS REMAINING AT END OF ROTATION

483 324 24222 15 11

24 18 12

36 65 17

Note 1: Formulas are as follows:Two years:

One year:

Note 2: All days have been rounded.

0.5 (a + b)=y days0.3 (a + b) =y days

73

84

1.5 years: 0.4 (a + b) =y days

Page 85: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

An Evaluation of a Major Airline Flight Internship Program: Participant's Perceptions ofCurricular Preparation for, and Components of, the Internship

Jose R. Ruiz, David A. New Myer, and D. Scott WorrellsSouthern Illinois University Carbondale

ABSTRACT

This article presents the results of a follow-up survey administered to 110 formeruniversity interns who served a semester-long flight operations internship at United Airlines. Theintent of the survey was to obtain the participant's opinions concerning their academicpreparation for the internship experience, as well as their overall assessment of the internshipexperience itself. Of the 78 respondents, 75.7% indicated that their university aviationcurriculum had prepared them "very well" or "well" for the internship. Further, 80.7% of allrespondents indicated that the semester-long internship had a "great impact" or "significantimpact" in helping them achieve their career goals. Also, 96.2% of all respondents said that theywould recommend a United Airlines internship to students seeking an aviation career.

INTRODUCTION

In July 1987 Southern IllinoisUniversity at Carbondale (SIUC) and UnitedAirlines signed a formal agreement to worktogether in establishing a flight operationsinternship partnership. Essentially, SIUCagreed to supply a well-prepared and pre-selected group of internship candidates.These candidates would then be evaluatedduring a one-to-two week "short internship"during which they would be formallyinterviewed for a semester-long "longinternship". Since this agreement wassigned, over 200 "short interns" and over110 "long interns" (largely selected from the"short intern" group) have participated inthis SIUC program.

As the university prepared for thecelebration of the tenth anniversary of thesigning of the internship agreement in 1997,SIUC faculty determined that this would bean excellent time to evaluate the flightoperations internship program with United

74

Airlines. The demographic results of thissurvey (characteristics of the interns, thencurrent employment, etc.) is the topic of anarticle published in JAAER entitled,A Pioneering University-Airline FlightInternship Program: A Follow-Up Study ofIntern Participants (1998). The purpose ofthis article is to present the results of surveydata that address intern perceptions ofcurricular preparation for the Unitedinternship, and intern perceptions of Unitedinternship components.

Methodology

Survey ParticipantsSurvey participants included all 110

SIUC aviation program students whoparticipated in the United-SIUC "longinternship" through August 1997. All 110participants are also alumni of SouthernIllinois University.

85

Page 86: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Survey Instru ent DesignThe survey instrument was a mail-in

questionnaire. The instrument wascomposed of six sections and designed tocollect two types of information. First, itcollected data related to the respondent'spersonal and professional characteristics.For example, types of Federal AviationAdministration (FAA) Aero-nauticalCertificates possessed, flight time and levelof education were among the type of datacollected. Data concerning personal andprofessional characteristics of therespondents have already been reported byNew Myer, Ruiz, and Worrells (1998). Thesecond type of data collected is attitudinal innature. Using a Likert Scale, dataconcerning attitudes toward the internshipexperience, classes taught at SIUC, andother relevant topics were collected. TheLikert Scale was used to allow respondentsto indicate the extent to which they agreedor disagreed with a statement. The LikertScale was selected because of its simplicityand ease of use. Attitudes were assessedalong a 5-point scale. The points rangedfrom 1 to 5. The scoring of statements wasdependent upon the particular scale. Forexample, Section IV of the survey asksrespondents to rate the helpfulness ofaviation classes taught at SIUC. A highresponse (5) represents the highest degree ofhelpfulness, while a low response (1)represents the least helpfulness.

Research DesignThe survey instrument was mailed to

all 110 participants in the United-SIUC"long internship" program. The Departmentof Aviation Management and Flight inconjunction with the SIUC AlumniAssociation developed a comprehensive listof alumni addresses for graduates whoparticipated in the United Airline "longinternship". Three mailings were sent tothese 110 alumni resulting in 78 responses, a

75

return rate of 70.9%. A 70.9% response raterepresents an acceptable sample. Miller andSchumaker discuss questionnaire follow-upsand the impact they have on response rates:The initial mailing of the letter oftransmittal, questionnaire, and stampedreturn-addressed envelope will usually resultin a response rate of from forty to sixtypercent that is, forty to sixty percent of thesample will typically return thequestionnaires. The first follow-upcorrespondence usually brings ten to twentypercent more returns, and a second follow-up will add another five to ten percent to thereturn rate. If researchers can obtain a totalreturn of seventy percent or better, they aredoing very well. In many studies the returnrate is closer to fifty or sixty percent.

The survey questionnaire addressedfour specific areas. The first three areasattempted to gauge respondent attitudestoward SIUC Aviation Flight-AviationManage-ment coursework and specificcomponents of the short and long internship.The fourth area allowed the respondent toprovide overall evaluative comments on theUnited-SIUC internship program:

1. What Aviation Flight-AviationManagement coursework was mosthelpful in preparing intern candidates forthe flight operations internship?

2. How valuable was the "short internship"(specifically, components of the shortinternship) in developing therespondent's understanding of the airlineindustry and their development as anaviation professional?

3. How valuable was the "long internship"(again, components of the longinternship) in developing therespondent's understanding of the airlineindustry and their development as anaviation professional?

86

Page 87: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

4. A series of overall evaluation questionsaddressing the value of the short andlong internship to each respondent, howwell the respondents believed they wereprepared by SIUC for the internship andwhether or not they would recommendthe United Airlines Internship to others.Room was also left for essay responsesfrom respond-ents who had additionalquestions.

DefinitionsDuring the implementation of theUnited Airlines-Southern Illinois UniversityCarbondale flight operations internship, twoimportant learning opportunities weredeveloped for SIUC students:

A. The Short Internship. This was initiallya two week-long internship experiencethat took place at the United AirlinesFlight Training Center in Denver, CO.This internship included a groundschool on one of United's aircraft(usually the 767 since it was the onlycomputerized, non-instructor groundschool available in the late 1980's),presentations by key United personnel,tours, a group problem-solving activityand interviews for the long internship.In the early 1990's, the short internshipwas reduced to a one week experiencewith the ground school portioneliminated, but all of the othercomponents mentioned aboveremained in the short internship. TenSIUC students were allowed toparticipate in the short internship persemester. This was and still is anunpaid experience but United pays forthe participants to fly to Denver andback from either St. Louis or Chicago.Students participating in the shortinternship were selected solely byUnited in the first years of the UA-

76

SIUC relationship, but in later yearsthey were selected by SIUC alone.Selection criteria for the shortinternship are: a minimum overallgrade point average of 2.75 on a 4.0scale; Federal Aviation AdministrationCommercial Pilot Certificate withInstrument and Multiengine Ratingsrequired and Certified Flight Instructor(Airplane) prefer-red; a relatively"clean" driving record (Driving Underthe Influence or Driving WhileIntoxicated arrests were considered"knockout" factors); and the applicantmust be an Aviation Managementmajor at SIUC, having flown aminimum amount with the SIUCAviation Flight program.

B. The Long Internship. The longinternship refers to the semester-longexperience that students attending the"short" internship (see above)interviewed for during the one or twoweek short internship. This internshiplasts for a full semester rather than oneor two weeks, thus the label "long"internship. Students attending the longinternship were initially assigned toeither the Denver Flight TrainingCenter or to the United Airlines WorldHeadquarters in Chicago. This wasexpanded to include any UnitedAirline's Flight Operations domicile(pilot base). So far SIUC interns haveserved at Honolulu, Miami, Chicago,Denver, Los Angeles, Dulles(Washington, D.C.) and San Francisco.Interns are assigned projectsappropriate to their work locations.During their internship they are flownto San Francisco for a tour of United'sMaintenance Operations Center I. andalso to Everett, Washington, for a tourof the Boeing Plant. This is an unpaid

87

Page 88: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

position; however, successful longinterns graduating from SIUC aregiven "guaranteed processingprivileges" (i.e., a guaranteed flightofficer interview with United).

Evaluation of SIUC Aviation CourseworkRespondents were asked to report,

"...what Aviation Flight/ AviationManagement course work (see Appendix)was most helpful to you in preparing for theUnited SIUC internship?" Responsecategories were: MH = most helpful, VII =very helpful, H = helpful, NVH = not veryhelpful, LH = least helpful and N/A = notapplicable. Based on the responses to thesequestions, it was determined that more thanhalf of the 78 respondents took eleven SIUCaviation courses, or groups of courses (seeTable 1).

Table 2 reports combined data forthe "most helpful/very helpful" responses oneach course and the "least helpful/not veryhelpful" responses on each course. Table 2also reports these data as the percent ofthose taking each course so that the readercan see how the respondents reported theirevaluation within each course subject area aswell as how the overall totals compareacross all courses.

As indicated in Table 3, threecourses or groups of courses were ranked bymore than half of all respondents as beingeither "Most Helpful" or "Very Helpful".These courses included: Flight Training atSIUC, AVM 373-Airline Management, andAVM 385/ATS 332-Air Transport LaborRelations. Among the 36 respondentsreporting that they have been hired byUnited, Flight Training at SIUC was rankedfirst as the "Most Helpful" and "VeryHelpful" course or group of courses with 29of 36 respondents rating it in this fashion.Flight Training at SIUC was followed byAVM 385/ATS 332-Air Transport LaborRelations and AVM 373-Airline

77

88

Management tied for second with 23respondents each, reflecting the "AllRespondents" group result. Whencomparing the data in Table 2 and Table 3, itcan be seen that, within the responses foreach course, there is a slightly differentranking of courses between the two tables.The following ranking is arrived at bylooking only at the total number ofrespondents who reported taking a specificcourse and then calculating the percentageof those who took only that course andranked it as "Most Helpful/Very Helpful":

Percent of Respondents takingCourse Title the course who ranked it

MHNH1. AVM 373-Airline 82.1%

Management2. Flight Training

at SIUC3. AMT 205-Cabin

Environment andJet Transport Systems

4. AMT 405-FlightSystems Management

5. AMT 385/ATS 332 -Air Transport LaborRelations

6. AVM 402/ATS 421 -Aviation IndustryCareer Development

80.8%

65.2%

64.7%

64.5%

58.8%

Using this method of ranking, allother courses were ranked 42.6% or lower(of those taking the course who rated it"Most HelpfulNery Helpful").

At the opposing end of the scale arethose courses identified as "Least Helpful,"or "Not Very Helpful" by all respondents.As noted in Table 4, AVM 370-AirportPlanning was identified as the "LeastHelpful" or "Not Very Helpful" course by24 respondents, followed by AVM 372 -Airport Management, ATS 383-DataInterpretation and ATS 364-Work CenterManagement in a three-way tie with 19 suchresponses. Among those hired by United,AVM 370-Airport Planning and AVM 372 -Airport Management were tied as the "Least

Page 89: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Helpful" or "Not Very Helpful" course witha score of 13 responses each.

Short Internship EvaluationSince the second year of the United

Airlines SIUC Flight OperationsInternship Program, a "short" (one to twoweek long) internship program has beenoffered by United Airlines to SIUC studentparticipants. Respondents to the survey wereasked to evaluate eleven items related totheir short internship experience at United,including their overall short internshipexperience. They were provided a Likert-type scale with the following responsecategories for each item: Most Valuable,Very Valuable, Valuable, SomewhatValuable, Not Valuable and Not Applicable.As shown in Table 5, six items were rated"Most Valuable or Very Valuable" by 42(53.8%) of all "short intern" evaluationrespondents.

The following group of "MostValuableNery Valuable" short internshipcomponents varied slightly when rankedamong the 36 respondents hired by United:

Respondents Hired by United

Interaction with 72.2%United Personnel

Experiencing theAirline WorkEnvironment

Presentation byUnited PersonnelInterview for the"long internship"

69.4%

69.4%

66.7%

Short Internship 63.9%Overall Experience

Simulator Time 52.7%

78

When the top group ranked by the 42respondents not hired by United, there werea couple of major differences in responses.First, the "Short Internship OverallExperience" fell from a 66.7% MVNVrating by those hired to a 45.2% MVNVrating by those not hired. Also, the "767 (orother) Ground School" dropped from 44.4%MVNV for those hired to 28.6% MVNVfor those not hired. The "Group Project withother Interns" was conversely ranked muchhigher by those not hired (at a 52.4%MVNV rating) than by those hired (38.9%).Similarly, "Tours of TK" were rated fairlyhigh at 50.0% by those "not hired" and36.1% by those "hired".

Overall, only two items were giventen or more total "Somewhat Valuable/NotValuable" responses by all respondentswho evaluated the "short intern-ship":

Group Project 12

with other Interns

Tours of DEN/DIA 14

Evaluation of the Long InternshipIn evaluating the "long" internship,

respondents were again given a range ofattitudinal questions with a Likert-type scalefor response. The response options were thesame as those provided for the shortinternship evaluation questions: MostValuable (MV), Very Valuable (VV),Valuable (V), Somewhat Valuable (SV),Not Valuable (NV) and Not Applicable(N/A). Table 6 provides a combinedconsensus of the "Most ValuableNeryValuable" responses and the "NotValuable/Somewhat Valuable" responses.

The overall respondent groupidentified a very clear "Top 5" list of itemsin their responses, as all five items receivedover 80% respondent support:

89

Page 90: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Component Percentage ofRespondents

1. Your internship 88.5%experience

2. Observer Memberof Crew (OMC)Privileges*

3. "Long" InternshipOverall Experience*

4. Interactionwith UnitedPersonnel(other than supervisor)

5. Your assignedwork locationat United

*Indicates a tie in ranking

5. Observer Member 77.8%of Crew (OMC)Privileges*

*Indicates a tie in ranking

87.2% Among the 42 respondents not hiredby United, OMC privileges jumped to thehead of the list of items, but with the topfive items again remaining the same items,

87.2% while in yet another order of respondentpreference:

85.9% 1. OMC privileges 95.2%

2. "Long" Internship 92.9%Overall Experience

80.8% 3. Interaction with 90.5%United Personnel(other than supervisor)*

Among those 36 respondents hiredby United, this group of items stayedessentially the same, but in a slightlydifferent order:

1. Your internship 86.1%experience

2. Interaction withUnited Personnel(other than supervisor)*

80.6%

3. Long internship 80.6%overall experience*

4. Your assigned work 77.8%location at United*

79

4. Your internshipexperience*

90.5%

5. Your assigned work 83.3%location at United

*Indicates a tie in ranking

Respondents were also asked via anopen-ended type of question to rank thethree "Most Valuable and Least Valuable"components of the "long" internship. Table7 reports the combined "Most Valuable"responses, which indicate that "Interactionwith United Personnel" received the mostcombined first, second, third "MostValuable" responses, followed by OMCprivileges. These two items receivedapproximately twice as many responses asthe three items tied in third place.

As far as internship items receivingthe most responses for "Least Valuable" (seeTable 8), leading the pack was a combined"Tours" (MOC, Boeing, etc.) response.

Page 91: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

While most respondents enjoyed these tours,they reported that they did not believe theywere as valuable as other internshipcomponents. Also receiving significantresponses were airline pass privileges(riding as a passenger, not to be confusedwith OMC privileges, which occur on theflight deck) and the intern project.

Overall EvaluationAt the end of the survey instrument

respondents were asked to respond to oneoverall evaluative question each about the"short" internship, the "long" internship andthe SIUC aviation curriculum.

Regarding the "short" internship,respondents were asked: "what impact hasthe 'short' internship experience had inachieving your aviation career goals"? Aslisted in Table 9, the response options givenfor this question were: Great Impact,Significant, Some, Little, No Impact andN/A (not applicable). The "Hired by United"is, for good reason, quite positive in itsevaluation of the "short" internship with75% of those respondents saying that the"short" internship had either "Great Impactor Significant Impact" on achieving theircareer goals. This differs substantially fromthe "Not Hired" by United respondent groupwhich reported 38.1% of the responses inthe "Great Impact or Significant Impact"categories. Also, 23.8% of the "Not Hired"respondents said that the "short" internshipand "Little or No Impact," while none of the"Hired" respondents responded in these twocategories.

Regarding the "long" internship,respondents were asked, "What impact hasthe "long" internship experience had inachieving your aviation career goals"?Again, respondents were given a Likert-typescale of six possible responses: Greatimpact, Significant, Some, Little, No Impactand N/A (not applicable). Again, as shownin Table 10, the "Hired" by United group of

80

respondents was clear in their response:Nearly 92% said that the "long" internshiphad "Great Impact" on their career goalachievement. This compares to only 38.1%of the "Not Hired" group of respondentswho answer "Great Impact". The two groupscombined indicated 62.8% "Great Impact".On the other end of the scale 5 of allrespondents (6.4%) indicated that the "long"internship had "No Impact" on their careergoal achievement.

With regard to an overall evaluationof aviation department coursework at SIUC(see Table 11) respondents were asked toanswer the following question using aLikert-type scale: "How well did thecoursework in the aviation department atSIUC prepare you for your internship"? Atotal of 78.5% of the "Not Hired" by Unitedgroup responded that the courseworkprepared them "Very Well or Well" whileonly 72.2% of the "Hired" group respondedin this manner.

Overall, a total of 4 respondents (3 inthe "hired" group and 1 in the "Not Hired"group), or a total of 5.1% of the overallrespondent group, answered "Not Much".There were no responses in the "Did Not"category.

A final structured question asked,"would you recommend a United Airlinesinternship to students seeking an aviationcareer?" A total of 75 respondents (96.2%)said that they "Would Recommend" theinternship while 2 said that they would not.One respondent did not respond to thisparticular question.

CONCLUSIONS

The following conclusions can bederived about the respondent's perceptionsof curricular preparation for a major airlineflight internship program; as well as their

91

Page 92: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

overall views about the internship programand its components:

1. The "Most HelpfulNery Helpful"(MH/VH) coursework in preparingintern candidates for the flightoperations internship at United are(when looking only at the total numberof respondents who reported taking aspecific course and then calculating thepercentage of those who took only thatcourse and marked it as MHNH):

A. AVM 373-Airline ManagementB. Flight Training at SIUCC. AMT 205-Cabin Environment

and Jet Transport SystemsD. AMT 405-Flight Systems

ManagementE. AVM 385/ATS 332-Air

Transport Labor RelationsF. AVM 402/ATS 421-Aviation

Industry Career Development

Also, when asked "How well did thecoursework in the aviation department atSIUC prepare you for your internship," atotal of 75.7% of survey respondentsindicated "Very Well" or "Well" withanother 17.9% responding "Moderately".

2. When asked to evaluate the "short"internship, six "short" internshipcomponents were rated as mostvaluable/very valuable by a majority ofall respondents:

A. Experiencing the airline workenvironment;

B. Interaction with United personnel;C. Interview for the "Long" internship;D. Presentation by United Personnel;E. Short Internship Overall Experience;

and,F. Simulator time.

81

When the respondents were asked,"What impact has the "short" internshipexperience had in achieving your aviationcareer goals", 55.1% of all respondents saidthat it had had either "Great Impact orSignificant Impact" on achieving theircareer goals. This percentage went up to75.0% for the subset of all respondents whowere hired by United.

3. When asked how valuable the "long"internship had been in achieving careergoals, 80.7% of all respondents reportedthat it has had "Great Impact orSignificant Impact". This percentagewent up to 91.7% for the "Hired byUnited" subset of respondents. Whenasked to evaluate components of the"long" internship, the following fiveitems were ranked "Most Valuable" or"Very Valuable" by 80% or more of allrespondents:

A. Your Internship Experience;B. Observer Member of Crew (OMC)

Privileges;C. "Long" Internship Overall

Experiences;D. Interaction with United Personnel;

and,E. Your Assigned Work Location with

United.

4. In some areas of the survey there weresignificant differences in the responsesbetween the "hired by United" group ofrespondents and the "not hired byUnited" group of respondents. Forexample, in rating the value of the shortinternship, 75% of the "hired" groupreported it as being of "great" or"significant" impact while only 38.1% ofthe "not hired" group responded in thismanner. Similarly, 91.7% of the "hired"group said that the long-internship had"great impact" on their career goal

92

Page 93: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

achievement; only 38.1% of the "nothired" group reported in this way. Apositive rating of the United Airlinesinternship program appears to beencouraged among those participantswho have been hired by United.

5. When asked, "Would you recommend aUnited Airlines internship to studentsseeking an aviation career," 96.2% ofrespondents said that they "WouldRecommend" the internship.

RECOMMENDATIONS

Research for this paper did notdiscover any comprehensive study on thesuccess of an airline's flight operationinternship program or curricula used inpreparing students for such internships. Onerecommendation to airlines which conductthese types of internship programs is toestablish a follow up mechanism todetermine to what degree these programs aresuccessful in meeting the goals of bothinternship participants; the airline and thestudents. A second recommendation wouldbe to include curricular preparationquestions in such a follow-up study toestablish the correlation between curricularpreparation and success in the internshipprogram. The final recommendation is thatUnited Airlines, having been in the flightoperations internship business for over tenyears, follow up with all of its flightoperations interns from each of theparticipating universities. Such researchwould provide invaluable insight to theeffect of various aviation-relatedcurriculums, from a variety of institutions,on the success of flight operations internshipprograms, specifically for United Airlines.

82 93

Page 94: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

REFERENCES

Bradley, P. (1997, September). Is the pilot shortage coming? Business and CommercialAviation.

Kite ley, G. (1997, November 11). Auburn, AL: University Aviation Association.McMillan, J. H. & Schumacher, S. (1989). Research in Education, A Conceptual

Introduction (2nd ed.). New York: Harper-Collins.New Myer, D. A., Ruiz, J. R. & Worrells, D. S. (1998). A pioneering university-airline

flight internship program: A follow-up study of intern participants. The Journal ofAviation/Aerospace Education and Research, 8, 23-33.

New Myer, D. A., Ruiz, J. R. (1997, September). Airline flight operations internprograms. Presentation to the Air Transportation Association Operations and Safety Forum,Hilton Head, SC.

New Myer, D. A., Ruiz, J. R. (1997, September). Airline flight intern programs and majoruniversities: A follow-up study of graduates. Presentation to the Air Transportation AssociationOperations and Safety Forum, Hilton Head, SC.

New Myer, D. A. (1991, October). Status report: An airline-university cooperative pilotcareer program. College Aviation Review, 2(1), 15.

Phillips, W. (1996, November). Internships & co-ops: Collegiate programs that can makeyour aviation career take off. Flight Training.

Schukert, M. A. (1993, May). Aviation Career Waypoints. Murfreesboro, TN: MiddleTennessee State University Aerospace Department.

Self, B. (1996, September 25). Southwest Airlines Internship Program Guidelines. Dallas,TX: Southwest Airlines.

Spencer, K. (1988, July 26). UAL Working Relationships with Aviation Colleges.Denver, CO: United Airlines.

Turney, M. A. (1997, September 22). Starting an internship program. Presentation to theAir Transport Association Operations and Safety Forum, Hilton Head, SC.

83 94

Page 95: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

APPENDIX

Descriptions of SIUC AviationManagement courses cited in this study(SIUC Undergraduate Catalog, 1997):

ATS 364/Work Center Management. Astudy of the problems of managing a smallworking unit (division, department, workcenter, section, etc.) within a larger unit(agency, company, regional office, etc.)Included items will be work center goalsidentification, staffing needs, monitoring ofwork process reporting, work centercommunications, and interpersonal relationswithin the work center. (p. 105)

ATS 416/Applications of TechnicalInformation. This course is designed toincrease student competence in analyzingand utilizing the various types of technicalinformation encountered by managers intechnical fields. (p.105)

ATS 383/Data Interpretation. A coursedesigned for students beginning their majorprogram of study to examine data use intheir respective professions. Emphasis willbe placed upon an understanding of the basicprinciples and techniques involved withanalysis, synthesis, and utilization of data.(p. 105)

AMT 205/ Cabin Environment and JetTransport Systems. Students will understandthe atmospheric variables at different altitudesand the basic equipment required to cope withmalfunction in the cabin pressurization andair-conditioning systems. Using the availableinformation, jet transport aircraft andsimulated training panels, they will understandthe operation of and be able to identify thecomponents of flight control systems, landinggear, fuel, anti-icing, and fire detection

84

systems. They will be able to compare andanalyze aircraft systems of current jettransport aircraft and to diagnose and resolvemalfunction problems. They will haveknowledge of procedures for aircraft groundhandling, APU operation, and systemservicing. (p. 156)

AMT 405/Flight Management Systems.Using industry type computer instructionand flight simulation trainers, the course willdevelop the knowledge for operation andmanagement of autopilots, auto throttles,inertial reference systems, electronicinstrument systems, and flight managementcomputers on advanced technology typeaircraft, such as the Boeing 737-400, 747-400, Douglas MD-81 and MD-11. (p. 157)

AVM 319/Aviation OccupationalInternships. Each students will be assignedto a departmentally approved work siteengaged in activities related to the student'sacademic program and career objectives.The student will be assigned to an unpaidinternship position and will perform dutiesand services in an instructional setting aspreviously arranged with the sponsoringwork site supervisor. (p. 159)

AVM 360/The Air Traffic ControlSystem, Procedures and Rules. This courseprovides instruction in basic air trafficcontrol procedures and phraseology used bypersonnel providing air traffic controlservices. Students will become familiar withFederal Aviation Administration handbookand federal aviation regulations that pertainto the operational responsibilities of an airtraffic controller. (p. 159)

AVM 370/Airport Planning. To acquaintthe student with the basic concepts of airportplanning and construction, as well as an

95

Page 96: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

investigation of various regulatory agenciesin the industry and their functions. (p. 159)

AVM 377/Aviation Industry Regulations.A study of various regulatory agencies of theindustry and their functions. (p. 159)

AVM 372/Airport Management. A studyof the operation of an airport devoted to thephases of lighting, fuel systems, fieldmarking, field buildings, hangars, andsurrounding community. (p. 159)

AVM 373/Airline Management. A study ofthe administrative aspects of airlineoperation and management including adetailed study of airline organizationalstructure. (p. 159)

AVM 374/General Aviation Operations.A study of general aviation operationsincluding fixed base operations (fuel, sales,flight training, charter, etc.), corporateaviation (business aviation, corporate flightdepartments, executive air fleets, etc.) andthe general aviation aircraft manufacturingindustry. (p. 159)

AVM 375/Legal Aspects of Aviation. Thestudent will develop an awareness of airtransportation. The course will emphasizebasic law as it relates to contracts, personnel,liabilities, and legal authority ofgovernmental units and agencies. (p. 159)

AVM 376/Aviation MaintenanceManagement. To familiarize the studentwith the functions and responsibilities of theaviation maintenance manager. Maintenancemanagement at the fixed based operator,commuter/regional airline, and nationalcarrier levels will be studied. (p. 159)

AVM 377/Aviation Safety Management.This course will survey the various aspects

of aviation flight and ground safetymanagement. Weather, air traffic control,mechanical and human factors in aviationsafety management will be reviewed. (p.159)

AVM 385/Air Transport Labor Relations.Students will gain a general understandingof the economic situation of which labor-management problems represent a subset.They will develop a perspective on theevolution of labor relations in the UnitedStates economy and on how the interactionof labor and management differs throughoutthe world. The collective bargaining sectionintroduces the student to the techniques ofbargaining used by labor and management intheir ongoing interactions. (p. 159)

AVM 386/Fiscal Aspects of AviationManagement. An introduction to the fiscalproblems encountered in the administrationof aviation facilities and airline operations.(p. 159)

AVM 402/Aviation Industry CareerDevelopment. Introduces students to thevarious elements involved in obtaining aposition in their chosen career field. Topicsincluded are: personal inventories,placement services, employment agencies,interviewing techniques, resumes, letters ofapplication, references, and employmenttests. Each student will develop a portfolioincluding personal and professionalinformation related to individual careergoals. (p. 160)

AVM 460/National Airspace System. Thiscourse provides instruction on the nationalairspace system, its purpose and majorcomponents. It defines the Federal AviationAdministration's role in the operation,maintenance and planning of the nationalairspace system. (p.160)

85

96

Page 97: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 1Courses Taken in the Aviation Major

Course Title (1 = 78) Number Taking %

1. Flight Training at SIUC 73 93.62. AVM 371-Aviation Industry Regulation 70 89.73. AVM 373-Airline Management 67 85.94. AVM 385/ATS 332-Air Transport 62 79.5

Labor Relations5. AVM 370-Airport Planning 61 78.26. AVM 372-Airport Management 60 76.97. ATS 364-Work Center Management 57 73.18. AVM 377-Aviation Safety Management 56 71.89. AVM 360-Air Traffic Control 55 70.5

10. ATS 416-Applications of 54 69.0Technical Information

11. AVM 386-Fiscal Aspects of 47 60.3Aviation Management

12. AVM 402/ATS 421- 34 43.6Aviation Industry Career Devel.

13. ATS 383-Data Interpretation 32 41.014. AVM 460-National Airspace System 28 35.9

15. AMT 205-Cabin Environmentand Jet Transport Systems

23 29.5

16. AMT 405-Flight Management Systems 17 21.8

Note. Courses are listed ordinally by the total number of respondentstaking each course. See Appendix for course descriptions.

86 97

Page 98: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 2Combined Numbers of "Most HelpfulNery Helpful" and "Least Helpful/NotVery Helpful" Responses by Course

Course Title (II = 78) Total MHNH % LH/NVH %

Flight Training 73 59 80.8 2 2.7AVM 371 70 25 35.7 8 11.4AVM 373 67 55 82.1 3 4.5AVM 385/ATS 332* 62 40 64.5 3 4.8AVM 370 61 9 14.8 24 39.3AVM 372 60 15 25.0 19 31.7ATS 364 57 8 14.0 19 33.3AVM 377 56 20 35.7 7 12.5AVM 360 55 9 34.5 13 23.6ATS 416 54 1 20.4 12 22.2AVM 386 47 20 42.6 10 21.3AVM 402/ATS 421* 34 20 58.8 2 5.9ATS 383 32 3 9.4 19 59.4AVM 460 28 7 25.0 5 17.9AMT 205 23 15 65.2 4 17.4AMT 405 17 11 64.7 1 5.9

Note. Course Titles are listed ordinally by the total number ofrespondents taking each course. MHNH = Most HelpfulNery Helpful(combined responses) and LH/NVH = Least Helpful/Not Very Helpful(combined responses). *ATS 421 converted to AVM 402 in 1996; ATS 332converted to AVM 385 in 1996. See Appendix for course descriptions.

87

98

Page 99: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 3Top 5 Courses Ranked as "Most Helpful" (MH) and "Very Helpful" (VH) byRespondent Group

All Respondents (n = 78)MH VH TotalCourse Title

Flight Training at SIUC 32 27 = 59Airline Management 28 27 = 55Air Transport Labor Relations 12 28 = 40Aviation Industry Regulations 4 21 = 25Aviation Safety Management* 5 15 = 20Fiscal Aspects of Aviation 6 14 = 20

Management*Aviation Industry Career 13 7 = 20

Development*

Hired by United (n = 36)

Flight Training at SIUC 14 15 = 29Air Transport Labor Relations* 8 15 = 23AVM 373 "Airline Management* 12 11 = 23Aviation Industry Regulations 4 12 = 16

Aviation Safety Management 5 7 = 12

Not Hired by United (n = 42)

Airline Management 16 16 = 32Flight Training at S IUC 18 12 = 30Air Transport Labor Relations 4 13 = 17Fiscal Aspects of Aviation 4 9 = 13

ManagementAviation Industry Career 7 5 = 12

Development

Note. *Indicates a tie in ranking. See Appendix for course descriptions.

88 99

Page 100: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 4Bottom 5 Courses Ranked as "Least Helpful" (LH) and "Not Very Helpful"(NVH) Respondent Group

All Respondents (n = 78)LH NVH TotalCourse Title

Airport Planning 8 16 = 24Airport Management* 5 14 = 19Data Interpretation* 7 12 = 19Work Center Management* 8 11 = 19Air Traffic Control 5 8 = 13

Hired by United (n = 36)

Airport Planning* 3 10 = 13Airport Management* 3 10 = 13

Air Traffic Control 2 5 = 7Fiscal Aspects of Aviation 1 6 = 7

Management*Data Interpretation* 0 4 = 4Work Center Management* 4 0 = 4

Not Hired by United (n = 42)

Data Interpretation* 7 8 = 15

Work Center Management* 4 11 = 15

Airport Planning 5 6 = 11

Applications of Technical 4 6 = 10Information

Air Traffic Control* 3 3 = 6Airport Management* 2 4 = 6

Note. *Indicates a tie in ranking. See Appendix for course descriptions.

Page 101: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 5Evaluation of "Short" Internship Experience

Overall Respondent Group (n = 78)

Course Title MV/VV NV/SV

Experiencing the airline workenvironment

61 0

Interaction with United personnel 59 1

Interview for the "long internship" 52 3

Presentation by United personnel 50 3

"Short internship" overall experience* 42 4

Simulator Time* 42 9

Group project with other interns 36 12

Tours of "TK" 34 7

Tours of EXO/WHQ 31 5

767 (or other) Ground School 28 4Tours of DEN/DIA 24 14

Hired by United (n = 36)

Interaction with United personnel 26 0Experiencing the airline work environment* 25 0

Presentation by United personnel* 25 1

Interview for the "long internship" 24 2

"Short internship" overall experience 23 0Simulator Time 19 7

767 (or other) Ground School 16 3

Group Project with other interns 14 5

Tours of TK 13 5

Tours of EXO/WHQ 12 2

Tours of DEN/DIA 9 6

Not Hired by United (n = 42)

Experiencing the airline work environment 36 1

Interaction with United personnel '33 1

Interview for the "long internship" 28 1

Presentation by United personnel 25 2

Simulator Time 23 2

Group project with other interns 22 7

Tours of TK 21 2Tours of EXO/WHQ* 19 3

"Short internship" overall experience* 19 4

Tours of DEN/DIA 15 8

90

Page 102: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

767 (or other) Ground School 12 1

Note. Items listed in order of their "Most ValuableNery Valuable"responses. *Indicatesa tie in ranking (MVNV). MVNV = Most ValuableNery Valuable. NV/SV= Not Valuable/Somewhat Value.

Table 6Evaluation of "Long" Internship Experience

Overall Respondent Group (n = 78)

MVNV NV/SVCourse Title

Your Internship Experience 69 4Observer Member of Crew Privileges* 68 2"Long" Internship Overall Experience* 68 2Interaction with United Personnel

(other than your supervisor)67 2

Your assigned work location at United 63 5Your internship supervisor 54 10Simulator Time 47 7Interaction with other Interns* 44 6Tours of MOC at SFO* 44 4Tours of the Boeing Plant 42 4Specific Intern Project 40 15Airline Pass Privileges 289

Visits to TK (if assigned elsewhere) 26 3

Hired by United (n = 36)

Your internship experience 31 1

Interaction with United Personnel*(other than supervisor)

29 1

Long internship overall experience* 29 0Your assigned work location at United* 28 2Observer Member of Crew Privileges* 28 2Your internship supervisor 27 3

Simulator time 21 4Tours of MOC at SFO 20 1

9. Tours of the Boeing Plant 19 3

Interaction with other interns* 18 1

Specific intern project* 18 1

Airline Pass Privileges 10 3

91102

Page 103: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Visits to TK (if assigned elsewhere) 9 2

Not Hired by United (n = 42)

Observer Member of Crew Privileges 40 0"Long" Internship overall experience 39 2

Interaction with United Personnel* 38 1

Your internship experience* 38 3

Your assigned work location at United 35 3

Your internship supervisor 27 7Simulator Time* 26 3

Interaction with other interns* 26 5

Tours of MOC at SFO 24 3

Tours of Boeing Plant 23 1

Specific intern project 22 10

Airline Pass Privileges 18 6Visits to TK (if assigned elsewhere) 17 1

Note. Items listed in order of their combined "Most Valuable/Very Valuable"responses.*Indicates a tie in ranking.

Table 7Respondent Ranking of Most Valuable "Long" Internship Components

Combined First, Second and Third Responses

Interaction with United Personnel 46OMC Privileges 40Overall Experience 23Assigned Work Location* 23Simulator Time/Experience* 23Relationship with Internship Supervisor 18Your Intern Experience 17

Note: All other components received less than ten responses. *Indicates a tiein ranking

92

193

Page 104: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 8Respondent Ranking of Least Valuable "Long" Internship Components

Combined First, Second and Third Responses

Tours (Boeing, MOC, etc.) 37Airline Pass Privileges 28Intern Project 21Interaction with other Interns 16Simulator Time/Experience 13Visits to TK 11

Relationship with Supervisor 10

Note: All other components received less than 10 responses.

Table 9Responses to the Question: "What impact has the "short internshipexperience had in achieving your aviation career goals?"

Not Hired (n = 42) Hired (n = 36) Totals (n = 78)

Response Number / Number / % Number / %

Great Impact 5 /11.9 15 /41.7 20 /25.6Significant 11 /26.2 12 /33.3 23 /29.5Some 10 /23.8 2 / 5.6 12 /15.4Little 7 /16.7 0 / 0.0 7 / 9.0No Impact 3 / 7.1 0 / 0.0 3 / 3.8N/A 5 /11.9 7 /19.4 12 /15.4No Response 1 / 2.4 0 / 0.0 1 / 1.3

Total 42 /100% 36 /100% 78 /100%

93

104

Page 105: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

Table 10Responses to the Question: "What Impact has the "long" internship had inachieving your aviation career goals"?

Not Hired (n = 42) Hired (n = 36) Totals (n = 78)

Response Number / % Number / Number / %

Great Impact 16 /38.1 33 /91.7 49 /62.8Significant 14 /33.3 0 / 0.0 14 /17.9Some 8 /19.1 0 / 0.0 8 /10.3Little 0 / 0.0 0 / 0.0 0 / 0.0No Impact 3 / 7.1 2 / 5.5 5 / 6.4N/A 0 / 0.0 1 / 2.8 1 / 1.3No Response 1 / 2.4 0 / 0.0 1 / 1.3

Totals 42 /100% 36 /100% 78 /100%

Table 11Responses to the Question: "How well did the coursework in the aviationdepartment at SIUC prepare you for your internship?"

Not Hired (nn = 42) Hired (n = 36) Totals (n = 78)

Response Number / Number / Number /

Very Well 10 /23.8 9 /25.0 19 /24.4Well 23 /54.7 17 /47.2 40 /51.3Moderately 7 /16.7 7 /19.5 14 /17.9Not Much 1 / 2.4 3 / 8.3 4 / 5.1Did Not 0 / 0.0 0 / 0.0 0 / 0.0N/A 0 / 0.0 0 / 0.0 0 / 0.0No Response 1 / 2.4 0 / 0.0 1 / 1.3

Totals 42 /100% 36 /100% 78 /100%

94

105

Page 106: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

ua dm an O

It starts with the press of a button: This revolution-on-a-disc we call Computer-Based Instruction (or CBI, for short).Using advanced multimedia technology to make learning easier and more fun, CBI takes students step-by-interactive-stepthrough the most comprehensive and most comprehensible learn-to-fly system ever developed.Meanwhile, it's taking our Cessna Pilot Centers right to the forefront of today's fast-growing flight trainingmarket. Backed by complete factory marketing support and training, plus outstanding financing on today's#1 flight training aircraft, CBI could well be the key to your flight school's future. To see for yourself, call us

OS2today at 316-332-0336. Or visit our website: www.cessnapilotcenters.cessna.com.

1 6 cAnt.

BEST COPY AVAILABLE

Page 107: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

lancsavi

When exploring options for self-publishinga book or considering a printer to handleyour department's publishing needs, lookto Morris Publishing as an essentialcomponent for your publishing project. Weoffer quality service and affordable prices.

By using (Morris Publishing, The UniversityAviation Association received superiorworkmanship and published this book fora fraction of the cost charged by theprevious printers.

Morris Publishinghas printed:a research projectsa literary publicationsa textbooksa college catalogsa business manualsa instruction booksa genealogiesa local historiesa works of fiction

Our customerbase includes:a professorsa administratorsa university departmentsa writers' guildsa community organizationsa historical societies

PUBLISHING

web site: www.morrispublishing.come-mail: [email protected] E. Hwy. 30 Kearney, NE 68847Phone: 800-650-7888

ESTABLISHED 1933 Fax: (308) 237-0263 1'07

BEST COPY AVAILABLE

Page 108: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

V

CMG 11

BC anixWAS =

8Qr

V

e

STD's that support all aspects

of today's pilot twining needs:

o Primary Training thru ATPO Multi Crew Cooperation (MCC)o LOFT & Airline Introductiono Crew Resource Management (CRM)0 Type Rating

Two product lines:

o Ascent® IFRNFR Trainers" Single& Twin Piston

o Ascent® Full Flight Trainers' - Turboprop& Fan Jets

A Training Partner of

EiTterSETZ OatteUnfiledty

00 g

Moncton Flight College

EST COPY AVALABLE

.1108

Page 109: Reproductions supplied by EDRS are the best that can be madeBenchmarking Airline Operational Performance in Highly Regulated Environments" (Brent D. Bowen, Dean Headley, Karisa D

EIZI d pilot111 - .* 1.0 in itiv. uttcoC 4GVV

U.S. Department of EducationOffice of Educations, Research and Improvement (OERI)

National Library of Education (Nis)Educational Resources Information Center (ERIC)

REPRODUCTION RELEASE(Specific Document)

I. DOCUMENT. IDENTIFICATION:

f. Ue/ UL

Ci,r-Oic-1 Or-

ERIC1

Title;

Author(s):

C-CDLL.Q61-)-T-6 J,

Corpor.ate Source: Publication Date:

II. REPRODUCTION RELEASE:

In order to disseminate as widely as possible timely and significant materials of Interest to the educational community, documents announced in themonthly abstract journal of the ERIC eyetem. Resources in Education (RIE), ere usually made available to users In microfiche, reproduced paper copy,and electronic media, and sold through the ERIC Document Reproduction Service (EDRS). Credit Is given to the source of each document, and, ifreproduction releatea is granted, one of thelonowing notices Is affixed to the document.

if permission is granted to reproduce and disseminate the identified document, please CHECK ONE of the following three options and sign at the bottomof the page.

The sample Middy shwa) below wffi be The sample Wicker shown below will beaffixed Is all Level I documents Axed to NI Level 2A deamteNe

PERMISSION TO REPRODUCE ANDDISSEMINATE INS MATERIAL HAS

be8N GRANTED BY

nPERmISSiON TO REPRODUCE ANDUISSEMINATE: MATERIAL IN

MICROFICHE., WI IN ELECTRONIC MEDIAFOR ERIC COLLECTION SUBSCRIBERS ONLY,

HAS BEEN ($ urge) BY

The sample !Donor *own below will beaffixed W ell Level 2B dooWnerde

PERMISSION TO REPRODUCE ANDDISSEMINATE THIS MATERIAL IN

MICROFICHE ONLY HAS BEEN GRANTED BY

AZA

5 71

ID. THE EDUCATIONAL RESOURCES

_TO 'HE ':=DUCAT1CNAL RESOURCES TO THE EDUCATIONAL RESOURCES

INFORMATION CENTER (ERIC) INFORMATION CENTER (CRIC) INFORMATION CENTER (ERIC)

12A 28Level I novel 2A Level 28

Coed hags for level I release, pownIttinp reproductionand diaarardnallon In microfiche es other ERIC mei.]

mime (ec.. stecooluc) endpaper copy.

Signhere,4please

1

Chock hers for Level 2A Noon. PennUtfril felNegbagenend dlaaembiathan In mil:rack* and In electronic roadie

for ERIC anatival collection euesolbees only

Chock here for Leval 2B release, PenniNnenaproduchon aid dissemination In microllohe only

tWorrnmea MD be procerwed ea indicated provided repro dudhan quality permits.It perm Wen to roproduzo is grunted. but fie box la checked. 004anonts will be procrweed el Level 1.

I hereby grant to the Educational Resources intimation Center (ERIC) nonexclusive permission to reproduce and disseminate this documentas indicated allow. Reproduction horn the ERIC microfiche or electronic made by persons other than ERIC employees and its systemcontractors requires permissbn gym the copyright holder. Exception is made for no .profit reproduction by libraneS end Other sendce agenciaato satisfy information needs of educators In response to discrete Inquiries.

SIOnelleir.

Ornerifetion/Addreer Lati,"wegly of HE' ext Owat-e

LINO Avicifion Ivic4;44te, nki44 b°415,tna 14t- la ma

EO/ad TGLE VSS E017

PnnigidAain4PAlit7-,c4iai16"eri*-cs-Li- :27772 40'4-554- 3711

Deo IC{ 00000E-6T-Ndbi

Emil Aadieas:csvvrotNICe/J.rin

1SN I NAL oNn