flow evaluation design technical reportdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf ·...

97
Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN TECHNICAL REPORT by John M. Ishimaru Mark E. Hallenbeck Senior Research Engineer Director Washington State Transportation Center (TRAC) University of Washington, Box 354802 University District Building 1107 NE 45th Street, Suite 535 Seattle, Washington 98105-4631 Washington State Department of Transportation Technical Monitor Leslie Jacobson Assistant Regional Administrator for Traffic Systems Prepared for Washington State Transportation Commission Department of Transportation and in cooperation with U.S. Department of Transportation Federal Highway Administration March 1999 (revised May 1999)

Upload: others

Post on 22-Sep-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

Technical ReportResearch Project T9903, Task 62

FLOW Evaluation

FLOW EVALUATION DESIGNTECHNICAL REPORT

by

John M. Ishimaru Mark E. HallenbeckSenior Research Engineer Director

Washington State Transportation Center (TRAC)University of Washington, Box 354802

University District Building1107 NE 45th Street, Suite 535

Seattle, Washington 98105-4631

Washington State Department of TransportationTechnical Monitor

Leslie JacobsonAssistant Regional Administrator for Traffic Systems

Prepared for

Washington State Transportation CommissionDepartment of Transportation

and in cooperation withU.S. Department of Transportation

Federal Highway Administration

March 1999(revised May 1999)

Page 2: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

TECHNICAL REPORT STANDARD TITLE PAGE1. REPORT NO. 2. GOVERNMENT ACCESSION NO. 3. RECIPIENT'S CATALOG NO.

WA-RD 466.2

4. TITLE AND SUBTITLE 5. REPORT DATE

FLOW Evaluation Design Technical Report March 19996. PERFORMING ORGANIZATION CODE

7. AUTHOR(S) 8. PERFORMING ORGANIZATION REPORT NO.

John M. Ishimaru and Mark E. Hallenbeck

9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. WORK UNIT NO.

Washington State Transportation Center (TRAC)University of Washington, Box 354802 11. CONTRACT OR GRANT NO.

University District Building; 1107 NE 45th Street, Suite 535 Agreement T9903, Task 62Seattle, Washington 98105-463112. SPONSORING AGENCY NAME AND ADDRESS 13. TYPE OF REPORT AND PERIOD COVERED

Washington State Department of TransportationTransportation Building, MS 7370

Technical report

Olympia, Washington 98504-7370 14. SPONSORING AGENCY CODE

15. SUPPLEMENTARY NOTES

This study was conducted in cooperation with the U.S. Department of Transportation, Federal HighwayAdministration.16. ABSTRACT

This report describes an evaluation approach, process, and analytical tool set that were developed toanalyze freeway usage and performance in the central Puget Sound region. It also functions as a user’sguide, describing the range of performance measures used and the methods and tools available to estimatethem, along with step-by-step instructions. This report also serves the important purpose of documentingthe analytical assumptions and limitations of the evaluation method.

This report is one of three products of a Washington State Department of Transportation(WSDOT) project to enhance the department’s ability to monitor and improve its traffic managementefforts on Seattle-area highways, and to provide useful information to the public and decision makersabout the status of the freeway system’s operational performance. This report is primarily intended tomeet the first of these objectives. In addition to this report, this project produced a set of software tools toassist in freeway data analysis, as well as an interim report documenting the level of traveler usage andtravel performance on the principal urban freeways in the central Puget Sound area for 1997. Thefreeways studied in this project are managed by WSDOT using its FLOW system, a coordinated networkof traffic monitoring, measuring, information dissemination, and control devices that operates on urbanstate and Interstate highways in the central Puget Sound region.

The data analysis procedures described in this report are intended to facilitate a series of periodicevaluations of the central Puget Sound urban highway network and the WSDOT FLOW system. Theywere also designed to have general capabilities, so that they can be employed at any freeway location ofinterest, provided that the appropriate data have been collected. This evaluation process focuses onmainline (GP and HOV) performance measures; plans call for eventual expansion of the process toinclude other aspects of the WSDOT FLOW system.

17. KEY WORDS 18. DISTRIBUTION STATEMENT

Archived Data User Services (ADUS), congestionmonitoring, freeway performance

No restrictions. This document is available to thepublic through the National Technical InformationService, Springfield, VA 22616

19. SECURITY CLASSIF. (of this report) 20. SECURITY CLASSIF. (of this page) 21. NO. OF PAGES 22. PRICE

None None 90

Page 3: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

ii

Disclaimer

The contents of this report reflect the views of the authors, who are responsible for the facts andthe accuracy of the data presented herein. The contents do not necessarily reflect the official views orpolicies of the Washington State Transportation Commission, Department of Transportation, or theFederal Highway Administration. This report does not constitute a standard, specification, or regulation.

Page 4: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

iii

Table of Contents

Section

List of Figures and Tables ............................................................................................................iii

Purpose of this Report.................................................................................................................. 1

1. Introduction ............................................................................................................................ 3

2. Evaluation Methodology.......................................................................................................... 4

3. User’s Guide: FLOW Evaluation Analysis Procedures and Tools.............................................. 10

4. Using the FLOW Evaluation Analysis Process......................................................................... 36

Example A: Producing the Central Puget Sound Freeway Usage and Performance Report..... 36

Example B: Producing Data for the WSDOT Ramp and Roadway Traffic Volume Report ...... 45

Example C: System Operations Diagnostics........................................................................ 46

5. Analytical Assumptions and Limitations. ................................................................................. 54

Appendix: CDR User’s Guide

Page 5: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

iv

List of Figures and Tables

Figures

1. CDR User Interface................................................................................................................. 132. CDR Output........................................................................................................................... 143. CDR Output, continued .......................................................................................................... 154. Estimated Weekday Volume, Speed, and Reliability Profiles...................................................... 175. Corridor Traffic Profile............................................................................................................ 196. Estimated Average Weekday Travel Time................................................................................ 207. Sensor Data Quality By Location and Time of Year ................................................................... 488. Estimated Weekday Volume Profile: GP and HOV Lanes ......................................................... 509. Per-Lane Volume Profile: GP and HOV Lanes.......................................................................... 5110. Comparison of CDR Analyst andRamp and Roadway Report Estimates .................................. 53

Tables

1. Reference Set of Measurement Locations.................................................................................. 372. Potential Routes for Travel Time Estimation and Monitoring..................................................... 403. Example of Supporting Data Location Table for Reference Measurement Sites............................ 44

Page 6: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 1

Purpose of This Report

This report describes an evaluation approach, process, and analytical tool set that was developedfor the analysis of freeway usage and performance in the central Puget Sound region. It also functions asa user’s guide, describing the range of performance measures and the methods and tools available toestimate them, along with step-by-step instructions. This report also serves the important purpose ofdocumenting the analytical assumptions and limitations of the evaluation method; these considerationsshould be carefully taken into account by prospective users.

This report is one of three products of a Washington State Department of Transportation(WSDOT) project to enhance the department’s ability to monitor and improve its traffic managementefforts on Seattle-area highways, and to provide useful information to the public and other decisionmakers about the status of the freeway system and state traffic management activities in the area. Inaddition to this report, this project has produced a set of software tools to assist in freeway data analysis,as well as an interim report documenting the level of traveler usage and travel performance on theprincipal urban freeways in the central Puget Sound area for 1997. The freeways studied in this projectare managed by the Washington State Department of Transportation using its FLOW system, acoordinated network of traffic monitoring, measuring, information dissemination, and control devicesthat operates on urban state and interstate highways in the central Puget Sound region.

The data analysis procedures described in this report are intended to facilitate a series of periodicevaluations of the central Puget Sound urban highway network and the WSDOT FLOW system. Theywere also designed to be general capabilities, so that they can be employed at any freeway location ofinterest, provided that the appropriate data have been collected. This evaluation process focuses onmainline (GP and HOV) performance measures; plans call for eventual expansion of the process toinclude other aspects of the WSDOT FLOW system, such as ramp meters, safety aspects, and travelerinformation systems.

What Is In This Report

This report summarizes the methodology, tools, and procedures developed for the evaluation ofthe Seattle area freeway network. The report is divided into five sections:

1. Introduction. The objectives of this report are described.

2. Evaluation Methodology . The overall evaluation process is described. Data collection,sampling, processing, and display issues are summarized.

3. User’s Guide: FLOW Evaluation Analysis Procedures and Software Tool Set. The processby which freeway data is collected and analyzed is described in more detail. The capabilitiesof the software tools CDR (Compact disc Data Retrieval), CDR Analyst, and CDR Auto aredescribed, including a catalog of performance evaluation options, illustrations of types ofoutput, and ancillary macros and template files. Performance measures and the softwaretools used to compute them are presented, including step-by-step instructions for their use aswell as specific data requirements. CDR Analyst algorithms are also described.

Page 7: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 2

4. Using the FLOW Evaluation Analysis Process. Three examples of the use of the FLOWevaluation process are described:

Example I: Producing the Central Puget Sound Freeway Usage and Performance Report.The process used to produce the “Central Puget Sound Freeway Usage and PerformanceReport” is summarized.

Example II: Producing Supporting Data for the WSDOT Ramp and Roadway TrafficVolume Report . Guidelines for the use of this evaluation method to produce data that canassist in producing the “Ramp and Roadway Traffic Volumes Report” for WSDOTNorthwest Region are described. (The ramp and roadway report summarizes average peakhour and daily weekday volumes for locations throughout the regional freeway network.)

Example III: System Operations Diagnostics. Examples of the application of FLOWevaluation analysis tools to study system operations issues, diagnose data collectionequipment status, and interpret the validity of summary statistics are described.

5. Analytical Assumptions and Limitations. The assumptions upon which performancemeasures, software, and analysis methods are based are described. Limitations and caveatsassociated with the software and algorithms are also summarized.

Appendix: CDR User’s Guide. A user’s guide has been developed for one of the three primaryanalytical tools described in this report. Because much of the discussion of analyticalmethods, data quality issues, and naming conventions in the CDR User’s Guide is alsorelevant to the other principal tools in the FLOW evaluation tool set, this appendix isincluded with this report.

About This Project

This report is a product of a WSDOT-sponsored project, FLOW Evaluation Framework Design.The overall objectives of this project are to 1) develop a methodology, framework, and detailedprocedures for conducting an ongoing series of evaluations of the performance and effects of the FLOWtraffic management system now in operation on Puget Sound area freeways; 2) create tools forperforming those evaluations; and 3) supplement earlier evaluation data with updated results by usingthe developed framework to evaluate selected portions of the FLOW system. This report documents theresults of work on the first two objectives. A separate report, “Central Puget Sound Freeway Usage andPerformance: Interim Report”, addresses the third objective.

Page 8: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 3

1. Introduction

This report describes the analytical methods and tools that were developed for a WSDOT-sponsored project, FLOW Evaluation Framework Design. These methods were designed to address theneed for a cost-effective process by which timely evaluations of freeway usage and performance on thecentral Puget Sound freeway network could be prepared.

The analytical approach outlined in this report focuses on the usage and performance of mainlinegeneral purpose and high-occupancy freeway lanes in the central Puget Sound region. Potential futureactivities include the extension of this methodology to other aspects of the freeway system, such asfreeway ramp traffic management and traveler information systems. Also under consideration areextensions of the methodology to data collection systems in other regions.

The algorithms used in this project were chosen in an effort to balance considerations of accuracy,calibration requirements, and usefulness of the resulting performance measures. These techniquesinvolve analytical assumptions and limitations, which are noted in this document. The tool set developedfor this project is a work in progress. It represents the initial software implementation of the analyticaltechniques used in this evaluation process; its principal purpose is to automate as much as possible theprocess by which an evaluation is performed. Potential future software development activities include anenhanced user interface, additional functionality, and greater integration with other tools and databases.All descriptions of tools are based on V 1.1 of the software (March 1999).

Page 9: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 4

2. Evaluation Methodology

Overview

This section describes an evaluation approach that was developed to perform a systemwideevaluation of the usage and performance of the major corridors of the Seattle area freeway network. Thediscussion in this section begins with an overview of the guidelines that were used to determine theevaluation method, data collection approach, and analytical tool development process. This is followedby a discussion of the measurement sampling approach used, and criteria that were used to select thelocations that were included in the measurement sample.

Evaluation Design Guidelines

The design of this evaluation methodology was driven by three sets of issues: overall projectobjectives, data collection issues, and analytical tool issues. The overall project objectives wereestablished in the original project proposal, based on the initial goals of the WSDOT. Data collectionissues revolve principally around the pragmatic considerations associated with the tradeoff betweencomprehensive, high quality data and cost (in both dollars and time). Analytical tool issues involve theuser-friendliness of the analysis process. The following is a discussion of the objectives and concerns thatguided the development of the evaluation methodology, as well as the ways in which the methodologyaddressed these issues.

Overall Objectives

The initial proposal for the FLOW evaluation design project was based on a recognition that asystematic, cost-effective method of evaluating the usage and performance of a freeway network wasdesired. The principal goals at that time were the following:

1) Easier evaluations. Previous evaluations were of a “custom-made” nature; they did not havethe benefit of already-developed methods or analytical tools. There was a need for a moreautomated approach that would make the evaluation process faster and easier to implement.

2) More frequent evaluations. There have only been two formal evaluations of the FLOWsystem and the associated freeway system since 1981. There was a need for an evaluationsystem that would facilitate more frequent evaluations without the need to re-inventprocedures and analytical methods.

3) More effective data analysis methods. There was a desire for a set of performance measuresthat would allow the planner or engineer to extract additional insights about the potentiallycomplex interactions that affect the freeway system.

4) More effective communication methods. There was a need for data presentation techniquesthat would more effectively and succinctly summarize the usage and performance of the

Page 10: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 5

freeway system in ways that would be useful for technical professionals, policy decisionmakers, and general audiences.

5) Comparable results. A useful element of an ongoing evaluation process is the ability tomonitor trends over time. This requires that the evaluation method produce information thatis consistent and comparable from one evaluation to the next. There was therefore a need foran evaluation process that used a consistent set of measures, so that comparisons over timecould be made.

Project Response: The project focused on five principal ways in which the evaluation processcould be made easier and more frequent. First, traffic data had to be made available in a convenient andtimely fashion. This was addressed by the WSDOT in 1997 when it began to record detailed 5-minutetraffic data for the central Puget Sound freeway network onto compact disks and update the CD set on aregular basis. Second, tools to facilitate the analysis of that data had to be developed to allow users toproduce results more easily. Furthermore, given the significant quantity of data that needed to beprocessed, it was important that those tools offered the option for automated operation as much aspossible. The development of such a tool set was thus included as an objective in this project. Third, arepeatable step-by-step process to perform an evaluation in a consistent way had to be developed, so thatsuccessive evaluation results could be compared with one another and trends could be monitored. Thedevelopment of an evaluation methodology and associated user guide as part of this project addressedthat objective.

The development of more effective analysis methods and communications techniques was alsoaddressed in this project. The analysis methods eventually developed combined traditional trafficengineering measures such as average daily volume with more detailed measures that focused on theinfluence of factors such as time of day, lane type, and corridor location on traffic patterns. Also includedwere measures that more directly measured the effects felt by the individual traveler, such as travel timeand speed; the variability or reliability of travel was another performance consideration that hassignificance for the traveler. These measures benefited from new methods of presentation, including theuse of graphical formats and color to better convey the meaning of the analysis to a wider range ofaudiences. Finally, the ability to compare results was addressed through the development of standardsets of measures, evaluation tools, and measurement locations.

Data Collection Issues

The process of collecting raw traffic data for further analysis offers several challenges. First, thereis the practical aspect of establishing a mechanism to collect traffic data in a timely and cost-effectivemanner. Second, there is the challenge of collecting measurements that are at a sufficient level of detail toaccommodate a range of analyses without becoming overwhelmed with data. The following datacollection issues were considered in the evaluation method design:

1) Data collection limits. An important issue for traffic data collection is one of tradeoffs,namely the challenge of collecting a sufficient quantity of high quality traffic data within therestrictions imposed by time, available human and equipment resources, and funds. Giventhese data collection trade-offs, the evaluation method needs to include criteria to guide theallocation of scarce resources to freeway locations of high utility and interest.

2) Readily available data. Because resources are limited, it is important that readily availabledata sources be used whenever feasible.

Project Response: The limitation of data collection resources was addressed by establishing acore set of representative measurement locations that could be used at each evaluation (the selection oflocations was based on a set of criteria that will be discussed shortly). The analysis tools were designedto allow other locations to be studied as well (as long as data were available), but the core set of locations

Page 11: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 6

allowed for consistent measurements from one evaluation to the next. The project was also fortunate tohave access to a substantial database of WSDOT traffic data that was converted to CD as this project wasbeing developed; this data set is a key resource for the evaluation process. This data set is updated usinga collection mechanism (a network of electronic sensors) that is already in existence, thus saving thisproject (and future evaluations) considerable time and money.

Analytical Tool Issues

It is also important to supplement an evaluation process with analytical tools and methods thatfacilitate the evaluation approach by providing fast, convenient results in a cost-effective manner. Thefollowing are issues associated with the design and operation of analysis tools that were considered fordevelopment in this project.

1) Emphasis on automation. The evaluation of a freeway network covering a region the size ofthe central Puget Sound area requires that analytical tools operate as quickly as possible toprocess the significant quantities of associated traffic data that are involved and to allowmore frequent evaluations to take place. As part of this project, an interim evaluation reportwas produced using an initial evaluation procedure that was only partially automated. Fromthis tedious and repetitive experience, it became abundantly clear that insufficientautomation significantly inhibits the ability and desire to perform frequent or timelyevaluations. Tools should therefore operate in an unattended mode as much as possible,especially when lengthy, highly repetitive tasks are involved.

2) Blending custom and off-the-shelf software. In the first phase of this project, prototypes ofanalytical tools were developed using commercial spreadsheet software and custom macros.While this method was useful as a proof-of-concept exercise, it became clear that softwarewritten in a high-level computer language would offer the performance (speed) needed toprocess the large quantities of data involved. At the same time, rather than write certainfunctions such as graphical output from scratch, it was noted that those functions could beprovided more cost-effectively by using off-the-shelf software that already included suchcapabilities. As a result of these early project results, it was determined that the use of ablend of custom-built and off-the-shelf software would be the most cost-effective way tobuild analytical tools.

3) Minimal hardware and software requirements. To enhance the ease of use of the analyticaltools, minimal hardware and software requirements were desired, with an emphasis onubiquitous hardware and inexpensive, readily available off-the-shelf software.

Project Response: The approach taken to address these issues involved a blend of commercialpackages, custom macros, and custom high-level code, all operating on a readily available computerplatform. First, it was decided that certain data processing functions were already well-developed inavailable commercial packages and therefore did not need to be reinvented. Specifically, some post-processing tasks and graphics output processes are performed within Microsoft Excel using its built-inmacro language. Excel was chosen because of its built-in graphics capabilities, automation options viauser-written macros, and widespread availability at WSDOT. Second, tasks that required significant dataprocessing and/or the computation of new performance measures were custom-coded to increase speedof processing; this code was written in the industry standard ANSI C language. These software analysistools were designed to allow unattended operation for selected performance measures that involvedsignificant processing time and/or repetitive tasks. By providing an operation mode that did not requireconstant user participation, yet was controllable by a user-specified batch file or command file, extendedanalysis could be performed in parallel with other activities, or even overnight as needed. Third, thestandard platform requirement for these tools was an IBM PC-compatible computer operating withWindows 95, Windows 98, or Windows NT, and including a CD-ROM drive, color monitor, and optional

Page 12: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 7

color printer. Recommended specifications include a Pentium II 200 Mhz processor (faster is preferred),64 Mb of RAM, and at least 1 Gb of free disk space (more space is preferred).

As suggested above, the measurement approach taken for the FLOW evaluation is to measure thestate of the system by sampling performance at a core set of locations, where each location is in some wayrepresentative of system performance in that geographic vicinity, for as many sampled time periods as isfeasible. This approach addresses the reality that though a comprehensive evaluation would ideallyinclude measurements performed at as many network locations as possible, at different times, and undera wide range of conditions, in practice the ability to perform a comprehensive review of this sort islimited by resource constraints as well as the fixed number of measurement sites for which usablemeasurement equipment is already in place. The next discussion describes the guidelines used to selectthe locations at which performance of the system will be sampled.

Performance Sampling Considerations

The selection of representative freeway locations requires a balance between the desire for highlydetailed comprehensive measurements of the system vs. limited time, staffing, and equipment resources.Ideally, one could select and evaluate an exhaustive collection of measurement points for a broad rangeof conditions, to determine in detail the variations in system performance as a function of both locationand ambient conditions. A benefit of this approach is that it focuses on the accumulation of detailed dataat many locations to minimize the possibility that important locations or conditions will be missed.

Clearly, though, time and resources impose limits on the extent to which a comprehensive roadnetwork evaluation of this type can be performed. At the other extreme, one could use an aggregateindex, or set of indices, that somehow attempts to capture the essential elements of the highway system’sperformance. This measure could be determined by combining data collected at a limited number ofdiscrete locations and producing an overall index or metric of the system’s performance. The level ofdetail of that metric could vary depending on the nature of the measure of interest; a simple example(that is, simple to describe if not to calculate) would be a “congestion index”, that uses a single value to“rate” the overall level of congestion on the system. While such an approach might be less data-intensiveand less costly to calculate, there remains the problem of developing an aggregate measure that isaccurate, meaningful, and intuitive. In addition, important variations between locations could be maskedby the use of only one or a few summary measures that “average out” noteworthy differences.

As a practical matter, it was decided that the measurement of the performance of a freewaynetwork would take place at a limited number of discrete points. At the same time, there is potentiallysignificant analytical value to be gained from sensitivity analyses that study the variations in traffic as afunction of changes in location and other conditions. Therefore, for the FLOW evaluation framework, amiddle ground was chosen to provide an evaluation that was easier to implement, but comprehensiveenough to be useful for overall system analysis. First, a core set of locations was selected to representsignificant elements of the system. (The use of a fixed set of locations also allows trends over time to bemonitored while holding location-dependent conditions as constant as possible.) Second, a range ofanalytical options is provided to make the most of the data that is collected by measuring trafficperformance in various ways (traditional measures like daily volume, as well as other measures such astravel reliability) and exploring the relationship between those measures and the freeway’s environment;examples of the latter include variations in traffic performance by lane type, time of day, and direction oftravel.

This approach, based on performance sampling at a representative core set of locations, tries toprovide an appropriate balance by avoiding inaccuracies that might result from evaluating a system withtoo few measurement points, while also avoiding an expensive and time-consuming, data-intensiveanalysis effort. It should be noted that this approach still allows the analytical tools to be used atadditional locations above and beyond the core set of locations, as resources and data availability allow,and analytic needs dictate.

Page 13: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 8

Sampling Location Selection Criteria

Measurements taken at a specific location in a road network provide information on typicaltraffic conditions in the immediate vicinity. The degree to which measurements at one site arerepresentative of those at a nearby site depends on such issues as the location of intervening trafficsources and sinks (i.e., on-ramps and off-ramps), variations in road geometry (e.g., elevation changes,lane widths, number of lanes) and differences in physical conditions (roadway or facility changes, such astunnels or bridges). Measurements also depend on the conditions under which the measurements aretaken (weather; time of day, week, month, or year; how “typical” the traffic conditions were at the time ofthe measurement). Variations in those conditions can cause two measurement sites in close proximity toone another to produce very different measurement results. Therefore, in order to select individuallocations that reflect nearby sites as much as possible, it is important that issues such as those mentionedabove be considered to avoid errors of commission (picking a nonrepresentative site) or omission (notsampling road segments that are potentially significant). The following is a description of criteria used toselect sampling locations.

Picking Measurement Locations

Representative measurement locations for the FLOW system were chosen according to thefollowing criteria:

• Measurement locations should be located at or near sites that are typically of interest from atraffic management and/or traveler viewpoint. This includes high volume locations, keyfacilities (e.g., bridges), and routes that lead to popular destinations such as majoremployment centers, universities, or central business districts.

• Measurement locations should be at or near sites that are expected to be of future interestfrom a traffic management viewpoint. Examples include perimeter locations that are notheavily traveled now, but are expected to be impacted by future residential or commercialgrowth, or growth management directives.

• Each major interstate and state freeway facility should be represented by at least onemeasurement site.

• If a major corridor has more than one major segment, each segment is represented by at leastone measurement site. The determination of what constitutes a “segment” depends on trafficflow issues such as typical travel patterns (e.g., commute patterns) and major intersectionsbetween corridors, as well as practical matters such as available measurement installations.An example is the I-5 corridor, which can be divided into north, north central, central, andsouth central segments for the purposes of an evaluation; the endpoints of the segmentscould be determined by noting the location of major interchanges (I-5/I-405 Swamp Creek, I-5/SR 520, I-5/I-90, I-5/I-405 Tukwila). Each segment would contain at least onemeasurement location.

• Whenever feasible, measurement locations should be in corridor segments that are part of atrip route selected for travel time analysis. (The travel time analysis will be discussed in thenext section.)

• Whenever feasible, measurement locations should be located in corridor segments that aredesignated by the Puget Sound Regional Council as congested segments that are the focus oftheir Congestion Management System.

Page 14: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 9

Picking Measurement Sites

Once a measurement location (i.e., a general area description such as “I-5 near the UniversityDistrict”) has been selected, one or more specific measurement sites at that location are chosen fromamong those sites that have existing data collection sensors maintained by the WSDOT (e.g., mainlinenorthbound lanes on I-5 at the NE 45th Street overpass). These sites use electronic sensors embedded inthe pavement to record basic data on volume and lane occupancy percentage; some sites also collectspeed and vehicle length data. The selection of specific measurement sites is based primarily on thepresence of existing WSDOT data sensors; however, locations of supplementary measurement sites canalso be taken into account. Examples of supplementary sites include locations where average vehicle orbus occupancy data are collected; such data are available from the WSDOT-sponsored HOV LaneEvaluation project or transit agencies. Whenever possible, primary and secondary measurement sitesshould be coincident, so that the volume, lane occupancy, and vehicle occupancy data for a measurementlocation are as compatible as possible. When this is not possible, primary and secondary measurementsshould be chosen to be as close to one another as practical.

• Measurement sites are restricted to those corridor locations for which the WSDOT has activeinductive loops in place in each and every mainline general purpose lane as well as eachHOV lane (if any exist there). These loops are identified with WSDOT loop identifiers thathave the _Mx/MMx (general purpose) or _MxH (HOV) designations, where x refers to thedirection (N, S, E, W, R). These data are accessible on CD via a set of analysis tools (describedin the next section). The loop identifier system is described in detail in the CDR User’sGuide , included as an appendix to this report.

• Whenever possible, measurement sites are located at or near supplementary measurementsites used for vehicle occupancy data collection by the WSDOT HOV Evaluation projectand/or transit agencies. These data are used to estimate person throughput.

• Whenever the vehicle sensor data, car occupancy data, and transit occupancy data cannot beat exactly the same location, the priority is to match the transit data location with the looplocation, and then to use the nearest available car occupancy data.

Selecting a Sampling Time Period

In addition to selecting analysis sites, a time period for the analysis should be selected. Typically,annual average performance measures are used. However, some studies might require a time intervalother than one calendar year. Examples of such studies include before-and-after performance changesrelating to construction projects or operational changes, or an analysis of seasonal or special event-relatedvariations of freeway performance. In addition, it may be appropriate to pick only weekday, onlyweekend, or seven-day-a-week data, depending on the analysis.

Performance Sampling Example: Central Puget Sound Freeway Usage andPerformance Interim Report

The performance sampling approach described in this section was used to prepare the CentralPuget Sound Freeway Usage and Performance Interim Report. This report includes 1) an introductionthat describes the FLOW system and establishes the context for its implementation; 2) a description offreeway system usage; and 3) a range of performance analyses for selected aspects of the FLOW system.Information about the measurements and sites that were analyzed is presented later in this document.

Page 15: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 10

3. User’s Guide : FLOW Evaluation Analysis Procedures and Tools

Introduction

The remainder of this document is a user’s guide to the capabilities, operating instructions, andunderlying analytical bases for the software tools and algorithms used to analyze the usage andperformance of the freeway system. In this section, the functionality of the analysis tool set andprocedures for their use is presented. In section 4, a series of example applications of the evaluationprocess are summarized, including 1) the process by which a “Central Puget Sound Freeway Usage andPerformance” interim report was prepared; 2) the potential use of the evaluation tool set to providesupporting data for the WSDOT Northwest Region’s “Ramp and Roadway Traffic Volumes” summaryreport; and 3) the use of the tool set to perform various system operations diagnostics and interpretperformance analyses. In section 5, known analytical assumptions, limitations, and caveats of the tool setare summarized.

The following topics are discussed in this section:

a) Data overview. The data set upon which the evaluation tool set is based is described,including contents, level of detail, and availability.

b) Tool set overview. Each analysis tool in the evaluation tool set is described, includingfunctions and operating platform requirements.

c) CDR (Compact disc Data Retrieval) software. A catalog of data processing options availablefor evaluating the freeway network using the CDR data retrieval tool is outlined. Illustrationsof sample output are shown. (A complete user’s guide to CDR is available in the Appendixof this report.)

d) CDR Analyst software. The range of analysis options available for evaluating the freewaynetwork using the CDR Analyst post-processor program is presented, along with step-by-step operating instructions. Pre- and post-processor macros, templates, and related data setsare also described. Sample illustrations of output from each analysis option are included.

e) CDR Analyst algorithms. The algorithms used to implement the CDR Analyst analysisoptions are described.

f) CDR Auto software. The purpose and operating instructions for the CDR Auto pre-processor program are presented.

g) CDR Analyst utilities. The CDR Analyst Excel-based macros and template files aredescribed.

h) Data Quality Mapping utility. The traffic data file data quality macro is described.

Page 16: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 11

The analysis options developed to date focus specifically on the evaluation of the mainlinegeneral purpose (GP) and high-occupancy vehicle (HOV) lane network in the central Puget Sound region.

Data Overview

The data set used by the analysis tools described in this report is available on compact discsproduced by the WSDOT. These CDs include traffic data collected from electronic inductance loopsensors installed at approximately 0.5-mile intervals on mainline lanes and ramps of freeways and statehighways in the central Puget Sound region, including I-5, I-405, I-90, SR 520, SR 18, SR 522, and SR 99.Vehicle presence is detected by the sensors, and the resulting detection data are collected at 20-secondintervals and transmitted to the WSDOT Transportation Systems Management Center for processing andarchiving. The data are archived at 5-minute intervals (i.e., 15 consecutive 20-second values arecombined to produce a single 5-minute value). CD archives are available for data starting from mid-1993,with 2 to 4 CDs required to hold a year of data from all sensor locations in the central Puget Soundfreeway network.

The principal values recorded by the sensors are vehicle volumes and average lane occupancypercentage. Vehicle volumes at a particular roadway location are estimated by recording the number oftimes that an inductive loop embedded in an individual lane of a road or ramp is “triggered” by apassing vehicle. This count is recorded 24 hours a day, except when equipment is turned off, beingserviced, or inoperative. Vehicle volumes can be a useful measure of facility usage; they can also becombined with information about per-vehicle person occupancy to estimate person throughput. Five-minute vehicle volumes can also be aggregated to produce data for other time intervals (e.g., hourly,daily, yearly averages).

The loop sensors are also used to estimate lane occupancy. Lane occupancy refers to thepercentage of time that a given loop is in a triggered (“on”) position, where the triggered state indicates avehicle’s presence within the loop’s detection range. For example, if a loop recorded a lane occupancy of10 percent for a 5-minute period, this would mean that vehicles were sensed within the loop’s detectionrange for a total of 30 seconds during the 5-minute interval (10% of 5 minutes = 30 seconds). Laneoccupancy can be considered a surrogate measure for the density of vehicles on a roadway; it can also beused as a measure of congestion. Lane occupancy can also be combined with vehicle volume estimates toderive estimated vehicle speeds.

The estimated validity of each 5-minute data value is also recorded on the CD archive in the formof a code that summarizes the data quality of its constituent 15 20-second values. The 5-minute datavalidity codes include “good” (all 15 constituent values are considered valid), “bad” (all 15 values areconsidered invalid), “disabled” (all 15 values were collected when the data collection equipment at thatsensor site was not operational), or “suspect” (all other combinations of 15 data point conditions).Additional information about the data set is available in section 5 (Analytical Assumptions andLimitations: Input Data Considerations) and the Appendix (CDR User’s Guide).

At selected locations, average vehicle speed and vehicle length information are also estimated,using pairs of mutually operating sensors.

Tool Set Overview

The evaluation tool set includes software that performs the following functions: 1) summarizeraw traffic data and compute performance measures (CDR and CDR Analyst); 2) present the analyzeddata in text and graphical formats (CDR, CDR Analyst, and CDR Analyst utilities), and 3) reformat rawtraffic data for use by CDR Analyst (CDR Auto). There are five categories of analysis options; theyinclude 1) basic traffic statistics, 2) site performance measures, 3) corridor performance measures, 4)travel time performance measures, and 5) system operations diagnostics measures.

Page 17: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 12

Three analysis tools will be discussed in the remainder of the report:

• CDR (Compact disc Data Retrieval): This program, developed by WSDOT NorthwestRegion, accesses 5-minute traffic data stored on compact disc, computes summary statisticsbased on that data, and produces text output files. Output options include volume, laneoccupancy percentage, and speed and vehicle length category (at selected locations). Datacan be summarized at various levels of detail ranging from peak-hour to yearly statistics.Minimum data quality thresholds used to compute summary statistics are user-specified.

• CDR Analyst (and associated utilities): Like CDR, CDR Analyst also uses traffic data storedon WSDOT-produced compact discs. CDR Analyst can produce additional performancemeasures, including 24-hour volume and speed profiles, congestion frequency statistics,corridor congestion summaries, travel time estimates, and travel time reliability measures.CDR Analyst output can be post-processed using Analyst utilities and templates to producecolor graphics output.

• CDR Auto : This utility is a pre-processor for CDR Analyst that converts compact disc trafficdata to a file format that can then be used by CDR Analyst.

There are several utilities and templates that post-process CDR and CDR Analyst output toproduce graphical output that can be used for analysis and report preparation. These ancillary programsare written using the macro language of Microsoft Excel, and use the graphics capabilities of thatprogram. They will be described later in this report.

CDR has been revised a number of times over the past two years, and now features a user-friendly interface and operating process. CDR Analyst, CDR Auto, and the associated utilities andtemplates are still in active development; for the most part they utilize a command line interface, andrequire some manual modifications of the data to produce graphics output. Tentative plans call forenhancements to the user interface of these command-line-based tools in a future phase of this project.

Please note that the evaluation techniques described here are independent of the measurementsites selected (data availability notwithstanding). Although they were initially used on a set of corelocations in the freeway usage and performance interim report (described later in this document), thetools are designed to perform detailed analyses at any location and for any time period, provided that thenecessary data is available in an appropriate format.

The following are summary descriptions of each tool. Most of the discussion centers on CDRAnalyst. CDR is discussed in detail in the Appendix, while CDR Auto is a data re-formatting programthat requires minimal discussion.

CDR (Compact disc Data Retrieval)

CDR is a program developed by WSDOT Northwest Region to access 5-minute traffic data storedon CD, and produce a summarized version in a text file format. Users can specify specific dates of datacollected, specific locations (actually, specific lanes), and various levels of summarization. The output isin the form of a text file that can also be read directly into a spreadsheet program such as Microsoft Excel.Standard output from the program includes 5-minute raw traffic data that includes traffic volumes andaverage lane occupancy percentage as well as a data quality/validity indicator. Data can also beaggregated to a 15-minute, hourly, peak hour, peak period, daily, weekly, monthly, or yearly level. Atsome locations, average estimated speed and vehicle length information can be provided. Figure 1 showsthe user-interface for CDR, including the three general levels of output: raw data (left one-third), dailysummaries (middle one-third), and multi-day or yearly summaries (right one-third). Figures 2 and 3show typical output.

Page 18: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 13

Figure 1. Main CDR Screen

Page 19: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 14

***********************************

Filename: 5TO15.DAT

Creation Date: 02/2/98 (Wed)

Creation Time: 03:16:59

File Type: SPREADSHEET

***********************************

ES-145D:_MS___1 I-5 Lake City Way 170.80

09/01/97 (Mon)

---Raw Loop Data Listing---

Time Vol Occ Flg nPds

0:00 49 3.80% 1 15

0:05 37 2.90% 1 15

0:10 38 3.50% 1 15

0:15 34 2.60% 1 15

0:20 48 4.40% 1 15

0:25 44 3.60% 1 15

0:30 35 2.80% 1 15

0:35 33 3.30% 1 15

0:40 28 2.50% 1 15

0:45 30 2.30% 1 15

Figure 2. Example of 5-minute Output (Volume and Occupancy)

Page 20: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 15

***********************************

Filename: AADT.MDS

Creation Date: 02/2/98 (Thu)

Creation Time: 10:54:09

File Type: SPREADSHEET

***********************************

ES-145D:_MS___1 I-5 Lake City Way 170.80

Monthly Avg for 1996 Jan (Sun)

---Multi-Day Loop Summary Report---

Summary Valid Vol Occ G S B D Val Inv Mis

Daily VAL 19392 7.50% 1133 18 1 0 4 0 0

AM Peak VAL 1493 3.50% 142 2 0 0 4 0 0

PM Peak VAL 5069 15.60% 190 2 0 0 4 0 0

AM Pk Hour VAL 1381 10.00% 47 1 0 0 4 0 0 10:45 11:45

PM Pk Hour VAL 1576 11.90% 48 0 0 0 4 0 0 13:45 14:45

Figure 3. Example of Multi-Day Output

Page 21: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 16

Additional operating and background information on CDR can be found in the CDRUser’s Guide , which is installed in PDF format on each WSDOT traffic data CD since early 1998. It is alsoincluded as an appendix to this report.

CDR Analyst

The discussion of CDR Analyst is organized as follows. First, an overview of the program andthe data that it uses are provided. Next, operating instructions to generate each of the major performancemeasures produced by CDR Analyst are presented. This is followed by a discussion of each of theprimary algorithms used by CDR Analyst to compute those measures.

Overview

CDR Analyst is a program developed by TRAC to access 5-minute traffic data stored on CD, andproduce performance measures that are not normally available from CDR. Because CDR producesstatistics one lane (sensor) at a time, it does not directly produce statistics that reflect all the lanes in agiven location traveling in a particular direction (e.g., the total average daily northbound volume on allGP mainline lanes on I-5 at University Street). CDR Analyst addresses this limitation by allowing theuser to process all relevant lanes at a specific location when producing statistics. CDR Analyst alsocomputes peak hour, peak period, and daily (e.g., Average Weekday Daily Traffic) values similar to thoseproduced by CDR, but does so for all relevant lanes, e.g., all northbound GP lanes at a specific location,rather than only one lane at a time. In addition, CDR Analyst produces supplementary performancemeasures beyond those computed by CDR. These measures include the following:

• Average daily site profiles. CDR Analyst can process user-specified days of traffic data andproduce three site-specific traffic profiles as a function of time-of-day. The first of these is anaverage 24-hour profile of volume per lane per hour at a selected location for a specifieddirection of travel (across all lanes), at 5-minute increments. The user can specify that onlyweekdays will be used, or all days of the week. Second, a corresponding 24-hour estimatedspeed profile is produced. Third, the program computes a 24-hour reliability profile whichestimates the percentage of time that the location is congested, as a function of time of day.The definition of what constitutes congestion is based on a lane occupancy percentagethreshold value; this threshold can be changed by the user.

The resulting output data can then be processed by a spreadsheet-based utility that willproduce a graphic that is suitable for analysis or report preparation. As can be seen in Figure4, the resulting graphic shows a line graph of vehicle volume (measured per lane per hour) asa function of time of day, at a particular site for a given travel direction and lane type (GP orHOV). This graph is then supplemented with the speed estimate profile by adding a color toeach data point on the volume curve, where the color is based on the corresponding speed forthat time interval. To determine the color for each data point, the speed information ismapped to several speed ranges; each range then corresponds to a different color. Forexample, if the calculated speed is estimated at above 55 mph at 9:00 am, the correspondingvolume line segment is green at that time; if the speed is between 45 to 55 mph, the line isyellow; and if the speed is below 45 mph, the line is red. While the specific speed ranges canbe changed, the green/yellow/red color system is intended to approximately represent freeflow, restricted flow, and congested traffic conditions, respectively. By combining thevolume and speed profiles in this way, a single graph can allow the viewer to distinguishbetween, for example, low traffic volumes associated with free-flow traffic, and low volumesthat are the result of congested conditions.

Page 22: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 17

I-5 University St GP NB

0

500

1000

1500

2000

2500

12 AM 2 AM 4 AM 6 AM 8 AM 10 AM 12 PM 2 PM 4 PM 6 PM 8 PM 10 PM

Ve

hic

les

Pe

r L

an

e P

er

Ho

ur

(VP

LP

H)

0

10

20

30

40

50

60

70

80

90

100

Co

ng

es

tio

n F

req

ue

nc

y (

%)

Cong.VPLPH

Figure 4. Estimated Weekday Volume, Speed, and Reliability Profiles

This curve represents the average weekday 24-hour traffic volume profile, in vehicles per lane per hour, as a function of time of day, for this location and direction of travel (I-5 at University Street, GP lanes, Northbound). The color of the curve represents the approximate traffic conditions at different times of the day: Green = 55 mph and above Yellow = 45 to 55 mph Red = less than 45 mph

This histogram indicates the frequency with which traffic becomes congested during the course of an average weekday. In this example, northbound GP traffic on I-5 at University Street was congested at 5:30 PM about 70 percent of all weekdays.

Example

Page 23: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 18

This graph is then supplemented with an overlay of the reliability profile, in histogram orcolumn graph form. When displayed in tandem with the color-coded volume line graph, thereliability histogram can help to highlight the relationships between the averaged values ofvolume and speed, and the frequency of congestion. By measuring the frequency ofcongested conditions, the histogram indicates the extent to which there is significant day-to-day variability from the average values.

• Average daily corridor profile. CDR Analyst can also process a series of user-specifiedlocations along a corridor and produce an average 24-hour profile of lane occupancypercentage at each of those locations for a specified direction of travel, at 5-minuteincrements. The user can specify that only weekdays will be used, or all days of the week.

The output file can then be displayed using a spreadsheet-based template that will produce agraphic that is suitable for analysis or report preparation. As can be seen in Figure 5, theresulting graph shows a contour map of the lane occupancy percentage information that iscolor-coded according to estimated congestion level. The result is a corridor overview oftraffic conditions as a function of location along the corridor as well as time of day anddirection of travel. The graphic is in a topographic map style, where elevation is replaced byaverage traffic conditions. Note: The template that is used to produce this graphic willproduce one contour map for one direction of travel. In order to produce a two-directiongraphic such as the one shown in Figure 5, each contour map and any descriptive map artmust be produced separately, then brought together using a standard drawing program suchas Corel Draw.

• Average travel time profile. CDR Analyst can process corridor information to produce three24-hour profiles related to a specific trip. First, it can estimate the average travel time fromone point to another on one corridor (i.e., a particular interstate freeway or state highway thathas vehicle sensor installations) as a function of the time that the trip starts (in five-minuteincrements, throughout an average 24-hour day). Second, a 90th percentile travel time iscomputed as a function of trip start time. The 90th percentile travel time is a travel time Nsuch that 90 percent of the time the trip will take less than or equal to N. For example, if the90th percentile travel time is 15 minutes for a trip starting at 7 AM, this means that 90 percentof the time a trip that starts at 7 AM is estimated to take no more than 15 minutes, based onavailable data. Third, trip travel time reliability is estimated as a function of trip start time bycomputing the likelihood (as a percentage) that the overall trip speed is less than 45 mph.

The output file can then be displayed using a spreadsheet-based template that will produce agraphic that is suitable for analysis or report preparation. As can be seen in Figure 6, theresulting graph shows a line graph of average travel time, a line graph of 90th-percentiletravel time, and a superimposed histogram (column graph) that shows the travel timereliability measure. All three measures are displayed as a function of trip start time for aspecific origin and destination.

Selecting Data to be Processed by CDR Analyst. CDR Analyst uses all the data contained in theinput files specified by the user. The basic set of data files created by the CDR Auto pre-processorprogram to be used by CDR Analyst contain seven-day-per-week data for an entire calendar year. Eachdata files contains one combination of location, travel direction, and lane type, e.g., cabinet 100,northbound, GP. The user has the option to extract two types of subsets from this collection of data files.First, the user can decide which location(s), direction of travel, and lane type (GP or HOV) will beanalyzed by the program, by indicating the specific data files of interest, either individually or in a batchfile. Second, the user can request that only weekday data be processed (Monday through Friday);otherwise, all seven days of the week are processed. In either case, data for an entire calendar year isprocessed.

Page 24: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 19

Time

182

180

178

176

174

172

170

168

166

Mile

po

st

0 2 4 6 8 10 12 2 4 6 8 10 12

Time

182

180

178

176

174

172

170

168

166

Mile

po

st

0 2 4 6 8 10 12 2 4 6 8 10 12AM PM AM PM

Olive Way

Snohomish County

King County

NE 175th

Northgate Way

405

5

520

522

NE 45th

Uncongested, near speed limit

Restricted movement but near speed limit

More heavily congested, 45 - 55 mph

Extremely congested, unstable flow

NorthboundSouthbound

Interstate 5 North Traffic ProfileGeneral Purpose Lanes1997 Weekday Average

Figure 5. Corridor Traffic Profile

These two contour maps indicate the average traffic conditions along a freeway corridor at different timesof the day, for each direction of traffic. The horizontal scale represents an average weekday 24-hourperiod, while the vertical scale represents the milepost locations along the corridor. The center mapprovides a rough approximation of the location that corresponds to a given milepost.

Page 25: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 20

Figure 6. Estimated Average Weekday Travel TimeNorthbound Interstate 5 GP Lanes, Boeing Field to Lynnwood (25.4 mi) (1997)

0:00

0:10

0:20

0:30

0:40

0:50

1:00

12 AM 2 AM 4 AM 6 AM 8 AM 10 AM 12 PM 2 PM 4 PM 6 PM 8 PM 10 PM

Trip Start Time

Es

tim

ate

d A

ve

rag

e T

rav

el

Tim

e

(ho

ur:

min

)

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Co

ng

es

tio

n F

req

ue

nc

y(A

ve

rag

e t

rip

sp

ee

d l

es

s t

ha

n 4

5 m

ph

)

520

520

90

Renton

BellevueSeattle

Edmonds

Issaquah

Redmond

90

405

405

5

These graphs show 1997 weekday travel time characteristics for a specific trip (I-5 northbound from Boeing Field to Lynnwood on GP lanes), as a function of the start time of the trip.

The green curve is the average travel time for this trip. In this example, it takes about 60 percent longer to take this trip starting at 5 PM than it does starting at 9 PM (41 minutes vs. 26 minutes).

The red curve is the 90th percentile travel time, i.e., 90 percent of the time, the trip will be completed in less than this time. For example, 90 percent of the time, a trip that starts at 5 PM will be completed in 55 minutes or less.

The histogram shows the frequency with which trips will average less than 45 mph. In this example, a trip starting at 5 PM has an average speed under 45 mph about 75 percent of the time.

Example

Page 26: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 21

If the user wishes to pick some other subset of the year (e.g., one particular month, weekends only,Tuesdays through Thursdays only), these data files can be created manually by the user using the CDRprogram ( not CDR Analyst ). The user must run CDR once for each combination of location, traveldirection, and lane type.

Operating Instructions

The following are step-by-step instructions on the use of CDR Analyst to produce each of thethree profile types described above. Before using CDR Analyst, the user should edit the text fileCDR_pref.txt. CDR_pref.txt allows the user to specify a number of parameters associated with dataquality thresholds, days of data to be processed, etc. These are values that typically do not vary from onerun to the next; by putting them in a preference file, the user does not need to answer the same questionsrepeatedly when processing, for example, a series of locations along a corridor or in a region.CDR_pref.txt can be edited by any text editor (e.g., WordPad). It should be saved in a text format. Seethe CDR-Analyst readme file for more information on preference file options .

To produce average daily site profiles:

(Note: This process can also be used to create ramp and roadway report statistics. SeeSection 4, Example B for more details.)

1. Double-click the program icon to start. The most recent version is Rev 1.1 (as of May 1999).

2. Do you want to extract weekday data? Answer Y (yes) or N (no).

All data files include both weekday and weekend data. If you want to produce statisticsbased only on weekdays, indicate your choice. Otherwise, all seven days of the week will beused to produce the desired statistic.

3. Do you want to run a batch job? Answer Y (yes) or N (no).

The batch job option is used to process multiple locations for corridor profiles and travel timeprofiles. It is not used for a single site profile, so answer N (no).

4. Do you want AWDT/AADT information? Answer Y (yes) or N (no).

If AWDT/AADT statistics are desired, indicate that here.

5. Do you want peak period information? Answer Y (yes) or N (no

If peak hour and peak period statistics are desired, indicate that here.

6. Input the file name.

At this point, type the name of the file that contains the data for the site that you areinterested in. The following is the (CDR Auto) file naming convention:

CCCCLDYY.dat

where

CCCC = cabinet number (i.e., the measurement site); leading zeroes are used if < 4 digits

L = lane type (G = GP, H = HOV)

Page 27: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 22

D = direction of travel (N=northbound, S=southbound, E=eastbound, W=westbound)

YY = year (e.g., 97 = 1997)

NOTE: CDR Analyst assumes that all files associated with the analysis (the CDR Analystprogram, the preference file CDR_pref.txt, the input data files, batch files if any) are in thesame directory.

7. Accept suspect data? Answer Y (yes) or N (no).

Each 5-minute data point has a data quality indicator flag associated with it. This data flagindicates whether the data point is considered “good”, “bad”, “suspect”, or “disabled” (i.e.,the equipment is off-line). CDR Analyst uses all available good data, and attempts to replacebad, suspect, or disabled data points with nearby good data. (The replacement method isdescribed later in this section; see “CDR Analyst Core Algorithm.”) However, the user hasthe option to accept suspect data as good data. This option is available because the ”suspect”label is based on a conservative threshold. In many cases, suspect data is very muchconsistent with the good data that surrounds it. However, if there is some question aboutthis, you can specify that suspect data be considered the same as bad data and thereforesubject to replacement by nearby good data.

8. Speed estimation method? Answer N (normal) or K (Kalman).

There are two methods implemented in the program for estimating speeds. The so-callednormal method uses the formula that is used by WSDOT in their WebFlow web-basedsystem to estimate speeds. The Kalman method was developed by D. Dailey at theUniversity of Washington. A preliminary version of the Kalman code has been implemented;however, associated constants have not yet been calibrated. Therefore, users are advised touse the normal method in Version 1.1 of CDR Analyst.

9. Output file name.

The resulting statistics are sent to an output file of the user’s choice. The output file isassumed not to exist yet; if it does exist, the user will be alerted and given the option tooverwrite the existing file. The output file is placed in the same directory with the input dataand CDR Analyst.

10. Use utilities to create graphical output. This is described later in this section.

To produce average corridor profiles:

(NOTE: THIS FUNCTION IS NOT FULLY OPERATIONAL IN VERSION 1.11. See the CDR- Analyst readme file for additional details.)

1. Double-click the program icon to start. The most recent version is Rev 1.2 (as of May 1999).

2. Do you want to extract weekday data? Answer Y (yes) or N (no).

All data files include both weekday and weekend data. If you want to produce statisticsbased only on weekdays, indicate your choice. Otherwise, all seven days of the week will beused to produce the desired statistic.

3. Do you want to run a batch job? Answer Y (yes) or N (no).

Page 28: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 23

The batch job option is used to process multiple locations for corridor profiles and travel timeprofiles. It is used for a corridor profile, so answer Y (yes).

4. D o you want travel time information? Answer Y (yes) or N (no).

Answer N (no) to this question to get corridor profiles only.

5. Input the batch file name.

At this point, type the name of the batch file (a text file) that contains the data file names forthe sites that you are interested in (file naming convention is shown below). There is norestriction on the batch file name; however, use the CDR Auto file naming convention for allfiles listed in the batch file. The list of files should be in ascending or descending order ofmilepost. The following is the file naming convention:

CCCCLDYY.dat

where

CCCC = cabinet number (i.e., the measurement site); leading zeroes are used if < 4 digits

L = lane type (G = GP, H = HOV)

D = direction of travel (N=northbound, S=southbound, E=eastbound, W=westbound)

YY = year (e.g., 97 = 1997)

NOTE: CDR Analyst assumes that all files associated with the analysis (the CDR Analystprogram, the input data files, batch files if any) are in the same directory.

6. Accept suspect data? Answer Y (yes) or N (no).

Each 5-minute data point has a data quality indicator flag associated with it. This data flagindicates whether the data point is considered “good”, “bad”, “suspect”, or “disabled” (i.e.,the equipment is off-line). CDR Analyst uses all available good data, and attempts to replacebad, suspect, or disabled data points with nearby good data. (The replacement method isdescribed later in this section; see “CDR Analyst Core Algorithm.”) However, the user hasthe option to accept suspect data as good data. This option is available because the ”suspect”label is based on a conservative threshold. In many cases, suspect data is very muchconsistent with the good data that surrounds it. However, if there is some question aboutthis, you can specify that suspect data be considered the same as bad data and thereforesubject to replacement by nearby good data.

7. Speed estimation method? Answer N (normal) or K (Kalman).

There are two methods implemented in the program for estimating speeds. The so-callednormal method uses the formula that is used by WSDOT in their WebFlow web-basedsystem to estimate speeds. The Kalman method was developed by D. Dailey at theUniversity of Washington. A preliminary version of the Kalman code has been implemented;however, associated constants have not yet been calibrated. Therefore, users are advised touse the normal method in Version 1.1 of CDR Analyst.

8. Output file name.

Page 29: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 24

The resulting data are sent to an output file of the user’s choice. The output file is assumednot to exist yet; if it does exist, the user will be alerted and given the option to overwrite theexisting file. The output file is placed in the same directory with the input data and CDRAnalyst.

9. Use utilities to create graphical output. This is described later in this section.

To produce average travel time profiles:

1. Double-click the program icon to start. The most recent version is Rev 1.06 (as of March1999).

2. Do you want to extract weekday data? Answer Y (yes) or N (no).

All data files include both weekday and weekend data. If you want to produce statisticsbased only on weekdays, indicate your choice. Otherwise, all seven days of the week will beused to produce the desired statistic.

3. Do you want to run a batch job? Answer Y (yes) or N (no).

The batch job option is used to process multiple locations for corridor profiles and travel timeprofiles. It is used for a travel time profile, so answer Y (yes).

4. D o you want travel time information? Answer Y (yes) or N (no).

Answer Y (yes) to this question to get travel time profiles.

5. D o you want daily travel time information? Answer Y (yes) or N (no).

This question refers to the option to either calculate travel times for each individual day andcompute statistics based on this collection of times, or to compute an overall 24-hour profileof traffic data first, then compute travel time based on this aggregate profile. Answer Y (yes)to this question to get all three travel time profiles (average, 90th percentile, travel timereliability). If you answer N (no), you will only get an average travel time that is based onthe aggregate 24-hour profile of all days processed. Usually, you will answer Y.

6. Input the batch file name.

At this point, type the name of the batch file (a text file) that contains the data file names forthe sites that you are interested in (file naming convention is shown below). There is norestriction on the batch file name; however, use the CDR Auto file naming convention for allfiles listed in the batch file. The list of files should be in ascending or descending order ofmilepost. The following is the file naming convention:

CCCCLDYY.dat

where

CCCC = cabinet number (i.e., the measurement site); leading zeroes are used if < 4 digits

L = lane type (G = GP, H = HOV)

D = direction of travel (N=northbound, S=southbound, E=eastbound, W=westbound)

Page 30: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 25

YY = year (e.g., 97 = 1997)

NOTE: CDR Analyst assumes that all files associated with the analysis (the CDR Analystprogram, the input data files, batch files if any) are in the same directory.

7. Accept suspect data? Answer Y (yes) or N (no).

Each 5-minute data point has a data quality indicator flag associated with it. This data flagindicates whether the data point is considered “good”, “bad”, “suspect”, or “disabled” (i.e.,the equipment is off-line). CDR Analyst uses all available good data, and attempts to replacebad, suspect, or disabled data points with nearby good data. (The replacement method isdescribed later in this section; see “CDR Analyst Core Algorithm.”) However, the user hasthe option to accept suspect data as good data. This option is available because the ”suspect”label is based on a conservative threshold. In many cases, suspect data is very muchconsistent with the good data that surrounds it. However, if there is some question aboutthis, you can specify that suspect data be considered the same as bad data and thereforesubject to replacement by nearby good data.

8. Speed estimation method? Answer N (normal) or K (Kalman).

There are two methods implemented in the program for estimating speeds. The so-callednormal method uses the formula that is used by WSDOT in their WebFlow web-basedsystem to estimate speeds. The Kalman method was developed by D. Dailey at theUniversity of Washington. A preliminary version of the Kalman code has been implemented;however, associated constants have not yet been calibrated. Therefore, users are advised touse the normal method in Version 1.1 of CDR Analyst.

9. Output file name.

The resulting data are sent to an output file of the user’s choice. The output file is assumednot to exist yet; if it does exist, the user will be alerted and given the option to overwrite theexisting file. The output file is placed in the same directory with the input data and the CDRAnalyst program.

10. Use utilities to create graphical output. This is described later in this section.

(See the CDR-Analyst readme file for additional details on other travel time output types.)

Other Analysis Options

It is important to note that CDR Analyst produces measures based on the data given to it. Whileit is common to wish to analyze one calendar year, and thus process one-year data files, any other timeperiod can also be analyzed by CDR Analyst. For example, if a construction project occurs during theyear, it may be more appropriate to analyze only the data corresponding to the time period afterconstruction was completed. It is also common to analyze all lanes of a particular type (GP or HOV) at agiven site. However, individual lanes can also be analyzed. Examples of such studies include outside vs.inside HOV lane comparisons, or passing lane studies. As indicated previously, CDR should be used toproduce input data files for time periods other than one calendar year.

Page 31: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 26

CDR Analyst Algorithms

The following are descriptions of the primary algorithms used in CDR Analyst. The discussionbegins with the core algorithm that processes 24-hour site profiles. This is followed by descriptions of thespeed estimation method, the congestion frequency histogram method, the peak hour/peak period/dailyvolume estimation method, the contour computation method, the average travel time profile method, andthe 90th percentile travel time profile method, and the travel time reliability histogram method.

CDR Analyst Core Algorithm

CDR Analyst’s principal process is the conversion of multiple days of multi-lane traffic data intoa single average 24-hour traffic profile. The process uses all available “good” data to produce the 24-houraverage profile.

To process multiple days into a single average day, the following steps are taken:

• For each 5-minute interval, do the following:

1. For each day, do the following two steps:

a. For each lane of traffic to be processed, do the following:

Look at the data flag of a given 5-minute data value in a given lane in a given day. If the datapoint is labeled “good”, or if it is labeled “suspect” but the user specifies that suspect data isassumed to be good, then use the value as is. If the data is labeled “suspect” and the userrequests a data replacement, or if it is labeled “bad” or “disabled”, the program searches for agood value in the (temporal) vicinity (and within the same lane) by moving back 5 minutes,then forward 5 minutes, then back 10 minutes, then forward 10 minutes, to a maximum of±15 minutes. If a good data point is located within that window, it is used as a replacementvalue. If no such value is found, the data point is not included in subsequent calculations.

b. Average the resulting values of each lane to get a per-lane average for that 5-minuteinterval for that day.

2. Average the resulting per-lane averages across all days to get an overall average for that 5-minute interval.

• Repeat the process for each 5-minute interval throughout a 24-hour day. Use this process forboth volume information and lane occupancy data.

This process is used to produce an average 24-hour profile at one site. The process can then beused at a series of sites along a corridor, in batch mode, with the result used to produce corridor profilesand travel time profiles.

Page 32: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 27

CDR Analyst Speed Algorithm

In the “normal” algorithm, the average speed of vehicles at a site, across all lanes, for a given 5-minute interval, is determined by using the same core process described above, then using the resultingaverage per-lane volume and occupancy to estimate speed. The formula used is

v = q/(o*g),

where

v = estimated speed (miles per hour)

q = estimated per-lane vehicle volume (vehicles per lane per hour)

o = estimated per-lane lane occupancy (percentage)

g = a constant that incorporates site characteristics such as average vehicle length and loopdetector field length.

Using five-minute volume and occupancy data and a constant of g=2.4 (the value used byWSDOT), the formula becomes

Estimated speed, in mph = [(average 5-min. per-lane volume)*12] /

(average per-lane occupancy percentage * 2.4)

Because the formula is considered less accurate at low and high speeds, the following thresholdsare used: if the resulting speed is greater than 60 mph or less that 10 mph, the speed is modified to 60mph or 10 mph, respectively; also, if the lane occupancy is less than 12 percent, the speed is set to 60 mph.These thresholds are the same as those used by WSDOT.

Caveats: The accuracy of this approach is of course dependent on the input data. Other studieshave suggested that the accuracy of this formula is especially dependent on the constant g, and that thereis a possible dependence of g on not only vehicle and site characteristics, but lane occupancy as well,suggesting that different g values should be used in low, medium, and high occupancy domains. Fornow, however, analyses of the FLOW system use g = 2.4, which is the value used for FLOW map onlinedisplays of speed (as of March 1996). This value can be changed by the user.

The Kalman method was developed by D. Dailey at the University of Washington1. Apreliminary version of the code has been implemented; however, associated constants have not yet beencalibrated. Therefore, users are advised to use the normal method in Version 1.1 of CDR Analyst.

1 D. J. Dailey, A Statistical Algorithm for Estimating Speed from Single Loop Volume and OccupancyMeasurements, Transportation research. Part B: Methodological, 1999.

Page 33: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 28

CDR Analyst Congestion Frequency Algorithm

The congestion frequency percentage that is computed for a specific site is based on thefrequency with which the average lane occupancy percentage exceeds a user-specified level. Thiscongestion measure is computed using the same core process described above, with the followingdifferences (noted in underlined text ):

• For each 5-minute interval, do the following:

1. For each day, do the following two steps:

a. For each lane of traffic to be processed, do the following:

Look at the data flag of a given 5-minute data value in a given lane in a given day, andperform data replacement if the user requests it and it is feasible (same as described in thecore algorithm).

b. If a sufficient number of lanes of valid values result from the data replacement process forthat 5-minute interval, average the resulting values of each lane to get a per-lane average forthat 5-minute interval for that day. Then, increment an “eligible day” counter, and determineif the resulting average lane occupancy exceeds the user-specified threshold for determiningcongestion. If it does, increment a separate “congestion frequency” counter for this 5-minuteinterval.

2. Average the resulting per-lane averages across all days to get an overall average for that 5-minute interval. Divide the congestion frequency counter value by the eligible day counter,then multiply by 100 to get a percentage.

• Repeat the process for each 5-minute interval throughout a 24-hour day.

CDR Analyst Peak Hour, Peak Period, and Daily Volume Algorithms

CDR Analyst computes peak hour, peak period, and average annual daily traffic volumes usingthe AASHTO method, a process that uses minimum thresholds of data availability to produce an estimateof average annual daily traffic that samples across days of the week and months of the year. The processis similar to that used by CDR to produce analogous statistics, with the following exceptions: 1) CDRAnalyst computes statistics for multi-lane sites (and therefore has additional data quality thresholds formulti-lane averaging), while CDR processes only one lane at a time; 2) CDR Analyst uses a differentalgorithm for user-specified data replacement of suspect values. The AASHTO procedure asimplemented in CDR Analyst is as follows:

1. For each day, do the following steps:

• For each 5-minute interval, do the following steps:

a. For each lane of traffic to be processed, do the following:

Look at the data flag of a given 5-minute data volumes in a given lane in a given day, andperform data replacement if the user requests it and it is feasible (same as described in thecore algorithm). (This is different from the CDR data replacement method, which only looksbackward in time, and will go as far back as the previous midnight to get a valid replacementvalue.)

Page 34: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 29

b. Average the resulting volumes of each lane to get a per-lane average for that 5-minuteinterval for that day. If less than 50 percent of the lanes have a good data value, flag that 5-minute average volume as “invalid.”

• If at least 90 percent of the average 5-minute per-lane volumes of that day are considered“valid”, sum up the 5-minute volumes for that day. Otherwise, flag the entire day as“invalid.”

2. For each month, sum up and average the daily volumes by day of the week, e.g., for January,produce a Monday average, Tuesday average, etc. Skip any “invalid” days. If there is morethan one “invalid” day of the week (e.g., an average day-of-week value cannot be calculatedfor the Mondays in that month), flag that month’s volume for that day of the week as“invalid.”

3. For each day of the week, average the 12 monthly day-of-week averages to produce a yearlyday-of-week average. If there are more than two “invalid” monthly day-of-week values, flagthat yearly day-of-week average as “invalid.”

4. Average the five (if weekday statistic) or seven (if seven-day statistic) yearly day-of-weekaverages to produce the average annual daily traffic value. If there is more than one invalidyearly day-of-week average, an annual daily traffic volume will not be computed.

CDR Analyst also computes a value based on a direct average of all days with at least 90 percentgood data, which can be used if an AASHTO-based value cannot be computed because of a threshold.Note that the direct average method may introduce some bias in the resulting annual estimate if the daysused to compute the average are skewed toward weekdays or weekends, or a particularly time of theyear (e.g., winter months).

AM and PM peak period values are computed the same way daily values are, except that theyuse two fixed subsets of data (6 to 9 AM and 3 to 7 PM, respectively). AM and PM peak hour values aredetermined by computing the one-hour period with the highest volume in their respective 12-hourperiods (midnight to noon, noon to midnight) by using a moving one-hour window that increments at 5-minute intervals (i.e., 12:00 midnight to 1:00 AM, then 12:05 AM to 1:05 AM, etc.). This is based on the 24-hour average produced by the core process.

CDR Analyst Contour Algorithm

The 24-hour traffic profile that is computed for one site by CDR Analyst’s core process (describedearlier) can also be computed at a number of sites along a corridor. The resulting series of traffic siteprofile “slices” along the corridor can then be graphically depicted simultaneously in the form of atopographic-style contour map that shows average traffic conditions as a function of both time of day andcorridor location. To create the process, CDR Analyst performs the following steps:

1. CDR Analyst starts with a user-specified batch file that contains a list of data file names.Each data file listed includes traffic data about one site on a corridor. The data file names arelisted sequentially in the batch file by milepost location. CDR Analyst uses this batch file toproduce a series of 24-hour profiles of average lane occupancy percentage, one at each sitelisted in the batch file.

2. CDR Analyst then takes the resulting profiles and prepares them to be put into an output filein a matrix format. Because the resulting matrix is then used in Microsoft Excel to create thecontour map, and Excel’s contour graphics option requires that the data points be equidistant(i.e., equal spacing along the corridor), it is first necessary for CDR Analyst to take the(usually) irregularly spaced profile slices and convert them to equal spacings. To do this, it

Page 35: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 30

performs a linear interpolation between profiles to produce regularly spaced (at 0.5 mileincrements) interpolated values. Each interpolated value is based on the profile data fromsites that are closest to the location at which interpolation is being computed, using theformula

valueINT = valueA+[(milepostINT - milepostA)/(milepostB - milepostA)] * (valueB - valueA)

where

location A = location of closest data value on one side of the interpolated location

location B = location of closest data value on the other side of the interpolated location

valueA = data value (average lane occupancy percentage) at location A

valueB = data value (average lane occupancy percentage) at location B

valueINT = desired data value (average lane occupancy percentage) at interpolation

location (0.5 mile spacing)

milepostA = milepost at location A

milepostB = milepost at location B

milepostINT= milepost at location of interpolation

For example, suppose that average lane occupancy profiles have been computed at a series ofsites along a corridor at the following irregularly spaced mileposts:

Profiles at mileposts: 0.2 1.1 1.5 2.2 2.4 2.7 3.7

(irregularly spaced, actual data)

CDR Analyst computes linearly interpolated values at regular 0.5 mile spacings within thisrange from 0.2 miles to 3.7 miles, i.e., at 0.5, 1.0, ..., 3.5 miles. For each evenly spaced location,the closest actual data on each side is used to perform the linear interpolation. For example,to compute the interpolated value at milepost 0.5, the values at 0.2 and 1.1 are used(corresponds to points A and B in the formula above). The same two values are used to getthe interpolated value at milepost 1.0, since 0.2 and 1.1 are still the closest data values. Theresults would be:

Profiles at mileposts: 0.2 1.1 1.5 2.2 2.4 2.7 3.7

(irregularly spaced, actual data)

Interpolation occurs at: 0.5 1.0 1.5 2.0 2.5 3.0 3.5

(regularly spaced, interpolated)

3. The resulting evenly spaced interpolated (or extrapolated) average lane occupancy profilesare sent to an output file in a matrix form (time vs. milepost location), where they can beprepared as a contour graph. Note that all site locations are based on the milepost of theassociated field data collection cabinet (the equipment installed at a freeway site to collect

Page 36: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 31

and process data from nearby sensors). Individual sensors do not have separate milepostvalues.

4. The resulting contour map graphic is color-coded by lane occupancy value. The range ofvalues used for each color corresponds to different levels of traffic congestion, based on thelevel of service concept described in the Highway Capacity Manual for freeways with afreeflow speed of 65 mph. The following fixed ranges are used:

Color Lane Occupancy % LOS General Traffic Description

green 0 to 10 percent A, B, C uninterrupted travel at thespeed limit

yellow 10 to 13 percent D moderate traffic at or near thespeed limit, with restrictedmovement (e.g., when changinglanes)

red 13 to 19 percent E traffic moving at or somewhatbelow the speed limit, withrestricted movement

purple above 19 percent F congested traffic with restrictedmovement

CDR Analyst Average Travel Time Profile Algorithm

A process similar to that used to develop contour maps can also be used to estimate averagetravel times from one point to another on a specific corridor using the following process:

1. For each day of the year (or whatever time period is being analyzed), average volumes andlane occupancy percentages are computed as a function of time of day and location along thecorridor segment of interest and the travel direction of interest, using the core algorithm.Multiple lanes are combined into an average per-lane statistic during this process. Datareplacement uses a ± 15 minute window to attempt to replace any data point that needs to bereplaced (either because it is bad/disabled, or because it is suspect data and the user wishesto replace it.)

2. These values are used to develop speed estimates, also as a function of time of day andlocation, for each day of interest. The speed algorithm (described earlier) is used to developthese estimates.

3. At this point in the process, we have estimated speeds as a function of time of day at a seriesof measurement locations along the corridor for each day, as well as known distancesbetween those measurement locations (based on the milepost of the associated data collectioncabinet). These speeds and distances are combined to estimate travel times as a function ofthe trip start time, for each day. For example, suppose we want to know the travel time fromlocation A to location E, and there are three additional measurement locations between thosetwo locations (B, C, and D). We therefore have the following information:

Measurement Locations: A (origin) B C D E (destination)Estimated Speeds: sA sB sC sD sEKnown Distances: AB BC CD DE

Page 37: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 32

Using this information, a travel time estimate can be developed for the trip by dividing thetrip into segments by using each pair of adjacent measurement locations to define segmentendpoints (A to B, B to C, etc.), computing the travel time for each segment, then adding thesegment times together to get an overall trip time. The resulting travel time formula(developed by D. Dailey of the University of Washington2) assumes that speeds vary linearlyfrom one measurement location to an adjacent measurement location; the approximateformula is:

Total Trip Travel Time = sum of individual segment travel times, where each segment isdefined by successive sensor locations

Each segment time ≈ 2 * (∆x) * { S + [(∆s)2/3 * S3]+ [∆s4/5 * S5] }

where the segment goes from measurement point i to point (i +1), and

si = speed at location i (one end of the segment)

si+1 = speed at location i+1 (the other end of the segment)

∆x = segment length

∆s = si+1 - s i

S = (si+1 + s i) -1

If a value is not available because it is determined to be invalid, then the program performs alinear interpolation across the missing data point, based on the two closest speed estimateson each side of the missing value, with the assumption that speeds vary linearly between thetwo points. In the previous example, if the speed at point C is missing, speeds at points Band D are used, using the assumption that speed between B and D varies linearly. If gooddata points are not available on both sides of the missing value (e.g., if an endpoint of the triphas a missing value), linear extrapolation is performed using the two closest values. If fewerthan two valid data points are available along the route, then interpolation or extrapolation isnot possible; therefore no travel time is computed for that trip start time on that day.

4. Because the speed estimates are computed as a function of time of day, travel times can thusbe computed as a function of trip start time. For each trip start time (at 5 minute intervalsthroughout a 24 hour day), a corresponding travel time is thus computed.

5. Note that as the trip time is built up segment by segment, the most current speed estimate isused at each step along the way. For example, suppose that in the previous example (goingfrom A to E), the trip starts at location A at 7 AM. The process of computing the travel timebegins by estimating the travel time from A to B (the first segment). This is done by using theformula in step 3 along with the known distance from A to B and speed estimates for 7 AM atpoints A and B. If that estimated segment travel time exceeds five minutes (say, sevenminutes), the segment travel time from B to C is computed by using the appropriate speedestimates at B and C not for 7 AM, but for 7:05 AM (since that is the most current estimate atlocation B, since it took 7 minutes to travel from A to B). This process continues, with each

2 D. J. Dailey, Travel Time Estimates Using a Series of Single Loop Volume and Occupancy Measurements, Paper970378, Transportation Research Board, 76th Annual Meeting, January 12-16, 1997, Washington, D.C.

Page 38: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 33

segment time computed by using the most appropriate speed estimates, based on thecumulative travel time up to that point.

6. For each trip start time (at 5 minute intervals throughout a 24 hour day), the correspondingtravel times for all the days in the analysis are averaged.

The result is a 24-hour travel time profile of the average travel time for a specific trip as a functionof trip start time. Note that this process assumes that the trip occurs on only one corridor (i.e., the entiretrip uses the same milepost numbering sequence, and there are no interchanges or ramps used during thetrip).

CDR Analyst 90th Percentile Travel Time Profile Algorithm

The process to develop average travel times is extended to compute the so-called 90th percentiletravel time as a function of trip start time. After computing the travel times for individual days, thenumber of travel time estimates corresponding to each 5-minute trip start time are counted, and the traveltimes are sorted in order of increasing duration. For each 5-minute time interval, starting from theshortest time and going up the list to the travel time that is 90 percent of the way to the longest trip time(i.e., 90 percent of the way to the end of the list), the corresponding travel time in the sorted list is the 90thpercentile travel time. This is the time for which 90 percent of the trips with a given start time have ashorter trip travel time.

This value is computed for each 5-minute trip start time interval in a 24 hour time period. Theresult is a 24-hour travel time profile of 90th percentile travel time as a function of trip start time.

CDR Analyst Travel Time Reliability Algorithm

During the process of computing average travel times, a travel time is computed for each day;from this information, an overall average trip speed is also determined. These speeds are tabulated as afunction of time of day to determine the percentage of trips that have average speeds less than 45 mph, asfollows:

For each 5-minute interval, an “eligible day” counter is incremented once for each day in which atravel time can be computed for a trip starting at that 5-minute interval. For each day’s trip whoseaverage trip speed is less than 45 mph for a trip starting at that 5-minute interval, a separate “slow trip”counter is incremented once for this 5-minute interval. After all daily trips are processed, the “slow trip”counter value is divided by the “eligible day” counter, then multiplied by 100 to get a percentage of tripsthat are less than 45 mph based on the time period that was studied (alternatively, this can be thought ofas the likelihood that a trip on a particular route starting at a particular time will have an average speedof less than 45 mph).

This percentage is recorded for each trip start time, and used for the superimposed histogram inthe resulting travel time chart. The result is a 24-hour profile of travel time reliability as a function of tripstart time.

CDR Auto

CDR Auto is a program developed by TRAC to re-format 5-minute traffic data stored on CD intofiles on a hard drive that can then be used by CDR Analyst. CDR Auto extracts information aboutmainline GP and HOV lanes for every day on each data CD.

Page 39: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 34

Operating Instructions

This program is a pre-processor for CDR Analyst, and needs to be run only once for each year ofdata. The resulting output files (which can take up to 2 Gb/year on a hard drive) are then used in allsubsequent runs of CDR Analyst. To process one year of data (2 to 4 CDs), do the following:

1. Double-click the program icon to start. The most recent version is Rev 1.0 (as of May 1999).

2. Enter the year of the data set that is being processed.

3. Enter the number range of cabinets to be processed. (See the CDR User’s Guide in theAppendix of this report for more discussion of the cabinet numbers.) The entire CD will beprocessed for the cabinets selected.

4. Repeat this process for each CD in the year of interest. Data files will be appended. (Thismeans that each CD does not generate brand new files; instead, data from subsequent CDs isappended to the file created when the first CD of the year was processed.)

Please read the CDR-Auto readme file for updated information on this utility.

CDR Analyst Utilities

CDR Analyst output is often processed to produce graphical output. The following utilities andtemplates are available to produce graphics.

Site Graphics

An Excel macro allows individual output files (.out extension) from the CDR Analyst site profileprocess to be displayed as a volume line graph and histogram. To access the file, open the macro filewithin Excel, then go to the Tools menu, select “Macro”, and select and open the “histobat” macro.

The macro first asks for the name of a user-written command file (a text file) that lists the CDRAnalyst output files to be processed into graphical form. The command file should have a “.bat”extension in its filename. The files listed in the command file should leave off the extension, which isassumed to be “.out.” (The easiest way to produce the command file is to create a file in Excel, and enterthe file names in column A.)

The macro then asks if histograms are to be produced. If the answer is yes, the graphs areproduced. The macro color-codes the line graph based on the speed profile, using fixed color/speedranges (red < 45 mph, yellow = 45 to 55 mph, green > 55 mph). The resulting output files have the samename as the input file, except with an “.hst” extension. Each file will contain the original input file in onesheet, and the resulting graph in another sheet.

The user can then specify whether graphs are to be printed out. If so, the graphs are printed outto the default printer. The user can also specify whether the headers of the output data files (summaryinformation) are to be printed out as well. Note: If you answer No to the question about producinggraphs, but Yes to the question about printing, the “.hst” version of each file in the command file will beprinted out.

Corridor Graphics

An Excel template allows output from the CDR Analyst contour process to be displayed as acontour (topographic style) map. To use the template, copy the entire output data file from CDR Analyst

Page 40: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 35

into the “data” sheet of the template ; the “graph” sheet shows the result. In some cases, the milepostrange will need to be modified. If significant changes are needed, it may be easier to use the ExcelWizard to create a contour map. Note that the final version of the corridor graphic with both directionsof travel and corridor map art must be created manually, using a drawing program.

Travel Time Graphics

An Excel template allows output from the CDR Analyst travel time process to be displayed as acombination line graph/histogram figure. To use the template, copy the entire output data file from CDRAnalyst into the “data” sheet of the template; the “graph” sheet shows the result. In some cases, the titleswill need to be modified manually.

CD Data Quality Mapping Utility

CD-based traffic data can be pre-processed to evaluate the level of data quality at each site. AnExcel macro accesses the traffic data files and tabulates the data quality on a cabinet by cabinet basis.This matrix of counts can then be analyzed to determine if sufficient “good” data exist. The macroprocesses a user-input command file that lists the data files to be processed.

The macro allows individual data files (.dat extension) created by CDR Auto to be analyzed. Toaccess the macro, open the macro file within Excel, then go to the Tools menu, select “Macro”, and selectopen the “flag0123” macro. The macro asks for the name of a user-written command file (a text file) thatlists the CDR Analyst output files to be processed into graphical form. The command file should have a“.txt” extension in its filename. The files listed in the command file should include the extension. (Theeasiest way to produce the command file is to create a file in Excel, and enter the file names in column A.)

The macro will then process each file, extracting and tabulating the number of data points foreach cabinet that are in each of the four data quality categories: 0, 1, 2, and 3 (bad, good, suspect, anddisabled, respectively). The result will be a matrix of such values, which can be reviewed to determine ifsites (cabinets) of interest have sufficient valid data. The resulting output file uses the first eightcharacters of the last input file name in the command file, with an “.flg” extension. The data files shouldbe in the same folder as the mapping macro.

Note that looking at the amount of good data alone may not be a sufficient measure of valid data,since suspect data can often be good data. If you plan to use data replacement of suspect data, youshould of course consider the suspect data count as well. Note also that if data is organized by quarters,the output will be similarly organized.

Page 41: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 36

4. Uses of the FLOW Evaluation Analysis Process

Introduction

In this section, three examples of the use of the evaluation process and tools described in thisdocument are discussed. They include the production of a freeway usage and performance report thatsummarizes levels of use and performance for the central Puget Sound freeway network; a summary oftraffic volumes at selected locations; and freeway system operations diagnostics. The following aredescriptions of each example and the process by which the evaluation tool set described in section 3 canbe used to produce the desired data.

Example A. Central Puget Sound Freeway Usage and Performance Report

In early 1999, the evaluation process and tool set described in this report was used to create the“Central Puget Sound Freeway Usage and Performance Interim Report” for WSDOT. This reportsummarized 1997 freeway usage at selected locations in the central Puget Sound freeway network, andalso provided performance measures that included volume, speed, and congestion frequency at selectedsites, as well as corridor-wide congestion patterns and travel time estimates. Variations in usage andperformance as a function of year, lane type (GP or HOV), and weekdays vs. weekends were analyzed.

The following is an outline of that report, with comments about how the analysis was performed,which sections of this document are relevant to those processes, and implementation notes. (The readeris encouraged to refer to the usage and performance report while reading the following comments.)

System Usage

Analysis Types. Two measures of system usage were computed: average annual weekdayvehicle volume, and average weekday peak period and peak hour vehicle volumes. Estimates of thesemeasures were made at 13 selected locations in the central Puget Sound freeway network, on the majorfreeway corridors (see Table 1 for information about measurement locations). Locations were selectedbased on their traffic significance and the availability of usable data. The values used in this section ofthe interim report were computed using the CDR Analyst core algorithm described earlier. This wasdone by using CDR Analyst, with input data corresponding to the locations of interest for the trafficdirection, lane type (GP/HOV) and time period (in this case 1997) of interest. The resulting outputincludes system usage statistics, as well as additional information used elsewhere in the interim report.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for siteprofiles, describes the process for computing these measures.

Note 1. The presence of reversible lanes at selected I-5 and I-90 locations used in the interimreport requires particular care. Because the same sensors collect data 24 hours a day, it is necessary toseparate the data corresponding to each direction of operation of the reversible lane(s). For example,even if the reversible lanes operate westbound from 1 AM to noon, and eastbound from 1 PM tomidnight, the system usage statistics shown in the output from CDR Analyst will reflect the entire 24

Page 42: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 37

Table 1. Reference Set of Measurement Locations

Corridor Location Traffic Considerations

I-5 (6 sites)

GP/HOV

GP/HOV

GP/HOV

GP

GP/HOV

GP/HOV

S. 272nd

S. 170th

Albro

NE 63rd

NE 137th

128th St. SW

Measures traffic between Seattle and South King and Pierce Co.

Measures traffic between Seattle and South King and Pierce Co.

Measures traffic from south end to Seattle CBD

Measures traffic between north end and U-District/Seattle CBD

Measures traffic in the north end and Snohomish Co.

Measures traffic in the north end and Snohomish Co.

I-90 (2 site)

GP/HOV

GP/HOV

Midspan

161st Ave. SE

Measures bridge use

Measures traffic east of I-405 interchange (near Eastgate)

SR520 (3 sites)

GP

WB HOV

GP/HOV

76th Ave NE

84th Ave NE

NE 60th

Measures bridge use

Measures bridge use

Measures traffic east of I-405 interchange (near Redmond)

I-405 (4 sites)

GP/HOV

GP/HOV

GP/HOV

GP/HOV

SE 52nd

NE 14th St.

NE 85th

Damson Rd.

Measures traffic between Bellevue CBD and South King Co.

Measures traffic in vicinity of Bellevue CBD

Measures traffic between Bellevue CBD and North King Co.

Measures traffic in Bothell/Woodinville area

SR167 (1 site)

GP TBD Measures non-I-5 traffic between South King Co. and SouthSeattle/Eastside

Page 43: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 38

hour period, i.e., both directions combined. It is therefore necessary to look at the detailed 24 hourprofile information (which is included with the output), and separate the volumes into two categoriescorresponding to the two directions of operation, in order to get appropriate daily volumes by directionof operation.

Note 2. The daily system usage statistics are based on the AASHTO method (described in section3). However, in many cases there may not be sufficient valid data to meet strict data quality standards.In those cases, daily usage based on a direct average of all days with sufficient “good” data is displayed.

Note 3. It is important to study the data to verify that there is sufficient “good” or valid data todevelop a meaningful statistic at a given location. One way to do this is to look at the 24 hour profile dataincluded with the statistics and in particular look at the number of good days that each lane contributedto volumes at each 5-minute interval. If the numbers are low relative to the total number of daysanalyzed (e.g., 261 weekdays in a year), it could suggest that the resulting statistic might not be accurate.A comparison of “good” day counts in each lane could also indicate whether an individual lane’s sensoris having equipment difficulties. Ideally, you will want to verify that the site has sufficient good dataprior to analysis. One way to do this is to use the data quality mapping option to get an overview of thequality of available data at the site of interest (the mapping option is described later in this section asExample C, System Operations Diagnostics).

System Performance I : Freeway Corridors (Contours)

Analysis Types. General purpose freeway corridor performance was summarized in this sectionof the interim report, using average traffic congestion levels by time of day and location (contour maps).Estimates of these measures were made on I-5, I-405, SR 520, and I-90 in the Seattle area. The corridoranalyses performed in this section of the interim report were computed using the corridor algorithmsdescribed earlier, namely the contour map option. This is done by using CDR Analyst, with input datacorresponding to a series of locations along each corridor for each traffic direction, lane type (GP only inthis case) and time period (in this case 1997) of interest. The resulting output includes a matrix ofestimated congestion level as a function of time of day and location along the corridor, which isconverted into a topographic-style contour map.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for corridorprofiles, describes the process for computing these measures.

Note 1. It is important to study the distribution of the input data to verify that there is a sufficientdistribution of valid data along a corridor to develop a meaningful contour map. Because the contourgraph option automatically performs interpolation to fill in areas of missing data, the gaps in the data willnot be immediately obvious from the output. It is therefore important to look at the geographicaldistribution of the initial data points being used along the corridor to identify locations with significantdata gaps, and qualify the resulting contour map accordingly. For example, in the 1997 interim report,the south part of I-5 from Tukwila to Boeing Field was left blank in the contour map because of theabsence of any valid data. The interpolated results in the north and south ends of I-405 were kept in thereport maps, but the sparseness of data in those areas was noted in the text.

Note 2. It is important to study the input data to verify that there is sufficient “good” or validdata at each site used to produce the corridor contour. Insufficient valid data will not be immediatelyobvious from the output. It is therefore important to look at the data at each location to verify its validity.One way to do this is to look at the 24 hour profile data included with the statistics and in particular lookat the number of good days that each lane contributed to volumes at each 5-minute interval. If thenumbers are low relative to the total number of days analyzed, it could suggest that the resulting statisticmight not be accurate. A comparison of “good” day counts at each lane could also indicate whether anindividual lane’s sensor is having equipment difficulties. Ideally, you will want to pre-filter the batch file

Page 44: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 39

used to create the contour map by verifying that each site listed in the batch file has sufficient good data.One way to do this is to use the data quality mapping option to get an overview of the quality of availabledata at the site of interest (the mapping option is described later in this section as Example C, SystemOperations Diagnostics).

Note 3. Unlike some of the other graphics in the interim report, the contour map graphics wereput together manually using several graphics from different sources. The two contours on each graphwere produced separately using Excel templates; screen dumps of each contour were exported to adrawing program along with a site map. The resulting pieces were then merged together. Note that insome instances, corridor maps had to be “straightened out” or distorted to better match the linearmilepost axis. In any case, the corridor map should be considered only a general guide to locations, andnot a precisely calibrated scale.

System Performance I : Freeway Corridors (Travel Times)

Analysis Types. The same data used to produce contour maps can be used to estimate traveltimes. Three measures of trip-oriented general purpose freeway corridor performance were computed inthis section of the interim report: average corridor travel times, 90th percentile corridor travel times, andaverage travel time reliability. Estimates of these measures were made on I-5, I-405, SR 520, and I-90 inthe Seattle area. The travel time analyses performed in this section of the interim report were computedusing the three travel time algorithms described earlier (average, 90th percentile, travel time reliability).This is done by using CDR Analyst, with input data corresponding to a series of locations along eachcorridor for each traffic direction, lane type (GP only in this case) and time period (in this case 1997) ofinterest. The resulting output is a function of trip start time.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for traveltime profiles, describes the process for computing these measures.

Note 1. It is important to study the data to verify that there is sufficient “good” or valid data todevelop a meaningful travel time along a corridor. Because the travel time option automatically performsinterpolation or extrapolation to fill in areas of missing data, the gaps in the data will not be immediatelyobvious from the output. It is therefore important to look at the geographical distribution of the datapoints being used along the corridor to identify any areas with significant gaps, and qualify the resultingtravel time profile accordingly. Ideally, you will want to pre-filter the batch file used to create the traveltime profile by verifying that each site listed in the batch file has sufficient good data. One way to do thisis to use the data quality mapping option to get an overview of the quality of available data at the site ofinterest (the mapping option is described later in this section as Example C, System OperationsDiagnostics).

Note 2. All travel time graphics in the interim report include a small map inset that was createdin a separate drawing program, then pasted into the Excel graph.

Note 3. It is important that travel time measurements be based on routes that are significant froma traffic perspective. Routes should consider 1) major origin-destination patterns in the region, 2) majorcorridors or highways in the region, and 3) data collection locations (especially vehicle sensor locations).

Note 4. In the interim report, travel times were estimated for trips spanning the entire length ofeach of the major freeway corridors. In future evaluations, plans call for the use of specific trips. Forexample, the trips shown in Table 2 are a tentative list of such trips (subject to data availability).

Page 45: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 40

Table 2. Potential Routes for Travel Time Estimation and Monitoring

Route # Origin toDestination

Route Type PrimaryFreewayCorridor(s)

FreewayPortion ofRoute: Start,End

Traffic Considerations/Comments

Route 1 Lynnwood toSeattle CBD

Suburb toSeattle

I-5 164th SW

Mercer St.

North-end trafficheading to Seattle on I-5

Route 1A North Seattle toU-District

Suburb toSeattle

I-5 NE 117th

NE 44th

North-end trafficheading to U-District onI-5; historical dataavailable from previousevaluations

Route 2 Federal Way toSeattle CBD

Suburb ToSeattle

I-5 S. 272th*

Mercer St.

South-end traffic headingto Seattle on I-5

Route 3 Mtlk. Terrace toBellevue CBD

Suburb toSuburb

I-405 I-5 interchg.

NE 8th

North-end trafficheading to Bellevue CBDon I-405

Route 4 Tukwila toBellevue CBD

Suburb toSuburb

I-405 I-5 interchg.

NE 8th

South-end traffic headingto Bellevue CBD on I-405

Route 5 Bellevue toSeattle

Suburb toSuburb

SR520/I-5 NE 60th

Mercer St.

Eastside-based trafficheading to Seattle onSR520

Route 6 Issaquah toSeattle

Suburb ToSeattle I-90/I-5

Front St.

Mercer St.

Eastside-based trafficheading to Seattle on I-90

Route 7 Auburn toRenton

Suburb toSuburb

SR167 SR18**

I-405 interchg.

South-end traffic stayingin South end on SR167

Route 8A CentralBellevue toSeattle CBD

Suburb toSeattle

I-405/SR520/I-5

NE 8th NB

Mercer St.

Central Bellevue residentcommuting to Seattle; theSR520 option

Route 8B CentralBellevue toSeattle CBD

Suburb toSeattle

I-405/

I-90/I-5

NE 8th SB

Mercer St.

Central Bellevue residentcommuting to Seattle; theI-90 option

Route 9 Redmond toBellevue CBD

Suburb toSuburb

SR520/I-405 NE 60th

NE 8th

Eastside (Redmond)traffic to Bellevue CBD

Route 10 Issaquah toBellevue CBD

Suburb toSuburb

I-90/I-405 Front St.

NE 8th

Eastside (Issaquah)traffic to Bellevue CBD

* The southernmost I-5 loop data that is available on-line is at S. 170th. Data from location(s) closer toS. 272nd will be used as they become available.

** The proposed starting location will be at or near the SR18 interchange. The exact starting locationwill be determined after evaluating recently-activated SR167 loop locations.

Page 46: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 41

System Performance II : Selected Freeway Sites

Analysis Types. Three measures of freeway site performance were computed in this section ofthe interim report: average traffic volume profile at a site, by time of day; average speed profile at a site,by time of day; and average travel reliability at a site, by time of day. Estimates of these measures weremade at a core set of 4 central freeway measurement locations in the Seattle area “rectangle” bounded byI-5, I-405, SR 520, and I-90 in the Seattle area, with one measurement location on each corridor. This wasdone by using CDR Analyst, with input data corresponding to each location for each traffic direction,lane type (GP, HOV, reversible) and time period (in this case 1997) of interest. The resulting output is a24-hour traffic performance profile of each site.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for siteprofiles, describes the process for computing these performance measures.

Note 1. The presence of reversible lanes at the I-5 and I-90 locations used in the interim reportrequire particular care. Because the vehicle sensors collect data 24 hours a day, it is necessary to separatethe data corresponding to each direction of operation of the reversible lane(s). For example, if thereversible lanes operate westbound from 1 AM to noon, and eastbound from 1 PM to midnight, theresulting profile graph should note this fact. It is therefore necessary to look at the detailed 24 hourprofile and separate the volume profiles into two sections corresponding to the two directions ofoperation, in order to get an appropriate traffic profile graph for each traffic direction.

Note 2. It is important to study the data to verify that there is sufficient “good” or valid data todevelop a meaningful statistic at a given location. One way to do this is to look at the 24 hour profile dataincluded with the statistics and in particular look at the number of good days that each lane contributedto volumes at each 5-minute interval. If the numbers are low relative to the total number of daysanalyzed (e.g., 261 weekdays in a year), it could suggest that the resulting statistic might not be accurate.A comparison of “good” day counts at each lane could also indicate whether an individual lane’s sensoris having equipment difficulties. Ideally, you will want to verify that the sites have sufficient good dataprior to analysis. One way to do this is to use the data quality mapping option to get an overview of thequality of available data at the sites of interest (the mapping option is described later in this section asExample C, System Operations Diagnostics).

System Performance III : Performance Variations

Analysis Types. The following measures of freeway site performance variations were computedin this section of the interim report: 1995 vs. 1997 daily vehicle volumes; 1995 and 1997 weekday vs.weekend daily volumes; and 1995 and 1997 weekday vs. weekend average traffic volume profiles, bytime of day and lane type (GP, HOV). Estimates of these measures were made at a core set of 4 centralfreeway measurement locations in the Seattle area “rectangle” bounded by I-5, I-405, SR 520, and I-90 inthe Seattle area, with one measurement location on each corridor. This was done by using CDR Analyst,with input data corresponding to each location for each traffic direction, lane type (GP, HOV) and timeperiod (1995 and 1997) of interest. The resulting output is a 24-hour traffic performance profile of eachsite, as well as daily volume statistics for each combination of year, lane type, and traffic direction.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for siteprofiles, describes the process for computing these performance measures.

Note 1. The presence of reversible lanes at the I-5 and I-90 locations used in the interim reportrequire particular care. It is necessary to recognize each direction of operation of the reversible lane(s).For example, if the reversible lanes operate westbound from 1 AM to noon, and eastbound from 1 PM tomidnight, the resulting profile graph should note this fact. It is therefore necessary to look at the detailed

Page 47: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 42

24 hour profile and separate the volume profiles into two sections corresponding to the two directions ofoperation, in order to get an appropriate traffic profile graph for each traffic direction.

Also, because the sensors collect data 24 hours a day, it is necessary to separate the datacorresponding to each direction of operation of the reversible lane(s) before computing summarystatistics. For example, even if the reversible lanes operate westbound from 1 AM to noon, andeastbound from 1 PM to midnight, the daily system usage statistics shown in the output from CDRAnalyst will reflect the entire 24 hour period, i.e., both direction combined. It is therefore necessary tolook at the detailed 24 hour profile information (which is included with the output), and separate thevolumes into two categories corresponding to the two directions of operation, in order to get appropriatedaily volumes by direction of travel.

Note 2. It is important to study the data to verify that there is sufficient “good” or valid data todevelop a meaningful statistic at a given location. One way to do this is to look at the 24 hour profile dataincluded with the statistics and in particular look at the number of good days that each lane contributedto volumes at each 5-minute interval. If the numbers are low relative to the total number of daysanalyzed (e.g., 261 weekdays in a year), it could suggest that the resulting statistic might not be accurate.A comparison of “good” day counts at each lane could also indicate whether an individual lane’s sensoris having equipment difficulties. Ideally, you will want to verify that the sites have sufficient good dataprior to analysis. One way to do this is to use the data quality mapping option to get an overview of thequality of available data at the sites of interest (the mapping option is described later in this section asExample C, System Operations Diagnostics).

Note 3. The daily system usage statistics are based on the AASHTO method (described in section3). However, in some cases there may not be sufficient valid data to meet strict data quality standards. Inthose cases, daily usage based on a direct average of all days with sufficient “good” data is alsodisplayed.

Note 4. The graph of 1997 vs. 1995 change in travel reliability was computed by taking eachyear’s congestion frequency data and subtracting one from the other in a separate spreadsheet.

HOV Lane Network

Analysis Types. The number of vehicles traveling on GP and HOV lanes at selected locationswas combined with available data about the average number of persons per vehicle (vehicle occupancy)to compare the number of people using GP and HOV lanes at selected sites on major corridors during thepeak period. Three vehicle categories were analyzed: passenger cars, vans, and transit buses. For the carsand vans, the number of vehicles of each type at the site of interest was determined by multiplying theoverall vehicle volumes (from CDR Analyst) by the corresponding mode split data from the WSDOTHOV Lane Evaluation project (i.e., the percentage of all vehicles traveling by a site that are of a particularvehicle type). The number of persons traveling past the site in a car or van was then computed bymultiplying the number of vehicles of that type by the average number of passengers per vehicle, asdetermined from research project or transit agency sources (see Note 3). Average bus ridership isobtained from transit agencies and does not need to be computed; however, the bus ridership is adjustedto reflect the actual percentage of buses traveling on that type of lane (GP or HOV). The three resultingperson volumes by vehicle type are then added together to produce an overall person volume estimate.

Comparisons of GP and HOV volumes at selected locations on an example corridor were alsopresented. Average traffic volume profiles were computed at selected sites on I-405, by time of day andlane type (GP, HOV) during November 1998 (in September 1998 the HOV lane was moved from theoutside to inside lane; the November time period was chosen to reflect the new HOV lane placement).The traffic volume profiles were computed by using CDR Analyst, with input data corresponding to eachlocation for each traffic direction and lane type (GP, HOV) of interest. The resulting output is a 24-hour

Page 48: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 43

traffic performance profile of each site, as well as daily volume statistics for each combination of lane typeand traffic direction.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for siteprofiles, describes the process for computing the vehicle volumes. The peak period person volumeformula is as follows:

Total persons carried = (# of persons in cars) + (# of transit riders) + (# of van riders)

= (Total Veh. Vol.) * (% Car)*(ACO) + (Bus Ridership) * (HOV/GP Bus Dist.)

+ (Total Veh. Vol.) * (% Van) * (VanOcc)

where

Total Veh. Vol. Total number of vehicles traveling at that site (for a givendirection of travel and lane type)

%Car, %Van Percentage of vehicles that are cars or vans traveling at that site(for a given direction of travel and lane type)

Bus Ridership Total number of peak period riders at a site traveling in a givendirection on a given lane type.

ACO Average car occupancy at that site (weighted average occupancyof 1, 2, 3 and 4+ person cars)

HOV/GP Bus Dist. Percentage of buses that travel in a given lane type at that site(GP or HOV)

VanOcc Average van occupancy at that site (including driver)

The data sources are:

Total Vehicle Volume CDR Data (1997 yearly peak period average)

% Car, % Van ACO Data from HOV Lane Evaluation Phase IV Report (Q2 andQ3 1997 data)

Average Car Occupancy HOV Lane Evaluation Phase IV Report (Q2 and Q3 1997 data)

Bus Ridership Transit Agencies (1995 peak period data)

Bus Dist ACO Data from Phase IV Report (Q2 and Q3 1997 data)

Average Van Occupancy King County Metro Rideshare Operations Performance

(9.74 for 1996 including driver, 9.58 for 1997, 9.27 for 1998)

Note 1. In the interim report, person volumes were estimated only for selected locations in thecentral Puget Sound region. In the future, person volumes are tentatively scheduled to be estimated foras many of the usage sites listed in Table 1 as possible, subject to data availability. Table 3 is an exampleof potential data source information that was collected in 1996 to estimate person volumes. These data

Page 49: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 44

Table 3. Example of Supporting Data Location Table for Reference Measurement Sites (1997)Volume, Lane Occupancy Average Car

Occupancy (ACO)**Average BusOccupancy

Comments

Corridor Location Cabinet # Location Site # LocationI-5 GP/HOV GP/HOV GP/HOV GP GP/HOV GP/HOV

S. 272ndS. 170thAlbroNE 63rdNE 137th128th St. SW

TBD059D086D143D165D213D

S. 216/70th E.S. 216thAlbroNE NorthgateNE 145thNE 145th

34 or 923425161414

TBDS. 170thAlbroNE 63rdNE 137thTBD

no loopsno SB HOV

I-90 EB GP WB GP HOV GP/HOV

MidspanMidspanMidspan161st Ave SE

858D857D857D910D

Island CrestIsland CrestLk Wa. Blvd.Newport Wy.

54545257

MidspanMidspanMidspanTBD

SR520 EB GP WB GP WB HOV GP/HOV

76th Ave NE76th Ave NE84th Ave NENE 60th

514D514D516R544D

92nd Ave NE92nd Ave NE92nd Ave NE148th Ave NE

42424245

MidspanMidspanMidspanTBD

I-405 GP/HOV GP/HOV GP/HOV GP/HOV

SE 52ndNE 14th St.NE 85thDamson Rd.

662D696D716/717761D

112th Ave. SENE 4th St.NE 85thNE 85th

6573b8181

SE 52ndNE 14th St.NE 85thTBD

SR167 GP TBD TBD S. 208th 98 S. 208th loops TBD

Data sources in this list were based on information collected during 1996. This entire list shouldbe updated prior to future evaluations.

Page 50: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 45

locations are based on measurement data being collected as of 1996, and are likely to change by the timefuture evaluations take place. Nevertheless, this table illustrates the importance of evaluating dataavailability as well as the proximity of data collection locations associated with a particular site (i.e., arevolume and vehicle occupancy measurements for a given site collected near enough to one another).

Note 2. The daily system usage statistics are based on the AASHTO method (described in section3). However, in some cases there may not be sufficient valid data to meet strict data quality standards. Inthose cases, daily usage based on a direct average of all days with sufficient “good” data is alsodisplayed.

Note 3. The passenger vehicle occupancy data were obtained from the WSDOT HOV LaneEvaluation project’s field measurements, which also include mode split data for major vehicle types(what percentage of all vehicles at a site are cars, buses, vans, etc.). The HOV Lane Evaluation project’sweb site is <http://www.wsdot.wa.gov/eesc/atb/atb/hov/Titlepg.html>. Transit and van per-vehicleridership data were provided by local transit agencies.

Note 4. It is important to study the data to verify that there is sufficient “good” or valid data todevelop a meaningful statistic at a given location. One way to do this is to look at the 24 hour profile dataincluded with the statistics and in particular look at the number of good days that each lane contributedto volumes at each 5-minute interval. If the numbers are low relative to the total number of daysanalyzed (e.g., 261 weekdays in a year), it could suggest that the resulting statistic might not be accurate.A comparison of “good” day counts at each lane could also indicate whether an individual lane’s sensoris having equipment difficulties. Ideally, you will want to verify that the sites have sufficient goodvolume data prior to analysis. One way to do this is to use the data quality mapping option to get anoverview of the quality of available data at the sites of interest (the mapping option is described later inthis section as Example C, System Operations Diagnostics).

Example B. WSDOT Northwest Region Ramp and Roadway Traffic Volume Report

The Traffic Systems Management Center (TSMC) of WSDOT Northwest Region produces a Rampand Roadway Traffic Volume Report approximately every two years. This report includes averageweekday volumes as well as AM and PM peak hour volumes at selected ramp and mainline (GP, HOV,and reversible) locations on freeways throughout the Northwest Region. The traffic counts are takenduring several months of the year using a combination of electronic sensors (inductance loops) andportable tube counters. Weekday volumes are based on Tuesday through Thursday counts. Peak hourvolumes represent the highest one hour volume during the AM or PM. The report also lists the one-hourpeak hour time period if it is outside the fixed peak periods of 6 AM to 9 AM and 2:30 PM to 6 PM.

The evaluation process and tool set described in this report can be used to produce some of thedata for this report. The following describes the portions of the ramp and roadway traffic volume reportanalysis that could be performed using the tool set. (The reader is encouraged to refer to the ramp androadway report while reading the following comments.)

Daily and Peak Hour Weekday Volumes

Analysis Types. Two measures of system usage are summarized in the ramp and roadwayreport: average weekday vehicle volume, and average peak hour vehicle volume. Estimates of these twomeasures were made at locations throughout the freeway network included in the Northwest Region.Highways included in the report were I-5, SR 18, SR 167, I-405, SR 518, SR 520, and I-90. These two usagemeasures can be computed by the FLOW evaluation tool set, provided that electronic sensor data isavailable. This is done by using CDR Analyst, with input data corresponding to the locations of interestfor the traffic direction, lane type (GP/HOV/reversible) and time period of interest. The resulting output

Page 51: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 46

includes summary usage statistics, as well as a more detailed 24-hour volume profile at 5-minuteintervals throughout an average day.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for siteprofiles, describes the process for computing these usage statistics.

Note 1. The tool set described in this document operates using archived electronic sensor dataonly. Therefore, it can only provide statistics for sites with sensor installations. Locations in the reportthat do not have corresponding or nearby sensors will not be able to use this method.

Note 2. The sensor sites might not always correspond precisely with locations used in past rampand roadway traffic volume reports.

Note 3. The presence of reversible lanes at selected locations requires particular care. Becausethe sensors collect data 24 hours a day, it is necessary to separate the data corresponding to each directionof operation of the reversible lane(s). For example, even if the reversible lanes operate westbound from 1AM to noon, and eastbound from 1 PM to midnight, the system usage statistics shown in the output fromCDR Analyst will reflect the entire 24 hour period, i.e., both direction combined. It is therefore necessaryto look at the detailed 24 hour profile information (which is included with the summary statistic output),and separate the volumes into two categories corresponding to the two directions of operation, in orderto get appropriate daily or peak hour volumes by direction of travel.

Note 4. In most cases, the normal hours of each direction of reversible lane operation correspondwell with the midnight-to-noon, noon-to-midnight time periods used for AM and PM peak hour statisticsrespectively, so the standard CDR Analyst peak hour statistics output will still be valid. Peak hourvolumes reported by CDR Analyst should be reviewed to verify that they take the previous caveat aboutreversible lane operation into account, however.

Note 5. The statistics used in the report are based on Tuesday through Thursday for selectedmonths (e.g., March, April, May). In contrast, CDR Analyst offers the option to choose any time periodup to the entire year, if data is available. While the use of 12 months of data might be considered a moreappropriate measure in some situations, it might also be worthwhile to consider using the same datacollection periods of previous ramp and roadway reports to maintain year-to-year consistency of resultsover time. Reconstruction of a previous set of results using CDR Analyst, then comparing the originaland Analyst-based sets of values to determine seasonal variations, might be useful in this regard. Theuser has the option to create data files using CDR that contain only Tuesday through Thursday data (seesection 3, Overview: Selecting data to be processed by CDR Analyst).

Note 6. Ideally, you will want to verify that the sites of interest have sufficient good volumedata, prior to analysis. One way to do this is to use the data quality mapping option to get an overview ofthe quality of available data at the sites of interest (the mapping option is described later in this section asExample C, System Operations Diagnostics).

Example C. System Operations Diagnostics

In addition to estimating freeway usage and performance, the tool set described in this report canalso provide diagnostic information about the operation of traffic systems management and datacollection field installations, and can contribute to a better understanding of the validity of summarystatistics. In the process of developing the usage and performance measures for the central Puget SoundFreeway Usage and Performance report, this diagnostic capability was used in three different situations.The following is a description of each situation, and the methods used to diagnose the problem in eachcase.

Page 52: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 47

Data Quality Mapping

Analysis Types. The analysis performed by CDR Analyst is of course highly dependent on thequality of the original sensor data that is being processed. The ability to evaluate the level of quality priorto an analysis can save time and enhance the efficiency of the analytical process. For example, in theprocess of developing site and corridor analyses in the freeway usage and performance report, it becameclear that in some situations, there were significant quantities of suspect or invalid data that affected theprocess of computing measures of performance. In some cases, data quality problems could beanticipated by referring to available information about construction projects that were likely to disruptdata collection devices during the time period of interest. In other cases, however, there was not always aprior indication that sufficient valid data might not be available. In those situations, time was spentsetting up and running analyses, only to discover that the results were either of questionable qualitybecause of insufficient “good” data, or were clearly unusable. Initially, this problem was dealt with byperforming random “spot checks” of the data set prior to analysis. This was tedious and ofteninconclusive. To better address this issue, a standalone spreadsheet-based prototype tool was developedto analyze the traffic data files (output files from the CDR Auto program) and compute the percentages of“good”, “suspect”, “bad”, and “disabled” data at each measurement site, on a quarter by quarter basisthroughout the year. This information was then summarized in graphical form. This summary wasuseful in determining which locations were likely to be problematic from a statistics computation point ofview, as well as determining which sites should be skipped altogether when developing contour maps orcomputing travel times. Figure 7 shows an example of the mapping output.

Relevant Sections of this Document. Section 3, CDR Data Quality Mapping Utility operatinginstructions, describes the process for running this program.

Note 1. The tool set described in this document summarizes the amount of data that is tagged as“good.” However, the user must still determine the minimum amount of “good” data that is consideredacceptable for the type of analysis being performed. This minimum threshold should take into accountnot only the overall amount of valid data, but its temporal distribution. For example, a minimum dataquality standard of at least 50% good data for the entire year does not take into account the fact that sucha standard could be met by having 50 percent good data uniformly throughout the year, or having 100percent good data in two quarters of the year and 0 percent good data in the other two quarters; in thelatter situation, the resulting average yearly statistics might not fully reflect seasonal variations. Theprecise acceptance criteria is a user decision.

Note 2. The data replacement option in CDR and CDR Analyst could also affect the resultinganalysis. For example, if the user chooses to use data replacement, transient suspect or bad data pointsthat are surrounded by good values will be replaced by nearby good data. Extended periods of suspector bad data, however, might not be replaced if the data replacement window is small (e.g., default = ± 15minutes).

Note 3. It might be appropriate to consider data quality categories besides “good” data. Forexample, the standard for data tagged as “suspect” is conservative. As noted earlier in this report, agiven 5-minute data value is tagged as suspect if even one of the 15 20-second values that make up that 5-minute value is considered to be unrealistic. In a number of situations during the data analysis processfor the usage and performance report, a value labeled as “suspect” was consistent with the “good” valuesthat preceded and followed it, suggesting that the value was likely to be acceptable. This possibility is thereason why CDR and CDR Analyst both offer the option of accepting suspect values as valid data.

Page 53: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 48

111

121

124

126

130

136

143

149

156

163

167

172

177

181

186

189

193

201

205

213 213

207

205

204

201

196

193

191

189

188

186

184

181

179

177

172

170

168

165

163

161

156

154

148

146

143

139

137

132

130

128

126

125

124

123

118

111

Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4

County Line

NE 175th

Northgate Way

5

405

520

522

MP 186 213

MP 183 205, 207

MP 182 201, 203, 204

MP 181 196

MP 180 191, 193

MP 179 187, 188, 189

MP 178 184, 186MP 177 179, 181

MP 176 174, 177MP 175 170, 172

MP 174 165, 167, 168

MP 173 161,163

MP 172 154, 156MP 171 148, 149

MP 170 141, 143, 145, 146

MP 169 132, 134, 136, 137, 139MP 168 126, 128, 130MP 167 123, 124, 125

MP 166 118, 121MP 165 111

% Good Data Southbound % Good Data Northbound

Figure 7. Sensor Data Quality By Location and Time of Year

I-5 North of Seattle (1977)

Page 54: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 49

This should be taken into account when determining the user’s minimum threshold for validdata. For example, rather than have a threshold based on the percentage of good data alone, it may beuseful to consider a combination of good and suspect data in the threshold computations. NOTE: Auser’s decision to assume that suspect data is valid when determining the data quality of a site should belinked with the user’s response to the CDR/CDR Analyst option to accept suspect data as valid. If theuser chooses to accept suspect data as valid, it would then be appropriate that the combined amount ofgood and suspect data be considered by the user when evaluating data quality thresholds.

Note 4. The data quality tool provides statistics for the time intervals that are reflected in eachdata file. If there is one quarter of data per file, the data quality tool will compute quarter by quarter dataquality information. If one file contains a year’s worth of data, yearly data quality totals are provided.

Electronic Sensors I

Analysis Types. The ability to see detailed time-of-day volume patterns from individual lanesensors using CDR Analyst can be useful in spotting potential equipment problems. For example, in theprocess of developing GP vs. HOV analyses in the freeway usage and performance report, the data at onelocation on I-405 suggested that HOV volumes were about the same as those of the GP lanes (see Figure8). Although HOV volumes on this corridor are significant, it seemed unlikely that the per-lane vehiclevolume of an HOV lane would equal the average per-lane vehicle volume of the two GP lanes. Afterstudying the data being collected by each lane individually in a one-day sample at that site, andcomparing the 24-hour profiles to those of nearby sites, it appeared that one of the GP lanes had beeninadvertently wired as if it was an HOV lane, and the HOV lane sensor had been wired as a GP lane (seeFigure 9). As a result, the HOV lane statistics were actually reporting GP lane performance, while the GPper-lane performance was being reduced because it was averaging in one lane of HOV lane volumes, thusaccounting for the HOV lane’s apparent high volumes relative to the average GP lane at that site. In thiscase, CDR Analyst was able to diagnose a potential equipment installation issue.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for siteprofiles, describes the process for obtaining 24-hour profiles.

Note 1. CDR Auto creates input files for CDR Analyst that include all the lanes of a givendirection and type at each site. To obtain individual lane data, CDR should be run for each laneseparately to produce the input data files for CDR Analyst.

Electronic Sensors II

Analysis Types. CDR Analyst can also help determine the validity of summary statistics. Forexample, in the process of developing peak hour and daily volumes in the freeway usage andperformance report, data at the midspan on I-90 suggested that reversible lane volumes were significantlylower at the midspan than they were at other nearby reversible lane locations. This transient drop involume did not appear in the volume estimates reported in the ramp and roadway volume report,suggesting that a small study should be performed to determine possible explanations for the differencein values. Several hypotheses were considered:

• Because the CDR Analyst results were for 1997, but the ramp and roadway results used forcomparison were for 1996 and 1998 (1997 results were not available), there was a possibilitythat the 1997 drop in reversible lane midspan volume was real. While it seemed unlikely thatthere would be a transient dip in 1997 midspan volumes given the general upward trend inother years, this possibility was checked by re-running the CDR Analyst data for 1998, andcomparing the results to the same year’s ramp and roadway values. Furthermore, becausethe CDR Analyst results were based on a year of weekday data, while the ramp and roadwayvalues were based on selected measurements from a four-month period (March through June

Page 55: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 50

1998, Tuesdays through Thursdays) there was a possibility that seasonal variations mightaccount for at least some of the difference. So, CDR Analyst data was collected for acomparable time period (March through June 1998, Wednesday values) to reduce thispossibility.

I-405 SE 52nd St-SB GP SB

0

500

1000

1500

2000

2500

12 AM 2 AM 4 AM 6 AM 8 AM 10 AM 12 PM 2 PM 4 PM 6 PM 8 PM 10 PM

Ve

hic

les

Pe

r L

an

e P

er

Ho

ur

(VP

LP

H)

Figure 8. Estimated Weekday Volume Profile: GP and HOV Lanes

GP

HOV

Nov-98

Page 56: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 51

Figure 9. Per-Lane Volume Profile: GP and HOV LanesI-405 SE 52nd St Southbound (November 11, 1998)

0

20

40

60

80

100

120

140

160

180

200

12 AM 2 AM 4 AM 6 AM 8 AM 10 AM 12 PM 2 PM 4 PM 6 PM 8 PM 10 PM

Time of Day

Fiv

e-m

inut

e V

ehic

le V

olum

e

HOV LaneGP Lane 1GP Lane 2

Page 57: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 52

• The CDR Analyst process might be systematically underestimating volumes, or the ramp androadway volumes might be systematically overestimating volumes. Although measurements at otherlocations did not suggest that either was the case, the possibility was tested by using CDR Analyst notjust at the midspan of I-90 but along a series of other locations along the I-90 corridor from Seattle easttoward Bellevue as well, and comparing the results to ramp and roadway volumes.

• There was a possibility of an equipment problem at the midspan that was producing anunderestimate at that one location. To study this issue, a data quality check was performedto study the quantity of valid data (see earlier example, “Data Quality Mapping”).

The results of a series of comparisons along the I-90 corridor show that with the exception of themidspan location, other measurement sites show good agreement in the two sets of volumes, thussuggesting that the difference at the midspan is not due to seasonal or year-to-year fluctuations (seeFigure 10). Furthermore, the agreement between the two sets of values did not support the idea thateither process was universally underestimating or overestimating volumes (although since both usedloop data, this was inconclusive). Looking specifically at the midspan values, the noticeable drop involume at the midspan measurement location does not seem likely given that the measurement site justto the west, near the west highrise of the bridge, has a much higher volume even though there areapparently no on- or off-ramps between the two measurement sites. Furthermore, the overall trend ofvolumes at successive locations along the corridor also supports the notion that this sudden drop involume is not correct. A data quality study of data cabinets along the corridor showed that although theresults from the cabinet (857) associated with the midspan measurements had a high percentage of“good” data, adjacent cabinets 852, 854, and 855, all had moderate to significant quantities of “suspect” orbad data. Another clue is that cabinet 854, which has a smaller percentage of good data, is located at thesame milepost as midspan cabinet 857. Finally, the ramp and roadway report indicates that midspanvalues are “unavailable” for 1998, suggesting that there might have been a data collection problem at thatlocation. Taken together, these results, while not conclusive, are all clues that suggest that the CDRAnalyst results should not be taken at face value at this site, and that a more detailed equipment studymight be useful.

Relevant Sections of this Document. Section 3, CDR Analyst operating instructions for siteprofiles, describes the process for obtaining 24-hour profiles.

Note 1. As noted above, CDR Analyst and the ramp and roadway report both use loop data;agreement of their respective results does not necessarily mean that both sets of results are correct.However, the use of cumulative data from a number of sources (ramp and roadway data, data qualitymaps, detailed CDR Analyst results), combined with user experience, can help focus diagnostic activitiesand suggest likely possibilities.

Page 58: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 53

Figure 10. Comparison of CDR Analyst andRamp and Roadway Report Estimates

Peak Hour and Daily Volumes on I-90 Reversible Lanes (Spring 1998)

0

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

East ofRainier Ave

S.

E. of Mt.Baker

Tunnels

W. part ofbridge

Midspan First Hill Lid E. of 77thAve SE

Between80th Ave

SE and Isl.Crest Wy.

LutherBurbank

Lid

Shore-wood Drive

E. of EastMercer Way

Pea

k H

our

or D

aily

Veh

icle

Vol

ume RR AMRR PMRR WBRR EBCDR AMCDR PMCDR WBCDR EB

Page 59: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 54

5. Analytical Assumptions and Limitations

Overview

In order to develop an evaluation process that can synthesize a large data set in a timely mannerand also produce performance measures that are succinct and understandable, some assumptions weremade regarding the algorithms that are used, and limits were set on the capabilities and intendedpurpose of the evaluation method and the accompanying tool set. This section presents a summarychecklist of considerations that the prospective user should keep in mind when producing, modifying, orinterpreting analytical results based on the FLOW evaluation method and tool set.

The user is strongly encouraged to review the discussion in this section as well as comments inthe preceding two sections of this document regarding the assumptions or limitations of specific aspectsof the evaluation process or the analytical tools, prior to using the FLOW evaluation tools or interpretingits analytical results.

Input Data Considerations

The FLOW evaluation process starts with and is dependent upon the 5-minute CD-based trafficdata. The following considerations should be kept in mind when designing an evaluation effort that usesthis data set:

1. Sensor availability at the location(s) and time periods of interest.

The traffic data is recorded at approximately 0.5-mile intervals along the major freewayfacilities. If the analysis must take place at a precise location, it is important to note whetheran available sensor location can be placed precisely enough for the desired analysis.

Note also that a measurement site’s milepost value is based on the location of the associateddata cabinet. Furthermore, a single field data cabinet collects information from multiplesensors. Therefore, the milepost value is only approximate; it is not necessarily the exactlocation of an individual vehicle sensor.

2. Data availability at the location(s) and time periods of interest.

Even when a sensor is available at a desired location, data availability can be disrupted byequipment problems or construction activity. It is important to evaluate the data that wouldbe used in an analysis to determine whether sufficient valid data is available during the timeperiods of interest.

Page 60: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 55

3. Data quality at the location(s) and time periods of interest.

Each archived 5-minute data value has an associated data quality flag that summarizes thedata quality of its constituent 15 20-second values. The 5-minute data validity codes include“good” (all 15 constituent values are considered valid), “bad” (all 15 values are consideredinvalid), “disabled” (all 15 values were collected when the data collection equipment at thatsensor site was disabled or not operational), or “suspect” (all other combinations of 15 datapoint conditions). Data validity for each 20-second value is determined by checking theequipment status as well as the nature of the values themselves. If a 20-second data pointindicates that the loop is hung (i.e., sensor is stuck in the “on” position for a longer thanreasonable time period), or consists of a volume and lane occupancy combination that isoutside of an envelope of “reasonable” values (based on earlier research), the data point isconsidered bad or invalid. (The thinking behind these thresholds is that a prolonged “on”condition by the sensor or a highly unusual volume/occupancy value combination issymptomatic of equipment difficulties.) If the operator has disabled the operation of thesensor, the 20-second data point is labeled disabled. In all other cases, the data point islabeled good.

The 5-minute data flags are indicators that should be used to evaluate the overall quality ofthe prospective data set. The user then has the option to use the data set as is, to reject a dataset, or to use the data set but attempt to “fix” data that is labeled as other than good (see nextitem).

4. Data quality threshold for “good” and “suspect” data.

The user has the option to specify the minimum amount of “good” data that is required tocomplete a computation. There is no absolute guideline for the minimum amount of gooddata that is required in order to develop a performance measure that one can feel confidentabout. However, a data quality map of the prospective data set will indicate whether asignificant quantity of good data is available. (See section 4, Example III: Data QualityMapping for additional details.) The user also needs to specify if “suspect” data is assumedto be good. The user’s choice for this option can also affect the level of comfort that the userhas in the analysis results.

As an example, a minimum threshold of 90 percent good data was used for the central PugetSound Freeway Usage and Performance report, while suspect data was always replaced byCDR Analyst whenever feasible during the analysis for that report, with the datareplacement window set to ± 15 minutes.

5. Data resolution for the analysis to be performed.

The data is archived at 5-minute intervals. This is sufficient for many types of analyses;however, for detailed studies that focus on short-term or highly variable trafficmeasurements, 5-minute data might not be appropriate. For example, some types of rampanalyses might fit this category.

6. Time period for the analysis to be performed.

It is important to select an appropriate time period for the analysis. This includes the day-of-week selections (weekday vs. weekend vs. seven day) as well as the overall time period(week, month, year).

Page 61: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 56

7. Other scheduling considerations.

As noted above, other activities such as construction can have a significant impact on dataavailability. In addition, the nature of the construction project or other disruption can affectthe selection of the time period of the analysis. For example, an HOV study should take intoconsideration a construction project that adds or disrupts an HOV lane (and possibly affectsassociated vehicle sensors in the pavement), or moves the HOV lane from an outside toinside lane.

Summary Statistic Considerations

The computation of summary statistics such as average daily volumes and peak hour or peakperiod volumes is based on an AASHTO-based process that requires user specification of minimum dataquality thresholds.

1. Data quality thresholds for “good” data at multiple-lane sites.

One of the differences between CDR and CDR Analyst is that CDR only looks at one lane at atime, while CDR Analyst is based on the idea of a site that can include multiple lanes. Thelatter capability is important because performance measures typically look at overallcharacteristics at a site (e.g., the GP volume) rather than individual lane performance (e.g.,the Lane 1 volume). However, if the location of interest has more than one lane of interest(e.g., multiple GP lanes), individual lane characteristics must be combined. For example, toget a total volume at a site for a given 5-minute interval, the individual lane volumes must beadded together. The same process needs to take place for the lane occupancy percentage,except that the values are generally averaged.

Because of this, the user must determine an appropriate minimum number of good lanes thatis required to compute a meaningful multilane measure. Once a user specifies a threshold ofgood lanes (as a percentage of total number of lanes), for each 5-minute interval the data flagsof each individual lane value are checked to verify that the user-specified threshold ofminimum “good” lanes is met. If it is not, the combined multi-lane statistic for that 5-minuteinterval is labeled “invalid.”

There are no absolute guidelines for setting the threshold. However, if the threshold is toorelaxed, an atypical lane could produce a non-representative multilane summary value. Ifthe threshold is too strict, measures might not be computed at all.

As an example, a minimum threshold of 50 percent good lanes was used for the central PugetSound Freeway Usage and Performance report.

2. Data quality thresholds for “valid” data points at each step in the summary statisticsprocess.

The AASHTO process starts with the archived 5-minute data and aggregates the values up todaily, then monthly, then yearly statistics. At each stage, user-specified thresholds of dataquality must be met. For example, if an average weekday daily volume for a year is desired,the process works as follows: First, for each weekday of the year a total daily volume issummed up from the archived 5-minute data points (after summing across all lanes). Duringthat process, the corresponding data quality flags of each 5-minute value are tabulated; theresults are checked to verify that a minimum user-specified threshold of good data isavailable (e.g., at least 90 percent of all 5-minute values for the day are good values). Each

Page 62: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 57

day that meets that threshold can then contribute to the next level of aggregation; all otherdays are tagged as “invalid” and are not included in subsequent aggregations. Next, for eachmonth a day-of-week average is computed (i.e., average Monday daily volume, averageTuesday daily volume, etc. for each month). For each of these day-of-week average values, aminimum threshold of valid days must be met (e.g., no more than one invalid day per day-of-week average value). If the threshold is met, that day-of-week average can contribute tothe next level of aggregation; otherwise, that day-of-week average for that month is tagged as“invalid” and is not included in subsequent aggregations.

Next, for the entire year a day-of-week average is computed (e.g., average 1997 Monday dailyvolume, average 1997 Tuesday daily volume, etc.). For each day of the week, a minimumthreshold of valid days must be met, e.g., at least ten day-of-week averages must be valid(out of the 12 average values corresponding to each of the 12 months). If that threshold ismet, that yearly day-of-week average can contribute to the next level of aggregation;otherwise, that yearly day-of-week average is tagged as “invalid” and is not included insubsequent aggregations. Finally, if the resulting yearly day-of-week averages (Mondaythrough Friday) meet a yearly threshold (e.g., at least 4 of 5 yearly day-of-week averagesmust be valid) the values are averaged together. The resulting value is then considered theaverage daily volume for the year.

Users must specify minimum thresholds of valid or good data points at each step of thisprocess. There is no absolute guideline for the selection of those thresholds. However,because the results can be sensitive to those thresholds, it is important to use the samethreshold values when doing a series of analyses whose results will then be compared to oneanother.

As an example, the minimum thresholds used for the central Puget Sound Freeway Usageand Performance report were:

Minimum number of good data points per day: 90 percent

Minimum number of good day-of-week values per month: ≤ 1 invalid value per day-of-week

Minimum number of good day-of-week values per year: ≥ 10 out of 12

Minimum number of good yearly day-of-week values: ≥ 4 of 5 (weekday)

2 of 2 (weekend)

≥ 6 of 7 (seven day)

These values can be set by the user in the CDR preferences file.

24-hour Site Profile Considerations

The time-of-day profile option in CDR Analyst produces per-lane volume, approximate speed,and approximate congestion frequency as a function of time of day for an average day. Interpretation ofthese measures should take into account the following:

1. Speed formula accuracy.

The speed estimation formula used in the tool set is a relatively simple function of volumeand lane occupancy percentage. There are several assumptions built into this formula. First,

Page 63: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 58

previous studies have indicated that the formula is less accurate at the low and high speedranges. Therefore the formula truncates the speeds at 10 mph and 60 mph to reduce theeffect of this error. Second, the formula includes a coefficient that reflects the combined effectof the vehicle length and the sensor detection range on the resulting speed estimate. Vehiclelength and sensor detection characteristics can both vary from site to site. However, CDRAnalyst uses a single average value for all sites; this could introduce some site-to-site errors.Given these factors, it is important to consider the resulting speed estimates as generalestimates that may be more useful as comparative rather than absolute values.

2. Definition of “congestion.”

The congestion frequency histogram uses a user-defined threshold of congestion, using thelane occupancy percentage as a surrogate measure for congestion. As an example, thethreshold of congestion used for the central Puget Sound Freeway Usage and Performancereport is a lane occupancy percentage that corresponds approximately to the transition fromtraffic condition Level of Service E to F, based on a freeway with 65 mph freeflow speedcharacteristics as described in the Highway Capacity Manual. This assumes, however, a fixedaverage vehicle length; this assumption may introduce some error depending on the site. Aswith speed estimates, it may be more useful to consider the resulting congestion frequencyestimate as comparative and qualitative rather than an absolute value.

Corridor Profile Considerations

The corridor profile option in CDR Analyst produces approximate congestion patterns as afunction of time of day and corridor location for an average day. Interpretation of these measures shouldtake into account the following:

1. Definition of “level of service.”

Similar to the congestion frequency histogram, the contour map uses the concept of level ofservice to define different traffic conditions. In the case of contours, four traffic conditionlevels are used (green = Level of Service A, B and C; yellow = LOS D; red = LOS E; purple =LOS F). Each traffic condition level is color-coded on the map based on a correspondingrange of lane occupancy percentage values. The levels of service are based on a freeway with65 mph freeflow speed characteristics as described in the Highway Capacity Manual. Asnoted earlier, this assumes a fixed average vehicle length; this assumption may introducesome error. The resulting values are therefore most useful as comparative and qualitativemeasures.

2. Interpolation between measurement locations.

The contour maps are produced by sampling 24-hour profiles at several discrete locationsalong the corridor, then interpolating between those measured locations to produce smoothlyvarying traffic condition contours. The interpolation is linear. This assumption’s validitywill vary with the local conditions, and especially with the distance between measurementlocations. This is an issue if there is a gap in measurements due to construction or equipmentproblems; the user must decide if a linear interpolation of traffic conditions is a reasonablerepresentation of traffic patterns if a significant distance between measurement locations isbeing spanned. As with most of the user-specified values noted in this document, there areno absolute guidelines for determining the maximum gap size that can reasonably bespanned by interpolation; it will depend on such factors as traffic variability in that segmentor the presence of nearby features such as major interchanges or significant

Page 64: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 59

sources/destinations of traffic (e.g., employment centers). Ultimately, this is a user judgmentbased on professional knowledge, experience, and common sense.

3. Measurement location milepost values.

As noted earlier, a measurement site’s milepost value is based on the location of theassociated data cabinet, and a single field data cabinet collects information from multiplesensors. Because the milepost value is therefore an approximation, the specified locations ofdata values used in the contour map interpolation are approximations as well. As a result,specific traffic patterns seen on the contour map cannot be precisely located based solely onthe map display. For example, if the contour map suggests a sharp change in level of serviceat milepost 170, it should not be assumed that the transition occurs precisely at that corridorlocation, but rather in the vicinity. (Furthermore, the corridor map that accompanies thecontour maps should only be used as an approximate reference. As noted earlier, thecorridor map can only be approximately positioned relative to the milepost axes of thecontours. This is especially true if the corridor map must be distorted or “straightened out”to better match a curving corridor with the linear milepost axis.)

Travel Time Profile Considerations

The travel time profile option in CDR Analyst produces approximate average and 90th percentiletravel times for an average day as a function of the start time of the trip and the trip route. Interpretationof these measures should take into account the following:

1. Method of interpolation/extrapolation between measurement locations.

Travel times are computed by estimating spot speeds at a series of locations along the triproute. A travel time is estimated for each segment of the trip (where a segment’s endpointsare defined by adjacent measurement locations), using the speed information and the lengthof the segment. These segment times are then added up to produce an overall trip time. Thetrip time can also be a function of the start time of the trip, by using spot speeds based ondata from different times of the day. As with the contour interpolation, the reasonableness ofthe resulting travel time depends in part on the frequency of sampled measurements, and theresulting length of the interpolated/extrapolated segments.

2. Speed formula accuracy.

As noted earlier, there are assumptions and simplifications built into the speed formula. Thiswill of course affect the travel times. Given this factor, it is important to consider the resultingtravel times as general estimates that may be more useful as comparative rather than absolutevalues.

3. Speed variation between measurement locations.

As noted above, travel times are computed by estimating a travel time for each segment ofthe trip (where a segment is defined by successive measurement locations), by combining thespeed information and the length of the segment. These segment times are then added up toproduce an overall trip time. In this process, it is assumed that speeds vary linearly from onemeasurement site to the next. The reasonableness of this assumption depends in part on thefrequency of sampled measurements, and the resulting length of those segments, as well astraffic characteristics on that corridor.

Page 65: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

FLOW Evaluation Design Technical Report v2 60

4. Measurement location milepost values.

As with the contour maps, a measurement site’s milepost value is based on the location of theassociated data cabinet, and a single field data cabinet collects information from multiplesensors. Because the milepost value is therefore an approximation, the specified locations ofdata values used in the travel time computation are approximations as well. The resultingtravel times are best used as general estimates and may be more useful as comparative ratherthan absolute values.

Person Volume Considerations

Person volumes are based on a combination of CDR Analyst results (vehicle volumes) and modesplit and vehicle occupancy values from other research projects and agencies.

1. Consistent measurement locations.

Because the data sources used for person volume computations vary, it is likely thatmeasurement locations are not going to be coincident. In the interim report (section 4, Example I), sensormeasurements, transit measurements, and vehicle occupancy measurements were not always at the samelocations. In those cases, the user must decide if a site’s measurement locations are “close enough” to oneanother, based on nearby traffic patterns.

2. Consistent measurement times.

Because the data sources used for person volume computations vary, it is likely thatmeasurement times are not going to be coincident. In the interim report (section 4, Example A), sensormeasurements, transit measurements, and vehicle occupancy measurements were not always collected atthe same times. In those cases, the user must decide if the times are “close enough”, based on seasonalvariations or the timing of any construction projects that might affect the results. In some cases, data willbe used because it is the only available data.

Page 66: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide

March 1998

John M. IshimaruSenior Research Engineer

Washington State Transportation Center (TRAC)University of Washington

Seattle, WA

Version 2.52

Page 67: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 i

Table of Contents

Introduction ................................................................................................................................1Information Sources and Acknowledgments ................................................................................1CDR Software Installation...........................................................................................................2FLOW Data CD Archives ...........................................................................................................3How to run CDR.........................................................................................................................4Notes about the Output Options ................................................................................................ 13How Summary Statistics are Computed..................................................................................... 19Notes about Data Validity ......................................................................................................... 20Notes about Loop Names.......................................................................................................... 27CDR User’s Guide Version History........................................................................................... 30

Figures

1. Main CDR Screen................................................................................................................82. CDR Options.......................................................................................................................93. Date Selection Screen........................................................................................................ 104. Location Selection Screen.................................................................................................. 115. Excel Import Wizard Window ........................................................................................... 126a. Example of 5-minute Output (Volume and Occupancy) ..................................................... 146b. Example of 5-minute Output (Speed and Vehicle Length).................................................. 157a. Example of 15-minute Summary Output ............................................................................ 167b. Example of Daily Summary Output.................................................................................... 178. Example of Multi-Day Output ........................................................................................... 189. Data Acceptance Threshold Window ................................................................................. 2510. Data Acceptance Threshold Window, continued ................................................................ 26

Page 68: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 1

Introduction

This guide describes the procedures for using CDR, the Compact disc Data Retrieval program.CDR is a PC-based software tool developed by the Washington State Department ofTransportation (WSDOT) to extract and process data from freeway traffic sensors in the centralPuget Sound region. CDR uses a CD-based archive of traffic volume data and other informationthat are collected using a network of sensors embedded in the roadway pavement. CDR offersthe user the ability to save and analyze a desired subset of the data based on specific days,locations, and analysis options of the user’s choice.

This guide describes CDR’s user options, the underlying algorithms for the computation ofsummary statistics, and caveats associated with CDR’s output. The following topics arediscussed:

• Installation instructions• Data set description• Operating instructions• Output options• Summary statistics• Data validity considerations• Loop naming conventions

Information Sources and Acknowledgments

The information in this guide was based on documentation provided with CDR and its DECVAX- and modem-based predecessor, VDR; the author’s experiences with the program; andemail communications with WSDOT staff. The author wishes to acknowledge the technicalassistance of WSDOT Northwest Region staff, with particular thanks going to Les Jacobson,Mark Morse, Dongho Chang, Lanping Xu, Mark Leth, Mike Forbis, Dave Berg, and CDR’sprogrammer Alan Shen for their assistance with the use of CDR as well as the preparation of thisdocument. This report was produced as part of documentation activities in support of theWSDOT FLOW Evaluation Design project.

If you have any comments about or corrections to this document, please contact John Ishimaru atthe Washington State Transportation Center (TRAC) or via email at <[email protected]>.

Page 69: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 2

CDR Software Installation

CDR requires a PC-compatible computer (preferably a 100 Mhz Pentium or faster) runningWindows 95 or Windows NT, and a CD-ROM drive. The following instructions assume CDR useon a Win 95 machine.

(If the program has already been installed, you can skip this section.)

• Install the software files.

There are three software components of the CDR program:

CDR.EXE The main program. Version 2.52 is the latest official version as of March1998. This version reads CD-based data from mid-1993 to the present.

CDR.INI A file used by the main program to store user preference values. Do not edit.

RMDC.LST A file used by the main program to match cross street locations to the cabinetlist. This file should be included on the CD with the data.

To install the program on your computer, simply copy the first two files (CDR.EXE andCDR.INI) into a folder on your PC.

Note: For some earlier data sets, the RMDC.LST file is provided separately from the data CDs.In such cases, the RMDC.LST file should be copied into the same folder as the CDR program fileand preference file.

Page 70: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 3

FLOW Data CD Archives

CDR uses data stored on the FLOW data compact discs. These CDs hold traffic informationcollected on freeways and state highways in the central Puget Sound region, including I-5, I-405,I-90, SR 520, SR 18, SR 522, and SR 99. The traffic data are stored in 5-minute intervals, whichare aggregated from 20-second data collected by a network of WSDOT-operated data collectiondevices known as inductance loop sensors. Inductance loops are embedded in the pavementwithin a particular freeway lane at approximately 0.5-mile intervals in the central Puget Soundregion, as well as at key locations such as interchanges. Vehicles that pass over a loop are sensedand their presence is then recorded by field data collection cabinets. These field collection devicesthen transmit data to the WSDOT’s Transportation Systems Management Center at 20-secondintervals, where it is processed and archived. In most cases, the loops collect two types of data:a) traffic volumes, i.e., the number of vehicles that pass over a loop in a given time interval, andb) average lane occupancy, the percentage of time that a loop detects a vehicle passing over it.Lane occupancy is useful as a general measure of congestion, and can also be used to estimatespeeds. The CDs also store data validity information. (See Notes about Data Validity for moreinformation.)

FLOW data CDs are available starting with 1993 data. It takes from 2 to 4 CDs to store the datafor all locations for an entire year. CD-based data is available since approximately mid-1993 (thefield devices were added to the computer-based archives between January and May of that year).Earlier data is available, though in less accessible form: volume data since 1981 is available oneither tape or microfiche.

In addition to volume and lane occupancy, some loops are equipped to record other information.In particular, so-called speed traps (installations of two consecutive closely-spaced loops whoseactions are coordinated) directly measure the speed of a passing vehicle and estimate its length, bynoting differences in the time that each loop detects the same vehicle. (See Notes about LoopNames: Numbering Rules #4 for more information.) Speeds are computed for 5-minute intervalsbased on an aggregation of 20-second speeds. The vehicle length information includes theestimated average vehicle length as well as the approximate distribution of estimated lengths ofthe vehicles passing by during that time period.

Page 71: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 4

How to run CDR

1. Open up the Windows Explorer and navigate to the subdirectory with the CDRprogram in it. Double-click the CDR.EXE file to start the program.

CDR will display the main CDR screen which is divided into three columns, one for each of thethree primary analysis options. See Figure 1.

2. Set up initial CDR parameters (if needed).

Next, you will need to specify the location of the input data (the traffic data archive) and thedesired location(s) of the output data produced by CDR. See Figure 2.

Specify the CD drive location.

CDR accesses traffic data that is stored on compact discs. To use the CDs, you will need toprovide CDR with the drive letter (e.g., E is typical) that represents the CD drive on yourcomputer. Enter this letter designation by selecting the Options menu, choosing the FileLocations option, and then choosing the Data File suboption. Enter the CD drive letterdesignation in the appropriate space, and click OK. If you’re not sure what drive letter touse, look for the drive’s letter designation in Windows Explorer.

Note: CDR also offers the option of installing the data files on a network hard drive. If youwant to access the files that way, you will need to copy the data directory (with the includeddata files) from the CD to a network directory with the same name. (CDR searches for thedirectory designation.) Then, follow the procedures above to tell CDR the location, using the“Network Drive” option rather than the CD option.

Specify the output data file location.

Next, you need to tell CDR where to place output files that it creates. To specify thesubdirectory where you want output files to be saved, select the Options menu, choose theFile Locations option, and then choose the Output File suboption. You will then be giventhe opportunity to specify the subdirectory path for each of three types of analysis output thatCDR can produce (raw data, daily summary, and multiday summary). Enter the fullpathnames (e.g., C:\directory\subdirectory\) for each of the three output types. Unless you have aspecific plan for organizing your output files, it’s often easiest to specify the same pathnamefor all three options, which will result in all output files going into the same subdirectory; youcan move the files around later. (The output file name is specified in the main CDR screen.)

This step only needs to be performed as needed; it does not need to be performed every time yourun the program. The CD drive location and output file locations are saved after you enter them,and are used automatically for subsequent CDR runs until you change them.

Page 72: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 5

3. Place a data CD of interest in the CD drive. If you want to produce yearly statistics,insert one of the CDs for that year in the drive.

4. Select the type of output you want.

The main CDR screen has three columns, representing three options for output:

1. Raw Data2. Single Day Summaries3. Multiple day Summaries

Set the output switch to “ON” for the output option(s) that you want. (See Notes about theOutput Options later in this guide for more information about these analysis options.)

5. Specify the output data file name.

For the output option that you picked, enter the desired output data file name in the spaceprovided. Data produced by CDR will be stored using this file name. The default file extensionsare .DAT (raw data), .SDS (single day summary), and .MDS (multiple day summary), althoughyou can specify any file extension you want. You are limited to the standard DOS 8.3 file nameformat (up to 8 characters, followed by a period, followed by up to 3 characters in the extension).You also need to specify the file format. In most situations, the spreadsheet option is preferred,since it generates a tab-delimited file for use by your favorite spreadsheet application.

Note: Be sure to use unique file names each time you access and save data. CDR will overwritethe contents of an existing file without notification if you specify one as an output data file.

6. Specify the data that you want.

You will next need to describe the specific data that you want. This is done by indicating whichdays and locations you are interested in.

Specify Desired Dates

To indicate which days you’re interested in, click on the “Change” button under the“Dates” label. The window that appears will offer you several ways of selecting the daysthat you are interested in. See Figure 3. Moving from left to right on the screen, you canfirst select the specific months or days that you’re interested in, by selecting (with oneclick) the month(s) that you want or the range of calendar day(s) that you want. Movingfurther to the right, the next filter allows you to further narrow down the dates of interestby specifying particular days of the week that you are interested in. (Drag the mouse orshift-click to select multiple months or days of the week.) Finally, the list on the right willshow the resulting dates you’ve selected based on your filter choices. After you’veselected the days you want, click OK to exit the “Change” window. If you need to, youcan later verify the choices you’ve made by clicking on the “View” button above the

Page 73: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 6

“Change” button. If you need to modify your choices after leaving the “Change” area,you will need to re-select “Change” and begin the process from the beginning; CDR doesnot save your previous selections once you re-enter the “Change” window. (Note that ifyou chose the multiple day summary option, you will only be asked for the year that youare interested in, since this option produces yearly statistics.)

Specify Desired Locations (Elements)

To indicate which locations you’re interested in, click on the “Change” button under the“Element” label. CDR allows you to select specific elements (loops, stations, or speedtraps), where each element is typically installed in a particular freeway lane. See Figure 4.(See Notes about Loop Names for the loop naming system.) You select the loops (lanes)of interest by first specifying the corridor that is of interest (left side of the screen), thenselecting a specific location on that corridor (middle of the screen), then selecting whichloop(s) at that location are of interest. To select a loop, double-click the loop name.Repeat this process as often as needed, keeping in mind that if you select multiple loops,all the data will be stored in a single output file. The list on the right will show theresulting loops you’ve selected. If you want to remove a loop you’ve selected, double-click the name on the selection list. After you’ve selected the loops you want, click OK toexit the “Change” window. If you need to, you can later verify the choices you’ve madeby clicking on the “View” button above the “Change” button. If you need to modify yourchoices after leaving the “Change” area, you will need to re-select “Change” and begin theprocess from the beginning; CDR does not save your previous selections once you re-enter the “Change” window.

CDR only shows loops that exist on the start and end dates of the range of dates youspecify. In some cases, you may be prompted for other CD(s); this is done to ensure thatthe loops that you select actually exist for the entire time period that you want to analyze.

Additional Choices

For the single-day summary and the multiple-day summary, you must choose one of theoptions listed on the main CDR screen. The single-day summary options are a) 15-minuteand b) daily summaries, while the multiple-day summary options are a) monthly and b)yearly day-of-week averages as well as c) yearly statistics (AADT and AWDT).

CDR places vehicle volume and lane occupancy data in the user’s output file. (See FLOWData CD Archives.) However, depending on which of the three analysis options youchoose, and which location (loop) you select, you can also obtain other data, includingspeed, vehicle length data, and data validity flags. These other data options are presentedon the main CDR screen as part of the appropriate analysis options.

7. Get the desired data.

Page 74: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 7

When you have finished choosing your options, select the “GO!” button in the lower left part ofthe main CDR screen. The output will be prepared and saved to a file within 5-10 seconds if allthe data is on the CD you’re using and the request is simple (a few loops and locations). In thecase of summary statistics, you might be prompted for other CDs, and the process will takelonger. Status messages will be displayed at the bottom left of the main screen during thisprocess. These messages will indicate when output files are being created, when data is missing,or if the output file is already open.

8. Run other cases.

Return to step 3 to create other output files.

9. Look at the output data.

Each output data file will be stored in the subdirectory that you specified earlier. The file can beread directly by a spreadsheet application such as Microsoft Excel if you previously specified thespreadsheet output format (as recommended above). To open an output data file using Excel,begin by starting up Excel, then open the file using the Open option from Excel’s File menu, andaccept the default data types that are offered by Excel’s Text Import Wizard (which appears asthe file is initially opened and translated) by clicking the Wizard’s “Finish” button. See Figure 5.

CDR also offers a built-in read-only utility to quickly review output data. Under the File menu,select Open Output File, then choose the file you want. Note that this utility can only view smallfiles; larger files will require the use of an external reader (e.g., a spreadsheet application).

10. End the program.

When you are done using CDR, select Exit from the File menu.

Page 75: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 8

Figure 1. Main CDR Screen

Page 76: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 9

Figure 2. CDR Options

Page 77: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 10

Figure 3. Date Selection Screen

Page 78: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 11

Figure 4. Location Selection Screen

Page 79: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 12

Figure 5. Excel Import Wizard Window

Page 80: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 13

Notes about the Output Options

As noted earlier, there are three types of CDR output data. The following is a more detaileddescription of the options:

Raw Data (.DAT file extension): This option produces 5-minute counts of traffic volumes andaverage lane occupancy for the specified lanes and times. These are the actual data counts thatare stored on the CD. The raw data output also shows a validity flag for each data point, as wellas the number of 20-second data points that make up the 5-minute count; normally this would bea count of 15 (15 x 20 seconds = 5 minutes). Anything less than a count of 15 would indicate oneor more missing 20-second data values.

The output includes vehicle speed and length estimates if the selected loop supports suchcomputations (i.e., if it is part of a speed trap).1 When average speed information is available, it isreported in miles per hour for each 5-minute interval, while vehicle length is reported in feet. Thevehicles are further classified in the form of four “bins” or vehicle length ranges. Bin1 is thevolume count of all vehicles 26 feet long or less. Bin2 is the count of vehicles from 26 to 39 feetlong. Bin3 is the count of vehicles from 39 to 65 feet long. Bin4 is the count of vehicles that aregreater than 65 feet in length.

Note: Speeds are not measured for every vehicle crossing a speed trap, especially during periodsof heavy congestion. The four bin counts only include those vehicles for which speed and lengthvalues were successfully measured, and therefore the sum of the four bin counts will in generalnot equal the total vehicle volume for that lane and time interval. However, the distribution ofvehicle lengths in the four bins should be representative of vehicles on the freeway at that time.

Daily Summary (.SDS): This option produces 15-minute and hourly counts of traffic volumesand average lane occupancy for the specified lanes and times. These are raw data countsaggregated from the 5-minute data on the CD. This option can also produce daily, peak hour,and peak period summaries. It also provides vehicle speed and length summaries if the loopsupports such computations (i.e., if it is a speed loop).

Multi-Day Statistics (.MDS): This option produces monthly and yearly average daily trafficvolumes and average lane occupancies for the specified lane, as well as peak hour and peak periodaverages. You can also specify weekday averages or averages by day-of-week. As noted earlier,when specifying the lane locations for MDS analysis you will be prompted for the other CD(s)that make up the specified year’s data set. This is done in order to verify that the loops that youselect exist for the entire time period that you want to analyze.

1 The speed aggregation algorithm was modified starting in July 1996. Prior to that, a 20-second interval withouta speed value (due to zero vehicle volume) contributed a 20-second speed of zero to the 5-minute speed average,thus pulling down the average. This was especially apparent late at night and to a lesser extent at other off-peakhours. These zero-volume values are no longer included in the 5-minute speed average starting July 1, 1996.

Page 81: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 14

Figures 6a, 6b, 7a, 7b, and 8 show examples of the 5-minute, 15-minute, Daily, and Monthlyoutput types.

***********************************Filename: 5TO15.DAT

Creation Date: 02/2/98 (Wed)Creation Time: 03:16:59 File Type: SPREADSHEET***********************************

ES-145D:_MS___1 I-5 Lake City Way 170.8009/01/97 (Mon)

---Raw Loop Data Listing---

Time Vol Occ Flg nPds0:00 49 3.80% 1 150:05 37 2.90% 1 150:10 38 3.50% 1 150:15 34 2.60% 1 150:20 48 4.40% 1 150:25 44 3.60% 1 150:30 35 2.80% 1 150:35 33 3.30% 1 150:40 28 2.50% 1 150:45 30 2.30% 1 15

Figure 6a. Example of 5-minute Output (Volume and Occupancy)

Page 82: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 15

***********************************Filename: FIG6B.DAT

Creation Date: 03/17/98 (Tue)Creation Time: 09:30:36 File Type: SPREADSHEET***********************************

ES-146R:_MN__T1 I-5 NE 70th St-NB 170.7609/01/97 (Mon)

---Raw Speed speed.data Listing---

Time Spd Len Bin1 Bin2 Bin3 Bin4 Flg nPds0:00 59.1 15.7 52 0 0 1 1 150:05 59.2 15.7 47 0 2 0 1 150:10 60.2 15.4 45 0 0 1 1 150:15 60.9 15.2 54 0 1 0 1 150:20 60.4 17.2 38 0 2 1 2 150:25 61.1 15.1 49 0 1 0 1 150:30 60.8 14.1 50 0 0 0 1 150:35 59.3 14.7 38 0 0 0 1 150:40 61.2 14.5 40 0 0 0 1 150:45 60.9 15.5 36 0 1 0 1 150:50 58.7 20.2 27 0 1 2 1 150:55 60 16.4 30 1 0 1 1 151:00 60 14.5 28 0 0 0 1 151:05 59.4 17.1 41 1 1 1 1 151:10 61.7 15.7 30 0 1 0 1 151:15 60.7 15.3 35 0 0 1 1 151:20 58.4 18.5 18 2 0 1 1 151:25 63.8 15 28 0 1 0 1 151:30 60.2 14.3 31 0 0 0 1 151:35 58.9 18.8 33 0 1 1 2 151:40 61.8 14.1 26 1 0 0 1 151:45 62.5 14 24 0 0 0 1 15

Figure 6b. Example of 5-minute Output (Speed and Vehicle Length)

Page 83: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 16

ES-145D:_MS___1 I-5 Lake City Way 170.8009/01/97 (Mon)

---15 Minute Loop Summary Report---

Time :00 :15 :30 :45 Hour Time :00 :15 :30 :45 Hour G S B D

0:00 124 126 96 88 434 1:00 62 68 56 61 247 24 0 0 0

3.40% 3.50% 2.80% 2.40% 3.00% 1.80% 1.90% 1.60% 1.80% 1.80%

2:00 73 62 41 48 224 3:00 31 38 32 34 135 24 0 0 0

2.00% 1.60% 1.10% 1.20% 1.50% 0.80% 1.10% 1.00% 0.90% 0.90%

4:00 35 45 51 60 191 5:00 49 73 93 90 305 24 0 0 0

1.20% 1.40% 1.60% 1.70% 1.40% 1.20% 1.90% 2.60% 2.50% 2.00%

6:00 96 129 160 161 546 7:00 132 133 154 167 586 24 0 0 0

2.70% 3.40% 4.30% 4.40% 3.70% 3.50% 3.80% 4.10% 4.60% 4.00%

8:00 123 189 186 201 699 9:00 192 222 262 299 975 24 0 0 0

3.40% 5.10% 5.20% 5.60% 4.80% 5.30% 6.10% 7.30% 8.00% 6.70%

10:00 300 336 318 371 1325 11:00 386 406 436 449 1677 23 1 0 0

8.10% 9.30% 8.50% 10.00% 9.00% 10.70% 11.10% 12.20% 12.50% 11.60%

Figure 7a. Example of 15-minute Summary Output

Page 84: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 17

***********************************Filename: 5TO15.SDS

Creation Date: 02/2/98 (Wed)Creation Time: 03:19:11 File Type: SPREADSHEET***********************************

ES-145D:_MS___1 I-5 Lake City Way 170.8009/01/97 (Mon)

---Daily Loop Summary Report---

Summary Valid Vol Occ G S B D

Daily INV 0 0.00% 286 2 0 0AM Peak VAL 1831 4.20% 36 0 0 0PM Peak VAL 6226 10.50% 47 1 0 0AM Pk Hr VAL 1599 11.00% 11 1 0 0 10:45 11:45PM Pk Hr VAL 1910 13.80% 12 0 0 0 12:45 13:45

Figure 7b. Example of Daily Summary Output

Page 85: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 18

***********************************Filename: AADT.MDS

Creation Date: 02/2/98 (Thu)Creation Time: 10:54:09 File Type: SPREADSHEET***********************************

ES-145D:_MS___1 I-5 Lake City Way 170.80Monthly Avg for 1996 Jan (Sun)

---Multi-Day Loop Summary Report---

Summary Valid Vol Occ G S B D Val Inv Mis

Daily VAL 19392 7.50% 1133 18 1 0 4 0 0

AM Peak VAL 1493 3.50% 142 2 0 0 4 0 0

PM Peak VAL 5069 15.60% 190 2 0 0 4 0 0

AM Pk Hour VAL 1381 10.00% 47 1 0 0 4 0 0 10:45 11:45

PM Pk Hour VAL 1576 11.90% 48 0 0 0 4 0 0 13:45 14:45

Figure 8. Example of Multi-Day Output

Page 86: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 19

How Summary Statistics are Computed

There are four categories of summary statistics: 15-minute/hourly, daily, monthly, and yearly.Each is built up from the 5-minute raw data on the CD(s).

15-minute summary: The 15-minute traffic summary is computed by adding up the trafficvolumes for the day’s 5-minute segments over 15-minute increments. Occupancy is averaged overthe 15-minute increment. Hourly totals are also computed. The validity of the 5-minuteconstituent data points is not considered in the 15-minute sums.

Daily summary: The daily traffic summary is computed by adding up the traffic volumes for theday’s 5-minute segments to get the total daily volume. Peak hour and peak period volumes arecomputed analogously. The fixed peak periods used are 6-9 AM and 3-7 PM. Occupancy isaveraged over each time period. The peak hour volumes are computed by using a moving one-hour window that moves at 15-minute steps. In contrast, peak hour speed data are computed atfixed one hour periods starting at 7 AM and 5 PM. The daily, peak hour, and peak periodsummaries are subject to user-specified data validity thresholds. (See Notes about Data Validityfor more information.)

Monthly summary: The monthly traffic summary is computed by adding up the traffic volumesfor each daily summary volume for that month. Peak hour and peak period volumes arecomputed analogously. The peak periods used are 6-9 AM and 3-7 PM. Note that peak hourvolume averages over multiple days are computed based on the peak values for each day, even ifthey are at different times. Occupancy is averaged for each time period. The monthly summariesof daily, peak hour, and peak period statistics are subject to user-specified data validitythresholds. (See Notes about Data Validity for more information.)

Yearly summary: The annual traffic summary is computed by progressively summing up andaveraging data from the daily level up to the monthly level up to the yearly level. First, totalvolume of each day in a given month is computed (see daily summary above). Then, an averagevolume is computed for each day of the week for that month. This process is repeated for eachmonth. Then, an average volume is computed for each day of the week for the entire year, byaveraging all 12 Monday averages together, all 12 Tuesday averages, etc. Finally, the resultingseven annual day-of-week averages (or five weekday averages) are averaged together to producethe AADT (average annual daily traffic) or AWDT (average weekday daily traffic) for the year.The yearly summaries of daily, peak hour, and peak period statistics are subject to user-specifieddata validity thresholds. (See Notes about Data Validity for more information.)

Page 87: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 20

Notes about Data Validity

CDR keeps track of the validity of individual data points as well as the validity of statistics basedon those data points, where “validity” is based on the operational status of the traffic sensor(loop) as well as the values being produced by the sensor. CDR also allows the user to specifyhow much “bad” or questionable data can be tolerated in the computation of a summary statistic.The following describes the validity checks used to evaluate data points, as well as the mechanismby which users can specify their tolerance for potentially invalid data.

• Types of Data Validity Checks

Two types of validity checks are performed on data collected by the WSDOT loop network.

5-minute data check

The first type of check is performed on the 20-second data that make up each 5-minuteraw data value that is stored on CDs. This validity check is performed by the datacollection and archiving software prior to data storage on the CD. During this check,each of the 15 20-second data points that make up every 5-minute data value is checked tosee if it meets one of three conditions:

Bad Data: If the loop is hung (locked in an “on” state for longer than a prescribedtime period) or the loop’s data is “outside the envelope” (i.e., the volume-occupancy combination is not reasonable)2, the corresponding 20-seconddata value is labeled “Bad Data”. The assumption is that a locked “on”state or highly atypical combinations of volume and occupancy values aresymptomatic of erroneous data collecting conditions.

Disabled: If the loop has been disabled by a system operator, the corresponding 20-second data value is labeled “Disabled”.

Good Data: If the loop does not meet the criteria for the first two conditions, thecorresponding 20-second data value is considered “Good Data”.

After each of the 15 20-second counts that make up a 5-minute value is evaluated in thisway, the associated 5-minute value is given an overall data validity flag, according to thefollowing rules:

2 The approximate envelopes of acceptable volume-occupancy combinations (based on earlier TRAC research)have been encoded in the loop controller software as the following (for 20-second data):

If occupancy = 1, then volume must be = 2.If 1 < occupancy = 5, then 0 = volume = 7.If 5 < occupancy = 10, then 2 = volume = 11.If 10 < occupancy = 16, then 1 = volume = 17.If occupancy > 16, then 0 = volume = 17.

Page 88: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 21

If all 15 20-second values are labeled “Disabled”, the associated 5-minute data value isconsidered “Disabled”.

If all 15 20-second values are labeled “Good”, the associated 5-minute data value isconsidered “Good”.

If all 15 20-second values are labeled “Bad”, the associated 5-minute data value isconsidered “Bad”.

For all other combinations of 20-second validity flags, the associated 5-minute datavalue is considered “Suspect”.

The overall validity flags are placed next to each data value in the 5-minute output file,using a number code. (For information on the flag codes, see Data validity codes, laterin this section.)

Summary statistic check

The second type of check is performed by CDR when computing summary statistics basedon the 5-minute data. CDR has the ability to take raw 5-minute data from the originaldata files and compute statistics for longer time periods such as daily, monthly, and yearlyaverages. It does this by following a series of data aggregation steps that sum up andaverage the data. At each step of this process, CDR tries to determine whether theresulting summary statistic is valid, by looking to see how much of the data used to createthe statistic is considered good data. Thus, the validity of each summary statistic (daily,monthly, yearly) is evaluated based on the validity of its constituent statistics.

Summary statistic validity checking begins with the initial step of the summing-and-averaging process, which is the computation of daily statistics using 5-minute data (dailysums, peak hours, peak periods). At this first step, CDR looks at the data validity flagassociated with each 5-minute data point (the flags stored on the CDs with the volume andoccupancy data) to see how much of the data that is being summed up to create the dailysummary statistic is “good”, and how much is otherwise questionable. This tabulation isthen reported in the summary statistic output file, using letter codes. (For information onthe flag codes, see Data validity codes, later in this section.) CDR then compares thenumber of “good” and “questionable” data values with user-specified threshold values ofacceptability, to determine whether the resulting daily statistic is valid or not. (See User-specified data acceptance thresholds for more information.)

• Data validity codes

As noted above, the validity of 5-minute data are determined by the data collection system, thenstored on the data CD, while the validity of a summary statistic is evaluated by tracking the

Page 89: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 22

validity of its constituent 5-minute data values, keeping a count of the number of data points thatfit different categories of validity, and comparing the counts to user-specified acceptance levels.

Both the 5-minute validity information (determined by the data collection system) and thesummary statistic validity tabulations (computed by CDR) can be included in CDR’s output, usinga code that indicates the nature of the data validity. There are four categories of data validity:

Data is: 5-minute data Summary statisticbad value 0 B

good value 1 Gsuspect value 2 S

loop is disabled 3 D

Number codes are used for 5-minute data output, while letter codes are used for summarystatistics. In the 5-minute output, a number code is provided for each data point, while in thesummary statistic output, tabulated validity counts are provided which categorize the total numberof B, G, S, and D data points per 2-hour blocks (in the 15-minute output) or per statistic (in theyearly, monthly, daily, peak hour, peak period output). Each summary statistic’s tabulation of B,G, S, and D counts is based on the sum of tabulations from its constituent data points. Speedloops show only B, G, and S data.

• Data replacement to improve bad data

When a summary statistic is calculated, CDR copes with any invalid data detected in the 5-minutedata set used to compute that statistic by replacing the bad data with better data. Specifically, if a5-minute data point is determined to be suspect, bad, or disabled, it is replaced with the mostrecent previous “good” data point. (The user also has the option to use suspect data unaltered.)The modified data is then used to compute the statistic. If all data for a day is bad, disabled, orsuspect, the resulting summary statistic is set to zero.

In the following example, suspect data (flagged with a “2”) is improved by replacing it with themost recent good value (volume = 35, at 10:05):

Time Volume Validity Flag Modified Data10:00 25 1 2510:05 35 1 3510:10 37 2 3510:15 42 2 3510:20 41 1 41

Note that this process is used only to facilitate computation of summary values, and only whenthere is mostly good data (i.e., user thresholds of data acceptability are met). In other words, datareplacement is used to fix occasional bad data in an otherwise good data stream, to facilitatecomputation of a summary statistic. If an entire day’s data are considered invalid because it hastoo much bad data (i.e., doesn’t meet user thresholds), the data are not improved in this manner;

Page 90: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 23

instead, that day’s summary is set to zero (as noted earlier), and is not included in any subsequentsummary statistic calculation (e.g., monthly, yearly).

• User-specified data acceptance thresholds

CDR gives the user the flexibility to set a number of data acceptance thresholds that determinewhether or not to accept a summary statistic, based on the validity of the data that is used tocompute the statistic. These thresholds are specified by indicating the minimum amount of “good”or valid data that one is willing to accept at each stage in the summary statistic calculation proc-ess, below which the resulting statistic is tagged as unreliable and is discarded or otherwisereplaced. One can also specify the maximum amount of “questionable” data that one is willing toaccept, above which the resulting statistic is tagged as unreliable and is discarded or otherwisereplaced.

This flexibility is a double-edged sword. While it gives the user control over the summarystatistics that are created, it also requires the user to make more decisions. The data acceptancethresholds provide the user with additional flexibility to cope with missing or questionable data,but there are no specific rules for determining what those threshold values should be. Theresulting statistic could be significantly affected by the user-specified thresholds. For example,overly strict thresholds could result in large quantities of data being thrown out, resulting insummary statistics based on a limited subset of the original data that are therefore of questionablevalidity. It could even result in no statistic at all, if so much data has been thrown out at lowerlevels (daily statistics) that thresholds at higher levels (monthly or yearly) cannot be met. Cautionis therefore recommended when specifying these values. It might be desirable to perform asensitivity analysis to determine good threshold values, or alternatively, to determine that it isbetter not to have any thresholds at all.

• How to set data acceptance thresholds

Data acceptance thresholds are specified in the Options menu, under the Preferences option.Thresholds are set up for six categories of summary statistics: Daily, AM Peak, PM Peak, AMPeak Hour, PM Peak Hour, and Multiple Day. (Note: These threshold values are retained duringthe program run, but are not stored once you exit CDR.) For the first five statistics categories(Daily, AM Peak, PM Peak, AM Peak Hour, PM Peak Hour), you have the following choices(see Figure 9):

Perform Running Flag CheckCDR scans the 5-minute data set (whether it’s daily, peak or peak hour) from the starttime to the end time, and checks to see if consecutive blocks of data meet the user’sthresholds. This is done by setting up a “window” of consecutive data points and movingthe window along the data set from the start time to the end time. The user specifies thesize of the window, the number of data points that the window moves before checking thenext block, and the minimum/maximum acceptance thresholds within the window forgood, suspect, bad, and disabled data.

Page 91: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 24

As an example, in the case of a daily running flag check, suppose we have the followingsettings:

step size 1width 12minimum good 10

maximum suspect 1maximum bad 1maximum disabled 1

These values mean that the window is 1 hour wide (i.e., width = 12 x 5 minutes per datapoint = 1 hour), and moves along at 5-minute intervals (step size = 1 data point = 5minutes). Within the moving window, there can be no more than 1 suspect, bad, ordisabled 5-minute data value, and a minimum of 10 good 5-minute data values.Otherwise, the resulting summary statistic is considered invalid.

Perform Total Flag CheckThis option works similarly to the running flag check, except that the thresholds are forthe entire data set, not a moving subset.

Use Suspect DataData that is suspect, bad, or disabled is normally “patched” with the most recent goodvalue. The user can select this option to instead use suspect data unaltered.

For the last statistics category (Multiple Day), you have the following choices (see Figure 10):

Days to Monthly Day-of-Week: The user specifies the maximum allowable number ofinvalid or missing days while computing valid day-of week averages for a month. Invaliddays are days that don’t meet validity checks. Missing days are days when no data isavailable due to data corruption or because of miscellaneous reasons.

Monthly Day-of-Week to Yearly Day-of-Week: The user specifies the maximum allowablenumber of invalid monthly day-of-week averages while computing valid day-of weekaverages for a year.

Yearly Day-of-Week to AWDT or AADT: The user specifies the maximum allowablenumber of invalid yearly day-of week averages while computing the AWDT or AADT.

Note that the thresholds for the first five categories of statistics operate independently of oneanother, but the thresholds for the last category of statistics are dependent on threshold settingsfrom the other five categories, since multi-day statistics are aggregated from the daily values.

The user thresholds described above are combined with the tabulated counts of valid and invaliddata points that make up each summary statistic to determine whether the resulting statistic isvalid. For example, an AADT is made up of the combination of 7 (day-of-week) values, each ofwhich is determined to be either valid or invalid based on its constituent data. The user thresholdis then used to determine whether the resulting AADT is valid. These results are shown in the

Page 92: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 25

output file, which indicates the count of how many of the 7 values are considered valid or invalid,as well as the end result (i.e., is the statistic valid or not), reported as VAL or INV.

(Note: In CDR version 2.52, speed loop data are not subject to validity checks.)

Figure 9. Data Acceptance Threshold Window

Page 93: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 26

Figure 10. Data Acceptance Threshold Window, continued

Page 94: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 27

Notes about Loop Names

The following information is adapted from the Loop Naming Rules documentation of the RampMetering Database.

Each loop name contains exactly 7 characters, and is based on the loop’s location and purpose.The code is created by concatenating the following codes from left to right:

2characters

1character

2char.

2characters

RoadwayCode

Description DirectionCode

DescriptionLaneTypeCode

Description Lane #Code

Description

_M mainline S southbound _X exit _1_C CD N northbound _O on ramp _2_R reversible E eastbound RA Rt Adv Q _3AM aux mnln W westbound LA Lt Adv Q _4AC aux CD _Q queue _5AR aux rev _I inter Q _6MM metered mnln _D demand _7MC metered CD _P passage _8MR metered rev HX HOV exit _9

HO HOV on S1...S9 speed loopHD HOV

demandT1...T9 “virtual”

speed loopHP HOV

passageH_ HOV

mainline__ mainline

The underscores are considered characters and are required.

For example, the seven character names

_MN___1 and _MNH__1

represent a mainline northbound general purpose lane and a mainline northbound HOV lane,respectively. Both are on the far right when facing downstream, as indicated by the lane 1designation.

The following are additional naming rules:

Roadway Naming Rules

1) Each cabinet is assigned a principal roadway. For example, if a cabinet is assigned to principalroadway I-5, but also has loops on I-90, then the I-90 loops are considered auxiliary (A incolumn 1). In the case of on-ramps and exits, the principal name assignment prevails; in the

Page 95: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 28

previous example, for instance, if that cabinet had a loop whose lane could be considered bothan exit from I-90 or an on-ramp to I-5, it would be assigned on-ramp status, since I-5 is theprincipal roadway designation for that loop’s cabinet.

2) Loops that are part of a ramp metering station (i.e., the loop’s data are used in the meteringalgorithm) are designated with an M in column 1. Speed loops and HOV lanes do not use theM in column 1.

3) All loops must have a roadway type. On-ramps and exits take on the roadway type of the typeof roadway that the ramp enters or leaves respectively.

Direction Rules

1) All loops must have a direction code: northbound, southbound, eastbound, westbound.

2) Reversible roadways will take on the direction of the increasing milepost. I-5 express lanesare considered Northbound.

Lane Naming Rules

1) All loops will have one lane type.

2) Each metered lane can have two advance queue loops, one left movement (LA) and one rightmovement (RA). If no movement is associated with the advance queue loop, use RA.

3) Depending on ramp length, each metered ramp lane can have up to 2 queue loops, oneintermediate (_I) and one queue (_Q), depending on ramp length. The queue loop andintermediate queue loop will be evenly spaced, 200-500 feet apart, depending on ramp length.The demand (_D) and passage (_P) ramp loops are just before and after the meter signal,respectively.

4) Use _X for exit ramps, and _O for on-ramps that are not metered.

5) HOV lanes are specified by an H in column 4.

6) Use __ (double underline) or H_ for mainline loops.

Numbering Rules

1) Loops of similar type are numbered from upstream to downstream.

2) All multi-lane roadway lanes are numbered from right to left looking downstream (traffic flowdirection). If one or more of the lanes are HOV lanes, they are numbered as if they are GPlanes.

Page 96: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 29

3) Each HOV lane bypass will have the same lane number as the ramp lane it bypasses.

4) Speed loops are designated by an S in column 6. Where speed traps are installed, there arethree relevant loops. The pair of mainline loops used to compute speeds is a mainline loopwith an _ (underline) in column 6, and a corresponding speed loop whose name is identicalexcept with an S in column 6. The computed speeds and vehicle length information are storedin a “virtual” loop that has the same name, except with a T in column 6. The T loop does notactually exist in the field.

5) Ramp meter lanes are numbered right to left, upstream to downstream. All pre-1992 rampmeter lanes are numbered 2.

Station Naming

1) All station names contain roadway and direction type followed by _Stn or Hstn. (A station istypically a grouping of mainline general-purpose lanes that can be used as input data for rampmetering operations, although it can consist of up to 8 loops of any type. Data from a stationreflects the sum of all volumes (and the average occupancies) associated with loops at thatstation.)

2) This naming scheme works only for mainline, CD, reversible, and HOV lanes on theseroadway types.

Page 97: FLOW EVALUATION DESIGN TECHNICAL REPORTdepts.washington.edu/trac/bulkdisk/pdf/466.2.pdf · Technical Report Research Project T9903, Task 62 FLOW Evaluation FLOW EVALUATION DESIGN

CDR User’s Guide Version 2.52 30

CDR User’s Guide Version History

Version Date Description unnumbered 1/30/98 Original version

2.0 2/4/98 More information on bin count, vol-occ envelope, nPds, and _StnFigures addedMinor wording changesWSDOT review version

2.52 3/18/98 Incorporate WSDOT review commentsUpdate to reflect CDR version 2.52 feature setAdd Figure 6bAdd Table of Contents and Table of FiguresCorrect descriptions of RMDC.LST file, network drive option, vehiclelength bin counts, and data validity flagsMinor wording and format changesPDF (Adobe Acrobat) version created and placed on data CDsproduced starting 3/18/98