25462
25462
25462
DETAILS
CONTRIBUTORS
GET THIS BOOK Frances D. Harrison, William Duke, Juliet Eldred, Michael Pack, Nikola Ivanov,
Joe Crosset, and Larry Chan; National Cooperative Highway Research Program;
Transportation Research Board; National Academies of Sciences, Engineering, and
FIND RELATED TITLES Medicine
SUGGESTED CITATION
Visit the National Academies Press at NAP.edu and login or register to get:
Distribution, posting, or copying of this PDF is strictly prohibited without written permission of the National Academies Press.
(Request Permission) Unless otherwise indicated, all materials in this PDF are copyrighted by the National Academy of Sciences.
N AT I O N A L C O O P E R AT I V E H I G H W AY R E S E A R C H P R O G R A M
Frances D. Harrison
William Duke
Juliet Eldred
Spy Pond Partners, LLC
Arlington, MA
in association with
Michael Pack
Nikola Ivanov
MLP, LLC
College Park, MD
Joe Crosset
Larry Chan
High Street Consulting Group
Pittsburgh, PA
Subscriber Categories
Highways • Data and Information Technology
Research sponsored by the American Association of State Highway and Transportation Officials
in cooperation with the Federal Highway Administration
2019
The National Academy of Sciences was established in 1863 by an Act of Congress, signed by President Lincoln, as a private, non-
governmental institution to advise the nation on issues related to science and technology. Members are elected by their peers for
outstanding contributions to research. Dr. Marcia McNutt is president.
The National Academy of Engineering was established in 1964 under the charter of the National Academy of Sciences to bring the
practices of engineering to advising the nation. Members are elected by their peers for extraordinary contributions to engineering.
Dr. John L. Anderson is president.
The National Academy of Medicine (formerly the Institute of Medicine) was established in 1970 under the charter of the National
Academy of Sciences to advise the nation on medical and health issues. Members are elected by their peers for distinguished contributions
to medicine and health. Dr. Victor J. Dzau is president.
The three Academies work together as the National Academies of Sciences, Engineering, and Medicine to provide independent,
objective analysis and advice to the nation and conduct other activities to solve complex problems and inform public policy decisions.
The National Academies also encourage education and research, recognize outstanding contributions to knowledge, and increase
public understanding in matters of science, engineering, and medicine.
Learn more about the National Academies of Sciences, Engineering, and Medicine at www.national-academies.org.
The Transportation Research Board is one of seven major programs of the National Academies of Sciences, Engineering, and Medicine.
The mission of the Transportation Research Board is to increase the benefits that transportation contributes to society by providing
leadership in transportation innovation and progress through research and information exchange, conducted within a setting that
is objective, interdisciplinary, and multimodal. The Board’s varied committees, task forces, and panels annually engage about 7,000
engineers, scientists, and other transportation researchers and practitioners from the public and private sectors and academia, all
of whom contribute their expertise in the public interest. The program is supported by state transportation departments, federal
agencies including the component administrations of the U.S. Department of Transportation, and other organizations and individuals
interested in the development of transportation.
FOREWORD
By Dianne S. Schwager
Staff Officer
Transportation Research Board
Recent federal legislation has established requirements for agencies to set performance
targets and report on safety, pavement and bridge conditions, transit asset state of good
repair, system performance, freight, and mobile source emissions. These requirements have
resulted in increased visibility and attention to TPM and increased awareness of the impor-
tance of data within that process. Transportation agencies are recognizing that the value of
performance management goes far beyond meeting federal requirements. NCHRP Report
920 will assist agencies in making visible progress in meeting their objectives.
Many transportation agencies collect data but need to improve their capabilities to trans-
form available data into useful information. This requires deliberate effort at all stages of
the data life cycle, from specification through analysis, to make sure that data is of sufficient
quality and that it can be integrated, visualized, and used to provide insights. Having people
with the right skills and experience to carry out these activities is essential.
Under NCHRP Project 08-108, a research team led by Spy Pond Partners, LLC was asked
to prepare guidance to improve data utilization in support of transportation performance
management. The research team conducted a literature review and a series of interviews to
identify current transportation agency practices for managing data-supporting TPM. Based
on this practice review, they identified success factors and challenges related to efficient
and effective data utilization within the TPM processes. They created guidance organized
around six data life-cycle stages. The guidance includes a discussion of what is involved
in implementing each step and some of the critical choices to be made; a synthesis of key
points in the form of “Do’s and Don’ts”; checklists that can be used to assess agency capa-
bilities and identify opportunities for improvement; and illustrative examples.
While this guide draws many examples related to the federally defined TPM areas (safety,
pavement, bridge and system performance), it does not provide official guidance for MAP-21/
FAST Act target setting or reporting. It provides a framework for assessing current data
management practices and a source of ideas for practice improvement. Its purpose is to
promote practices that will enable agencies to go beyond meeting reporting requirements,
to get valuable insights from data that can be used to boost agency results.
The Guide for Practitioners is accompanied by a downloadable report, Developing
National Performance Management Data Strategies to Address Data Gaps, Standards, and
Quality: Final Research Report, available on the TRB website (www.trb.org) by searching
for “NCHRP Research Report 920.”
Table of Contents
Introduction..............................................................................................................................................1
Foundation..............................................................................................................................................12
Step 1: Specify & Define Data.............................................................................................................................................13
Step 2: Obtain Data ............................................................................................................................................................. 20
Reporting ................................................................................................................................................ 26
Step 3: Store & Manage Data............................................................................................................................................. 27
Step 4: Share Data................................................................................................................................................................ 38
Insight..................................................................................................................................................... 52
Step 5: Analyze & Use Data ............................................................................................................................................... 53
Step 6: Present & Communicate Data............................................................................................................................. 60
Cases ...................................................................................................................................................... 64
Case A: Arizona DOT Long-Range Plan Investment Trade-offs ........................................................................... 65
Case B: Caltrans State Highway System Management Plan .................................................................................... 72
Case C: Florida DOT Transportation Data Portal................................................................................................... 79
Case D: I-95 Corridor Coalition Probe Vehicle Data Procurement.................................................................... 85
Case E: Maryland State Highway Administration’s Incident After Action Reviews........................................... 90
Case F: MATOC Regional Operations Evaluation.....................................................................................................98
Case G: Creating a Team of Data Experts to Support TPM at the Mid-America Regional Council................................102
Case H: New Jersey DOT Project Assessment Reporting....................................................................................107
Case I: Ohio DOT Winter Performance Management ..........................................................................................115
Case J: Pennsylvania DOT's Statewide Transportation Operations Data Warehousing Business Plan ......121
Case K: Virginia DOT's Pavement Monitoring Program ........................................................................................127
Appendix A: Capabilities Checklists ..................................................................................................134
Introduction
Introduction
Introduction
This cycle begins with specifying and defining data requirements, and then
proceeds to obtaining the data, storing it in one or more repositories,
and processing it as needed to support use. Then, data are shared in
various forms, analyzed and used for decision making, and communicated
to different audiences.
Figure 3 shows the organizing framework for this guide. It extends this
cyclical model of data management to illustrate the process of utilizing
data for TPM.
The left side of Figure 3 shows how data can be used within TPM. This
provides the motivation and the requirements for the six data
management processes shown on the right side of the diagram.
Every investment in data—whether it is collecting new data or improving
reporting tools—should be evaluated based on how it helps the agency
to better identify needs and solutions, prioritize projects, allocate funding,
manage real-time performance, enhance accountability, and/or meet
reporting requirements.
The bottom row of Figure 3 identifies three fundamental processes for
making use of data for TPM: Foundation, Reporting, and Insight:
• Establish a data Foundation—defining performance measures,
identifying data requirements, selecting data sources, and obtaining
the data.
Introduction
Introduction
Foundation
Step 1
Specify & Define Data
“We don’t have time to do it
This step involves the up-front work to define data right, but we always have
time to do it over.”
requirements for TPM. – Anon
Step 1.1
Define Need, Vision, & Scope
Define business needs for data. Begin with the business objective(s) Case G
and concern(s) in mind and consider how your performance measures
will be used to support them. The goal is to clearly articulate the business To support its TPM efforts and
case for new data to agency managers and stakeholders. To do this, you as part of a broader strategy for
need to answer three questions: making effective use of data, the
Mid-America Regional Council
• What will new data tell us? (MARC) data coordination
• How will we act on it? committee compiled a top 10 list
of high-priority data sets for
• Is the cost of obtaining the data worth the value that will be
automation. This top 10 created
added? a road map for subsequent work
Define and document what information is needed to meet both internal activities to organize critical data
sets at MARC. Priorities included
agency decision-making requirements as well as external reporting and pavement and bridge conditions,
information-sharing requirements. safety measures, and system
performance.
• For agency decision support, document each decision: what is the
decision, when is it made, who makes it, and who provides
supporting information. Example decisions are which bridges to
program for rehabilitation, which intersections to target for safety
improvements, what percentage of available funding should be
allocated to bridges versus pavements, and what strategies should be
Tip
considered to address freight bottlenecks.
• For external performance reporting requirements, document what needs Don’t limit your scope to the
to be reported, when reports are due, and the required format for data needed to calculate
performance measures. Also
the information. Include references to any applicable regulations or consider the data needed to
guidance documents. understand trends or patterns,
• For external performance information sharing, define what formulate strategies, and identify
performance information the agency will share with the traveling appropriate actions to improve
public and with external partners. Document the intended uses of performance.
the information by each type of audience.
Specify the data requirements. For each of the above business
needs, identify the following:
• What data attributes are essential for calculating performance
measures, and what additional attributes might be helpful for
providing context and interpreting performance results?
• What scope of data coverage is needed?
Step 1.2
Define Performance
Measures
Case I
Specify, test, and document performance measure calculations.
For each performance measure, precisely document the data inputs and Ohio DOT established “regain
calculations needed. This documentation should have all of the time” as a winter performance
information needed for a programmer/analyst to implement the measure and defined it as the
elapsed time from the end of the
calculations. Test the calculations with sample data and compare values
snow or ice event to the time at
and trends against other similar measures that may be available. which speeds recover to typical
Describe performance measures in plain English. Performance levels. Regain time is the type of
measure that is easily
measures involving multiple data inputs and complex calculation logic communicated to decision
should be documented in a manner that end users can understand. For makers and the general public, yet
example, the measure “Buffer Time Index” can be described as “the it ties well with operational actions
amount of extra buffer time a commuter needs to allow to avoid being that directly influence it.
late to work more than one day per month.”
Step 1.3
Identify Analysis &
Reporting Requirements
Understand data user needs. Consider the information needs of
Case G
different audiences and identify how they want to consume this
information. Conduct interviews or focus groups to learn about analysis To ensure MARC’s data
and reporting needs and desired improvements. developers understand how the
Specify and document data extracts and report formats. For data they manage is ultimately
used, data personnel regularly
some audiences, standard, static reports will be sufficient; others may participate in meetings with
need more flexible views of the data with the ability to drill down into transportation planning staff,
details from a summary view or to obtain direct access to data via an both to discuss high-level data
application programming interface (API). Others may want to load needs and more focused detail-
detailed performance data into specialized analysis tools such as oriented breakout information.
pavement management systems, safety analysis systems, or traffic
simulation models. Identifying these different needs early can help to
avoid unexpected data requests that may be difficult to satisfy once data
systems and processes are established.
Step 2
Obtain Data
This involves activities related to obtaining the data
“Increasingly, data is gathered by
needed to support the entire TPM process, including information-sensing mobile
• data needed to calculate performance measures, devices, remote sensing, software
logs, cameras, microphones, and
• data needed to provide context necessary to wireless sensor networks. Global
understand performance trends, technological information per-
capita capacity has
• data needed to understand root causes and factors approximately doubled every 40
contributing to performance results, months since the 1980s.”
• data needed to set realistic targets, and Institute of Engineering and
• data needed for selecting strategies to improve Technology
performance.
These data may be obtained from existing internal,
external, and commercial sources. New data may also
be gathered using in-house resources and/or via
contract.
Step 2.1
Assess & Select Data Sources
Identify available data. Identify existing data sources that could be
tapped to meet some or all of the requirements. Review sources within Case I
the agency, sources from external partners (federal agencies, state
agencies, Metropolitan Planning Organizations (MPO), local agencies, Ohio DOT wanted to track how
quickly roads returned to normal
universities), and commercial sources. Obtain detailed information about speeds following a storm. They
each source, including data elements and their definitions, scope, date of were able to leverage existing
last update, frequency of updates, available formats, costs, and use data sources, including their RWIS
restrictions. and commercial speed data.
Step 2.2
Acquire Data
It may be appropriate to launch a data acquisition effort if Case E
• current agency data sources will not meet the requirements,
• there are no suitable commercial sources that meet the Maryland SHA captures a rich set
of data about highway incidents
requirements (for an acceptable price), including the name of responders,
• there is a business case for new data collection, and the road surface conditions, lane
• resources are available—both for initial collection and ongoing closings/openings over the course
upkeep of the data. of the incident, and operator
notes. These data are then
Once you have decided to collect new data, you must determine combined with data from ITS
whether to collect the data with in-house personnel or outsource data devices [Dynamic Message Signs
(DMS), Closed Caption
collection to a vendor. This will depend on the scale of the effort and in-
Television (CCTV) images,
house staff capacity. volume and speed detectors,
Regardless of who will be collecting the data, it is essential to have a signals] and probe-based speed
data.
documented plan describing how it will be collected.
Create a data collection and quality management plan. Prepare a
detailed plan to guide both data collection and quality management
activities (see Table 1 for suggested elements of such a plan). Data quality
management takes additional time and effort but should be integral to the
data collection process. Without sufficient resourcing for data quality,
there is a risk that the data collected will not be usable.
Remember that the people who collect the data are in the best position
to ensure quality. Build in training activities so that they understand not
only how to collect the data, but why the data are being collected and
what the intended uses are. Check in with them during the data
collection and see if they have suggestions for improving the process.
Reporting
Step 3
Store & Manage Data
This step includes validating, cleaning, normalizing,
aggregating, and integrating data; storing the data in
one or more repositories—either within the agency or
“in the cloud”; producing documentation needed for
“Data is just like crude. It’s
both technical and business users of the data; and valuable, but if unrefined it
managing access to the data—to both protect it from cannot really be used.”
unauthorized use and to ensure that it is accessible to Michael Palmer
those who need it. This step also includes activities to
design, develop, and manage databases and technical
infrastructure for data storage and data integration.
Step 3.1
Establish Databases
Design databases to support analysis needs. Performance measures Case G
rely on a deep archive of data to develop an accurate baseline;
understand multi-year, seasonal trends; and establish reasonable targets. MARC repurposed two open
Database design supporting performance measures should consider positions, including a “GIS
requirements for reporting, trend analysis, and root cause analysis. specialist” and a “demographer,”
Design should also consider the possibility that requirements may change into “data developer” positions
capable of creating and managing
over time—for example, an agency may decide to calculate different systematic workflows for data
metrics, drawing on the same raw data sources. Therefore, both raw and gathering and organization. The
transformed data may need to be stored. When raw data is voluminous developers have since created
(for example, pavement images), processed data can be maintained in automated processes to obtain
active storage and the raw data can be kept in lower-cost archive data sets and import them into
SQL databases. The databases
storage. have front-end interfaces that
Determine data retention policies. If retention policies are not greatly simplify the process of
querying them to extract the
modernized to reflect changes in storage costs, or if they are set without
information MARC needs.
full understanding of business needs, there is a danger of loss of valuable
data and TPM capability. Ten or more years ago, data storage hardware
was both physically large and expensive. Therefore, agencies implemented
data retention policies to better manage budgets and constrained physical
space in data centers by limiting the amount of storage and the duration
of the storage. Both the size and cost of storage have dropped
dramatically over the years. With the exponential cost savings and
available storage options, agencies can re-examine their retention policies
to make sure they align with business needs. For example:
• An agency may be required to report performance of the system to
the federal or state government in 15-minute intervals. That agency
may be tempted to aggregate raw data coming in 1-minute intervals
and only retain the aggregate information to save space. Later, the
agency may identify a need to track incident management
performance metrics—requiring the original 1-minute data that
tracks growing and shrinking queue lengths, user delay, arterial signal
performance, and the effects of secondary incidents. If the 1-minute
data are gone, the agency may be unable to track that metric
accurately (if at all).
• A data set may have limited value by itself and would be considered
unimportant for retention. However, when combined with other
Plan for data security. Implement a sound data back-up strategy that
will allow you to restore data in the event of a hardware failure, cyber-
attack, or inability to physically access facilities. If your data contains
personally identifiable information (PII) or other sensitive elements, it
should be clearly categorized as sensitive and managed to prevent
unauthorized access. TPM-related data that may be sensitive include
crash reports, travel survey data, and data from mobile devices. Some
agencies have policies that do not allow sensitive data to be stored in the
cloud. However, many cloud providers have a robust security policy to
both prevent and recover from cybersecurity compromises. In contrast,
agencies may have limited funds and expertise to implement robust
security mechanisms.
Maintain metadata. While some data sets are considered “self-
explanatory,” metadata and documentation are critical. For example,
highway crashes may appear to be a straightforward data set. On closer
examination, you may find that data from one jurisdiction is gathered
using different definitions for serious injuries than another. Data may be
collected using a mixture of electronic and manual processes with
different quality assurance processes applied. Newer data sets may be
provisional and subject to further updates. Metadata and documentation
become even more important when data is used in calculations to
support TPM. Two individuals can use the same raw data and measure
definition, but execute calculations differently depending on the context
and interpret the results completely differently.
Metadata should be maintained at both the data set and data element
level. Data set metadata covers information such as source, spatial and
temporal scope, quality, and access classification. Data element metadata
covers meaning, origins, usage, value domain, and format. Standards for
data set level metadata can be found in International Organization for
Standardization (ISO) 19115 and the Office of Management and Budget’s
(OMB) Project Open Data (POD) Schema. Standards for data element
level metadata can be found in ISO/IEC 11179.
Proper metadata and documentation that is frequently updated and
audited can ensure that confusion and interpretation variations are
minimized. Metadata and documentation must be properly versioned so
that data processing spanning different versions of metadata can be
interpreted and processed properly.
Step 3.2
Load & Integrate Data
Establish repeatable data loading processes. Ad hoc data loading
conducted in a rushed manner is a recipe for disaster. Repeatable Case G
processes need to be set up and, ideally, automated to load and
transform raw data into a form suitable for use. When a problem occurs At MARC, use of automated
with a data load, procedures should be in place to roll back and then processes and commercial data
integration tools for maintaining
repeat the process once the issue is identified. Sometimes, a series of key data sets has greatly simplified
loads are needed to refresh data in various repositories. For example, the process of querying, which
new bridge inspection data may be loaded into a staging database for means MARC is able to dedicate
review and quality assurance. The data may then be transferred to the more time to analyzing data, not
bridge management system database for analysis and to the agency’s road just collecting it.
inventory system. These data flows should be thoroughly tested,
automated, and well-documented. Accurate and detailed documentation
is essential, especially when data loads occur infrequently and there are
multiple systems and staff from different business units involved.
Case K
Store both raw and processed data. Storing transformed
performance data in addition to raw data can facilitate analysis and Virginia DOT integrated
reporting. pavement condition data with
data on planned paving projects
Make use of data integration tools. There is a wide array of to produce performance
commercial and open source tools available supporting data integration monitoring reports that tracked
processes. Some tools are geared to building extract-transform-load anticipated versus actual changes
processes for data warehouse environments; others are geared to big in condition and the likelihood of
data sets. Several excellent tools focus on integrating geospatial data. Use achieving performance targets.
The data integration effort relied
of these tools requires expertise and involves a learning curve, but it can on standardization of several data
save a great deal of time for data loading and integration tasks while also elements across two databases.
reducing the risk that errors are introduced through highly manual
processes.
Step 3.3
Assess & Improve Data
Quality
Data quality assessment. Poor quality data may have significant
impacts on calculated performance metrics and therefore impact TPM
decisions. Step 2.2 discussed the importance of planning for data quality “Data that is loved tends to
as part of data acquisition and outlined the contents of a data quality survive.”
management plan. However, there may be already-existing data sets
needed for TPM that are of unknown quality. A data quality assessment Kurt Bollacker
can be conducted to determine suitability of a data set for use in TPM.
Quality assessment can consider multiple characteristics, including
completeness, currency, accuracy, and consistency. Data accessibility and
interoperability are also sometimes considered. Assessing data quality
involves establishing data quality metrics and measurement methods. For
example, a metric for crash data completeness might be the percentage
of data records that are missing a location code. This could be measured
through a simple data query. Accuracy is typically assessed through a
combination of independent verification for a sample of the records and
application of validation checks to make sure measured values are within
expected ranges.
Quality management. Quality management is a continuous process
Case D
that starts prior to data acquisition and continues through the entire data
life cycle. It should include analysis and flagging of data records that fail
specific quality policies and thresholds. For example, pavement roughness I-95 Corridor Coalition data use
measurements less than 30 inches/mile or travel speeds over 150 mph agreements contain explicit data
quality specifications that ensure
might be flagged as suspect.
3rd-party-provided data meets
It is important to find the right balance when planning for data quality required quality standards to
improvement. All too often, agencies spend large amounts of resources support TPM.
attempting to clean, scrub, and validate data—only to find that there
continue to be data issues regardless of how much time and energy is
spent in cleaning. Perfection becomes the enemy of good, and agencies
end up never fully using the data to inform decisions. Worse, the
department (or person) responsible for the data hides it or prevents
others from using it due to potential issues, fear, liability, etc. As soon as
data (in any form) become available, it can and should be analyzed for
data quality and consistency. The act of analyzing data, even when it has
Step 4
Share Data
This step includes sharing transportation performance
data across business units within an agency, across “There’s a digital revolution
taking place both in and out of
agencies, or with the general public. This includes but government in favor of open-
is not limited to transmitting data and reports to meet sourced data, innovation, and
reporting obligations. collaboration.”
Step 4.1
Establish Reporting &
Presentation Infrastructure
Select and deploy analysis and reporting tools. Data analysis and
Case E
reporting tools that are available to agency staff are a critical element in
making effective use of data. These can include tools that fuse “siloed”
Maryland State Highway
data from disparate sources, tools that fill in gaps (missing data), and Administration (SHA) uses
those that identify or screen data outliers. Other important tools support Regional Integrated
analytics and visualization that help the agencies “see” into the data— Transportation Information
asking questions, identifying issues, deriving meaning from the data, and System (RITIS) visual analytics to
communicating those insights to others. Tools include commercial combine disparate data sets and
derive valuable information as
business intelligence packages that support both traditional reporting as part of after action reviews for
well as dashboards: GIS tools, statistical analysis packages, and specialized operational improvements.
tools geared to particular types of performance data—for example, asset
management systems and analytics platforms for congestion performance
reporting.
While it is unlikely that a single reporting and analysis tool can meet all of
the agency’s needs, it is important to keep in mind that every new tool
requires support to bring on new releases, train users, and troubleshoot
issues. It is best to follow a disciplined and coordinated process of
defining needs and requirements and considering whether existing tools
are sufficient prior to bringing on a new tool.
Make build versus buy decisions. Developing the appropriate
analytics software and databases that make the data easier to analyze and
accessible to end users can be a significant hurdle for agencies. For an
agency to build successful tools independently, they will typically need to
draw upon the expertise of software engineers, system architects, user
interface and user experience design specialists, developers, and project
managers. The tools will need to be maintained over time; therefore,
ample documentation and knowledgeable staff are needed that can be
called upon over the course of many years to keep the tools up to date.
Building complex tools with extremely small teams can be risky and
costly to an agency.
Because of the high barrier to entry and continuing maintenance costs of
developing custom tools, many agencies are now choosing to either
purchase off-the-shelf tools or to leverage tools that other
In-House Development
• Allocate ample time to working on requirements for usability,
functionality, and recruiting multiple user groups to get an
understanding of expected usage.
• Find an experienced partner. Attempt to procure the services of
a consultant who has performed similar work for other agencies.
Analysis tools may need customization and tailoring, but a proven
provider is often more reliable than a standard consultant.
• Recognize that initial startup will be costly. There are several
private-sector and university providers that have excellent archiving,
fusion, and analytics products. Some of these systems work across
borders and across multiple agencies. Consider adopting similar
technologies or products as neighboring jurisdictions when possible
so that shared experiences, knowledge, and benefits from shared
resources can be leveraged.
• Avoid “black box” solutions that do not explain the underlying
technologies, algorithms, or methods used to calculate the
performance measures. Ensure the chosen provider has documented
procedures that can be shared with software engineers and data
analysts. Some providers have multistate/agency steering committees
that collectively drive the features of the archive products to ensure
they are constantly meeting user needs.
Purchasing Tools
• More and more states and MPOs are starting to purchase probe-
based speed data; however, not as many agencies are investing in
tools to analyze the data that enables better decisions. Probe data
vendors, for example, have analytic tools that are sold at prices that
are less expensive than the effort needed to reproduce those tools
inside of the agency. These tools dramatically improve the
Purchasing Services
• For agencies that are not comfortable using analytic tools and are not
interested in doing in-house data analysis, hiring outside consultants or
university support may prove to be a viable option. Consultants and
universities frequently have access to scientists, statisticians, database
programmers, economists, and other analysts that would otherwise be
difficult to hire at state and local agencies. When seeking out-of-agency
services, it is wise to review product and project portfolios for
examples of prior work to ensure an agency’s needs match the skills of
the consultant or university personnel being proposed on a project.
• When hiring outside support (consultant or universities), consider a
phased approach to projects. Start small, and ensure the consultant is
able to perform basic analysis and fusion tasks with the data available. If
the consultants are successful, then work can progress to bigger
analysis tasks—adding layers of complexity and building on prior work
and available data sets. Initiating extremely large analysis tasks that are
not easily broken down into smaller deliverables can be a recipe for
confusion, cost overruns, disappointment, and waste.
• Regardless of who does the work, it is advisable to avoid mandating that
consultants use specific tools, technologies, or techniques to deliver a
solution. New technologies, methodologies, and tools are developed
quickly and often. Requiring outdated technologies can result in
unnecessarily limiting the agency and the consultant in performing
analytical tasks. Allow the consultants to drive these decisions based on
what they perceive to be the most efficient and effective tools and
methods.
Step 4.2
Establish Data Standards &
Formats
Take advantage of data standards. There are a number of data
standards that can be adopted for agency data sets and/or used when “The wonderful thing about
sharing transportation system performance data between agencies (see standards is that there are
Table 2). Some data standards cover data dictionary information (data so many of them to choose
elements and their definitions); others are more comprehensive and from.”
specify data formats, message structures, and technical mechanisms and
protocols for sharing. Grace Murray Hopper
Select file formats. Certain file formats have advantage over others
when it comes to sharing data between agencies. For example,
exchanging PDF files containing detour plans may make sense on an
individual case basis, but it significantly reduces the ability to automatically
process information and incorporate it in TPM processes. Ideally, data
should be formatted in a machine-readable format that provides the most
flexibility for integration in TPM tools.
Common data file formats found in open data platforms include JSON,
XML, CSV, and KML.
Step 4.3
Publish Data
Designate authoritative data sources. Authoritative data sources for
performance measure calculation should have been established as part of
Step 1.3—Identify Analysis and Reporting Requirements. In preparation
for publication, it is also important to designate authoritative sources for
the computed performance measures and for any contextual data to be
provided in the reports. Only designated authoritative sources should be
used for reporting. Following this guideline will ensure that information
released to the public is consistent and quality-checked.
Determine what data to share. The growing “open data” movement
is creating the need for agencies to decide what data to proactively make
available to the public, what data to provide on request, and what data to
keep restricted. Several states have developed policy guidance on data
classification. For example, the District of Columbia defines five levels:
• Level 0—Open (the default classification)
• Level 1—Public, Not Proactively Released (e.g., due to potential
litigation risk or administrative burden)
• Level 2—For District Government Use (exempt from the
Freedom of Information Act but not confidential and of value
within the agency)
• Level 3—Confidential (sensitive or restricted from disclosure)
• Level 4—Restricted Confidential (unauthorized disclosure can
result in major damage or injury)
DC has adopted the philosophy that data should be open by default and
restricted only if there is a reason to do so.
Select data sharing methods. Sharing methods can vary from very
basic file transmission, such as FTP, to more complex asynchronous,
persistent transmission methods such as subscriptions, web services, and
others. Open data sharing platforms such as data.gov have been
established at the federal level and by many state agencies. While simple
methods may be quick and inexpensive to implement, they can, in some
situations, diminish the value of shared data. For example, files posted to
an FTP site once a day introduce unnecessary latency and reduce certain
TPM capabilities.
prevalent in recent years. Not only are agencies benefiting from obtaining
new data sets from the private sector, but they are also benefiting from
the private-sector value-add to the existing agency data sets. Case D
Agencies must be careful about negotiating data sharing contracts with The I-95 Corridor Coalition
private-sector entities. In particular, agencies should pay particular collaboratively developed a public–
attention to data use restrictions and seek maximum flexibility in use of private partnership between
data. This includes the ability to share data with universities and partner member agencies and 3rd-party
data providers to take advantage
agencies and the ability to generate and share reports and summaries of the latest private sector data
with the general public. Agencies, in turn, should treat the private sector offerings. They created a liberal
as equal partners who can assist in disseminating information to the and flexible model data use
public and providing valuable insight in customers’ behavior and travel agreement that has become the
patterns. “gold standard” for agencies and
consortiums across the country
Provide tools for easy data access. Data has little value if it is not for over a decade.
easily accessible. With continued improvement in bandwidth capabilities,
web-based tools and data portals are becoming the norm. These tools
allow users to log in and access data from anywhere with an internet
connection. In addition to web-based access, the user interface and
efficiency of the applications are critical. Poor user interfaces can make it
difficult to understand what data and capabilities are available. Similarly,
executing a query on a data set and waiting several hours or even days to
receive an answer is unacceptable. Users must be able to quickly define a
question and receive a response to make data and information useful.
This means that agencies need to go beyond establishing databases or big
data platforms and ensure that appropriate tools exist to access, visualize,
and manipulate data for TPM. In many cases, more than one type of tool
will be required to meet the needs and skill sets of different types of
users. For example, some agencies make available one reporting package
for technical staff and “power users” and a second for more casual users.
Insight
Step 5
Analyze & Use Data
The Analyze & Use step begins once data are
“Distinguishing the signal from
converted into information. Data consist of values and the noise requires both scientific
figures that on their own have limited value. knowledge and self-knowledge;
the serenity to accept the things
Information is a result of data that have been processed, we cannot predict, the courage
organized, and interpreted to provide insight. Analyzing to predict the things we can, and
the wisdom to know the
and using data for TPM involves consumption of data difference.”
by analysts, planners, managers, engineers, and
Nate Silver
operations personnel to inform decision making or
direct real-time system management. In the Store &
Manage step, reporting and analysis tools are selected
and configured. In the Analyze & Use step, these
tools are used to support decision making. Moving
from installation of a tool to productive use of the tool
requires, at a minimum, the following:
• Identification of the intended users and uses for the
tool;
• Designation of one or more individuals to develop
specialized expertise with the tool (or engagement
of a consultant to play this role);
• Training and support for additional users of the
tool; and
• Iterative application and adjustment to tool
parameters and configuration.
Step 5.1
Analyze Trends
Assemble data. Assemble historical performance data for as many Case E
years as possible. If there have been changes to measurement methods,
document when these occurred, but do not discard the older data. Even Maryland SHA used an incident
if there are discontinuities in the trend line, each section of the line can timeline tool and graphics
still be instructive for understanding how performance changed within showing queue buildups and delay
each applicable time period. costs to help convince the
responder community to change
Review and analyze the data. Plot the data to visualize variations over its policies for blocking lanes.
time. If appropriate, apply smoothing techniques such as moving averages
to reduce noise. Use statistical techniques to distinguish the underlying
trends in the data from seasonal variations and one-off variations due to
events or other exogenous factors. Involve someone with expertise in
statistics to be sure that the methods being applied are valid for the data
being analyzed.
Step 5.2
Identify Patterns & Causes
Visualize data. Many people think of visualization as an end product— Case A
something produced after an analysis is complete to help communicate
the results (or a story) to the public. While this is often the case,
visualization can also be leveraged during the analysis life cycle as a way to Arizona found in its Long-Range
Transportation Plan process
better understand what is in your data, to identify outliers, and even to that for some performance areas,
point out flaws that exist in your data. Interactive visual analytics can lead good outcome-oriented
to insights earlier in the TPM process, sometimes more so than at the performance curves can be
end. established. Where this was not
possible, however, ADOT relied
Interpret data. Involve a group of experienced analysts in interpreting on simple curves reflecting the
the observed trends. Look for correlations between performance trends percent of identified needs met at
and factors such as changes in revenues or budget allocations, fuel prices, a given allocation level. The
lesson was to “not let the perfect
economic conditions, or legislation/regulation. Use statistical packages
become the enemy of the good.”
and available analytical tools to analyze correlations. Develop insights that
can be communicated to stakeholders (see Step 6—Present &
Communicate Data.)
Step 5.3
Predict Future
Performance
Create predictive models. Develop realistic assumptions about future
work based on available revenues, program budgets, and improvement Case B
programs. Use available analytical tools to predict future performance
based on funding levels and/or specification of planned improvements. Caltrans developed a unified
These include pavement and bridge management systems, safety analysis approach to presenting
tools, travel demand models, and other specialized simulation tools. In predictions of asset performance
and need that combined results
addition, predictive analytics tools are available that make use of a variety
from mature pavement and
of statistical techniques, including machine learning to predict future bridge management system runs
performance based on available data. with analytical methods based on
available data and expert
Applying analytical tools typically involves initial calibration—adjusting judgement for other assets. The
model parameters so that predictions are in line with observed approach involved combining data
conditions. This is followed by an iterative process of testing different from multiple disparate data
assumptions and reviewing results for reasonableness. Every model has sources that were at different
limitations; it is the role of an analyst to understand and explain these levels of completeness and based
on different analysis
limitations. methodologies at varying levels of
Predictive models generally require specialized expertise to set up and sophistication.
use. Significant modeling tasks can be outsourced if this expertise is not
available in-house. However, staff with analytical skills, patience, and
interest in modeling can be trained to take on ownership and apply these
tools as well as oversee work of contractors.
There are variations in available predictive tools for different
performance areas, and it can take time to develop robust modeling
capabilities. Agencies can start with basic approaches to prediction that
rely on expert judgment and rules of thumb. As long as methods are
clearly documented and caveats are stated, these approaches can provide
value.
Step 5.4
Establish/Update Targets
Integrate results of trend analysis and predictive analysis. Case A
Establish a baseline value based on the trend line. Integrate the results of
trend analysis and performance predictions to set targets for future Arizona DOT’s Long-Range Plan
performance. process developed performance
curves for different investment
Document the analysis. Documentation should include data sources, categories, including preservation,
the steps taken to prepare and combine them, any key assumptions or modernization, and capacity.
parameters used (e.g., inflation rates), and observations about data These curves were used to
anomalies or correlations. Good documentation will enable the analysis analyze the impacts of a change in
process to be repeated in the future by other staff members and will investment on different
performance outcomes.
serve as a valuable resource if questions come up about the results.
Step 6
Present & Communicate
Data
The Present & Communicate step involves developing “The greatest value of a
picture is when it forces us
effective ways of communicating the message and story to notice what we never
behind the data. expected to see.”
John Tukey
The process of communicating performance results is
likely to lead to questions about the data and analysis.
Data analysts should anticipate that there will be
iteration between the communication and analysis
steps. The need for data improvement or
augmentation may also be identified as new questions
arise. Over time, these improvements will strengthen
the agency’s ability to make effective use of data to
improve performance.
Step 6.1
Develop & Communicate
Performance Stories
Tell the story. Once data are successfully translated into information, it
Case H
is important to provide context and the “so what” surrounding that
information. One of the most effective ways to accomplish this is through New Jersey DOT developed
a storytelling approach. Information consumers (the audience) must buy project assessment summary
into the story for the information to be effective. The focus of the story pamphlets that tell a compelling
must not be on data and information, but on the message that the story about how investment in a
information is supporting. For example, when the New Jersey project benefitted the general
public.
Department of Transportation and the Delaware Valley Regional Planning
Commission were trying to convey the importance of specific roadway
projects to senior managers and public officials, they were challenged to
communicate about complex performance measures related to reliability,
safety, congestion trends, economic impacts, and more.
After many unsuccessful attempts at producing thorough reports for
decision makers, they tried an information visualization approach. They In 2012, an article in Governing
developed “elevator pitch” brochures that conveyed, primarily through magazine reported, “The [Gray]
Notebook tells Washington
graphics, the performance measures related to individual projects. The citizens pretty much whatever
visualizations contained within the brochures could be easily interpreted they might want to know about
by both engineers and the public. Accompanying narratives were short, how their transportation system
and the brevity of the brochures meant that more people ultimately read is working.…The first Gray
and understood the message. Notebook—as it came to be
called because of the color of its
When making a new investment in sensor infrastructure to support TPM, cover—was published in 2001,
the message should not be that 200 more sensors will provide more data and legislators loved it. Two
years later, those legislators
about congestion. Instead, the message should be that the new sensors approved a 5-cent increase in
will allow operators to more quickly identify traffic slowdowns, which will the gas tax to fund new
enable signal timing changes to ensure that inbound commuters make it transportation projects.”
to work on time. The audience must be able to relate to the outcomes
and understand how they are being affected by changes in performance of
the system.
Basic Do:
o Managers and analysts meet to review and interpret performance Leverage visualization tools
for your data analysis and for
results.
communicating TPM to the
o Story lines for performance results are developed, reviewed, and public and decision makers.
communicated. Employ “best practices” in
o Training is offered to internal staff to build skills in data presentation visualization that aim to
and communication. communicate with users, not
o Staff have capabilities to present data in a variety of formats tailored deceive them.
to the needs of different audiences, including heat maps, thematic Leverage 3rd-party
visualization tools and/or
maps, timelines, and other infographics. professionals to support your
o A combination of narrative and graphical presentation is used to analysis.
communicate performance information. Use visualization to support
your narrative.
Advancing Don’t:
o Feedback from data consumers is sought and used to improve Wait until the end of your
communication of information to different target audiences. project to begin to interpret
o Individuals with expertise in data visualization and communication are the results.
available to support development of performance data products. Use all the bells and whistles
in a chart or visualization
o Social media is used to communicate key results or draw people to tool; clean and simple
more detailed communication products. graphics tell compelling
o Specialized visualization and analysis environments have been stories.
developed—e.g., virtual reality simulators. Try to be too complicated
with your visualizations;
you’re trying to tell a story,
not confuse people.
Expect visualization alone to
tell your story; some
supplemental explanatory
text will be required.
Cases
Cases
Case A
Overview
Arizona’s LRSTP four-step planning process relied equally on data, public
engagement, and use of multi-objective decision analysis (MODA) Multi-Objective Decision
software. First, it mined rich engineering data about capital and operating Analysis (MODA)
needs and revenues. Second, it used these data within a public
Multi-objective decision analysis
engagement process to identify stakeholder goals and priorities. Third, it (MODA) is a tool for resolving
used the MODA software to enable stakeholders to explore resource allocation problems
performance projections of Arizona’s future transportation safety, where choices involve trade-offs
congestion, and infrastructure condition under various alternate among competing objectives that
feature sacrifice of one objective
transportation futures propelled by divergent investment strategies.
for the sake of another. MODA
Finally, it used stakeholder input to inform a recommended investment uses data about stakeholders’
strategy. preferences and decision
outcomes to guide selection of
Foundation: Specify & Define Data optimum choices. Initially, the
relative importance ascribed by
Estimates of capital and operating needs by transportation stakeholders to different choices
investment category. Arizona Department of Transportation oversees is scored using weighting
a statewide system of major highways and supports transit, rail, aviation, techniques. Subsequently,
and non-motorized transportation facilities around the state. The 2040 outcomes of different choices
are evaluated in terms of their
Plan used established data sources and modeling tools such as FHWA’s relative alignment with
Highway Performance Monitoring System (HPMS), National Bridge stakeholders’ priorities to arrive
Investment Analysis System (NBIAS), and Highway Economic at an optimum solution.
Requirements System—State Version (HERS-ST) to document $89.5
billion in baseline 25-year needs for all of the state’s major transportation
investment categories:
• Preservation investment needs to maintain pavement and bridges in
good repair;
• Modernization investment needs for upgrades like safety
improvements and intelligent transportation systems;
• Expansion investment needs for added lanes, new roadway
alignments, or interchanges;
• Operations and maintenance investment needs for routine work,
like patching potholes, fixing guardrails, mowing, and snow removal;
and
• Non-highway investment needs for transit, rail, non-motorized, and
aviation modes.
These estimates were derived from segment-by-segment data and
analysis of all engineering work needed to achieve and maintain an
acceptable level of performance throughout the state for each major
investment category. In combination with funding information, they
Cases
1
Because annual operations spending levels are determined independently by the
Arizona legislature and ADOT does not have the ability to allocate these funds to
highway capital spending, Operation and Maintenance needs were excluded from the
scenarios.
Cases
Success Factors
Using a MODA-based scenario approach for long-range planning helped
ADOT achieve a more informed recommendation for how to allocate
scarce transportation funding in the future. Success factors included:
Cases
investment options and served as the basis for criteria weighting. This
was either because they were not comfortable with some of the For more information...
comparisons (e.g., how can you compare preservation and safety), or • Arizona Long Range Statewide
because they simply did not understand its purpose. Transportation Plan
• The MODA approach did not enable users to consider synergies in https://www.azdot.gov/planning/
spending, such as the benefits to safety or mobility that might come transporation-programs/state-
long-range-transportation-plan
from increased preservation spending. Integrating consideration of • Arizona DOT Point of Contact:
investment synergies across performance areas is an area identified for Statewide Planning Manager
future improvement.
Case B
Cases
Overview
At Caltrans, capital projects to preserve, rehabilitate, or replace existing
transportation assets are included in the State Highway Operations and
Protection Program (SHOPP). Caltrans manages a separate Highway
Maintenance (HM) program for smaller maintenance projects. Together
the annual budgets for these programs are projected to total over $4
billion per year over the next 10 years.
The California Streets and Highways Code requires Caltrans to prepare
periodic updates to its SHOPP and HM programs. Historically these
updates were made separately, in some cases drawing upon different data
sources. In 2017, Caltrans developed a new, integrated SHSMP that
incorporates state requirements for preparation of both a ten-year plan
for the SHOPP and five-year HM plan. The SHSMP (shown in Figure 5)
includes a Needs Assessment and Investment Plan to help guide the
management of the state highway system and related infrastructure. The
plan covers thirty-four different SHOPP subprograms and six
maintenance subprograms. These subprograms address physical assets
including but not limited to
• pavement
• bridges
• drainage systems
• lighting
• signage
• guardrail
• transportation management systems
• water/wastewater treatment
• rest areas
• facilities
Cases
Cases
Success Factors
• Alignment between analysis and available data. The investment
categories in Caltrans’ SHOPP and maintenance programs are well
defined, and in many cases they are aligned with specific asset classes.
This simplified the process of determining what data were needed to
develop the SHSMP and help address the organization of data in the
plan.
• Common performance measures across assets. Development
of a common approach for analyzing and summarizing asset/investment
data was a critical step in preparing an integrated plan. One key insight
was that using a simplified approach to presenting data (e.g.,
good/fair/poor asset conditions) allowed for an effective means to
summarize data in cases where a more complicated approach was
used for analysis while also providing a basic analytical approach in
cases where only summary data were available on the asset inventory
and its condition.
• Streamlined presentation. The resulting SHSMP uses standardized
graphics for communicating conditions and deterioration rates. Details
on each asset/investment area are included in an appendix to the
document. Use of a standard approach helped simplify the
presentation of the materials and streamlined document preparation.
For more information...
Challenges & Lessons • Caltrans 2017 State Highway
System Management Plan
Lack of integrated data system. Caltrans did not, as of the http://www.dot.ca.gov/assetmg
development of the 2017 SHSMP, have an integrated system for collecting mt/documents/SHSMP.pdf
and managing the data used to prepare the document. Caltrans is • Caltrans Point of Contact:
exploring the feasibility of implementing an integrated asset management State Transportation Asset
Engineer
system to help support development of the SHSMP and other related
documents and plans in the future.
Cases
Case C
Overview
FDOT’s goal was to provide a resource to allow people “to explore and
download open geospatial data; analyze and combine open data sets using
maps; develop new web/mobile applications, and more.” FDOT’s available
transportation data includes everything from GIS shapefiles describing
transportation facilities, aerial photography, documents, manuals, real-
time and historical traffic counts, summary statistics, interactive web
applications, assets, software, and much more. They created a data portal
that meets the needs of multiple stakeholders, including internal
employees, those doing business with FDOT, and the public.
Cases
Figure 9 shows a user exploring aerial photography for a part of the state.
Users can search for photos by year, specific date, location, format, etc.
Results for small queries are shown immediately, while larger queries may
require additional time or retrieval options that go beyond online access.
Cases
Figure 10. In addition to raw data sets, FDOT makes many data exploration
websites available to the public. The image above is of FDOT's continuous
count stations.
Success Factors
• Legislative action: In 2011, Florida’s governor issued Executive
Order 11-03 establishing the Office of Open Government. This order
required the state to establish and maintain a website providing ready
access to accountability information and required each Florida agency
to establish an Open Government contact.
• Leading by example: Several business units within the DOT had
already started to post important data sets online. This was done for
several reasons, including trying to proactively keep consultants and
the public from flooding FDOT phone lines and inboxes with data
requests, thus freeing up employees to conduct other business.
Certain FDOT business units also believed that providing data to the
public could potentially spur innovative solutions to FDOT’s growing
transportation problems. The business units that were already
successfully sharing their data with the public could tout their success
and show the positive ROI to other business units that had not yet
begun to share data.
Cases
Case D
Overview
DOTs have been procuring traffic data and information services from the
private sector for many years. Data use agreements—documents that
state what can and cannot be done with private-sector data—are a
standard component of these public–private data procurements.
Unfortunately, many agencies end up with data use agreements that
heavily favor the private sector and severely limit the agency’s ability to
utilize the data in a way that benefits everyone.
Most agencies write RFPs for data from the private sector in a vacuum.
They may forget to talk to other stakeholders in their own agency—
procuring the data for a single use only. They may not think strategically
about future applications of the data. They may not seek out “lessons
learned” from other DOTs who have procured similar data. And worst
of all, they may neglect to specify acceptable use terms at all—leaving it
completely up to the data provider.
For example, many agencies deployed and owned Closed Caption
Television (CCTV) infrastructure, but contracted with 3rd parties to
stream those videos internally and with their customers (television
stations, agency traveler information sites, etc.). In those contracts,
agencies ended up having to pay to view their own video or share it with
others. The 3rd party monetized an asset that wasnottheirs by taking
advantage of agencies’ inexperience in negotiation of acceptable use
agreements.
Similarly, some 3rd-party speed sensor data providers negotiated the
installation of private-sector sensors on the public right-of-way in
exchange for allowing the agency to view the data coming from those
sensors. While at a glance that seems to be a reasonable partnership, that
data exchange came with many strings attached, effectively preventing
agencies from doing useful things with the data (like posting travel times
on variable message signs or on the web) unless the agency paid
significant additional fees. In effect, agencies traded valuable right-of-way
for a data set of very limited value due to acceptable use agreements.
Cases
Success Factors
• Collaboration. Because agencies approached this procurement as
true collaborators, they were able to leverage their collective
knowledge and experience in contracting and procurement to create
a strong, public, agency-friendly data use agreement.
• Willingness to share. The I-95 Corridor Coalition emphasized the
need to share data across jurisdictional borders to support TPM
efforts across entire regions. The importance placed on cross-
jurisdictional sharing has led to innovation.
• Strong champion. The executive leadership of the Coalition was a
strong proponent of this project. This leadership helped to push for
the collaboration mentioned above and was extremely forceful in
demanding some of the more innovative terms and conditions that
had heretofore not been requested of data vendors.
• Governance. The Coalition established a steering committee made
up of leadership from each state. This ensured that no single interest
could dominate and kept all of the states actively involved.
• Focusing on the end result. This highly successful data use
agreement was possible because agencies focused on the end result
and allowed the private sector to innovate and meet the needs of
agencies in a mutually beneficial manner. Agencies worked together
to exchange knowledge and experience and create a common vision
for better service to the general public.
Cases
Case E
Cases
Overview
The most mature transportation operations agencies conduct weekly
AARs on all types and categories of events in order to build teams,
enhance communication, and continually improve. Some agencies use a
manual tracking process—operators and responders fill out paper forms
to document AARs. These forms may be completed days or even weeks
following an incident and thus rely on foggy, imprecise memories. Then,
information from the forms is compiled and used to write reports or
discuss the incident in small groups. This manual process can be tedious,
leading to less complete data capture and a less-than-enthusiastic group
of AAR participants.
The Maryland State Highway Administration (MD SHA) has taken a
decidedly different approach to conducting AARs that involve automated,
electronic data capture and reporting. The result is a more effective AAR
that is engaging, easy to conduct, and informative. This approach
encourages more frequent AARs and allows the operational response of
the agency to be quantified and tracked over time.
Incident data are then combined with data from ITS devices [Dynamic
Message Signs (DMS), Closed Caption Television (CCTV) images, volume
and speed detectors, signals] and probe-based speed data used to derive
queue buildups and congestion levels.
Maryland SHA’s data is transmitted in real time to the Regional Integrated
Transportation Information System (RITIS) platform that supports
reporting and analysis for AARs and performance evaluation.
Cases
This timeline includes every event recorded during the incident, including
when responders were notified about the incident, when they arrived,
and when they departed. It includes communication logs, DMS
activations, queue buildups, photos and videos of the event, and
indications of which lanes were blocked over time.
Within the timeline, they can expand the list of operator notes and
communication logs to see the flow of information between the responders.
Figure 13 shows the communication log.
They also generate animated maps that show queues building (and receding)
over time (Figure 14). These maps typically show adjacent roads and
arterials so that the agency can better understand how their actions affect
others. Animated maps can also be placed side by side to showcase traffic
during a particular incident compared to normal traffic conditions.
Side-by-side congestion scan graphics (Figure 15) also show how queues
built up and subsided during the day of the event compared to similar
days of the week when no incidents occurred.
Cases
Heat maps then help the agency understand if this location is a high-crash
location (Figure 16).
Video images at different locations and time points (Figure 17) are available
to provide additional documentation of the incident.
Finally, user delay cost graphics (Figure 18) help the agency visualize the
social financial cost of the delays. For example, the cost of typical user delay
on I-495 in Maryland (including the connecting arterials) would be about
$150k/weekday. However, during one particularly bad incident (shown
above from images captured by MD SHA in RITIS), the cost of user delay
skyrocketed to over $1.2M. This conservative estimate did not account for
delays in the opposite direction of travel, excess fuel consumption,
emissions, or the cost of equipment damage.
Cases
Success Factors
• Specifying and obtaining the right data. The performance
reports described above were only possible because MD SHA spent
considerable time over the last couple of decades defining data needs
and dedicating funding and operator training to ensure that all
necessary data is collected.
• Analysis tools tailored to decision maker needs. Analysis tools
provide quick access to data and show the benefits of quick-clearance
practices and the value of transportation systems management and
operations programs.
• Effective visualizations. The reports and visualization provide the
agency ammunition for requests for funding, positions, and
equipment. MD SHA’s early investments in data and analytics are
paying off.
• Commitment to data-driven decision making. In the past, AARs
were more about “war stories” than data analysis. As a result, they For more information...
reinforced or justified existing behaviors rather than provide an • The Maryland DOT CHART
opportunity for new insights. Now, however, data, tools, and Strategic Planning website can
processes are in place to conduct regular AARs, and those tools be found here:
provide data-backed conclusions. The agency can be more confident in https://chart.maryland.gov//read
ingroom/RR_StrategicPlanning.
its decision making, and the tools assist MD SHA in making the case to
asp
external (and internal) partners about improving current practices. • A video of the Statewide
Over time, the agency will be able to analyze trends along individual Operations Center (SOC)
corridors and quantify the effects of actions taken based on the AARs. Operations Manager for
MDOT discussing their AAR
Challenges & Lessons reporting procedures using a
fatal incident example can be
Making the case for investing in data. Operators already face found here:
demanding jobs, and asking them to collect more data was an uphill https://vimeo.com/207690734#
battle. Early education and advocacy were needed at all levels. Senior t=567s
management had to be convinced that the extra workload would be • Maryland DOT Point of
Contact: State Operations
worth the effort. Center Manager
Funding and implementation. Even after the agency made the • RITIS Point of Contact:
University of Maryland’s CATT
decision to collect more data, it took a great deal of time to raise funds
Lab Director
and enhance systems to add new data fields, train staff, and see a return
on the investment. It is important to keep in mind that implementation
takes time and to set appropriate expectations.
Case F
Cases
This is a conservative estimate and does not include the costs of secondary
incident reduction.
The positive ROI stemmed primarily from enhanced real-time data sharing
among agencies. This allowed agencies to more quickly become aware of
incidents, respond, clear the incident, and alert travelers, and to develop
standard operating procedures that account for impacts of regional and
cross-jurisdictional events. Using the RITIS as a data sharing, warehousing,
visualization, and dissemination platform, agencies had easy access to
regional performance measures data that included detailed incident and
incident response information, as well as flow information from traditional
sensors and probe vehicles. A sample RITIS incident information display is
shown in Figure 20. The byproducts of this data sharing and collaboration
were improvement of each agency’s data (since others were relying on it)
and the ability to provide better and more relevant traveler information.
Cases
Success Factors
• Interagency collaboration. The pooled operational data from
multiple agencies enabled the benefit-cost evaluators to look at
benefits and costs as they pertain to the entire region, not just a
single agency or jurisdiction.
• Exposing data as a quality improvement strategy. As data
were exposed to a larger audience, a virtuous cycle of quality
improvement and data utilization occurred.
• Benefit-cost analysis to sustain support. While 9/11 and other
major incidents provided the initial impetus for MATOC, conducting
a benefit-cost analysis was instrumental to sustaining support for the
program. Historical data enabled before-and-after comparisons of
incident response that provided the basis for the analysis. For more information...
• MATOC Benefit Cost Analysis
Challenges & Lessons White Paper
http://www1.mwcog.org/upload
Providing a baseline. One of the largest challenges when it comes to s/committee-
evaluating benefits of a program such as MATOC is establishing a baseline documents/Yl5ZVlZc20100607
performance level. Unlike a capital improvement project that provides a 114406.pdf
capacity increase, the value of quicker communication or cross- • MATOC Website
https://matoc.org/
jurisdictional coordination is more difficult to establish. However, the • RITIS https://ritis.org/
availability of supporting data prior to establishment of MATOC and after • MATOC Point of Contact:
the program, and the ability to model incidents, allowed the independent MATOC Facilitator
evaluators to calculate tangible benefits of the program.
Case G
Cases
After inventorying all data housed within MARC, the committee created
a “top ten” list of priority data sets for automation. The list was split
evenly between transportation- and census-related items. The
transportation-related items covered pavement and bridge conditions,
safety/crash statistics, National Performance Management Research Data
Set (NPMRDS) information, transit route information, and what MARC
calls “network attributes” (e.g., functional classification, National Highway
System designation).
Repurposing staff positions to adapt to changing needs. In order
to move forward with its plan to automate data processes, MARC
Cases
Success Factors
• Focus. Developing a top ten list of data elements/sets for automation
helped MARC determine the right skill sets to look for, and also
helped the developers focus on high priority projects immediately.
• Communication. Having the data developers participate in the
performance management team meetings ensures that they have a
Data analysis only gets you so far. MARC cited “institutional inertia”
as a challenge, specifically the difficulty in convincing stakeholders to
appropriately consider data analyses when making critical decisions. Much
of the data MARC is now able to process was not available ten years ago.
That lack of availability led people to make decisions that were slightly
more political in nature. Unfortunately, that practice carries on through
today even though the data is now readily accessible and can more easily
For more information...
be analyzed.
• MARC data webpage
http://www.marc.org/Data-
A useful committee is often composed of people too busy to sit
Economy
on it. MARC staff understand the importance of the data coordination • MARC Point of Contact:
committee and its meetings in ensuring the developers have proper MARC Principal Transportation
guidance, but finding staff bandwidth to keep the meetings going has been Planner
a challenge. To address this, MARC is planning on restructuring certain
staff roles so that participating in the committee meetings becomes an
explicit responsibility.
Cases
Case H
Cases
Figure 23. Congestion scans depict congestion before and after the
completion of a project on the Garden State Parkway.
Cases
Cases
Most of these documents are posted online, and some are also
distributed in print form. The new reports and online publications have
been well received; the graphics are more engaging than prior reports,
and they tell a story that is relatable and understandable to a customer.
The agency is now able to update the documents more quickly and have
been able to conduct more before-and-after studies in less time than
before. This leads to the agency being perceived as more responsive to
the public and more capable.
The agency is also realizing significant cost savings in using this approach
and the data analytics platform. Before having their data fused and
available in analytics, they spent upwards of $20k for a before-and-after
study with a small consultant team. Now they can conduct the analysis in-
house in just a few hours. NJDOT estimates they are saving $475k/year
and 4,475 person-hours on conducting these studies annually.
Success Factors
• Ease of access to the data through graphical user interfaces
Cases
Case I
Cases
Figure 28. Partial screenshot of ODOT's Snow and Ice Event Dashboard.
It can be seen in this figure that in April 2018, District 4 met their
performance objective of recovering routes within two hours on only 11
of the 13 routes in their district—thus receiving a recovered score of 85%.
As the recovery period became an established measure of performance,
the agency was able to implement operational strategies that improved
the overall customer satisfaction level. For example, ODOT was able to
focus their attention on storm-impacted areas with the slowest recovery
period and/or districts that struggled to recover all of their routes to
deploy additional roadway treatments for future storms. ODOT also
Drilling down in the dashboard lets the user see how each event was
managed, whether it hit the two-hour recovery period, missed, etc.
Clicking on a specific location draws a diagram of the road depicting
which segments did not recover soon enough.
The dashboard resource view (shown in Figure 30) displays which
resources were used: overtime hours, brine, equipment, etc. The district
can even include written feedback explaining why it believes it was not
able to meet the specified target.
Cases
Success Factors
• Customer-focused performance measure with connection to
operational actions. ODOT spent a good deal of time developing
their TSMO plan and identifying specific measures that would best
reflect customer expectations with respect to system performance
after a winter storm. Recovery time is the type of measure that is
easily communicated to decision makers and general public, yet it also
ties well with operational actions that directly influence it.
• Use of available data and tools. ODOT was able to utilize
existing data sources and tools to calculate this new measure that
provided better insight into winter performance management. They
also leveraged internal staff to develop their own dashboard and
develop computational methods.
• Workforce capabilities. ODOT believes it has a large number of
younger, computer-savvy engineers who have thoroughly embraced
these measures. These staff resources have enabled ODOT to use
the measures to make real decisions within the agency.
Cases
Case J
Overview
Every agency generates transportation-related data and must store those
data in a way that enables easy access and management. All too often,
agencies collect data in silos. One department generates centerline
mapping files in a standalone GIS environment; a second department
collects speed and volume data for planning and federal reporting using
in-pavement sensors; a third department collects speed data from a mix
of probes and above-ground sensors; a fourth department collects and
manages toll collection data; and so on.
This approach is often organic and usually happens because agencies are
large and complex. However, in other agencies, an ad hoc approach can
be intentional. Business units fight for resources, become territorial over
their own data, or can be uncomfortable with others becoming aware of
their efforts. Ad hoc data management (or lack of management)
approaches increase agency costs, limit capabilities, and can lead to a
toxic culture.
The more mature agencies take a holistic view of data collection and
management—pooling resources to understand data needs, data assets,
data gaps, management, and accessibility.
Cases
Cases
Success Factors
• Stakeholder involvement. Through frequent stakeholder meetings and
considerable outreach, this project was much more successful than it otherwise
would have been. The stakeholder engagement identified data needs and existing
capabilities (and inabilities) with respect to data. These stakeholder meetings served
to galvanize the state’s data owners and data users and produced a much more
effective end result.
• Leadership buy-in. The Chief of Traffic Operations in PennDOT had a strong
desire to see this project succeed. He was an effective communicator who could
easily convey the intent and justification for the project in a way that secured buy-in
from others within the agency. Through his leadership, he was able to successfully
build a team that shared his vision.
• Alignment with existing agency goals. PennDOT’s State Transportation
Advisory Committee’s 2015 Transportation Performance report states that
“PennDOT is committed to accountability for results and transparency of
operations” and that “PennDOT must continue to provide leadership and
collaboration to its partners in continuing to modernize transportation products and
services.” This foundational data project directly aligned with this commitment and
therefore was easier to justify to agency funders.
Cases
Case K
Overview
VDOT has a well-established pavement management methodology that
includes annual pavement condition collection and needs assessment,
establishment of statewide pavement condition targets, and a
performance-based budgeting process. VDOT’s Central Office
Maintenance Division has responsibility for data collection and analysis;
districts have primary responsibility for pavement maintenance,
rehabilitation, and reconstruction project selection and development.
Cases
Cases
Success Factors
• Planning and data modeling to facilitate system integration.
Through iterative improvement of both PMS and PMSS data models
over the course of several years, automated processes to transfer
information between these two systems have been developed.
• Tapping into available data sources. Optimal selection, planning,
and execution of maintenance projects are the most direct ways for a
DOT to improve asset conditions on their network. VDOT
identified the information available during project development and
execution that could be used to predict the influence of planned
paving on the network-level performance and made this information
available to PMS analytical tools. This allowed the department to
integrate network analysis with project decision making without
additional data collection or reporting burden to district staff.
• Linking paving schedules to performance targets. Arming
district pavement managers with the information needed to
understand network-level implications of project-level investment
decisions reinforces good pavement management practice. Field input
generated through elevated attention to the PMS analysis also
exposed previously unrecognized opportunities to improve the
decision-support tools.
Cases
Appendix A: Capabilities
Checklists
Appendix A
Basic
o The business need for data has been identified and
documentation of this need is available for future reference.
o An inventory of existing agency data sources has been
compiled.
o Managers of the units responsible for data collection can
describe the primary users and uses of that data.
o Data requirements to meet internal and external performance
reporting requirements are defined and documented—
including attributes, scope, and granularity.
o Location referencing methods for performance data are
established to enable linkages with other agency data sets.
o Updated frequencies for new data are defined and
documented.
o Authoritative data sources have been designated for
performance measure calculations.
Advancing
o Discussions about data requirements are not constrained by
the status quo—they reflect what is important to know about
transportation performance in order to improve.
o Data needs are identified to support the entire TPM cycle
(beyond performance reporting), including root cause analysis,
identification and prioritization of improvements, and
evaluation of impacts.
o Minimum data quality standards are established considering
timeliness, accuracy, completeness, consistency, and
accessibility.
o Data requirements are defined collaboratively across business
units, including GIS and IT.
o Data communities of interest (or equivalent) have been
established to identify data improvements to support different
business needs.
Appendix A 135
Basic
o Data collection procedures and protocols are defined and documented.
o Data collection and processing workflows are mapped to clearly assigned
responsibilities and deadlines.
o Existing agency data sources are reviewed prior to collection of new
data.
o Available external (public and private) data sources are reviewed prior
to collection of new data.
o Quality management procedures are defined and documented, including
training and certification for data collection personnel.
o Requirements are in place that ensure new data collection adheres to
agency location referencing standards.
o Impacts of changes to existing data collection methods are assessed to
minimize loss of consistent trend data and disruption to existing reports.
o Data sources are assessed to understand usage restrictions that may
limit value.
Advancing
o The full cost of new data acquisition is estimated—considering initial
collection, ongoing updates, and supporting staff and technology
infrastructure.
o Funding for regular data updates (beyond the initial collection) is planned
and committed.
o There is regular communication with partner agencies to identify
opportunities for collaboration on data collection.
o Periodic scans are conducted to identify ways to improve data quality
and collection efficiency.
o Agency guidance and/or coordination protocols have been established to
assist business units wishing to purchase commercial data sources.
o Specialists with appropriate expertise (in-house or contractors) evaluate
use of emerging private data sources.
o Data requirements are defined with consideration of opportunities to
create valuable information through integration of multiple data sources.
136 Appendix A
Appendix A
Basic
o Data needed for TPM is stored in databases that are managed and
regularly backed-up to provide protection from unauthorized
access and corruption.
o Back-ups are tested on a regular, established cycle (e.g., monthly).
o Quality control procedures are in place to flag records that do not
meet established validation criteria.
o Data dictionary information (metadata) is maintained and stored in
a standardized fashion.
o Annual data snapshots are created for coordinated reporting
across data programs.
Advancing
o Hardware and software requirements for data storage, updating,
integration, and access are understood.
o Central data repositories have been established to integrate data
from multiple sources and provide source data for reporting and
analysis.
o Cloud and hosted storage options are considered for larger and
more complex data sets.
o Data retention policies and archiving protocols have been updated
to reflect lower storage costs and analysis of TPM business data
needs.
o A range of data storage options are available to support databases
with high transaction volumes and memory-intensive calculations
as well as archived data retained for future use.
o Standards have been adopted to enable combining data from
different sources.
o Data from multiple sources are fused to assemble a more
complete and accurate data set than would be possible from any
single source.
o Where appropriate, edge computing techniques are used,
involving data processing at the source (e.g., at the site of the
field sensor) rather than within a centralized repository.
Appendix A 137
Basic
o Employees are aware of key performance data sources within
the agency.
o There are clear agency policies in place that data should be
shared unless the need to protect it is demonstrated.
o There are protocols defined for how to share data to meet
different needs that consider use of state and federal open data
portals and hosted or cloud solutions.
o Open data portals are used to share data.
o Data explanations are provided in “plain English” to help users
understand meaning, sources, and limitations.
Advancing
o Data governance and stewardship structures have been
established to facilitate communication about data sharing and
identify opportunities for synergies across business units for
collaborating or combining data sources.
o Data sharing agreements are used (internal to an agency and
between an agency and its partners) that specify what data will
be shared, when and how—and establish a clear understanding
of data limitations and expectations for use.
o Data are shared in formats that are designed to meet the needs
of different users, which may include standard reports, data
feeds, and dashboards.
o Data with sensitive elements are sanitized for public
distribution.
o Data contracts and sharing agreements are reviewed to ensure
that agency flexibility is retained.
138 Appendix A
Appendix A
Basic
o Analysts are aware of and taking advantage of existing commercial
off-the-shelf, open source, and publicly available tools for analysis,
visualization, forecasting, and scenario analysis.
o Analysts are trained in use of data analysis and visualization tools.
o Private-sector or university contractors are used to provide data
analysis services as alternatives to standing up analysis capabilities
in-house.
o Data are available that are sufficiently accurate to meet analysis
requirements.
o Visualization and analysis tools are used to explore and discover
data anomalies and limitations.
o Data preparation and analysis tasks are well defined and planned to
ensure sufficient calendar time and staff resources.
o Analysts are able to identify trends and causal factors.
o Data element meanings, data transformations, and analysis
assumptions are documented.
Advancing
o Predictive models for key transportation performance measures
are validated based on multiple cycles of application.
o Targets are established based on predictive analysis relating
revenues and programmed work to performance results.
o Data mining is conducted to support “back-casting”—which
involves starting with a future vision and analyzing current and
historical data to estimate changes required to move from the
current situation to the future vision.
o Cooperative arrangements across agencies have been established
to transform data into information (e.g., the state DOT performs
analysis of travel-time reliability, computes measures for each
facility, and provides the data for use by MPOs and local agencies).
o Predictive analytics and machine learning techniques are applied for
predicting asset failure probabilities and other performance
measures.
Appendix A 139
Basic
o Managers and analysts meet to review and interpret performance
results.
o Story lines for performance results are developed, reviewed, and
communicated.
o Training is offered to internal staff to build skills in data
presentation and communication.
o Staff have capabilities to present data in a variety of formats
tailored to the needs of different audiences, including heat maps,
thematic maps, timelines, and other infographics.
o A combination of narrative and graphical presentation is used to
communicate performance information.
Advancing
o Feedback from data consumers is sought and used to improve
communication of information to different target audiences.
o Individuals with expertise in data visualization and communication
are available to support development of performance data
products.
o Social media is used to communicate key results or draw people to
more detailed communication products.
o Specialized visualization and analysis environments have been
developed—e.g., virtual reality simulators.
140 Appendix A
ADDRESS SERVICE REQUESTED
Washington, DC 20001
500 Fifth Street, NW
TRANSPORTATION RESEARCH BOARD
ISBN 978-0-309-48074-1
NON-PROFIT ORG.
COLUMBIA, MD
PERMIT NO. 88
U.S. POSTAGE
90000
PAID
9 780309 480741