Understanding the Potential for Video Analytics to Support Traffic Management Functions

Similar documents
Performance Evaluation of Non-Intrusive Methods for Traffic Data Collection. Kamal Banger, Ministry of Transportation of Ontario

Data Collection and Uses at International Border Crossings

Cedar Rapids ITS Deployment Project

UDOT Freeway and Traffic Signal Performance Metrics

LAWRENCE-DOUGLAS COUNTY INTELLIGENT JOURNEY

Project Summary. Feb. 2014

Intelligent Work Zones on Traffic Critical Projects

NOCoE Sponsored Virtual Peer Exchange- TMC Staffing

RTMS Solutions. Detection solutions to fit your city s needs.

Connected Car. Dr. Sania Irwin. Head of Systems & Applications May 27, Nokia Solutions and Networks 2014 For internal use

Arizona Emergency VII (E-VII) Program Overview and Focus Areas

Mike Mollenhauer. Director of the Center for Technology

TxDOT TMS PERFORMANCE MEASURES ITS TEXAS Texas Department of Transportation

The Practical Side of Cell Phones as Traffic Probes

ENTERPRISE Transportation Pooled Fund Study TPF-5 (231)

Connecting a Mobile York Region

I. Replacement and upgrade of Span Wire Traffic Signal at Indian River Road and Princess Anne Road

AZTech Capability Maturity Model

Mountain Corridor Incident Management Program

Managing DC Work Zones: DDOT s Citywide Transportation Management Plan

Intelligent Transportation Systems (ITS)

Smarter Work Zones Webinar Series

Case Studies: Using Signal Performance Metrics to Optimize Traffic Signal Operations. March 6, 2018

Climate Change and Extreme Weather Vulnerability Assessments AASHTO Extreme Weather Events Symposium May 2013

Transforming Transport Infrastructure with GPU- Accelerated Machine Learning Yang Lu and Shaun Howell

ITS Canada Annual Conference and General Meeting. May 2013

Developing Consistency in ITS Safety Solutions: Intersection Conflict Warning Systems E N T E R P R I S E

Concept Definition Report Adaptive Urban Signal Control Integration (AUSCI) Project. Executive Summary -- August 1995

NORTHWEST PA REGIONAL OPERATIONS PLAN PENNSYLVANIA DEPARTMENT OF TRANSPORTATION ENGINEERING DISTRICT 1-0

Collaborative Intelligent Transportation Systems (ITS) Initiatives with Neighbouring Jurisdictions

San Antonio TransGuide: TOC Update ITS Texas 2009

Integration Guide AXIS Camera Station and Citilog SmartTraffic-AID application

ENHANCED PARKWAY STUDY: PHASE 3 REFINED MLT INTERSECTION ANALYSIS

Engaging Maryland toward CAV advancements Christine Nizer, Administrator

Utilization of TSMO Practices in Highway Construction Work Zones: A Case Study

Security for V2X Communications

Fusing Data Across the State of Florida to Support ATIS Services

Los Angeles County Metropolitan Transportation Authority (Metro) Arterial Performance Measures Framework

Connected Vehicle (CV) Technology Procurement State of the Practice Analysis

SECTION 2 NAVIGATION SYSTEM: DESTINATION SEARCH

Upgrading Traffic Signals to Enable a Connected Vehicle Test Bed Somerville, Massachusetts

Arizona State Troopers Highway Patrol Division Sergeant John Paul Cartier

ENTERPRISE Transportation Pooled Fund Study TPF-5 (231)

Bellevue s Traffic Adaptive Signals

City surveillance. Creating safer cities. Network video surveillance with excellent image quality. Every time.

SAFETY ON THE IH 35 EXPANSION PROJECTS. Andy Petter, P.E. - Waco District

SIMULATION AND ANALYSIS OF ARTERIAL TRAFFIC OPERATIONS ALONG THE US 61 CORRIDOR IN BURLINGTON, IOWA FINAL REPORT

Validation of Simulation Models Using Vehicle Trajectories. TRB Annual Meeting January 11, 2015

Managing DC Work Zones via a Citywide Transportation Management Plan. ITE Mid-Colonial District Annual Meeting May 20, 2014

Cisco Kinetic for Cities Safety and Security Video Analytics (Kiwi Security)

The Missouri Department of Transportation, in cooperation with its regional partners Illinois Department of Transportation, Metro and East West

Transportation Data for Chicago Traffic Management Center. Abraham Emmanuel Deputy Commissioner, CDOT

Singapore Autonomous Vehicle Initiative (SAVI)

Virginia Connected Corridor

VIDEO-BASED AUTOMATIC INCIDENT DETECTION ON SAN-MATEO BRIDGE IN THE SAN FRANCISCO BAY AREA

Escambia-Santa Rosa Regional ATMS. Escambia-Santa Rosa Regional Advanced Traffic Management System (ATMS) Florida Alabama TPO

Johnson City Regional ITS Architecture Update Review Workshop. March 12, 2015

STREAMS Integrated Network Management Presented by: Matthew Cooper

ISO/TS TECHNICAL SPECIFICATION. Transport information and control systems Traffic Impediment Warning Systems (TIWS) System requirements

National Operations Center of Excellence. Douglas W. Wiersig, P.E. Director of Transportation and Public Works, City of Fort Worth, Texas

SHRP 2 Safety Research Symposium July 27, Site-Based Video System Design and Development: Research Plans and Issues

Generic Procurement Specification for a Video Detection Solution for Roadway Applications

Accelerating solutions for highway safety, renewal, reliability, and capacity. Connected Vehicles and the Future of Transportation

MS2. Modern Traffic Analytics ms2soft.com

Number of state sponsored/hosted pilot programs in the following areas:

KENTUCKY TRANSPORTATION CABINET

PART 2. SIGNS Chapter 2L. Changeable Message Signs

Aurora: A 17 agency pooled fund focused on advancing Road Weather Information Systems. Aurora Pooled Fund

STATEWIDE INTEGRATED TRANSPORTATION RELIABILITY PROGRAM

ALABAMA DEPARTMENT OF TRANSPORTATION (ALDOT) STATEWIDE OPERATIONS

5 ROP Institutional Issues

ANALYZING AND COMPARING TRAFFIC NETWORK CONDITIONS WITH A QUALITY TOOL BASED ON FLOATING CAR AND STATIONARY DATA

KIPDA ITS Architecture Update Kick-off Meeting

Next Generation COMPASS Lite. Presented to: ITS ACGM 2014 Present by: Kris Huber, CTO, Array Systems Computing

ARE YOU ADAPTIVE? (From a Traffic Signal Perspective)

Acyclica Congestion Management. By Sarah King Regional Sales Manager Control Technologies

Best Practices Guide

Central Corridor. - Central Corridor. Management Committee, Light Rail Transit. Rail-Volution - Weaving Transit into Existing Communities

APPLICATION OF AERIAL VIDEO FOR TRAFFIC FLOW MONITORING AND MANAGEMENT

TRAFFIC STUDIES MADE SIMPLER: COUNT AND CLASSIFICATION IN THE CLOUD

Intelligent Transportation Systems (ITS) for Critical Infrastructure Protection

Treating Potential Back- of-queue Safety. Developed By:

Executive Summary. City of Goodyear. Prepared for: Prepared by: November, 2008 Copyright 2008, Kimley-Horn and Associates, Inc.

Speed Limit and Safety Nexus Studies for Automated Enforcement Locations in the District of Columbia Inside Southern part of 3rd Street Tunnel

Automatic Incident Detection Solutions UK & Ireland Partner

Page 7. Page 8. (MDSS)) Pilot Phase. Page 9. Page 6. Vision. Mission. of Program. Optimizing the. complimentary. Page 1 of 9

South Central ROP Projects

GTA West Corridor Review

Tips for viewing this webinar

APPENDIX E TRANSPORTATION

Using Bluetooth Low Energy (BLE) Technology to Trigger In- Vehicle Messages at Work Zones. Chen-Fu Liao and Max Donath University of Minnesota

WIRELESS CCTV. Mobile Surveillance for Events.

Connected Corridors Face-to-Face Meeting. Tuesday, Dec 6th, :30 3:30 pm Caltrans D7 HQ

Verification Plan: Mitchell Hammock Road. Adaptive Traffic Signal Control System. Prepared by: City of Oviedo. Draft 1: June 2015

OR 217,I-5 Experience Portland, OR

Real-time Adaptive Control System. Technical White Paper. September 2012 Trafficware SynchroGreen Technical White Paper

Altec Systems Inc 16 Goldpark Court Woodbridge, Ontario L4L 8V5. Monday October 22, 2007 Project Number

Travel Time Estimation Using Bluetooth

The goal of this task is to assemble the input data required to deliver national accessibility reports and

QUICK HELP GUIDE. Traffic Signal Management System (TSMS) Modern Traffic Analytics

Transcription:

Understanding the Potential for Video Analytics to Support Traffic Management Functions Project Summary Slides November 2014

ENTERPRISE Program Program Goals Facilitate rapid progress in the development and deployment of ITS technologies Accelerate the systematic advancement of selected ITS projects Members carry out ITS projects and activities including fundamental research, technology development, demonstration, standardization, and deployment. 2

ENTERPRISE Program Members Arizona DOT Georgia DOT Idaho Transportation Department Illinois DOT Iowa DOT Kansas DOT Maricopa County, AZ Michigan DOT Minnesota DOT Mississippi DOT Oklahoma DOT Pennsylvania DOT Texas DOT Washington State DOT Ontario Ministry of Transportation Transport Canada Dutch Ministry of Transport FHWA

What is Video Analytics? Video Analytics systems process video streams from traffic cameras to: Collect Traffic Data: Vehicle counts, speeds, vehicle classifications Detect Incidents and Create Alerts: Stopped vehicles, slow traffic, wrong-way vehicles, wildlife, pedestrians, debris Traffic Data Output (15-min Increments) Incident Detection 4

Why Use Video Analytics? Challenges Difficult to monitor conditions in rural areas Challenge for TMC operators to monitor multiple camera views simultaneously Vehicles traveling the wrong way introduce safety hazard Opportunities Utilize existing camera infrastructure Potential to use Video Analytics for multiple purposes (traffic data collection, incident detection) 5

Why Evaluate Video Analytics? Project Goals Investigated potential of Video Analytics as a tool for: Traffic data collection Incident detection Wrong-way vehicle detection Proof of Concept evaluation to understand current state of practice How accurate? How effective? How useful? Compared to traditional methods/technologies: Loop detectors, radar, reported incidents, visual observation Not a comparison of vendors products 6

Virtual Test Bed Deployment Sites Ames, IA Wrong Way Detection Test-bed Des Moines, IA Incident Detection Cedar Rapids/ Iowa City, IA Traffic Data Collection Incident Detection Ontario, CA Traffic Volumes Kansas City, MO/KS Traffic Data Collection

Deployment Conditions Tested in Real World Conditions Existing camera infrastructure Typical TMC practices and workflow Conditions Not Controlled to Ensure Optimum Performance Camera settings & system configurations not always ideal for video processing (doing this could affect viewing ability) Normal panning/zooming of cameras TMC operations did not allow for constant monitoring and re-configuring of Video Analytics. Efforts made to adjust systems as much as practical.

INCIDENT DETECTION

Incident Detection Cedar Rapids - Rural Deployment 7 cameras instrumented - 2 vendors 2 Cameras Cedar Rapids 5 Cameras Rural Interstate Coralville

Incident Detection Des Moines Deployment Urban / Suburban 15 cameras instrumented 1 vendor (Approx. 12% of Des Moines freeway network coverage with Video Analytics)

Incident Detection Variation in Camera Views (examples) Incident Types Detected by Video Analytics Stopped Vehicle / Debris in Road Slow Traffic / Congestion Pedestrian Wrong-Way Vehicle

Incident Detection Analysis Approach: 1) Reviewed Detection Alerts: Still Images / Video Clips 2) Classified Alerts: Likely Detection (validated) Detection Not Likely (not validated) Unable to Determine 3) Calculated % validated, % not validated, % unable to determine (as a function of total number of alerts) 4) Highest level of performance reported

Incident Detection Examples - Incident detection validated Stopped Vehicle

Incident Detection Examples - Incident detection validated Stopped Vehicle

Incident Detection Examples - Incident detection validated Slow Traffic / Congestion

Incident Detection Example - Incident detection validated Pedestrian detected as Stopped Vehicle

Incident Detection Examples - Incidents not validated (false alarms)

Incident Detection Examples Incidents not validated (False Alarms caused by Obstructions in View)

Incident Detection Examples - Unable to determine

Incident Detection Results: Highest Level of Performance Stopped Vehicle / Debris: 72% alerts validated, 23% not validated, 5% unable to determine (81 alerts during a 44-day period) Stopped Vehicle / Debris Remove False alarms from Object in View: 0% false alarms (26 alerts during a 21-day period) Slow Vehicle/Congestion: 30% alerts validated, 33% not validated, 37% unable to determine (1111 alerts during a 44-day period) Pedestrian in Road: None observed Wrong-Way Vehicle Movements: None observed

Incident Detection Results Factors that Impacted Performance Objects in the field of view Weather events / moisture on camera lens Headlight glare on roadway during nighttime lighting conditions Factors that Did Not Appear to Impact Performance Camera position (zoom level, angle to roadway) Inaccurate configuration of Video Analytics to roadway lanes (e.g. camera panning)

Incident Detection Comparison of Detection Alerts to Agency Reported Incidents It is likely that Video Analytics detected a number of incidents that were not observed by agency staff, indicating that Video Analytics can be an effective tool for supplementing existing mechanisms to alert operators Strategic selection of camera locations along a coverage area will optimize usefulness of Video Analytics

TRAFFIC DATA COLLECTION: Iowa/Kansas City Deployments

Traffic Data Collection Traffic Data Types: Volumes (Vehicle Counts) Average Speeds Vehicle Classifications Classification Categories from Video Analytics Corresponding FHWA Classifications Motorcycles Classifications 1 Cars Classifications 2-3 Small Trucks Classifications 4-7 Large Trucks Classifications 8-13

Traffic Data Collection Analysis Approach Data collected in 15-minute increments Video analytics outputs compared to outputs from DOT detectors (loops and radar) Absolute Percent Difference (Abs % Diff) Calculation: o Calculate 15 min. period difference from DOT data o Convert it to absolute difference (remove any - ) o o Compute Percent Difference Result is Abs % Diff. Caveat: Night-time traffic is often very low volumes. Abs % Diff. is not as meaningful

Traffic Data Collection Results : Highest Level of Performance (All results shown are average % Diff for one week) Traffic Volumes: Day: 9% Avg. % Diff. (carries reasonable expectation of repeatability) Night: 17% Avg. % Diff. (Does not carry reasonable expectation of repeatability) Vehicle Speeds: Day: 2% Avg. % Diff (carries reasonable expectation of repeatability) Night: 6% Avg. % Diff (carries reasonable expectation of repeatability) Vehicle Classifications: Motorcycles (FHWA Classification 1): Avg. % Diff of 24% at night Cars (FHWA Classifications 2-3): Avg. % Diff of 13% daytime Small Trucks (FHWA Classifications 4-7): Avg. % Diff of 44% daytime Large Trucks (FHWA Classifications 8-13): Avg. % Diff. of 23% daytime

Traffic Data Collection Results Factors that Impacted Performance Low light / dark conditions Camera position (proximity to traffic, zoomed out, angled to roadway) Weather events that reduce image quality Inaccurate configuration of video analytics to roadway lanes Camera settings (e.g. shutter speed, max gain) Factors that Did Not Appear to Impact Performance Position of camera relative to direction of traffic (e.g. counting headlights vs. tail lights at night)

TRAFFIC DATA COLLECTION: Ontario Ministry of Transportation (MTO) Deployment

Traffic Data: MTO Deployment MTO Deployment Focus on Volumes 13 cameras instrumented at 4 Locations Data collected in 15-minute periods Video recorded for 1 week at each camera, sent to video analytics vendor for processing Manual counts conducted for comparison Manual counts compared to video analytics data outputs to compute percent error

Traffic Data: MTO Deployment Results: Type of Comparison Time of Day Camera Angle Camera Type Configuration/ Setting % Error Day 1 9.1% Night 7.9% Side 9.4% Overhead 6.5% Axis 7.5% Cohu 9.6% 1 Day analysis was PM peak (16:30-17:30)

Traffic Data: MTO Deployment Results: 1. Camera based counting system is appropriate if: Overall Accuracy within 10% is acceptable Vehicle Classification is not critical 2. Camera based counting system may not be suitable if: Counts are to be conducted in work zones or areas with high stop-and-go traffic Accuracy within 5% is required Vehicle Classification is needed Night-time accuracy is important

Traffic Data: MTO Deployment Lessons Learned: 1. Engage in discussions early with camera vendors 2. Standard definition cameras are actually better 3. Ambient light surrounding cameras should be taken into consideration for camera locations Next Steps: MTO will be undertaking additional data collection assignments utilizing video analytics beginning this fall and continuing through next summer

WRONG-WAY VEHICLE DETECTION

Wrong-Way Vehicle Detection Controlled Test: Nov. 2013 in Ames, IA 3 vendors/technologies at 3 separate freeway ramps Ramp closures to test various conditions Detections conveyed via email, web interface, or onsite computer interface Recorded detection or non-detection

Wrong-Way Vehicle Detection Deployment Site #1 US 30 at Dayton Ave. Off-ramp traffic Camera 90 degree detection

Wrong-Way Vehicle Detection Deployment Site #2 US 30 at Duff Ave. Camera Off-ramp traffic 90 degree detection

Wrong-Way Vehicle Detection Deployment Site #3 US 30 at University Blvd. Camera Off-ramp traffic head-on detection

Wrong-Way Vehicle Detection Highest Level of Performance Achieved Daytime Test: 100% detection for 12 test drives Nighttime Test: 83% detection for 12 test drives Factors that Impacted Detection Rate Nighttime / Low Light Conditions Slow Speeds Factors that Did Not Appear to Impact Detection Rate Color/Size of Vehicle Lane Position (consistent position, shoulder, and/or weaving)

LESSONS LEARNED

Lessons Learned Planning and Procurement 1. Determine Uses and Needs What will the system be used for? What are the most important uses (e.g. traffic data collection, incident detection, etc.)? 2. Understand Limitations of Multi-Purpose Capabilities Camera positions / settings may serve one application better than others Multiple uses may be difficult or impractical

Lessons Learned Planning and Procurement, cont d 3. Recognize Investment Tradeoffs Potentially lower up-front investment with Video Analytics Consider continuing costs: Staff training, setup, and ongoing monitoring/configuration 4. Utilize Fixed Cameras and/or Dedicated Cameras for Traffic Data Traffic data tends to be more accurate with cameras that remain stationary (fixed, dedicated) Consider installing temporary dedicated cameras where infrastructure does not allow optimized positioning

Lessons Learned Planning and Procurement, cont d 5. Optimize Video Feed Quality and Communications Video feeds with minimal interruption are desired. Choppy feeds/communications will not be accurately processed. Ask vendors to provide feedback on feed quality Test video feeds in advance of procurement 6. Include Design & Testing Provisions in Procurement Add tasks for additional testing and tuning 6 months to 1 year after initial deployment

Lessons Learned Planning and Procurement, cont d 7. Make Go/No-Go Decisions When Selecting Cameras Work with vendors to determine if camera positions are suitable 8. Consider Future Potential for Video Analytics when Installing New Cameras Even if Video Analytics deployments are not planned, consider potential for future use when installing new camera infrastructure

Lessons Learned Deployment 1. Dedicate Agency Resources to Deployment Activities Agency resources needed during installation and troubleshooting during set-up Schedule check-in visits with vendors 2. Commit to Learning & Understanding System Procedures Dedicate resources to learning system configurations, procedures, and performance impacts Operators should fully understand capabilities to ensure that the system is as useful and accurate as possible

Lessons Learned System Operation 1. Use Camera Presets and Auto-Return to Preset Positions Cameras should reset to optimal Video Analytics positions after being manually moved 2. Monitor Calibrations and Adjust as Needed Operators should ensure that cameras are returned to their optimal view settings (use presets, if possible) 3. Recognize Strong Link Between Human Interaction & System Performance Success is dependent on agency s level of commitment Resources needed to monitor performance, adjust and reconfigure when cameras pan/zoom, etc.

Lessons Learned Evaluation 1. Establish Performance Parameters Develop subjective success parameters to determine if a system performs to pre-determined standards 2. Compare/Contrast Video Analytics to Other Detection Mechanisms Compare performance outcomes of various technologies for specific uses 3. Extend Incident Detection Testing to Missed Incidents Determine extent to which Video Analytics fails to detect actual incidents Utilize closed test track or other controlled environment

EVALUATION FINDINGS

Evaluation Findings State of Practice for Video Analytics is ready to meet many agency needs. Dedicated and/or fixed cameras may be warranted, especially for traffic data collection Video Analytics may not serve all purposes simultaneously (e.g. a camera used for incident detection may not be optimal for traffic data collection) Important to follow vendor guidelines for camera selection, position, zoom level, etc. Recognize significant human component involved. Operator resources are required to monitor system settings and re-configure as needed.

Acknowledgements & Project Contact Final Report: Published at www.enterprise.prog.org Participating Agencies Iowa DOT KC Scout/Missouri DOT Ministry of Transportation of Ontario Participating Vendors DRS Technologies, Inc. Iteris, Inc. Peek Traffic Corporation TrafficVision VideoIQ Contact Information: Mike Barnet Ontario Ministry of Transportation mike.barnet@ontario.ca