Video-based traffic state estimation on smart cameras for traffic. information systems

Similar documents
LOOK2: A video-centric system for real-time notification and. presentation of relevant traffic situations in Austria

Pervasive Computing. OpenLab Jan 14 04pm L Institute of Networked and Embedded Systems

C-ITS Deployment in Austria & European Harmonisation Activities

Portable Work Zone Data Collection. Blaine Van Dyke ODOT ITS Designer

Real-time-Data for Your Traffic Management System. smart eye TDS Traffic Data Sensor

Vega Basic Family. Automatic Number Plate Reader

TRAFFIC DATA FUSION OF VEHICLE DATA TO DETECT SPATIOTEMPORAL CONGESTED PATTERNS

From Smart Cameras to Visual Sensor Networks

RTMS Solutions. Detection solutions to fit your city s needs.

Intelligent road side platform for future applications

Basler IP Fixed Box Cameras

DMM R2 PMD R2 System requirements

Mobile Millennium Using Smartphones as Traffic Sensors

Basler IP Fixed Box Cameras NETWORK CAMERAS Premium image quality CCD and CMOS sensors

Basler IP Fixed Box Cameras

Aegis Electronic Group

Building of Intelligent Transport Systems. Berthold Jansen

Basler IP Fixed Box Cameras

Option Driver Assistance. Product Information

Basler IP Fixed Box Cameras

Connected Car. Dr. Sania Irwin. Head of Systems & Applications May 27, Nokia Solutions and Networks 2014 For internal use

Basler IP Fixed Dome Cameras

Real-Time Traffic Video Analysis Using Intel Viewmont Coprocessor

Advanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module

Datasheet. High-Definition IP Surveillance Cameras. Camera Models: UVC-G3, UVC-G3-AF, UVC-G3-DOME NVR Model: UVC-NVR

Datasheet. High-Definition IP Video Surveillance System. Scalable Day and Night Surveillance. Advanced Hardware and Optics for 1080p Full HD Video

Realtime Object Detection and Segmentation for HD Mapping

Vehicle Detection Method using Haar-like Feature on Real Time System

PORTAL. A Case Study. Dr. Kristin Tufte Mark Wong September 23, Linux Plumbers Conference 2009

ADAPTIVE TRAFFIC LIGHT IN IMAGE PROCESSING BASED- INTELLIGENT TRANSPORTATION SYSTEM: A REVIEW

ELSAG North America Law Enforcement Systems, LLC

IoT Based Traffic Signalling System

EyeCheck Smart Cameras

Datasheet. High-Definition IP Surveillance Cameras. Camera Models: UVC-G3-AF, UVC-G3-DOME, UVC-G3-MICRO, UVC-G3-PRO NVR Model: UVC-NVR-2TB

Datasheet. High-Definition IP Video Surveillance System. Scalable Day and Night Surveillance. Advanced Hardware and Optics for 1080p Full HD Video

S100 Series. Compact Smart Camera. High Performance: Dual Core Cortex-A9 processor and Xilinx FPGA. acquisition and preprocessing

Cohda Wireless White Paper DSRC Field Trials

Lecture: Autonomous micro aerial vehicles

TRAFFIC VIOLATION MANAGEMENT SOLUTION. 1

Datasheet. High-Definition IP Surveillance Cameras. Models: UVC-G3, UVC-G3-DOME. Scalable Day and Night Surveillance

TM2.0 Enabling vehicle interaction with traffic management. TF3 Principles for data. Report

TR Digital Software-Driven Traffic Radar Unit. ANPR Automatic Number Plate Recognition. Mobile, Fixed, and Portable. Bi-direction radar

Efficient and Large Scale Monitoring of Retaining Walls along Highways using a Mobile Mapping System W. Lienhart 1, S.Kalenjuk 1, C. Ehrhart 1.

SHRP 2 Safety Research Symposium July 27, Site-Based Video System Design and Development: Research Plans and Issues

Guppy F-033. Description. Specifications. IEEE 1394a camera Lightweight Robust design Machine vision camera

Transport System Telematics. Detailed evaluation and analysis of vision-based online traffic parameters estimation approach using lowresolution

Servosila Robotic Heads

Estimating Speed of Vehicle using Centroid Method in MATLAB

Performance Evaluation of Non-Intrusive Methods for Traffic Data Collection. Kamal Banger, Ministry of Transportation of Ontario

PERFORMANCE MEASUREMENTS OF FEATURE TRACKING AND HISTOGRAM BASED TRAFFIC CONGESTION ALGORITHMS

ANALYZING AND COMPARING TRAFFIC NETWORK CONDITIONS WITH A QUALITY TOOL BASED ON FLOATING CAR AND STATIONARY DATA

DSRC Field Trials Whitepaper

Large-Scale Traffic Sign Recognition based on Local Features and Color Segmentation

Cisco Kinetic for Cities Safety and Security Video Analytics (Kiwi Security)

Multi-Sensor Traffic Data Fusion

Cooperative Vehicles Opportunity and Challenges

GAFFCO UNDER VEHICLE BOMB DECTECTOR ( UVBD ) SYSTEM

Smart Camera Series LSIS 400i Fast and simple quality assurance and identification through innovative and high-performance camera technology

Sophisticated in Detail, Versatile and Proven Worldwide. Your benefits include:

Inline Computational Imaging: Single Sensor Technology for Simultaneous 2D/3D High Definition Inline Inspection

Automation Motion Server

NBASE-T and Machine Vision: A Winning Combination for the Imaging Market

PeMS, Berkeley Highway Lab System, Data Fusion and Future Data Needs

APPLICATION OF AERIAL VIDEO FOR TRAFFIC FLOW MONITORING AND MANAGEMENT

Datasheet. High-Definition IP Video Surveillance System. Scalable Day and Night Surveillance. Advanced Hardware and Optics for 1080p Full HD Video

Datasheet. High-Definition IP Video Surveillance System. Scalable Day and Night Surveillance. Advanced Hardware and Optics for 1080p Full HD Video

High-performance, cost-effective, intelligent LPR for Free Flow applications

Learnings from the development of a traffic data fusion methodology LEARNINGS FROM THE DEVELOPMENT OF A TRAFFIC DATA FUSION METHODOLOGY

NEW! Smart Camera Series LSIS 400i Fast and simple quality assurance and identification through innovative and high-performance camera technology

Real-time Video Surveillance for Large Scenes

Evaluation of Information Dissemination Characteristics in a PTS VANET

Simplify System Complexity

Henry Ford. engineering company, focused on designing and providing intelligent video surveillance software

Motion Detection Using Adaptive Temporal Averaging Method

FortesServer An Integrated Ready to Deploy Video Management Server

Evaluation of Caltrans Automated Warning System Caltrans District 10 Stockton, California. Status Report Winter 2002

3U CompactPCI Intel SBCs F14, F15, F17, F18, F19P

Network Video Recorder: NUUO NVR Titan NT-4040(R)

Milestone Solution Partner IT Infrastructure Components Certification Report

Improving Fault-Tolerance in Intelligent Video Surveillance by Monitoring, Diagnosis and Dynamic Reconfiguration

ISUSCORE2FS / ISUCALCFS / ISUCUTTERFS

FortesServer An Integrated Ready to Deploy Video Management Server

Remote Health Monitoring for an Embedded System

Sophisticated in Detail, Versatile and Proven Worldwide. Your benefits include:

Information, entertainment, safety, and data gathering - all in one

Giancarlo Vasta, Magneti Marelli, Lucia Lo Bello, University of Catania,

ADVANCING SECURITY INTELLIGENCE

Map Guided Lane Detection Alexander Döbert 1,2, Andre Linarth 1,2, Eva Kollorz 2

Thermal and Optical Cameras. By Philip Smerkovitz TeleEye South Africa

SOLUTIONS FOR TESTING CAMERA-BASED ADVANCED DRIVER ASSISTANCE SYSTEMS SOLUTIONS FOR VIRTUAL TEST DRIVING

TomTom Innovation. Hans Aerts VP Software Development Business Unit Automotive November 2015

Aegis Electronic Group

Basler ace and IP Cameras with Sony IMX174 and IMX249: The Most Modern CMOS Technology for More Flexibility in ITS Applications

Hosted Microsoft Exchange Server 2003 Deployment Utilizing Network Appliance Storage Solutions

Basler IP Cameras. The Most Powerful IP Camera Concept. Megapixel. Day/Night H.264

Physical Security. Multi-vision Inspection. Embedded Computing. Hazard Mitigation. Traffic Surveillance. Solution Brief

5 Mega-pixel Vandalproof IR PoE IP Camera with Extended Support

DFI Redefines Flexibility Through a Series of Modularized Embedded Systems for... Industrial Automation

White Paper. Connected Car Brings Intelligence to Transportation

ABOUT SWISSTRAFFIC TABLE OF CONTENTS

Transcription:

19th ITS World Congress, Vienna, Austria, 22/26 October 2012 EU-00654 Video-based traffic state estimation on smart cameras for traffic information systems Felix Pletzer 1*, Roland Tusch 2, Thomas Mariacher 3, Manfred Harrer 3, Laszlo Böszörmenyi 2, Bernhard Rinner 1 1. Institute of Networked and Embedded Systems, Alpen-Adria-Universität Klagenfurt, Lakeside Park B02b, 9020 Klagenfurt, Austria. Phone: +43-463-2700-3671; Email: felix.pletzer@aau.at. 2. Institute of Information Technology, Alpen-Adria-Universität Klagenfurt, Austria 3. ASFINAG Maut Service GmbH, 5020 Salzburg, Austria Abstract This paper presents an accurate and efficient traffic state detection system that is implemented on smart cameras. Our video analysis method employs feature tracking and edge information along with the camera calibration parameters for periodic level of service (LOS) measurements. After a calibration using the known lengths of lane markings, the smart cameras perform the LOS estimation onboard and report any change in the observed traffic state to a traffic information system. Architecture of the onboard system and video detection methods are introduced in this paper. Evaluation results indicate a detection precision of more than 95% for stationary traffic. Keywords: video analysis, embedded computer vision, smart camera, traffic speed and density estimation, level of service classification 1. Introduction Modern traffic information systems provide reliable traffic messages in real-time. Today, these traffic messages are generated from different data sources, such as drivers reports, traffic sensors, or travel-times recorded by toll systems [Fehler! Verweisquelle konnte nicht gefunden werden.]. For many years, video surveillance has been used to recognize traffic jams, accidents, and other threats on the road. In 2011, ASFINAG (the Austrian motorways and expressways operator) has operated more than 4000 cameras to monitor the traffic state on the Austrian motorway and expressway network. In the current operational system of ASFINAG, the cameras are merely used for human inspection of videos. Automatic video analysis can be added to report interesting situations, such as traffic jams or weather-related road conditions, to human traffic operators. In this paper we present an accurate traffic state detection system for motorways, which is implemented on smart cameras. Smart cameras [Fehler! Verweisquelle konnte nicht

gefunden werden.] integrate video sensing, video processing, and communication in a single device. In contrast to most other methods which rely on object detection and background modeling, our method uses robust feature statistics to estimate the mean speed and traffic density on smart cameras. The presented system has been developed as part of the research project SOMA-LOOK2 [Fehler! Verweisquelle konnte nicht gefunden werden.][5], which aims at increasing the quality of traffic messages using multimedia data and multi sensor fusion. 2. System description 2.1 Architecture and data flow Storage SOAP RTSP Sensing Processing Communication 1/2" CCD SXGA, 19 fps 32 GB SSD Intel Atom 1.6 GHz Processor 1024 MB RAM Gigabit Ethernet USB 2.0 Figure 1: Architecture of the smart camera [6] Figure 1 shows the architecture of the smart camera that is used as target platform. The smart camera includes a 1280x1024 (SXGA) color CCD sensor that is connected to an embedded processor board. The processor board is equipped with an Intel Atom Z530, 1.6 GHz processor and 1024 MB RAM, connected to a 32 GB solid state disk (SSD) for internal storage. The processor runs a standard Linux distribution that provides flexible integration of additional hardware models and external software libraries. The smart camera is connected to the video network through a Gigabit Ethernet interface. Every time the camera detects an event, it uses a web service interface to send an event description and the associated video sequence to the traffic information system. 2

Image Sensor frames Video Analysis traffic speed traffic density Statistics Buffer frames Video Ring Buffer Event Detector LOS level LOS Estimation mean speed Data Aggregation Recording Smart LOS + Recording traffic density LOOK2 Traffic Information System Fusion Input Service Figure 2: Data flow of the traffic state detection Figure 2 shows the data flow of the traffic state detection, which is performed on the smart camera. The image sensor acts as data source providing the raw video data with a frame rate of 19 frames per second. The captured frames are written to the video ring buffer and passed to the video analysis module. For each frame, the video analysis module computes the estimated speed and traffic density values and stores them to the statistics buffer. At periodic intervals, e.g. each minute, the data aggregation module uses the data from the statistics buffer and computes the current speed and traffic density estimates. These speed and traffic density estimates are used to calculate the current LOS value as defined in Table 1. Table 1: Level of service classes used on Austrian motorways [4] LOS Description Speed [km/h] Density [vehicles/km] 1 Free flow >80 [0,20] 2 Heavy >80 (20,50] 3 Queuing [30,80) [0,50] 4 Stationary [0,30) >50 The event detector compares the current LOS value to the previous LOS value to detect a change in traffic state. If the traffic state has changed, it creates a traffic state description. Furthermore, an encoded video sequence is generated from the content of the video ring buffer. The traffic state description and the video sequence are sent to the LOOK2 traffic information system in real-time for further processing [5]. 2.2 Video-based traffic state estimation 3

Figure 3: Calibration grid Figure 4: Analysis area (orange) and motion vectors (blue) Figure 5: Binary edge image Figure 6: Statistical vehicle counting using occupation state of stripes The smart camera provides an automatic estimation of the observed traffic state. Offline calibration employs the known lengths of lane markers to calculate the intrinsic and extrinsic camera parameters for a deployed smart camera. Figure 3 shows an example 1m-by-1m calibration grid that was obtained from 10 manually selected calibration points. The calibration parameters are necessary to measure metric distances from the captured image data. The onboard video analysis uses motion vectors (Figure 4) and edge information (Figure 5) to calculate the mean speed and traffic density values for individual lanes. The video analysis method is only applied in predefined analysis areas as illustrated in Figure 3 (orange rectangle). In contrast to other methods, such as [7], our method does not rely on vehicle tracking or background modeling. Instead, robust cascaded outlier detection is used that facilitates accurate measurements even at difficult weather and illumination conditions. Using the calibration parameters and motion vectors (optical flow), our method periodically calculates the mean speed. Traffic density estimation is based on a robust vehicle counting method that uses binary edge images (illustrated in Figure 5) to obtain the occupancy state for 4

different stripe sections. Figure 6 shows occupied (red) and non-occupied stripe states (white). 3. Evaluation and Results For testing the video-based traffic state detection, we used 37 hours video data recorded at daylight in July 2011 by a smart camera on the Austrian motorway A12. The smart camera was mounted on a gantry as shown in Figure 7. The video sequences have VGA resolution and a frame rate of 16 frames per second. For estimating the speed from optical flow, we used the frame times that were obtained from the driver of the image sensor. The video sequences include traffic situations with different traffic states and different weather conditions (i.e., sunny, cloudy, light rain, and heavy rain). Figure 7: Smart camera mounted on a gantry on Austrian motorway A12 Figure 8: Analysis area (red) used for the evaluation shown in Table 2 Mean speed and traffic density estimates were computed over one-minute intervals. For ground truth we used one-minute aggregated measurements of overhead sensors that utilize triple-technologies (Doppler radar, passive infrared, and ultrasound) for speed measurement and vehicle counting. To evaluate the quality of our traffic state detection method, we 5

calculated the precision, recall, and accuracy of the analysis values for the four levels of service (LOS) defined in Table 1 [4]. Furthermore, we calculated the mean absolute error (MAE) of the speed and traffic density with respect to the LOS categories. The results show a very high accuracy of the video-based traffic state detection method. The evaluation results are summarized in Table 2. A comparison between the result of the video-based LOS detector (blue line) and the triple-tech sensor (red line) on a two-hour test set, recorded on July 18, 2011 (rainy weather conditions), is shown in Figure 9. The video-based LOS detection method detects congestion and dissipation of congestion in a reliable way. Table 2: Evaluation results (37 hours test set) MAE Traffic Level of Service Precision Recall Accuracy MAE Speed [km/h] Density [vehicles/km] Free Flow 0.9942 0.9879 0.9848 2.53 1.11 Heavy 0.6974 0.8281 0.9848 2.55 2.14 Queuing 0.8682 0.9412 0.9893 1.65 4.87 Stationary 0.9583 0.9848 0.9902 2.35 19.10 Figure 9: Direct comparison of video-based LOS detection (blue line) to LOS detection of triple-tech detectors (red line) on a 2-hour test set 6

4. Conclusion and future work This paper presented an accurate, vision-based traffic state detection system. Live video data is processed on an embedded smart camera in real-time. Any changes in traffic state are automatically reported to the LOOK2 [5] traffic information system. The generated messages contain short video sequences as visual proof that can be used for verification by human operators. In general, video detection of traffic states provide a number of advantages. First, video-based solutions save costs, especially when existing traffic surveillance infrastructure is used. Second, video detection systems can provide images to human operators for easy verification of traffic messages. Third, in comparison to inductive loops and overhead sensors, cameras can be mounted more flexible and measurements can cover a larger area. Today, most video analysis systems rely on centralized infrastructures, where video data is analyzed on a central server, leading to a potential bottleneck for large-scale video networks. Decentralized systems, as described in this paper, rely on local processing of the obtained image data, which leads to more scalable and more flexible video networks. Especially for temporary maintenance areas, video-based traffic state detection on smart cameras appears promising. Based on the described results, ASFINAG plans to evaluate the implemented system in 2012 at a maintenance area on the Austrian motorway A2 in the Vienna region. A future vision is a flexible self-sustaining (solar powered) smart camera that employs a wireless network connection to report the observed traffic state and video data to a traffic information system. 5. Acknowledgements This work was supported by Lakeside Labs GmbH, Klagenfurt, Austria, by funding from the European Regional Development Fund, and by the Carinthian Economy Promotion Fund (KWF) under grant KWF-20214/17097/24774. References 1. Bramberger, M., A. Doblander, A. Maier, B. Rinner, and H. Schwabach. Distributed embedded smart cameras for surveillance applications. Computer, 39(2):68 75, Feb. 2006. 2. Schneider, M., M. Linauer, N. Hainitz, and H. Koller. Traveller information service based on real-time toll data in Austria. Intelligent Transport Systems, 7

IET, 3(2):124 137, 2009. 3. Tusch, R., A. Fuchs, H. Gutmann, M. Kogler, J. Köpke, L. Böszörmenyi, M. Harrer, and T. Mariacher. A multimedia-centric quality assurance system for traffic messages. Data and Mobility, pages 1 13, 2010. 4. Bundesanstalt für Straßenwesen (BASt): Merkblatt für die Ausstattung von Verkehrsrechnerzentralen und Unterzentralen. Ausgabe 1999; Bergisch Gladbach, 1999. 5. Tusch, R., F. Pletzer, T. Mariacher, A. Krätschmer, V. Mudunuri, K. Sabbavarapu, M. Kogler, M. Harrer, P. Hrassnig, L. Böszörményi, and B. Rinner. LOOK2: A video-centric system for real-time notification and presentation of relevant traffic situations in Austria, Proceedings of 19th ITS World Congress, Vienna, 2012. Accepted for presentation. 6. Data sheet of SLR smart camera, SLR Engineering OG, available online at http://www.slr-engineering.at/smart_camera. 7. Sidla, O., M. Rosner, M. Ulm, and G. Schwingshackl. Traffic monitoring with distributed smart cameras, Proceedings of SPIE, San Francisco, 2012. 8