2 BAYESIAN NETWORK ESTIMATION
|
|
- Shawn Webster
- 5 years ago
- Views:
Transcription
1 Naoki KAWASAKI In this paper, a new architecture for sensor fusion for Advanced Driver Assistant Systems (ADAS) is proposed. This architecture is based on Bayesian Network and plays the role of platform for integrating various sensors such as Lidar, Radar and Vision sensors into sensor fusion systems. This architecture has the following 3 major advantages: (1) It makes structure and signal flow of the complicated fusion systems easy to understand (2) It increases the reusability of the sensor algorithm modules (3) It achieves easy integration of various sensors with different specifications These advantages are confirmed by vehicle test. Key words : Sensor fusion, Bayesian Network, Advanced Driver Assistant Systems, MMW radar, Vision sensor, Camera, LIDER Untill now, Advanced Driver Assistant Systems (ADAS) such as ACC (Adaptive Cruise Control) and FCAAS (Frontal Collision Avoidance Assistance System) have been developed. These systems are using frontal monitoring sensors, such as millimeter wave (MMW) radar, laser radar (LIDAR) or vision sensor (video camera), to recognize preceding cars and other objects. It is very difficult to fulfill very high accuracy requirements from those systems by only single sensor. Therefore the sensor fusion method is a hot topic. It takes advantage of the fact, that different sensors have different advantages by weighting each sensor depending on its performance. For example, the fusion of MMW radar and vision sensor is an effective combination. 1) Because the former has a good accuracy in longitudinal distance measurement, but poor in lateral while the properties of the latter are the exact opposite. Thus, this combination provides good accuracy in both al measurements. Many sensor fusion methods have been suggested in the literature using averaging, majority rule, Kalman filtering or Bayesian estimation, 2) 3) and higher accuracy is achieved already. But those conventional algorithms tend to be quite ad hoc. They depend on the specific combination of sensor devices, algorithms or application systems they were designed for. In the future, there will be many new sensors and upgraded recognition algorithms, which will cause a huge number of combinations. Therefore, the modular and general fusion architecture is an important topic. In this paper, sensor fusion architecture using a BN is presented. This general probabilistic method can treat all kinds of relations with statistical data in the estimation. In chapter 2, general usage of BN estimation is explained. Chapter 3 introduces the BN into the application of MMW radar and vision sensor fusion. Then chapter 4 explains the new architectural structure we propose in this paper. A Bayesian Network (BN), which is also called belief net or causal network, is a visualized modeling technique for statistical dependencies between variables 4) 5) and also works as a probabilistic estimation machine. shows an example of a BN model. In common BN model consists of nodes and arrows (DAG: Directed Acyclic Graph). A node is a probabilistic variable, which consists of exhaustive and mutually exclusive states. An arrow is the causal relationship between nodes, from cause to effect. In this example, all variables are implemented with 2 states and it is assumed that target object is either a vehicle or a pedestrian.
2 Classification Width Every node has a database inside, which shows the strength of dependency between the parent nodes and itself. In, those are expressed as tables p(c), p(r C), and p(w C), and called conditional probability tables (CPT). Those databases are built from historical knowledge about how often the state occurred given fixed parent node s condition. In this paper, lower case p(x) means database of BN, such as CPT. Radio reflection intensity Given a model structure and conditional probability tables (CPT), the probabilistic estimation is now available. Probabilities are assigned to all states of the classification nodes. Namely, if P(Cpedestrian given data) > P(Cvehicle given data), the target is very likely a pedestrian (Cped). For example, if the sensor detects that the radio reflection intensity is weak in, P(Cped Rwea) must be calculated. To calculate it, the fundamental rule for probability calculus can be used, which is also called Bayes rule. P(a, b) P(a b)p(b) P(b a)p(a) P(b a) It is expressed it as the following general inference form P(Searched Known) where P(Known) is scalar normalization factor. Now we get P(Cped Rwea) p(cvehicle) p(cpedestrian) C p(rstrong C) p(rweak C) vehicle 0.6 pedestrian C p(wwide C) p(wnarrow C) vehicle pedestrian 0.9 Fig. 1 Sample BN model for classification estimation P(a, b) P(a) Free W P(a b)p(b) P(a) P(Searched, Known, Free) P(Known) P(Cped, Rwea, W). P(Rwea) (1) (2) (3) Here we have to calculate the joint probability function P(C, R, W) with 3 variables C, R, and W. BN offers a very useful theorem called chain rule for a more compact representation of P(C, R, W). P(all nodes) all nodes p(node parentnodesofthatnode) Using this rule, the joint probability function is described as P(C, R, W) p(c) p(r C) p(w C) where the right terms are the given conditional probability tables (CPT). Now (3) derives following equation. P(Cped Rwea) p(cped) p(rwea Cped) p(w Cped) W P(Rwea) From this equation the probability P(Cped Rwea) can be calculated as 0.667, meanwhile P(Cveh Rwea) is 33. Comparing them, the result P(Cped Rwea) > P(Cveh Rwea) can be given. By introducing additional acquired data, the probability estimation may be improved. If the width is additionally observed as narrow, the answer can now be calculated as P(Cped Rwea, Wnar) p(cped) p(rwea Cped) p(wnar Cped) P(Cped Rwea, Wnar) Now P(Cped Rwea,Wnar) is 0.994, which is more accurate than before. If a sensor s performance changes, only the corresponding node s CPT has to be changed and other CPTs or model structure can be left untouched. In, the utilization of a different radar sensor might cause some change in the reflection intensity resulting in a different CPT of that node. However the structure of the model and other nodes tables do not have to be modified. To treat continuous variables such as length or, probability density functions (PDF) can be assigned to these probabilities. Instead of CPT, conditional probability density function (CPDF) is used. In this paper, PDF and probability function are expressed as upper case P(X), while the databases of BN such as CPT and CPDF are expressed as lower case p(x)... (4) (5) (6) (7)
3 shows a typical example of the sensor fusion BN model structure. There are two sensors A and B which both detect the continuous variable lateral Xa and Xb respectively, to estimate true lateral. CPDF is stored in each node. The example of CPDF p(x) and p(xa X) are shown in and. p(xa X) shows the statistically acquired probability distribution database: if the true lateral X is given, the sensor A outputs the data Xa in such distributions. Sensor CPDF could be expressed by a combination of Gaussian and uniform distribution. If the sensor detects objects with a reliability of 50%, the half of the sensor outputs is wrong and has no relationship between the aimed target s true X and the acquired data Xa. This can be expressed by a uniform probability distribution. Probability p(x) 0.2 p(xa X) Detected Xa True lateral X (a) (b) Fig. 2 BN causality model for lateral estimation Basic sensor fusion is done by a BN such as in. From this function of the BN model and calculation, an important idea of the fusion can be evolved, called likelihood merger. The equation of the fusion calculation to acquire the PDF P(X Xaa, Xbb) is shown below. p(x) Detected Xb Probability A p(xa X) 0.2 X 2 1 p(xb X) (c) Xa p(x) p(xaa X) p(xbb X) P(X, Xaa, Xbb)dX X 1 p(xaa X) p(xbb X) Z where p(x) is assumed to be constant, and the denominator is also constant. So they are expressed as a normalization factor Z. It can be said from (8) that the fusion result P(X Xaa, Xbb) is calculated by merging p(xaa X) and p(xbb X), which are the database CPDFs p(xa X) and p(xb X) where Xa and Xb are given as a and b. In other words, by fixing variable Xa, the CPDF p(xaa X) is transferred into likelihood of X. It means that how likely X is when the sensor A detected a. In this paper, this kind of likelihood is also called unitable CPDF depending on X. shows the idea of this likelihood merger. There are two unitable CPDFs p(xaa X) and p(xbb X) in, which are calculated from CPDFs and detected data a and b. They are simply multiplied to get the resulting PDF P(X Xaa, Xbb) like. The highest peak of that is the most likely result of X. Likelihood 0.2 p(xaa X) -1 p(xbb X) (a) X Fig. 3 Example of likelihood merging If the third sensor is added in this system, the third child node will be added to the model in, while the existing nodes are untouched. Then the third unitable CPDF will also appear in, and the peak of the answer PDF will be steeper. This means that the estimation will be more precise. This idea is the essence of the Bayes theorem, and it will be used in chapter 4. Probability 0.2 P(X Xaa, Xbb) (b) X P(X Xaa, Xbb) P(X, Xaa, Xbb) P(Xaa, Xbb) (8)
4 This BN estimation technique is applied in a real sensor fusion practice. The experimental subject is a data fusion system of MMW radar plus vision sensor for tracking vehicles on a road. Vision sensor MMW radar Image MMW radar Prediction Symmetry Shadow Tail lamp Tracking Blue sky Total decision Lateral Pos. Vertical edge Width Brightness - longitudinal lateral -Size width -Speed -Classification -Environment Fig. 4 Data flow diagram of experimental sensor system shows the data flow diagram of the experimental sensor system using two devices, MMW radar and vision sensor. There are many object-detection algorithms, which are called recognition IPs (Intellectual Properties). They output the target object properties such as lateral center or width. All outputs of the recognition IPs are collected and fused at Total decision IP, where BN is implemented. Some image recognition IPs are using MMW IP output information, but it is only for preconditioning. Here are brief explanations about the recognition IPs, the inputs of the BN: 1) Symmetry IP: searches the symmetry of the object image, and detects the lateral of the object. 2) Vertical edge IP: sums up the luminance gradient of the image in vertical direction, and detects the width of the object. 3) Shadow IP: searches the dark spot in the image, and detects the center /width of the object. 4) Tail lamp IP: searches the corresponding pair of white spot in the image, and detects the center /width of the object. 5) Tracking IP: looks for the previously detected image of the object in the current image, and detects the center of the object. 6) Blue sky IP: detects the environmental brightness by scanning the upper part of the image. 7) MMW radar IP: analyzes radar wave reflection and detects the lateral and longitudinal, and longitudinal relative velocity of the object. 8) Prediction IP: predicts the object, width, etc. using previous estimation result. Each IP also outputs the detection intensity value, which means how far the recognition was successful. For example of the Symmetry IP, it outputs the value how symmetrical the image was. Environmental condition (daytime-night) blue sky Relative lateral velocity Wid. Own velocity Contrast of the target image tail lamp Longitudinal shadow Prediction Wid. tracking Lateral symmetry Cause to result relationship Feed back relationship Hypothesis variable (to be estimated) Information variable (observed) Rel. V. Fig. 5 Bayesian Network model for experimental sensor fusion shows the BN model for this experimental sensor fusion as an algorithm within the Total decision IP in. This model consists of several layers, which help categorizing of the nodes. Upper layers are the true s or properties of the object that are the targets of the estimation. Lower layer are the detected values that are caused by upper layer nodes. The core part of this model is e.g. Lateral node and its children Lat. nodes. This structure means just like : several IPs detect the lateral of the same object. On the other hand, each recognition IP has the Boolean Wid. Width Classification vertical edge Absolute Lon. V. Lon. MMW radar Target fundamentals Environmental and own fundamentals Effective recognition causes Detection algorithm validity, Weighting factor Detection intensity value Detected value
5 node detection algorithm validity (weighting factor), which is the most characteristic point of this model. This factor means how much the detection algorithm is adequate for the current situation and target object, and how much this IP is reliable now. If the probability of the state yes of this factor is higher, it is more reliable, and the acquired data will have a bigger impact on the final result. If that probability is zero, there is no relationship between the true and the acquired data. This weighting factor is determined by other related nodes. For example in, the Symmetry IP s weighting factor depends on the Contrast of the Target Image, Classification and Detection Intensity. It means: this IP may be not trustworthy if it is night and the image is not clear, or if the target object is something which is not necessarily symmetrical such as a road structure, or if the symmetry of the image is weak. This model part represents and defines the reliability causality of each IP. The weighting factor node is also connected with detected lateral node, which is at the lowest layer in. This means: it affects the unitable CPDF (likelihood), the impact of detected value on the true value. If it is night condition and the weighting factor is low, the unitable CPDF p(xx A) in must be adapted according to the condition as shown in. The lower the weighting factor is, the more the uniform distribution becomes dominant. Fig. 6 Example of likelihood adaptation BN allows a clear description of the fusion algorithm strategies. It based on the detection mechanism using causal model, therefore it is very easy to understand, analyze weak points, add/remove sensors, and improve the fusion algorithm. This advantage is very important as the models tend to become huge. The above BN sensor fusion algorithm achieves good estimation accuracy. However, it still has an architectural problem. If only a single recognition IP algorithm is changed, we also have to change the Total decision IP. This is because the information concerning the recognition IP s performance is not stored in the respective IP, but in the Total decision IP as CPDF. In order to reduce the development effort, a modular BN architecture is proposed. To eliminate the generation of unitable CPDFs from the Total decision IP, the BN model is separated into recognition IP parts and a Total decision IP part. shows the part of the separated models for Symmetry IP as an example of recognition IP parts. This model shows how to output the unitable CPDF as explained in section 3.2. This means that each IP considers its own accuracy and reliability, using previously estimated and currently acquired information, and sends the unitable CPDF to the Total decision IP. Those functions are now inside of the Symmetry block in, not in the Total decision block. is the partial model extracted from Total decision IP part BN model. This part is now simplified because the unitable CPDFs are now calculated in the recognition IP parts, which means that the function is distributed. Only the likelihood merger calculation, as explained in section 3.2, and the peak search are done in Total decision IP. The Total decision IP block is now independent from sensor devices. All inputs are likelihoods of the estimation target variables, and Total decision IP does not care what kinds of devices are used. This architecture makes the fusion system quite modular. When an IP algorithm changes, the IP developer must only change its CPDF database. This means that the only corresponding recognition IP block in is changed, and the Total decision IP block does not have to be modified.
6 Environmental condition (prev.) Longitudinal (prev.) Contrast of the target condition Classification (prev.) Lateral Output (unitable CPDF of lateral from symmetry IP) symmetry (a) (a) Symmetry IP part Prediction Lateral Output (Estimated PDF of lateral ) (Symmetry) (Shadow) (Tracking) (Tail lamp) (MMW) (b) Total decision IP part (partial) Fig. 7 Examples of the separated BN models This modularity offers reusability. If a recognition IP is removed, only corresponding node in has to be removed. This means, one of unitable CPDFs is deleted from equation (8). It is very easy operation. Addition of new sensor or recognition IP is also easily treated in this architecture, if it outputs the data as united CPDF format. Such platform function will be an important ability for sensor fusion architecture, because it allows easy integration of various sensors from various sensor suppliers. The proposed architecture and algorithm was validated on real data. and show the estimated lateral PDF expressed as a downward graph under the car image. is the result using the output of MMW radar IP alone, whereas is the fusion result using every IP. In, the estimated width PDF is also shown as an upward peak, which indicates the right edge of the object. The lateral PDF peek of is steeper than that of. It means that the fusion result is more reliable than (b) Fig. 8 Estimation result MMW radar alone. Generally speaking, it was checked that fusion result achieves better accuracy than MMW alone, and almost same accuracy with the conventional fusion algorithm. It was also checked that such addition or removal of the IP results is easily treated by simply adding or removing the corresponding unitable CPDF from multiplication at Total decision IP part. Therefore it is also said that this fusion algorithm is quite modular. This paper offered new architecture of the sensor fusion by use of modular Bayesian Network. Bayesian Network describes the fusion system in a causality model, which makes the fusion algorithm easy to understand. This merit is important if the fusion algorithm is big and complicated. It also helps to modify the algorithm when the system is changed. The addition of the new sensors and recognition algorithms will be easily
7 treated by adding new nodes in the existing BN model, while the existing nodes are untouched. This architecture defines unitable CPDF as a standardized output format for all sensors and recognition algorithms, and synthesizes them in a mathematically welldefined, modular way. Therefore, the reusability of each recognition algorithm is greatly increased. Additionally, sensors from different suppliers can be easily combined on this standard platform. Validation was done using real data. The accuracy of the merged data was as good as conventional fusion algorithm, while the modularity is drastically increased. Higher accuracy will be achieved mainly by adding new helpful recognition algorithms, or by improving existing recognition algorithms. But a more detailed model and a more precise database of BN will also make the estimation more powerful and accurate. A disadvantage of BN is their need for calculation performance. In the above application, this problem was reduced by separating the model into the modular Bayesian Networks we proposed. Further research can be done on this topic. 1) A. Gern, U. Franke, and P. Levi, DaimlerChrysler Research, Stuttgart, Robust Vehicle Tracking Fusing Radar and Vision, Intelligent Vehicle Symposium 2000, IEEE 2) C. Cou, Th. Fraichard, P. Bessi re, and E. Mayer, Multi-Sensor Data Fusion Using Bayesian Programming: an Automotive Application, Intelligent Vehicle Symposium 2002, IEEE 3) B. Steux, C. Laurgeau, L. Salesse, D. Wautier, Fade: a vehicle detection and tracking system featuring monocular color vision and radar data fusion, Intelligent Vehicle Symposium 2002, IEEE 4) R. O. Duda, Pattern Classification, 2nd ed. John Wiley & Sons, 2001, ISBN ) F. V. Jensen, Bayesian Networks and Decision Graphs. New York: Springer-Verlag, 2001, ISBN
D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.
D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either
More informationComputer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models
Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Bayes Nets: Independence Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationLaserscanner Based Cooperative Pre-Data-Fusion
Laserscanner Based Cooperative Pre-Data-Fusion 63 Laserscanner Based Cooperative Pre-Data-Fusion F. Ahlers, Ch. Stimming, Ibeo Automobile Sensor GmbH Abstract The Cooperative Pre-Data-Fusion is a novel
More informationFMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu
FMA901F: Machine Learning Lecture 6: Graphical Models Cristian Sminchisescu Graphical Models Provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate
More informationComputer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models
Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Bayesian Curve Fitting (1) Polynomial Bayesian
More informationPedestrian Detection with Radar and Computer Vision
Pedestrian Detection with Radar and Computer Vision camera radar sensor Stefan Milch, Marc Behrens, Darmstadt, September 25 25 / 26, 2001 Pedestrian accidents and protection systems Impact zone: 10% opposite
More informationWorkshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient
Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient quality) 3. I suggest writing it on one presentation. 4. Include
More informationFusion Framework for Moving-Object Classification. Omar Chavez, Trung-Dung Vu (UJF) Trung-Dung Vu (UJF) Olivier Aycard (UJF) Fabio Tango (CRF)
Fusion Framework for Moving-Object Classification Omar Chavez, Trung-Dung Vu (UJF) Trung-Dung Vu (UJF) Olivier Aycard (UJF) Fabio Tango (CRF) Introduction Advance Driver Assistant Systems (ADAS) help drivers
More informationA Brief Introduction to Bayesian Networks AIMA CIS 391 Intro to Artificial Intelligence
A Brief Introduction to Bayesian Networks AIMA 14.1-14.3 CIS 391 Intro to Artificial Intelligence (LDA slides from Lyle Ungar from slides by Jonathan Huang (jch1@cs.cmu.edu)) Bayesian networks A simple,
More informationVehicle Localization. Hannah Rae Kerner 21 April 2015
Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular
More information2 OVERVIEW OF RELATED WORK
Utsushi SAKAI Jun OGATA This paper presents a pedestrian detection system based on the fusion of sensors for LIDAR and convolutional neural network based image classification. By using LIDAR our method
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Bayes Nets: Inference Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationParticle Filters for Visual Tracking
Particle Filters for Visual Tracking T. Chateau, Pascal Institute, Clermont-Ferrand 1 Content Particle filtering: a probabilistic framework SIR particle filter MCMC particle filter RJMCMC particle filter
More informationAnalysis of Obstacle Detection Technologies used in Mobile Robots
ISSN (Online) : 2319-8753 ISSN (Print) : 2347-6710 International Journal of Innovative Research in Science, Engineering and Technology Volume 3, Special Issue 3, March 2014 2014 International Conference
More informationParticle Filtering. CS6240 Multimedia Analysis. Leow Wee Kheng. Department of Computer Science School of Computing National University of Singapore
Particle Filtering CS6240 Multimedia Analysis Leow Wee Kheng Department of Computer Science School of Computing National University of Singapore (CS6240) Particle Filtering 1 / 28 Introduction Introduction
More informationVision-based Frontal Vehicle Detection and Tracking
Vision-based Frontal and Tracking King Hann LIM, Kah Phooi SENG, Li-Minn ANG and Siew Wen CHIN School of Electrical and Electronic Engineering The University of Nottingham Malaysia campus, Jalan Broga,
More informationLecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying
given that Maximum a posteriori (MAP query: given evidence 2 which has the highest probability: instantiation of all other variables in the network,, Most probable evidence (MPE: given evidence, find an
More informationW4. Perception & Situation Awareness & Decision making
W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic
More informationSummary: A Tutorial on Learning With Bayesian Networks
Summary: A Tutorial on Learning With Bayesian Networks Markus Kalisch May 5, 2006 We primarily summarize [4]. When we think that it is appropriate, we comment on additional facts and more recent developments.
More informationGraphical Models and Markov Blankets
Stephan Stahlschmidt Ladislaus von Bortkiewicz Chair of Statistics C.A.S.E. Center for Applied Statistics and Economics Humboldt-Universität zu Berlin Motivation 1-1 Why Graphical Models? Illustration
More informationAutomatic Tracking of Moving Objects in Video for Surveillance Applications
Automatic Tracking of Moving Objects in Video for Surveillance Applications Manjunath Narayana Committee: Dr. Donna Haverkamp (Chair) Dr. Arvin Agah Dr. James Miller Department of Electrical Engineering
More informationComputer vision: models, learning and inference. Chapter 10 Graphical Models
Computer vision: models, learning and inference Chapter 10 Graphical Models Independence Two variables x 1 and x 2 are independent if their joint probability distribution factorizes as Pr(x 1, x 2 )=Pr(x
More information3.2 Level 1 Processing
SENSOR AND DATA FUSION ARCHITECTURES AND ALGORITHMS 57 3.2 Level 1 Processing Level 1 processing is the low-level processing that results in target state estimation and target discrimination. 9 The term
More informationMap Guided Lane Detection Alexander Döbert 1,2, Andre Linarth 1,2, Eva Kollorz 2
Map Guided Lane Detection Alexander Döbert 1,2, Andre Linarth 1,2, Eva Kollorz 2 1 Elektrobit Automotive GmbH, Am Wolfsmantel 46, 91058 Erlangen, Germany {AndreGuilherme.Linarth, Alexander.Doebert}@elektrobit.com
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationAdvanced Driver Assistance: Modular Image Sensor Concept
Vision Advanced Driver Assistance: Modular Image Sensor Concept Supplying value. Integrated Passive and Active Safety Systems Active Safety Passive Safety Scope Reduction of accident probability Get ready
More informationAutomated Driving Development
Automated Driving Development with MATLAB and Simulink MANOHAR REDDY M 2015 The MathWorks, Inc. 1 Using Model-Based Design to develop high quality and reliable Active Safety & Automated Driving Systems
More informationHuman Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg
Human Detection A state-of-the-art survey Mohammad Dorgham University of Hamburg Presentation outline Motivation Applications Overview of approaches (categorized) Approaches details References Motivation
More informationGraphical Models Part 1-2 (Reading Notes)
Graphical Models Part 1-2 (Reading Notes) Wednesday, August 3 2011, 2:35 PM Notes for the Reading of Chapter 8 Graphical Models of the book Pattern Recognition and Machine Learning (PRML) by Chris Bishop
More informationProbabilistic Learning Classification using Naïve Bayes
Probabilistic Learning Classification using Naïve Bayes Weather forecasts are usually provided in terms such as 70 percent chance of rain. These forecasts are known as probabilities of precipitation reports.
More informationComputers and Mathematics with Applications. Vision-based vehicle detection for a driver assistance system
Computers and Mathematics with Applications 61 (2011) 2096 2100 Contents lists available at ScienceDirect Computers and Mathematics with Applications journal homepage: www.elsevier.com/locate/camwa Vision-based
More informationBayesian Networks. A Bayesian network is a directed acyclic graph that represents causal relationships between random variables. Earthquake.
Bayes Nets Independence With joint probability distributions we can compute many useful things, but working with joint PD's is often intractable. The naïve Bayes' approach represents one (boneheaded?)
More informationThe Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University September 30, 2016 1 Introduction (These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan.
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University April 1, 2019 Today: Inference in graphical models Learning graphical models Readings: Bishop chapter 8 Bayesian
More informationLocalization and Map Building
Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples
More informationAdvanced Techniques for Mobile Robotics Bag-of-Words Models & Appearance-Based Mapping
Advanced Techniques for Mobile Robotics Bag-of-Words Models & Appearance-Based Mapping Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Motivation: Analogy to Documents O f a l l t h e s e
More information차세대지능형자동차를위한신호처리기술 정호기
차세대지능형자동차를위한신호처리기술 008.08. 정호기 E-mail: hgjung@mando.com hgjung@yonsei.ac.kr 0 . 지능형자동차의미래 ) 단위 system functions 운전자상황인식 얼굴방향인식 시선방향인식 졸음운전인식 운전능력상실인식 차선인식, 전방장애물검출및분류 Lane Keeping System + Adaptive Cruise
More informationMachine Learning. Sourangshu Bhattacharya
Machine Learning Sourangshu Bhattacharya Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Curve Fitting Re-visited Maximum Likelihood Determine by minimizing sum-of-squares
More informationLecture 5: Exact inference
Lecture 5: Exact inference Queries Inference in chains Variable elimination Without evidence With evidence Complexity of variable elimination which has the highest probability: instantiation of all other
More informationVision. OCR and OCV Application Guide OCR and OCV Application Guide 1/14
Vision OCR and OCV Application Guide 1.00 OCR and OCV Application Guide 1/14 General considerations on OCR Encoded information into text and codes can be automatically extracted through a 2D imager device.
More informationELL 788 Computational Perception & Cognition July November 2015
ELL 788 Computational Perception & Cognition July November 2015 Module 6 Role of context in object detection Objects and cognition Ambiguous objects Unfavorable viewing condition Context helps in object
More informationBayesian Classification Using Probabilistic Graphical Models
San Jose State University SJSU ScholarWorks Master's Projects Master's Theses and Graduate Research Spring 2014 Bayesian Classification Using Probabilistic Graphical Models Mehal Patel San Jose State University
More information1 : Introduction to GM and Directed GMs: Bayesian Networks. 3 Multivariate Distributions and Graphical Models
10-708: Probabilistic Graphical Models, Spring 2015 1 : Introduction to GM and Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Wenbo Liu, Venkata Krishna Pillutla 1 Overview This lecture
More informationCausality in Communication: The Agent-Encapsulated Bayesian Network Model
Causality in Communication: The Agent-Encapsulated Bayesian Network Model Scott Langevin Oculus Info, Inc. Toronto, Ontario, Canada Marco Valtorta mgv@cse.sc.edu University of South Carolina, Columbia,
More informationCHAPTER 9. Classification Scheme Using Modified Photometric. Stereo and 2D Spectra Comparison
CHAPTER 9 Classification Scheme Using Modified Photometric Stereo and 2D Spectra Comparison 9.1. Introduction In Chapter 8, even we combine more feature spaces and more feature generators, we note that
More informationResearch on the Measurement Method of the Detection Range of Vehicle Reversing Assisting System
Research on the Measurement Method of the Detection Range of Vehicle Reversing Assisting System Bowei Zou and Xiaochuan Cui Abstract This paper introduces the measurement method on detection range of reversing
More informationEvaluation of Moving Object Tracking Techniques for Video Surveillance Applications
International Journal of Current Engineering and Technology E-ISSN 2277 4106, P-ISSN 2347 5161 2015INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijcet Research Article Evaluation
More informationVerification: is that a lamp? What do we mean by recognition? Recognition. Recognition
Recognition Recognition The Margaret Thatcher Illusion, by Peter Thompson The Margaret Thatcher Illusion, by Peter Thompson Readings C. Bishop, Neural Networks for Pattern Recognition, Oxford University
More informationMixture Models and EM
Mixture Models and EM Goal: Introduction to probabilistic mixture models and the expectationmaximization (EM) algorithm. Motivation: simultaneous fitting of multiple model instances unsupervised clustering
More informationBayes Classifiers and Generative Methods
Bayes Classifiers and Generative Methods CSE 4309 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 The Stages of Supervised Learning To
More informationWhat do we mean by recognition?
Announcements Recognition Project 3 due today Project 4 out today (help session + photos end-of-class) The Margaret Thatcher Illusion, by Peter Thompson Readings Szeliski, Chapter 14 1 Recognition What
More informationSupplier Business Opportunities on ADAS and Autonomous Driving Technologies
AUTOMOTIVE Supplier Business Opportunities on ADAS and Autonomous Driving Technologies 19 October 2016 Tokyo, Japan Masanori Matsubara, Senior Analyst, +81 3 6262 1734, Masanori.Matsubara@ihsmarkit.com
More informationDirected Graphical Models (Bayes Nets) (9/4/13)
STA561: Probabilistic machine learning Directed Graphical Models (Bayes Nets) (9/4/13) Lecturer: Barbara Engelhardt Scribes: Richard (Fangjian) Guo, Yan Chen, Siyang Wang, Huayang Cui 1 Introduction For
More informationProbabilistic Sensor Models for Virtual Validation Use Cases and Benefits
Probabilistic Sensor Models for Virtual Validation Use Cases and Benefits Dr. Robin Schubert Co-Founder & CEO BASELABS GmbH 2 BASELABS enables data fusion results. 3 Who we are What we do We partner Data
More informationSensory Augmentation for Increased Awareness of Driving Environment
Sensory Augmentation for Increased Awareness of Driving Environment Pranay Agrawal John M. Dolan Dec. 12, 2014 Technologies for Safe and Efficient Transportation (T-SET) UTC The Robotics Institute Carnegie
More informationSpatio-Temporal Stereo Disparity Integration
Spatio-Temporal Stereo Disparity Integration Sandino Morales and Reinhard Klette The.enpeda.. Project, The University of Auckland Tamaki Innovation Campus, Auckland, New Zealand pmor085@aucklanduni.ac.nz
More informationOn Road Vehicle Detection using Shadows
On Road Vehicle Detection using Shadows Gilad Buchman Grasp Lab, Department of Computer and Information Science School of Engineering University of Pennsylvania, Philadelphia, PA buchmag@seas.upenn.edu
More informationBuilding Classifiers using Bayesian Networks
Building Classifiers using Bayesian Networks Nir Friedman and Moises Goldszmidt 1997 Presented by Brian Collins and Lukas Seitlinger Paper Summary The Naive Bayes classifier has reasonable performance
More informationDigital Image Processing. Prof. P.K. Biswas. Department of Electronics & Electrical Communication Engineering
Digital Image Processing Prof. P.K. Biswas Department of Electronics & Electrical Communication Engineering Indian Institute of Technology, Kharagpur Image Segmentation - III Lecture - 31 Hello, welcome
More informationComputer Vision Patch-based Object Recognition (2)
Computer Vision Patch-based Object Recognition (2) Contents Papers on patch-based object recognition Previous class: basic idea Bayes Theorem: probability background Papers in this class Hierarchy recognition
More informationK-Means and Gaussian Mixture Models
K-Means and Gaussian Mixture Models David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 43 K-Means Clustering Example: Old Faithful Geyser
More informationVehicle Detection Method using Haar-like Feature on Real Time System
Vehicle Detection Method using Haar-like Feature on Real Time System Sungji Han, Youngjoon Han and Hernsoo Hahn Abstract This paper presents a robust vehicle detection approach using Haar-like feature.
More informationCS 188: Artificial Intelligence
CS 188: Artificial Intelligence Bayes Nets: Inference Instructors: Dan Klein and Pieter Abbeel --- University of California, Berkeley [These slides were created by Dan Klein and Pieter Abbeel for CS188
More informationBayesian Networks Inference (continued) Learning
Learning BN tutorial: ftp://ftp.research.microsoft.com/pub/tr/tr-95-06.pdf TAN paper: http://www.cs.huji.ac.il/~nir/abstracts/frgg1.html Bayesian Networks Inference (continued) Learning Machine Learning
More informationA New Approach For Convert Multiply-Connected Trees in Bayesian networks
A New Approach For Convert Multiply-Connected Trees in Bayesian networks 1 Hussein Baloochian, Alireza khantimoory, 2 Saeed Balochian 1 Islamic Azad university branch of zanjan 2 Islamic Azad university
More informationCSCE 478/878 Lecture 6: Bayesian Learning and Graphical Models. Stephen Scott. Introduction. Outline. Bayes Theorem. Formulas
ian ian ian Might have reasons (domain information) to favor some hypotheses/predictions over others a priori ian methods work with probabilities, and have two main roles: Optimal Naïve Nets (Adapted from
More informationJo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)
Chapter 8.2 Jo-Car2 Autonomous Mode Path Planning (Cost Matrix Algorithm) Introduction: In order to achieve its mission and reach the GPS goal safely; without crashing into obstacles or leaving the lane,
More informationGraphical Models. David M. Blei Columbia University. September 17, 2014
Graphical Models David M. Blei Columbia University September 17, 2014 These lecture notes follow the ideas in Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. In addition,
More informationEE795: Computer Vision and Intelligent Systems
EE795: Computer Vision and Intelligent Systems Spring 2013 TTh 17:30-18:45 FDH 204 Lecture 18 130404 http://www.ee.unlv.edu/~b1morris/ecg795/ 2 Outline Object Recognition Intro (Chapter 14) Slides from
More informationFace Detection on Similar Color Photographs
Face Detection on Similar Color Photographs Scott Leahy EE368: Digital Image Processing Professor: Bernd Girod Stanford University Spring 2003 Final Project: Face Detection Leahy, 2/2 Table of Contents
More informationFundamental Technologies Driving the Evolution of Autonomous Driving
426 Hitachi Review Vol. 65 (2016), No. 9 Featured Articles Fundamental Technologies Driving the Evolution of Autonomous Driving Takeshi Shima Takeshi Nagasaki Akira Kuriyama Kentaro Yoshimura, Ph.D. Tsuneo
More informationStereo Vision Based Advanced Driver Assistance System
Stereo Vision Based Advanced Driver Assistance System Ho Gi Jung, Yun Hee Lee, Dong Suk Kim, Pal Joo Yoon MANDO Corp. 413-5,Gomae-Ri, Yongin-Si, Kyongi-Do, 449-901, Korea Phone: (82)31-0-5253 Fax: (82)31-0-5496
More informationMachine Learning
Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University February 15, 2011 Today: Graphical models Inference Conditional independence and D-separation Learning from
More informationTightly-Coupled LIDAR and Computer Vision Integration for Vehicle Detection
Tightly-Coupled LIDAR and Computer Vision Integration for Vehicle Detection Lili Huang, Student Member, IEEE, and Matthew Barth, Senior Member, IEEE Abstract In many driver assistance systems and autonomous
More informationFog Detection System Based on Computer Vision Techniques
Fog Detection System Based on Computer Vision Techniques S. Bronte, L. M. Bergasa, P. F. Alcantarilla Department of Electronics University of Alcalá Alcalá de Henares, Spain sebastian.bronte, bergasa,
More informationIntroduction to Graphical Models
Robert Collins CSE586 Introduction to Graphical Models Readings in Prince textbook: Chapters 10 and 11 but mainly only on directed graphs at this time Credits: Several slides are from: Review: Probability
More informationMeasurement of Pedestrian Groups Using Subtraction Stereo
Measurement of Pedestrian Groups Using Subtraction Stereo Kenji Terabayashi, Yuki Hashimoto, and Kazunori Umeda Chuo University / CREST, JST, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan terabayashi@mech.chuo-u.ac.jp
More informationProbabilistic Graphical Models
Overview of Part One Probabilistic Graphical Models Part One: Graphs and Markov Properties Christopher M. Bishop Graphs and probabilities Directed graphs Markov properties Undirected graphs Examples Microsoft
More informationIntroduction to Bayesian networks
PhD seminar series Probabilistics in Engineering : Bayesian networks and Bayesian hierarchical analysis in engineering Conducted by Prof. Dr. Maes, Prof. Dr. Faber and Dr. Nishijima Introduction to Bayesian
More informationAdvanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module
Advanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module www.lnttechservices.com Table of Contents Abstract 03 Introduction 03 Solution Overview 03 Output
More informationAcoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios
Sensor Fusion for Car Tracking Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios, Daniel Goehring 1 Motivation Direction to Object-Detection: What is possible with costefficient microphone
More informationContextual priming for artificial visual perception
Contextual priming for artificial visual perception Hervé Guillaume 1, Nathalie Denquive 1, Philippe Tarroux 1,2 1 LIMSI-CNRS BP 133 F-91403 Orsay cedex France 2 ENS 45 rue d Ulm F-75230 Paris cedex 05
More informationChapter 10. Conclusion Discussion
Chapter 10 Conclusion 10.1 Discussion Question 1: Usually a dynamic system has delays and feedback. Can OMEGA handle systems with infinite delays, and with elastic delays? OMEGA handles those systems with
More informationBayesian Methods in Vision: MAP Estimation, MRFs, Optimization
Bayesian Methods in Vision: MAP Estimation, MRFs, Optimization CS 650: Computer Vision Bryan S. Morse Optimization Approaches to Vision / Image Processing Recurring theme: Cast vision problem as an optimization
More informationSHRP 2 Safety Research Symposium July 27, Site-Based Video System Design and Development: Research Plans and Issues
SHRP 2 Safety Research Symposium July 27, 2007 Site-Based Video System Design and Development: Research Plans and Issues S09 Objectives Support SHRP2 program research questions: Establish crash surrogates
More informationPreceding Vehicle Detection and Tracking Adaptive to Illumination Variation in Night Traffic Scenes Based on Relevance Analysis
Sensors 2014, 14, 15325-15347; doi:10.3390/s140815325 Article OPEN ACCESS sensors ISSN 1424-8220 www.mdpi.com/journal/sensors Preceding Vehicle Detection and Tracking Adaptive to Illumination Variation
More informationGraphical Models & HMMs
Graphical Models & HMMs Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I. Christensen (RIM@GT) Graphical Models
More informationData Mining Classification: Bayesian Decision Theory
Data Mining Classification: Bayesian Decision Theory Lecture Notes for Chapter 2 R. O. Duda, P. E. Hart, and D. G. Stork, Pattern classification, 2nd ed. New York: Wiley, 2001. Lecture Notes for Chapter
More informationAUTOMATIC PARKING OF SELF-DRIVING CAR BASED ON LIDAR
AUTOMATIC PARKING OF SELF-DRIVING CAR BASED ON LIDAR Bijun Lee a, Yang Wei a, I. Yuan Guo a a State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University,
More informationSTAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference
STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Exact Inference What To Do With Bayesian/Markov Network? Compact representation of a complex model, but Goal: efficient extraction of
More informationStructure Learning in Bayesian Networks with Parent Divorcing
Structure Learning in Bayesian Networks with Parent Divorcing Ulrich von Waldow (waldow@in.tum.de) Technische Universität München, Arcisstr. 21 80333 Munich, Germany Florian Röhrbein (florian.roehrbein@in.tum.de)
More informationInference. Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: Most likely explanation:
Inference Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: B A E J M Most likely explanation: This slide deck courtesy of Dan Klein at
More informationparco area delle Scienze, 181A via Ferrata, , Parma 27100, Pavia
Proceedings of the IEEE Intelligent Vehicles Symposium 2000 Dearbon (MI), USA October 3-5, 2000 Stereo Vision-based Vehicle Detection M. Bertozzi 1 A. Broggi 2 A. Fascioli 1 S. Nichele 2 1 Dipartimento
More informationProblem definition Image acquisition Image segmentation Connected component analysis. Machine vision systems - 1
Machine vision systems Problem definition Image acquisition Image segmentation Connected component analysis Machine vision systems - 1 Problem definition Design a vision system to see a flat world Page
More informationObject recognition (part 1)
Recognition Object recognition (part 1) CSE P 576 Larry Zitnick (larryz@microsoft.com) The Margaret Thatcher Illusion, by Peter Thompson Readings Szeliski Chapter 14 Recognition What do we mean by object
More informationVisible and Long-Wave Infrared Image Fusion Schemes for Situational. Awareness
Visible and Long-Wave Infrared Image Fusion Schemes for Situational Awareness Multi-Dimensional Digital Signal Processing Literature Survey Nathaniel Walker The University of Texas at Austin nathaniel.walker@baesystems.com
More informationEstimating the wavelength composition of scene illumination from image data is an
Chapter 3 The Principle and Improvement for AWB in DSC 3.1 Introduction Estimating the wavelength composition of scene illumination from image data is an important topics in color engineering. Solutions
More informationObjects Relationship Modeling for Improving Object Detection Using Bayesian Network Integration
Objects Relationship Modeling for Improving Object Detection Using Bayesian Network Integration Youn-Suk Song and Sung-Bae Cho Dept. of Computer Science, Yonsei University 134 Shinchon-dong, Sudaemun-ku,
More information