I see you, you see me: Cooperative Localization through Bearing-Only Mutually Observing Robots

Size: px
Start display at page:

Download "I see you, you see me: Cooperative Localization through Bearing-Only Mutually Observing Robots"

Transcription

1 2012 IEEE/RSJ International Conference on Intelligent Robots an Systems October 7-12, Vilamoura, Algarve, Portugal I see you, you see me: Cooperative Localization through Bearing-Only Mutually Observing Robots Philippe Giguere 1 an Ioannis Rekleitis 2 an Maxime Latulippe 1 Abstract Cooperative localization is one of the funamental techniques in GPS-enie environments, such as unerwater, inoor, or on other planets, where teams of robots use each other to improve their pose estimation. In this paper, we present a novel schema for performing cooperative localization using bearing only measurements. These measurements correspon to the angles of pairs of lanmarks locate on each robot, extracte from camera images. Thus, the only exteroceptive measurements use are the camera images taken by each robot, uner the conition that both cameras are mutually visible. An analytical solution is erive, together with an analysis of uncertainty as a function to the relative pose of the robots. A theoretical comparison with a stanar stereo camera pose reconstruction is also provie. Finally, the feasibility an performance of the propose metho were valiate, through simulations an experiments with a mobile robot setup. I. INTRODUCTION The methoology of Cooperative Localization (CL) is a popular localization approach for teams of robots, in situations where external measurements of the environment are not reliable or even unavailable. The key concept is to utilize proprioceptive measurements together with measurements relative to other robots to get an accurate estimate of the pose of each robot. In this paraigm, no external positioning systems are available, such as GPS or external cameras setups that are frequently use in motion capture systems. Unerwater an unergroun environments, inoor areas, as well as, areas on other planets are all GPS enie areas where CL can improve the accuracy of state estimation. The most important contribution of CL is the ecoupling of uncertainty from the environment. When robots operate in an unknown environments, the statistical properties of the noise affecting their sensors is, at best, an eucate guess. For example, poor reflectance of walls can give skewe results in range finers an istorte images to cameras. Similarly, spills on the floor can affect the oometric measurements. By carefully engineering robot tracking sensors, usually a sensor on one robot an a target on the other, the uncertainty increase is boune by the known parameters of the system. For robots moving in 2D, the relative information between any two robots can be measure as a triplet of one istance an two angles z = [l, θ, φ]; where l is the istance between the two robots, θ is the bearing at which the observing robot sees the observe robot, an finally, φ is the perceive orientation of the observe robot. When robustness an minimal uncertainty is more value than efficiency [1], [2], a team of robots will always keep some robots stationary to act 1 Département informatique et e génie logiciel, Université Laval, firstname.lastname@ulaval.ca 2 School of Computer Science, McGill University, yiannis@cim.mcgill.ca Fig. 1. An irobot Create with a USB camera an a four marker target mounte on it. A Hokuyo laser range finer can be seen behin which was use to collect groun truth measurements. as fixe reference points, while others move. Consequently the uncertainty accumulation results only from the noise of the robot tracker sensor. When efficiency is important an all robots move at the same time [3], [4] the uncertainty reuction is proportional to the number of robots [5]. Most approaches up to now utilize irect measurements of the relative istance l between two robots. In this paper, we propose a novel scheme where only relative angles are use. As such, a vision base sensor mounte on each robot suffice to achieve localization; see Fig. 1 for an early prototype 1. Traitionally, ifferent probabilistic reasoning algorithms have been employe to combine the sensor ata into an accurate estimate of the collective pose of the robot team [6], [7]. In this paper, we present an analytical calculation of the robot s pose together with a stuy on the ifferent factors that affect the uncertainty accumulation, such as, istance between the robots an also relative orientation; probabilistically fusing oometry measurements with the robot tracker ata is beyon the scope of this paper. In the following section, an overview of relate work is provie. Section III escribes the CL problem we are aressing, as well as its approximate solution. Next in Section IV, we present an uncertainty stuy base on an omniirectional camera moel. This stuy inclues a comparison with a stereo camera localization system. Experimental results from a prototype system are presente. Finally, we present future irections for this work. II. RELATED WORK The first time cooperative localization was use was in the work of Kurazume et al. [1] terme cooperative positioning. The authors stuie the effect of CL to the uncertainty reuction by keeping at least a member of the robot team 1 In this setup, we place two sets of two ifferent lanmarks, in orer to fin the best type of LED markers /12/S IEEE 863

2 stationary. Early work presente results from simulation, later work [8] use a laser scanner an a cube-target to recover range an bearing between robots an erive optimal motion strategies. A vision base system with a helicalpattern was presente in [9] recovering range bearing an the orientation of the observe robot an was use in a followthe-leaer scenarion. The term cooperative localization was use in [10] using the helical pattern to facilitate mapping. The accuracy of the vision base system was severely limite by iscretization errors. Later work use a laser range finer an a three plane target proucing estimates on the orer of 0.01 m accuracy [11]. Alternative vision base etection [12] use colore cyliners. Using CL with a team of small robots Grabowski et al. [13] employe range only estimates from a single sonar transucer with centimeter accuracy. Ultrasoun transucers were use also in [14] with similar accuracy. Alternative approaches use ifferent sensors to provie: the bearing of the observe robot using omniirectional cameras [15], [16]; or both bearing an istance, with stereo vision an active lighting in [17] an combining vision an laser range finers in [18]. Relative bearing an/or range has been employe in a constraint optimization framework [19]. An analysis of the ifferent sensor moalities with solutions to range only [20] in 2D [21] an in 3D [22] was presente. The first ever analytical evaluation of estimating the uncertainty propagation uring CL was presente in [5]. The initial formulation was base on the algorithm escribe in [23]. The main ifference was that the robots instea of measuring their relative orientations, ha access to absolute orientation measurements. Further stuies of performance were presente in [24]. III. DESCRIPTION OF THE PROBLEM The problem of cooperative localization of two robots we are aressing consists of combining measurements relative to each other, in orer to recover the iniviual robot poses in a common frame of reference. For simplicity, let us consier two robots (A an B) that are moving on a plane, their state escribe as follows: Fig. 2. The three angles ϕ 1 ϕ 2 an ϕ 3 measure by a single camera. The angles ϕ 1 an ϕ 3 correspon to the angular positions, in the image, of the lanmarks. The angle ϕ 2 correspon to the angular position of the other camera. 6 angular measurements: ϕ A 1, ϕ A 2, ϕ A 3, ϕ B 1, ϕ B 2 an ϕ B 3. These angles ϕ correspon to the three visual markers on the other robot, as epicte in Fig. 2. Central to the propose solution is ientifying the orientation an location of the other camera, which will be use as the reference point for the robot location. The relative localization of B compare to A will be euce from: the angle α = ϕ B 1 ϕ B 3, corresponing to the angle between the outer markers of robot A, as seen by camera B; the angle β = ϕ A 2, corresponing to the bearing of the location of camera B within the frame of reference of A. These measurements, α an β, inuce the following constraints on the location of camera B relative to camera A: camera B must be locate on a circle of raius r = /(2 sin α) passing through markers D an E (inscribe an central angle theorem); camera B must be locate along a line of angle β, with respect to camera A. The intersection of these two constraints, illustrate in Fig. 3, uniquely locates camera B with respect to camera A. The relative position of A compare to B can also be compute, using ϕ A 1, ϕ A 3 an ϕ B 2 in a similar manner. Next, we are going to erive the analytical expression of the pose of robot B in the frame of reference of robot A. x i = [x i, y i, θ i ] T (1) where [x, y] escribes the position an θ the orientation. Each robot i is equippe with a number of ientifiable co-linear visual markers; in our implementation three lanmarks were use (LED lights) place at equal istance /2 from each other. In aition, each robot is equippe with a camera locate irectly below the central marker, as epicte in Fig. 2. Using only relative orientation measurements from the two cameras, the relative position an orientation of one robot with respect to the other can be erive. In general, the two robots move on the plane using their oometry, or other sensors to localize. When two robots meet, they orient themselves in such a way that each camera can see the markers on the other robot, an thus, the other robot s camera. From this configuration, each robot takes a picture an transmits it to the other robot. From the pair of images I A an I B, each robot extracts the following Fig. 3. Schema of the localization technique between two robots separate by an unknown istance l. The angle α between the two visual markers D an E space by is measure from the image taken by camera B, yieling a circular constraint. The camera A is use to measure the relative angle β. Camera B is at the intersection of the circle an the line. 864

3 A. Analytical Solution for the position of robot B relative to robot A Let us set the origin A = (0, 0) as the position of the camera on the A robot. The outer lanmarks D an E of robot A are locate at L A l = ( /2, 0) an L A r = (/2, 0), respectively. The position of the camera on the B robot is at B = (x b, y b ), with the center of the circumscribing circle containing the position of robot B an the position of the two markers D an E on robot A locate at C = (x c, y c ). All of these positions are epicte in Fig. 3. The values of α, β an will then be use to estimate the unknown position of robot B relative to A. Let us first calculate the center an raius r of the circumscribing circle. From the law of the inscribe angle theorem, the angle DCE between the two markers at the center of the circle C is 2α. With the origin (0,0) at A an the optical axis of the camera pointing along the y-axis, the circle must be locate irectly in front of camera A, because of the symmetric location of the markers D an E with respect to camera A. The center of the circle (0, y c ) an its raius r are y c = 2 tan α, r = 2 sin α. (2) The istance l = AB between the two robots can be calculate from the triangle ACB, where, AC = y c, CB = r an the BAC angle is β. From the construction of the circle r 2 = AB = l = yc 2 + AB 2 2y c AB cos β ( cos α cos β + 1 cos 2 α sin 2 β 2 sin α Given l, then the position of B is: [ ] [ ] xb l cos(90 B = = o β) y b l sin(90 o = β) where l is calculate from Eq. 3. [ l sin(β) l cos(β) ).(3) ], (4) B. Approximate Solution for the position of robot B relative to robot A By assuming that l, a number of approximations can be mae. Most importantly, we will assume that the camera A is on the circumscribing circle, as shown as point P in Fig. 4, as oppose to (0,0). This approximation is possible, since the istance between P an (0,0) tens to 0 as l/ grows. Fig. 4. Geometric relationships use in the approximate solution for the localization problem. The angle between the tangent at P an the line P B is equal to 90 o β. Using the fact that the angle forme by a chor an a tangent that intersect on a circle is half the measure of the intercepte arc, we have that the angle γ in Fig. 4 is equal to: γ = 2(90 o β). (5) The approximate position x b from the camera A is: x b r sin γ = r sin (2(90 o β)) = r sin 2β. (6) Combining Eq. 2 an 6, we have: x b (α, β, ) The approximate solution for y b is: sin 2β 2 sin α. (7) y b r(1 cos(γ)) = r(1 + cos(2β)). (8) Combining Eq. 2 an 8, we have: y b (α, β, ) (1 + cos(2β)). (9) 2 sin α IV. UNCERTAINTY STUDY: JACOBIAN OF x b AND y b The precision for the localization of robot B relative to A is a function of the Jacobian J of the measurement functions x b (α, β, ) an y b (α, β, ): ] J = [ xb α α (10) an of the measurement errors of the various angles ϕ j. In this stuy, we will assume that these errors are istribute normally, with a stanar eviation σ ϕ. Since the angle α is base on the ifference ϕ 1 ϕ 3 an assuming that the errors are uncorrelate 2, we have the following stanar eviation on α: σ α = 2σ ϕ. The angle β simply correspons to the secon angle β = ϕ 2, leaing to the following stanar eviation on β: σ β = σ ϕ. In orer to better see the impact of the istance l on the precision of localization, we can reformulate Eqs. 7 an 9 in terms of (l, β, ) instea of (α, β, ). This can be one using the epenency between α an β, through the law of sines applie to the triangle of Fig. 4 b): sin α sin(ω) l sin(β + 90o ) l = cos(β). (11) l Using this approximation an cos 2 (β) = 1 2 (1 + cos(2β)), we get α = 1 + cos(2β) l2 cos(α) 1 + cos(2β) = l2 cos(α). (12) 2 There is some amount of correlation, but the error ue to neglecting it is smaller than the actual noise. 865

4 Finally, for the conitions where l, we get α 1 an cos(α) 1: α l2 (13) The next partial erivative is approximate by using Eq.11 = sin(2β) sin(α) l sin(2β) cos(β) = 2l sin(β) (14) By making use of the ientity sin(2β) = 2 sin(β) cos(β). The last element for y b, using Eq. 11, gives: = (1 + cos(2β)) 2 sin α l(1 + cos(2β)) 2 cos(β) = l (cos(β)). (15) In like manner, we approximate the partial erivatives of x b : an α cos(α) sin 2β 2 sin 2 α cos 2β sin α l2 tan β, (16) = l cos(2β) cos(β). (17) l sin 2β 2 cos(β) = l (sin(β)). (18) Thus, the Jacobian J of the system is approximate as: [ ] l2 J tan β l cos(2β) l cos(β) (sin(β)) l2 l 2l sin(β) (cos(β)). (19) Fig. 5 shows the istributions of the estimates of the position of robot B relative to A, for ifferent angles β an l = 10 m, using a simulation in matlab. Robot A is locate at the origin (0,0), with its markers D an E locate at (-1,0) an (1,0), respectively. Analysis of these simulation results confirme the valiity of Eq. 19. Fig. 5. Distribution of pose estimates of camera B with a istance between the robots of l = 10 m at ifferent relative bearings β, in simulation. The samples are generate with Gaussian noise 2σ ϕ for α an σ ϕ for β, with σ ϕ =1.745e-3 ra. As can be seen from the istributions at location 1 an 2, the value of σ y is relatively inepenent of β. uality principle [25]. The relative heaing error, however, is much smaller for our metho. For this comparison, we moel a pair of stereo cameras with a baseline of ientical to the istance between the lanmarks in our configuration. Both cameras are oriente so as to see the marker L on robot B, as shown in Fig. 6 b). This baseline is the same as the istance between the lanmarks use in our propose metho, an is reprouce again in Fig. 6 a). B. Derivation of Precision for Stereo Camera V. COMPARISON WITH OTHER TECHNIQUES A. Theoretical Comparison with Stereo Camera Base Localization Distance an relative heaing can be estimate also if both cameras are locate on a single robot, forming a stereo camera pair. However, our metho presents a number of avantages compare to a stereo system: only one camera per robot is require, reucing the computing loa, weight an cost; perceive heaing φ between the robots can be measure with higher precision; an with more than two robots, all the robots nee to be equippe with stereo pairs, in orer to avoi the case when two robots with only targets meet, thus being unable to localize. In this section, we emonstrate that the localization errors σ x an σ y are, for all purposes, similar for both systems. The reason for this similarity is that cameras an view points can be interchange, as capture by the Carlsson-Weinshall Fig. 6. Comparison between lanmark (L) an camera (C) configurations for our metho a) an an equivalent stereo system b). The positions of the cameras an markers are interchange. For this equivalent stereo camera system, the variance in the location of the target in the space (x, y) is [26]: an σ 2 xstereo = (b2 + x 2 )y 2 2b 2 f 2 σ 2 p, (20) σ 2 ystereo = y4 2b 2 f 2 σ2 p, (21) 866

5 where σ p is the stanar eviation of the lanmark in the image plane, f is the focal length an b is the baseline of the system. Since our system operates in 2D, we have change the original notation so that z = y, to conform with our axes. The error functions for σ xstereo an σ ystereo in Eqs. 20 an 21 can be expresse in terms of β an σ β, to compare irectly with the errors in our system, base on angles α an β. First, we take the square root an use f = 1: σ xstereo = b2 + x 2 2bf yσ p = b2 + x 2 2b yσ p. (22) From the efinition of β, we replace x by x = y tan(β) an move the enominator b insie the square root: 1 + y2 b tan(β) 2 σ xstereo = 2 yσ p. (23) 2 From the efinition of β an a focal istance f = 1, we get that the position on the image plane is p = tan β. This converts the pixel noise σ p into angular noise σ β = σ ϕ : σ p = p σ ϕ = tan βσ ϕ = sec 2 (β)σ ϕ. (24) Replacing σ p in Eq. 23 by Eq. 24 an from the problem efinition y = l cos(β), we get that the error on position σ xstereo expresse as a function of β an σ ϕ is: 1 + ( l b σ xstereo = sin(β)) 2 lσ ϕ. (25) 2 cos(β) In a similar manner, we can show that error on position σ ystereo, expresse as a function of σ ϕ, is equal to: σ ystereo = C. Comparison with our Metho l2 2b σ ϕ. (26) We can estimate the error on the x position using partial erivatives: ( ) 2 ( ) 2 σx 2 xb = σ xb β + α σ α (27) since we assume that σ = 0. We have σ α = 2σ ϕ an σ β = σ ϕ. Taking the expressions from Eq. 19, we have: ( σx 2 = l cos(2β) ) 2 ( ) 2 cos(β) σ ϕ + l2 tan β 2σ ϕ. (28) Substituting = 2b for the baseline efinition in [26] an using stanar trigonometric ientities, we get: 1 + cos(4β) + ( l b σ x = sin(β)) 2 lσ ϕ. (29) 2 cos(β) For a similar level of angular noise σ ϕ, both systems perform nearly ientically. In fact, Eq 29 only iffers from Eq. 26 by the extra cos(4β) term. However, this term oes not contribute significantly beyon 10 o when l b, since it is being ominate by ( l b sin(β))2. Similarly for σ y an using the partial erivatives in Eq. 19, we have: σ 2 y = ( yb σ β ) 2 ( ) 2 ) yb + α σ α = (2 sin 2 (β) + l2 4b 2 l 2 σϕ. 2 (30) If we only consier the cases where the robots are within their fiel of views (β < 45 o ) an at a istance l b, then 2 sin 2 (β) l2 4b 2 an we can approximate Eq. 30: σ y l2 2b σ ϕ. (31) Thus, we have a epth evaluation better by a factor of 1/ 2, compare to a stereo system with one lanmark observe. If two lanmarks are observe with the stereo system an the estimate epth average, then the error σ ystereo shoul be the same as in Eq. 31. One must keep in min that the analysis above was one for the case where a stereo baseline istance 2b is equal to. However, most stereo systems on mobile robots use short baseline istances 2b, to facilitate image registration. Since our approach is not boun by such limitations, we can use an arbitrarily large 2b between our lanmarks. In practice, this allows our system to significantly outperform stanar stereo systems on mobile robots. D. Perceive Heaing φ Compare to a single stereo pair, we are able to recover the perceive heaing φ between the two cameras with very high precision. The perceive heaing estimate for our approach is equal to: φ = β 2 β 1, (32) with a noise equal to σ φ = 2σ ϕ. (33) Measuring this perceive heaing, using a single stereo pair, woul not be able to capture an estimate of φ with this level of precision. The reason is that fining φ involves the following computation: φ stereo cos 1 ( (ϕ3 ϕ 1 )l measure ). (34) With ϕ δ = ϕ 3 ϕ 1, the estimate noise on φ stereo is ( ) σφ 2 φstereo 2 ( ) 2 φstereo stereo 2σϕ + σ l ϕ δ l (35) σφ 2 1 ( stereo 2l 2 σϕ 2 + ϕ 2 ( lϕ δ )( + lϕ δ ) δσl 2 ) (36) The best possible case is when ϕ δ = 0 an the other robot is locate straight ahea, which means that: σ φstereo 2 l σ ϕ (37) 867

6 an is always greater than our perceive heaing estimate noise in Eq. 33, for l. E. Simulate Comparison against a one Camera an 3 Noncolinear Lanmarks It is possible to retrieve both heaing an istance information without exchanging images between robots, if the robots have at least 3 non-colinear markers on them. For comparison, we simulate the performance of such system with one camera, an 3 markers locate on a square of sie = 1, with a istance l = 10 between the robots an σϕ =1.745e-3 ra. For our mutual camera approach, information gathere from both sies of the square were optimally combine. Simulation results, presente in Fig. 7, show that the uncertainty on the location of the robot is much larger for the 3 markers case. 11 Fig. 8. Platform use for the ata collection (top), with one picture use for localization (bottom). Distance Y (m) True Location Robot B Localization 3 markers Our Localization Distance X (m) Fig. 7. Simulations results between a one camera an 3 non-colinear markers vs. our approach, for various locations of robot B at a istance l = 10. The markers on the robot A are locate at (, 0), (0, 0) an (0, ) (outsie the graph). VI. E XPERIMENTAL R ESULTS We evaluate the performance of our localization metho on two real ata sets. The experimental setup comprise one irobot Create robot with an on-boar computing moule, 2 Logitech C x1200 pixels webcams an a number of LED marker glowsticks. The outer markers were separate by a istance of = m on the robot, with a camera an another marker exactly in the mile, as shown in Fig. 8. The other camera was mounte on a fixe setup, with an LED marker above. Groun truth was establishe by using a Hokuyo URG-04LX-UG01 LIDAR mounte on the robot an placing 10 boxes in the environment as lanmarks. The robot move forwar by 5.25 cm between each step an along a nearly straight trajectory towars the fixe camera. Pictures were taken at each step forwar an use as the test sets. Fig. 9 shows the estimate robot position foun by our metho, using one α an one β per sample, compare with groun truth. The average absolute position error over the complete trajectory was 4.09 cm for experiment 1 (140 samples) an 4.40 cm for experiment 2 (110 samples). When combining with the range information obtaine by the other α, β pair, the average absolute measurement error for experiment 1 roppe to 2.82 cm, a reuction by a factor Fig. 9. Comparison between the estimate robot position using our metho an groun truth, for two trials. The fixe camera was locate at (0, 0). Fig. 10. Comparison between the absolute position error an the preicte error, as a function of the istance l. As can be seen, the preicte errors for an angular noise of σϕ = ra are consistent with the measure errors. With a focal istance of f = 1320 pixels for the webcam, the noise is equivalent to 0.4 pixel. of 2, as expecte when averaging two uncorrelate noisy estimates. The istribution of absolute errors, shown in Fig. 10, are consistent with the noise σy preicte by Eq. 31. Histograms of step measurements, for three brackets of istance l, are shown in Fig. 11. These correspon to 868

7 Fig. 11. Robot step (5.25 cm) measurement istributions with our metho (a-c) an LIDAR (). Fig. 12. Robot perceive heaing estimates, uring experiment 1. As a comparison, laser scan alignments performe with ICP resulte in noisier estimates. the estimate istance between the 5.25 cm steps of the robot. Thus, a perfect system woul measure this value, an noisy systems shoul have a istribution with a sprea relate to the measurement noise. The stanar eviation of these istributions are, again, in agreement with our noise moel. The perceive heaing φ for experiment 1 is plotte in Fig. 12, along with the perceive heaing erive from the Hokuyo laser ata. One can see that the heaing φ compute with our metho is much less noisy than the one foun with a LIDAR, which is the common metho for evaluating the heaing of a robot in its environment. VII. CONCLUSION AND FUTURE WORK In this paper we presente a novel approach to cooperative localization, base on mutual observation of lanmark bearing angles between two robots. We erive a mathematical solution along with a theoretical noise moel that compare favorably against an equivalent stereo camera system. During experiments with a mobile robot, our system emonstrate goo position estimation (average error aroun 4 cm over 250 samples), espite the use of off-the-shelf cameras an markers. This makes our solution particularly well suite for eployment on fleets of inexpensive robots. The biggest challenge with the current physical implementation is to establish mutual observations, without interfering with the normal operation of the vehicles. To eliminate this problem, omniirectional cameras can be employe. Our metho is completely compatible with such cameras. We plan on extening this methoology to vehicles that move freely in three imensions. Of particular interest are uneractuate square blimps that move at slow spees [27], since the require harware for our solution is very light. Applications to unerwater vehicles [28] are also consiere, since they are generally eploye in unstructure environments without GPS. Another current irection of research is to employ Unscente Kalman Filtering (UKF) to improve the state estimation, by exploiting oometry measurements. REFERENCES [1] R. Kurazume, S. Nagata, an S. Hirose, Cooperative positioning with multiple robots, in ICRA, vol. 2, 1994, pp [2] I. Rekleitis, G. Duek, an E. Milios, Multi-robot exploration of an unknown environment, efficiently reucing the oometry error, in IJCAI, vol. 2, 1997, pp [3] S. Roumeliotis an G. Bekey, Collective localization: A istribute kalman filter approach to localization of groups of mobile robots, in ICRA. IEEE, 2000, pp [4] D. Fox, W. Burgar, H. Kruppa, an S. Thrun., Collaborative multirobot localization, in In Proc. of the 23r Annual German Conf. on Artificial Intelligence (KI), 1999, pp [5] S. I. Roumeliotis an I. M. Rekleitis, Propagation of uncertainty in cooperative multirobot localization: Analysis an experimental results, Autonomous Robots, vol. 17, no. 1, pp , July [6] K. Y. K. Leung, T. D. Barfoot, an H. H. T. Liu, Decentralize localization of sparsely-communicating robot networks: A centralizeequivalent approach, IEEE T-RO, vol. 26, pp , [7] I. M. Rekleitis, A particle filter tutorial for mobile robot localization, Centre for Intelligent Machines, McGill University, Tech. Rep. TR- CIM-04-02, [8] R. Kurazume an S. Hirose, An experimental stuy of a cooperative positioning system, Autonomous Robots, vol. 8, pp , [9] G. Duek, M. Jenkin, E. Milios, an D. Wilkes, A taxonomy for multi-agent robotics, Autonomous Robots, vol. 3, pp , [10] I. M. Rekleitis, G. Duek, an E. E. Milios, On multiagent exploration, in Proceeings of Vision Interface 1998, Vancouver, Canaa, June 1998, pp [11] I. Rekleitis, G. Duek, an E. Milios, Multi-robot collaboration for robust exploration, Annals of Mathematics an Artificial Intelligence, vol. 31, no. 1-4, pp. 7 40, [12] D. Fox, W. Burgar, H. Kruppa, an S. Thrun, A probabilistic approach to collaborative multi-robot localization, Autonomous Robots, vol. 8, no. 3, pp , Jun [13] R. Grabowski, L. E. Navarro-Serment, C. J. J. Pareis, an P. Khosla, Heterogenouse teams of moular robots for mapping an exploration, Autonomous Robots, vol. 8, no. 3, pp , [14] A. Sanerson, A istribute algorithm for cooperative navigation among multiple mobile robots, Avance Robotics, vol. 12, no. 4, pp , [15] K. Kato, H. Ishiguro, an M. Barth, Ientifying an localizing robots in a multi-robot system environment. in IROS, 1999, pp [16] D. W. Gage, Minimum resource istribute navigation an mapping, in Proc. of SPIE Mobile Robots XV, vol. 4195, no. 12, [17] A. Davison an N. Kita, Active visual localisation for cooperating inspection robots, in IROS, vol. 3, 2000, pp [18] W. Burgar, D. Fox, M. Moors, R. Simmons, an S. Thrun, Collaborative multi-robot exploration, in ICRA, 2000, pp [19] C. Taylor an J. Spletzer, A boune uncertainty approach to multirobot localization, in IROS, Oct. 2007, pp [20] X. S. Zhou an S. I. Roumeliotis, Robot-to-robot relative pose estimation from range measurements, IEEE T-RO, vol. 24, no. 6, pp , December [21] N. Trawny an S. I. Roumeliotis, On the global optimum of planar, range-base robot-to-robot relative pose estimation, in ICRA, Anchorage, AK, 2010, pp [22] N. Trawny, X. S. Zhou, K. Zhou, an S. I. Roumeliotis, Interrobot transformations in 3-, IEEE T-RO, vol. 26, no. 2, p , [23] S. Roumeliotis an G. Bekey, Distribute multirobot localization, IEEE Transactions on Robotics an Automation, vol. 18, no. 5, pp , Oct [24] A. I. Mourikis an S. I. Roumeliotis, Performance analysis of multirobot cooperative localization. IEEE TRO, vol. 22, no. 4, pp , August [25] R. I. Hartley an A. Zisserman, Multiple View Geometry in Computer Vision, 2n e. Cambrige University Press, [26] J. Ruiz-Alzola, C. Alberola-Lopez, an J. R. C. Correera, Moelbase stereo-visual tracking: Covariance analysis an tracking schemes. Signal Processing, pp , [27] D. St-Onge, N. Reeves, an C. Gosselin, [voiles sails]: A moular architecture for a fast parallel evelopment in an international multiisciplinary project, in 15th ICAR, 2011, pp [28] G. Duek, et al., A visually guie swimming robot, in IROS, Aug , pp

Refinement of scene depth from stereo camera ego-motion parameters

Refinement of scene depth from stereo camera ego-motion parameters Refinement of scene epth from stereo camera ego-motion parameters Piotr Skulimowski, Pawel Strumillo An algorithm for refinement of isparity (epth) map from stereoscopic sequences is propose. The metho

More information

Shift-map Image Registration

Shift-map Image Registration Shift-map Image Registration Svärm, Linus; Stranmark, Petter Unpublishe: 2010-01-01 Link to publication Citation for publishe version (APA): Svärm, L., & Stranmark, P. (2010). Shift-map Image Registration.

More information

6DoF Cooperative Localization for Mutually Observing Robots

6DoF Cooperative Localization for Mutually Observing Robots 6DoF Cooperative Localization for Mutually Observing Robots Olivier Dugas, Philippe Giguere and Ioannis Rekleitis Abstract The solution of cooperative localization is of particular importance to teams

More information

Transient analysis of wave propagation in 3D soil by using the scaled boundary finite element method

Transient analysis of wave propagation in 3D soil by using the scaled boundary finite element method Southern Cross University epublications@scu 23r Australasian Conference on the Mechanics of Structures an Materials 214 Transient analysis of wave propagation in 3D soil by using the scale bounary finite

More information

Fast Window Based Stereo Matching for 3D Scene Reconstruction

Fast Window Based Stereo Matching for 3D Scene Reconstruction The International Arab Journal of Information Technology, Vol. 0, No. 3, May 203 209 Fast Winow Base Stereo Matching for 3D Scene Reconstruction Mohamma Mozammel Chowhury an Mohamma AL-Amin Bhuiyan Department

More information

New Geometric Interpretation and Analytic Solution for Quadrilateral Reconstruction

New Geometric Interpretation and Analytic Solution for Quadrilateral Reconstruction New Geometric Interpretation an Analytic Solution for uarilateral Reconstruction Joo-Haeng Lee Convergence Technology Research Lab ETRI Daejeon, 305 777, KOREA Abstract A new geometric framework, calle

More information

State Indexed Policy Search by Dynamic Programming. Abstract. 1. Introduction. 2. System parameterization. Charles DuHadway

State Indexed Policy Search by Dynamic Programming. Abstract. 1. Introduction. 2. System parameterization. Charles DuHadway State Inexe Policy Search by Dynamic Programming Charles DuHaway Yi Gu 5435537 503372 December 4, 2007 Abstract We consier the reinforcement learning problem of simultaneous trajectory-following an obstacle

More information

FINDING OPTICAL DISPERSION OF A PRISM WITH APPLICATION OF MINIMUM DEVIATION ANGLE MEASUREMENT METHOD

FINDING OPTICAL DISPERSION OF A PRISM WITH APPLICATION OF MINIMUM DEVIATION ANGLE MEASUREMENT METHOD Warsaw University of Technology Faculty of Physics Physics Laboratory I P Joanna Konwerska-Hrabowska 6 FINDING OPTICAL DISPERSION OF A PRISM WITH APPLICATION OF MINIMUM DEVIATION ANGLE MEASUREMENT METHOD.

More information

Vision-based Multi-Robot Simultaneous Localization and Mapping

Vision-based Multi-Robot Simultaneous Localization and Mapping Vision-base Multi-Robot Simultaneous Localization an Mapping Hassan Hajjiab an Robert Laganière VIVA Research Lab School of Information Technology an Engineering University of Ottawa, Ottawa, Canaa, K1N

More information

Localization of Multiple Robots with Simple Sensors

Localization of Multiple Robots with Simple Sensors Proceedings of the IEEE International Conference on Mechatronics & Automation Niagara Falls, Canada July 2005 Localization of Multiple Robots with Simple Sensors Mike Peasgood and Christopher Clark Lab

More information

CAMERAS AND GRAVITY: ESTIMATING PLANAR OBJECT ORIENTATION. Zhaoyin Jia, Andrew Gallagher, Tsuhan Chen

CAMERAS AND GRAVITY: ESTIMATING PLANAR OBJECT ORIENTATION. Zhaoyin Jia, Andrew Gallagher, Tsuhan Chen CAMERAS AND GRAVITY: ESTIMATING PLANAR OBJECT ORIENTATION Zhaoyin Jia, Anrew Gallagher, Tsuhan Chen School of Electrical an Computer Engineering, Cornell University ABSTRACT Photography on a mobile camera

More information

Classifying Facial Expression with Radial Basis Function Networks, using Gradient Descent and K-means

Classifying Facial Expression with Radial Basis Function Networks, using Gradient Descent and K-means Classifying Facial Expression with Raial Basis Function Networks, using Graient Descent an K-means Neil Allrin Department of Computer Science University of California, San Diego La Jolla, CA 9237 nallrin@cs.ucs.eu

More information

Robust Camera Calibration for an Autonomous Underwater Vehicle

Robust Camera Calibration for an Autonomous Underwater Vehicle obust Camera Calibration for an Autonomous Unerwater Vehicle Matthew Bryant, Davi Wettergreen *, Samer Aballah, Alexaner Zelinsky obotic Systems Laboratory Department of Engineering, FEIT Department of

More information

Research Article Inviscid Uniform Shear Flow past a Smooth Concave Body

Research Article Inviscid Uniform Shear Flow past a Smooth Concave Body International Engineering Mathematics Volume 04, Article ID 46593, 7 pages http://x.oi.org/0.55/04/46593 Research Article Invisci Uniform Shear Flow past a Smooth Concave Boy Abullah Mura Department of

More information

Architecture Design of Mobile Access Coordinated Wireless Sensor Networks

Architecture Design of Mobile Access Coordinated Wireless Sensor Networks Architecture Design of Mobile Access Coorinate Wireless Sensor Networks Mai Abelhakim 1 Leonar E. Lightfoot Jian Ren 1 Tongtong Li 1 1 Department of Electrical & Computer Engineering, Michigan State University,

More information

AN INVESTIGATION OF FOCUSING AND ANGULAR TECHNIQUES FOR VOLUMETRIC IMAGES BY USING THE 2D CIRCULAR ULTRASONIC PHASED ARRAY

AN INVESTIGATION OF FOCUSING AND ANGULAR TECHNIQUES FOR VOLUMETRIC IMAGES BY USING THE 2D CIRCULAR ULTRASONIC PHASED ARRAY AN INVESTIGATION OF FOCUSING AND ANGULAR TECHNIQUES FOR VOLUMETRIC IMAGES BY USING THE D CIRCULAR ULTRASONIC PHASED ARRAY S. Monal Lonon South Bank University; Engineering an Design 103 Borough Roa, Lonon

More information

Bends, Jogs, And Wiggles for Railroad Tracks and Vehicle Guide Ways

Bends, Jogs, And Wiggles for Railroad Tracks and Vehicle Guide Ways Ben, Jogs, An Wiggles for Railroa Tracks an Vehicle Guie Ways Louis T. Klauer Jr., PhD, PE. Work Soft 833 Galer Dr. Newtown Square, PA 19073 lklauer@wsof.com Preprint, June 4, 00 Copyright 00 by Louis

More information

Shift-map Image Registration

Shift-map Image Registration Shift-map Image Registration Linus Svärm Petter Stranmark Centre for Mathematical Sciences, Lun University {linus,petter}@maths.lth.se Abstract Shift-map image processing is a new framework base on energy

More information

1 Surprises in high dimensions

1 Surprises in high dimensions 1 Surprises in high imensions Our intuition about space is base on two an three imensions an can often be misleaing in high imensions. It is instructive to analyze the shape an properties of some basic

More information

Dense Disparity Estimation in Ego-motion Reduced Search Space

Dense Disparity Estimation in Ego-motion Reduced Search Space Dense Disparity Estimation in Ego-motion Reuce Search Space Luka Fućek, Ivan Marković, Igor Cvišić, Ivan Petrović University of Zagreb, Faculty of Electrical Engineering an Computing, Croatia (e-mail:

More information

A Versatile Model-Based Visibility Measure for Geometric Primitives

A Versatile Model-Based Visibility Measure for Geometric Primitives A Versatile Moel-Base Visibility Measure for Geometric Primitives Marc M. Ellenrieer 1,LarsKrüger 1, Dirk Stößel 2, an Marc Hanheie 2 1 DaimlerChrysler AG, Research & Technology, 89013 Ulm, Germany 2 Faculty

More information

Multimodal Stereo Image Registration for Pedestrian Detection

Multimodal Stereo Image Registration for Pedestrian Detection Multimoal Stereo Image Registration for Peestrian Detection Stephen Krotosky an Mohan Trivei Abstract This paper presents an approach for the registration of multimoal imagery for peestrian etection when

More information

Impact of changing the position of the tool point on the moving platform on the dynamic performance of a 3RRR planar parallel manipulator

Impact of changing the position of the tool point on the moving platform on the dynamic performance of a 3RRR planar parallel manipulator IOSR Journal of Mechanical an Civil Engineering (IOSR-JMCE) e-issn: 78-84,p-ISSN: 0-4X, Volume, Issue 4 Ver. I (Jul. - Aug. 05), PP 7-8 www.iosrjournals.org Impact of changing the position of the tool

More information

Using the disparity space to compute occupancy grids from stereo-vision

Using the disparity space to compute occupancy grids from stereo-vision The 2010 IEEE/RSJ International Conference on Intelligent Robots an Systems October 18-22, 2010, Taipei, Taiwan Using the isparity space to compute occupancy gris from stereo-vision Mathias Perrollaz,

More information

Characterizing Decoding Robustness under Parametric Channel Uncertainty

Characterizing Decoding Robustness under Parametric Channel Uncertainty Characterizing Decoing Robustness uner Parametric Channel Uncertainty Jay D. Wierer, Wahee U. Bajwa, Nigel Boston, an Robert D. Nowak Abstract This paper characterizes the robustness of ecoing uner parametric

More information

Queueing Model and Optimization of Packet Dropping in Real-Time Wireless Sensor Networks

Queueing Model and Optimization of Packet Dropping in Real-Time Wireless Sensor Networks Queueing Moel an Optimization of Packet Dropping in Real-Time Wireless Sensor Networks Marc Aoun, Antonios Argyriou, Philips Research, Einhoven, 66AE, The Netherlans Department of Computer an Communication

More information

CONSTRUCTION AND ANALYSIS OF INVERSIONS IN S 2 AND H 2. Arunima Ray. Final Paper, MATH 399. Spring 2008 ABSTRACT

CONSTRUCTION AND ANALYSIS OF INVERSIONS IN S 2 AND H 2. Arunima Ray. Final Paper, MATH 399. Spring 2008 ABSTRACT CONSTUCTION AN ANALYSIS OF INVESIONS IN S AN H Arunima ay Final Paper, MATH 399 Spring 008 ASTACT The construction use to otain inversions in two-imensional Eucliean space was moifie an applie to otain

More information

Navigation Around an Unknown Obstacle for Autonomous Surface Vehicles Using a Forward-Facing Sonar

Navigation Around an Unknown Obstacle for Autonomous Surface Vehicles Using a Forward-Facing Sonar Navigation Aroun an nknown Obstacle for Autonomous Surface Vehicles sing a Forwar-Facing Sonar Patrick A. Plonski, Joshua Vaner Hook, Cheng Peng, Narges Noori, Volkan Isler Abstract A robotic boat is moving

More information

Figure 1: Schematic of an SEM [source: ]

Figure 1: Schematic of an SEM [source:   ] EECI Course: -9 May 1 by R. Sanfelice Hybri Control Systems Eelco van Horssen E.P.v.Horssen@tue.nl Project: Scanning Electron Microscopy Introuction In Scanning Electron Microscopy (SEM) a (bunle) beam

More information

Module13:Interference-I Lecture 13: Interference-I

Module13:Interference-I Lecture 13: Interference-I Moule3:Interference-I Lecture 3: Interference-I Consier a situation where we superpose two waves. Naively, we woul expect the intensity (energy ensity or flux) of the resultant to be the sum of the iniviual

More information

Estimating Velocity Fields on a Freeway from Low Resolution Video

Estimating Velocity Fields on a Freeway from Low Resolution Video Estimating Velocity Fiels on a Freeway from Low Resolution Vieo Young Cho Department of Statistics University of California, Berkeley Berkeley, CA 94720-3860 Email: young@stat.berkeley.eu John Rice Department

More information

A Plane Tracker for AEC-automation Applications

A Plane Tracker for AEC-automation Applications A Plane Tracker for AEC-automation Applications Chen Feng *, an Vineet R. Kamat Department of Civil an Environmental Engineering, University of Michigan, Ann Arbor, USA * Corresponing author (cforrest@umich.eu)

More information

Real Time On Board Stereo Camera Pose through Image Registration*

Real Time On Board Stereo Camera Pose through Image Registration* 28 IEEE Intelligent Vehicles Symposium Einhoven University of Technology Einhoven, The Netherlans, June 4-6, 28 Real Time On Boar Stereo Camera Pose through Image Registration* Fai Dornaika French National

More information

Technical Report TR Navigation Around an Unknown Obstacle for Autonomous Surface Vehicles Using a Forward-Facing Sonar

Technical Report TR Navigation Around an Unknown Obstacle for Autonomous Surface Vehicles Using a Forward-Facing Sonar Technical Report Department of Computer Science an Engineering niversity of Minnesota 4-192 Keller Hall 2 nion Street SE Minneapolis, MN 55455-159 SA TR 15-5 Navigation Aroun an nknown Obstacle for Autonomous

More information

Depth Sizing of Surface Breaking Flaw on Its Open Side by Short Path of Diffraction Technique

Depth Sizing of Surface Breaking Flaw on Its Open Side by Short Path of Diffraction Technique 17th Worl Conference on Nonestructive Testing, 5-8 Oct 008, Shanghai, China Depth Sizing of Surface Breaking Flaw on Its Open Sie by Short Path of Diffraction Technique Hiroyuki FUKUTOMI, Shan LIN an Takashi

More information

[2006] IEEE. Reprinted, with permission, from [Damith C. Herath, Sarath Kodagoda and Gamini Dissanayake, Simultaneous Localisation and Mapping: A

[2006] IEEE. Reprinted, with permission, from [Damith C. Herath, Sarath Kodagoda and Gamini Dissanayake, Simultaneous Localisation and Mapping: A [6] IEEE. Reprinte, with permission, from [Damith C. Herath, Sarath Koagoa an Gamini Dissanayake, Simultaneous Localisation an Mapping: A Stereo Vision Base Approach, Intelligent Robots an Systems, 6 IEEE/RSJ

More information

Skyline Community Search in Multi-valued Networks

Skyline Community Search in Multi-valued Networks Syline Community Search in Multi-value Networs Rong-Hua Li Beijing Institute of Technology Beijing, China lironghuascut@gmail.com Jeffrey Xu Yu Chinese University of Hong Kong Hong Kong, China yu@se.cuh.eu.h

More information

Classical Mechanics Examples (Lagrange Multipliers)

Classical Mechanics Examples (Lagrange Multipliers) Classical Mechanics Examples (Lagrange Multipliers) Dipan Kumar Ghosh Physics Department, Inian Institute of Technology Bombay Powai, Mumbai 400076 September 3, 015 1 Introuction We have seen that the

More information

Section 20. Thin Prisms

Section 20. Thin Prisms OPTI-0/0 Geometrical an Instrumental Optics opyright 08 John E. Greivenkamp 0- Section 0 Thin Prisms Thin Prism Deviation Thin prisms introuce small angular beam eviations an are useful as alignment evices.

More information

Geometrical Feature Extraction Using 2D Range Scanner

Geometrical Feature Extraction Using 2D Range Scanner Geometrical Feature Extraction Using 2D Range Scanner Sen Zhang Lihua Xie Martin Adams Fan Tang BLK S2, School of Electrical and Electronic Engineering Nanyang Technological University, Singapore 639798

More information

Dual Arm Robot Research Report

Dual Arm Robot Research Report Dual Arm Robot Research Report Analytical Inverse Kinematics Solution for Moularize Dual-Arm Robot With offset at shouler an wrist Motivation an Abstract Generally, an inustrial manipulator such as PUMA

More information

Calculation on diffraction aperture of cube corner retroreflector

Calculation on diffraction aperture of cube corner retroreflector November 10, 008 / Vol., No. 11 / CHINESE OPTICS LETTERS 8 Calculation on iffraction aperture of cube corner retroreflector Song Li (Ó Ø, Bei Tang (», an Hui Zhou ( ï School of Electronic Information,

More information

10. WAVE OPTICS ONE MARK QUESTIONS

10. WAVE OPTICS ONE MARK QUESTIONS 1 10. WAVE OPTICS ONE MARK QUESTIONS 1. Define wavefront.. What is the shape of wavefront obtaine from a point source at a (i) small istance (ii) large istance? 3. Uner what conitions a cylinrical wavefront

More information

Particle Swarm Optimization Based on Smoothing Approach for Solving a Class of Bi-Level Multiobjective Programming Problem

Particle Swarm Optimization Based on Smoothing Approach for Solving a Class of Bi-Level Multiobjective Programming Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 17, No 3 Sofia 017 Print ISSN: 1311-970; Online ISSN: 1314-4081 DOI: 10.1515/cait-017-0030 Particle Swarm Optimization Base

More information

Robust Ground Plane Tracking in Cluttered Environments from Egocentric Stereo Vision

Robust Ground Plane Tracking in Cluttered Environments from Egocentric Stereo Vision Robust Groun Plane Tracking in Cluttere Environments from Egocentric Stereo Vision Tobias Schwarze an Martin Lauer 1 Abstract Estimating the groun plane is often one of the first steps in geometric reasoning

More information

Section 19. Thin Prisms

Section 19. Thin Prisms Section 9 Thin Prisms 9- OPTI-50 Optical Design an Instrumentation I opyright 08 John E. Greivenkamp Thin Prism Deviation Thin prisms introuce small angular beam eviations an are useful as alignment evices.

More information

Optimal path planning in a constant wind with a bounded turning rate

Optimal path planning in a constant wind with a bounded turning rate Optimal path planning in a constant win with a boune turning rate Timothy G. McGee, Stephen Spry an J. Karl Herick Center for Collaborative Control of Unmanne Vehicles, University of California, Berkeley,

More information

EXPERIMENTAL VALIDATION OF HIGH SPEED HAZARD AVOIDANCE CONTROL FOR UNMANNED GROUND VEHICLES. Matthew Spenko, Steven Dubowsky, and Karl Iagnemma

EXPERIMENTAL VALIDATION OF HIGH SPEED HAZARD AVOIDANCE CONTROL FOR UNMANNED GROUND VEHICLES. Matthew Spenko, Steven Dubowsky, and Karl Iagnemma EXPERIMENTAL VALIDATION OF HIGH SPEED HAZARD AVOIDANCE CONTROL FOR UNMANNED GROUND VEHICLES Matthew Spenko, Steven Dubowsky, an Karl Iagnemma Massachusetts Institute of Technology, Department of Mechanical

More information

Study of Network Optimization Method Based on ACL

Study of Network Optimization Method Based on ACL Available online at www.scienceirect.com Proceia Engineering 5 (20) 3959 3963 Avance in Control Engineering an Information Science Stuy of Network Optimization Metho Base on ACL Liu Zhian * Department

More information

Learning Subproblem Complexities in Distributed Branch and Bound

Learning Subproblem Complexities in Distributed Branch and Bound Learning Subproblem Complexities in Distribute Branch an Boun Lars Otten Department of Computer Science University of California, Irvine lotten@ics.uci.eu Rina Dechter Department of Computer Science University

More information

Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed. Preface Here are my online notes for my Calculus I course that I teach here at Lamar University. Despite the fact that these are my class notes, they shoul be accessible to anyone wanting to learn Calculus

More information

Interference and diffraction are the important phenomena that distinguish. Interference and Diffraction

Interference and diffraction are the important phenomena that distinguish. Interference and Diffraction C H A P T E R 33 Interference an Diffraction 33- Phase Difference an Coherence 33-2 Interference in Thin Films 33-3 Two-Slit Interference Pattern 33-4 Diffraction Pattern of a Single Slit * 33-5 Using

More information

Kinematic Analysis of a Family of 3R Manipulators

Kinematic Analysis of a Family of 3R Manipulators Kinematic Analysis of a Family of R Manipulators Maher Baili, Philippe Wenger an Damien Chablat Institut e Recherche en Communications et Cybernétique e Nantes, UMR C.N.R.S. 6597 1, rue e la Noë, BP 92101,

More information

Local Path Planning with Proximity Sensing for Robot Arm Manipulators. 1. Introduction

Local Path Planning with Proximity Sensing for Robot Arm Manipulators. 1. Introduction Local Path Planning with Proximity Sensing for Robot Arm Manipulators Ewar Cheung an Vlaimir Lumelsky Yale University, Center for Systems Science Department of Electrical Engineering New Haven, Connecticut

More information

Non-homogeneous Generalization in Privacy Preserving Data Publishing

Non-homogeneous Generalization in Privacy Preserving Data Publishing Non-homogeneous Generalization in Privacy Preserving Data Publishing W. K. Wong, Nios Mamoulis an Davi W. Cheung Department of Computer Science, The University of Hong Kong Pofulam Roa, Hong Kong {wwong2,nios,cheung}@cs.hu.h

More information

Intensive Hypercube Communication: Prearranged Communication in Link-Bound Machines 1 2

Intensive Hypercube Communication: Prearranged Communication in Link-Bound Machines 1 2 This paper appears in J. of Parallel an Distribute Computing 10 (1990), pp. 167 181. Intensive Hypercube Communication: Prearrange Communication in Link-Boun Machines 1 2 Quentin F. Stout an Bruce Wagar

More information

UNIT 9 INTERFEROMETRY

UNIT 9 INTERFEROMETRY UNIT 9 INTERFEROMETRY Structure 9.1 Introuction Objectives 9. Interference of Light 9.3 Light Sources for 9.4 Applie to Flatness Testing 9.5 in Testing of Surface Contour an Measurement of Height 9.6 Interferometers

More information

Online Appendix to: Generalizing Database Forensics

Online Appendix to: Generalizing Database Forensics Online Appenix to: Generalizing Database Forensics KYRIACOS E. PAVLOU an RICHARD T. SNODGRASS, University of Arizona This appenix presents a step-by-step iscussion of the forensic analysis protocol that

More information

Generalized Edge Coloring for Channel Assignment in Wireless Networks

Generalized Edge Coloring for Channel Assignment in Wireless Networks Generalize Ege Coloring for Channel Assignment in Wireless Networks Chun-Chen Hsu Institute of Information Science Acaemia Sinica Taipei, Taiwan Da-wei Wang Jan-Jan Wu Institute of Information Science

More information

MORA: a Movement-Based Routing Algorithm for Vehicle Ad Hoc Networks

MORA: a Movement-Based Routing Algorithm for Vehicle Ad Hoc Networks : a Movement-Base Routing Algorithm for Vehicle A Hoc Networks Fabrizio Granelli, Senior Member, Giulia Boato, Member, an Dzmitry Kliazovich, Stuent Member Abstract Recent interest in car-to-car communications

More information

Advanced method of NC programming for 5-axis machining

Advanced method of NC programming for 5-axis machining Available online at www.scienceirect.com Proceia CIRP (0 ) 0 07 5 th CIRP Conference on High Performance Cutting 0 Avance metho of NC programming for 5-axis machining Sergej N. Grigoriev a, A.A. Kutin

More information

Yet Another Parallel Hypothesis Search for Inverse Entailment Hiroyuki Nishiyama and Hayato Ohwada Faculty of Sci. and Tech. Tokyo University of Scien

Yet Another Parallel Hypothesis Search for Inverse Entailment Hiroyuki Nishiyama and Hayato Ohwada Faculty of Sci. and Tech. Tokyo University of Scien Yet Another Parallel Hypothesis Search for Inverse Entailment Hiroyuki Nishiyama an Hayato Ohwaa Faculty of Sci. an Tech. Tokyo University of Science, 2641 Yamazaki, Noa-shi, CHIBA, 278-8510, Japan hiroyuki@rs.noa.tus.ac.jp,

More information

Computer Graphics Chapter 7 Three-Dimensional Viewing Viewing

Computer Graphics Chapter 7 Three-Dimensional Viewing Viewing Computer Graphics Chapter 7 Three-Dimensional Viewing Outline Overview of Three-Dimensional Viewing Concepts The Three-Dimensional Viewing Pipeline Three-Dimensional Viewing-Coorinate Parameters Transformation

More information

A Duality Based Approach for Realtime TV-L 1 Optical Flow

A Duality Based Approach for Realtime TV-L 1 Optical Flow A Duality Base Approach for Realtime TV-L 1 Optical Flow C. Zach 1, T. Pock 2, an H. Bischof 2 1 VRVis Research Center 2 Institute for Computer Graphics an Vision, TU Graz Abstract. Variational methos

More information

AnyTraffic Labeled Routing

AnyTraffic Labeled Routing AnyTraffic Labele Routing Dimitri Papaimitriou 1, Pero Peroso 2, Davie Careglio 2 1 Alcatel-Lucent Bell, Antwerp, Belgium Email: imitri.papaimitriou@alcatel-lucent.com 2 Universitat Politècnica e Catalunya,

More information

New Version of Davies-Bouldin Index for Clustering Validation Based on Cylindrical Distance

New Version of Davies-Bouldin Index for Clustering Validation Based on Cylindrical Distance New Version of Davies-Boulin Inex for lustering Valiation Base on ylinrical Distance Juan arlos Roas Thomas Faculta e Informática Universia omplutense e Mari Mari, España correoroas@gmail.com Abstract

More information

NEW METHOD FOR FINDING A REFERENCE POINT IN FINGERPRINT IMAGES WITH THE USE OF THE IPAN99 ALGORITHM 1. INTRODUCTION 2.

NEW METHOD FOR FINDING A REFERENCE POINT IN FINGERPRINT IMAGES WITH THE USE OF THE IPAN99 ALGORITHM 1. INTRODUCTION 2. JOURNAL OF MEDICAL INFORMATICS & TECHNOLOGIES Vol. 13/009, ISSN 164-6037 Krzysztof WRÓBEL, Rafał DOROZ * fingerprint, reference point, IPAN99 NEW METHOD FOR FINDING A REFERENCE POINT IN FINGERPRINT IMAGES

More information

PDE-BASED INTERPOLATION METHOD FOR OPTICALLY VISUALIZED SOUND FIELD. Kohei Yatabe and Yasuhiro Oikawa

PDE-BASED INTERPOLATION METHOD FOR OPTICALLY VISUALIZED SOUND FIELD. Kohei Yatabe and Yasuhiro Oikawa 4 IEEE International Conference on Acoustic, Speech an Signal Processing (ICASSP) PDE-BASED INTERPOLATION METHOD FOR OPTICALLY VISUALIZED SOUND FIELD Kohei Yatabe an Yasuhiro Oikawa Department of Intermeia

More information

Investigation into a new incremental forming process using an adjustable punch set for the manufacture of a doubly curved sheet metal

Investigation into a new incremental forming process using an adjustable punch set for the manufacture of a doubly curved sheet metal 991 Investigation into a new incremental forming process using an ajustable punch set for the manufacture of a oubly curve sheet metal S J Yoon an D Y Yang* Department of Mechanical Engineering, Korea

More information

Fuzzy Learning Variable Admittance Control for Human-Robot Cooperation

Fuzzy Learning Variable Admittance Control for Human-Robot Cooperation Fuzzy Learning ariable Amittance Control for Human-Robot Cooperation Fotios Dimeas an Nikos Aspragathos Abstract This paper presents a metho for variable amittance control in human-robot cooperation tasks,

More information

Almost Disjunct Codes in Large Scale Multihop Wireless Network Media Access Control

Almost Disjunct Codes in Large Scale Multihop Wireless Network Media Access Control Almost Disjunct Coes in Large Scale Multihop Wireless Network Meia Access Control D. Charles Engelhart Anan Sivasubramaniam Penn. State University University Park PA 682 engelhar,anan @cse.psu.eu Abstract

More information

A Framework for Dialogue Detection in Movies

A Framework for Dialogue Detection in Movies A Framework for Dialogue Detection in Movies Margarita Kotti, Constantine Kotropoulos, Bartosz Ziólko, Ioannis Pitas, an Vassiliki Moschou Department of Informatics, Aristotle University of Thessaloniki

More information

Evolutionary Optimisation Methods for Template Based Image Registration

Evolutionary Optimisation Methods for Template Based Image Registration Evolutionary Optimisation Methos for Template Base Image Registration Lukasz A Machowski, Tshilizi Marwala School of Electrical an Information Engineering University of Witwatersran, Johannesburg, South

More information

Controlling formations of multiple mobile robots. Jaydev P. Desai Jim Ostrowski Vijay Kumar

Controlling formations of multiple mobile robots. Jaydev P. Desai Jim Ostrowski Vijay Kumar Controlling formations of multiple mobile robots Jayev P. Desai Jim Ostrowski Vijay Kumar General Robotics an Active Sensory Perception (GRASP) Laboratory, University of Pennsylvania 340 Walnut Street,

More information

IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 31, NO. 4, APRIL

IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 31, NO. 4, APRIL IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, VOL. 1, NO. 4, APRIL 01 74 Towar Efficient Distribute Algorithms for In-Network Binary Operator Tree Placement in Wireless Sensor Networks Zongqing Lu,

More information

A new fuzzy visual servoing with application to robot manipulator

A new fuzzy visual servoing with application to robot manipulator 2005 American Control Conference June 8-10, 2005. Portlan, OR, USA FrA09.4 A new fuzzy visual servoing with application to robot manipulator Marco A. Moreno-Armenariz, Wen Yu Abstract Many stereo vision

More information

Tracking and Regulation Control of a Mobile Robot System With Kinematic Disturbances: A Variable Structure-Like Approach

Tracking and Regulation Control of a Mobile Robot System With Kinematic Disturbances: A Variable Structure-Like Approach W. E. Dixon e-mail: wixon@ces.clemson.eu D. M. Dawson e-mail: awson@ces.clemson.eu E. Zergeroglu e-mail: ezerger@ces.clemson.eu Department of Electrical & Computer Engineering, Clemson University, Clemson,

More information

X y. f(x,y,d) f(x,y,d) Peak. Motion stereo space. parameter space. (x,y,d) Motion stereo space. Parameter space. Motion stereo space.

X y. f(x,y,d) f(x,y,d) Peak. Motion stereo space. parameter space. (x,y,d) Motion stereo space. Parameter space. Motion stereo space. 3D Shape Measurement of Unerwater Objects Using Motion Stereo Hieo SAITO Hirofumi KAWAMURA Masato NAKAJIMA Department of Electrical Engineering, Keio Universit 3-14-1Hioshi Kouhoku-ku Yokohama 223, Japan

More information

Ad-Hoc Networks Beyond Unit Disk Graphs

Ad-Hoc Networks Beyond Unit Disk Graphs A-Hoc Networks Beyon Unit Disk Graphs Fabian Kuhn, Roger Wattenhofer, Aaron Zollinger Department of Computer Science ETH Zurich 8092 Zurich, Switzerlan {kuhn, wattenhofer, zollinger}@inf.ethz.ch ABSTRACT

More information

Offloading Cellular Traffic through Opportunistic Communications: Analysis and Optimization

Offloading Cellular Traffic through Opportunistic Communications: Analysis and Optimization 1 Offloaing Cellular Traffic through Opportunistic Communications: Analysis an Optimization Vincenzo Sciancalepore, Domenico Giustiniano, Albert Banchs, Anreea Picu arxiv:1405.3548v1 [cs.ni] 14 May 24

More information

Design of Controller for Crawling to Sitting Behavior of Infants

Design of Controller for Crawling to Sitting Behavior of Infants Design of Controller for Crawling to Sitting Behavior of Infants A Report submitte for the Semester Project To be accepte on: 29 June 2007 by Neha Priyaarshini Garg Supervisors: Luovic Righetti Prof. Auke

More information

Lesson 11 Interference of Light

Lesson 11 Interference of Light Physics 30 Lesson 11 Interference of Light I. Light Wave or Particle? The fact that light carries energy is obvious to anyone who has focuse the sun's rays with a magnifying glass on a piece of paper an

More information

Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching

Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching Hauke Strasdat, Cyrill Stachniss, Maren Bennewitz, and Wolfram Burgard Computer Science Institute, University of

More information

Holy Halved Heaquarters Riddler

Holy Halved Heaquarters Riddler Holy Halve Heaquarters Riler Anonymous Philosopher June 206 Laser Larry threatens to imminently zap Riler Heaquarters (which is of regular pentagonal shape with no courtyar or other funny business) with

More information

The Reconstruction of Graphs. Dhananjay P. Mehendale Sir Parashurambhau College, Tilak Road, Pune , India. Abstract

The Reconstruction of Graphs. Dhananjay P. Mehendale Sir Parashurambhau College, Tilak Road, Pune , India. Abstract The Reconstruction of Graphs Dhananay P. Mehenale Sir Parashurambhau College, Tila Roa, Pune-4030, Inia. Abstract In this paper we iscuss reconstruction problems for graphs. We evelop some new ieas lie

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the worl s leaing publisher of Open Access books Built by scientists, for scientists 3,800 116,000 120M Open access books available International authors an eitors Downloas Our authors

More information

An Investigation in the Use of Vehicle Reidentification for Deriving Travel Time and Travel Time Distributions

An Investigation in the Use of Vehicle Reidentification for Deriving Travel Time and Travel Time Distributions An Investigation in the Use of Vehicle Reientification for Deriving Travel Time an Travel Time Distributions Carlos Sun Department of Civil an Environmental Engineering, University of Missouri-Columbia,

More information

WATER MIST SPRAY MODELLING WITH FDS

WATER MIST SPRAY MODELLING WITH FDS PROCEEDINGS, Fire an Evacuation Moeling Technical Conference 0 Baltimore, Marylan, August 5-6, 0 WATER MIST SPRAY MODELLING WITH FDS Simo Hostikka, Jukka Vaari, Topi Sikanen, Antti Paajanen VTT Technical

More information

EXTRACTION OF MAIN AND SECONDARY ROADS IN VHR IMAGES USING A HIGHER-ORDER PHASE FIELD MODEL

EXTRACTION OF MAIN AND SECONDARY ROADS IN VHR IMAGES USING A HIGHER-ORDER PHASE FIELD MODEL EXTRACTION OF MAIN AND SECONDARY ROADS IN VHR IMAGES USING A HIGHER-ORDER PHASE FIELD MODEL Ting Peng a,b, Ian H. Jermyn b, Véronique Prinet a, Josiane Zerubia b a LIAMA & PR, Institute of Automation,

More information

A half-scan error reduction based algorithm for cone-beam CT

A half-scan error reduction based algorithm for cone-beam CT Journal of X-Ray Science an Technology 12 (2004) 73 82 73 IOS Press A half-scan error reuction base algorithm for cone-beam CT Kai Zeng a, Zhiqiang Chen a, Li Zhang a an Ge Wang a,b a Department of Engineering

More information

A multiple wavelength unwrapping algorithm for digital fringe profilometry based on spatial shift estimation

A multiple wavelength unwrapping algorithm for digital fringe profilometry based on spatial shift estimation University of Wollongong Research Online Faculty of Engineering an Information Sciences - Papers: Part A Faculty of Engineering an Information Sciences 214 A multiple wavelength unwrapping algorithm for

More information

Disjoint Multipath Routing in Dual Homing Networks using Colored Trees

Disjoint Multipath Routing in Dual Homing Networks using Colored Trees Disjoint Multipath Routing in Dual Homing Networks using Colore Trees Preetha Thulasiraman, Srinivasan Ramasubramanian, an Marwan Krunz Department of Electrical an Computer Engineering University of Arizona,

More information

SLAM in Dynamic Environments via ML-RANSAC

SLAM in Dynamic Environments via ML-RANSAC *Manuscript Clic here to view line References SLAM in Dynamic Environments via ML-RANSAC Masou S. Bahraini 1,2, Mohamma Bozorg 2, Ahma B. Ra 1* 1 School of Mechatronic Systems Engineering, Simon Fraser

More information

Bayesian localization microscopy reveals nanoscale podosome dynamics

Bayesian localization microscopy reveals nanoscale podosome dynamics Nature Methos Bayesian localization microscopy reveals nanoscale poosome ynamics Susan Cox, Ewar Rosten, James Monypenny, Tijana Jovanovic-Talisman, Dylan T Burnette, Jennifer Lippincott-Schwartz, Gareth

More information

A shortest path algorithm in multimodal networks: a case study with time varying costs

A shortest path algorithm in multimodal networks: a case study with time varying costs A shortest path algorithm in multimoal networks: a case stuy with time varying costs Daniela Ambrosino*, Anna Sciomachen* * Department of Economics an Quantitative Methos (DIEM), University of Genoa Via

More information

Robust PIM-SM Multicasting using Anycast RP in Wireless Ad Hoc Networks

Robust PIM-SM Multicasting using Anycast RP in Wireless Ad Hoc Networks Robust PIM-SM Multicasting using Anycast RP in Wireless A Hoc Networks Jaewon Kang, John Sucec, Vikram Kaul, Sunil Samtani an Mariusz A. Fecko Applie Research, Telcoria Technologies One Telcoria Drive,

More information

On the Role of Multiply Sectioned Bayesian Networks to Cooperative Multiagent Systems

On the Role of Multiply Sectioned Bayesian Networks to Cooperative Multiagent Systems On the Role of Multiply Sectione Bayesian Networks to Cooperative Multiagent Systems Y. Xiang University of Guelph, Canaa, yxiang@cis.uoguelph.ca V. Lesser University of Massachusetts at Amherst, USA,

More information

Exercises of PIV. incomplete draft, version 0.0. October 2009

Exercises of PIV. incomplete draft, version 0.0. October 2009 Exercises of PIV incomplete raft, version 0.0 October 2009 1 Images Images are signals efine in 2D or 3D omains. They can be vector value (e.g., color images), real (monocromatic images), complex or binary

More information

Parts Assembly by Throwing Manipulation with a One-Joint Arm

Parts Assembly by Throwing Manipulation with a One-Joint Arm 21 IEEE/RSJ International Conference on Intelligent Robots an Systems, Taipei, Taiwan, October, 21. Parts Assembly by Throwing Manipulation with a One-Joint Arm Hieyuki Miyashita, Tasuku Yamawaki an Masahito

More information

Estimation of large-amplitude motion and disparity fields: Application to intermediate view reconstruction

Estimation of large-amplitude motion and disparity fields: Application to intermediate view reconstruction c 2000 SPIE. Personal use of this material is permitte. However, permission to reprint/republish this material for avertising or promotional purposes or for creating new collective works for resale or

More information