StrokeTrack: Wireless Inertial Motion Tracking of Human Arms for Stroke Telerehabilitation
|
|
- Posy Cooper
- 6 years ago
- Views:
Transcription
1 : Wireless Inertial Motion Tracking of Human Arms for Stroke Telerehabilitation Jihyoung Kim, Sungwon ang, Mario Gerla Department of Computer Science, UCLA {jhkim, swyang, Abstract Stroke is a leading cause of disability in the United States and yet little technology is currently available for individuals with stroke to practice rehabilitation therapy progress in their homes. This paper presents, an efficient, wearable upper limb motion tracking method for stroke rehabilitation therapy at home. consists of two inertial measurement units (IMU) that are placed on the wrist and the elbow. Each IMU consists of a 3-axis accelerometer and a 3-axis gyroscope. In order to track the motion of the upper limb, estimates the position of the forearm and upper arm by using inertial tracking algorithms and kinematical models. In the next step, corrects the positions of the joints inferred from the inherent integration drift, and updates them. Finally, dynamic time warping (DTW) is adopted in order to check the accuracy of the patient s motions by matching them to the reference motions. Keywords Biomedical measurements, Telerehabilitation, igbee, Complementary filter, Dynamic time warping 1 Introduction Stroke is the leading cause of serious, long-term disability in the United States. Each year, about 795,000 people suffer a stroke. About 600,000 of these are first attacks, and 185,000 are recurrent attacks [2]. More than 75% of these people require multi-disciplinary assessments and appropriate rehabilitative treatments after they are discharged from the hospital [5]. This places a large demand on community healthcare services which often have quite limited therapy resources. As a result, there is a considerable interest in training aids or intelligent systems that can support rehabilitation as complementary tools at the patients home than at hospital. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. One of the goals of stroke rehabilitation is to help the stroke patients return to normal life as much as possible by regaining the highest level of motor control. Motor learning is a set of processes associated with practice or experience leading to relatively permanent changes in the capability for movement. Since stroke patients lose motor control in their limbs, they need to go through the process of recovering the motor control Motor relearning. Motor relearning is a process that forces the patient to use the affected limb by restraining the unaffected one. As a result, patients engaging in repetitive exercises with the affected limbs recover their capability for movement [1]. Recently, researchers have proposed several solutions that enable home rehabilitation therapy for stroke patients. Due to the characteristics of home rehabilitation therapy, these systems must be constrained in cost, system resources, including battery, and interface hardware. This paper presents to address these challenges and focus on tracking the motions of the upper limb. The goal of this paper is to propose an efficient upper limb motion tracking solution for post-stroke home rehabilitation therapy. uses two inertial measurement units (IMU), each consists of a 3 axis accelerometer and a 3 axis gyroscope. The IMU collects orientation and acceleration measurements of the two body segments, forearm and upper arm, respectively, and send the values to the PC via igbee (IEEE ). The orientation and position of the forearm and upper arm can be estimated by integrating the gyroscope data and double integrating the accelerometer data respectively. However, due to the inherent integration drift, these uncorrected estimates are only accurate within a few seconds [8]. Unlike other motion tracking algorithms using [10] or Extended [16], uses an efficient complementary filter [11] to reduce the integration drift and update the correct orientation and position of each segment. Since IMUs are powered by small batteries, it is important to implement low complexity algorithms in order to reduce power consumption and process latency. provides an instant matching of the motion performed by the patient with the reference motion that tells the accuracy of motion performed. stores certain motions or gestures performed by healthy adults in its motion library. Basically, the positions of forearm and upper arm of certain motions are stored in angular degree. When
2 a patient is performing a movement, compares the position of the forearm and upper arm of the motions in the library to the motions made by the patient using dynamic time warping (DTW) [12]. In Summary, this paper makes the following contributions to the field: This paper presents, an efficient upper limb motion detection solution based on inertial measurement unit (IMU) which consists of a 3 axis accelerometer and a 3 axis gyroscope. This paper proposes an accurate method of tracking the position of forearm and upper arm based on kinematic models. This paper provides an efficient kinematic correction filter model to correct position and orientation of forearm and upper arm. This paper shows that efficient and accurate motion matching is feasible via dynamic time warping (DTW). 2 Algorithm Design In this section, the key technical components of Stroke- Track will be presented: inertial tracking, complementary filter, and dynamic time warping. Since this paper focuses on tracking the movement upper limb, only the position of the wrist(middle point between the radial and ulnar styloid process), elbow (lying anterior to the olecranon process), and shoulder (center of the humeral head) joints are considered. Since the sensor signals and the biomechanical model can be described in a stochastic manner, it can be incorporated in a sensor fusion scheme with a prediction and correction. In the prediction step, all sensor signals are processed using inertial tracking algorithm (Section 2.1). This is followed by the prediction of the segment kinematics using a known sensor-to-body alignment and a biomechanical model of the body (Section 2.2). Over time, integration of inertial sensor data leads to drift errors due to the presence of sensor noise, sensor signal offset, or sensor orientation errors. To correct the estimated quantities, such as orientation, velocity, and position, the sensor fusion scheme updates the estimates continuously. The correction step includes updates based on biomechanical characteristics of human body, especially the joints (Section 2.3). Estimated kinematics are fed back to the inertial tracking algorithms and segment kinematic step to be used in the next time frame. After a certain motion is performed by the patient, it can be compared with the reference motion to check the accuracy of the motion via DTW. For motion recognition, leverages a motion library that stores certain motions performed by healthy adults (Section 2.4). 2.1 Inertial Tracking In order to estimate the position of an arm in a world(global) coordinate system, the inertial measurements from the sensor coordinate system must be transformed into the world(global) coordinate system. Orientation estimation of the arm joints positions is typically performed by fusing estimates from two separate estimation methods: rate gyroscope integration and vector observation. Rate gyroscope measures angular velocity ω, and as the angular velocity integrates over time, it provides the change in angle(or orientation) with respect to an initially known angle. wb q t = 1 2 wb q t Ω t (1) where wb q t is the quaternion describing the rotation from sensor body(b) to world(global) frame(w) at time t. Ω t = (0,ω x,ω y,ω z ) T is the quaternion representation of the angular velocity ω t, and is the quaternion multiplication operator. After each update, the estimated quaternion should be renormalized to minimize the effects of rounding errors in limited precision implementations. The integration process has two significant disadvantages. Firstly, any bias in Ω t will result in an increasing cumulative error in the estimated orientation. Secondly, the initial orientation of the device must be known. Vector observation provides an estimate of the orientation relative to a fixed world coordination frame. By measuring the position of two or more vectors in the local coordination frame of a device, and comparing these with the known position of the vectors in the fixed coordinate frame, the rotation between the two frames can be calculated. Consider a rigid body moving in the earth frame. Let w denote the world frame, and b denote the sensor body frame. R w b, a 3 3 matrix, indicates the orientation transformation from the b-frame to the w-frame: v w = R w b vb (2) where v w and v b denote the linear velocity vectors of the sensor in the w-frame and b-frame, respectively. The time derivative of the transformation matrix R w b is Ṙ w b = Rw b S(ωb ) (3) where S(ω b ) [ω b ] is the skew-symmetric matrix that is formed using the cross-product operation of the angular velocity estimates ω b [7]. The matrix cross product operator is given by 0 ω b [ω b z ω b y ] = ω b z 0 ω b x (4) ω b y ω b x 0 For the IMU, the reference vectors v w are the set of observed vectors in the world frame, and they are used as the direction of acceleration due to gravity. Vector observation has the advantage of providing an absolute estimate of orientation. However, as observations of the acceleration due to gravity are corrupted by acceleration due to subject movement, it suffers from high frequency noise. Linear accelerometers measure the vector of acceleration a and gravitational acceleration g in sensor coordinates. The sensor signals can be expressed in the world reference system if the orientation of the sensor wb q t is known. w a t w g = wb q t ( b a t b g) wb q t (5)
3 Segment U (Upper arm) Relation with global frame PU0 Upper arm PU1 PU0 Upper arm PL0 Segment L (Forearm) Forearm Global frame Figure 1. Definition of segment axes and determination of segment lengths. Figure 2. Relation of segment with global frame. Figure 3. Relation of two connecting segments at t = 0. where denotes the complex conjugate of the quaternion, and is the quaternion multiplication operator. After removing the gravity component, the acceleration a t can be integrated once to velocity v t and twice to position p t, all in the world frame. w p t = w a t (6) 2.2 Segment Kinematics The inertial tracking kinematics are translated to body segment kinematics using a biomechanical model that assumes a subject s body includes body segments linked by joints and the sensors are attached to the subject s body segments. Joint origins are determined by the anatomical frame and are defined in the center of the functional axes with the directions of the, and being related to functional movements (Figure 1). For example, flexion/extension of the elbow is described by the rotation on the b -axis of the forearm with respect to the upper arm, abduction/adduction is the rotation on the b -axis, and endo/exo rotation is on the b -axis. Assuming a rigid segment, when the position p U0 of the joint origin, the orientation of wb q U, and the length of s U of the segment U are known, the position of point p U1 in the global frame can be calculated (Figure 2). w p U1 = w p U0 + wb q U b s U wb q U (7) At t = 0, the origin of segment L, point p L0 is connected to point p U1 of segment U (Figure 3). 2.3 Complementary Filter After each inertial and segment kinematic prediction step, the uncertainty of the joint position and rotation will grow due to sensor noise and movement related errors (Figure 4). In order to provide accurate orientation estimates which preserve the high frequency response of rate gyroscope integration and the absolute estimate provided by vector observation, data fusion must be performed. Complementary filters provide a means to fuse multiple independent noisy measurements of the same signal that have complementary spectral characteristics [3]. For example, consider two measurements y 1 = x + μ 1 and y 2 = x + μ 2 of a signal x where μ 1 is predominantly high frequency noise and μ 2 is predominantly low frequency disturbance. Choosing a pair of complementary transfer functions F 1 (s) + F 2 (s) = 1 with F 1 (s) low pass and F 2 (s) high pass, the filter estimate is given by: ˆ(s) = F 1 (s) 1 + F 2 (s) 2 = (s) + F 1 (s)μ 1 (s) + F 2 (s)μ 2 (s) (8) The signal (s) is all pass in the filter output while noise components are high and low pass filtered as desired. Complementary filters are particularly well suited to fusing low bandwidth position measurements with high band width rate measurements for first order kinematics systems. Recall that the rate gyroscope integration method suffers from low frequency drift while the vector observation method suffers from high frequency movement errors. The complementary filter equation is defined as { q ˆq = t + 1 k (q t q t) a 1 < a T q t (9) a 1 a T where q and q are the rate gyroscope and vector observation estimates respectively, k is a filter coefficient that controls blending of the two estimates, and a T is a threshold for optionally gating the vector observation during linear accelerations [15]. High frequency response is maintained by the first term of equation (9) while low frequency drift correction is provided by the second term of equation (9). The filter coefficient, k, controls the rate at which drift correction is performed. The smaller k, the faster drift correction. However, more movement induced noise is passed through the filter if k is small, while larger values result in less high frequency noise but slower drift correction. A gating process is used in order to reduce movement noise further. The drift correction by the gating process is only performed when the magnitude of the measured acceleration vector is within a bound of 1g. By only performing corrections while the device is not experiencing significant linear accelerations, the erroneous drift corrections could be limited (Figure 5). Kalman Filter and its derivations such as Extended Kalman Filter(EKF), Unscented Kalman Filter(UKF) are
4 Integration of accelerations and angular velocities Uncertainty of joint position /rotation Joint measurement update Updated kinematics and reduced uncertainty Global frame Global frame Figure 4. Integration of accelerations leads to an increased uncertainty on the joint position. Figure 5. After a joint update, the kinematics are corrected and the uncertainty is reduced. very commonly used in data fusion. The most expensive steps of Kalman Filters are the matrix operations (multiplication, inversion). The asymptotically fastest algorithm known for matrix multiplication is the Coppersmith-Winograd algorithm [6] with complexity of O(d ), where d denotes the size of the square matrix. In specific case of Kalman Filters, this size is defined by the number of state variables estimated [14]. The complementary filter is based on digital filtering theory. The Butterworth filter algorithm, a type of low-pass filter has a complexity that is proportional to the order n of the filter which is O(n) [4]. The proposed filter estimates each of the variables of the state vector ˆq which leads to a complexity of O(nd), where d denotes the number of variables of the state vector. 2.4 Dynamic Time Warping Dynamic time warping(dtw) is a classical algorithm based on dynamic programming measuring similarity between two sequences which may vary in time or speed [12]. A time series is a list of samples taken from a signal, measured typically at successive times spaced at uniform time intervals. A naive approach to calculate a matching distance between two time series could be to resample one of them and then compare the series sample-by-sample. The drawback of this method is that it does not produce intuitive results, as it compares samples that may not correspond well. DTW solves this discrepancy between intuition and calculated matching distance by recovering optimal alignments between sample points in the two time series. The alignment is optimal in the sense that it minimizes a cumulative distance measure consisting of local distances between aligned samples. Let S[1,, M] and T[1,, N] denote the two time series. The DTW-distance between two time series is D(M, N), which can be calculated in a dynamic programming approach using D(i, j 1) D(i, j) = min D(i 1, j) + d(x i,y i ) (10) D(i 1, j 1) where d(, ) is the local distance function varying with the application. The matching cost and the corresponding optimal path can be calculated using equation (12). For example, the optimal path from point (1, 1) to point (i, j) can be obtained from the optimal paths from (1, 1) to the three predecessor candidates, e.g., (i 1, j), (i, j 1), (i 1, j 1). The matching cost from (1, 1) to (i, j) is therefore the distance at (i, j) plus the smallest matching cost of the predecessor candidates. The time complexity and space complexity of DTW are both O(MN) [9]. After estimating the positions of wrist, elbow, and shoulder joints, the positions will be matched to the positions of each joint stored in the motion library. The positions that have the smallest matching costs will be the matches. 3 Experiments 3.1 Motion Library motions Motions adopted from [13, 9]. The dot denotes the starting point and the arrow the end. Figure 6 illustrates the motions that are stored in the motion library. In order to build the reference motion library, four healthy male adults performed the motions of Figure 6 with both arms one at a time. Each man performed the movement that was drawn on a letter size ( ) paper 100 times with the left arm first then the right arm. Hence, each man performed a motion 0 times with both arms and all the motions 1600 times in total with both arms. The diameter of motion 7 and 8 is cm, the length of motion 3, 4, 5, and 6 is cm, and a side of motion 1 and 2 is cm. 3.2 Experimental environment setup In this section, the performance of the proposed biomechanical model and the complementary filter is evaluated by comparing to that of the based motion tracking algorithm [10]. The motion tracking algorithm proposed by H.J. Luinge et al. has accurate inclination estimates within 3 RMS(Root Mean Square) and the gyroscope offset
5 x-coordinates(cm) Figure 7. -coordinates of estimated wrist joint position of algorithm y-coordinates(cm) Figure 8. -coordinates of estimated wrist joint position of algorithm z-coordinates(cm) Figure 9. -coordinates of estimated wrist joint position of algorithm x-coordinates(cm) Figure 10. -coordinates of estimated elbow joint position of algorithm y-coordinates(cm) Figure 11. -coordinates of estimated elbow joint position of algorithm z-coordinates(cm) Figure 12. -coordinates of estimated elbow joint position of algorithm error is roughly 5% of the initial offset error. H.J. Luinge et al. gained these results by comparing the performance of their algorithm with a laboratory-bound 3D human motion tracking system Vicon [10]. Due to the requirement of real-time operation, smoothing the estimates by removing spikes, jumps, and etc., is a task beyond the scope of this paper. On the contrary, the possibility of maximally reducing the uncertainty of joint position and rotation is more considered, so that the final estimation approximates to that of the algorithm proposed by H.J. Luinge et al., [10]. In the experiments, the following will be considered: Accuracy of motion estimation and accuracy of motion matching. Since the efficiency of the complementary filter over is proven in Section 2.3, the performance of both filter models will not be evaluated. An experimental environment needs to be set up before the evaluations start. By using Velcro straps, one of the IMUs is attached to the upper arm 5cm up from the elbow joint, and the other is tied to the forearm 3cm up from the wrist joint. Both of the IMUs are faced outwards the arm. In order to calibrate the IMUs, the forearm should be aligned parallel to the table where the motion drawn paper is placed, regardless the distance between the forearm and table. By repeating the aforementioned process, the IMUs initialize the orientation of the forearm and upper arm. The subject performs each motion illustrated in Figure 6 ten times, and each time the subject produces a confusion matrix. The confusion matrices of each subject are averaged and the averaged confusion matrices of all subjects are averaged again, and finally created a confusion matrix for a single motion. For the complementary filter settings, the filter coefficient, k is set to 115, and the accelerometer gating value, a T is set to 0.1g to reduce gross movement noise. 3.3 Experimental results Accuracy of motion estimation In this section, the accuracy of s motion estimation algorithm will be evaluated by comparing x, y, z coordinates of the estimated position of wrist, elbow, and shoulder of the algorithm proposed in [10]. The x,y,z coordinates are converted into cm and the signal matching is calculated by the following equation: Di f f (t) = Coord 1 (t) Coord 2 (t) (11) where Di f f (t) denotes the difference between x,y,z coordinates of both algorithm at each sample while Coord 1 (t) and Coord 2 (t) denote x,y,z coordinates of the algorithm proposed at each sample, respectively. While performing motion no.7 in Figure 6, we can see that x, y, z coordinates of both algorithms are matching (Figure 7 Figure 12)). According to the experiment results, the averaged signal matching percentage between and the algorithm proposed in [10] of each motion in Figure 6 is more than 88%.
6 Arm Motion Signal Matching(%) - Right arm Signal Matching(%) - Left arm Figure 13. Signal matching of each motion in Figure 6 between the performed motion and reference motion Accuracy of motion matching In this section, the accuracy of motion matching is evaluated by matching the motion performed by the subject to the reference motion stored in the motion library. The reference motion library contains the estimated x,y,z coordinates of each motion performed by the four subjects. To check the accuracy of motion matching, the subject performed all the motions with both arms. As a subject performs a motion in Figure 6, the DTW algorithm starts to match the x,y,z coordinates of each motion in Figure 6, and as it finds the best match, it tells the subject, how accurately, which motion was performed. In Figure 13, we can see the matching percentage of each reference motion and each performed motion is at least higher than 88.5%. However, the motion matching percentages of right hand were higher than those of the left hand. This may be due to whether the subject is right-handed or left-handed. All the subjects who participated the experiments were righthanded, and they may be more familiar with using their right arms than their left arms. 4 Conclusion An upper limb motion tracking system with two commercial wireless inertial sensors has been presented in this paper. Unlike other motion tracking systems, is totally wearable, and does not need external cameras, emitters or markers. Thus, it can be used outdoors as well as indoors. There are no restrictions due to the lack of lighting and no problems of occlusion or missing markers that exist in most vision base systems. is cost efficient due to novel sensor fusion algorithms that relax the need to add more sensors to reduce sample imprecisions(e.g., in measuring joint position and rotation). proposes a novel biomechanical model and sensor fusion algorithms to accurately track the motion of the stroke patient. It also provides an efficient way of checking the accuracy of the motion performed by patients via DTW. Preliminary results have been very promising, indicating that the proposed can be integrated into a home based rehabilitation system to report the upper limb motions of a patient. Since uses low complexity complementary filter, our next step is to implement Stroke- Track on a smartphone platform. 5 References [1] A Rehab Revolution. Stroke Connection Magazine, 04. [2] A. H. Association. Heart disease and stroke statistics 09 update [3] E. R. Bachmann, I. Duman, U.. Usta, R. B. McGhee,. P. un, and M. J. yda. Orientation tracking for humans and robots using inertial sensors. In IEEE International Symposium Computational Intelligence, [4] S. Butterworth. Wireless Engineer, volume [5] J. Cauraugh and S. Kim. Two coupled motor recovery protocols are better than one electromyogram-triggered neuromuscular stimulation and bilateral movements. In Stroke, volume 33, pages , 02. [6] D. Coppersmith and S. Winograd. Matrix multiplication via arithmetic progression. In Proceedings of the Nineteenth Annual ACM Symposium on Theory of Computing, pages 1 6, [7] J. J. Craig. Introduction to Robotics: Mechanics and Control. Prentice Hall, 3rd edition, 04. [8] D. Giansanti, V. Maceralli, G. Maccioni, and V. Maceralli. The development and test of a device for the reconstruction of 3-d position and orientation by means of a kinematic sensor assembly with rate gyroscopes and accelerometers. In IEEE Transactions on Biomedical Engineering, volume 52, pages , 05. [9] J. Liu,. Wang, L. hong, J. Wickramasuriya, and V. Vasudevan. uwave: Accelerometer-based personalized gesture recognition and its applications. In IEEE Int. Conf. Pervasive Computing and Communication, 09. [10] H. J. Luinge and P. H. Veltink. Measuring orientation of human body segments using miniature gyroscope and accelerometers. In Medical Biological Engineering Computing, volume 43, pages , 05. [11] R. Mahony, T. Hamel, and J. M. Pflimlin. Complementary filter design on the special orthogonal group so(3). In IEEE Conference on Decision and Control, pages , 05. [12] C. S. Myers and L. R. Rabiner. A comparative study of several dynamic time-warping algorithms for connected word recognition. In The Bell System Technical Journal, volume 60, pages , [13] L. R. Rabiner and B. H. Juang. An introduction to hidden markov models. In IEEE ASSP Magazine, pages 4 15, January [14] S. Thrun, W. Burgard, and D. Fox. Probabilistic robotics: intelligent robotics and autonomous agents. MIT Press, 05. [15] A. D. oung. Comparison of orientation filter algorithms for realtime wireless inertial posture tracking. In Body Sensor Networks, 09. [16]. un and E. R. Bachmann. Design, implementation and experimental results of a quaternion-based kalman filter for human body motion tracking. In IEEE transactions on Robotics, volume 22, pages , 06.
A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models
A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models Emanuele Ruffaldi Lorenzo Peppoloni Alessandro Filippeschi Carlo Alberto Avizzano 2014 IEEE International
More informationQuaternion Kalman Filter Design Based on MEMS Sensors
, pp.93-97 http://dx.doi.org/10.14257/astl.2014.76.20 Quaternion Kalman Filter Design Based on MEMS Sensors Su zhongbin,yanglei, Kong Qingming School of Electrical and Information. Northeast Agricultural
More informationMultimodal Movement Sensing using. Motion Capture and Inertial Sensors. for Mixed-Reality Rehabilitation. Yangzi Liu
Multimodal Movement Sensing using Motion Capture and Inertial Sensors for Mixed-Reality Rehabilitation by Yangzi Liu A Thesis Presented in Partial Fulfillment of the Requirements for the Degree Master
More informationExam in DD2426 Robotics and Autonomous Systems
Exam in DD2426 Robotics and Autonomous Systems Lecturer: Patric Jensfelt KTH, March 16, 2010, 9-12 No aids are allowed on the exam, i.e. no notes, no books, no calculators, etc. You need a minimum of 20
More informationTracking of Human Arm Based on MEMS Sensors
Tracking of Human Arm Based on MEMS Sensors Yuxiang Zhang 1, Liuyi Ma 1, Tongda Zhang 2, Fuhou Xu 1 1 23 office, Xi an Research Inst.of Hi-Tech Hongqing Town, Xi an, 7125 P.R.China 2 Department of Automation,
More informationFAB verses tradition camera-based motion capture systems
FAB verses tradition camera-based motion capture systems The advent of micromachined inertial sensors, such as rate gyroscopes and accelerometers, has made new navigation and tracking technologies possible.
More informationACCURATE HUMAN MOTION ESTIMATION USING INERTIAL MEASUREMENT UNITS FOR USE IN BIOMECHANICAL ANALYSIS
ACCURATE HUMAN MOTION ESTIMATION USING INERTIAL MEASUREMENT UNITS FOR USE IN BIOMECHANICAL ANALYSIS An Undergraduate Research Scholars Thesis by WYATT HAHN and TYLER MARR Submitted to the Undergraduate
More informationLecture 13 Visual Inertial Fusion
Lecture 13 Visual Inertial Fusion Davide Scaramuzza Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve
More informationMotion capture: An evaluation of Kinect V2 body tracking for upper limb motion analysis
Motion capture: An evaluation of Kinect V2 body tracking for upper limb motion analysis Silvio Giancola 1, Andrea Corti 1, Franco Molteni 2, Remo Sala 1 1 Vision Bricks Laboratory, Mechanical Departement,
More informationResearch and Literature Review on Developing Motion Capture System for Analyzing Athletes Action
International Conference on Education Technology, Management and Humanities Science (ETMHS 2015) Research and Literature Review on Developing Motion Capture System for Analyzing Athletes Action HAN Fang
More informationEstimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter
Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Przemys law G asior, Stanis law Gardecki, Jaros law Gośliński and Wojciech Giernacki Poznan University of
More informationSelection and Integration of Sensors Alex Spitzer 11/23/14
Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs
More informationThis is the accepted version of a paper presented at Proceeding of The Swedish Symposium on Image Analysis (SSBA2013), Gothenburg, Sweden.
http://www.diva-portal.org Postprint This is the accepted version of a paper presented at Proceeding of The Swedish Symposium on Image Analysis (SSBA2013), Gothenburg, Sweden. Citation for the original
More informationSimplified Orientation Determination in Ski Jumping using Inertial Sensor Data
Simplified Orientation Determination in Ski Jumping using Inertial Sensor Data B.H. Groh 1, N. Weeger 1, F. Warschun 2, B.M. Eskofier 1 1 Digital Sports Group, Pattern Recognition Lab University of Erlangen-Nürnberg
More information(1) and s k ωk. p k vk q
Sensing and Perception: Localization and positioning Isaac Sog Project Assignment: GNSS aided INS In this project assignment you will wor with a type of navigation system referred to as a global navigation
More informationVCIT Visually Corrected Inertial Tracking
Maximilian Eibl, Martin Gaedke. (Hrsg.): INFORMATIK 2017, Lecture Lecture Notes Notes in Informatics in Informatics (LNI), (LNI), Gesellschaft Gesellschaft für für Informatik, Informatik, Bonn Bonn 2017
More informationnavigation Isaac Skog
Foot-mounted zerovelocity aided inertial navigation Isaac Skog skog@kth.se Course Outline 1. Foot-mounted inertial navigation a. Basic idea b. Pros and cons 2. Inertial navigation a. The inertial sensors
More informationAn IMU-based Wearable Presentation Pointing Device
An IMU-based Wearable Presentation Pointing evice imitrios Sikeridis and Theodore A. Antonakopoulos epartment of Electrical and Computer Engineering University of Patras Patras 654, Greece Email: d.sikeridis@upnet.gr,
More informationOrientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design
Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design Sebastian Andersson School of Software Engineering Tongji University Shanghai, China
More information3D Human Motion Analysis and Manifolds
D E P A R T M E N T O F C O M P U T E R S C I E N C E U N I V E R S I T Y O F C O P E N H A G E N 3D Human Motion Analysis and Manifolds Kim Steenstrup Pedersen DIKU Image group and E-Science center Motivation
More informationDesign of Augmented Reality 3D Rehabilitation Application using an Inertial Measurement Unit A Pilot Study
1 Design of Augmented Reality 3D Rehabilitation Application using an Inertial Measurement Unit A Pilot Study G. Kontadakis, D. Chasiouras, D. Proimaki 1 Abstract This work introduces an innovative gamified
More informationDealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech
Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The
More informationImproving the Orientation Estimation in a Packet Loss-affected Wireless Sensor Network for Tracking Human Motion
21th IMEKO TC4 International Symposium on Understanding the World through Electrical and Electronic Measurement, and 19th International Workshop on ADC Modelling and Testing udapest, Hungary, September
More informationMe 3-Axis Accelerometer and Gyro Sensor
Me 3-Axis Accelerometer and Gyro Sensor SKU: 11012 Weight: 20.00 Gram Description: Me 3-Axis Accelerometer and Gyro Sensor is a motion processing module. It can use to measure the angular rate and the
More informationEstimation of Angles from Upper and Lower Limbs for Recognition of Human Gestures using Wireless Inertial Sensors
Estimation of Angles from Upper and Lower Limbs for Recognition of Human Gestures using Wireless Inertial Sensors Irvin Hussein López Nava, Angélica Muñoz-Meléndez Technical Report No. CCC-15-004 December,
More informationProgramming-By-Example Gesture Recognition Kevin Gabayan, Steven Lansel December 15, 2006
Programming-By-Example Gesture Recognition Kevin Gabayan, Steven Lansel December 15, 6 Abstract Machine learning and hardware improvements to a programming-by-example rapid prototyping system are proposed.
More informationState-space models for 3D visual tracking
SA-1 State-space models for 3D visual tracking Doz. G. Bleser Prof. Stricker Computer Vision: Object and People Tracking Example applications Head tracking, e.g. for mobile seating buck Object tracking,
More informationREAL-TIME human motion tracking has many applications
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, VOL. 12, NO. 2, JUNE 2004 295 A Real-Time Articulated Human Motion Tracking Using Tri-Axis Inertial/Magnetic Sensors Package Rong Zhu
More informationCHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS
CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS ökçen Aslan 1,2, Afşar Saranlı 2 1 Defence Research and Development Institute (SAE), TÜBİTAK 2 Dept. of Electrical and Electronics Eng.,
More informationGesture Recognition Aplication based on Dynamic Time Warping (DTW) FOR Omni-Wheel Mobile Robot
Gesture Recognition Aplication based on Dynamic Time Warping (DTW) FOR Omni-Wheel Mobile Robot Indra Adji Sulistijono, Gama Indra Kristianto Indra Adji Sulistijono is with the Department of Mechatronics
More informationMotion Editing with Data Glove
Motion Editing with Data Glove Wai-Chun Lam City University of Hong Kong 83 Tat Chee Ave Kowloon, Hong Kong email:jerrylam@cityu.edu.hk Feng Zou City University of Hong Kong 83 Tat Chee Ave Kowloon, Hong
More informationProceedings of the 2013 SpaceVision Conference November 7-10 th, Tempe, AZ, USA ABSTRACT
Proceedings of the 2013 SpaceVision Conference November 7-10 th, Tempe, AZ, USA Development of arm controller for robotic satellite servicing demonstrations Kristina Monakhova University at Buffalo, the
More informationMotion Capture & Simulation
Motion Capture & Simulation Motion Capture Character Reconstructions Joint Angles Need 3 points to compute a rigid body coordinate frame 1 st point gives 3D translation, 2 nd point gives 2 angles, 3 rd
More informationReal Time Motion Authoring of a 3D Avatar
Vol.46 (Games and Graphics and 2014), pp.170-174 http://dx.doi.org/10.14257/astl.2014.46.38 Real Time Motion Authoring of a 3D Avatar Harinadha Reddy Chintalapalli and Young-Ho Chai Graduate School of
More informationVirtual Interaction System Based on Optical Capture
Sensors & Transducers 203 by IFSA http://www.sensorsportal.com Virtual Interaction System Based on Optical Capture Peng CHEN, 2 Xiaoyang ZHOU, 3 Jianguang LI, Peijun WANG School of Mechanical Engineering,
More informationSatellite and Inertial Navigation and Positioning System
Satellite and Inertial Navigation and Positioning System Project Proposal By: Luke Pfister Dan Monroe Project Advisors: Dr. In Soo Ahn Dr. Yufeng Lu EE 451 Senior Capstone Project December 10, 2009 PROJECT
More informationEXPERIMENTAL COMPARISON BETWEEN MAHONEY AND COMPLEMENTARY SENSOR FUSION ALGORITHM FOR ATTITUDE DETERMINATION BY RAW SENSOR DATA OF XSENS IMU ON BUOY
EXPERIMENTAL COMPARISON BETWEEN MAHONEY AND COMPLEMENTARY SENSOR FUSION ALGORITHM FOR ATTITUDE DETERMINATION BY RAW SENSOR DATA OF XSENS IMU ON BUOY A. Jouybari a *, A. A. Ardalan a, M-H. Rezvani b a University
More informationCalibration of Inertial Measurement Units Using Pendulum Motion
Technical Paper Int l J. of Aeronautical & Space Sci. 11(3), 234 239 (2010) DOI:10.5139/IJASS.2010.11.3.234 Calibration of Inertial Measurement Units Using Pendulum Motion Keeyoung Choi* and Se-ah Jang**
More informationUpper limb motion estimation from inertial measurements
International Journal of Information Technology Vol. 13 No. 1 2007 Upper limb motion estimation from inertial measurements Huiyu Zhou 1 and Huosheng Hu 2 1 Department of Electronic Engineering, Queen Mary,
More informationComputer Animation and Visualisation. Lecture 3. Motion capture and physically-based animation of characters
Computer Animation and Visualisation Lecture 3. Motion capture and physically-based animation of characters Character Animation There are three methods Create them manually Use real human / animal motions
More informationVisualization and Analysis of Inverse Kinematics Algorithms Using Performance Metric Maps
Visualization and Analysis of Inverse Kinematics Algorithms Using Performance Metric Maps Oliver Cardwell, Ramakrishnan Mukundan Department of Computer Science and Software Engineering University of Canterbury
More informationMPU Based 6050 for 7bot Robot Arm Control Pose Algorithm
Proc. 1 st International Conference on Machine Learning and Data Engineering (icmlde2017) 20-22 Nov 2017, Sydney, Australia ISBN: 978-0-6480147-3-7 MPU Based 6050 for 7bot Robot Arm Control Pose Algorithm
More informationRobotics and Autonomous Systems
Robotics and Autonomous Systems Lecture 6: Perception/Odometry Terry Payne Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception
More informationRobotics and Autonomous Systems
Robotics and Autonomous Systems Lecture 6: Perception/Odometry Simon Parsons Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception
More informationHuman Motion Reconstruction by Direct Control of Marker Trajectories
Human Motion Reconstruction by Direct Control of Marker Trajectories Emel Demircan, Luis Sentis, Vincent De Sapio and Oussama Khatib Artificial Intelligence Laboratory, Stanford University, Stanford, CA
More informationTracking a Mobile Robot Position Using Vision and Inertial Sensor
Tracking a Mobile Robot Position Using Vision and Inertial Sensor Francisco Coito, António Eleutério, Stanimir Valtchev, and Fernando Coito Faculdade de Ciências e Tecnologia Universidade Nova de Lisboa,
More informationHand-Eye Calibration from Image Derivatives
Hand-Eye Calibration from Image Derivatives Abstract In this paper it is shown how to perform hand-eye calibration using only the normal flow field and knowledge about the motion of the hand. The proposed
More informationVINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem
VINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem Presented by: Justin Gorgen Yen-ting Chen Hao-en Sung Haifeng Huang University of California, San Diego May 23, 2017 Original
More informationVIBRATION ISOLATION USING A MULTI-AXIS ROBOTIC PLATFORM G.
VIBRATION ISOLATION USING A MULTI-AXIS ROBOTIC PLATFORM G. Satheesh Kumar, Y. G. Srinivasa and T. Nagarajan Precision Engineering and Instrumentation Laboratory Department of Mechanical Engineering Indian
More informationJoint Angle Tracking with Inertial Sensors
Portland State University PDXScholar Dissertations and Theses Dissertations and Theses Winter 2-22-213 Joint Angle Tracking with Inertial Sensors Mahmoud Ahmed El-Gohary Portland State University Let us
More informationInertial Navigation Systems
Inertial Navigation Systems Kiril Alexiev University of Pavia March 2017 1 /89 Navigation Estimate the position and orientation. Inertial navigation one of possible instruments. Newton law is used: F =
More informationTime-Invariant Strategies in Coordination of Human Reaching
Time-Invariant Strategies in Coordination of Human Reaching Satyajit Ambike and James P. Schmiedeler Department of Mechanical Engineering, The Ohio State University, Columbus, OH 43210, U.S.A., e-mail:
More informationComputationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor
Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor Chang Liu & Stephen D. Prior Faculty of Engineering and the Environment, University of Southampton,
More informationROBOTICS AND AUTONOMOUS SYSTEMS
ROBOTICS AND AUTONOMOUS SYSTEMS Simon Parsons Department of Computer Science University of Liverpool LECTURE 6 PERCEPTION/ODOMETRY comp329-2013-parsons-lect06 2/43 Today We ll talk about perception and
More informationDevelopment of an optomechanical measurement system for dynamic stability analysis
Development of an optomechanical measurement system for dynamic stability analysis Simone Pasinetti Dept. of Information Engineering (DII) University of Brescia Brescia, Italy simone.pasinetti@unibs.it
More informationOverview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models
Introduction ti to Embedded dsystems EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping Gabe Hoffmann Ph.D. Candidate, Aero/Astro Engineering Stanford University Statistical Models
More informationStable Vision-Aided Navigation for Large-Area Augmented Reality
Stable Vision-Aided Navigation for Large-Area Augmented Reality Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu Supun Samarasekera, Rakesh Teddy Kumar Vision and Robotics Laboratory SRI-International Sarnoff,
More informationAutomatic Recognition of Postoperative Shoulder Surgery Physical Therapy Exercises from Depth Camera Images
Proceedings of The National Conference On Undergraduate Research (NCUR) 2015 Eastern Washington University, Cheney, WA April 16-18, 2015 Automatic Recognition of Postoperative Shoulder Surgery Physical
More informationMonitoring Human Wrist Rotation in Three Degrees of Freedom
Monitoring Human Wrist Rotation in Three Degrees of Freedom Fatemeh Abyarjoo 1, Armando Barreto 1, Somayeh Abyarjoo 2, Francisco R. Ortega 1, Jonathan Cofino 1 1 Florida International University. Miami,
More informationPhysics 101, Lab 1: LINEAR KINEMATICS PREDICTION SHEET
Physics 101, Lab 1: LINEAR KINEMATICS PREDICTION SHEET After reading through the Introduction, Purpose and Principles sections of the lab manual (and skimming through the procedures), answer the following
More informationTracking of Human Body using Multiple Predictors
Tracking of Human Body using Multiple Predictors Rui M Jesus 1, Arnaldo J Abrantes 1, and Jorge S Marques 2 1 Instituto Superior de Engenharia de Lisboa, Postfach 351-218317001, Rua Conselheiro Emído Navarro,
More informationInertial Measurement Units II!
! Inertial Measurement Units II! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 10! stanford.edu/class/ee267/!! wikipedia! Polynesian Migration! Lecture Overview! short review of
More informationExtended Hopfield Network for Sequence Learning: Application to Gesture Recognition
Extended Hopfield Network for Sequence Learning: Application to Gesture Recognition André Maurer, Micha Hersch and Aude G. Billard Ecole Polytechnique Fédérale de Lausanne (EPFL) Swiss Federal Institute
More information/$ IEEE
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 54, NO. 11, NOVEMBER 2007 1927 Probabilistic Inference of Multijoint Movements, Skeletal Parameters and Marker Attachments From Diverse Motion Capture
More informationAn Overview of a Probabilistic Tracker for Multiple Cooperative Tracking Agents
An Overview of a Probabilistic Tracker for Multiple Cooperative Tracking Agents Roozbeh Mottaghi and Shahram Payandeh School of Engineering Science Faculty of Applied Sciences Simon Fraser University Burnaby,
More informationModeling of Humanoid Systems Using Deductive Approach
INFOTEH-JAHORINA Vol. 12, March 2013. Modeling of Humanoid Systems Using Deductive Approach Miloš D Jovanović Robotics laboratory Mihailo Pupin Institute Belgrade, Serbia milos.jovanovic@pupin.rs Veljko
More informationChapter 1: Introduction
Chapter 1: Introduction This dissertation will describe the mathematical modeling and development of an innovative, three degree-of-freedom robotic manipulator. The new device, which has been named the
More informationLow Cost solution for Pose Estimation of Quadrotor
Low Cost solution for Pose Estimation of Quadrotor mangal@iitk.ac.in https://www.iitk.ac.in/aero/mangal/ Intelligent Guidance and Control Laboratory Indian Institute of Technology, Kanpur Mangal Kothari
More informationBeate Klinger and Torsten Mayer-Gürr Institute of Geodesy NAWI Graz, Graz University of Technology
and Torsten Mayer-Gürr Institute of Geodesy NAWI Graz, Graz University of Technology Outline Motivation GRACE Preprocessing GRACE sensor fusion Accelerometer simulation & calibration Impact on monthly
More informationInertial Navigation Static Calibration
INTL JOURNAL OF ELECTRONICS AND TELECOMMUNICATIONS, 2018, VOL. 64, NO. 2, PP. 243 248 Manuscript received December 2, 2017; revised April, 2018. DOI: 10.24425/119518 Inertial Navigation Static Calibration
More informationJoint Angle Estimation in Rehabilitation with Inertial Sensors and its Integration with Kinect
Joint Angle Estimation in Rehabilitation with Inertial Sensors and its Integration with Kinect Antonio Bo, Mitsuhiro Hayashibe, Philippe Poignet To cite this version: Antonio Bo, Mitsuhiro Hayashibe, Philippe
More informationGlobal Journal of Engineering Science and Research Management
A REVIEW PAPER ON INERTIAL SENSOR BASED ALPHABET RECOGNITION USING CLASSIFIERS Mrs. Jahidabegum K. Shaikh *1 Prof. N.A. Dawande 2 1* E & TC department, Dr. D.Y.Patil college Of Engineering, Talegaon Ambi
More informationA novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models
214 IEEE International Conference on Robotics & Automation (ICRA) Hong Kong Convention and Exhibition Center May 31 - June 7, 214. Hong Kong, China A novel approach to motion tracking with wearable sensors
More informationINPUT PARAMETERS FOR MODELS I
9A-1 INPUT PARAMETERS FOR MODELS I Lecture Overview Equations of motion Estimation of muscle forces Required model parameters Body segment inertial parameters Muscle moment arms and length Osteometric
More informationSelf-calibration of a pair of stereo cameras in general position
Self-calibration of a pair of stereo cameras in general position Raúl Rojas Institut für Informatik Freie Universität Berlin Takustr. 9, 14195 Berlin, Germany Abstract. This paper shows that it is possible
More informationOrientation Independent Activity/Gesture Recognition Using Wearable Motion Sensors
> REPLACE THIS LINE WITH YOUR PAPER IDENTIFICATION NUMBER (DOUBLE-CLICK HERE TO EDIT) < 1 Orientation Independent Activity/Gesture Recognition Using Wearable Motion Sensors Jian Wu, Student Member, IEEE
More informationMTRX4700 Experimental Robotics
MTRX 4700 : Experimental Robotics Lecture 2 Stefan B. Williams Slide 1 Course Outline Week Date Content Labs Due Dates 1 5 Mar Introduction, history & philosophy of robotics 2 12 Mar Robot kinematics &
More informationSensor fusion for motion processing and visualization
Sensor fusion for motion processing and visualization Ali Baharev, PhD TÁMOP 4.2.2 Szenzorhálózat alapú adatgyűjtés és információfeldolgozás workshop April 1, 2011 Budapest, Hungary What we have - Shimmer
More information1. INTRODUCTION ABSTRACT
Weighted Fusion of Depth and Inertial Data to Improve View Invariance for Human Action Recognition Chen Chen a, Huiyan Hao a,b, Roozbeh Jafari c, Nasser Kehtarnavaz a a Center for Research in Computer
More informationCAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING
CAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING By Michael Lowney Senior Thesis in Electrical Engineering University of Illinois at Urbana-Champaign Advisor: Professor Minh Do May 2015
More informationReal Time Skin Deformation with Bones Blending
Real Time Skin Deformation with Bones Blending Ladislav Kavan Charles University Faculty of Mathematics and Physics Malostranske nam. 25 118 00 Prague 1, Czech Republic lkav8604@ss1000.ms.mff.cuni.cz Jiří
More informationNovel applications for miniature IMU s
Novel applications for miniature IMU s WORKSHOP GEÏNTEGREERDE NAVIGATIE vrijdag 15 december 2006, NLR Xsens Technologies B.V. Per Slycke (CTO) www.xsens.com About Xsens Based in Enschede, The Netherlands
More informationManipulator State Estimation with Low Cost Accelerometers and Gyroscopes
Manipulator State Estimation with Low Cost Accelerometers and Gyroscopes Philip Roan, Nikhil Deshpande, Yizhou Wang, and Benjamin Pitzer Abstract Robot manipulator designs are increasingly focused on low
More informationDynamic Modelling for MEMS-IMU/Magnetometer Integrated Attitude and Heading Reference System
International Global Navigation Satellite Systems Society IGNSS Symposium 211 University of New South Wales, Sydney, NSW, Australia 15 17 November, 211 Dynamic Modelling for MEMS-IMU/Magnetometer Integrated
More informationINSTITUTE OF AERONAUTICAL ENGINEERING
Name Code Class Branch Page 1 INSTITUTE OF AERONAUTICAL ENGINEERING : ROBOTICS (Autonomous) Dundigal, Hyderabad - 500 0 MECHANICAL ENGINEERING TUTORIAL QUESTION BANK : A7055 : IV B. Tech I Semester : MECHANICAL
More informationMonitoring Human Body Motion Denis Hodgins
Monitoring Human Body Motion Denis Hodgins Lead Partner: ETB Other Partners: Salford IMIT Zarlink Finetech Medical Salisbury Hospital Project Aim To develop a wireless body motion sensing system Applications:
More informationUsing SensorTag as a Low-Cost Sensor Array for AutoCAD
Using SensorTag as a Low-Cost Sensor Array for AutoCAD Kean Walmsley Autodesk SD5013 In 2012 Texas Instruments Inc. launched the SensorTag, a $25 sensor array that communicates via Bluetooth Smart (also
More informationTai Chi Motion Recognition Using Wearable Sensors and Hidden Markov Model Method
Tai Chi Motion Recognition Using Wearable Sensors and Hidden Markov Model Method Dennis Majoe 1, Lars Widmer 1, Philip Tschiemer 1 and Jürg Gutknecht 1, 1 Computer Systems Institute, ETH Zurich, Switzerland
More informationAllison L. Hall. at the. June Allison Hall All Rights reserved. Signature of Author.../... Delartment of Mechmical Engineering May 23, 2005
METHOD FOR THE ACQUISITION OF ARM MOVEMENT DATA USING ACCELEROMETERS by Allison L. Hall Submitted to the Department of Mechanical Engineering in Partial Fulfillment of the Requirements for the Degree of
More informationArm Movement Recorder
Energy Research Journal 1 (2): 126-130, 2010 ISSN 1949-0151 2010 Science Publications Arm Movement Recorder Jakkrapun Chuanasa and Szathys Songschon Department of Mechanical Engineering, King Mongkut s
More informationAccelerometer Gesture Recognition
Accelerometer Gesture Recognition Michael Xie xie@cs.stanford.edu David Pan napdivad@stanford.edu December 12, 2014 Abstract Our goal is to make gesture-based input for smartphones and smartwatches accurate
More informationProducts Catalogue 2011
Products Catalogue 2011 Software : CAPTIV L2100 CAPTIV L7000 Acquisition systems : T-Log T-USB T-Sens wireless sensor modules : T-Sens Accelerometer T-Sens Heart rate T-Sens semg T-Sens FSR T-Sens Gonio
More informationSegmentation and Tracking of Partial Planar Templates
Segmentation and Tracking of Partial Planar Templates Abdelsalam Masoud William Hoff Colorado School of Mines Colorado School of Mines Golden, CO 800 Golden, CO 800 amasoud@mines.edu whoff@mines.edu Abstract
More informationAnimation. Computer Graphics COMP 770 (236) Spring Instructor: Brandon Lloyd 4/23/07 1
Animation Computer Graphics COMP 770 (236) Spring 2007 Instructor: Brandon Lloyd 4/23/07 1 Today s Topics Interpolation Forward and inverse kinematics Rigid body simulation Fluids Particle systems Behavioral
More informationForce Modeling, Quaternion PID, and Optimization
Cornell University Autonomous Underwater Vehicle Team Spring 2014 Force Modeling, Quaternion PID, and Optimization Technical Report Alex Spitzer (aes368) May 11, 2014 Contents 1 Abstract 2 2 Previous Designs
More informationRobots are built to accomplish complex and difficult tasks that require highly non-linear motions.
Path and Trajectory specification Robots are built to accomplish complex and difficult tasks that require highly non-linear motions. Specifying the desired motion to achieve a specified goal is often a
More informationNavigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM
Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement
More informationJacobians. 6.1 Linearized Kinematics. Y: = k2( e6)
Jacobians 6.1 Linearized Kinematics In previous chapters we have seen how kinematics relates the joint angles to the position and orientation of the robot's endeffector. This means that, for a serial robot,
More informationExploring unconstrained mobile sensor based human activity recognition
Exploring unconstrained mobile sensor based human activity recognition ABSTRACT Luis Gerardo Mojica de la Vega luis.mojica@utdallas.edu Arvind Balasubramanian arvind@utdallas.edu Human activity recognition
More informationLive Metric 3D Reconstruction on Mobile Phones ICCV 2013
Live Metric 3D Reconstruction on Mobile Phones ICCV 2013 Main Contents 1. Target & Related Work 2. Main Features of This System 3. System Overview & Workflow 4. Detail of This System 5. Experiments 6.
More information