Group EEG (Electroencephalogram) l Anthony Hampton, Tony Nuth, Miral Patel (Portions credited to Jack Shelley-Tremblay and E. Keogh) 05/09/2008 5/9/2008 1
Outline Introduction Goal Methodology Results Discussion Conclusion 5/9/2008 2
Goals The goal of this project is to evaluate EEG data from 19 subjects using various techniques of mathematics. 5/9/2008 3
EEG A recording of the electrical waves that sweeps over the brain s surface. It is measured by the electrodes that is placed on top of the scalp. 5/9/2008 4
Parts of the brain We are interested in the Primary Motor Cortex, and Pre-motor Cortex 5/9/2008 5
Understand the Waves In neurophysiology,,anaction action potential (also known as a nerve impulse or spike) ) is a pulse- like wave of voltage that travels along several types of cell membranes. 5/9/2008 6
Placement International 10-20 system, Place an Electrode on Each point. 5/9/2008 7
Event related Potentials An event-related potential (ERP) is any stereotyped electrophysiological response to an internal or external stimulus. More simply, it is any measured brain response that is directly the result of a thought or perception. By collecting multiple trials of the same type of stimuli, we can enhance the signal and reduce the noise using simple math(typically neuroscientist use this). 5/9/2008 8
Defined an Epoch A series of time points locked in a significant point time. The button press is our significant point in time. 5/9/2008 9
A subject s s data Example of an Epoch 5/9/2008 10
Same data Same data At a different Epoch 5/9/2008 11
Time Series A time series is a sequence of data points, measured typically at successive times, spaced at (often uniform) time intervals. 5/9/2008 12
Time Series What is a time series? 5/9/2008 13
What EM clustering does How do we classify points and estimate parameters of the models in a mixture at the same time? Adaptive soft clustering: EM.. Data points are assigned to each group with a probability bilit equal to a likelihood lih of that t point belonging to that group. 5/9/2008 14
What is EM - Expectation Maximization A statistical model that makes use of the finite Gaussian mixture models. A set of parameters are recomputed until a desired value is reached Initial variables are randomly initialized iti 5/9/2008 15
The methods of EM Initialization: Pick start values for parameters (for us it was making random models and setting a sigma) Iteratively process until parameters converge Expectation (E) step: Calculate weights for every data point and update the weights to affect further steps Maximization (M) step: Maximize a log likelihood function with the weights given by E step to update the parameters 5/9/2008 16
Evaluate Initialized 2 models of data using the mean of the EEG entered using the first half of data over time for the first model and the second half of over time for the second model Compared each model and created a weight matrix Normalized the data 5/9/2008 17
OSB algorithm OSB : Optimal Subsequence Bijection It is an algorithm that determines the optimal subsequence bijection between two sequences of real numbers. We were given a code tsdagjump4 that worked for only one channels and we modified so it can work for more than 1 channels.( works for 40 channels) Modification: Created difference matrix with each entry containing differences of corresponding elements. 5/9/2008 18
Why OSB algorithm? The OSB is efficient because we use DAG(Directed Acyclic Graph), cheapest path to find the solution. By using DAG in OSB, we get perfect and correct results on Time Series dataset. DAG helps us to get rid of outlier elements and get one-to to-one one or onto bijection of a sequences. Comparing OSB with DTW using warping window, OSB shows that by skipping elements improves results. 5/9/2008 19
Directed Acyclic Graph This is a simple example of DAG. By skipping over outlier elements we get perfect result. 5/9/2008 20
OSB algorithm This program is used to find: Ts - Time Series DAG Directed Acyclic Graph OSB between two sequences of real numbers D is to find distance between two elements. C is for the jump cost.(penalty for skipping an element) W- weight of edges 5/9/2008 21
OSB algorithm Find subsequences of two elements. Create dissimilarity matrix. Use shortest path algorithm on Directed Acyclic Graph. Find jump cost. Nodes are index pairs of matrix. The main thing in the algorithm is to find edge weights of DAG. 5/9/2008 22
Wavelet Wavelets are mathematical functions that cut up data into different frequency components, and then study each component with a resolution matched to its scale. (IEEE Computational Science and Engineering, Summer 1995, vol. 2, num. 2, published by the IEEE Computer Society) 5/9/2008 23
Discussion of Results We used OSB to obtain our results. Our results consist of two sequences a and b then find subsequences a of a and b of b so that a matches best with b. Results are divided on two parts: Cluster Precision - means x% of time that you will cluster an epoch correctly. Cluster Recall - means y% of time that you will cluster a known left or right button 5/9/2008 24
Example: Discussion of Results typeout1: [75 76 27 21 22] = a rtypeout1: [77 72 24 22 24] = b ltypeout2: [25 24 73 79 78] = a rtypeout2: t2 [23 28 76 78 76] = b For Cluster Precision: left button cluster: 76, right button cluster: 72 (from a and b) Formula : 76/ (76+72) = 0.51351351 = 51% For Cluster Recall: left button cluster: 76, right button cluster: 28 (from a and b ) Now, we applied the formula so: 76/ (76+28) = 0.73076923 = 73% Note: Apply same formula to find both right cluster precision and recall. 5/9/2008 25
Conclusion We re given total 19 subjects (EEG Datasets) but we derived correct result for only 10 subjects. Other 9 subjects gave us all zeros as a result. ltypeout1: [0 0 0 0 0] rtypeout1: [0 0 0 0 0] ltypeout2: [0 0 0 0 0] rtypeout2: [0 0 0 0 0]..error 5/9/2008 26
Extra References Yang Ran, Expectation Maximization : An Approach to Parameter Estimation, www.umiacs.umd.edu/ umd edu/~shaohua/enee698 a_f03/em.ppt Andrew Blake, Bill Freeman, Learning and Vision: Generative Methods -ppt ICCV 2003 October 12, 2003 5/9/2008 27
Thank You 5/9/2008 28