Unsupervised seismic facies from mixture models to highlight channel features Robert Hardisty* 1 and Bradley C. Wallet 1 1)
|
|
- Evelyn Collins
- 5 years ago
- Views:
Transcription
1 Unsupervised seismic facies from mixture models to highlight channel features Robert Hardisty* 1 and Bradley C. Wallet 1 1) University of Olahoma, OK USA Summary Unsupervised seismic facies are a convenient and efficient tool for interpretation. Expanding upon Zhao et al. s (2016) study, Gaussian mixture models are used to show how features can automatically be generated using machine learning. The conventional expectation-maximization algorithm is compared to the neighborhood expectationmaximization algorithm to highlight the effects of spatial relations in the data in addition to the measurements of seismic attributes. The survey being used is a 3D seismic survey from the Canterbury basin, ew Zealand called Waa-3D Introduction Visual examination of seismic facies on large 3D seismic data sets where there is little a priori geologic information can be tedious and inaccurate. The process can be more automated and improved using machine learning. By teaching a computer how to recognize patterns, features can automatically be piced. This has the obvious benefit of quicer interpretations, but moreover it can highlight features that might otherwise go unnoticed. The Gaussian mixture model (GMM) provides a flexible framewor by which to accomplish this. Geologic setting The seismic survey is located on the Canterbury Basin, offshore ew Zealand (Figure 1). The area lies in the transition zone of the continental rise and continental slope. Figure 1: Aerial view of study area (Modified from Zhao et al., 2016) The data set contains many Cretaceous and Tertiary age paleocanyons and turbidite deposits. Sediments were deposited in a single transgressive-regressive cycle driven by tectonics (Zhao et al. 2016). A previously identified channel feature is analyzed using a Gaussian mixture model technique. Theory Gaussian mixture models for seismic attributes Gaussian mixture models (GMM) are a well-nown semiparametric density estimation technique using a weighted sum of normal, or Gaussian, distributions (Figure 2). An inherent assumption when using this technique is that the data comes from different Gaussian distributions. Figure 2: Example of a Gaussian mixture model with three mixture components. The overall density is estimated as the sum of the components. (Modified from Wallet et al., 2014) A multivariate Gaussian distribution can be defined as φ(x μ, C) = 1 e 1 (2π) d 2 C 1 2 (x μ) C 1 (x μ) 2 where µ is the mean, C is the covariance matrix, and d is the number of dimensions of x and µ. For seismic attributes, x is a voxel with dimensions equal to the number of attributes. A GMM can be expressed as p(x ψ) = π j φ(x μ j, C j ) j=1 where is the number of different Gaussian distributions or components, and πj is the weight of the j th component such that πj > 0 and π j = 1 j=1. Page 2289
2 The problem is to estimate (or learn) the parameters of the GMM, {π j, μ j, C j } for j= (1 ). Common practice is to learn the parameters of a Gaussian mixture through the expectation-maximization (EM) algorithm developed by Dempster et al. (1977). Dynamic component allocation (DCA) as proposed by Vlassis and Lias (2002) is used to avoid user-defined initialization and to mae the process more unsupervised. Dynamic component allocation (DCA) starts with a single component, and then alternates between optimization using the EM algorithm and allocation of a new component for the GMM. The first component is initialized using the population mean and covariance. Convergence of DCA occurs when the maximum number of components is reached. eighborhood expectation-maximization (EM) algorithm Learning of a GMM using the EM algorithm is a purely statistical construct and doesn t consider spatial correlations. In general, facies are expected to be at least laterally continuous to some extent. To account for spatial correlations of the latent space the eighborhood expectation-maximization (EM) algorithm is implemented and compared to the results of the conventional EM. The conventional EM algorithm can be viewed as a variant of coordinate descent on a certain objective function, D(W, ψ) = W ji [log{ W ji } log {π j φ(x μ j, C j )}] j=1 i=1 where Wji are the elements of the responsibility matrix, W (Hathaway, 1986). Ambroise et al. (1996) introduced a regularization term to tae into account the spatial information of the data, G(W) = 1 2 W ij W pj V ip j=1 i=1 p=1 where Vip are the elements of a neighborhood matrix, V. The new objective function then becomes U(W, ψ) = D(W, ψ) + β G(W) where β 0 and determines the weight of the spatial term, G(W). The neighborhood matrix, V, for this application has been chosen to be V ip = { 1 if x i and x p neighbors 0 else, and xi and xp are neighbors if they both lie within a userdefined window. The benefit of the EM algorithm is that the responsibilities of neighboring voxels are considered when deciding which mixture component a voxel belongs to. Methods Latent space modeling Lie all statistical classifiers, GMM s suffer from the curse of dimensionality. Latent space modeling is a powerful technique to project high dimensional data into a lower dimensional space. In this application, a two-dimensional latent space generated from Zhao et al. (2016) is considered. The latent space was generated using a distance-preserving SOM (DPSOM) technique with the attribute inputs being pea spectral frequency, pea spectral magnitude, coherent energy, and curvedness. The DPSOM algorithm mapped the 4D attribute input to a 2D SOM latent space resulting in 2 seismic attribute volumes, SOM latent axis 1 and SOM latent axis 2 (Figure 3). Using a GMM as a classifier on these two axes will produce a single partition volume and a number,, of mixture decomposition volumes for unsupervised seismic facies analysis. Figure 3: Horizon slice of (a) seismic amplitude, (b) SOM latent axis 1, (c) SOM latent axis 2. Purple lines and arrows show a feature previously interpreted as a muddy channel that cuts throug a sandy channel (orange arrow). Gaussian mixture models as a classifier Each component of a GMM attempts to model an underlying process that generated the data. A GMM is a model based clustering technique in that that each underlying process is assumed to be Gaussian in shape. The objective of a classifier is to find which component is responsible for producing each voxel. Page 2290
3 Usually finding the component responsible for each voxel is simply done by using the responsibility matrix, W, and assigning each voxel to the component with the highest responsibility. However, due to the large size of seismic data a training set must be used due to memory and time constraints. The training set is used to learn the parameters of the GMM. The training set is constructed by uniformly sampling every 125 th voxel (one voxel for every 5 th inline, crossline, and time sample). Once the parameters of a GMM are learned using a training data set, the responsibility of each voxel can be calculated individually. For the conventional EM algorithm, this is simply done by implementing another E-step that includes the whole volume. The EM algorithm is done in a similar manner, but uses the training data set to approximate the total population when calculating the penalty term, G(W). Application The area of interest has been interpreted as a possible channel feature by Zhao et al. (2016). The area of interest consists of 456 crosslines x 576 inlines x 23 time samples. The SOM latent axis 1 and SOM latent axis 2 are used as inputs for two different GMM s; one GMM using the conventional EM algorithm and another using the EM algorithm. The number of components to be found is set to be four because four prototype vectors were used in the construction of the latent space axes. For the conventional EM case, DCA is used to find a GMM with four components. For the EM case, the parameters from the EM case are used for initialization and the spatial weight, β, is set to 0.1. Two cross sections are made, A-A and B-B, to show the channel feature in three dimensions. Previously this was interpreted by Zhao et al. (2016) as a possible muddy channel cutting through a sandy channel (Figure 3). In both the EM and EM case the sandy channel is dominated by the 4 th component of the mixture model and is colored tan. Liewise, the muddy channel is dominated by the 2 nd and 3 rd components of the mixture model, and are colored red and green respectively. The EM algorithm successfully segments the image into more spatially continuous facies. However, there are hard right angles similar to how acquisition footprint loos due to the uniform sampling of the training set of data. Cross section A-A shows the high amplitude channel being delineated by the tan colored facies and being surrounded by the blue colored facies. The EM algorithm improves the segmentation by removing the anomalous red facies above the high amplitude feature. In both EM and EM the red and green facies are not within the high amplitude feature. Figure 4: Horizon slice (a) seismic amplitude for reference (b) results from EM algorithm, (c) results frm EM algorithm. Blocy right angles can be noticed in (c) due to how the training data was sampled. A-A cuts across the flow direction of both channels. B- B goes along the flow direction of the tan-colored channel and cuts across the blue-green colored channel in the western end of the profile. Page 2291
4 Figure 5: Profile A-A of (a) seismic amplitude, (b) EM algorithm, and (c) EM algorithm. Cuts perpindicular to the flow direction of tan colored channel. The blac arrow indicates a high amplitude feature Cross section B-B goes more or less along the flow direction of the tan colored channel (Figure 6). The combination of red and green colored facies segment the channel well. The EM algorithm removes many of the red colored facies in in the high amplitude areas and replaces them with tan colored facies. Figure 6: Profile B-B of (a) seismic amplitude, (b) EM algorithm, and (c) EM algorithm. The channel outlined in purple is composed of all the facies. In the EM algorithm, C, constrains the red facies to mostly the channel fill unlie the EM algorithm. Conclusions Gaussian mixture models are a convenient way to characterize seismic attributes and generate unsupervised seismic facies to let the data spea for itself. Results may not correlate to all the geology, but can highlight features that may be of geological interest. The EM algorithm can act lie a smoothing operator in the spatial domain to ensure that facies have some spatial continuity. Different ways of defining the neighborhood matrix, along with different values of the spatial weight, β, should be investigated further. The unsupervised seismic facies in this paper are using GMM s as a partitioning method lie -means; future wor using GMM s as a fuzzy clustering method may more reveal more complexity in the data. Acnowledgements We would lie to than the ew Zealand Petroleum and Minerals for maing the Waa-3D seismic data public. We would also lie to than the sponsors of the Attribute- Assisted Seismic Processing and Interpretation (AASPI) Consortium at the University of Olahoma. Horizon slices were generated using Petrel licenses courtesy of Schlumberger. A special thans to Tao Zhao for use of his latent space axes. And thans to all our colleagues for their valuable insights. Page 2292
5 EDITED REFERECES ote: This reference list is a copyedited version of the reference list submitted by the author. Reference lists for the 2017 SEG Technical Program Expanded Abstracts have been copyedited so that references provided with the online metadata for each paper will achieve a high degree of lining to cited sources that appear on the Web. REFERECES Ambroise, C., M. Dang, and G. Govaert, 1997, Clustering of spatial data by the EM algorithm, in geoev I Geostatistics for Environmental Applications, 9, , Dempster, A.P.,. M. Laird, and D. B. Rubin, 1977, Maximum lielihood from incomplete data via the EM algorithm: Journal of Royal Statistical Society, B, 39, Hathaway, R. J., 1986, Another interpretation of the EM algorithm for mixture distributions: Statistics & Probability Letters, 4, 53 56, Vlassis,., and A. Lias, 2002, A greedy EM for Gaussian mixture learning: eural Processing Letters, 15, 77 87, Wallet, B. C., R. M. Slatt, and R. P. Altimar, 2014, Unsupervised classification of λρ and µρ attributes derived from well log data in the Barnett Shale: 84th Annual International Meeting, SEG, Expanded Abstracts, , Zhao, T., J. Zhang, F. Li, and K. J. Marfurt, 2016, Characterizing a turbidite system in Canterbury Basin, ew Zealand using seismic attributes and distance-preserving self-organizing maps: Interpretation, 4, SB79 SB89, Page 2293
SUMMARY INTRODUCTION METHOD. Review of VMD theory
Bin Lyu*, The University of Olahoma; Fangyu Li, The University of Georgia; Jie Qi, Tao Zhao, and Kurt J. Marfurt, The University of Olahoma SUMMARY The coherence attribute is a powerful tool to delineate
More informationVolumetric Classification: Program pca3d
Volumetric principle component analysis for 3D SEISMIC FACIES ANALYSIS PROGRAM pca3d Overview Principal component analysis (PCA) is widely used to reduce the redundancy and excess dimensionality of the
More informationModeling of seismic data takes two forms: those based
SPECIAL Reservoir SECTION: modeling Reservoir modeling constrained constrained by seismic by by seismic Latent space modeling of seismic data: An overview BRADLEY C. WALLET, MARCILIO C. DE MATOS, and J.
More informationNote Set 4: Finite Mixture Models and the EM Algorithm
Note Set 4: Finite Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine Finite Mixture Models A finite mixture model with K components, for
More informationTh A4 11 Interpretational Aspects of Multispectral Coherence
Th A4 11 Interpretational Aspects of Multispectral oherence K.J. Marfurt* (University of Oklahoma) Summary Seismic coherence volumes are routinely used to delineate geologic features that might otherwise
More informationMixture Models and the EM Algorithm
Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine c 2017 1 Finite Mixture Models Say we have a data set D = {x 1,..., x N } where x i is
More informationTensor Based Approaches for LVA Field Inference
Tensor Based Approaches for LVA Field Inference Maksuda Lillah and Jeff Boisvert The importance of locally varying anisotropy (LVA) in model construction can be significant; however, it is often ignored
More informationSeismic facies analysis using generative topographic mapping
Satinder Chopra + * and Kurt J. Marfurt + Arcis Seismic Solutions, Calgary; The University of Oklahoma, Norman Summary Seismic facies analysis is commonly carried out by classifying seismic waveforms based
More informationVolumetric Self-Organizing Maps for 3D SEISMIC FACIES ANALYSIS PROGRAM som3d
Volumetric Self-Organizing Maps for 3D SEISMIC FACIES ANALYSIS PROGRAM som3d Computation flow chart This self-organizing map (SOM) 3D facies analysis program is a tool to generate a seismic facies map
More informationAbstractacceptedforpresentationatthe2018SEGConventionatAnaheim,California.Presentationtobemadeinsesion
Abstractacceptedforpresentationatthe2018SEGConventionatAnaheim,California.Presentationtobemadeinsesion MLDA3:FaciesClasificationandReservoirProperties2,onOctober17,2018from 11:25am to11:50am inroom 204B
More informationGeoff McLachlan and Angus Ng. University of Queensland. Schlumberger Chaired Professor Univ. of Texas at Austin. + Chris Bishop
EM Algorithm Geoff McLachlan and Angus Ng Department of Mathematics & Institute for Molecular Bioscience University of Queensland Adapted by Joydeep Ghosh Schlumberger Chaired Professor Univ. of Texas
More informationMixture Models and EM
Table of Content Chapter 9 Mixture Models and EM -means Clustering Gaussian Mixture Models (GMM) Expectation Maximiation (EM) for Mixture Parameter Estimation Introduction Mixture models allows Complex
More informationTh Volume Based Modeling - Automated Construction of Complex Structural Models
Th-04-06 Volume Based Modeling - Automated Construction of Complex Structural Models L. Souche* (Schlumberger), F. Lepage (Schlumberger) & G. Iskenova (Schlumberger) SUMMARY A new technology for creating,
More informationSUMMARY THEORY. L q Norm Reflectivity Inversion
Optimal L q Norm Regularization for Sparse Reflectivity Inversion Fangyu Li, University of Oklahoma & University of Georgia; Rui Xie, University of Georgia; WenZhan Song, University of Georgia & Intelligent
More informationInversion after depth imaging
Robin P. Fletcher *, Stewart Archer, Dave Nichols, and Weijian Mao, WesternGeco Summary In many areas, depth imaging of seismic data is required to construct an accurate view of the reservoir structure.
More informationVolumetric Attributes: Computing Texture Attributes Program glcm3d
GENERATING TEXTURE ATTRIBUTES - PROGRAM glcm3d Computation flow chart The AASPI gray-level co-occurrence matrices and textures attributes are computed along structural dip. In addition to supplying the
More informationClustering Lecture 5: Mixture Model
Clustering Lecture 5: Mixture Model Jing Gao SUNY Buffalo 1 Outline Basics Motivation, definition, evaluation Methods Partitional Hierarchical Density-based Mixture model Spectral methods Advanced topics
More informationA SOM-view of oilfield data: A novel vector field visualization for Self-Organizing Maps and its applications in the petroleum industry
A SOM-view of oilfield data: A novel vector field visualization for Self-Organizing Maps and its applications in the petroleum industry Georg Pölzlbauer, Andreas Rauber (Department of Software Technology
More informationFLATTENING AND GENERATING STRATAL SLICES PROGRAMS flatten, vector_flatten, stratal_slice, and vector_stratal_slice
FLATTENING AND GENERATING STRATAL SLICES PROGRAMS flatten, vector_flatten, stratal_slice, and vector_stratal_slice Extracting phantom horizon and stratal slices is one of the more common interpretation
More informationNormalized Texture Motifs and Their Application to Statistical Object Modeling
Normalized Texture Motifs and Their Application to Statistical Obect Modeling S. D. Newsam B. S. Manunath Center for Applied Scientific Computing Electrical and Computer Engineering Lawrence Livermore
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Clustering and EM Barnabás Póczos & Aarti Singh Contents Clustering K-means Mixture of Gaussians Expectation Maximization Variational Methods 2 Clustering 3 K-
More informationExpectation Maximization (EM) and Gaussian Mixture Models
Expectation Maximization (EM) and Gaussian Mixture Models Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 2 3 4 5 6 7 8 Unsupervised Learning Motivation
More informationMODULATING A POLYCHROMATIC IMAGE BY ONE PLOTTED AGAINST LIGHTNESS: PROGRAM hlplot
MODULATING A POLYCHROMATIC IMAGE BY ONE PLOTTED AGAINST LIGHTNESS: PROGRAM hlplot Multiattribute display of dip magnitude modulating dip azimuth Program hlplot AASPI provides two ways to blend a polychromatic
More informationAutomatic Seismic Facies Classification with Kohonen Self Organizing Maps - a Tutorial
Automatic Seismic Facies Classification with Kohonen Self Organizing Maps - a Tutorial Abstract Atish Roy 1, Marcilio Matos 2, Kurt J. Marfurt 1 1 Geology and Geophysics, University of Oklahoma, Norman,
More informationProgram flatten reads in an input seismic or attribute volume as well as a picked horizon and generates a flattened output volume:
FLATTENING A SCALAR DATA VOLUME PROGRAM flatten Computation Flow Chart Program flatten reads in an input seismic or attribute volume as well as a picked horizon and generates a flattened output volume:
More informationUsing Similarity Attribute as a Quality Control Tool in 5D Interpolation
Using Similarity Attribute as a Quality Control Tool in 5D Interpolation Muyi Kola-Ojo Launch Out Geophysical Services, Calgary, Alberta, Canada Summary Seismic attributes in the last two decades have
More informationMixture Models and EM
Mixture Models and EM Goal: Introduction to probabilistic mixture models and the expectationmaximization (EM) algorithm. Motivation: simultaneous fitting of multiple model instances unsupervised clustering
More information2D Inversions of 3D Marine CSEM Data Hung-Wen Tseng*, Lucy MacGregor, and Rolf V. Ackermann, Rock Solid Images, Inc.
2D Inversions of 3D Marine CSEM Data Hung-Wen Tseng*, Lucy MacGregor, and Rolf V. Ackermann, Rock Solid Images, Inc. Summary A combination of 3D forward simulations and 2D and 3D inversions have been used
More informationObstacles in the analysis of azimuth information from prestack seismic data Anat Canning* and Alex Malkin, Paradigm.
Obstacles in the analysis of azimuth information from prestack seismic data Anat Canning* and Alex Malkin, Paradigm. Summary The azimuth information derived from prestack seismic data at target layers
More informationECE 5424: Introduction to Machine Learning
ECE 5424: Introduction to Machine Learning Topics: Unsupervised Learning: Kmeans, GMM, EM Readings: Barber 20.1-20.3 Stefan Lee Virginia Tech Tasks Supervised Learning x Classification y Discrete x Regression
More informationSpectral Attributes Program spec_cmp. COMPUTING SPECTRAL COMPONENTS USING A COMPLEX MATCHING PURSUITE ALGORITHM PROGRAM spec_cmp
COMPUTING SPECTRAL COMPONENTS USING A COMPLEX MATCHING PURSUITE ALGORITHM PROGRAM spec_cmp Alternative Spectral Decomposition Algorithms Spectral decomposition methods can be divided into three classes:
More information( ) =cov X Y = W PRINCIPAL COMPONENT ANALYSIS. Eigenvectors of the covariance matrix are the principal components
Review Lecture 14 ! PRINCIPAL COMPONENT ANALYSIS Eigenvectors of the covariance matrix are the principal components 1. =cov X Top K principal components are the eigenvectors with K largest eigenvalues
More informationA comparison between time domain and depth domain inversion to acoustic impedance Laurence Letki*, Kevin Darke, and Yan Araujo Borges, Schlumberger
Laurence Letki*, Kevin Darke, and Yan Araujo Borges, Schlumberger Summary Geophysical reservoir characterization in a complex geologic environment remains a challenge. Conventional amplitude inversion
More informationA comparison of classification techniques for seismic facies recognition. and
A comparison of classification techniques for seismic facies recognition Tao Zhao 1, Vikram Jayaram 2, Atish Roy 3, and Kurt J. Marfurt 1 1 The University of Oklahoma, ConocoPhillips School of Geology
More informationInverse Continuous Wavelet Transform Deconvolution Summary ICWT deconvolution Introduction
Marcilio Castro de Matos*, Sismo Research&Consulting and AASPI/OU, and Kurt J. Marfurt, The University of Oklahoma (OU) Summary Most deconvolution algorithms try to transform the seismic wavelet into spikes
More informationNetwork Traffic Measurements and Analysis
DEIB - Politecnico di Milano Fall, 2017 Introduction Often, we have only a set of features x = x 1, x 2,, x n, but no associated response y. Therefore we are not interested in prediction nor classification,
More informationUnsupervised Learning
Unsupervised Learning Learning without Class Labels (or correct outputs) Density Estimation Learn P(X) given training data for X Clustering Partition data into clusters Dimensionality Reduction Discover
More informationGrundlagen der Künstlichen Intelligenz
Grundlagen der Künstlichen Intelligenz Unsupervised learning Daniel Hennes 29.01.2018 (WS 2017/18) University Stuttgart - IPVS - Machine Learning & Robotics 1 Today Supervised learning Regression (linear
More informationVolumetric Classification: Program gtm3d
3D PROBABILISTIC SEISMIC FACIES ANALYSIS PROGRAM gtm3d (Generative Topographic Mapping) Overview Like Self-organizing Maps (SOM), Generative Topographic Mapping (GTM) maps high-dimensional data (e.g. five
More informationAzimuthal binning for improved fracture delineation Gabriel Perez*, Kurt J. Marfurt and Susan Nissen
Azimuthal binning for improved fracture delineation Gabriel Perez*, Kurt J. Marfurt and Susan issen Abstract We propose an alternate way to define azimuth binning in Kirchhoff prestack migration. This
More informationCluster analysis of 3D seismic data for oil and gas exploration
Data Mining VII: Data, Text and Web Mining and their Business Applications 63 Cluster analysis of 3D seismic data for oil and gas exploration D. R. S. Moraes, R. P. Espíndola, A. G. Evsukoff & N. F. F.
More informationSGN (4 cr) Chapter 11
SGN-41006 (4 cr) Chapter 11 Clustering Jussi Tohka & Jari Niemi Department of Signal Processing Tampere University of Technology February 25, 2014 J. Tohka & J. Niemi (TUT-SGN) SGN-41006 (4 cr) Chapter
More informationWe LHR5 06 Multi-dimensional Seismic Data Decomposition for Improved Diffraction Imaging and High Resolution Interpretation
We LHR5 06 Multi-dimensional Seismic Data Decomposition for Improved Diffraction Imaging and High Resolution Interpretation G. Yelin (Paradigm), B. de Ribet* (Paradigm), Y. Serfaty (Paradigm) & D. Chase
More informationMarkov Random Fields for Recognizing Textures modeled by Feature Vectors
Markov Random Fields for Recognizing Textures modeled by Feature Vectors Juliette Blanchet, Florence Forbes, and Cordelia Schmid Inria Rhône-Alpes, 655 avenue de l Europe, Montbonnot, 38334 Saint Ismier
More informationCorrelating Production or Injection Volumes to Volumetric Attributes Program cigar_probe. Computation flow chart
Correlating Production or Injection Volumes to Volumetric Attributes Program cigar_probe Computation flow chart Program cigar_probe uses a suite of seismic amplitude volumes and a table containing the
More informationRegion-based Segmentation
Region-based Segmentation Image Segmentation Group similar components (such as, pixels in an image, image frames in a video) to obtain a compact representation. Applications: Finding tumors, veins, etc.
More informationClustering: Classic Methods and Modern Views
Clustering: Classic Methods and Modern Views Marina Meilă University of Washington mmp@stat.washington.edu June 22, 2015 Lorentz Center Workshop on Clusters, Games and Axioms Outline Paradigms for clustering
More informationProduction of Video Images by Computer Controlled Cameras and Its Application to TV Conference System
Proc. of IEEE Conference on Computer Vision and Pattern Recognition, vol.2, II-131 II-137, Dec. 2001. Production of Video Images by Computer Controlled Cameras and Its Application to TV Conference System
More informationMachine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme
Machine Learning B. Unsupervised Learning B.1 Cluster Analysis Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany
More informationSUPPLEMENTAL MATERIAL: Co-Constrained Handles for Deformation in Shape Collections
SUPPLEMENTAL MATERIAL: Co-Constrained Handles for Deformation in Shape Collections Mehmet Ersin Yumer Carnegie Mellon University Levent Burak Kara 3 Results of the Lamps Dataset Figure 1: (a) input models.
More informationHomework #4 Programming Assignment Due: 11:59 pm, November 4, 2018
CSCI 567, Fall 18 Haipeng Luo Homework #4 Programming Assignment Due: 11:59 pm, ovember 4, 2018 General instructions Your repository will have now a directory P4/. Please do not change the name of this
More informationStructurally oriented coherent noise filtering
Structurally oriented coherent noise filtering Geoffrey A. Dorn 1* presents a novel post-stack structurally oriented coherent noise filter that removes footprint of any orientation and wavelength from
More informationUnsupervised Learning
Networks for Pattern Recognition, 2014 Networks for Single Linkage K-Means Soft DBSCAN PCA Networks for Kohonen Maps Linear Vector Quantization Networks for Problems/Approaches in Machine Learning Supervised
More informationGenerative and discriminative classification techniques
Generative and discriminative classification techniques Machine Learning and Category Representation 2014-2015 Jakob Verbeek, November 28, 2014 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.14.15
More informationDownloaded 10/29/15 to Redistribution subject to SEG license or copyright; see Terms of Use at
Pitfalls in seismic processing: part 1 groundroll sourced acquisition footprint Sumit Verma*, Marcus P. Cahoj, Tengfei Lin, Fangyu Li, Bryce Hutchinson and Kurt J. Marfurt, the University of Oklahoma Summary
More informationSummary. Introduction
. Tony Martin*, Cristiano Saturni and Peter Ashby, ION Geophysical Summary Modern marine seismic surveys may contain many Terabytes of data. In seismic processing terms, understanding the impact and effectiveness
More informationOverview Citation. ML Introduction. Overview Schedule. ML Intro Dataset. Introduction to Semi-Supervised Learning Review 10/4/2010
INFORMATICS SEMINAR SEPT. 27 & OCT. 4, 2010 Introduction to Semi-Supervised Learning Review 2 Overview Citation X. Zhu and A.B. Goldberg, Introduction to Semi- Supervised Learning, Morgan & Claypool Publishers,
More informationSeismic Attributes on Frequency-enhanced Seismic Data
Seismic Attributes on Frequency-enhanced Seismic Data Satinder Chopra* Arcis Corporation, Calgary, Canada schopra@arcis.com Kurt J. Marfurt The University of Oklahoma, Norman, US and Somanath Misra Arcis
More informationGeology 554 Environmental and Exploration Geophysics II Final Exam
Geology 554 Environmental and Exploration Geophysics II Final Exam In this exam, you are asked to apply some of the seismic interpretation skills you ve learned during the semester to the analysis of another
More informationCS Introduction to Data Mining Instructor: Abdullah Mueen
CS 591.03 Introduction to Data Mining Instructor: Abdullah Mueen LECTURE 8: ADVANCED CLUSTERING (FUZZY AND CO -CLUSTERING) Review: Basic Cluster Analysis Methods (Chap. 10) Cluster Analysis: Basic Concepts
More informationCommon-angle processing using reflection angle computed by kinematic pre-stack time demigration
Common-angle processing using reflection angle computed by kinematic pre-stack time demigration Didier Lecerf*, Philippe Herrmann, Gilles Lambaré, Jean-Paul Tourré and Sylvian Legleut, CGGVeritas Summary
More informationCOMPUTING APPARENT CURVATURE PROGRAM euler_curvature. Euler curvature computation flow chart. Computing Euler curvature
COMPUTING APPARENT CURVATURE PROGRAM euler_curvature Euler curvature computation flow chart Inline dip Crossline dip curvature3d k 1 Strike of k 1 k 2 Strike of k 1 euler_curvature (φ=-90 0 ) (φ=0 0 )
More information10701 Machine Learning. Clustering
171 Machine Learning Clustering What is Clustering? Organizing data into clusters such that there is high intra-cluster similarity low inter-cluster similarity Informally, finding natural groupings among
More informationMachine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme, Nicolas Schilling
Machine Learning B. Unsupervised Learning B.1 Cluster Analysis Lars Schmidt-Thieme, Nicolas Schilling Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim,
More informationPattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition
Pattern Recognition Kjell Elenius Speech, Music and Hearing KTH March 29, 2007 Speech recognition 2007 1 Ch 4. Pattern Recognition 1(3) Bayes Decision Theory Minimum-Error-Rate Decision Rules Discriminant
More informationA Method for Edge Detection in Hyperspectral Images Based on Gradient Clustering
A Method for Edge Detection in Hyperspectral Images Based on Gradient Clustering V.C. Dinh, R. P. W. Duin Raimund Leitner Pavel Paclik Delft University of Technology CTR AG PR Sys Design The Netherlands
More informationMotivation. Technical Background
Handling Outliers through Agglomerative Clustering with Full Model Maximum Likelihood Estimation, with Application to Flow Cytometry Mark Gordon, Justin Li, Kevin Matzen, Bryce Wiedenbeck Motivation Clustering
More informationParameter Selection for EM Clustering Using Information Criterion and PDDP
Parameter Selection for EM Clustering Using Information Criterion and PDDP Ujjwal Das Gupta,Vinay Menon and Uday Babbar Abstract This paper presents an algorithm to automatically determine the number of
More informationDIGITAL IMAGE ANALYSIS. Image Classification: Object-based Classification
DIGITAL IMAGE ANALYSIS Image Classification: Object-based Classification Image classification Quantitative analysis used to automate the identification of features Spectral pattern recognition Unsupervised
More informationMachine Learning for Signal Processing Clustering. Bhiksha Raj Class Oct 2016
Machine Learning for Signal Processing Clustering Bhiksha Raj Class 11. 13 Oct 2016 1 Statistical Modelling and Latent Structure Much of statistical modelling attempts to identify latent structure in the
More informationAnalysis of Functional MRI Timeseries Data Using Signal Processing Techniques
Analysis of Functional MRI Timeseries Data Using Signal Processing Techniques Sea Chen Department of Biomedical Engineering Advisors: Dr. Charles A. Bouman and Dr. Mark J. Lowe S. Chen Final Exam October
More informationClustering. CS294 Practical Machine Learning Junming Yin 10/09/06
Clustering CS294 Practical Machine Learning Junming Yin 10/09/06 Outline Introduction Unsupervised learning What is clustering? Application Dissimilarity (similarity) of objects Clustering algorithm K-means,
More information10. MLSP intro. (Clustering: K-means, EM, GMM, etc.)
10. MLSP intro. (Clustering: K-means, EM, GMM, etc.) Rahil Mahdian 01.04.2016 LSV Lab, Saarland University, Germany What is clustering? Clustering is the classification of objects into different groups,
More informationUsing Machine Learning to Optimize Storage Systems
Using Machine Learning to Optimize Storage Systems Dr. Kiran Gunnam 1 Outline 1. Overview 2. Building Flash Models using Logistic Regression. 3. Storage Object classification 4. Storage Allocation recommendation
More informationk-means demo Administrative Machine learning: Unsupervised learning" Assignment 5 out
Machine learning: Unsupervised learning" David Kauchak cs Spring 0 adapted from: http://www.stanford.edu/class/cs76/handouts/lecture7-clustering.ppt http://www.youtube.com/watch?v=or_-y-eilqo Administrative
More informationMachine Learning. Nonparametric methods for Classification. Eric Xing , Fall Lecture 2, September 12, 2016
Machine Learning 10-701, Fall 2016 Nonparametric methods for Classification Eric Xing Lecture 2, September 12, 2016 Reading: 1 Classification Representing data: Hypothesis (classifier) 2 Clustering 3 Supervised
More informationUnsupervised Learning: Clustering
Unsupervised Learning: Clustering Vibhav Gogate The University of Texas at Dallas Slides adapted from Carlos Guestrin, Dan Klein & Luke Zettlemoyer Machine Learning Supervised Learning Unsupervised Learning
More information3-D vertical cable processing using EOM
Carlos Rodriguez-Suarez, John C. Bancroft, Yong Xu and Robert R. Stewart ABSTRACT Three-dimensional seismic data using vertical cables was modeled and processed using equivalent offset migration (EOM),
More informationMODULATING A POLYCHROMATIC IMAGE BY A 2 ND IMAGE PLOTTED AGAINST SATURATION AND A 3 RD IMAGE PLOTTED AGAINST LIGHTNESS: PROGRAM hlsplot
MODULATING A POLYCHROMATIC IMAGE BY A 2 ND IMAGE PLOTTED AGAINST SATURATION AND A 3 RD IMAGE PLOTTED AGAINST LIGHTNESS: PROGRAM hlsplot Plotting dip vs. azimuth vs. coherence Program hlsplot Earlier, we
More informationMachine Learning and Data Mining. Clustering (1): Basics. Kalev Kask
Machine Learning and Data Mining Clustering (1): Basics Kalev Kask Unsupervised learning Supervised learning Predict target value ( y ) given features ( x ) Unsupervised learning Understand patterns of
More informationSensor Tasking and Control
Sensor Tasking and Control Outline Task-Driven Sensing Roles of Sensor Nodes and Utilities Information-Based Sensor Tasking Joint Routing and Information Aggregation Summary Introduction To efficiently
More informationJoint seismic traveltime and TEM inversion for near surface imaging Jide Nosakare Ogunbo*, Jie Zhang, GeoTomo LLC
Jide Nosaare Ogunbo*, Jie Zhang, GeoTomo LLC Summary For a reliable interpretation of the subsurface structure, the joint geophysical inversion approach is becoming a viable tool. Seismic and EM methods
More informationExperimental Analysis of GTM
Experimental Analysis of GTM Elias Pampalk In the past years many different data mining techniques have been developed. The goal of the seminar Kosice-Vienna is to compare some of them to determine which
More informationProbabilistic Abstraction Lattices: A Computationally Efficient Model for Conditional Probability Estimation
Probabilistic Abstraction Lattices: A Computationally Efficient Model for Conditional Probability Estimation Daniel Lowd January 14, 2004 1 Introduction Probabilistic models have shown increasing popularity
More informationEffects of multi-scale velocity heterogeneities on wave-equation migration Yong Ma and Paul Sava, Center for Wave Phenomena, Colorado School of Mines
Effects of multi-scale velocity heterogeneities on wave-equation migration Yong Ma and Paul Sava, Center for Wave Phenomena, Colorado School of Mines SUMMARY Velocity models used for wavefield-based seismic
More informationCS 229 Midterm Review
CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask
More informationDownloaded 09/01/14 to Redistribution subject to SEG license or copyright; see Terms of Use at
Random Noise Suppression Using Normalized Convolution Filter Fangyu Li*, Bo Zhang, Kurt J. Marfurt, The University of Oklahoma; Isaac Hall, Star Geophysics Inc.; Summary Random noise in seismic data hampers
More informationhttp://www.xkcd.com/233/ Text Clustering David Kauchak cs160 Fall 2009 adapted from: http://www.stanford.edu/class/cs276/handouts/lecture17-clustering.ppt Administrative 2 nd status reports Paper review
More informationPRESTACK STRUCTURE-ORIENTED FILTERING PROGRAM sof_prestack
PRESTAK STRUTURE-ORIENTED FILTERING PROGRAM sof_prestack omputation Flow hart Program sof_prestack is a generalization of program sof3d For this reason, the input parameters and workflow of the two algorithms
More informationAN IMPROVED HYBRIDIZED K- MEANS CLUSTERING ALGORITHM (IHKMCA) FOR HIGHDIMENSIONAL DATASET & IT S PERFORMANCE ANALYSIS
AN IMPROVED HYBRIDIZED K- MEANS CLUSTERING ALGORITHM (IHKMCA) FOR HIGHDIMENSIONAL DATASET & IT S PERFORMANCE ANALYSIS H.S Behera Department of Computer Science and Engineering, Veer Surendra Sai University
More informationA New Energy Model for the Hidden Markov Random Fields
A New Energy Model for the Hidden Markov Random Fields Jérémie Sublime 1,2, Antoine Cornuéjols 1, and Younès Bennani 2 1 AgroParisTech, INRA - UMR 518 MIA, F-75005 Paris, France {jeremie.sublime,antoine.cornuejols}@agroparistech.fr
More informationClustering & Dimensionality Reduction. 273A Intro Machine Learning
Clustering & Dimensionality Reduction 273A Intro Machine Learning What is Unsupervised Learning? In supervised learning we were given attributes & targets (e.g. class labels). In unsupervised learning
More informationMULTIATTRIBUTE DISPLAY: PROGRAMS hlplot, hsplot, hlsplot, rgbplot, and crossplot
MULTIATTRIBUTE DISPLAY: PROGRAMS hlplot, hsplot, hlsplot, rgbplot, and crossplot Multiattribute display of dip magnitude modulating dip azimuth Programs hlplot and hsplot In section 6, pages 12-13, we
More informationUnsupervised Learning and Clustering
Unsupervised Learning and Clustering Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Spring 2009 CS 551, Spring 2009 c 2009, Selim Aksoy (Bilkent University)
More informationLecture 2 The k-means clustering problem
CSE 29: Unsupervised learning Spring 2008 Lecture 2 The -means clustering problem 2. The -means cost function Last time we saw the -center problem, in which the input is a set S of data points and the
More informationChapter 5: Outlier Detection
Ludwig-Maximilians-Universität München Institut für Informatik Lehr- und Forschungseinheit für Datenbanksysteme Knowledge Discovery in Databases SS 2016 Chapter 5: Outlier Detection Lecture: Prof. Dr.
More informationData Mining Chapter 3: Visualizing and Exploring Data Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University
Data Mining Chapter 3: Visualizing and Exploring Data Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Exploratory data analysis tasks Examine the data, in search of structures
More informationImage-guided 3D interpolation of borehole data Dave Hale, Center for Wave Phenomena, Colorado School of Mines
Image-guided 3D interpolation of borehole data Dave Hale, Center for Wave Phenomena, Colorado School of Mines SUMMARY A blended neighbor method for image-guided interpolation enables resampling of borehole
More informationUnsupervised Learning and Clustering
Unsupervised Learning and Clustering Selim Aksoy Department of Computer Engineering Bilkent University saksoy@cs.bilkent.edu.tr CS 551, Spring 2008 CS 551, Spring 2008 c 2008, Selim Aksoy (Bilkent University)
More informationCOMS 4771 Clustering. Nakul Verma
COMS 4771 Clustering Nakul Verma Supervised Learning Data: Supervised learning Assumption: there is a (relatively simple) function such that for most i Learning task: given n examples from the data, find
More information