Feature Selection for Classification of Remote Sensed Hyperspectral Images: A Filter approach using Genetic Algorithm and Cluster Validity

Size: px
Start display at page:

Download "Feature Selection for Classification of Remote Sensed Hyperspectral Images: A Filter approach using Genetic Algorithm and Cluster Validity"

Transcription

1 Feature Selection for Classification of Remote Sensed Hyperspectral Images: A Filter approach using Genetic Algorithm and Cluster Validity A. B. Santos 1, C. S. F. de S. Celes 1, A. de A. Araújo 1, and D. Menotti 2 1 Computer Science Department, UFMG - Federal University of Minas Gerais, Belo Horizonte, MG, Brazil 2 Computing Department, UFOP - Federal University of Ouro Preto, Ouro Preto, MG, Brazil Abstract In this paper, we investigate the advantages of using feature selection approaches for classification of remote sensed hyperspectral images. We propose a new filter feature selection approach based on genetic algorithms (GA) and cluster validity measures for finding the best subset of features that maximizes inter-cluster and minimizes intracluster distances, respectivelly. Thus, using the optimal, or sub-optimal, subset of features, classifiers can build decision boundaries in an accurate way. Dunn s index metric, given a subset of features, is used to estimate how good the built clusters are. Experiments were carried out with two wellknown datasets: AVIRIS - Indian Pines and ROSIS - Pavia University. Three different classifiers were used to evaluate our proposal: Support Vector Machines (SVM), Multi-layer Perceptron Neural Networks (MLP) and K-Nearest Neighbor (KNN). Moreover, we compare the performance of our proposal in terms of accuracies to other ones: the traditional Pixelwise, without feature selection/extraction, and the widely used Singular Value Decomposition Band Subset Selection (SVDSS). Experiments show that the classification methods using our feature selection approach produce a small subset of features which easily achieve enough discriminative power and their results are similars to the ones using SVDSS. Keywords: filter feature selection, hyperspectral image, pattern classification, genetic algorithm, cluster validity 1. Introduction One of the many tasks in remote sensing is land cover classification, which is concerned with the identification of areas with vegetation, hydrographic, artificial cover (plantations, urban areas, reforestation areas, etc.) and all the different coverages of the earth s surface [1], [2]. Hyperspectral images have information about materials on earth s surface expressed in hundred bands/wavelengths [1]. This data allows us to identify and discrimate materials with more accuracy [1], [2]. With a new data representation as such, classifiers can improve their performance in terms of accuracy and precision. For instance, recently, classification methods using Support Vector Machines (SVM) have shown greater accuracy when dealing with hyperspectral data than when compared with other methods using Maximum Likelihood (ML), k-nearest neighbors (KNN), among other classifiers [3], [4], [2]. Although the high dimensionality of hyperspectral images provides great discriminative power, its classification is still a challenging task due to the large amount of spectral information and its small set of referenced data [1], [2], [5], [6]. This is also known as Hughes phenomena [7] or the curse of dimensionality. Another constraint mentioned in the literature when data is in high-dimensional space is the density estimation [2]. It is more difficult to compute than when in a lower dimensional space, since the space is quite sparse [2]. In order to surmount such difficulties, some approaches apply feature extraction/selection/representation techniques [3], [8], [9]. Thus, feature dimension reduction approaches are still required in order to improve the generalization power of the classifiers and reduce its overhead [3]. In [8], a wrapper approach using genetic algorithms (GA) and a SVM classifier tackles this issue. Wrapper approaches, however, have high computacional costs [10]. For this reason, in this paper, we propose a filter approach for feature selection. The search for a smaller subset of features is based on a GA as well, where clustering analysis measures are evolved trying to achieve a minimal number of features without loss of discriminative power. Experiments were carried out using two well-known data sets: Indian Pines and Pavia University, obtained by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) [11] and Reflective Optics System Imaging Spectrometer (ROSIS) [12] sensors, respectively. Three different classifier algorithms (i.e., KNN, Multi-layer Perceptron Neural Networks (MLP), and SVM) were used to compare accuracies obtained by our approach to Pixelwise, which does not use feature selection, and SVDSS ones, which is widely used for feature selection in the remote sensing community [13], [14], [15]. The remainder of this paper is organized as follows. Section 2 describes the classification process and presents some well-known algorithms for such task. Section 3 introduces the problem of feature selection and the SVDSS approach. In section 4, our proposed approach for feature selection is presented. Finally, the experiments and conclusions are presented in Sections 5 and 6, respectively.

2 2. Classification Algorithms First, let us mathematically define the problem of classification of hyperspectral images. Let δ {1,..., n} be an integer set which indexes the n pixels of a hyperspectral image. Let ψ {1,..., K c } be a set of K c available classes and X (X 1,..., X n ) R d n be the pixels that compose the feature vector in a d-dimensional space. Finally, let y (y 1,..., y n ) ψ n represent a labeled image. The classification goal is, for every pixel l δ, to infer a label y l ψ using its feature vector x l R d. The so-called Pixelwise approach uses all d responses/bands in the feature vector, without any spatial information, to assign a label to the pixel. In this traditional approach, any feature extraction/selection technique may be applied. Some classification algorithms are briefly described below. 2.1 K-Nearest Neighbor The K-Nearest Neighbor (KNN) classification algorithm labels a sample according to the K closest samples. Since samples are in some feature space, KNN uses a distance measure to define the closeness of each sample to another [16]. For example, let X (X 1,..., X n ) be a training set and each sample X i a tuple (x 1, x 2,..., x d, c), where c is the class which X i belongs to and x i, with 1 i d, its features or attributes. If T (T 1,..., T n ), a testing set, where each sample T i is an unlabeled sample with the same d feature number as X i, to assign a label the KNN algorithm computes the distance of T i to every sample in the training set X, then a label c is assigned to T i considering the most frequent label of the K closest training samples. Besides its simple operation, this algorithm has some drawbacks: High overhead to compute the distance between the sample to be labeled and all samples in the training set. Low accuracy in high dimensional space. Need to set the parameter K. 2.2 Multilayer Perceptron Neural Network An Artificial Neural Network (ANN) of Multilayer Perceptron type is an extension of common ANN Perceptron. The MLP is composed of a set of input units, or neurons, that represents the input layer, at least a hidden layer and an output layer [16]. In pattern classification, a MLP separates the feature space using hyperplanes, by means of a supervised learning process. Thus, regions, in feature space, are associated with a class, then a new sample can be labeled according to the region in which it is inserted. As MLPs can have a greater number of layers, they are able to perform multiple separations in feature space. Hence, MLP can build arbitrary shapes in feature space that represents different and complex classes [16]. The construction of a MLP has some issues such as the number of hidden layers and neurons in each layer, which should be set according to the problem. 2.3 Support Vector Machines The Support Vector Machines (SVM) methodology is based on class separation through margins in which samples are mapped to a feature space where they can be linearly separable [16]. The data is transformed to a new feature space, generally larger than the original, by a kernel function. Some popular kernels are: Linear, Polynomial and Radial Basis Function (RBF). The ability of separating data with nonlinear distributions is related to the choice of this function and should be chosen according to the problem domain [16]. Using an appropriate nonlinear mapping, samples of two classes can then be linearly separated by a hyperplane [16], [17] in this new transformed and high feature space [16], [4]. Thus, SVM training consists of finding an optimal hyperplane where the separating distance between margins of each class can be maximized [16], [17]. Samples whose locations are located on the margins are called support vectors and are the most informative ones to build the classification decision boundary [16]. 3. Feature Selection There are many reasons to reduce dimensionality on data. As said previously, one of them is the Hughes effect. Another obvious reason is to reduce the computational complexity [17]. For example, more features imply more synaptic weights to be optimized in a neural network [17]. The feature selection is preprocessing which aims to minimize the number of features amount, and simultaneously keep the maximum amount of discriminant information as possible [17]. The selection of B-optimal bands (features) is a combinatorial optimization problem in a very large search space. For example, to select B bands from a set of N, the ) = N! B!(N B)!. total number of possible combinations is ( N B For each combination of selected features, a separability criterion should be used to find the best subset. The computational load employed to test all possibible combination is intractable, thus, GA is a suitable tool that one can use to lead a search that optimizes a certain separability criterion. In general, the selection process is divided into two different approaches: filter and wrapper [18]. The wrapper approach leads a search using information provided by a learning algorithm (classifier) [18]. For each step on selection process the subset chosen is evaluated by the learning algorithm, then the best subset is that one which has the best evaluation [18]. Filter approaches have the property of being completely independent of information from the classifier because the feature selection is performed as a preprocessing step, before the classification process. When the number of features is high the filter model is usually chosen due

3 its computational efficiency [10]. Figure 1 illustrates the wrapper approach and Figure 2 the filter model. Fig. 1: Wrapper model. Modified from [18]. represented as a bit string. Each individual is evaluated by a function called fitness. This function establishes the quality of an individual related to a solution. In GA, the population starts in a random way, or by some strategy according to the problem in question. The population undergoes a determined number of evolutions. During this process individuals of this population are evolved and reproduced using some genetic operations such as: crossover, mutation and the selection process. Its main goal is to find the individual with the best fitness [20]. We model the problem of feature selection as follows: each individual has a size of B bands/genes, as shown in Figure 3, and each gene represents the presence (bit 1) or the absence (bit 0) of a band, then, from the composition of present bands the subset of selected bands (features subset) is composed. As previously stated, the quality of Fig. 2: Filter model. Modified from [18]. 3.1 Singular Value Decomposition Band Subset Selection - SVDSS The Singular Value Decomposition Band Subset Selection (SVDSS) is a heuristic based on Singular Value Decomposition (SVD) and rank revealing QR matrix factorization. The SVD of a hyperspectral image is computed, then B-bands are selected as the B first rows of its rank built by QR factorization [14], [15]. This unsupervised approach selects the most independent bands looking for a subset of B-bands that approximate the total variability to the first B Principal Components of a Principal Component Analysis (PCA) [14], [19], [13]. This filter approach has been widely used in the field of remote sensed hyperspectral images [14], [15] and represents a great technique for feature selection. The advantage of this approach over combinatorial optimization ones is that it can be performed in polynomial time [19]. However, it does not guarantee the optimal subset of features. More details on SVDSS can be see in [14], [19], [13]. 4. Proposal Genetic Algorithms (GAs) are the most widespread techniques among evolutionary algorithms. They allow us to find potencial solutions for optimization problems in an acceptable time, especially when the search space is very large [20]. This technique is a heuristic based on population of individuals (e.g., chromosomes), in which each individual enacts a candidate solution for a problem [20] and can be Fig. 3: Chromossome representation and mapping of the hyperspectral bands on the hypercube. Adapted from [6]. each candidate solution is evaluated according to a fitness function. The fitness function evaluates the selected features (set of bands which corresponds to bit 1 in individual chromosome) according to a metric of interest. Here, we are interested in a metric based on homogeneity between samples in the same partition (class), as well as the dissimilarity between samples in different partitions which compose the subset. For this task, a cluster validity metric is used. One of the most popular metrics for cluster validity is the Dunn s index [21]. The higher the indices obtained are, more dense and distant from each other the clusters are. Let U {C 1,..., C c } be a partition system composed of a given subset of features. The Dunn s index can then be calculated as, { min v D (U) = 1 i c min 1 j c i j { δ(ci, C j ) max 1 k c{φ(c k )} } }, where Φ(C k ) is the diameter of cluster C k and c the number of clusters. In our work, we adopted δ(c i, C j ) as the distance between centroids of clusters C i and C j, however any kind of inter-cluster distance measurement can

4 be applied [21]. The similarity, or distance metric, used is the Euclidian distance, i.e., δ(x, y) = (x 1 y 1 ) (x n y n ) 2. Thereby, the fitness function evaluates a partition system compound by a subset of features. Therefore, the aim is to minimize distances between samples belonging to the same cluster (intra-cluster) and to maximize distances between samples of different clusters (inter-cluster). Thus, we expect to build clusters with a subset of features that maximizes Dunn s index such that classifiers can build decision boundaries in an accurate way. 5. Experiments In this section, we describe the experiments carried out on two well-known datasets for validating our proposal and comparing it with other feature selection/representation approaches (SVDSS and pixelwise) using three classifier algorithms (MLP, KNN, and SVM). After finding the most suitable features using the proposed scheme, in order to perform a fair comparison, we use this same number of features as parameter in SVDSS approach. We use the SVDSS implementation from Hyperspectral Image Analysis Toolbox (HIAT) for Matlab [22]. The MLP and KNN classifiers are native implementations from Matlab and SVM is run from by Matlab however using the LIBSVM [23] implementation. Firstly, we present details regarding GA and classifiers setups used in our proposal. 5.1 GA and Classifiers Setups An individual s chromossome, i.e., the features present in the individual, is initialized in a random way and the parameters were set according to results of preliminary experiments. Table 1 presents all parameters used in GA. Notice that all parameters, for both datasets, are the same. The samples are randomly chosen, however the total number of samples has an important impact on the performance. The higher the number of samples, the higher the time to calculate fitness for each individual. In order to ensure of features to be found. We expect that at the end of all runs the total number of features found at each run are close. Three different classification algorithms were used: Support Vector Machines (SVM), Multilayer Perceptron Neural Network (MLP) and K-Nearest Neighboor (KNN). The kernel used in SVM is Radial Basis Function (RBF) and the parameters were manually adjusted. The configuration of MLP was: a single hidden layer and its number of neurons equal to square root of the number of input patterns times output patterns, sigmoidal transfer function and backpropagation training algorithm. In KNN, the parameter K was setup to Selected features by GA scheme It is well-known that for each channel of a hyperspectral images we have a response/amplitude in a specific wavelength. Thus, in the case that indexes of these images are correlated to their waveleghts, finding some closeness between responses that have near indexes was expected. It is interesting to note that our work always find a subset of features that are in the same track, that is, for each GA execution new subsets are generated but with indexes that are equal (same features) or very close (same tracks). Since we have decided to let GA find the optimal number of features, it is also important to note that for each run we obtained almost the same number of features. Figure 5 shown the average number of features of the best individuals of each age, over 10 runs. This indicates that there are tracks of spectral bands that are more discriminative than others, according to Dunn s index. Thereby, features that repeat in at least 50% of all 10 runs were selected to compose the final subset of features. Table 1: GA s Parameters for Indian Pines & Pavia University datasets. Parameters data Population (num. of individuals) 100 Number of ages 500 Crossover probability 80% Mutation probability 0.9% K-tournament 2 K-elitism 2 Samples per class 50 high reliability of results, ten runs of GA for each dataset were performed. Then, we select those features that more frequently appeared. Note that we do not define the number Fig. 5: Average number of features of the best individuals of each age. Pavia University and Indian Pines datasets in blue and green, respectively. 5.3 Classification Results As can be seen in Table 2, the results of our proposal were very close to the widely used SVDSS for all used classifiers, but any improvements, in terms of OA and AA compared

5 (a) Ground-truth Fig. 4: Results for Indian Pines dataset. Table 2: Results for Indian Pines dataset. (b) Classification map - Proposal+SVM Number Train KNN MLP SVM of samples samples (%) Pixelwise SVD Proposal Pixelwise SVD Proposal Pixelwise SVD Proposal OA (%) AA (%) Classes Alfafa Corn-notill Corn-min Corn Grass/pasture Grass/trees Grass/pasture-mowed Hay-windrowed Oats Soybeans-notill Soybeans-min Soybean-clean Wheat Woods Bldg-grass-trees-drives Stone-steel towers to Pixelwise approach, were obtained for both SVDSS and our proposal. However, the results are still close to the ones of Pixelwise with the advantage of reducing computational overhead in the classification process, since the feature space was reduced. Figure 4b shows the classification map from the Indian Pines dataset obtained by using the subset of features found by our work and the SVM classifier, while Figure 4a shows its respective ground-truth. In order to achieve consistency in terms of results, a similar procedure to the one applied to the Indian Pines dataset was applied to Pavia University dataset. Again, features that repeat in at least 50% of all executions were selected to compose the final subset of features. At the end of 500 ages, for 10 runs, GA always found a mean of 38 out 103 features. The blue curve in Figure 5 shows the average number of features of the best individuals of each age, over 10 runs. Table 3 shows that for KNN and MLP classifiers, our approach achieved greater or similar results in terms of OA if compared to SVDSS. However, using more robust classifiers, such as SVM, the Pixelwise approach holds the highest OA values. Nevertheless SVM performance is still near the ones of SVDSS and ours. Figure 6b shows the classification map obtained by using the subset of features found by our work and the SVM classifier, while Figure 6a shows its respective ground-truth. A very relevant constraint with Dunn s index is its sensitivity to a few outliers [21]. Due to that, it is possible that sometimes this cluster validity measure can provide an inappropriate fitness for our purpose. In order to overcome this problem some generalization of Dunn s index has been used [21] and may provide a better estimative for our problem. 6. Conclusions Due to the high dimensionality of remote sensed hyperspectral images, their rich content and some drawbacks

6 (a) Ground-truth Fig. 6: Results for Pavia University dataset. Table 3: Results for Pavia University dataset. (b) Classification map - Proposal+SVM Number Train KNN MLP SVM of samples samples(%) Pixelwise SVD Proposal Pixelwise SVD Proposal Pixelwise SVD Proposal OA (%) AA (%) Classes Asphalt Meadow Gravel Trees Metal Sheets Bare Soil Bitumen Bricks Shadow for data discrimination, we have investigated the benefits of using feature selection approaches for the problem of classification of this data type. A new filter feature selection approach was proposed. The main idea in this proposal was that smaller subsets of features, which generate clusters with high values of Dunn s index, may provide enough discriminant information for the classification task and hence from them it is easier to build decision boundaries with good power of generalization. We used the advantages of genetic algorithms to lead a search for finding better clusters from a subset of features. The use of genetic algorithms and a cluster validity measure (Dunn s index) have proven suitable for the problem of feature selection. However, it is noticiable that Dunn s index may be affected by a few outliers and for this reason another cluster validity measures should be explored. Nevertheless, our proposal shows that a smaller number of features with enough discriminative power is easily achievable. Acknowledgements The authors would like to thank FAPEMIG, CAPES and CNPq for the financial support. References [1] C. Chang, Hyperspectral data exploitation: theory and applications. Wiley-Blackwell, [2] A. Plaza et al., Recent advances in techniques for hyperspectral image processing, Remote Sensing Environmet, vol. 113, no. 1, pp , 2009.

7 [3] B.-C. Kuo, C.-H. Li, and J.-M. Yang, Kernel nonparametric weighted feature extraction for hyperspectral image classification, IEEE Trans. on Geoscience and Remote Sensing (TGARS), vol. 47, no. 4, pp , [4] G. Camps-Valls and L. Bruzzone, Kernel-based methods for hyperspectral image classification, IEEE Trans. on Geoscience and Remote Sensing (TGARS), vol. 43, no. 6, pp , [5] J. Benediktsson, J. Palmason, and J. Sveinsson, Classification of hyperspectral data from urban areas based on extrended morphological profiles, IEEE Trans. on Geoscience and Remote Sensing (TGARS), vol. 43, no. 3, pp , [6] Y. Tarabalka, Classification of hyperspectral data using spectralspatial approaches, Ph.D. dissertation, University of Iceland and Grenoble Institute of Technology, [7] G. Hughes, On the mean accuracy of statistical pattern recognizers, IEEE Transactions on Information Theory, vol. 14, no. 1, pp , [8] Y. Bazi and F. Melgani, Toward an optimal svm classification system for hyperspectral remote sensing images, IEEE Trans. on Geoscience and Remote Sensing (TGARS), vol. 44, no. 11, pp , [9] S. Serpico and L. Bruzzone, A new search algorithm for feature selection in hyperspectral remote sensing images, IEEE Trans. on Geoscience and Remote Sensing (TGARS), vol. 39, no. 7, pp , [10] L. Yu and H. Liu, Feature selection for high-dimensional data: A fast correlation-based filter solution, in International Workshop on Machine Learning, 2003, p [11] R. Green et al., Imaging spectroscopy and the airborne visible/infrared imaging spectrometer (aviris), Remote Sensing of Environment, vol. 65, no. 3, pp , [12] P. Gege et al., System analysis and performance of the new version of the imaging spectrometer rosis, in EARSeL Workshop on Imaging Spectroscopy, 1998, pp [13] M. Velez-Reyes and L. Jimenez, Subset selection analysis for the reduction of hyperspectral imagery, in IEEE International Geoscience and Remote Sensing Symposium (IGARSS), vol. 3, 1998, pp [14] L. Jimenez-Rodriguez, E. Arzuaga-Cruz, and M. Vélez-Reyes, Unsupervised linear feature-extraction methods and their effects in the classification of high-dimensional data, IEEE Trans. on Geoscience and Remote Sensing (TGARS), vol. 45, no. 2, pp , [15] G. Bilgin, S. Ertürk, and T. Yıldırım, Segmentation of hyperspectral images via subtractive clustering and cluster validation using one-class support vector machines, IEEE Trans. on Geoscience and Remote Sensing (TGARS), no. 99, pp. 1 9, [16] R. Duda, P. Hart, and D. Stork, Pattern Classification and Scene Analysis, 2nd ed. John Wiley & Sons, [17] C. Bishop, Pattern recognition and machine learning. Springer New York, 2006, vol. 4. [18] R. Kohavi and G. John, Wrappers for feature subset selection, Artificial intelligence, vol. 97, no. 1-2, pp , [19] M. Velez-Reyes, L. Jimenez, D. Linares, and H. Velazquez, Comparison of matrix factorization algorithms for band selection in hyperspectral imagery, in SPIE proceedings series, 2000, pp [20] D. Goldberg, Genetic algorithms in search, optimization, and machine learning. Addison-wesley, [21] J. Bezdek and N. Pal, Cluster validation with generalized dunn s indices, in International Two-Stream Conference on Artificial Neural Networks and Expert Systems,, 1995, pp [22] E. Arzuaga-Cruz, L. Jimenez-Rodriguez, M. Velez-Reyes, D. Kaeli, E. Rodriguez-Diaz, H. Velazquez-Santana, A. Castrodad-Carrau, L. Santos-Campis, and C. Santiago, A matlab toolbox for hyperspectral image analysis, in Geoscience and Remote Sensing Symposium, IGARSS 04. Proceedings IEEE International, vol. 7, 2004, pp [23] C.-C. Chang and C.-J. Lin, LIBSVM: A library for support vector machines, ACM Transactions on Intelligent Systems and Technology, vol. 2, no. 3, pp. 27:1 27:27, 2011, software available at cjlin/libsvm.

DIFFERENT OPTIMAL BAND SELECTION OF HYPERSPECTRAL IMAGES USING A CONTINUOUS GENETIC ALGORITHM

DIFFERENT OPTIMAL BAND SELECTION OF HYPERSPECTRAL IMAGES USING A CONTINUOUS GENETIC ALGORITHM DIFFERENT OPTIMAL BAND SELECTION OF HYPERSPECTRAL IMAGES USING A CONTINUOUS GENETIC ALGORITHM S. Talebi Nahr a, *, P. Pahlavani b, M. Hasanlou b a Department of Civil and Geomatics Engineering, Tafresh

More information

Does Normalization Methods Play a Role for Hyperspectral Image Classification?

Does Normalization Methods Play a Role for Hyperspectral Image Classification? Does Normalization Methods Play a Role for Hyperspectral Image Classification? Faxian Cao 1, Zhijing Yang 1*, Jinchang Ren 2, Mengying Jiang 1, Wing-Kuen Ling 1 1 School of Information Engineering, Guangdong

More information

Hyperspectral Image Classification Using Gradient Local Auto-Correlations

Hyperspectral Image Classification Using Gradient Local Auto-Correlations Hyperspectral Image Classification Using Gradient Local Auto-Correlations Chen Chen 1, Junjun Jiang 2, Baochang Zhang 3, Wankou Yang 4, Jianzhong Guo 5 1. epartment of Electrical Engineering, University

More information

c 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all

c 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all c 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising

More information

Classification of Hyperspectral Data over Urban. Areas Using Directional Morphological Profiles and. Semi-supervised Feature Extraction

Classification of Hyperspectral Data over Urban. Areas Using Directional Morphological Profiles and. Semi-supervised Feature Extraction IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL.X, NO.X, Y 1 Classification of Hyperspectral Data over Urban Areas Using Directional Morphological Profiles and Semi-supervised

More information

PoS(CENet2017)005. The Classification of Hyperspectral Images Based on Band-Grouping and Convolutional Neural Network. Speaker.

PoS(CENet2017)005. The Classification of Hyperspectral Images Based on Band-Grouping and Convolutional Neural Network. Speaker. The Classification of Hyperspectral Images Based on Band-Grouping and Convolutional Neural Network 1 Xi an Hi-Tech Institute Xi an 710025, China E-mail: dr-f@21cnl.c Hongyang Gu Xi an Hi-Tech Institute

More information

Hyperspectral Image Classification by Using Pixel Spatial Correlation

Hyperspectral Image Classification by Using Pixel Spatial Correlation Hyperspectral Image Classification by Using Pixel Spatial Correlation Yue Gao and Tat-Seng Chua School of Computing, National University of Singapore, Singapore {gaoyue,chuats}@comp.nus.edu.sg Abstract.

More information

Fuzzy Entropy based feature selection for classification of hyperspectral data

Fuzzy Entropy based feature selection for classification of hyperspectral data Fuzzy Entropy based feature selection for classification of hyperspectral data Mahesh Pal Department of Civil Engineering NIT Kurukshetra, 136119 mpce_pal@yahoo.co.uk Abstract: This paper proposes to use

More information

Spectral Angle Based Unary Energy Functions for Spatial-Spectral Hyperspectral Classification Using Markov Random Fields

Spectral Angle Based Unary Energy Functions for Spatial-Spectral Hyperspectral Classification Using Markov Random Fields Rochester Institute of Technology RIT Scholar Works Presentations and other scholarship 7-31-2016 Spectral Angle Based Unary Energy Functions for Spatial-Spectral Hyperspectral Classification Using Markov

More information

Multi-resolution Segmentation and Shape Analysis for Remote Sensing Image Classification

Multi-resolution Segmentation and Shape Analysis for Remote Sensing Image Classification Multi-resolution Segmentation and Shape Analysis for Remote Sensing Image Classification Selim Aksoy and H. Gökhan Akçay Bilkent University Department of Computer Engineering Bilkent, 06800, Ankara, Turkey

More information

GRAPH-BASED SEMI-SUPERVISED HYPERSPECTRAL IMAGE CLASSIFICATION USING SPATIAL INFORMATION

GRAPH-BASED SEMI-SUPERVISED HYPERSPECTRAL IMAGE CLASSIFICATION USING SPATIAL INFORMATION GRAPH-BASED SEMI-SUPERVISED HYPERSPECTRAL IMAGE CLASSIFICATION USING SPATIAL INFORMATION Nasehe Jamshidpour a, Saeid Homayouni b, Abdolreza Safari a a Dept. of Geomatics Engineering, College of Engineering,

More information

Hyperspectral Image Classification via Kernel Sparse Representation

Hyperspectral Image Classification via Kernel Sparse Representation 1 Hyperspectral Image Classification via Kernel Sparse Representation Yi Chen 1, Nasser M. Nasrabadi 2, Fellow, IEEE, and Trac D. Tran 1, Senior Member, IEEE 1 Department of Electrical and Computer Engineering,

More information

PARALLEL IMPLEMENTATION OF MORPHOLOGICAL PROFILE BASED SPECTRAL-SPATIAL CLASSIFICATION SCHEME FOR HYPERSPECTRAL IMAGERY

PARALLEL IMPLEMENTATION OF MORPHOLOGICAL PROFILE BASED SPECTRAL-SPATIAL CLASSIFICATION SCHEME FOR HYPERSPECTRAL IMAGERY PARALLEL IMPLEMENTATION OF MORPHOLOGICAL PROFILE BASED SPECTRAL-SPATIAL CLASSIFICATION SCHEME FOR HYPERSPECTRAL IMAGERY B. Kumar a, O. Dikshit b a Department of Computer Science & Information Technology,

More information

ECG782: Multidimensional Digital Signal Processing

ECG782: Multidimensional Digital Signal Processing ECG782: Multidimensional Digital Signal Processing Object Recognition http://www.ee.unlv.edu/~b1morris/ecg782/ 2 Outline Knowledge Representation Statistical Pattern Recognition Neural Networks Boosting

More information

HYPERSPECTRAL image (HSI) acquired by spaceborne

HYPERSPECTRAL image (HSI) acquired by spaceborne 1 SuperPCA: A Superpixelwise PCA Approach for Unsupervised Feature Extraction of Hyperspectral Imagery Junjun Jiang, Member, IEEE, Jiayi Ma, Member, IEEE, Chen Chen, Member, IEEE, Zhongyuan Wang, Member,

More information

SUPERVISED classification techniques classify the input

SUPERVISED classification techniques classify the input JOURNAL OF L A TEX CLASS FILES, VOL 6, NO 1, JANUARY 2007 1 Feature Selection Based on Hybridization of Genetic Algorithm and Particle Swarm Optimization Pedram Ghamisi, Student Member, IEEE and Jon Atli

More information

Exploring Structural Consistency in Graph Regularized Joint Spectral-Spatial Sparse Coding for Hyperspectral Image Classification

Exploring Structural Consistency in Graph Regularized Joint Spectral-Spatial Sparse Coding for Hyperspectral Image Classification 1 Exploring Structural Consistency in Graph Regularized Joint Spectral-Spatial Sparse Coding for Hyperspectral Image Classification Changhong Liu, Jun Zhou, Senior Member, IEEE, Jie Liang, Yuntao Qian,

More information

IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING 1

IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING 1 IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING 1 Exploring Locally Adaptive Dimensionality Reduction for Hyperspectral Image Classification: A Maximum Margin Metric Learning

More information

COSC160: Detection and Classification. Jeremy Bolton, PhD Assistant Teaching Professor

COSC160: Detection and Classification. Jeremy Bolton, PhD Assistant Teaching Professor COSC160: Detection and Classification Jeremy Bolton, PhD Assistant Teaching Professor Outline I. Problem I. Strategies II. Features for training III. Using spatial information? IV. Reducing dimensionality

More information

Band Selection for Hyperspectral Image Classification Using Mutual Information

Band Selection for Hyperspectral Image Classification Using Mutual Information 1 Band Selection for Hyperspectral Image Classification Using Mutual Information Baofeng Guo, Steve R. Gunn, R. I. Damper Senior Member, IEEE and J. D. B. Nelson Abstract Spectral band selection is a fundamental

More information

Remote Sensed Image Classification based on Spatial and Spectral Features using SVM

Remote Sensed Image Classification based on Spatial and Spectral Features using SVM RESEARCH ARTICLE OPEN ACCESS Remote Sensed Image Classification based on Spatial and Spectral Features using SVM Mary Jasmine. E PG Scholar Department of Computer Science and Engineering, University College

More information

STRATIFIED SAMPLING METHOD BASED TRAINING PIXELS SELECTION FOR HYPER SPECTRAL REMOTE SENSING IMAGE CLASSIFICATION

STRATIFIED SAMPLING METHOD BASED TRAINING PIXELS SELECTION FOR HYPER SPECTRAL REMOTE SENSING IMAGE CLASSIFICATION Volume 117 No. 17 2017, 121-126 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu ijpam.eu STRATIFIED SAMPLING METHOD BASED TRAINING PIXELS SELECTION FOR HYPER

More information

Spectral-Spatial Classification of Hyperspectral Image Based on Kernel Extreme Learning Machine

Spectral-Spatial Classification of Hyperspectral Image Based on Kernel Extreme Learning Machine Remote Sens. 2014, 6, 5795-5814; doi:10.3390/rs6065795 Article OPEN ACCESS remote sensing ISSN 2072-4292 www.mdpi.com/journal/remotesensing Spectral-Spatial Classification of Hyperspectral Image Based

More information

CLASSIFICATION OF FULLY POLARIMETRIC SAR SATELLITE DATA USING GENETIC ALGORITHM AND NEURAL NETWORKS

CLASSIFICATION OF FULLY POLARIMETRIC SAR SATELLITE DATA USING GENETIC ALGORITHM AND NEURAL NETWORKS CLASSIFICATION OF FULLY POLARIMETRIC SAR SATELLITE DATA USING GENETIC ALGORITHM AND NEURAL NETWORKS Iman Entezari 1, Babak Mansouri 2, and Mahdi Motagh 1 1 Department of Geomatics Engineering, College

More information

Fusion of pixel based and object based features for classification of urban hyperspectral remote sensing data

Fusion of pixel based and object based features for classification of urban hyperspectral remote sensing data Fusion of pixel based and object based features for classification of urban hyperspectral remote sensing data Wenzhi liao a, *, Frieke Van Coillie b, Flore Devriendt b, Sidharta Gautama a, Aleksandra Pizurica

More information

Dimensionality Reduction using Hybrid Support Vector Machine and Discriminant Independent Component Analysis for Hyperspectral Image

Dimensionality Reduction using Hybrid Support Vector Machine and Discriminant Independent Component Analysis for Hyperspectral Image Dimensionality Reduction using Hybrid Support Vector Machine and Discriminant Independent Component Analysis for Hyperspectral Image Murinto 1, Nur Rochmah Dyah PA 2 1,2 Department of Informatics Engineering

More information

Application of nonparametric Bayesian classifier to remote sensing data. Institute of Parallel Processing, Bulgarian Academy of Sciences

Application of nonparametric Bayesian classifier to remote sensing data. Institute of Parallel Processing, Bulgarian Academy of Sciences Application of nonparametric Bayesian classifier to remote sensing data Nina Jeliazkova, nina@acad.bg, +359 2 979 6606 Stela Ruseva, stela@acad.bg, +359 2 979 6606 Kiril Boyanov, boyanov@acad.bg Institute

More information

An Automatic Method for Selecting the Parameter of the Normalized Kernel Function to Support Vector Machines *

An Automatic Method for Selecting the Parameter of the Normalized Kernel Function to Support Vector Machines * JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 28, -5 (202) An Automatic Method for Selecting the Parameter of the Normalized Kernel Function to Support Vector Machines * CHENG-HSUAN LI, HSIN-HUA HO 2,

More information

Spectral-Spatial Response for Hyperspectral Image Classification

Spectral-Spatial Response for Hyperspectral Image Classification Article Spectral-Spatial Response for Hyperspectral Image Classification Yantao Wei 1,2, *,, Yicong Zhou 2, and Hong Li 3 1 School of Educational Information Technology, Central China Normal University,

More information

The MATLAB Hyperspectral Image Analysis Toolbox

The MATLAB Hyperspectral Image Analysis Toolbox The MATLAB Hyperspectral Image Analysis Toolbox Samuel Rosario-Torres, Miguel Vélez-Reyes Phd. and Luis O. Jiménez Ph.D. Laboratory of Applied Remote Sensing and Imaging System University of Puerto Rico

More information

Revista de Topografía Azimut

Revista de Topografía Azimut Revista de Topografía Azimut http://revistas.udistrital.edu.co/ojs/index.php/azimut Exploration of Fourier shape descriptor for classification of hyperspectral imagery Exploración del descriptor de forma

More information

CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS

CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CHAPTER 4 CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS 4.1 Introduction Optical character recognition is one of

More information

CLASSIFICATION OF ROOF MATERIALS USING HYPERSPECTRAL DATA

CLASSIFICATION OF ROOF MATERIALS USING HYPERSPECTRAL DATA CLASSIFICATION OF ROOF MATERIALS USING HYPERSPECTRAL DATA C. Chisense Department of Geomatics, Computer Science and Mathematics, University of Applied Sciences Stuttgart Schellingstraße 24, D-70174 Stuttgart

More information

Support Vector Machines

Support Vector Machines Support Vector Machines RBF-networks Support Vector Machines Good Decision Boundary Optimization Problem Soft margin Hyperplane Non-linear Decision Boundary Kernel-Trick Approximation Accurancy Overtraining

More information

Support Vector Selection and Adaptation and Its Application in Remote Sensing

Support Vector Selection and Adaptation and Its Application in Remote Sensing Support Vector Selection and Adaptation and Its Application in Remote Sensing Gülşen Taşkın Kaya Computational Science and Engineering Istanbul Technical University Istanbul, Turkey gtaskink@purdue.edu

More information

CONVOLUTIONAL NEURAL NETWORKS FOR HIGH SPATIAL RESOLUTION REMOTE SENSING IMAGE CLASSIFICATION

CONVOLUTIONAL NEURAL NETWORKS FOR HIGH SPATIAL RESOLUTION REMOTE SENSING IMAGE CLASSIFICATION CONVOLUTIONAL NEURAL NETWORKS FOR HIGH SPATIAL RESOLUTION REMOTE SENSING IMAGE CLASSIFICATION 1. Sakhi. G 1, 2., R. Balasubramanian 2, R.Nedunchezhian 3 1 Research Scholar, Manonmaniam Sundaranar University,Tirunelveli

More information

Comparison of Support Vector Machine-Based Processing Chains for Hyperspectral Image Classification

Comparison of Support Vector Machine-Based Processing Chains for Hyperspectral Image Classification Comparison of Support Vector Machine-Based Processing Chains for Hyperspectral Image Classification Marta Rojas a, Inmaculada Dópido a, Antonio Plaza a and Paolo Gamba b a Hyperspectral Computing Laboratory

More information

HYPERSPECTRAL sensors provide a rich source of

HYPERSPECTRAL sensors provide a rich source of Fast Hyperspectral Feature Reduction Using Piecewise Constant Function Approximations Are C. Jensen, Student member, IEEE and Anne Schistad Solberg, Member, IEEE Abstract The high number of spectral bands

More information

Title: A Deep Network Architecture for Super-resolution aided Hyperspectral Image Classification with Class-wise Loss

Title: A Deep Network Architecture for Super-resolution aided Hyperspectral Image Classification with Class-wise Loss 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising

More information

R-VCANet: A New Deep Learning-Based Hyperspectral Image Classification Method

R-VCANet: A New Deep Learning-Based Hyperspectral Image Classification Method IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING 1 R-VCANet: A New Deep Learning-Based Hyperspectral Image Classification Method Bin Pan, Zhenwei Shi and Xia Xu Abstract

More information

A Novel Clustering-Based Feature Representation for the Classification of Hyperspectral Imagery

A Novel Clustering-Based Feature Representation for the Classification of Hyperspectral Imagery Remote Sens. 2014, 6, 5732-5753; doi:10.3390/rs6065732 Article OPEN ACCESS remote sensing ISSN 2072-4292 www.mdpi.com/journal/remotesensing A Novel Clustering-Based Feature Representation for the Classification

More information

Unsupervised learning in Vision

Unsupervised learning in Vision Chapter 7 Unsupervised learning in Vision The fields of Computer Vision and Machine Learning complement each other in a very natural way: the aim of the former is to extract useful information from visual

More information

IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 7, NO. 6, JUNE

IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 7, NO. 6, JUNE IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 7, NO. 6, JUNE 2014 2147 Automatic Framework for Spectral Spatial Classification Based on Supervised Feature Extraction

More information

Research on Applications of Data Mining in Electronic Commerce. Xiuping YANG 1, a

Research on Applications of Data Mining in Electronic Commerce. Xiuping YANG 1, a International Conference on Education Technology, Management and Humanities Science (ETMHS 2015) Research on Applications of Data Mining in Electronic Commerce Xiuping YANG 1, a 1 Computer Science Department,

More information

Textural Features for Hyperspectral Pixel Classification

Textural Features for Hyperspectral Pixel Classification Textural Features for Hyperspectral Pixel Classification Olga Rajadell, Pedro García-Sevilla, and Filiberto Pla Depto. Lenguajes y Sistemas Informáticos Jaume I University, Campus Riu Sec s/n 12071 Castellón,

More information

Fine Classification of Unconstrained Handwritten Persian/Arabic Numerals by Removing Confusion amongst Similar Classes

Fine Classification of Unconstrained Handwritten Persian/Arabic Numerals by Removing Confusion amongst Similar Classes 2009 10th International Conference on Document Analysis and Recognition Fine Classification of Unconstrained Handwritten Persian/Arabic Numerals by Removing Confusion amongst Similar Classes Alireza Alaei

More information

REMOTE sensing hyperspectral images (HSI) are acquired

REMOTE sensing hyperspectral images (HSI) are acquired IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 10, NO. 3, MARCH 2017 1151 Exploring Structural Consistency in Graph Regularized Joint Spectral-Spatial Sparse Coding

More information

Support Vector Machines

Support Vector Machines Support Vector Machines RBF-networks Support Vector Machines Good Decision Boundary Optimization Problem Soft margin Hyperplane Non-linear Decision Boundary Kernel-Trick Approximation Accurancy Overtraining

More information

Shapelet-Based Sparse Image Representation for Landcover Classification of Hyperspectral Data

Shapelet-Based Sparse Image Representation for Landcover Classification of Hyperspectral Data Shapelet-Based Sparse Image Representation for Landcover Classification of Hyperspectral Data Ribana Roscher, Björn Waske Department of Earth Sciences, Institute of Geographical Sciences Freie Universität

More information

HYPERSPECTRAL imagery (HSI) records hundreds of

HYPERSPECTRAL imagery (HSI) records hundreds of IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 11, NO. 1, JANUARY 2014 173 Classification Based on 3-D DWT and Decision Fusion for Hyperspectral Image Analysis Zhen Ye, Student Member, IEEE, Saurabh

More information

INTELLIGENT TARGET DETECTION IN HYPERSPECTRAL IMAGERY

INTELLIGENT TARGET DETECTION IN HYPERSPECTRAL IMAGERY INTELLIGENT TARGET DETECTION IN HYPERSPECTRAL IMAGERY Ayanna Howard, Curtis Padgett, Kenneth Brown Jet Propulsion Laboratory, California Institute of Technology 4800 Oak Grove Drive, Pasadena, CA 91 109-8099

More information

DIMENSION REDUCTION FOR HYPERSPECTRAL DATA USING RANDOMIZED PCA AND LAPLACIAN EIGENMAPS

DIMENSION REDUCTION FOR HYPERSPECTRAL DATA USING RANDOMIZED PCA AND LAPLACIAN EIGENMAPS DIMENSION REDUCTION FOR HYPERSPECTRAL DATA USING RANDOMIZED PCA AND LAPLACIAN EIGENMAPS YIRAN LI APPLIED MATHEMATICS, STATISTICS AND SCIENTIFIC COMPUTING ADVISOR: DR. WOJTEK CZAJA, DR. JOHN BENEDETTO DEPARTMENT

More information

A classifier ensemble based on fusion of support vector machines for classifying hyperspectral data

A classifier ensemble based on fusion of support vector machines for classifying hyperspectral data International Journal of Image and Data Fusion Vol. 1, No. 4, December 2010, 293 307 A classifier ensemble based on fusion of support vector machines for classifying hyperspectral data Xavier Ceamanos

More information

Schroedinger Eigenmaps with Nondiagonal Potentials for Spatial-Spectral Clustering of Hyperspectral Imagery

Schroedinger Eigenmaps with Nondiagonal Potentials for Spatial-Spectral Clustering of Hyperspectral Imagery Schroedinger Eigenmaps with Nondiagonal Potentials for Spatial-Spectral Clustering of Hyperspectral Imagery Nathan D. Cahill a, Wojciech Czaja b, and David W. Messinger c a Center for Applied and Computational

More information

EE 589 INTRODUCTION TO ARTIFICIAL NETWORK REPORT OF THE TERM PROJECT REAL TIME ODOR RECOGNATION SYSTEM FATMA ÖZYURT SANCAR

EE 589 INTRODUCTION TO ARTIFICIAL NETWORK REPORT OF THE TERM PROJECT REAL TIME ODOR RECOGNATION SYSTEM FATMA ÖZYURT SANCAR EE 589 INTRODUCTION TO ARTIFICIAL NETWORK REPORT OF THE TERM PROJECT REAL TIME ODOR RECOGNATION SYSTEM FATMA ÖZYURT SANCAR 1.Introductıon. 2.Multi Layer Perception.. 3.Fuzzy C-Means Clustering.. 4.Real

More information

A Comparative Study of Conventional and Neural Network Classification of Multispectral Data

A Comparative Study of Conventional and Neural Network Classification of Multispectral Data A Comparative Study of Conventional and Neural Network Classification of Multispectral Data B.Solaiman & M.C.Mouchot Ecole Nationale Supérieure des Télécommunications de Bretagne B.P. 832, 29285 BREST

More information

Nearest Clustering Algorithm for Satellite Image Classification in Remote Sensing Applications

Nearest Clustering Algorithm for Satellite Image Classification in Remote Sensing Applications Nearest Clustering Algorithm for Satellite Image Classification in Remote Sensing Applications Anil K Goswami 1, Swati Sharma 2, Praveen Kumar 3 1 DRDO, New Delhi, India 2 PDM College of Engineering for

More information

Facial Expression Recognition using Principal Component Analysis with Singular Value Decomposition

Facial Expression Recognition using Principal Component Analysis with Singular Value Decomposition ISSN: 2321-7782 (Online) Volume 1, Issue 6, November 2013 International Journal of Advance Research in Computer Science and Management Studies Research Paper Available online at: www.ijarcsms.com Facial

More information

Using a genetic algorithm for editing k-nearest neighbor classifiers

Using a genetic algorithm for editing k-nearest neighbor classifiers Using a genetic algorithm for editing k-nearest neighbor classifiers R. Gil-Pita 1 and X. Yao 23 1 Teoría de la Señal y Comunicaciones, Universidad de Alcalá, Madrid (SPAIN) 2 Computer Sciences Department,

More information

Research Article Hyperspectral Image Classification Using Kernel Fukunaga-Koontz Transform

Research Article Hyperspectral Image Classification Using Kernel Fukunaga-Koontz Transform Mathematical Problems in Engineering Volume 13, Article ID 471915, 7 pages http://dx.doi.org/1.1155/13/471915 Research Article Hyperspectral Image Classification Using Kernel Fukunaga-Koontz Transform

More information

Lab 9. Julia Janicki. Introduction

Lab 9. Julia Janicki. Introduction Lab 9 Julia Janicki Introduction My goal for this project is to map a general land cover in the area of Alexandria in Egypt using supervised classification, specifically the Maximum Likelihood and Support

More information

DEEP LEARNING TO DIVERSIFY BELIEF NETWORKS FOR REMOTE SENSING IMAGE CLASSIFICATION

DEEP LEARNING TO DIVERSIFY BELIEF NETWORKS FOR REMOTE SENSING IMAGE CLASSIFICATION DEEP LEARNING TO DIVERSIFY BELIEF NETWORKS FOR REMOTE SENSING IMAGE CLASSIFICATION S.Dhanalakshmi #1 #PG Scholar, Department of Computer Science, Dr.Sivanthi Aditanar college of Engineering, Tiruchendur

More information

SAM and ANN classification of hyperspectral data of seminatural agriculture used areas

SAM and ANN classification of hyperspectral data of seminatural agriculture used areas Proceedings of the 28th EARSeL Symposium: Remote Sensing for a Changing Europe, Istambul, Turkey, June 2-5 2008. Millpress Science Publishers Zagajewski B., Olesiuk D., 2008. SAM and ANN classification

More information

A CNN-based Spatial Feature Fusion Algorithm for Hyperspectral Imagery Classification. Alan J.X. Guo, Fei Zhu. February 1, 2018

A CNN-based Spatial Feature Fusion Algorithm for Hyperspectral Imagery Classification. Alan J.X. Guo, Fei Zhu. February 1, 2018 A CNN-based Spatial Feature Fusion Algorithm for Hyperspectral Imagery Classification Alan J.X. Guo, Fei Zhu February 1, 2018 arxiv:1801.10355v1 [cs.cv] 31 Jan 2018 Abstract The shortage of training samples

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

Nonlinear data separation and fusion for multispectral image classification

Nonlinear data separation and fusion for multispectral image classification Nonlinear data separation and fusion for multispectral image classification Hela Elmannai #*1, Mohamed Anis Loghmari #2, Mohamed Saber Naceur #3 # Laboratoire de Teledetection et Systeme d informations

More information

Vulnerability of machine learning models to adversarial examples

Vulnerability of machine learning models to adversarial examples Vulnerability of machine learning models to adversarial examples Petra Vidnerová Institute of Computer Science The Czech Academy of Sciences Hora Informaticae 1 Outline Introduction Works on adversarial

More information

Fast Efficient Clustering Algorithm for Balanced Data

Fast Efficient Clustering Algorithm for Balanced Data Vol. 5, No. 6, 214 Fast Efficient Clustering Algorithm for Balanced Data Adel A. Sewisy Faculty of Computer and Information, Assiut University M. H. Marghny Faculty of Computer and Information, Assiut

More information

Irfan Ahmed, Kyung-Suk Lhee, Hyun-Jung Shin and Man-Pyo Hong

Irfan Ahmed, Kyung-Suk Lhee, Hyun-Jung Shin and Man-Pyo Hong Chapter 5 FAST CONTENT-BASED FILE TYPE IDENTIFICATION Irfan Ahmed, Kyung-Suk Lhee, Hyun-Jung Shin and Man-Pyo Hong Abstract Digital forensic examiners often need to identify the type of a file or file

More information

Data mining with Support Vector Machine

Data mining with Support Vector Machine Data mining with Support Vector Machine Ms. Arti Patle IES, IPS Academy Indore (M.P.) artipatle@gmail.com Mr. Deepak Singh Chouhan IES, IPS Academy Indore (M.P.) deepak.schouhan@yahoo.com Abstract: Machine

More information

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 13, NO. 8, AUGUST

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 13, NO. 8, AUGUST IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 13, NO. 8, AUGUST 2016 1059 A Modified Locality-Preserving Projection Approach for Hyperspectral Image Classification Yongguang Zhai, Lifu Zhang, Senior

More information

Feature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Feature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani Feature Selection CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Dimensionality reduction Feature selection vs. feature extraction Filter univariate

More information

KBSVM: KMeans-based SVM for Business Intelligence

KBSVM: KMeans-based SVM for Business Intelligence Association for Information Systems AIS Electronic Library (AISeL) AMCIS 2004 Proceedings Americas Conference on Information Systems (AMCIS) December 2004 KBSVM: KMeans-based SVM for Business Intelligence

More information

INF 4300 Classification III Anne Solberg The agenda today:

INF 4300 Classification III Anne Solberg The agenda today: INF 4300 Classification III Anne Solberg 28.10.15 The agenda today: More on estimating classifier accuracy Curse of dimensionality and simple feature selection knn-classification K-means clustering 28.10.15

More information

Texture Image Segmentation using FCM

Texture Image Segmentation using FCM Proceedings of 2012 4th International Conference on Machine Learning and Computing IPCSIT vol. 25 (2012) (2012) IACSIT Press, Singapore Texture Image Segmentation using FCM Kanchan S. Deshmukh + M.G.M

More information

Spatially variant dimensionality reduction for the visualization of multi/hyperspectral images

Spatially variant dimensionality reduction for the visualization of multi/hyperspectral images Author manuscript, published in "International Conference on Image Analysis and Recognition, Burnaby : Canada (2011)" DOI : 10.1007/978-3-642-21593-3_38 Spatially variant dimensionality reduction for the

More information

SPATIAL-SPECTRAL CLASSIFICATION BASED ON THE UNSUPERVISED CONVOLUTIONAL SPARSE AUTO-ENCODER FOR HYPERSPECTRAL REMOTE SENSING IMAGERY

SPATIAL-SPECTRAL CLASSIFICATION BASED ON THE UNSUPERVISED CONVOLUTIONAL SPARSE AUTO-ENCODER FOR HYPERSPECTRAL REMOTE SENSING IMAGERY SPATIAL-SPECTRAL CLASSIFICATION BASED ON THE UNSUPERVISED CONVOLUTIONAL SPARSE AUTO-ENCODER FOR HYPERSPECTRAL REMOTE SENSING IMAGERY Xiaobing Han a,b, Yanfei Zhong a,b, *, Liangpei Zhang a,b a State Key

More information

MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS

MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS J.I. Serrano M.D. Del Castillo Instituto de Automática Industrial CSIC. Ctra. Campo Real km.0 200. La Poveda. Arganda del Rey. 28500

More information

A Feature Selection Method to Handle Imbalanced Data in Text Classification

A Feature Selection Method to Handle Imbalanced Data in Text Classification A Feature Selection Method to Handle Imbalanced Data in Text Classification Fengxiang Chang 1*, Jun Guo 1, Weiran Xu 1, Kejun Yao 2 1 School of Information and Communication Engineering Beijing University

More information

Classification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University

Classification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University Classification Vladimir Curic Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University Outline An overview on classification Basics of classification How to choose appropriate

More information

Datasets Size: Effect on Clustering Results

Datasets Size: Effect on Clustering Results 1 Datasets Size: Effect on Clustering Results Adeleke Ajiboye 1, Ruzaini Abdullah Arshah 2, Hongwu Qin 3 Faculty of Computer Systems and Software Engineering Universiti Malaysia Pahang 1 {ajibraheem@live.com}

More information

Neural Network based textural labeling of images in multimedia applications

Neural Network based textural labeling of images in multimedia applications Neural Network based textural labeling of images in multimedia applications S.A. Karkanis +, G.D. Magoulas +, and D.A. Karras ++ + University of Athens, Dept. of Informatics, Typa Build., Panepistimiopolis,

More information

Preface to the Second Edition. Preface to the First Edition. 1 Introduction 1

Preface to the Second Edition. Preface to the First Edition. 1 Introduction 1 Preface to the Second Edition Preface to the First Edition vii xi 1 Introduction 1 2 Overview of Supervised Learning 9 2.1 Introduction... 9 2.2 Variable Types and Terminology... 9 2.3 Two Simple Approaches

More information

Bioinformatics - Lecture 07

Bioinformatics - Lecture 07 Bioinformatics - Lecture 07 Bioinformatics Clusters and networks Martin Saturka http://www.bioplexity.org/lectures/ EBI version 0.4 Creative Commons Attribution-Share Alike 2.5 License Learning on profiles

More information

Remote Sensing Image Analysis via a Texture Classification Neural Network

Remote Sensing Image Analysis via a Texture Classification Neural Network Remote Sensing Image Analysis via a Texture Classification Neural Network Hayit K. Greenspan and Rodney Goodman Department of Electrical Engineering California Institute of Technology, 116-81 Pasadena,

More information

A Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation. Kwanyong Lee 1 and Hyeyoung Park 2

A Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation. Kwanyong Lee 1 and Hyeyoung Park 2 A Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation Kwanyong Lee 1 and Hyeyoung Park 2 1. Department of Computer Science, Korea National Open

More information

GENDER CLASSIFICATION USING SUPPORT VECTOR MACHINES

GENDER CLASSIFICATION USING SUPPORT VECTOR MACHINES GENDER CLASSIFICATION USING SUPPORT VECTOR MACHINES Ashwin Swaminathan ashwins@umd.edu ENEE633: Statistical and Neural Pattern Recognition Instructor : Prof. Rama Chellappa Project 2, Part (a) 1. INTRODUCTION

More information

Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers

Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers A. Salhi, B. Minaoui, M. Fakir, H. Chakib, H. Grimech Faculty of science and Technology Sultan Moulay Slimane

More information

Support Vector Machines

Support Vector Machines Support Vector Machines About the Name... A Support Vector A training sample used to define classification boundaries in SVMs located near class boundaries Support Vector Machines Binary classifiers whose

More information

Classification of Hyper spectral Image Using Support Vector Machine and Marker-Controlled Watershed

Classification of Hyper spectral Image Using Support Vector Machine and Marker-Controlled Watershed Classification of Hyper spectral Image Using Support Vector Machine and Marker-Controlled Watershed Murinto #1, Nur Rochmah DPA #2 # Department of Informatics Engineering, Faculty of Industrial Technology,

More information

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 12, NO. 2, FEBRUARY

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 12, NO. 2, FEBRUARY IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 12, NO. 2, FEBRUARY 2015 349 Subspace-Based Support Vector Machines for Hyperspectral Image Classification Lianru Gao, Jun Li, Member, IEEE, Mahdi Khodadadzadeh,

More information

Distributed Optimization of Feature Mining Using Evolutionary Techniques

Distributed Optimization of Feature Mining Using Evolutionary Techniques Distributed Optimization of Feature Mining Using Evolutionary Techniques Karthik Ganesan Pillai University of Dayton Computer Science 300 College Park Dayton, OH 45469-2160 Dale Emery Courte University

More information

A Vector Agent-Based Unsupervised Image Classification for High Spatial Resolution Satellite Imagery

A Vector Agent-Based Unsupervised Image Classification for High Spatial Resolution Satellite Imagery A Vector Agent-Based Unsupervised Image Classification for High Spatial Resolution Satellite Imagery K. Borna 1, A. B. Moore 2, P. Sirguey 3 School of Surveying University of Otago PO Box 56, Dunedin,

More information

Color-Based Classification of Natural Rock Images Using Classifier Combinations

Color-Based Classification of Natural Rock Images Using Classifier Combinations Color-Based Classification of Natural Rock Images Using Classifier Combinations Leena Lepistö, Iivari Kunttu, and Ari Visa Tampere University of Technology, Institute of Signal Processing, P.O. Box 553,

More information

Introduction to digital image classification

Introduction to digital image classification Introduction to digital image classification Dr. Norman Kerle, Wan Bakx MSc a.o. INTERNATIONAL INSTITUTE FOR GEO-INFORMATION SCIENCE AND EARTH OBSERVATION Purpose of lecture Main lecture topics Review

More information

Classification of hyperspectral images by using extended morphological attribute profiles and independent component analysis

Classification of hyperspectral images by using extended morphological attribute profiles and independent component analysis Classification of hyperspectral images by using extended morphological attribute profiles and independent component analysis Mauro Dalla Mura, Alberto Villa, Jon Atli Benediktsson, Jocelyn Chanussot, Lorenzo

More information

High Resolution Remote Sensing Image Classification based on SVM and FCM Qin LI a, Wenxing BAO b, Xing LI c, Bin LI d

High Resolution Remote Sensing Image Classification based on SVM and FCM Qin LI a, Wenxing BAO b, Xing LI c, Bin LI d 2nd International Conference on Electrical, Computer Engineering and Electronics (ICECEE 2015) High Resolution Remote Sensing Image Classification based on SVM and FCM Qin LI a, Wenxing BAO b, Xing LI

More information

URBAN IMPERVIOUS SURFACE EXTRACTION FROM VERY HIGH RESOLUTION IMAGERY BY ONE-CLASS SUPPORT VECTOR MACHINE

URBAN IMPERVIOUS SURFACE EXTRACTION FROM VERY HIGH RESOLUTION IMAGERY BY ONE-CLASS SUPPORT VECTOR MACHINE URBAN IMPERVIOUS SURFACE EXTRACTION FROM VERY HIGH RESOLUTION IMAGERY BY ONE-CLASS SUPPORT VECTOR MACHINE P. Li, H. Xu, S. Li Institute of Remote Sensing and GIS, School of Earth and Space Sciences, Peking

More information

A Robust Band Compression Technique for Hyperspectral Image Classification

A Robust Band Compression Technique for Hyperspectral Image Classification A Robust Band Compression Technique for Hyperspectral Image Classification Qazi Sami ul Haq,Lixin Shi,Linmi Tao,Shiqiang Yang Key Laboratory of Pervasive Computing, Ministry of Education Department of

More information

Choosing the kernel parameters for SVMs by the inter-cluster distance in the feature space Authors: Kuo-Ping Wu, Sheng-De Wang Published 2008

Choosing the kernel parameters for SVMs by the inter-cluster distance in the feature space Authors: Kuo-Ping Wu, Sheng-De Wang Published 2008 Choosing the kernel parameters for SVMs by the inter-cluster distance in the feature space Authors: Kuo-Ping Wu, Sheng-De Wang Published 2008 Presented by: Nandini Deka UH Mathematics Spring 2014 Workshop

More information