Please use this identifier to cite or link to this item: http://localhost:8081/xmlui/handle/123456789/5341
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMukherjee, Kriti-
dc.date.accessioned2014-10-09T06:46:03Z-
dc.date.available2014-10-09T06:46:03Z-
dc.date.issued2011-
dc.identifierPh.Den_US
dc.identifier.urihttp://hdl.handle.net/123456789/5341-
dc.guideMittal, R. C.-
dc.guideGhosh, J. K.-
dc.description.abstractSpectral imaging of the earth using hyperspectral sensors designed on the principle of charged coupled devices is a recent advancement of remote sensing. These sensors are superior in the sense that they are capable to collect large number of spectral bands corresponding to each instantaneous field of view of the sensor. Compared to multispectral data having two to ten bands for each image, hyperspectral data carries detailed spectral information on the ground surface in ten to thousand numbers of bands. This makes hyperspectral data useful to study subtly different classes and deal with applications like target recognition, anomaly detection and background characterization. Due to the huge data volume associated with each scene of hyperspectral data, this data type requires more specific attention to the complexity of data receiving, storing, transforming and processing. In particular, due to the high dimensionality of this data the analysis of the images becomes a complex problem. Some researchers studied the characteristics of the high dimensional space and their implications for hyperspectral-data analysis. It had been proved the volume in a hypercube has a tendency to concentrate in the corners and in a hyper ellipsoid in an outside shell. Consequently, the high-dimensional space is mostly empty. Furthermore with limited number of training samples, classifier performance improves with dimensionality initially and then declines. Moreover, in given circumstances, the required number of training samples is linearly related to the dimensionality for a linear classifier and to the square of the dimensionality for a quadratic classifier; also this situation for nonparametric classifier like neural network gets worse. Under these circumstances and difficulties, a large number of classes of interest and a large number of available spectral bands need a large number of training samples, which unfortunately are expensive or tedious to acquire. As a result, either the class statistics must be estimated from the limited training sample set or otherwise different feature selection/ extraction based dimensionality reduction methods are required in order to get good classification accuracy. A large number of dimensionality reduction approaches have already been developed and are continuously being developed for hyperspectral data to deal with the above mentioned limitations of the data. Feature selection based approaches reduce the dimensionality of the data by selecting a representative subset of the original features. Since only few of the all available bands are chosen, this approach always suffers from loss of original information. Feature extraction methods, on the other hand, preserve most of the desired original information. However, many of the feature extraction algorithms project the data to a new coordinate system and the physical meaning of the original hyperspectral data, after transforming to the new coordinate system gets lost. Spectral space based feature extraction methods are another alternative, which operate on the spectral response curves of each pixel of hyperspectral data. The physical meaning of the data, i.e., the pattern of the spectral response curve (SRC) is taken into consideration in such dimensionality reduction algorithms and at the same time, no information from the original data is ignored or unattended since the features are generated using the whole of the SRC. Because of the above mentioned advantages of spectral space based dimensionality reduction methods as mentioned above, this research work has also been concentrated on developing a new methodology for spectral space based dimensionality reduction. Fractal mathematics is a new concept of fractional dimension which assigns a unique value of dimension .to each real or in other words, irregular object on the earth. Unlike Euclidean geometry where dimension can be 1, 2 or 3, according to fractal geometry dimension is treated as a continuum having fractional values. According to fractal concept, each of the SRCs of hyperspectral data, being irregular, can be represented by a unique fractal dimension greater than one and less than two. Again, spectral signature in the form of SRC of each land cover class is different from the other. Each SRC, corresponding to each pixel of hyperspectral data, uniquely characterises the land cover class occupied by that pixel. Therefore fractal dimension, which uniquely characterises an SRC, is also unique to the land cover class under consideration. Keeping this significance of fractal dimension in mind, the proposed methodology of dimensionality reduction has been applied on each SRC to extract fractal features which represent the original SRC at a reduced dimension. Fractal dimension of the spectral response curves has been computed using four fractal dimension computation methods, viz. Sevcik's method, power spectrum method, variogram method and divider method. The features have been generated after smoothing, interpolating and segmenting the spectral response curves. Fractal dimension of each adjacent segment has been multiplied by the spectral energy associated with the segment and thus one new feature corresponding to each segment of the spectral response curve is generated. The new features so obtained have then been used to classify three sets of real hyper spectral data, having increasing number of subtly different classes to be distinguished from each other. Finally, optimum number of reduced dimension has been identified using the, criterion of maximum classification accuracy. In order to test its validity, the proposed method of dimensionality reduction has been compared with some other conventional dimensionality reduction methods. The conventional dimensionality reduction methods used for comparison are principal component analysis (PCA), Minimum noise fraction (MNF), independent component analysis (ICA), segmented principal component transform (SPCT) and decision boundary feature extraction (DBFE). Two ways of validation has been performed. One is by comparing the classification results of the features extracted by the methods under comparison and the other one is by comparing the algorithmic complexity. It has been observed that the proposed method, with less computational complexity than the conventional methods, is able to provide classification accuracy statistically equivalent to those from conventional methods. Thus proposed method of dimensionality reduction has been proved to be a viable option for dimensionality reduction of hyperspectral data and at the same time is a better alternative when compared to the conventional methods discussed here because of its less computational complen_US
dc.language.isoenen_US
dc.subjectCIVIL ENGINEERINGen_US
dc.subjectFRACTAL ANALYSISen_US
dc.subjectDIMENSIONALITYen_US
dc.subjectHYPERSPECTRAL DATAen_US
dc.titleFRACTAL ANALYSIS TO REDUCE THE DIMENSIONALITY OF HYPERSPECTRAL DATAen_US
dc.typeDoctoral Thesisen_US
dc.accession.numberG21502en_US
Appears in Collections:DOCTORAL THESES (Civil Engg)

Files in This Item:
File Description SizeFormat 
CEDG21502.pdf24.75 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.