+ All Categories
Home > Documents > Noa Privman Horesh December 2012. Many uses for fusion: Visualization – fuse between bands...

Noa Privman Horesh December 2012. Many uses for fusion: Visualization – fuse between bands...

Date post: 14-Dec-2015
Category:
Upload: georgia-halloway
View: 218 times
Download: 2 times
Share this document with a friend
Popular Tags:
52
Hyperspectral Imaging Seminar HI Sensor Fusion Noa Privman Horesh December 2012
Transcript
  • Slide 1

Noa Privman Horesh December 2012 Slide 2 Many uses for fusion: Visualization fuse between bands Sharpening - fusion between hyperspectaral image and panchromatic image Detection and classification - fusion between hyperspectaral image and panchromatic/ FOPEN SAR / LIDAR image Slide 3 Visualization In lecture 3 Displaying of Hyperspectral Images on RGB Displays we saw several algorithms: 1BT Based Band Selection Principal Components Analysis (PCA) Spectral Weighting Envelopes Slide 4 Visualization cont Another method for getting better visualization - hierarchical fusion based on vector quantization and bilateral filtering. Slide 5 Hierarchical Fusion Using Vector Quantization for Visualization of Hyperspectral Images A typical hyperspectral image data set in remote sensing contains a few hundred images to be fused into a single image (for grayscale) or three images (for RGB). Fusing all the bands together requires heavy calculation and a lot of memory Slide 6 Visualization - Hierarchical Fusion For the hyperspectral image cube of dimensions (X*Y *N), vector quantization (V Q) based fusion is applied across a contiguous subset of dimensions (X*Y*P) to generate B = N/P different fused images at the first stage of hierarchy In the subsequent levels of hierarchy, contiguous images are grouped together in a smaller subset and fused using bilateral filtering. Slide 7 Visualization - Hierarchical Fusion cont Images I 1 to I N from N contiguous bands are organized into Group1 to GroupB, using uniform grouping. So each group has P = N/B images each of size X*Y. First stage - Each group is individually fused using Vector quantization Slide 8 Fused using Vector quantization Vector Quantization used to compress the information and manipulate the data in a way that maintain the most important features. Each image I k is divided into sub-blocks of size mxm giving rise to (XxY )/m 2 image blocks. In a given group there are IV n = (XxYxP)/(m 2 ) image sub- blocks. Slide 9 Generate first code-vector Convert these image vectors to one dimensional vectors each of size m 2 and generate a cluster (matrix) S of size IV n x m 2 The first code-vector (CV (1,1) ) of the code- book size 1, can be computed by finding the column wise average of the entire cluster as follows: Slide 10 Generate code book The code-vector (CV (1,1) ) is then split into two code-vectors by adding and subtracting a tolerance , in order to double the code- book size: Slide 11 The original cluster S is divided into two clusters S1 and S2 based on the distortion D 1 (2,1) and D 1 (2,2) with respect to the code- vectors Comparing the values D 1 (2,1) (k) and D 1 (2,2) (k) the image vectors of the cluster S is grouped into two sub-clusters S1 and S2 such that and Generate code book cont Slide 12 The quality of the code-book is enhanced by updating existing code-vectors through calculating the mean of the image-vectors in each sub-cluster S1 and S2. The code-vectors are updated to the new code-vectors. The corresponding distortions are calculated for the complete image vector set S to get updated sub-clusters S1 and S2. Generate code book cont Slide 13 The update repeated until the vector sum of the distortion in the current level is significantly less than the distortion in the previous level Now we have n code-vectors, each of size m 2, in the code-book (size nxm 2 ). Generate code book cont Slide 14 Fused using Vector quantization Each image Ii rearrange to a matrix of size (XxY/m 2, m 2 ). The rearranged image is now compared with all the n code-vectors with respect to MSE The MSE values of all the P images for a given sub-block position with all the code- vectors are then added. The code-vector CV i that gives the minimum sum of MSE values is selected as the i th sub-block of the fused image I F Slide 15 Hierarchical Fusion - vector quantization At the end of first stage fusion, there are B fused images (I 1,1 to I 1,B ) which are the input images for second level of hierarchy. Slide 16 bilateral filtering is used only from the second hierarchical level following the redundancy removal in the first stage through Vector quantization. Fusion using Bilateral Filtering Slide 17 A bilateral filter A bilateral filter is an edge-preserving and noise reducing smoothing filter. The intensity value at each pixel in an image is replaced by a weighted average of intensity values from nearby pixels. This weight is based on a Gaussian distribution. This preserves sharp edges by systematically looping through each pixel and according weights to the adjacent pixels accordingly. Slide 18 Fusion using Bilateral Filtering Compute the bilateral filtered image: Calculate the weight at each pixel (x, y) for each image: Slide 19 Fusion using Bilateral Filtering The fused Image of the hyperspectral cube subset I F is given by Slide 20 Slide 21 The 1st and the 81st image of the urban image cube (Palo Alto) from Hyperion dataset Slide 22 Results (a) (b) (c) Slide 23 Sharpening Combine the high spatial and the high spectral resolutions in order to obtain the complete and accurate description of the observed scene. The following method will be describe: Unmixing-based constrained nonnegative matrix factorization (UCNMF) Slide 24 Unmixing-based constrained nonnegative matrix factorization Slide 25 Nonnegative matrix factorization (NMF) for hyperspectral unmixing The hyperspectral data is a 3D-array V R LK store the original hyperspectral data. V = WH + N W R LS - the spectral signature matrix H R SK is the abundance matrix Slide 26 Nonnegative matrix factorization (NMF) for hyperspectral unmixing cont To unmix the hyperspectral data, NMF could be conducted: Minimize the square of the Euclidean distance between V and WH Slide 27 Unmixing-based constrained nonnegative matrix factorization (UCNMF) for image fusion After creating the abundance matrix, the weighted fusion method is adopted. Therefore, we have the fused data V f : V f = W(H + (1 )P) Slide 28 Preserve the spectral information of the original hyperspectral image The fuse image does not hold the same spectral quality as the original hyperspectral image causing spectral distortion. Constraint function: Which is equivalent to: Slide 29 Final Fusion model min J(W, H) = F(W, H) + S(V f ) s.t. W 0, H 0 V f = W(H + (1 )P) This is an optimization problem Slide 30 Algorithm (Outline: Lin-PG for UCNMF). 1. Given 0 < < 1, 0 < < 1, 0 < < 1. Set 0 = 1. Initialize the matrices W 0, H 0. Calculate the 2. For k = 1, 2,... (a) Assign k k-1. (b) If k satisfies (1), repeatedly increase it by k k / until either k does not satisfy (1) or W, H keep the same before and after the change of k Else repeatedly decrease k by k k until k satisfies (1). (c) Update W by (2), H by (3). (d) Calculate the 3. Repeat step 2, until satisfying the stopping condition given in (4). 4. Obtaining the fused image V f = W(H + (1 )P) Slide 31 12341234 Slide 32 Results proposed method UCNMF has the advantage that it could advance the spatial resolution of the hyperspectral image without losing much its color information Slide 33 Slide 34 Slide 35 Slide 36 Slide 37 Slide 38 Slide 39 Detection and classification Fusing data from hyperspectral imaging (HSI) sensors with data from other sensors can enhance overall detection and classification performance. Fusing HSI data with foliage-penetration synthetic aperture radar (FOPEN SAR) data - feature level Fusing HSI data with high-resolution imaging (HRI) data - data and feature level Slide 40 HSI and FOPEN SAR Data Fusion FOPEN SAR and HSI sensors detection capabilities complement each other. FOPEN SAR typically operates at 20 to 700 MHz. It penetrates foliage and detects targets under tree canopy, but has significant clutter returns from trees. HSI is capable of subpixel detection and material identification Slide 41 HSI and FOPEN SAR Data Fusion Both SAR and HSI systems may suffer substantial false-alarm and missed detection rates because of their respective background clutter. Reduction in spectral dimensionality to the HSI data cube in order to extract the spectral features Slide 42 HSI and FOPEN SAR Data Fusion PCA is used to decorrelate data and maximize the information content in a reduced number of features A matched-filtering algorithm with thresholding was then applied to the HSI data to detect all pixels of fabric nets. Slide 43 HSI fabric-net detection with a matched- filtering algorithm (left) and terrain classification map (right). The map shows background classes for roads, grass, trees, and shadow regions; these classes result from an unsupervised data-clustering operation that uses the first five principal components Slide 44 Combined FOPEN SAR-HSI Analysis and Fusion The SAR data processed with pixel grouping and threshold. Combined analyses, retained only SAR detections from either open areas or around fabric nets indicated in the HSI data. SAR detections that corresponded to identifications of trees, far-from-open areas, or nets on the HSI were considered false alarms. Slide 45 Slide 46 SAR detection confirmed using HSI material identified There are several strong SAR detections on the left side of the open area. Three pixels match well with military gray-tan paint, indicating the presence of a vehicle, possibly military; This match confirms the SAR detection. Slide 47 HSI and HRI Data Fusion Sharpening the HSI data conduct a combined spatial-spectral analysis Background classification and anomaly detection are first obtained from HSI data. Applying the results to the sharpened HSI data provides enhanced background classification and target detection. Slide 48 HSI and HRI Data Fusion cont The HRI data provide target and background boundaries with spatial edge detection. These edges, combined with results from the sharpened HSI data, spatially enhance the definition of targets and backgrounds. Finally, spectral-matched filtering for target detection is applied to the sharpened HSI data. Slide 49 Slide 50 Slide 51 Slide 52 References Shah, P.; Jayalakshmi, M.; Merchant, S.N.; Desai, U.B.;, "Hierarchical fusion using vector quantization for visualization of hyperspectral images," Information Fusion (FUSION), 2011 Proceedings of the 14th International Conference on, vol., no., pp.1-8, 5-8 July 2011 Z. Zhang, et al., Hyperspectral and panchromatic image fusion using unmixing-based constrained nonnegative matrix factorization, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10.1016/j.ijleo.2012.04.022 Multisensor Fusion with Hyperspectral Imaging Data: Detection and Classification Su May Hsu, Hsiao-hua K. Burke


Recommended