Local Geometry and Sparsity Preserving Embedding for Hyperspectral Image Classification
-
摘要: 大量维数约简(Dimensionality reducion, DR)方法表明保持数据间稀疏特性的同时, 确保几何结构的保持能更有效提取出具有鉴别性的特征, 为此本文提出一种联合局部几何近邻结构和局部稀疏流形的维数约简方法. 该方法首先通过局部线性嵌入方法重构每个样本以保持数据的局部线性关系, 同时计算样本邻域内的局部稀疏流形结构, 在此基础上通过图嵌入框架保持数据的局部几何近邻结构和稀疏结构, 最后在低维嵌入空间中使类内数据尽可能聚集, 提取低维鉴别特征, 从而提升地物分类性能. 在Indian Pines和PaviaU高光谱数据集上的实验结果表明, 本文方法相较于传统维数约简方法能明显提高地物的分类性能, 总体分类可达到83.02%和91.20%, 有利于实际应用.Abstract: A large number of dimensionality reduction methods show that while maintaining the sparse characteristics between data, ensuring that the geometry is maintained can more effectively extract discriminative features. To address this issue, a dimensionality reduction (DR) method combining joint local geometry neighbor structure and local sparse manifold is proposed. The method first reconstructs each sample by local linear embedding to maintain the local linear relationship, and calculates the local sparse manifold structure in the neighbors. Then the local geometry neighbor structure and sparse manifold structure are maintained by the graph embedding frame. Finally, in the low-dimensional embedded space, the intra-class data is compacted as much as possible, so that the low-dimensional discriminative features are extracted to improve the classification performance. The experimental results on the Indian Pines and PaviaU hyperspectral datasets show that the proposed method can significantly improve the classification precision compared with the traditional DR methods. The overall classification can reach 83.02% and 91.20%, respectively, which is beneficial to practical applications.
-
表 1 不同算法在Indian Pines数据集上的分类结果(总体分类精度±标准差(%) (
$p,h$ ))Table 1 Classification results with different methods on Indian Pines dataset (OA ± Std (%) (
$p,h$ ))算法 2% 4% 6% 8% 10% RAW 58.08±1.01 (1.5×10−08, 1) 63.17±0.95 (5.8×10−12, 1) 64.97±0.65 (1.2×10−14, 1) 66.11±0.57 (2.2×10−17, 1) 67.88±0.55 (3.9×10−20, 1) PCA 58.02±1.03 (1.1×10−08, 1) 62.99±1.04 (8.7×10−12, 1) 64.83±0.53 (4.6×10−15, 1) 66.09±0.70 (7.2×10−17, 1) 67.80±0.53 (2.3×10−20, 1) LDA 47.39±1.49 (5.9×10−16, 1) 63.15±1.47 (1.5×10−09, 1) 68.52±1.19 (1.3×10−09, 1) 71.05±1.03 (7.5×10−11, 1) 73.53±0.63 (2.2×10−14, 1) LFDA 57.68±0.80 (2.3×10−10, 1) 63.00±1.26 (2.1×10−11, 1) 65.17±0.84 (9.3×10−14, 1) 67.25±0.63 (3.5×10−16, 1) 68.78±0.79 (1.5×10−17, 1) SNPE 47.29±1.82 (1.5×10−14, 1) 65.04±1.29 (2.1×10−08, 1) 68.72±1.28 (2.3×10−09, 1) 70.19±0.73 (6.2×10−13, 1) 71.67±0.44 (1.7×10−17, 1) SPP 51.95±1.04 (8.0×10−14, 1) 60.00±0.80 (1.6×10−15, 1) 64.29±0.96 (6.2×10−15, 1) 65.98±0.59 (2.8×10−17, 1) 67.97±0.55 (5.2×10−20, 1) DLSP 57.47±1.36 (2.1×10−06, 1) 63.63±1.18 (1.5×10−10, 1) 66.05±0.58 (7.6×10−14, 1) 67.65±0.74 (1.4×10−15, 1) 69.35±0.94 (1.6×10−16, 1) SDL 57.56±1.08 (8.8×10−09, 1) 64.04±0.52 (2.3×10−13, 1) 69.05±0.97 (2.4×10−11, 1) 71.27±0.88 (9.0×10−11, 1) 74.03±0.72 (3.3×10−13, 1) DSPE 61.66±1.80 (2.5×10−01, 0) 65.63±1.89 (5.5×10−06, 1) 69.00±1.37 (3.8×10−09, 1) 69.25±1.28 (1.7×10−11, 1) 70.84±1.15 (3.9×10−13, 1) MSME 51.66±1.34 (2.4×10−13, 1) 57.76±1.38 (1.3×10−14, 1) 61.65±0.84 (1.7×10−16, 1) 64.36±0.57 (8.0×10−19, 1) 66.57±0.99 (4.5×10−18, 1) LGSFA 58.44±1.64 (3.7×10−06, 1) 68.17±1.69 (6.0×10−03, 1) 72.77±1.13 (3.4×10−03, 1) 75.45±0.90 (4.6×10−03, 1) 77.24±0.59 (2.4×10−05, 1) SDME 58.99±1.41 (3.8×10−06, 1) 66.84±1.34 (4.0×10−06, 1) 72.71±1.09 (1.5×10−03, 1) 76.05±0.55 (7.2×10−03, 1) 78.30±0.30 (2.0×10−03, 1) LGSPE 62.79±1.25 70.34±1.05 74.64±1.25 77.04±0.86 79.13±0.53 表 2 不同算法在Indian Pine数据集上各类地物的分类结果
Table 2 Classification results of each class samples via different methods on Indian Pines dataset
类别 RAW PCA LDA LFDA SNPE SPP DLSP SDL DSPE MSME LGSFA SDME LGSPE 1 58.33 66.67 72.22 61.11 47.22 63.89 58.33 58.33 55.56 52.78 75.00 77.78 77.78 2 55.68 55.93 67.13 60.38 62.36 59.06 58.90 70.76 66.97 58.07 77.35 77.18 79.98 3 58.01 58.16 63.55 59.86 63.26 58.01 58.16 64.40 60.57 59.43 67.52 68.65 70.35 4 38.31 37.31 51.74 36.32 40.80 32.84 43.28 51.74 39.80 32.84 52.24 54.73 61.19 5 83.70 84.18 87.83 84.18 83.94 83.94 86.62 88.56 86.62 82.00 89.29 90.51 91.24 6 93.23 93.71 96.77 94.68 93.71 94.68 94.52 96.94 95.16 92.74 97.58 97.26 97.90 7 77.78 77.78 72.22 55.56 88.89 55.56 83.33 83.33 88.89 66.67 88.89 88.89 88.89 8 94.83 95.57 99.01 93.60 94.83 94.09 96.06 94.83 95.57 92.36 96.06 96.31 96.80 9 50 60.00 90 70 70 40 60 100 90 60 100 80 70 10 66.59 65.38 61.74 68.77 69.61 69.85 65.98 74.21 67.43 67.43 74.33 77.24 77.60 11 70.82 70.96 73.50 72.83 75.13 73.41 72.21 81.74 74.75 74.70 80.88 81.60 82.89 12 43.25 43.85 70.63 46.63 53.77 46.43 49.80 63.10 62.30 43.45 69.64 72.42 75.99 13 89.08 86.78 97.13 86.78 87.93 93.68 85.63 95.98 94.83 82.76 94.83 95.98 95.98 14 88.93 88.56 94.05 90.23 90.88 89.58 90.33 96.65 91.53 90.33 94.60 94.33 95.26 15 35.37 34.76 59.15 39.94 38.11 39.02 35.06 46.04 42.99 33.54 50.91 50.00 56.71 16 92.41 92.41 88.61 91.14 89.87 88.61 92.41 91.14 86.08 86.08 86.08 88.61 89.87 OA (%) 69.65 69.65 76.17 71.62 73.34 71.43 71.28 79.12 74.63 70.65 80.57 81.32 83.02 AA (%) 68.52 69.50 77.83 69.50 71.89 67.67 70.66 78.61 74.94 67.20 80.95 80.72 81.78 Kappa 0.654 0.654 0.728 0.676 0.695 0.674 0.673 0.761 0.710 0.664 0.778 0.787 0.806 DR 时间 (s) — 0.01 0.01 0.06 0.13 21.94 39.76 3.67 2.07 6.18 5.52 36.64 12.28 表 3 不同算法在PaviaU数据集上的分类结果(总体分类精度±标准差(%) (
$p,h$ ))Table 3 Classification results with different methods on PaviaU dataset (OA ± Std (%) (
$p,h$ ))算法 0.4% 0.8% 1.2% 1.6% 2% RAW 76.16±1.28 (1.8×10−09, 1) 78.13±0.69 (2.4×10−15, 1) 80.50±0.53 (5.2×10−17, 1) 81.11±0.65 (4.7×10−16, 1) 82.20±0.54 (3.0×10−17, 1) PCA 76.15±1.27 (1.7×10−09, 1) 78.12±0.69 (2.4×10−15, 1) 80.49±0.54 (6.7×10−17, 1) 81.10±0.64 (4.1×10−16, 1) 82.18±0.54 (2.5×10−17, 1) LDA 68.06±1.49 (1.2×10−14, 1) 76.24±1.65 (1.1×10−12, 1) 78.73±1.21 (4.2×10−14, 1) 80.36±0.64 (7.0×10−17, 1) 81.88±0.64 (1.1×10−16, 1) LFDA 79.10±1.16 (3.8×10−06, 1) 81.60±0.57 (1.0×10−11, 1) 83.41±0.26 (6.6×10−15, 1) 84.25±0.68 (1.7×10−11, 1) 85.25±0.48 (3.7×10−13, 1) SNPE 75.17±1.00 (4.5×10−11, 1) 78.32±1.23 (1.2×10−12, 1) 80.48±0.56 (9.0×10−17, 1) 82.15±1.20 (1.9×10−11, 1) 83.09±0.68 (7.7×10−15, 1) SPP 67.23±0.44 (1.9×10−17, 1) 73.63±1.27 (6.1×10−16, 1) 76.24±0.96 (1.6×10−17, 1) 78.13±0.48 (5.7×10−20, 1) 79.19±0.92 (5.4×10−17, 1) DLSP 74.32±1.80 (6.8×10−10, 1) 77.73±1.10 (1.0×10−13, 1) 80.47±0.80 (5.7×10−15, 1) 79.69±1.39 (4.9×10−13, 1) 81.52±1.02 (3.9×10−14, 1) SDL 57.30±2.31 (9.4×10−17, 1) 60.73±4.15 (2.2×10−13, 1) 62.80±2.30 (1.4×10−17, 1) 65.46±1.89 (2.2×10−18, 1) 67.28±2.63 (1.3×10−15, 1) DSPE 79.50±1.52 (6.0×10−05, 1) 80.50±2.31 (6.6×10−07, 1) 80.75±1.56 (1.8×10−10, 1) 82.09±1.42 (1.7×10−10, 1) 83.62±1.67 (2.1×10−08, 1) MSME 75.24±1.76 (3.2×10−09, 1) 79.32±1.90 (3.3×10−09, 1) 82.94±1.01 (2.6×10−10, 1) 83.94±1.12 (1.9×10−09, 1) 85.41±0.64 (2.8×10−11, 1) LGSFA 73.90±1.05 (4.7×10−12, 1) 78.26±1.80 (1.6×10−10, 1) 81.40±0.93 (5.3×10−13, 1) 82.59±0.65 (2.7×10−14, 1) 83.28±0.78 (9.5×10−14, 1) SDME 79.67±1.84 (3.2×10−04, 1) 82.32±0.74 (9.0×10−10, 1) 83.57±0.68 (2.2×10−11, 1) 84.67±0.78 (4.6×10−10, 1) 85.35±0.60 (9.7×10−12, 1) LGSPE 82.96±1.46 86.24±0.78 87.34±0.47 88.23±0.52 88.74±0.36 表 4 不同算法在PaviaU数据集上各类地物的分类结果
Table 4 Classification results of each class samples via different methods on PaviaU dataset
类别 RAW PCA LDA LFDA SNPE SPP DLSP SDL DSPE MSME LGSFA SDME LGSPE 1 85.76 85.74 87.59 86.28 86.17 83.60 86.25 66.85 76.28 85.33 89.17 85.24 89.08 2 93.48 93.33 93.76 95.49 94.65 93.19 93.28 94.19 93.52 96.14 97.27 93.50 97.92 3 63.09 62.99 62.24 69.26 64.69 57.37 62.69 49.00 58.48 64.54 66.10 63.29 74.12 4 83.10 83.24 84.40 84.40 84.95 80.87 83.37 75.78 79.66 83.72 90.45 87.29 88.94 5 98.75 98.75 99.84 99.30 98.67 99.22 98.75 99.69 99.61 99.77 99.77 99.69 99.61 6 63.02 63.16 68.84 72.35 68.04 67.06 62.89 34.72 73.73 76.81 58.04 72.48 82.52 7 83.93 83.69 74.51 86.22 85.59 76.80 83.45 54.87 71.81 82.34 78.62 84.40 88.28 8 80.70 80.93 76.62 83.16 81.68 76.53 80.85 75.61 70.10 78.90 75.04 74.30 83.59 9 100 100 100 100 100 99.67 99.67 87 97 99.33 99 98.89 100 OA (%) 85.37 85.34 85.90 88.13 86.88 84.38 85.35 76.63 83.40 88.03 87.29 86.20 91.20 AA (%) 83.50 83.50 83.04 86.25 84.90 81.59 83.47 70.91 80.07 85.21 83.79 84.34 89.21 Kappa 0.804 0.804 0.812 0.841 0.824 0.791 0.804 0.682 0.778 0.840 0.828 0.816 0.883 DR 时间 (s) — 0.009 0.01 0.10 0.28 23.72 71.13 7.30 6.54 8.26 6.31 33.18 12.17 表 5 不同算法在 Indian Pines 和 PaviaU 数据集上的分类结果(总体分类精度±标准差(%) (Kappa))
Table 5 Classification results with different methods on PaviaU and Indian Pines dataset (OA ± Std (%) (Kappa))
5 10 20 40 60 RAW 44.02±3.74 (0.376) 48.69±1.92 (0.430) 55.25±2.36 (0.499) 58.78±0.91 (0.538) 61.10±0.33 (0.562) 未滤波 SSRSHE 43.65±5.79 (0.375) 49.16±5.34 (0.436) 56.58±4.33 (0.515) 59.91±2.70 (0.550) 66.90±1.69 (0.626) Indian LGSPE 45.58±2.97 (0.342) 52.80±2.47 (0.470) 61.82±2.88 (0.572) 70.26±1.42 (0.664) 73.29±1.09 (0.697) Pines RAW 48.97±2.10 (0.432) 58.53±1.92 (0.535) 65.27±1.66 (0.609) 70.14±1.65 (0.663) 74.30±1.04 (0.709) 滤波后 SSRSHE 65.04±4.15 (0.610) 71.50±2.87 (0.680) 79.30±1.63 (0.767) 86.64±2.85 (0.848) 87.75±1.15 (0.860) LGSPE 69.17±2.72 (0.657) 78.96±1.82 (0.763) 85.48±1.48 (0.836) 90.68±1.00 (0.894) 93.92±0.50 (0.930) RAW 56.85±7.57 (0.474) 65.12±3.46 (0.564) 68.82±2.72 (0.609) 71.78±0.79 (0.644) 74.58±0.59 (0.676) 未滤波 SSRSHE 62.49±3.39 (0.538) 64.29±1.73 (0.559) 67.81±3.25 (0.599) 71.21±1.71 (0.638) 75.36±2.44 (0.685) PaviaU LGSPE 65.58±6.08 (0.567) 69.91±6.06 (0.622) 75.70±1.98 (0.690) 81.76±1.71 (0.765) 81.45±1.30 (0.760) RAW 60.41±2.85 (0.514) 67.39±2.46 (0.590) 71.27±3.06 (0.639) 76.11±1.14 (0.696) 77.86±2.17 (0.718) 滤波后 SSRSHE 71.18±4.85 (0.639) 75.17±2.96 (0.686) 82.86±2.07 (0.730) 84.24±0.71 (0.795) 87.02±0.97 (0.803) LGSPE 76.10±3.53 (0.697) 80.21±3.05 (0.700) 86.70±2.18 (0.828) 91.02±1.95 (0.883) 93.72±0.93 (0.917) -
[1] 杜培军, 夏俊士, 薛朝辉, 谭琨, 苏红军, 鲍蕊. 高光谱遥感影像分类研究进展. 遥感学报, 2016, 20(2): 236-256.Du Pei-Jun, Xia Jun-Shi, Xue Zhao-Hui, Tan Kun, Su Hong-Jun, Bao Rui. Progress of hyperspectral remote sensing image classification. Journal of Remote Sensing, 2016, 20(2): 236-256. [2] 黄鸿, 李政英, 石光耀, 潘银松.面向高光谱影像分类的多特征流形鉴别嵌入.光学精密工程, 2019, 27(3): 726-738. doi: 10.3788/OPE.20192703.0726Huang Hong, Li Zheng-Ying, Shi Guang-Yao, Pan Ying-Song. Multi-features manifold discriminant embedding for hyperspectral image classification. Optics and Precision Engineering, 2019, 27(3):726-738. doi: 10.3788/OPE.20192703.0726 [3] 张兵.高光谱图像处理与信息提取前沿.遥感学报, 2016, 20(5):1062-1090.Zhang Bing. Advancement of hyperspectral image processing and information extraction. Journal of Remote Sensing. 2016, 20(5): 1062-1090. [4] 王天成, 刘相振, 董泽政, 王海波. 一种自适应鲁棒最小体积高光谱解混算法[J].自动化学报, 2017, 43(12): 2141-2159.Wang Tian-Cheng, Liu Xiang-Zhen, Dong Ze-Zheng, Wang Hai-Bo. A robust minimum volume based algorithm with automatically estimating regularization parameters for hy-perspectral unmixing. Acta Automatica Sinica, 2017, 43(12): 2141-2159. [5] Chen Mulin, Wang Qi, Li Xue-Long. Discriminant analysis with graph learning for hyperspectral image classification. Remote Sensing, 2018, 10(6):836. doi: 10.3390/rs10060836 [6] 张号逵, 李映, 姜晔楠. 深度学习在高光谱图像分类领域的研究现状与展望. 自动化学报, 2018, 44(6): 961-977.Zhang Hao-Kui, Li Ying, Jiang Ye-Nan. Deep learning for hyperspectral imagery classification: the state of the art and prospects. Acta Automatica Sinica, 2018, 44(6): 961-977. [7] 张成坤, 韩敏.基于边缘保持滤波的高光谱影像光谱-空间联合分类[J].自动化学报, 2018, 44(2):280-288.Zhang Cheng-Kun, Han Min. Spectral-spatial joint classi-fication of hyperspectral image with edge-preserving filtering. Acta Automatica Sinica, 2018, 44(2): 280-288. [8] Gan Yu-Hang, Luo Fu-Lin, Liu Ju-Hua, Lei Bing, Zhang Tao, Liu Ke. Feature extraction based multi-structure manifold embedding for hyperspectral remote sensing image classification. IEEE Access, 2017, 5:25069-25080. doi: 10.1109/ACCESS.2017.2766242 [9] Sun Wei-Wei, Yang Gang, Du Bo, Zhang Le-Fei, Zhang Liang-Pei. A sparse and low-rank near-isometric linear embedding method for reature extraction in hyperspectral imagery classification. IEEE Transactions on Geoscience and Remote Sensing, 2017, 55(7):4032-4046. doi: 10.1109/TGRS.2017.2686842 [10] Karl P. On lines and planes of closest fit to systems of points in space. Philosophical Magazine. 1901, 2(11): 559-572. [11] Fisher R A. The use of multiple measurements in taxonomic problems. Annals of Eugenics, 1936, 7: 179-188. doi: 10.1111/j.1469-1809.1936.tb02137.x [12] Belkin M, Niyogi P. Laplacian Eigenmaps for Dimensionality Reduction and data Representation. MIT Press, 2003. [13] Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290(5500): 2323-2326. doi: 10.1126/science.290.5500.2323 [14] Tenenbaum J B, de Silva V, Langford J C. A global geometric framework for nonlinear dimensionality reduction. Science, 2000, 290(5500). [15] He X F, Cai D, Yan S C, Zhang H J. Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision (ICCV), Beijing, China, 2005, 2: 1208-1213. [16] He Xiao-Fei, Niyogi P. Locality preserving projections. Neural Information Processing Systems, 2004, 16: 153-160. [17] Yan Shui-Cheng, Xu Dong, Zhang Ben-Yu, Zhang Hong-Jiang, Yang Qiang, Lin S. Graph embedding and ex-tensions: a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis & Machine Intelli-gence, 2007, 29(1):40-51. [18] Zeng Xian-Hua, Luo Si-Wei. A Supervised Subspace Learning Algorithm: Supervised neighborhood preserving embedding. Advanced Data Mining and Applications, Har-bin, China, 2007, 4632: 81-85 [19] Zheng Zhong-Long, Yang Fan, Tan We-Nan, Jia Jiong, Yang Jie. Gabor feature-based face recognition using supervised locality preserving projection. Signal Processing, 2007, 87(10): 2473-2483. doi: 10.1016/j.sigpro.2007.03.006 [20] Sugiyama M. Dimensionality reduction of multimodal labeled data by local Fisher discriminant analysis. Journal of Machine Learning Research, 2007, 8(1): 1027-1061. [21] Luo Fu-Lin, Huang Hong, Duan Yu-Le, Liu Jia-Min, Liao Ying-Hua. Local geometric structure feature for dimen-sionality reduction of hyperspectral imagery. Remote Sensing, 2017, 9(8): 790-813. doi: 10.3390/rs9080790 [22] Qiao Li-Shan; Chen Song-Can; Tan Xiao-Yang. Sparsity preserving projections with applications to face recognition. Pattern Recognition, 2010, 43(1):331-341. doi: 10.1016/j.patcog.2009.05.005 [23] Zang Fei, Zhang Jiang-She. Discriminative learning by sparse representation for classification. Neurocomputing, 2011, 74:2176-2183. doi: 10.1016/j.neucom.2011.02.012 [24] 罗甫林. 高光谱图像稀疏流形学习方法研究. 测绘学报, 2017, 46(3):400. doi: 10.11947/j.AGCS.2017.20160621Luo Fu-Lin. Sparse manifold learning for hyperspectral imagery. Acta Geodaetica et Cartographica Sinica, 2017, 46(3):400. doi: 10.11947/j.AGCS.2017.20160621 [25] 罗甫林, 黄鸿, 刘嘉敏, 马泽忠.基于鉴别稀疏保持嵌入的高光谱影像地物分类.仪器仪表学报, 2016, 37(1): 177-183. doi: 10.3969/j.issn.0254-3087.2016.01.024Luo Fu-Lin, Huang Hong, Liu Jia-Min, Ma Ze-Zhong. Hyperspectral image land cover classification based on discriminant sparse preserving embedding. Chinese Journal of Scientific Instrument, 2016, 37(1):177-183. doi: 10.3969/j.issn.0254-3087.2016.01.024 [26] Huang H, Li Z Y, He H B, Duan Y L, Yang S. Self-adaptive manifold discriminant analysis for feature extraction from hyperspectral imagery. Pattern Recognition, 2020, 107(9): 108487 [27] Elhamifar E, Vidal R. Sparse manifold clustering and embedding. In: Proceedings of the 24th International Conference on Neural Information Processing and Systems. Granada, Spain, 2011. 55−63 [28] Huang Hong, Luo Fu-Lin, Liu Jia-Ming, Yang Ya-Qiong. Dimensionality reduction of hyperspectral images based on sparse discriminant manifold embedding. ISPRS Journal of Photogrammetry and Remote Sensing, 2015, 106:42-54. doi: 10.1016/j.isprsjprs.2015.04.015 [29] 黄鸿, 陈美利, 王丽华, 李政英.空-谱协同正则化稀疏超图嵌入的高光谱图像分类[J].测绘学报, 2019, 48(6):676- 687. doi: 10.11947/j.AGCS.2019.20180469Huang Hong, Chen Mei-Li, Wang Li-Hua, Li Zheng-Yin. Using Spatial-spectral Regularized Hypergraph Embedding for Hyperspectral Image Classification. Acta Geodaetica et Cartographica Sinica,, 2019, 48(6):676- 687. doi: 10.11947/j.AGCS.2019.20180469