Prediction of Aeroengine Remaining Life by Combining Multi-scale Local Features and Transformer Global Learning
-
摘要: 飞机发动机剩余寿命(Remaining useful life, RUL)的准确预测对确保其安全性和可靠性至关重要. 在基于多传感器检测数据预测时, 需解决局部特征提取问题以全面捕捉设备在不同时间尺度下的退化趋势, 并需解决时间序列中各元素之间长期依赖性的全局学习问题. 因此, 提出了结合多尺度局部特征增强单元(Multi-sacle local feature enhancement unit, MSLFU_BLOCK)和Transformer编码器的预测模型, 称之为MS_Transformer. MSLFU_BLOCK利用堆叠的因果卷积逐层从时间序列数据中提取多尺度局部信息, 同时避免了传统卷积计算中固有的未来数据泄漏问题. 随后, Transformer编码器通过其自注意机制进一步捕获时间序列数据中的短期和长期依赖关系. 通过将多尺度局部特征增强单元与Transformer编码器相结合, 提出的MS_Transformer全面捕捉了时间序列数据中的局部和全局模式. 在广泛使用的C-MAPSS基准数据集上进行的消融和预测实验验证了模型的合理性和有效性. 与13个先进预测模型的比较分析表明, MS_Transformer模型在操作条件更复杂的FD002和FD004数据集上的RMSE和Score指标优于其他模型, 同时在四个数据集上的平均性能最优. 该研究为发动机剩余寿命预测提供了更为可靠的解决方案.
-
关键词:
- 剩余寿命预测 /
- 航空发动机 /
- Transformer /
- 多尺度特征 /
- 局部特征
Abstract: Accurate prediction of the remaining useful life (RUL) of aeroengine is crucial for ensuring their safety and reliability. In the process of predicting RUL based on multi-sensor detection data, it is necessary to address the issue of local feature extraction to comprehensively capture the degradation trends of equipment at different time scales, as well as the global learning problem of long-term dependencies among elements in the time series. Therefore, we propose a predictive model named MS_Transformer, which combines the multi-scale local feature enhancement unit (MSLFU_BLOCK) and a Transformer encoder. The MSLFU_BLOCK leverages stacked causal convolutional layers to progressively extract multi-scale local information from the time series data, simultaneously mitigating concerns related to future data leakage inherent in conventional convolutional computations. Subsequently, the Transformer encoder, with its self-attention mechanism, further captures short-term and long-term dependencies within the time series data. By integrating the MSLFU_BLOCK with the Transformer encoder, the proposed MS_Transformer comprehensively captures both local and global patterns within time series data. Extensive ablation and prediction experiments were performed on the widely utilized C-MAPSS benchmark dataset to validate the rationality and effectiveness of the proposed model. Comparative analyses with thirteen advanced prediction models demonstrate that the MS_Transformer model outperforms others, particularly on the more complex FD002 and FD004 datasets, based on RMSE and Score metrics. The average performance across all four datasets indicates the superiority of the proposed approach. The research provides a more reliable solution for predicting the RUL of engines.-
Key words:
- Remaining life prediction /
- aeroengine /
- Transformer /
- multi-scale features /
- local features
-
表 1 C-MAPSS数据集的属性
Table 1 Attributes of the C-MAPSS dataset
参数 FD001 FD002 FD003 FD004 训练集中发动机个数 100 260 100 249 测试集中发动机个数 100 259 100 248 操作条件 1 6 1 6 错误模式 1 1 2 2 训练集大小 20632 53760 24721 61250 测试集大小 13097 33992 16597 41215 表 2 与先进方法相比较
Table 2 Comparison with state-of-the-art methods
方法 FD001 FD002 FD003 FD004 Average RMSE Score RMSE Score RMSE Score RMSE Score RMSE Score LSTM (2017)[7] 16.14 338.00 24.49 1718.00 16.18 852.00 28.17 2238.00 21.25 1286.50 DCNN (2018)[11] 12.61 274.00 22.36 4020.00 12.64 284.00 23.31 5027.00 17.73 2401.25 HDNN (2019)[14] 13.02 245.00 15.24 1282.42 12.22 287.72 18.16 1527.42 14.66 835.64 AGCNN (2021)[18] 12.42 225.51 19.43 1492.00 13.39 227.09 21.50 3392.00 16.68 1334.15 GCU_Transformer (2021)[32] 11.27 — 22.81 — 11.42 — 24.86 — 17.59 — BiGRU-TSAM (2022)[20] 12.56 213.35 18.94 2264.13 12.45 232.86 20.47 3610.34 16.10 1580.17 IDMFFN (2022)[13] 12.18 204.69 19.17 1819.42 11.89 205.54 21.72 3338.84 16.24 1392.12 MTSTAN (2023)[24] 10.97 175.36 16.81 1154.36 10.90 188.22 18.85 1446.29 14.38 741.06 Encoder-Attention (2023)[21] 10.35 183.75 15.82 1008.08 11.34 219.63 17.35 1751.23 13.72 790.67 MSIDSN (2023)[23] 11.74 205.55 18.26 2046.65 12.04 196.42 22.48 2910.73 16.13 1339.83 ATCN (2024)[26] 11.48 194.25 15.82 1210.57 11.34 249.19 17.80 1934.86 14.11 897.22 MHT (2024)[33] 11.92 215.20 13.70 746.70 10.63 150.50 17.73 1572.00 13.50 671.10 MachNet (2024)[34] 11.04 176.82 24.52 3326.00 10.59 161.26 28.86 5916.00 18.75 2395.02 Ours 11.79 224.36 11.98 608.88 11.95 225.05 14.47 1072.38 12.55 532.67 表 3 消融实验结果
Table 3 Results of ablation experiment
方法 FD001 FD002 FD003 FD004 RMSE Score RMSE Score RMSE Score RMSE Score MS_Transformer 11.79 224.36 11.98 608.88 11.95 225.05 14.47 1072.38 MS (CNN) _Transformer 12.82 254.36 13.72 1098.09 13.80 325.05 15.93 1372.87 MS_Transformer (w/o MS) 13.20 275.59 15.78 1430.90 14.45 445.51 18.48 1754.22 MS_Transformer (w/o s & MS) 13.91 298.18 15.91 1497.41 16.10 552.61 19.03 1992.69 表 4 不同窗口长度对应的预测指标值
Table 4 Predictive metric values corresponding to different window lengths
滑动窗口长度 FD001 FD002 FD003 FD004 RMSE Score RMSE Score RMSE Score RMSE Score $L=30 $ 12.89 264.78 14.38 1011.04 13.73 279.99 17.20 1858.13 $L=40 $ 12.67 268.07 13.42 854.88 12.21 213.13 16.74 1676.82 $L=50 $ 11.93 212.96 12.94 724.12 12.31 255.20 15.56 1375.81 $L=60 $ 11.79 224.36 11.98 608.88 11.95 225.05 14.47 1072.38 $L=70 $ 12.23 242.86 11.75 587.67 12.59 266.96 14.26 1093.49 表 5 不同因果卷积层数对应的预测指标值
Table 5 Predictive metric values corresponding to different numbers of causal convolution layers
因果卷积层数 FD001 FD002 FD003 FD004 RMSE Score RMSE Score RMSE Score RMSE Score 1 12.28 270.33 12.64 749.42 12.60 278.99 16.31 1887.34 2 11.79 224.36 11.98 608.88 11.95 225.05 14.47 1072.38 3 13.02 270.33 14.98 1225.98 14.19 367.16 17.30 2185.46 表 6 不同数量的encoder layer对应的预测指标值
Table 6 Predictive metric values corresponding to different numbers of encoder layers
编码器层数 FD001 FD002 FD003 FD004 RMSE Score RMSE Score RMSE Score RMSE Score 1 11.79 224.36 11.98 608.88 11.95 225.05 14.47 1072.38 2 11.35 210.25 12.78 785.32 11.56 230.32 16.72 1785.03 3 11.95 223.25 12.58 735.32 11.86 235.46 15.72 1685.03 -
[1] Qiao W, Zhang P, Chow M Y. Condition monitoring, diagnosis, prognosis, and health management for wind energy conversion systems. IEEE Transactions on Industrial Electronics, 2015, 62(10): 6533−6535 doi: 10.1109/TIE.2015.2464785 [2] Moghaddass R, Zuo M J. An integrated framework for online diagnostic and prognostic health monitoring using a multistate deterioration process. Reliability Engineering & System Safety, 2014, 124: 92−104 [3] 裴洪, 胡昌华, 司小胜, 张建勋, 庞哲楠, 张鹏. 基于机器学习的剩余寿命预测方法综述. 机械工程学报, 2019, 55(8): 1−13 doi: 10.3901/JME.2019.08.001Pei Hong, Hu Chang-Hua, Si Xiao-Sheng, Zhang Jian-Xun, Pang Zhe-Nan, Zhang Peng. Review of machine learning based remaining useful life prediction methods for equipment. Journal of Mechanical Engineering, 2019, 55(8): 1−13 doi: 10.3901/JME.2019.08.001 [4] Javed K, Gouriveau R, Zerhouni N. A new multivariate approach for prognostics based on extreme learning machine and fuzzy clustering. IEEE Transactions on Cybernetics, 2015, 45(12): 2626−2639 doi: 10.1109/TCYB.2014.2378056 [5] Khelif R, Chebel-Morello B, Malinowski S, Laajili E, Fnaiech F, Zerhouni N. Direct remaining useful Life estimation based on support vector regression. IEEE Transactions on Industrial Electronics, 2017, 64(3): 2276−2285 doi: 10.1109/TIE.2016.2623260 [6] Wu D, Jennings C, Terpenny J, Gao R X, Kumara S A. Comparative study on machine learning algorithms for smart manufacturing: Tool wear prediction using random forests. Journal of Manufacturing Science and Engineering, 2017, 139(7): Article No. 071018 [7] Zheng S, Ristovski K, Farahat A, Gupta C. Long short-term memory network for remaining useful life estimation. In: Proceedings of IEEE International Conference on Prognostics and Health Management. Dallas, TX, USA: IEEE, 2017. 88−95 [8] Huang C G, Huang H Z, Li Y F. A bidirectional LSTM prognostics method under multiple operational conditions. IEEE Transactions on Industrial Electronics, 2019, 66(11): 8792−8802 doi: 10.1109/TIE.2019.2891463 [9] Yu W, Kim I Y, Mechefske C. An improved similarity-based prognostic algorithm for RUL estimation using an RNN autoencoder scheme. Reliability Engineering & System Safety, 2020, 199: Article No. 106926 [10] Liu J, Lei F, Pan C, Hu D, Zuo H. Prediction of remaining useful life of multi-stage aero-engine based on clustering and LSTM fusion. Reliability Engineering & System Safety, 2021, 214: Article No. 107807 [11] Li X, Ding Q, Sun J Q. Remaining useful life estimation in prognostics using deep convolution neural networks. Reliability Engineering & System Safety, 2018, 172: 1−11 [12] Yang B, Liu R, Zio E. Remaining useful life prediction based on a double-convolutional neural network architecture. IEEE Transactions on Industrial Electronics, 2019, 66(12): 9521−9530 doi: 10.1109/TIE.2019.2924605 [13] Li X, Jiang H, Liu Y, Wang T, Li Z. An integrated deep multiscale feature fusion network for aeroengine remaining useful life prediction with multi-sensor data. Knowledge-Based Systems, 2022, 235: Article No. 107652 doi: 10.1016/j.knosys.2021.107652 [14] Al-Dulaimi A, Zabihi S, Asif A, Mohammadi A. A multi-modal and hybrid deep neural network model for remaining useful life estimation. Computers in Industry, 2019, 108: 186−196 doi: 10.1016/j.compind.2019.02.004 [15] Saxena A, Goebel K, Simon D, Eklund N. Damage propagation modeling for aircraft engine run to failure simulation. In: Proceedings of 2008 International Conference on Prognostics and Health Management. Denver, CO, USA: IEEE, 2008. 1−9 [16] Ayodeji A, Wang Z, Wang W, Qin W, Yang C, Xu S, et al. Causal augmented ConvNet: A temporal memory dilated convolution model for long-sequence time series prediction. ISA Transactions, 2021, 123: 200−217 [17] Li H, Wang Z, Li Z. An enhanced CNN-LSTM remaining useful life prediction model for aircraft engine with attention mechanism. PeerJ Computer Science, 2022, 8: Article No. 1084 doi: 10.7717/peerj-cs.1084 [18] Liu H, Liu Z, Jia W, Lin X. Remaining useful life prediction using a novel feature-attention-based end-to-end approach. IEEE Transactions on Industrial Informatics, 2021, 17(2): 1197−1207 doi: 10.1109/TII.2020.2983760 [19] Xu X, Li X, Ming W, Chen M. A novel multi-scale CNN and attention mechanism method with multi-sensor signal for remaining useful life prediction. Computers & Industrial Engineering, 2022, 169: Article No. 108204 [20] Zhang J, Jiang Y, Wu S, Li X, Luo H, Yin S. Prediction of remaining useful life based on bidirectional gated recurrent unit with temporal self-attention mechanism. Reliability Engineering & System Safety, 2022, 221 : Article No. 108297 [21] Wang X, Li Y, Xu Y, Liu X, Zheng T, Zheng B. Remaining useful life prediction for aero-engines using a time-enhanced multi-head self-attention model. Aerospace, 2023, 10(1): Article No. 80 doi: 10.3390/aerospace10010080 [22] Xu Z, Zhang Y, Miao J, Miao Q. Global attention mechanism based deep learning for remaining useful life prediction of aero-engine. Measurement, 2023, 217: Article No. 113098 doi: 10.1016/j.measurement.2023.113098 [23] Zhao K, Jia Z, Jia F, Shao H. Multi-scale integrated deep self-attention network for predicting remaining useful life of aero-engine. Engineering Applications of Artificial Intelligence, 2023, 120: Article No. 105860 doi: 10.1016/j.engappai.2023.105860 [24] Zhu J, Jiang Q, Shen Y, Xu F, Zhu Q. Res-HSA: Residual hybrid network with self-attention mechanism for RUL prediction of rotating machinery. Engineering Applications of Artificial Intelligences, 2023, 124: Article No. 106491 doi: 10.1016/j.engappai.2023.106491 [25] Li H, Cao P, Wang X, Yi B, Huang M, Sun Q, et al. Multi-task spatio-temporal augmented net for industry equipment remaining useful life prediction. Advanced Engineering Informatics, 2023, 55: Article No. 101898 doi: 10.1016/j.aei.2023.101898 [26] Zhang Q, Liu Q, Ye Q. An attention-based temporal convolutional network method for predicting remaining useful life of aero-engine. Engineering Applications of Artificial Intelligence, 2024, 127: Article No. 107241 doi: 10.1016/j.engappai.2023.107241 [27] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez A N, et al. Attention is all you need. Neural Information Processing Systems, 2017, 30: 5998−6008 [28] Xu J, Wu H, Wang J, Long M. Anomaly transformer: Time series anomaly detection with association discrepancy. In: Proceedings of 10th International Conference on Learning Representations. Virtual Event: ICLR, 2022. [29] Tuli S, Casale G, Jennings N R. TranAD: Deep transformer networks for anomaly detection in multivariate time series data. arXiv: 2201.07284, 2022. [30] Zerveas G, Jayaraman S, Patel D, Bhamidipaty A, Eickhoff C. A transformer-based framework for multivariate time series representation learning. In: Proceedings of 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. Virtual Event: 2021. 2114−2124 [31] Li S, Jin X, Xuan Y, Zhou X, Chen W, Wang Y, et al. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. arXiv: 2331.8422, 2019. [32] Mo Y, Wu Q, Li X, Huang B. Remaining useful life estimation via transformer encoder enhanced by a gated convolutional unit. Journal of Intelligent Manufacturing, 2021, 32(7): 1997−2006 doi: 10.1007/s10845-021-01750-x [33] Guo J, Lei S, Du B. MHT: A multiscale hourglass-transformer for remaining useful life prediction of aircraft engine. Engineering Applications of Artificial Intelligence, 2024, 128: Article No. 107519 doi: 10.1016/j.engappai.2023.107519 [34] Jaenal A, Ruiz-Sarmiento J R, Javier G J. MachNet, a general deep learning architecture for predictive maintenance within the industry 4.0 paradigm. Engineering Applications of Artificial Intelligence, 2024, 127: Article No. 107365 doi: 10.1016/j.engappai.2023.107365