参考文献/References:
[1]LU X, WEI G, YANG S, et al. Properties analysis and optimization of primary air volume in power station[C]//Electronics, Communications and Control (ICECC), 2011 International Conference on. IEEE,2011:3848-3851.[2]LUCCARINI L, PORRA E, SPAGNI A, et al. Soft sensors for control of nitrogen and phosphorus removal from wastewater by neural networks[J].Water Science Technology,2002, 45(4-5):101-107.
[3]WANG Xiaokai, HUA Lin, WANG Xiaoxuan. Soft measurement model of ring’s dimensions for vertical hot ring rolling process using neural networks optimized by genetic algorithm[J]. Journal of Central South University, 2017, 24(1):17-29.
[4]GONZAGA J C B, MELEIRO L A C, KIANG C, et al. ANN-based soft-sensor for real-time process monitoring and control of an industrial polymerization process[J]. Computers and Chemical Engineering,2009, 33(1):43-49.
[5]XU C, DAI F, XU X, et al. GIS-based support vector machine modeling of earthquake-triggered landslide susceptibility in the Jianjiang River watershed, China[J]. it Geomorphology, 2012, 145-146(none):70-80.
[6]WANG Y, LIN B, DONG Y, et al. Mechanism modeling and validation in ultrasonic vibration assisted drilling with variable cross section drilling tool of brittle materials[J].it International Journal of Advanced Manufacturing Technology, 2019:1-13.
[7]GE Z. Supervised latent factor analysis for process data regression modeling and soft sensor application[J]. IEEE Transactions on Control Systems Technology, 2016, 24(3):1004-1011.
[8]HAO X, ZHANG G, MA S. Deep learning[J].International Journal of Semantic Computing, 2016, 10(03):417-439.
[9]LECUN Y, BENGIO Y, HINTON G. Deep learning[J]. Nature, 2015, 521(7553):436-444.
[10]KANEKO H , FUNATSU K .Classification of the degradation of soft sensor models and discussion on adaptive models[J].Aiche Journal, 2013, 59(7):2339-2347.
[11]GERS F A .Learning to forget: continual prediction with LSTM[J].Neural Computation, 1999.
[12]GREFF K, SRIVASTAVA R K, KOUTNIK Jan, et al. LSTM: a search space odyssey[J]. IEEE Transactions on Neural Networks Learning Systems, 2015, 28(10):2222-2232.
[13]Shuting, Dong, Mingming, et al. LSTM based reserve prediction for bank outlets[J]. Journal of Tsinghua University, Natural Science Edition, 2019,24(1):9.
[14]ALTCHE F , FORTELLE A D L .An LSTM network for highway trajectory prediction[J].IEEE, 2017.
[15]KARIM F, MAJUMDAR S, DARABI H, et al. LSTM fully convolutional networks for time series classification[J]. IEEE Access, 2017, 6(99):1662-1669.
[16]SCHMIDHUBER J. Deep learning in neural networks: an overview[J]. Neural Networks, 2015, 61:85-117.
[17]KOLEN J F, KREMER S C. Gradient flow in recurrent nets: the difficulty of learning longterm dependencies[J]. A Field Guide to Dynamical Recurrent Neural Networks, 2001:1-15.
[18]TAN Y H, CHAN C S. Phi-LSTM: a phrase-based hierarchical LSTM model for image captioning[J]. Asian Conference on Computer Vision, 2016.
[19]SOLTAU H, LIAO H, SSK H. Neural speech recognizer: acoustic-to-word LSTM model for large vocabulary speech recognition[J]. Interspeech, 2016, 1566:3707-3711.
[20]BLUCHE T,LOURADOUR J, MESSINA R. Attend and Read: End-to-End Handwritten Paragraph Recognition with MDLSTM Attention[C]//2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR).IEEE, 2018.
[21]JOZEFOWICZ R, ZAREMBA W, SUTSKEVER I. An empirical exploration of recurrent network architectures[C]International Conference on International Conference on Machine Learning. JMLR.org, 2015,37.
[22]SALAKHUTDINOV N S M .Unsupervised 1eaming of video representations using 1stms[J].[2023-09-13].
[23]LI Jiulong, ZHOU Lingke. Detecting and identifying gross errors based on “rule”[J]. Computer and Modernization, 2012.
相似文献/References: