Adaptive Multi-Scale Representation Learning for System Indicator Prediction in Complex Operational Environments

Main Article Content

Alaric Voss

Abstract

This paper proposes an adaptive multi-scale representation learning method for system indicator prediction to address the challenges of non-stationarity, dynamic dependencies, and scale heterogeneity in multidimensional indicator sequences of complex systems. The proposed method employs a multi-scale convolutional decomposition module to extract features at different temporal granularities, capturing both short-term fluctuations and long-term trends in system state variations. An adaptive feature fusion mechanism is further introduced to dynamically weight and constrain cross-scale features, enabling joint modeling and balanced representation across multiple temporal levels. Structurally, the model integrates hierarchical normalization and gated update units to enhance the stability of feature flow and the continuity of temporal dependencies, effectively mitigating prediction degradation under high-frequency disturbances and distribution shifts. In addition, a residual propagation-based dynamic feature transformation layer is designed to achieve collaborative modeling between local information and global semantics, further improving the model's representational capacity and generalization for complex time-varying signals. Validation on representative system indicator datasets demonstrates that the proposed approach outperforms mainstream baseline models in key metrics such as MSE, MAE, MAPE, and RMSE, maintaining high accuracy and stability under complex dynamic conditions and providing an effective solution for intelligent prediction and robust modeling of multidimensional system indicators.

Article Details

How to Cite
Voss, A. (2023). Adaptive Multi-Scale Representation Learning for System Indicator Prediction in Complex Operational Environments. Journal of Computer Science and Software Applications, 3(5). Retrieved from https://mfacademia.org/index.php/jcssa/article/view/264
Section
Articles

References

B. Lim and S. Zohren, “Time-series forecasting with deep learning: A survey,” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 379, no. 2194, 2021.

A. van den Oord, S. Dieleman, H. Zen, K. Simonyan, O. Vinyals, A. Graves et al., “WaveNet: A generative model for raw audio,” arXiv:1609.03499, 2016.

S. Bai, J. Z. Kolter and V. Koltun, “An empirical evaluation of generic convolutional and recurrent networks for sequence modeling,” arXiv:1803.01271, 2018.

G. Lai, W. C. Chang, Y. Yang and H. Liu, “Modeling long- and short-term temporal patterns with deep neural networks,” Proceedings of the 41st International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR), pp. 95-104, 2018.

D. Salinas, V. Flunkert, J. Gasthaus and T. Januschowski, “DeepAR: Probabilistic forecasting with autoregressive recurrent networks,” International Journal of Forecasting, vol. 36, no. 3, pp. 1181-1191, 2020.

B. N. Oreshkin, D. Carpov, N. Chapados and Y. Bengio, “N-BEATS: Neural basis expansion analysis for interpretable time series forecasting,” arXiv:1905.10437, 2019.

C. Challu, K. G. Olivares, B. N. Oreshkin, F. G. Ramirez, M. M. Canseco and A. Dubrawski, “N-HiTS: Neural hierarchical interpolation for time series forecasting,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, no. 6, pp. 6989-6997, 2023.

M. Liu, A. Zeng, M. Chen, Z. Xu, Q. Lai, L. Ma and Q. Xu, “SCINet: Time series modeling and forecasting with sample convolution and interaction,” Advances in Neural Information Processing Systems, vol. 35, pp. 5816-5828, 2022.

H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong and W. Zhang, “Informer: Beyond efficient transformer for long sequence time-series forecasting,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 12, pp. 11106-11115, 2021.

H. Wu, J. Xu, J. Wang and M. Long, “Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting,” Advances in Neural Information Processing Systems, vol. 34, pp. 22419-22430, 2021.

B. Lim, S. Ö. Arık, N. Loeff and T. Pfister, “Temporal fusion transformers for interpretable multi-horizon time series forecasting,” International Journal of Forecasting, vol. 37, no. 4, pp. 1748-1764, 2021.

A. Zeng, M. Chen, L. Zhang and Q. Xu, “Are transformers effective for time series forecasting?” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, no. 9, pp. 11121-11128, 2023.

Z. Zhu, Y. Yan, R. Xu, Y. Zi and J. Wang, “Attention-UNet: A deep learning approach for fast and accurate segmentation in medical imaging,” Journal of Computer Science and Software Applications, vol. 2, no. 4, pp. 24-31, 2022.

M. Wang, “Multi-level attention and sequence modeling for dynamic user interest representation,” Transactions on Computational and Scientific Methods, vol. 3, no. 2, 2023.

B. Barlocker and X. Yan, “Contrastive representation learning for anomaly detection in cloud-based backend services,”2021.

Z. Qiu, “A multi-scale deep learning and uncertainty estimation framework for comprehensive anomaly detection,” 2023.

H. Liu, “Structural regularization and bias mitigation in low-rank fine-tuning of LLMs,”2023.

Y. Xing, “Enhancing advertising recommendation performance via integrated causal inference and exposure bias correction,” Journal of Computer Technology and Software, vol. 2, no. 3, 2023.

S. Pan, T. Hu, S. Sun, J. Yuan and J. Luo, “Help oneself in helping the others: The ecology of online support groups,” Proceedings of the IEEE International Conference on Big Data, pp. 2418-2427, 2019.

A. M. Jones, G. Sahin, Z. W. Murdock, Y. Ge, A. Xu, Y. Li et al., “USC-DCT: A collection of diverse classification tasks,” Data, vol. 8, no. 10, p. 153, 2023.

Wu H, Xu J, Wang J, et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting[J]. Advances in neural information processing systems, 2021, 34: 22419-22430.

Torres J F, Hadjout D, Sebaa A, et al. Deep learning for time series forecasting: a survey[J]. Big data, 2021, 9(1): 3-21.