Forecasting Realized Volatility of Crude Oil: Can MLP-Based Models, CNN-Based Models and Transformer-Based Models Help?

Authors

  • Guoyu Gu School of Economics and Management, Nanjing University of Science and Technology, Nanjing, China

DOI:

https://doi.org/10.62051/zqhcta04

Keywords:

Volatility forecasting; Shanghai crude oil futures; Deep learning.

Abstract

The Shanghai crude oil futures market exudes a distinct speculative attribute, which highlights the importance of Volatility Prediction. Therefore, this study uses the improved models based on MLP, CNN and Transformer for time series forecasting to predict the realized volatility of Shanghai crude oil futures. The results of this study show that MLP-based models, CNN-based models and Transformer-based models are better than the original model in three aspects of extending window, rolling window and long series prediction.

Downloads

Download data is not yet available.

References

[1] Kilian L, Vigfusson R J. Do oil prices help forecast U.S. real GDP? The role of nonlinearities and asymmetries. Journal of Business and Economic Statistics, 2013, 31(1): 78-93.

[2] Aloui R, Gupta R, Miller S M. Uncertainty and crude oil returns. Energy Economics, 2016, 55: 92-100.

[3] Krane J, Medlock K B. Geopolitical dimensions of US oil security. Energy Policy, 2018, 114: 558-565.

[4] Zhang Y J, Pan X. Does the risk aversion of crude oil market investors have directional predictability for the precious metal and agricultural markets? China Agricultural Economic Review, 2021, 13(4): 894-911.

[5] Li Y, Jiang S, Li X, et al. The role of news sentiment in oil futures returns and volatility forecasting: Data-decomposition based deep learning approach. Energy Economics, 2021, 95.

[6] Brandt M W, Gao L. Macro fundamentals or geopolitical events? A textual analysis of news events for crude oil. Journal of Empirical Finance, 2019, 51.

[7] Zhang Y J, Yan X X. The impact of US economic policy uncertainty on WTI crude oil returns in different time and frequency domains. International Review of Economics and Finance, 2020, 69.

[8] Jackson J K, Weiss M A, Schwarzenberg A B, et al. Global economic effects of COVID-19M//The Effects of COVID-19 on the Global and Domestic Economy. 2021.

[9] Qureshi A, Rizwan M S, Ahmad G, et al. Russia–Ukraine war and systemic risk: Who is taking the heat? Finance Research Letters, 2022, 48: 103036.

[10] Qin X. Oil shocks and financial systemic stress: International evidence. Energy Economics, 2020, 92: 104945.

[11] Orhan E. THE EFFECTS OF THE RUSSIA-UKRAINE WAR ON GLOBAL TRADE//Journal of International Trade, Logistics and Law: Vol. 8. 2022.

[12] Vaswani A, Shazeer N, Parmar N, et al. Attention is All you Need//GUYON I, VON LUXBURG U, BENGIO S, et al. Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, December 4-9, 2017, Long Beach, CA, USA. 2017: 5998-6008.

[13] Brown T B, Mann B, Ryder N, et al. Language Models are Few-Shot Learners//LAROCHELLE H, RANZATO M, HADSELL R, et al. Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020, December 6-12, 2020, virtual. 2020: 1877-1901.

[14] Dosovitskiy A, Beyer L, Kolesnikov A, et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale//9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021. OpenReview.net, 2021.

[15] Zhou H, Zhang S, Peng J, et al. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting//35th AAAI Conference on Artificial Intelligence, AAAI 2021: Vol. 12B. 2021: 11106-11115.

[16] Wu H, Xu J, Wang J, et al. Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting//RANZATO M, BEYGELZIMER A, DAUPHIN Y N, et al. Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, NeurIPS 2021, December 6-14, 2021, virtual. 2021: 22419-22430.

[17] Zeng A, Chen M, Zhang L, et al. Are Transformers Effective for Time Series Forecasting?//WILLIAMS B, CHEN Y, NEVILLE J. Thirty-Seventh {AAAI} Conference on Artificial Intelligence, {AAAI} 2023, Washington, DC, USA, February 7-14, 2023: Vol. 37. {AAAI} Press, 2023: 11121-11128[2025-04-24].

[18] Liu Y, Hu T, Zhang H, et al. iTransformer: Inverted Transformers Are Effective for Time Series Forecasting//The Twelfth International Conference on Learning Representations, ICLR 2024, Vienna, Austria, May 7-11, 2024. OpenReview.net, 2024.

[19] Chen S A, Li C L, Yoder N, et al. TSMixer: An all-MLP Architecture for Time Series Forecasting. CoRR, 2023, abs/2303.06053.

[20] Wang S, Wu H, Shi X, et al. TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting//The Twelfth International Conference on Learning Representations, ICLR 2024, Vienna, Austria, May 7-11, 2024. OpenReview.net, 2024.

[21] Kitaev N, Kaiser L, Levskaya A. Reformer: The Efficient Transformer//8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020. OpenReview.net, 2020.

[22] Woo G, Liu C, Sahoo D, et al. ETSformer: Exponential Smoothing Transformers for Time-series Forecasting. CoRR, 2022, abs/2202.01381.

[23] Zhou T, Ma Z, Wen Q, et al. FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting//CHAUDHURI K, JEGELKA S, SONG L, et al. International Conference on Machine Learning, ICML 2022, 17-23 July 2022, Baltimore, Maryland, USA: Vol. 162. PMLR, 2022: 27268-27286.

[24] Wu H, Wu J, Xu J, et al. Flowformer: Linearizing Transformers with Conservation Flows//CHAUDHURI K, JEGELKA S, SONG L, et al. International Conference on Machine Learning, ICML 2022, 17-23 July 2022, Baltimore, Maryland, USA: Vol. 162. PMLR, 2022: 24226-24242.

[25] Zhang Y, Yan J. Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting//The Eleventh International Conference on Learning Representations, ICLR 2023, Kigali, Rwanda, May 1-5, 2023. OpenReview.net, 2023.

[26] Nie Y, Nguyen N H, Sinthong P, et al. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers//The Eleventh International Conference on Learning Representations, ICLR 2023, Kigali, Rwanda, May 1-5, 2023. OpenReview.net, 2023.

[27] Wu H, Hu T, Liu Y, et al. TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis//The Eleventh International Conference on Learning Representations, ICLR 2023, Kigali, Rwanda, May 1-5, 2023. OpenReview.net, 2023.

Downloads

Published

31-12-2025

How to Cite

Gu, G. (2025). Forecasting Realized Volatility of Crude Oil: Can MLP-Based Models, CNN-Based Models and Transformer-Based Models Help?. Transactions on Economics, Business and Management Research, 16, 252-270. https://doi.org/10.62051/zqhcta04