2025 International Conference on Computer and Knowledge Engineering , 2025-05-28

Title : ( A Synergistic Hybrid Architecture with Residual Attention and Mixture-of-Experts for Robust Hour-Ahead Forex Forecasting )

Authors: Alireza Abbaszadeh , Seyed Abed Hosseini , Mohammad Reza Akbarzadeh Totonchi ,

Access to full-text not allowed by authors

Citation: BibTeX | EndNote

Abstract

Forecasting highly volatile exchange rates like EUR/USD is a critical challenge where traditional models fail to capture the associated complex and volatile nonlinear dynamics of these rates. While the modern deep learning architectures have shown considerable promise in the past few years, they also present significant variation, and achieving optimal synergy among their advanced components, such as attention and mixture-of-experts, remains an open research question. This paper introduces a novel, synergistic hybrid architecture engineered to address this gap. Our model (V8) strategically integrates a Residual Block with Multi-Head Attention (RB-MHA) for robust feature extraction, a Bidirectional LSTM for temporal modeling, and a Mixture-of-Experts (MoE) module for adaptive prediction under varying market conditions. Evaluated on over 15 years of hourly EUR/USD data using a rigorous, leak-free methodology, our model sets a new state-of-the-art performance with a Root Mean Squared Error (RMSE) of 0.001863 and an R2 of 0.985467 on the unseen test set. This result constitutes a 44.1% reduction in RMSE over a strong deep learning baseline (V1), demonstrating the significant impact of our synergistic design and establishing a new benchmark for hour-ahead forex forecasting.

Keywords

Foreign Exchange Forecasting; Deep Learning; Residual Networks; Attention Mechanism; Mixture of Experts (MoE).
برای دانلود از شناسه و رمز عبور پرتال پویا استفاده کنید.

@inproceedings{paperid:1106419,
author = {علیرضا عباس زاده and سید عابد حسینی and Akbarzadeh Totonchi, Mohammad Reza},
title = {A Synergistic Hybrid Architecture with Residual Attention and Mixture-of-Experts for Robust Hour-Ahead Forex Forecasting},
booktitle = {2025 International Conference on Computer and Knowledge Engineering},
year = {2025},
location = {مشهد, IRAN},
keywords = {Foreign Exchange Forecasting; Deep Learning; Residual Networks; Attention Mechanism; Mixture of Experts (MoE).},
}

[Download]

%0 Conference Proceedings
%T A Synergistic Hybrid Architecture with Residual Attention and Mixture-of-Experts for Robust Hour-Ahead Forex Forecasting
%A علیرضا عباس زاده
%A سید عابد حسینی
%A Akbarzadeh Totonchi, Mohammad Reza
%J 2025 International Conference on Computer and Knowledge Engineering
%D 2025

[Download]