DGAT: Dynamic Graph Attention-Transformer network for battery state of health multi-step prediction
State of health (SOH) prediction is crucial for battery health monitoring, as it relies on accurate measurement and analysis of charge–discharge data. Existing studies have shown that AI models considering spatiotemporal dependencies contribute to accurate SOH prediction. However, their performance in multi-step prediction—where SOH is predicted across multiple future cycles—remains limited. To enhance model expressiveness and improve SOH multi-step prediction, we propose the Dynamic Graph Attention-Transformer (DGAT), which integrates spatial feature extraction via graph attention networks, temporal modeling via Transformers, and adaptive fusion. Battery health data is represented as a dynamic graph, in which time windows within each cycle are treated as individual nodes, preserving intra-cycle degradation patterns and capturing inter-cycle dependencies. When evaluated on the MIT battery dataset containing 124 LiFePO4 batteries, DGAT achieved RMSEs of 0.572 %, 0.601 %, and 0.677 % for SOH prediction at 10, 15, and 20 cycles, respectively, outperforming existing benchmarks. Ablation studies validate the effectiveness of the spatial, temporal, and fusion components in enhancing multi-step prediction reliability. These results demonstrate the capability of DGAT in processing battery sensor data and highlight its potential as a robust solution for SOH prediction in battery management systems.