Powerformer: A Section-adaptive Transformer for Power Flow Adjustment

摘要

In this paper, we present a novel transformer architecture tailored for learning robust power system state representations, which strives to optimize power dispatch for the power flow adjustment across different transmission sections. Specifically, our proposed approach, named Powerformer, develops a dedicated section-adaptive attention mechanism, separating itself from the self-attention employed in conventional transformers. This mechanism effectively integrates power system states with transmission section information, which facilitates the development of robust state representations. Furthermore, by considering the graph topology of power system and the electrical attributes of bus nodes, we introduce two customized strategies to further enhance the expressiveness: graph neural network propagation and multi-factor attention mechanism. Extensive evaluations are conducted on three power system scenarios, including the IEEE 118-bus system, a realistic China 300-bus system, and a large-scale European system with 9241 buses, where Powerformer demonstrates its superior performance over several popular baseline methods. The code is available at: https://github.com/Cra2yDavid/Powerformer

出版物
Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.1
张权
张权
博士研究生

张权,浙江大学电气工程学院2021级博士研究生,IEEE Student Member。