Skip to main content
Power Systems Computation Conference 2026

Full Program »

View File
PDF
0.6MB

Learning Power Flow With Confidence: A Probabilistic Guarantee Framework For Voltage Risk

The absence of formal performance guarantees in machine learning (ML) has limited its adoption for safety-critical power system applications, where confidence and interpretability are as vital as accuracy. In this work, we present a probabilistic guarantee for power flow learning and voltage risk estimation, derived through the framework of Gaussian Process (GP) regression. Specifically, we establish a bound on the expected estimation error that connects the GP’s predictive variance to confidence in voltage risk estimates, ensuring statistical equivalence with Monte Carlo–based ACPF risk quantification. To enhance model learnability in the low-data regime, we first design the Vertex- Degree Kernel (VDK), a topology-aware additive kernel that decomposes voltage–load interactions into local neighborhoods for efficient large-scale learning. Building on this, we introduce a network-swipe active learning (AL) algorithm that adaptively samples informative operating points and provides a principled stopping criterion without requiring out-of-sample validation. Together, these developments mitigate the principal bottleneck of ML-based power flow—its lack of guaranteed reliability—by combining data efficiency with analytical assurance. Empirical evaluations across IEEE 118-, 500-, and 1354-bus systems confirm that the proposed VDK-GP achieves mean absolute voltage errors below 1E-03 p.u., reproduces Monte Carlo–level voltage risk estimates with 15× fewer ACPF computations, and achieves over 120× reduction in evaluation time while conservatively bounding violation probabilities.

Parikshit Pareek
Indian Institute of Technology Roorkee
India

Deepjyoti Deka
MIT Energy Initiative
United States

Sidhant Misra
Los Alamos National Laboratory
United States

 


Powered by OpenConf®
Copyright ©2002-2025 Zakon Group LLC