Power system resilience analysis methods aim to assess the ability of a power system to withstand and recover from high-impact, low-probability events, such as natural disasters. These methods often entail performing AC Optimal Power Flows (OPF) using nonlinear optimization algorithms for numerous sequential Monte Carlo (MC) iterations, which require high computational resources. This may restrict the number of feasible simulations, with the risk of reducing the reliability on the analysis outcomes and limiting the scope of the resilience assessment. This paper introduces the use of Artificial Intelligence (AI) to enhance the computational performance of power system resilience analysis. The paper proposes to use artificial neural networks (ANN) as a surrogate model to approximate the OPF solution for different scenarios and contingencies. The case study demonstrates that ANN provide an interesting solution, reducing the computational burden to a fraction of a classical OPF approaches, while keeping remarkable accuracy in the estimation of resiliency KPIs, such as energy not served (ENS) and survivability. A case study of resiliency analysis with ANN application is presented. The analysis is conducted with reXplan, a novel tool for climate power system resilience analysis, and an ANN built on Pytorch framework. SimBench, an open dataset for network analysis, provides the grid data, generation, and demand profiles. A comparison of the KPIs calculated with ANN and simulated results with OPF is provided, showing the effectiveness and efficiency of the proposed approach.
The paper presents a mixed method approach for the analysis of power systems in augmented uncertainty scenarios, related to the increasing penetration of variable renewable energy and country specific constraints to be found in fragile states. In the formulated methodology, both deterministic and probabilistic load flow have their own specific, necessary and interactive role. To establish the soundness of the methodology, the analysis is conducted for a real case study, along with wind speed measurements (eleven-month duration), visual model validations, statistical and load flow analysis. The probabilistic simulations are based on Monte Carlo (MC) analysis. Synthetic data are created from probabilistic distribution functions (PDF) calculated on original measured samples, operational constraints, and load uncertainties. These data are processed by load flow simulations and the results consolidated and analyzed. To facilitate the implementation of the proposed methods, scripts developed in Python programming language have been created for the analysis of statistical data, sample generation, post processing, data visualization and the interaction with conventional software for load flow analysis. The scripts are made public and available for download. The proposed methodology of analysis, conceptualized for developing and fragile states, may also be used as a basis for all power system planning where the number of uncertainties is no longer negligible, and the use of deterministic methods alone would provide inadequate results.
The growing influence of digitalization and the widespread availability of data from electrical systems have unlocked significant potential for data-driven analysis. This has led to substantial opportunities for enhancing power system performance, reliability, and overall operational efficiency. One concrete illustration of these data-driven analyses involves assessing and predicting harmonic distortion using statistical learning models, notably linear regression. The implementation of linear regression models may present various challenges, especially when dealing with time series data, which is the case of harmonic measurements too. One of the main issues is the presence of autocorrelation within error terms. In this situation, the conventional linear regression approach (ordinary least square) tends to lose its reliability due to the introduction of biased standard errors impacting the accuracy of the analysis. Experts in statistics have developed various methods to handle such situations, where the error terms show autocorrelation. Interestingly, these methods have been somehow ignored or perhaps sparsely discussed by power system engineers. The primary objective of this paper is to contribute to narrowing this gap. It does so by presenting the application of the Cochrane-Orcutt method in the harmonic data analysis context. The Cochrane-Orcutt method represents a well-known econometric technique used to consider serial correlations within the error terms of a linear model and can also be applied in the harmonic analysis domain. This paper is supported at first by an illustrative example based on simulated data, designed to provide a clear insight into the subject matter. Additionally, a case study with real-world measurement data is also presented, to further enhance the understanding.
The rapid growth of renewable generation, driven by increasing concerns about climate change, sustainable energy sources, and energy independence, has presented significant challenges for distribution system operators (DSOs). Integrating intermittent sources while ensuring grid stability and reliability demands a robust evaluation of hosting capacity (HC) in power distribution networks. However, prevailing HC analysis methodologies predominantly rely on conservative assumptions or worst-case scenarios, often leading to impractical and unreliable outcomes. Commercial tools, though offer readily available solutions for hosting capacity analysis, suffer from limitations as they typically assess HC of individual nodes independently or focus on only a few distributed scenarios. More elaborate methods have been proposed, attempting to cope with the complexity and stochastic nature of the HC problem. However, no much consideration has been given to their scalability to large distribution systems. In response to the complexity and stochastic nature of the HC problem, this paper introduces an efficient methodology based on stochastic power system simulations and statistical analysis. The proposed approach undergoes comprehensive assessments to substantiate its efficacy, with a key focus on validating its accuracy for reliable HC estimates. Utilizing the bootstrap method, a resampling technique, multiple samples are generated to mimic possible Distributed Energy Resources (DERs) deployments, enabling estimation of confidence intervals for HC metrics. By applying statistical analysis to power system simulation results, insights into the expected variability and uncertainty of the problem are gained. These insights guide the selection of the minimum number of DER deployment scenarios necessary to ensure an efficient and well-informed HC assessment process. The proposed methodology effectively addresses the challenges of HC analysis, offering scalability to large distribution networks. Its efficiency and enhanced accuracy make it a valuable tool for DSOs in facilitating the integration of renewable generation in a dependable and practical manner.

Luca Pizzimbone

and 1 more

The increase in extreme weather events induced  by climate change, and their impact on power systems, has  created a need for tools that can assess and improve system resilience. To meet this need, the R&D team of the Energy  System Consulting segment of Tractebel Engineering GmbH has developed a novel tool called reXplan. ReXplan is a Python  library for resilient electrical system planning under extreme  hazard events, such as windstorms, earthquakes, floods,  wildfire, etc. It is designed to help power system operators and  planners make better-informed decisions to create more  resilient and secure power grids. This paper provides an overview of reXplan’s main features,  architecture, methodology of analysis and metrics. ReXplan is capable of modeling both spatiotemporal  extreme events and electrical power systems. It leverages technologies and techniques such as Julia/JuMP package and sequential Monte Carlo analysis with multivariate stratified  sampling to achieve high accuracy in the results while reducing  computational load. By quantifying resiliency metrics,  comparing different planning strategies, and validating technical solutions, reXplan can help reduce the risks of severe  outages in the grid. The software can be easily integrated into common data  science environments and is available as a Python library. For  computationally intensive tasks, such as optimal power flow,  reXplan is exploiting the fast speed of Julia programming  language, using PowerModels.jl as backend.