Power System Realtime Loss Analysis
Challenge
Loss studies are arduous and tedious processes that can typically take 18-24 months with an estimated cost of $300,000 to $500,000. The traditional approach to loss studies leverages load research meters and class average load profiles (CALPs) to extrapolate and allocate hourly system loading across scalar metered service points. This long-standing methodology determines annual and seasonal peak-days and then disaggregates customer class contributions to the peak-day and associated customer class losses based upon extrapolated CALPs. Loss studies are resource intensive processes and due to the duration of the study can be out of date shorty-after they are completed.
Predicting Distribution Transformer Failures
​
Today many utilities have deployed advanced metering infrastructure (AMI) meters that report voltage and load readings in 15-minute intervals. This unprecedented volume of data not only enables a deeper understanding of the operation of the system, but it also provides the utility with a means to perform predictive analytics — the exploration of data with machine learning methods — to predict distribution transformer outages. The machine learning methods applied may also be used for predictive analytics on any devices where downstream data is being collected.
​
Tens of thousands distribution transformers fail each year. Generally, the point of failure for some transformers is preceded by an abnormally high output voltage from a fusion of the high-side windings, which changes the effective ratio of the transformer. A similar failure mode exists wherein the low-side windings fuse, resulting in an abnormally low output voltage. By examining the voltage profile of transformers PowerRunner can accurately and proactively identify distribution transformers that require immediate decommissioning and replacement.