Grid edge analytics is essential for energy efficiency

Grid edge analytics is essential for energy efficiency
Image: 123rf

Operating the grid without leveraging analytics is a missed opportunity, claims Charlie Nobles – vice president of sales for smart tech developer Ubicquia – who believes that analytics is a key yet often overlooked element in utilities’ workflow.

As such, claims Nobles, consumers and utilities alike are missing out on the untapped potential that opens up when holistic analytics is given the spotlight.

By Yusuf Latief

How can predictive data enable utilities to improve energy savings?

Utilities typically have plenty of energy and it’s during peak hours or when they have commercial industrial loads, that they become concerned.

By monitoring assets, energy consumption can be more readily controlled. This can extend over the sag and swell voltages on the circuit to see if there’s a circuit issue. Momentary outages caused by reclosers can be quickly detected and resolved. Alerts can be set to detect a power failure, or if there’s a part of the fixture that’s been burned out.

With all this information at hand, the utility can then promptly get to work. These issues need to be addressed without waiting for customers to call and complain. Quicker, proactive reactions mean that maintenance costs can be surgically targeted.

Charlie Nobles – vice president of sales for Ubicquia

From a utility standpoint, that’s what they’re looking to do. However, for the data that comes from monitoring distribution transformers, it’s a different story. The grid is only smart in places. And this is a bit of the fallacy that tags along when we talk about the ubiquitous smart grid.

Inside the substation, no doubt, it’s smart. Utilities usually have switches and regulators for different assets. But they’re disparate and there are very few. Across the miles and miles of the grid vast parts go unmonitored.

There’s a need for cost-effective, scalable monitoring solutions across distribution transformers. Not station transformers inside the substation that are already smart, but rather the millions of distribution transformers. They’re not monitored. And real time data on asset health is needed.

Have you read:
Targeting grid resilience with digitalisation
New edge computing solutions set to revolutionise the power industry

What level of insight can be gained by monitoring asset health?

Today the grid is far less resilient than it was 20 years ago. The grid was not designed for edge-attached technologies and distributed energy resources. It was not designed for a preponderance of electric vehicle plugins. And it was not designed for an extensive feed of rooftop solar or generation at the edge.

Secondly, the grid has been sectionalised. In order to reduce the number of people affected by any one outage, feeders and laterals have been chopped up into pieces so that if there’s an outage the affected area is much smaller and the number of moving parts is much greater.

Maintenance and refurbishment of grid assets have not kept up with the ageing of the grid. For LADWP (Los Angeles Department of Water and Power), they did a study and found out that over a quarter of their utility poles were over 60 years old. This means that transformers, poles and other assets well beyond their design life are still in use.

The increasing number of environmental challenges – ice storms, wind storms, hurricanes, extremely hot weather, extremely cold weather – all seem to be accelerating. When that environmental challenge factor is added to an ageing grid there are many elements beyond their design life compounding the grid’s complexity.

And with the challenges of devices at the edge being back fed, there’s a hot mess. In this type of situation, there’s incredible need to monitor parts of the grid that have never before been looked at. But while there’s a lot of work in predictive analytics, the analytics is only as good as the data fed being fed back.

If the only data fed is that from the substation or from the meter, and utilities are trying to infer what’s happening in the part of the grid between those two extremes, then the predictive analytics will be fairly thin.

But with real time data from devices in that gap, and this does focus on distribution transformers, the analytics would be much more accurate because it’s being fed with more real time data about what’s going on between the substation and the meter.

Have you read:
Why a proactive data strategy is the key to grid resilience
How big data analytics are enabling sustainability for water utilities

How do you see the grid edge evolving?

Imagine that a utility has 100,000 distribution transformers.

Maybe five or 10% are critical for monitoring: whether the critical loads, the ‘hard to get to’, the ‘out in the middle of nowhere’, the loads that are near waterways etc. Imagine now that we have solutions that can monitor those assets and by feeding the data back we can get alarms.

The data itself provides tremendous insight. But it also provides a context. This data needs to be coupled with other data – when looking at voltages along all transformers on the same circuit, now circuit dynamics can be translated. Then one can look at a capacitor bank for circuit voltage regulators on that same circuit and see a full telemetry.

The data from these transformers or other devices by themselves has limited value. What’s going to happen is utilities are going to start seeing real time data from thousands of devices never before monitored.

But then the place of analytics comes into question. One can’t look at raw data, they need context around the data. That data needs to be coupled with data from other devices to create new software-defined alerts. New analytics that give circuit condition, not just asset condition. And this is going to be a huge data analytics lift for data scientists.

A lot of smaller utilities don’t have the manpower to really do that. They’re trying to operate the grid, but they don’t necessarily have the luxury of analysing the grid.

I think that vendors like our ourselves – they’re developing these new devices that can scale to get this data – also have to provide the context; the metadata. We have to help codify what we’re seeing so that utilities can do what they do best, which is make it operational.