Photo courtesy: Can Stock Photo Inc.
In an electric utility industry that is over 100 years old, the smart meter is in its infancy in terms of deployment and usage. Because the meter is the front-end cash register for the utility, it must be accurate.
Plenty of requirements, specifications and regulations placed on meters and the utilities that try to ensure their accuracy exist, but in practice, once a meter design is certified, manufactured and deployed, the accuracy of most meters is known only at the point when it is production-tested. How a given meter fares in the field until it is decommissioned is implied only through statistical sample testing.
Moreover, meter tampering is a major source of revenue loss for utilities. Although often considered a larger problem in developing economies, it is significant and on the rise in many developed regions. For example, a report titled “Tackling Electricity Theft – Consultation” released in July 2013 from the UK-based Office of Gas and Electricity Markets (Ofgem) estimates that over £200 million worth of electricity is stolen each year in the UK and an additional £25 million is spent by utilities investigating theft and repairing or replacing tampered equipment.
Knowledge of Accuracy for Complete Meter is Key
The major advantage of smart meters over electronic and mechanical meters is their connectivity. Connected smart meters can remotely report the amount of electricity used, implement outage management, collect time-of-use data and guard against some types of tampering. Is it possible, however, to perform more sophisticated diagnostics on the key measurement function of the meter itself?
Other industries with mission-critical functions, such as automotive and industrial, introduce the concept of “functional safety” into their diagnostics requirements, which in essence checks the equipment to determine if it is functioning correctly before, during and after it is needed. One such function for the utility meter industry is meter accuracy during its deployed life.
The industry currently performs field sample testing and relies on implied precision of components inside meters remaining in calibration while in the field, but this approach comes with risk. Field accuracy monitoring is important because accuracy is influenced by the sensor, which is exposed to high current, voltage events and harsh environments. It is important, therefore, that diagnostics include monitoring the complete meter, including sensors and all electronics.
Meter accuracy checks usually require human intervention, disconnection and dedicated equipment and, therefore, cannot be completed in the field without significant cost or disruption. Non-invasive monitoring technology on every meter can change this, however.
Opportunities for Big Data Analytics
A good question for information system architects is: “What would you do if you could periodically obtain the accuracy of each and every meter deployed in the field?” This capability could weed-out failures and outliers, but the greater opportunity comes from being able to gather and analyze information about the whole meter population.
Monitoring accuracy does not contravene any regulation, but it could give you an advantage related to how you manage your meters. The amount of data if gathered hourly or daily is not huge, but the possibilities are endless.
Figure 1 shows a scenario for monitoring the accuracy of a population to a fine granular resolution, which allows differences in population across their usage lifetime to be extracted. This might lead to insights regarding differences in manufacturing lots, suppliers, regions deployed or different electrical grid topologies.
That data could also correlate with other metrics such as seasonality, temperature, humidity and power usage to determine if trends exist that would drive the specifications of future meters to deliver more repeatable field measurements.
In addition, knowing how a whole population is performing gives insight into what to expect from the sample testing that is required by the regulating agencies. The implementation of big data analytics for entire populations of meters would allow better handling of the liability risk that is assumed by the utilities.
To date, no test has existed that includes the complete meter, in situ operations and self-check accuracy. Consequently, there no mechanism exists that can identify and report a change in the accuracy of the equipment. This gap has highlighted the need for a new monitoring technique that can continuously monitor in situ meter accuracy to provide a built-in, self-test facility for checking meter performance throughout the meter’s life. Such a technique must monitor meter accuracy while the meter is running without impacting the metrology function.
To meet this challenge, Analog Devices Inc. developed mSure technology (shown in Figure 2 in green blocks), which is designed to continually monitor the response of the complete meter by injecting a known reference signal into the sensor. By superposition, the sensor can sense both the reference signal and the load signal at the same time. This combined signal is acquired from the same path so a digital representation of the combined signal at the end of the electronics exists.
The detection circuit then extracts the unique reference signal component from the load signal, and once it has achieved this, the system has the transfer function of the complete meter from the sensor to the digital representation.
The same transfer function is applicable to the load signal through to the digital representation, so utilities can determine accuracy changes. To preserve the energy data, the monitoring signal is digitally removed from the signal path to the metrology.
Because of its ability to monitor the sensor and the electronics, this technique also can be used to detect a number of approaches that present day “tamper proof” meters fail to identify. Even if utilities prevent a small percentage of all tamper events, the resulting revenue savings can have a significant benefit to the bottom line.
Today’s electricity meters are supplied and installed after certification, calibration and test in the factory to ensure that they meet a set of accuracy and performance criteria, as laid down in various standards prevailing in different geographies. But after that it’s mainly a matter of faith (component quality and statistical testing) that the meters will all remain accurate.
By adopting a rigorous approach to meter monitoring, utilities can leverage the native connectivity of smart meters to allow non-invasive in-situ accuracy testing in the field, better apply big data analytics to understand the accuracy of the entire meter deployment and reduce incidents of tampering.
Author: Jed Hurwitz (BEng) is a technologist in the Energy Management Products group of Analog Devices Inc. (ADI). Hurwitz helped design the mSure monitoring technology, for use with industry standard sensors including shunts, current transformers, potential dividers and Rogowski coils. Prior to ADI, he pioneered CMOS imagers at Vision Group and co-founded Gigle Semiconductor and Metroic (acquired Analog Devices, 2014). He has 18 granted and approximately 50 pending patents.