Accuracy in Field Data: How Much is Enough?

By Tom Petrik, Osmose Utilities Services Inc.

When it comes to taking inventory of a utility’s field assets–poles, transformers and other T&D equipment–and assessing customer-to-transformer connectivity, conventional wisdom tells us that the more accurate the data is, the better. On the surface this is true. If you have the choice between facility data that is 95 percent accurate and data that is 98 percent accurate, it’s a no-brainer which one is “better.”

The issue is cost.

Click here to enlarge image

With advanced data-collection technology and experienced field-survey crews, it’s neither difficult nor expensive to achieve data accuracy of 90 percent to 95 percent. But every percentage point above that range comes with an exponential increase in cost. For example, moving from 95 percent accuracy to 98 percent accuracy–a mere three percentage points–will likely increase data-delivery costs dramatically.

The question now is whether the additional cost of highly accurate data is worth it. To answer that question, one must consider the main factors that affect data quality and price.

Achieving Different Accuracy Levels

As noted, accuracy in the 90 percent to 95 percent range should be easily achievable given a trained work force, sophisticated data-collection software, and the established organization needed to support the field survey on the required scale. Data quality in this range will include a standard level of in-the-field resampling to verify accuracy and a standard level of automated validations before the data is formatted for delivery.

Nudging accuracy up to 97 percent requires a bigger effort. In addition to a skilled, well-trained field survey staff, more extensive training is necessary–for example, increased practice on sample areas before collecting data live. Ramped-up quality control and in-the-field rework to verify accuracy become more critical, too. More extensive automation processing is required, as are more painstaking quality-assurance measures in the post-field production environment.

Moving up to 99-plus percent accuracy involves the same kinds of activities but more of them–more up-front planning and analysis, more training and retraining, more sampling, more rework, more process improvements, more automated validation processing, more manual verification and correction of individual problems. It requires more time, more people, more effort. To achieve this level of accuracy for all attributes collected in a full equipment inventory would require a multiple-pass process. Two field technicians would individually collect data for a specific area or circuit. The results of the two passes would be compared, and any mismatches would be returned to the field for resolution. With very little margin for error and with expanded demands on time and resources, 99 percent data accuracy will cost significantly more to deliver than 95 percent data accuracy.

None of these accuracy levels–95, 97 or 99 percent–is either good or bad in itself. The real issue is determining the accuracy requirements of the target system or operation, and then structuring the data-collection and data-production efforts to meet those requirements. This will produce data that lets the target system perform as required without unnecessary extra expense.

When Greater Accuracy is Needed–and When it Isn’t

Some utility systems may indeed call for the highest data accuracy possible. Engineering analysis and outage management systems (OMS) are good examples. For an OMS to correctly identify an outage source, the data driving the system must reflect a highly accurate model of what is installed in the field and how customers are connected to the network.

Click here to enlarge image

Industry experts who have implemented outage management systems state that 95 percent of their customers must be correctly connected to the network for the OMS to function properly. But when asked, “does that mean that 95 percent of the customers are connected to the correct transformer?” the answer has ranged from “yes” to “as long as they are within one transformer of the correct transformer on a circuit” to “as long as they are on the right circuit.” Ensuring strict accuracy of the customer-to-transformer link will have a significant impact on the cost of data collection and verification. Yet strict accuracy of this data is not always mandatory, and less accurate data does not necessarily have a negative impact on responsiveness or reliability.

On the other hand, when these same industry experts were asked about the accuracy of the connected facilities model, the answer was nearly unanimous. Knowing how the circuit is built from the substation out, they said, is critical. The network model should not have any loops or unfed spans and should be traceable from the substation out to the transformer. Here, an accuracy level of 98 percent is more critical to achieve.

Although this kind of network connection data should be highly accurate, minor inaccuracies in pole, conductor or device attributes often have little or no impact on the operation of the OMS.

While it makes practical sense to combine a facility inventory project with a network connectivity survey, a common mistake is to make the target accuracy levels of both efforts the same, which means at the higher requirement of the network connectivity collection. Quality levels should be established separately for both–typically, higher for network connectivity (about 98 percent) and lower for equipment attribute data (about 95 percent).

Some field audits don’t demand high accuracy to begin with. For example, first-pass accuracy of 95 percent is generally acceptable in joint-use surveys, at least as far as correct assignment of the attaching companies. The reason is that incorrect assignments tend to balance each other out, with attachers A, B and C receiving a roughly equal number of errors. In such a case, 95 percent accuracy does the job. It provides a reasonably accurate, verifiable record of attaching companies without penalizing any single attacher.

So accuracy requirements are anything but static, and setting too high a quality standard can raise the cost of a field survey unnecessarily. As one utility executive commented recently, unless there’s a compelling reason to do so, “why fill the tank with premium when regular does the job?”

Planning for Data Quality

How should you determine quality standards for your upcoming field survey? Here are five strategic steps:

Determine the accuracy needed by the target systems. As noted previously, the accuracy of customer connections is important for an OMS–but even here, absolute precision may not be required as long as the circuit is correct and the “within one transformer” rule is met. Furthermore, in congested urban settings, customer-to-transformer connections will usually be more critical to establish than in rural, wide-open environments where such connections are easily observable. Effective operation of an engineering analysis system relies on accurate, complete information about equipment attributes, transformer sizes, customer-to-network connectivity, conductor sizes, and other data that is critical to working effectively with phasing and load. But in many joint-use initiatives, first-pass accuracy will suffice. Just be sure the accuracy level of the data meets the requirements of the destination system.

Identify existing information sources that can help promote accuracy. Most utilities have established data resources that can help ensure accuracy during a field survey. These data sources may include equipment databases, GIS source information, third-party landbases, meter-reading databases and other data stores that are known to be reasonably accurate. Having an accurate third-party landbase, for example, can help field technicians compare addresses in current data against address ranges shown on the landbase–and can help orient technicians in the field. Similarly, having access to an equipment database that distinguishes between overhead and underground facilities can help isolate overhead facilities that were missed during the field survey–rather than lumping them together with all of the underground features that would not appear in the field-captured data.

Set separate quality levels when appropriate. You may decide that you absolutely need network-connectivity accuracy of 98 percent to respond effectively to outage and crisis situations and to avoid regulatory penalties pegged to outage duration. But the equipment data being gathered in the same pass as the circuit connectivity information may be adequate if it’s 95 percent accurate. In that case, the equipment data won’t have to be resampled and inspected at the same level as the connectivity information. The different accuracy levels of the two data types are consistent with the end-use accuracy requirements and are more economical to produce than across-the-board 98 percent accuracy.

Apply ANSI or other sampling standards. It’s important to define delivery sizes and sampling methods and to define how random samples will be created (by attribute, by pole, by customer, etc.). The utility and the field-survey vendor need to use the same quality-assurance process. The sampling methods must be effectively integrated with the inspection level used to monitor accuracy and completeness and to achieve the established quality standards. Clearly identifying the sampling standards and their connection to required accuracy levels puts everyone on the same page–in the contract, in the field, and in the project’s data-acceptance phase.

Establish an effective data-maintenance strategy. Field data does nothing but change. The instant a field survey is completed, a transformer may get upgraded or a pole may get changed out with one of a higher class. No matter how exacting the survey’s quality standards, inaccuracies can creep in. To combat this, you need to implement a data-maintenance plan that lets you keep your data at its target accuracy on an ongoing basis. This may involve regular in-field data verification by an outside vendor or by your own internal staff. Whoever provides the trained personnel, it’s important to use a technology-based means of auditing data, logging corrections, and posting them to the data repository in question. Ideally, the technology used by the field-survey contractor for the initial data collection can be easily transitioned to the data-maintenance function, which typically saves on costs.

All these steps are geared toward producing high-quality data, doing so at an economical price, and ensuring that the field-survey vendor and the utility are working with the same set of accuracy standards. This helps ensure not only data that meets the utility’s specific requirements but a constructive vendor-client relationship that both parties value.

Tom Petrik is vice president of technology at Osmose Utilities Services Inc. He is responsible for directing the development of utility-related software, including FastGate Mobile software, which is used in the collection and verification of field data for North American utilities. He can be reached at tpetrik@osmose.com.

Previous articlePOWERGRID_INTERNATIONAL Volume 9 Issue 2
Next articleTransformer monitoring and diagnostic techniques

No posts to display