Five Deadly Data Mistakes and How to Avoid Them

Tom Petrik, Osmose Utilities Services Inc.

You’re planning a major system implementation-a new outage management system, geographic information system, graphic design tool or maybe all three. It’s a whirlwind of technology, personnel, deadlines, test runs and nitpicking details. In such a volatile setting, it’s not easy to keep your priorities straight. One thing you don’t want to lose sight of, however, is the importance of the data that will drive the new system. All the technical wizardry you can muster will be meaningless if the data isn’t up to par.

From a performance and reliability standpoint, the new technology is important, but the data driving it is even more important. Here are five pitfalls you’ll want to avoid.

Mistake #1. Forgetting about the data.

To most IT people, getting different systems to work together smoothly is dynamic and exhilarating work. Data migration, on the other hand, is a gigantic yawn. But, in many ways, getting the data properly migrated, integrated and validated is just as important as the high-profile system work. If the data is bad-if it’s inaccurate, incomplete, inconsistent or otherwise incorrect-then the system implementation will fall short of full success. Installing the new system without ensuring that the data is good is like buying a high-performance race car and filling it with low-octane fuel. The counterbalance to this is:

Mistake #2. Over-emphasizing data quality.

While accurate data is clearly a desirable thing, your data can be too accurate if the cost of that accuracy exceeds the target system’s requirements. For example, data accuracy levels of 90 percent to 95 percent are an “average,” expected level for a field survey. Achieving 96 percent, 97 percent or 98 percent accuracy raises the bar, not just for precision, but also for cost. The real question is how accurate the data really needs to be. Network connectivity data has to be very accurate (up to 98 percent) if that data will feed an outage and trouble-call system. The high accuracy level enables a high level of responsiveness and reliability, and it justifies the expense required to achieve it. But 98 percent accuracy is not generally required for an overhead equipment or joint-use audit. Trying to get 98 percent accuracy on projects like these can double the data acquisition cost without providing any significant gain in performance.

Data accuracy is not a static, fixed goal. It’s a moving target that changes according to system requirements, intended uses, budget and other factors. Aim for the level of data accuracy and completeness you need to run the target system reliably. This might dictate a comprehensive, painstaking field audit, or it might only require a targeted survey that verifies accuracy and corrects obvious problems.

Mistake #3. Assuming your data is clean.

The quality of your data is like the quality of your health: It’s easy to take for granted if you’re not seeing any obvious problems. But problems can nonetheless exist. Before you can say that your data is accurate-and before you can ensure the system it drives will perform as desired-you have to ask some hard questions:

  • When was the last time you conducted a field survey to verify the data is accurate and complete?
  • Do you have a reliable, day-to-day system for updating and correcting inaccuracies when they’re discovered?
  • How rigorous is your process for maintaining field information once it’s housed in a data repository?
  • Except that no crisis has spotlighted data issues, how exactly do you know that the data is equal to the task?

Even if your data was clean three or four years ago, errors and inaccuracies inevitably creep in. Every day, something-maybe something small, but something-goes unrecorded. A Class 3 pole gets changed out for a Class 1. A transformer gets upgraded. A customer gets switched to a different transformer “temporarily” during a storm repair operation but never gets switched back. The only way to guarantee high-quality data is to validate existing information in the field.

Mistake #4. Thinking about data short-term.

The immediate goal is to get the new GIS or trouble-call system up and running, and the natural focus is the data you need to accomplish that. But it’s important to consider alternate uses of that data. Is it possible, for example, that the data that’s now being verified and collected will be accessed by some other system, or used for some other, additional purpose? If you’re conducting a field survey whose main purpose is to establish traceable electrical connectivity from substation to customer, getting the phasing right is critical. But keep in mind that during a field audit of phasing, it doesn’t take much more effort or expense to also collect customer-to-transformer connections. Beyond that, it’s not much more effort or expense to perform a reliability and safety inspection at the same time. The point is to think beyond the current project to consider long-term needs.

Mistake #5. Upgrading your data- then stopping.

It’s not enough to get your data accurate and complete. You also need a means of keeping it that way. Field data is like a garden. It needs ongoing care and attention. From the design phase to the “as-built” in the field, getting new data and updating old data should follow a well-defined and well-monitored process. While these processes can be monitored closely, there are things happening to the network every day that are out of your control. Mother Nature, attaching companies, reckless drivers, frustrated hunters, teenagers with too much time on their hands-all have ways of changing your facilities without your knowledge. The only way to capture this type of information is by observing it and recording it first-hand in the field. Gathering this type of information in a cyclic field survey is becoming a more common practice. Some states have even started to mandate these types of inspections. (General Executive Order 165 in California is a prime instance.)

Another method made popular by recent advances in mobile technology is to leverage existing field activities by adding additional data collection to the effort. An example is expanding your pole inspection and treatment program to also include collection of data on equipment, joint-use attachments, NESC violations and so on. Getting crews out there is the most expensive part of the equation. Having them collect an additional type of field data can give you valuable information at a very modest increase in cost.

The main point is that your data (like your garden) needs regular nurturing and care to be at its best. Upgrade your data, by all means. Just don’t forget about it.

Previous articlePOWERGRID_INTERNATIONAL Volume 9 Issue 6
Next articleMerrill Lynch completes acquisition of Entergy-Koch, LP

No posts to display