by Peter Batty, SSP Innovations
Maintaining an accurate and up to date model of a utility’s network assets is an essential foundation for many important aspects of the business, including network reliability, safety and customer service. The system of record used to maintain this data is typically a GIS (Geographic Information System), and the data is usually synced from the GIS to various other major enterprise systems, including Outage Management Systems (OMS) and Advanced Distribution Management Systems (ADMS). Today most utilities have some significant challenges with their GIS data quality. Maintaining the data is a manually intensive, expensive, and complex process. The growth in ADMS capabilities and usage in recent years has been a strong driver for improving GIS data quality, as many advanced applications will not give useful results without extremely accurate network data.
Aspects of Data Quality
There are various aspects of data quality to consider when looking at making improvements. The spatial accuracy of the data is one aspect. Some examples of where this is important include using GPS to find equipment in the field; sharing data with other organizations, for example in locating underground assets for excavation purposes; and overlaying data with either aerial or street level imagery, which is increasingly useful for many purposes. The accuracy of the network connectivity is another aspect – are customers fed from the correct transformer, is the phasing correct, and more. The network connectivity is more important than location for applications like OMS and ADMS. It is more challenging to improve both of these aspects of data quality for underground assets than for overhead assets, since they cannot be directly seen. Technology for scanning underground assets is rapidly improving, but still has a way to go.
Data currency is an aspect of data quality that cuts across all the aspects just mentioned. Currently the as-built update backlog — the time between installing equipment in the field and having a record of it appearing in the company’s GIS — is measured in months at most utilities. In many ways this is the most fundamental problem to fix. Data inaccuracies that are due to historical processes or systems can be improved with a one-time effort, but the network is constantly changing, and fixing the as-built update problem requires significant changes to ongoing business processes.
The root cause of the as-built update backlog is that changes to the network are generally recorded by different people from those who actually made the change to the network. This is because it is too complex and time consuming, in general, for field workers themselves to record all the data that is needed about the changes that they just made to the network. This is the case even with modern and simple to use mobile applications: they typically require some sort of manual sketching of changes to the network, and this approach just makes the process too complicated and time consuming for a typical field worker in many situations.
Automated Field Sourcing
In order to address the as-built update problems of utilities, we need to find a way to “crowd source” the data updates from the field workers who make the changes to the network, where and when the work is done – we call this field sourcing. For this to be a viable approach, the data update process needs to be both fast and very simple to use. It needs to use equipment that is inexpensive and readily available – ideally using a smart phone, which almost everyone already carries with them.
There are various new technologies that are now available, which are continuing to advance quickly, that enable relevant data to be captured in a very automated way – and ultimately in a fully automated way. We will discuss some of these technologies and how they can be applied in the remainder of this article.
Self-driving Cars, Computer Vision and Machine Learning
One of the interesting things about the technologies in question is that they are being developed and applied in a wide range of industries, outside the traditional geospatial / GIS industry. One major industry of interest is autonomous driving.
Tesla in particular is doing a lot of relevant work. If you look at tesla.com/autopilot you can see a range of interesting information about the company’s ability to automate various aspects of driving, aiming ultimately towards fully autonomous driving. In the first video on that page, you can see how Tesla’s machine learning based computer vision software is identifying and spatially locating various objects of interest from the car’s camera streams, including stop signs, traffic lights, street edges, other cars, and more. This has strong similarities with capturing asset data for utilities.
There are some other things we can learn from Tesla’s approach to this problem. One is that they are not immediately trying to solve the most complex problem of fully automating the driving process. Their first iteration was to provide “Autopilot” capability, which keeps the car in its lane on a highway and maintains a safe distance from the car in front. Their next iteration was “Autopilot with Navigation,” which added the ability to change lanes and pass slower cars on a highway, and to navigate exits onto other highways. All the time while the cars are driving, Tesla is recording extensive information which is fed back into their machine learning models, to improve existing capabilities and add new ones in the future. This includes not just the information from cameras and other external sensors about the environment, but also the way that the driver interacted with the pedals and steering wheel in each situation.
There are parallels here with how we can automate the process of utility data capture: don’t try to fully automate everything up front but automate just some elements of the process at first and have it guided by users. Over time as you capture more information to feed into machine learning models, and learn more about the process, you can automate additional elements and ultimately make the process fully automatic.
While Tesla cars include some specialized hardware like radar and sonar sensors, the computer vision capabilities that recognize objects in the video stream are also available in modern smart phones.
Another technology that is relevant to capturing data about the real world using smart phones is augmented reality. The gaming industry is one of several that are driving development in this area. Niantic, the company behind Pokemon Go, who incidentally are run by several of the key people who originally built Google Earth, are doing a lot of impressive work in this area. Using a combination of phone cameras and built-in movement sensors, augmented reality software can build an accurate 3D model of physical space in the area around the device. The latest high-end iPhones and iPads even have built in LiDAR sensors that enhance these capabilities further. The iPhone Measure app is a simple example of how this technology can be used to accurately measure distances. Augmented reality combined with computer vision technologies can build a very accurate model of physical objects in the real world.
How Will This Impact Utility Data Capture?
These technologies will transform the way that utilities capture and maintain data about their network assets over the next few years. Today’s technology already has the capabilities to provide fast and easy to use semi-automated data capture that is a substantial improvement over previous approaches. This will evolve over the coming years to the point where data about physical changes that are made to the network will be automatically pushed to enterprise databases in real time as the changes are made, without requiring any interaction from field workers – using wearable devices like helmet mounted cameras or phones, or smart glasses. This will be a key foundational element for the idea of a digital twin of the physical network —an idea that is widely talked about these days — that will enable utilities to significantly improve network reliability, customer service, safety, and more. In my opinion, based on working on innovations in the utility GIS space for 35 years, this will be the most transformational change we have ever seen in this industry – these are very exciting times!
About the Author
Peter Batty is Chief Research Officer at SSP Innovations. His focus is on researching new technologies and building out innovative new product offerings, as well as providing guidance to the Executive Leadership Team and the overall strategic direction of the company. He has been an innovator in the GIS industry for utilities and telecommunications companies for 35 years. He has served as Chief Technology Officer at GE Smallworld, Intergraph (now Hexagon), and IQGeo. He has created multiple successful products including the IQGeo myWorld platform and the GE PowerOn electric outage management application