An Innovative Approach to GIS Database Maintenance

An Innovative Approach to GIS Database Maintenance

By Mickey A. Fain, Serion International

Data maintenance is not new. Spatial data had to be maintained when it was stored on mylar in flat files. Data management and products have changed dramatically as manual spatial data have evolved into digital data. It is much faster and easier to update a screen image than it is to redraw an entire paper map. Ironically, this very fact may be part of the problem. It is so easy that it is tempting to assign maintenance to a technician with little or no training and to apply few guidelines or standards. In fact, GIS maintenance seems so easy that it is often overlooked during the design of a GIS and, more importantly, during the allocation of resources.

There are many reasons to pay very close attention to spatial database maintenance. As one of the greatest expenses over the life of a GIS (you keep paying and paying for it …), maintenance represents one of the most logical areas in which to cut costs, especially because there are ways this can be done without sacrificing quality. The maintenance process offers excellent opportunities to improve the accuracy and quality of the data in an incremental and cost-effective manner. You can lose your entire investment in data if you don`t.

A GIS and the many services it provides simply accelerates and intensifies the need to maintain the database in a systematic and efficient manner. Some new services to users depend on data being current to the latest transaction or within 24 hours. Merely mimicking manual data maintenance methods is inefficient and expensive.

A good analogy for this common practice is the initial use of computers to do the same tasks in the same ways that adding machines and typewriters had always done them. It soon became apparent to some Bill Gates types that the tool at hand was far more powerful than the old manual devices once used to perform these functions. New processes–even new paradigms–were in order.

A paradigm shift in the design of the maintenance processes used to keep spatial databases current is also needed. Maintenance should not just keep the data current, but also provide an opportunity to continuously improve the quality and the accuracy of the database. And it should do so at a cost lower than the GIS is currently experiencing.

The Conversion-Maintenance Connection

The GIS processes which most closely parallel the maintenance process–and it is a process, not just an onerous task–are conversion and integration. Most data do not initially exist in electronic form; they start out as a work order, survey report, diagram, legal document or a piece of paper. If the data are not digital, they must be put into electronic form. Digital data then must be integrated into the GIS. This is essentially conversion and integration on a small scale happening in an incremental manner throughout the life of the GIS.

When a GIS manager contracts for conversion or integration work, specifications are written, accuracy standards are established and so on. This occurs because it is recognized that the quality and the accuracy of the data are two of the most important factors in determining the overall quality of a GIS. Conversion and integration are thus seen as critical steps in the GIS creation process. Moreover, the creation of a high-quality spatial database can constitute the single most expensive aspect of the GIS, often costing more than hardware and software combined.

This is equally true of the maintenance process. Streamlining this process can save money and if done properly, can simultaneously lead to quality that improves, rather than deteriorates. Process engineering techniques provide the keys to achieving both cost savings and quality improvements.

Attaining Zero Defect

Total quality management (TQM) is an innovative approach to ensuring quality. It was developed during the 1950s by W. Edwards Deming, who is considered by many to be the “father of quality control (QC).” At the heart of this paradigm shift in quality control is fixing the process when an error occurs, rather than merely correcting the error.

It is, after all, cheaper to do something once than it is to do it twice. Processes designed so that errors cannot occur are known as zero-defect processes. Their use allows statistical process control to replace the expensive and time-consuming 100 percent QC step that has traditionally formed the last lap in data processing. This approach can ensure that the quality and accuracy standards established for the conversion process are adhered to during the maintenance process. This is done at a cost significantly lower than traditional methods.

Making Maintenance Work

There is an entire palette of ways to improve the maintenance process once the underlying structure has been addressed by applying TQM principles and techniques. Three of the most cost-effective of these concepts involve establishing a complete set of clearly spelled-out specifications for all data in the system, the use of process engineering to redesign the entire data entry process, and QA Metadata files.

Clearly Defined Specifications

Specifications always start with the end user who may be external or internal to the organization. Because quality and accuracy are expensive, it is important to establish accuracy specifications that meet the needs of the most demanding user–not more, not less. Conversion specifications are a good place to start when developing maintenance specifications. TQM and zero-defect processes will ensure that these specifications are met in a consistent, timely and cost-effective manner. These clearly defined specifications also allow automated checking for data integrity–an important part of statistical process control.

The Maintenance Plan

Draw a flow chart that maps the route each type of data currently follows, from inception to GIS, then answer three things:

1. At what points can data be lost or altered? Design processes that foresee all the ways in which this can happen and then embed preventive measures into the process itself. For instance, rather than having an engineering technician hand a GIS technician a piece of paper with data that needs to be entered, could the engineering technician submit the data in such a way that it does not have to be re-entered? At this stage, data is like food; the less it is processed, the better.

2. How can new (or old) technologies streamline this process? Consider the Internet, handheld computer units, or even Post-it Notes to improve the process and avoid opportunities for error. Also search carefully for ways in which technology may be producing, rather than reducing error.

3. What level of service is needed? This is the basis for the maintenance plan. It is a written document that addresses each type of data in the system in terms of:

how frequently the data must be updated;

how current the data must be;

how easy the access must be;

how accurate the data must be;

what risks are run if the data is inaccurate or out of date; and

how quickly users must have access to each type of data.

Once these questions have been answered for every type of user and every kind of data, a cost analysis can be performed to establish the least expensive way to meet these parameters. In-house, partial outsourcing and full outsourcing are all options.

QA Metadata Files

QA Metadata files are a method of identifying the spatial accuracy of each segment or node, as well as its source. Having a QA Metadata file allows a GIS manager to quickly and easily prioritize quality improvement efforts starting with the least accurate engineering drawings. The development of these standards and the identification of sources are jobs for the engineering department. Think carefully about the most efficient way to attach this information to the corresponding spatial data. QA Metadata files can be constructed wholesale during the conversion process; they then serve as a guide for integrating data accuracy improvement work into the maintenance process. But the construction of the metadata file itself can also be part of the maintenance process. This is done on a sector-by-sector, incremental basis. Well-defined and consistently defined metadata standards also permit automated verification of data quality or feedback. If the metadata parameters are exceeded, there is a quality problem.

GIS, being an infrastructure tool used to offer many different types of services, can often bridge numerous departments or sections of an organization. For instance, a water utility may superimpose its data over the city`s planimetric base map. Of course, this means that when the assessor`s office performs its routine maintenance of the base map, water main information will probably need to be updated, too. This type of interdependency requires the development of excellent data processes and well-established communication channels. Some organizations outsource all data maintenance for several layers from all user departments to a third party. This is being done more and more, particularly to ensure data integrity and to guarantee that all layers are adjusted to the same standards.

Maintenance is not a chore that can be handled ad hoc by low-level technicians. Over a period of months or years, this approach to maintenance can easily pollute the data quality that was so carefully established during startup. This is particularly true in organizations plagued by understaffing, high turnover and too few training dollars. It can also come to pass simply by virtue of the success of a GIS. Unanticipated demand by unforeseen users can hamper the ability of GIS staff to merely provide basic services; maintenance is put off until someone has time to get to it. Maintenance may seem like a crisis sometimes. Lack of it can certainly cause crises. But, in Chinese, “crisis” also means “opportunity.” Innovative maintenance represents an opportunity to save money and improve GIS data quality. Don`t pass it up.

Author Bio

Mickey Fain founded Serion International (formerly SMARTSCAN Inc.) in 1981. Serion is one of the leading full-service spatial database management firms in the United States. Fain is the co-author of a white paper entitled “GIS Database Maintenance–Part I.

If you would like to see more articles on this topic, circle R.S. 117.

For more information on this article, circle R.S. 118.

Author

  • The Clarion Energy Content Team is made up of editors from various publications, including POWERGRID International, Power Engineering, Renewable Energy World, Hydro Review, Smart Energy International, and Power Engineering International. Contact the content lead for this publication at Jennifer.Runyon@ClarionEvents.com.

Previous articlePOWERGRID_INTERNATIONAL Volume 1 Issue 5
Next articlePOWERGRID_INTERNATIONAL Volume 1 Issue 6
The Clarion Energy Content Team is made up of editors from various publications, including POWERGRID International, Power Engineering, Renewable Energy World, Hydro Review, Smart Energy International, and Power Engineering International. Contact the content lead for this publication at Jennifer.Runyon@ClarionEvents.com.

No posts to display