Distributed knowledge management can enhance utility security

B.K. Gogia, InferX Corp.

Information Systems (IS) are vital components of the control and management of power from generation to end-use. Supervisory control and data acquisition (SCADA) systems, which allow a control center to collect electrical system data from nodes placed throughout the power system, would not be possible without modern IS infrastructure. Information Systems (IS) are increasingly being used to manage other aspects of the power distribution business more efficiently (e.g., supplier communications, customer management, buying from and selling to third parties).

The thigh bone’s connected to the knee bone

The expansion of information systems in the power industry vastly increases the security vulnerabilities of power companies’ networks. Specifically, SCADA systems accessibility through non-SCADA networks, pressure from deregulation implementation of open access capabilities, and the expansion of corporate IT boundaries as the result of mergers and acquisitions, have rendered many information systems extremely vulnerable.

Information/knowledge management and control in SCADA systems is based on a centralized access mechanism. However, in many vulnerability analysis/assessment situations the collection of data from distributed hosts for its subsequent use to generate profiles (e.g., vulnerability profiles) may not be technically feasible (e.g., due to data size and the immense size of modern power grids). This situation is especially evident for data intensive intrusion profile generation (e.g., inducing intrusion profiles via data mining techniques) where the transmission of huge amounts of data to a centralized computing location may not be technically feasible and/or are prohibited due to the costs involved.

Intrusion detection is an important component of a modern computer network system’s protection from unauthorized and hostile use. The main purpose of a typical intrusion detection system is to detect outside intruders as well as inside intruders that may break into the system (based on observations of deviations from normal system usage patterns). This is achieved through building a statistical pattern-based profile of a monitored system. In typical intrusion detection scenarios, this profile is built from data collected from multiple hosts.

Keep out prying eyes

Most of the existing data intensive profile-building tools (e.g., the ones using the knowledge discovery/data mining techniques) assume that all the data can be collected on a single host machine and be represented by a homogeneous and relational structure. This assumption is not very realistic in highly distributed SCADA computing systems. Thus, there have been a number of efforts in the research community directed towards data analysis for intrusion detection (e.g., applying distributed data mining techniques). Unfortunately, the problem with most of these efforts is that, although they allow the databases to be distributed over a network, they assume that the data in all of the databases is defined over the same set of features. However, in order to fully take advantage of all the available data, the data analysis tools must have a mechanism for integrating data from a wide variety of data sources and should be able to handle data characterized by spatial (or logical) distribution, complexity and heterogeneity.

A distributed data mining (DDM) tool offers such a mechanism. Distributed mining and global profile generation is accomplished via the synchronized collaboration of data analysis agents. Using data analysis technology, collaborating agents generate partial models and communicate the temporary results among themselves in the form of indices to the data records. The process is terminated when a final model is induced. Since the communication mechanism does not involve any data transfers, the local data ownership is maintained and security control is not compromised. In addition, a compression approach is used to reduce the communication bandwidth of data index transfers.

Some DDM tools, like InferAgent, utilize independent software agents installed at each distributed database location. Each “intelligent” knowledge discovery agent only has access to its own local database and is responsible for obtaining information from that local data source. InferAgent has a Java-based mediator toll to synchronize the collection of information from each location and keep the data private.

InferX Corp. markets InferAgent, a DDM tool developed by Datamat under the aegis of the Missile Defense Agency.

Gogia, chairman and CEO of InferX, can be reached at 703-917-0880 X225.

Previous articleELP Volume 80 Issue 10
Next articleFERC plans conference to discuss natural gas industry

No posts to display