Integration of disparate systems has become a major priority for utilities. Mergers and acquisitions, e-commerce, globalization, competition and re-regulation have created a need for utilities to make real-time operational and engineering data accessible throughout the company. Furthermore, these business conditions have increased the need for business-related analysis and planning. Much of the data contained in traditional back office systems needs to be shared with front office systems, such as billing and customer service. This is a case where the whole can be greater than the sum of the parts. Utilities can realize additional benefits by not only linking applications, but also by constructing an enterprise-wide view of their business. In this way, the data captured in existing applications can be used to create shared information. This article discusses how off-the-shelf technology can reduce the cost and time required to both integrate utility applications and create an infrastructure where shared information can be analyzed.
The Need for Application Integration
Today’s utility IT environment is truly heterogeneous. Some of the more significant features of this mix include:
- Many computing hardware platforms,
- Many operating systems,
- Mainframe/client-server/web-based systems, and
- Many component technologies (CORBA, DCOM, Enterprise Java Beans).
Furthermore, the utility of today has several sources of “real-time” process-oriented information that allows key devices, measurements and subsystems to be controlled and monitored. These include: supervisory control and data acquisition (SCADA) systems, energy management systems (EMS), distribution facilities management systems (DFMS), automatic meter reading (AMR) systems, and other sources of “real-time” information. Several of these systems may need to exchange information with business applications and desktops or with other process-oriented systems.
Availability of process information outside the control room would allow operational data to be integrated into business applications such as work management systems (WMS). WMS and other systems like it typically employ databases that have been configured separately from customer and SCADA system information. However, any or all of this information can be valuable to process-oriented or business applications.
While utilities need to integrate many applications (Figure 1), most likely each application structures its data in ways that are incompatible with other applications’ data structures. However, this issue can be overcome by the use of a single model for all data exchange.
Common Data Models
Efforts are under way in the U.S. and in the international arena to create a standard way of integrating utility industry applications. One effort, a volunteer organization sponsored by EPRI, is called the Control Center Application Program Interface (CCAPI) Task Force. To date, the CCAPI Task Force’s most significant technology is the Common Information Model (CIM). The CIM contains models for most of the software objects generally found in electric utilities’ enterprise. The CIM is presently under consideration for adoption by the International Electrotechnical Commission (IEC) as an international standard.
As described by the IEC CIM documentation, the CIM is composed of the following packages:
“- Core. This package models base types, such as power system resource and conducting equipment entities. Most of the other packages extend the types defined in this package.
“- Topology. This package models how equipment is physically connected to other equipment. In addition, it models topology, which is the logical definition of how equipment is connected electrically via open or closed switch positions.
“- Wires. This package models the electrical characteristics of transmission and distribution networks.
“- Outage. This package models information on current and planned network configuration.
“- Protection. This package models information for protection equipment, such as relays.
“- SCADA. This package models information used by SCADA and alarming applications.
“- Meas. This measurement package models dynamic measurement data.
“- Load Model. This package models energy consumer and system loads as curves and associated curve data.
“- Production. This package contains higher level models of generator with production costing information.
“- Generation Dynamics. This package contains lower level models of prime movers, such as turbines and boilers.
“- Domain. This package contains a data dictionary of quantities and units, defining data types for attributes (properties) that may be used by any class in any other package.
“- Energy Scheduling. This package models transaction schedules and account information for electricity exchange between companies.
“- Reservation. This package models transaction scheduling for energy, generation capacity, transmission and ancillary services.
“- Financial. This package models settlement and billing as well as the legal entities that participate in formal or informal agreements.
“- Asset. This package models utility equipment from a business asset perspective.
More and more, organizations are accepting and implementing the CIM as an ideal model for building new utility operations applications and integrating legacy applications. For example, recently the North American Electric Reliability Council’s (NERC’s) Security Coordinators adopted a resolution mandating that CIM be used to exchange power system models between regional areas. This model exchange will allow neighboring utilities as well as national entities to create inter-regional network models based on current operational information instead of older, less accurate planning information.
Applications that have been integrated or are planned for integration with the CIM include but are not limited to the following:
“- Automatic generation control and economic dispatch.
“- Interchange scheduling.
“- State estimation and powerflow.
“- Operator training simulator.
“- Asset management.
The CIM’s current scope goes far beyond its original application of modeling the utilities’ EMS data. The CCAPI Task Force break-out teams are working on extending the CIM to such topics as water networks, distribution networks, asset management and a common schematic display model. The CCAPI Task Force is also working on two application interface specifications: The Generic Interface Definition; and the Data Access from Industrial Systems Interface. The former deals with creating a plan for a cohesive set of utility application interfaces and the latter deals specifically with SCADA data access. The CCAPI Task Force’s vision is that these APIs will lead to “plug-and-play” applications within the utility control center. Today CCAPI’s mission can best be defined by paraphrasing the objectives of the Object Management Group Utilities Task Force-to provide standard objects for the interoperation of systems and applications used for production, transmission, distribution, marketing and retailing functions of electric, water and gas utilities. For information on CCAPI activities, see the ftp site hosted at: ftp://ftp.kemaconsulting.com/epriapi
The CIM and Application Integration
It is important to realize that the CIM is used only to model data exchange. It is not used to model data inside applications. Some of the benefits of a standard data model for data exchange are:
“- Each application only needs to be interfaced to a single model instead of multiple point to point translations.
“- Although data may be distributed, a virtual database built upon a common schema helps reduce data duplication.
“- Enterprise-wide data updates are available simultaneously.
“- A flexible platform for future growth is provided.
“- A platform that makes it easier to present enterprise data in a coherent way is provided. Typically, this would be done with a web page or so called “information portal.”
“- A platform on which enterprise wide analysis can take place is available.
There are two methodologies for making sure all data exchanged conforms to the CIM. The first is to accumulate all shared data into one or more centralized databases. The second is to recognize and accommodate today’s individuality between data stores and establish rules and communications protocols for communication between the distributed data.
There are many reasons to not store all data in a single database. First, not all data are efficiently stored in a single database. Process (real-time/temporal) data are not efficiently stored in a transactional (relational) database and model/configuration information is not stored efficiently in a temporal database. Second, specific users of different data reside in a variety of groups around the organization. Users interested in outage management are not the same users who update drawings. It does not make technical sense to have these groups working in the same database. However, it is important for the two groups to have access to one another’s data when needed.
An alternative to putting all common data into centralized databases is to align existing data information sources within a utility into a cogent virtual information system (Figure 2). For a data model such as the CIM to be used by multiple applications, its meaning and context must be understood and managed. The commonly accepted way to manage data semantics is by describing what, where and how data is used in a metadata repository. Metadata is simply information about data. A metadata repository provides a utility with the infrastructure necessary to create a single cohesive model for all data exchanged. A metadata repository looks like a single all-encompassing data structure that supports technical efforts within the utilities. However, it maintains each data source’s individuality.
In addition to a data exchange model, a utility also needs a communication infrastructure to reliably link legacy applications. Recently a new type of middleware has emerged whose sole function is to link legacy applications via the exchange of messages. This software requires building wrappers for existing applications and can provide a standardized way to perform integration.
In general, wrappers are responsible for exposing the wrapped application’s functionality in a way that is compatible with a common methodology. Wrappers also perform translation between object models. This approach’s most significant benefits are that it is technology neutral and legacy applications do not need to be rewritten.
Because existing legacy applications normally do not run simultaneously, they usually require a looser coupling. Message brokers have proved to be the best way to link loosely coupled message applications. With message broker technology, an application can send messages to a broker message queue for later delivery. The message broker can also dispatch messages to other internal or external applications based on predefined criteria. To use a post office analogy, no one waits at the front door for the postman to arrive before mailing a package. Mailboxes provide a convenient method for storing letters until a mail truck comes along to pick up the mail and deposit the received mail. A message broker performs similar functions.
The creation of a cohesive set of APIs for all data exchange across the entire electric utility can be a very large undertaking. It is likely that, unless existing technology is utilized, many utilities will be unable to afford such a comprehensive solution. Fortunately, much of the required engineering is complete. Furthermore, since the CCAPI technology is non-proprietary, vendors are free to build products that leverage this prior work. In fact, this is happening already.
CIM-based applications and integration tools are available today. It is critical for IT and operations managers to understand the issues at hand as they adopt integration strategies and evaluate what products are appropriate to their situation. A CIM-based integration bus is one solution that has been created to leverage existing standards and products while providing a solution to address the unique needs of utilities.
John Gillerman of SISCO Inc. has over 16 years experience working in the power engineering field. Since 1988, he has concentrated on software system design for power system monitoring and control. At SISCO, Gillerman has co-led the design and prototyping of the Utility Integration Bus-a utility application integration tool based on open standards. Gillerman has served as the Integrated Security Network Facilitator for NERC’s Data Exchange Working Group and is an active member of EPRI’s Control Center API Task Force and OMG’s Utility Domain Task Force. He graduated with a BSEE from Rutgers University.