SCADA/EMS Developments Mirror Market Changes

By Robert Crosley and Bob Fesmire, ABB Inc.

For many years, SCADA/EMS was the system for the vertically integrated utility, and, in many ways, these systems are still vital to the continuous operation of power delivery systems. Changes in both technology and the business environment, however, have wrought significant changes in the IT infrastructure that supports utility operations. Over the last decade or so, there has even been a kind of give-and-take between these two forces, one enabling the other. This overall trend-the interaction between technology and business environment-appears likely to continue well into the future.

Of course, power systems functioned for decades without the benefit of computerized control systems. Operational adjustments were made by people. Utilities had to employ large technical staffs to man the equipment in various locations. The whole process was extremely labor-intensive and depended almost exclusively on the experience and judgment of the operators.

In the 1960s, computing technology began to move from government and university research to more “real world” applications, and electric power control systems were among these. Early SCADA systems did allow a greater degree of centralized control, but their analytic capabilities were limited due to the time and processing resources required. Also, power systems were still being designed for one utility operating more or less independently from its neighbors, save for a small amount of cooperative energy trading. They were still vulnerable to widespread outages.

The need to model more complex distribution systems and gain a better understanding of their behavior drove further IT development, and with the computer revolution of the early ’80s, the technology was soon available to do just that. In addition to offline modeling, utilities could now implement automated controls that were themselves based on analyses that were previously impossible to perform in a timely manner. As the PC made its way into all aspects of business, the cost of processing power, memory, data storage and faster communications fueled even more SCADA/EMS development.

One obvious by-product of this shift was the continued automation and centralization of utility operations. Engineering expertise and judgment were still important, of course, but they were turned on the task of developing the automated systems that would carry out much of the work previously done by operators. As the 1990s approached, a new day was dawning on the power industry and the reduced staffing demand precipitated by advances in SCADA/EMS was just one of several factors that paved the way for what was to be a tectonic shift in how utilities operate, and even how they viewed themselves as an organization.

Throughout the ’80s, one of the prime movers of change in the business world was the tide of deregulation. Airlines, telephone service and other industries were being opened up. The natural gas business followed suit, and it wasn’t long before policymakers began to seriously consider doing the same for electricity. In this same span of a few years, computing and communications technologies were exploding. By the time the 1992 Energy Policy Act set the stage for power industry deregulation, the new generation of SCADA/EMS was ready to support the shift. There is a certain chicken-and-egg aspect to the timing, but competitive energy markets had the good fortune to come into being in parallel with the IT systems needed to make them possible. A differential of even five to eight years between application development and (de)regulatory ambition would have rendered the discussion moot.

One element of SCADA/EMS development that was especially influential didn’t really have that much to do with technology per se. It wasn’t a better mousetrap, but a better way of building really complicated mousetraps: the advent of open-architecture systems such as LINUX and standardized commercial products like Oracle. Up to this point, SCADA/EMS was the exclusive province of proprietary software, and a given utility was tied by necessity to its vendor for most if not all the applications that fell under the SCADA/EMS umbrella. Under an open-source paradigm, the utility had greater flexibility. The company could opt for a turnkey system or could parse out some of the components to be developed by different vendors because they were all based on a common platform, one whose operating parameters were public information.

This new level of choice had a predictable impact on SCADA/EMS suppliers as well. On one hand, they no longer had the dubious luxury of a “captive” customer, but, on the other, they were relieved of the task of building the database systems and other structures that supported their systems. They could now focus on the applications themselves, on improving functionality, and on delivering value-added service.

With the advance of SCADA/EMS technologies, operational data has become more readily available to a wider community of users within the utility organization. Witness the current emphasis in the industry on asset management-much of the data required to support a successful asset management program comes out of the utility’s monitoring and control system. As with any number of other industries, access to data tends to foster more applications of that data, and the same is true in the utility world. In the power business, though, there is an added dimension to this due to the separation of the formerly integrated functions of generation, transmission and distribution.

Over the last few years, SCADA systems have begun to be applied to the specific needs of now-independent generation businesses and local distribution companies, in addition to the traditional transmission operations. Though these businesses differ considerably, they all rely heavily on information-of high-quality and often extreme timeliness-just to run their operations, let alone build or improve them. As we’ve already noted, SCADA provides the “raw material” in the form of operational data, but it also provides the analytic applications to turn data into useful information. Today’s energy markets rely on the market operator’s ability to manage congestion and ensure grid security to arrive at market clearing prices. These objectives are supported by applications like state estimation and security analysis that have long been under the SCADA/EMS umbrella. As computing technologies continue to advance, these systems will provide even more robust analytic capabilities.

In the electric utility world, the term “security” is often applied in two ways. The first, as just mentioned, refers to the physical integrity of grid equipment vis-à-vis operating practices and unforeseen events. The second refers to cyber-security, or the integrity of the “virtual” systems that monitor and control the physical power grid. The proliferation of open systems, standard commercial products, and especially Internet-enabled systems has opened SCADA/EMS up to a new universe of users, but also to greater security threats. Old proprietary systems were isolated-physically, and also in terms of restricted access and proprietary code. Today’s systems are simultaneously more useful and more vulnerable to attack. The last few years have seen increased attention paid to all forms of security, but cyber-security in particular is likely to see major advances in the years ahead.

The dual forces of open markets and advancing technology will continue to drive changes in all aspects of the industry, especially with regard to SCADA/EMS. Greater computing capabilities will enable the implementation of better analytic algorithms; the proliferation of Ethernet, fiber optic cable, and technologies not yet commercialized will make communications even faster; open-source software and greater integration will continue to make operational data more widely available to a greater range of applications. Driving all of these movements are the imperatives of open markets. As higher demands are placed on our power infrastructure-particularly with regard to bulk “shipment” of power over long distances-we will face a simultaneous challenge in the form of environmental concerns, old-fashioned NIMBY-ism, and other obstacles to physical expansion of the transmission infrastructure.

What all this means is that the power industry, and in particular transmission operators, will be forced to do more with what we currently have in terms of “iron in the ground.” That, in turn, points to advances in monitoring and control technologies. Both the Department of Energy and the Department of Homeland Security have called for the modernization of the grid, but in most cases that will be translated into enhancements in the IT infrastructure that underlies the physical power system.

In this environment, more intelligence will need to be built into SCADA/EMS systems, and an even more seamless integration between monitoring, analysis and control functions will facilitate the process. New technologies like wide-area monitoring utilizing phasor measurements that are today in their commercial infancy will likely find their way into tomorrow’s baseline systems. And, as technology advances, driven by the needs of energy markets, a certain irony begins to take shape. In another decade or two we may find our power system control schemes have come full circle, back to a decentralized structure, only this time the machines will largely run themselves. 

Bob Fesmire is a communications manager in ABB’s Power Technologies division. He writes periodically on IT systems, transmission and distribution issues, and other power industry topics.

Bob Crosley is director of SCADA/EMS Operations for Network Management in Santa Clara, California. Prior to coming to ABB in 1996, he worked in SCADA/EMS projects for Florida Power Corporation.

Previous articleELP Volume 83 Issue 1
Next articleAllegheny Energy completes sale of interest in Ohio Valley Electric Corporation

No posts to display