by Bryon Turcotte, Contributing Editor
As electric utilities deploy smart meters and record terabytes of usage data that must be stored and accessed frequently, they look left, they look right. There’s not a room big enough to house the data on-site, so utilities need another option. It might be cloud computing.
“All of my clients are looking at how to store meter consumption data externally,” said Steve Rogers, vice president and North American chief architect at consulting firm Capgemini.
External storage isn’t the only reason cloud computing is attractive to utilities. The cloud offers a scalable solution, unlimited data access, computation systems to analyze raw data and cost savings, in addition to the reduced need for data storage and server configurations. And combined with an online interface, cloud computing provides utilities and customers access to accurate consumption data at any time through smart phones, tablets, desktops and laptops.
“In the last year and a half, utilities have developed a much higher expectation that any application they buy should have the possibility to be hosted outside their environment,” Rogers said.
Those applications, he said, are going beyond typical office apps such as email to smart meter as a service or claims management.
The cloud also frees information technology resources to focus on business solutions, said Ben Butera of Clearion Software in March during TechAdvantage Conference & Expo in San Diego. Butera touted the cloud’s flexibility, deployment speed and superior hardware and bandwidth.
John Ramsey, chief technical officer and Dell Fellow of Dell SecureWorks, joined Butera on the panel “Practical Uses of Cloud Computing in Utility Operations.” Security and privacy are the major issues with cloud adoption, Ramsey said. Regarding multitenancy, Ramsey said anyone in the cloud can move next door, which leaves all involved feeling insecure.
Technology Evangelist Brad Molander of the National Information Solutions Cooperative (NISC) also spoke about cloud strengths and weaknesses in a session called “Modern Architecture for the Modern Smart Grid.” The cloud, Molander said, is a strong solution for backups, resources, sharing and hardware operations, but it’s limited by server constraints and does not scale well vertically.
Together, Butera, Ramsey and Molander provided enough information for a list of the cloud’s pros and cons.
Cost-effectiveness. Cloud technology costs less than a server-based system required to manage the same amount of data on-site, and it eliminates on-site servers, storage devices and software applications. Providers fully manage services, which are sold on demand per minute or hour, giving users as much or as little service as they need. Cloud users also wipe out the costs of hardware purchases and replacements because necessary processing resides within providers’ cloud environments. This relieves internal operations when systems experience downtime and require immediate, around-the-clock support. The highly automated cloud allows information technology personnel to shift focus from updating software to dealing with other in-house issues.
These include multicore processing capabilities and resilient load distribution, which can help break down and process tasks efficiently.
Better processing, storage. The cloud’s multiple processing points allow customers’ usage data to be readily available.
Quick deployment, flexibility.
Ability to monitor, access data. Users may access their information from any Internet browser on any computer.
Security, privacy. Companies retain control of their data behind the firewalls of their network infrastructure, but when using the cloud, companies hand the keys to their kingdoms to something that can open and shut many other doors. Physical security becomes a stress point because data may be saved at several locations within the cloud, and locations don’t share the same protection levels.
System downtime. The cloud is limited by its server configuration and network capabilities. What happens if the cloud’s server infrastructure fails or the cloud network experiences latency issues or a shutdown? Because applications demand an intense use of large data volumes, data transfer introduces the increasing possibility of bottlenecks. Redundant servers, fail-safes and backup storage usually catch the data if it begins to fall, but if the cloud’s network fails, companies and customers lose data access. A company that requires instant processing of a large amount of data generated by two applications-one retrieving measurements from field sensors and one that compiles billing data-might not see the cloud as an option after assuming the outcome of failure. The lack of scalable storage, the potential absence of service-level agreements (SLAs) and the fear of interoperability and lock in can affect the decision to adopt the cloud computing model. Those adopting cloud computing might face severe constraints when moving data from one provider to another and find themselves locked in with a huge obstacle.
No matter a technology’s perks, utility adoption remains conservative and slow, Rogers said. He attributes the slow adoption rate to utilities’ frequently being regulated and their having security regulations that differ from those of other companies because utilities control the grid. Still, some utilities will adopt cloud computing sooner than others, he said.
“Right now it’s so expensive-and they’re nervous because of privacy and security-that I’m not sure they’ll go beyond that, at least the big IOUs (investor-owned utilities),” Rogers said. “The smaller municipal utilities may.”
Small, independent munis often own little of their infrastructure, which makes them good candidates for outsourcing their functions to the cloud, he said. But for smaller utilities to move to the cloud, they must get permission from city halls compared with large IOUs’ seeking permission from public utility commissions, Rogers said.
“I see city agencies moving their email platforms to Google and MSFT,” he said.
Rogers said larger utilities mainly use external cloud providers such as Amazon for test environments.
“Many of the larger utilities are moving to the private cloud, primarily to provision platforms faster using tools like NetApp’s FlexPod that lets you create a repeatable virtual environment,” Rogers said.
Utilities, the Cloud and the Future
A Google ad in the January 2011 issue of The Economist quotes David Girouard, president of Google’s Enterprise business.
“We believe that 100 percent Web is the future of the cloud computing model, and brings substantial benefits for companies that no other IT model can provide in simplicity, cost, security, flexibility, and pace of innovation,” the ad quotes Girouard.
According to the ad, 3,000 businesses are moving to the cloud each day, and more than 3 million have moved since the cloud’s 2007 debut.
Rogers isn’t so sure cloud computing will be welcomed as warmly and as soon in the utility industry. “Is it the future? It certainly will be a significant part, but integration between applications is still a challenge,” Rogers said. “There are integration platforms like Dell Boomi, which indicates there’s a market. The next question will become: Does it integrate with multiple providers? The utility world faces significant challenges like performance. Data volumes are very high when you’re talking about reads. With smart grid and sensors, there can be almost no latency. Asset management is also becoming a big issue, so I’d expect to see that in the cloud soon.
“Many utilities are already doing visualization so they can see their assets on a map in real time using Google Earth to see if weather will impact their field assets. Almost everything I mentioned is being done by my current clients, like the Green Button initiative and the use of FlexPod to internally provision test environments.”
The Green Button initiative seeks to build an energy cloud to allow consumers to download their data to manage their appliances more efficiently, encourage utilities to build applications and make the data anonymous yet available to third parties for research and marketing.
Green Button initiative site: http://greenbuttondata.org
FlexPod site: http://netapp.com/us/technology/flexpod
What is Cloud Computing?
Cloud computing can be daunting to anyone. Tangible hardware and software that can be manipulated and managed easily within arm’s reach are becoming the worn-out security blanket. Comparing that comfortable storage habit with the invisible cloud, many people perceive the new space as too unfamiliar, unfriendly and out there.
The change prompts questions from those with little or no understanding of the cloud: How can we store all information, processes and tasks on a cloud? How can we trust the cloud?
And electric utility executives approaching cloud integrations ask: How could this technology ever apply to our business? What are the benefits? How will it benefit customers? How does it work? What are the pros and cons?
First, decision-makers must understand what the cloud is and why it exists.
During a session at TechAdvantage 2012 in San Diego, Technology Evangelist Brad Molander of the National Information Solutions Cooperative (NISC) explained cloud computing, listed its characteristics and outlined its three service and four deployment models. Molander, borrowing a definition from the National Institute of Standards and Technology (NIST), said cloud computing is “a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.”
The NIST definition is clear to some and confusing to others. Simply, cloud computing uses a network of remote servers hosted on the Internet to store, manage and process data, rather than using a personal computer, local server or servers at one location. It promotes convenient access, availability and serviceability.
A cloud computing offering, according to a 2008 International Data Corp. (IDC) report, must meet the following eight key cloud service attributes:
1. Off-site, third-party provider
2. Accessed via Internet
3. Minimum or no information technology skills required to implement
4. Provisioning = self-service requesting; near
real-time deployment, dynamic and fine-grained scaling
5. Pricing model = fine-grained, usage-based (at least available as an option)
6. UI = browser and successors
7. System interface = Web services APIs
8. Shared resources/common versions (customization around the shared services)
NIST also specifies that a cloud computing system is composed of:
1. On-demand self-service: allows clients to adjust their levels of computing service, including the amount of server time and network storage. This is accomplished automatically without human interaction with the service provider.
2. Broad network access: extends the user’s network capabilities through mobile phones, laptops and other personal computing devices.
3. Resource pooling: serves multiple consumers using a multitenant model based on customers’ demands, consisting of specific physical and virtual resources that include storage, processing, memory, network bandwidth and virtual machines.
4. Rapid elasticity: often allows specific capabilities to be assigned automatically to adjust and re-adjust the scale.
5. Measured service: allows the cloud system to automatically control and optimize resource use by leveraging storage, processing, bandwidth and active user accounts. This service allows usage to be monitored, reported and controlled while providing transparency for cloud providers and clients.
Three Service Models (See Table)
1. Software as a Service (SaaS): runs the cloud provider’s applications on a cloud infrastructure. Each application is accessible from a specific device through a portal such as a Web browser. There is no client management or control of network, servers, operating systems, storage or individual application capabilities. Access to limited user-specific application configuration settings might be an exception.
2. Platform as a Service (PaaS): provides clients the ability to deploy cloud provider-supported applications onto the cloud infrastructure. Clients have no management or control permission on the cloud infrastructure but control the deployed applications and possibly application hosting environment configurations.
3. Infrastructure as a Service (IaaS): allows clients access to processing, storage, networks and other fundamental computing resources giving clients ability to deploy operating systems and other applications. Clients have no cloud infrastructure management or control options but can control operating systems, storage, deployed applications and possibly limited select networking components such as host firewalls.
Four Deployment Models
1. Private cloud: designed to be managed by one client or a third-party administrator on-site or off-site.
2. Community cloud: shared by several clients and supports a specific community group sharing concerns regarding mission, security requirements, policy and compliance considerations. Management may be conducted by clients or a third party and may exist on-site or off-site.
3. Public cloud: available for use by the general public and owned by a cloud service provider.
4. Hybrid cloud: two or more clouds that can be private community or public. Individual cloud models must remain separate but are bound by standardized technology allowing data and application portability and load-balancing between clouds.More PowerGrid International Issue Articles
View Power Generation Articles on PennEnergy.com