How Real-Time Analytics Help Transform Utility Operations

The Importance of a Reliable Approach for Turning Raw Operational Data into User-Friendly, Interactive, Context-Rich Business Intelligence

The right operational analytics strategy can empower utilities to make better decisions, from long-term strategy to urgent outage response. All too often, however, potentially valuable data sits underutilized. Nonintegrated source systems wall off data from easy access while isolating it from essential context. In some cases, operations managers have to send a request to IT for even routine data questions. In others, detailed operational analytics and reporting can only be delivered weeks or even months after the fact.

For some industries, collecting and processing data is important only for the generation of long-term strategic insights. Utilities, however, face a slew of challenges that are best addressed with real time data-driven decision-making. Customer communication, outage management, asset management for vital infrastructure/equipment, and field service all benefit directly from streamlined access to detailed real-time operational analytics.

“Utilities are sitting on a wealth of opportunity from data analytics, with more information than ever before flowing from smart meters and other sensors, along with traditional sources of data about their operations...[Executives] know these discoveries will eventually generate tremendous value for their organizations, but they tend to think this will be years away. The good news is, they don’t need to wait.”

– Bain & Company Report on Utility Analytics

For most utilities, voluminous generation of potentially valuable data is not an issue. Utilities manage complex infrastructure that generates large amounts of operational data across a variety of source systems. More and more utility companies have begun capturing and storing this data. But transforming this raw operational data into real-time business intelligence (while saving valuable nuggets for later analysis) is a genuine technology management challenge.

In this report, we look at how the right real-time data approach, paired with user interfaces featuring the right tools, like interactive dashboards, drag and drop and drill-down capabilities, and real-time outage maps, can meet this challenge. An effective real-time analytics system can have a direct impact on operations and function as an integrated ecosystem that ties together essential data from a variety of source systems.

An Ideal Utility Analytics Implementation Offers Intuitive Access to Interactive Real-Time Intelligence

Utilities are collecting more data than ever. In its raw form, however, the value of this data is limited. To realize its full potential, data needs to be easily accessible, rich with granular operational context, and ready for analysis the moment it is generated.

The ideal utility analytics solution will:

• Provide valuable operational insights that are conveniently accessible for decision-makers across the organization.
• Be available through an intuitive, interactive user interface that makes detailed information fully accessible to non-technical decisionmakers.
• Collect and integrate data from multiple source systems.
• Support interfaces like dashboards with real-time data streaming.
• Curate data to support long-term exploration and analytics while avoiding the unnecessary storage of data with no value

Fulfilling all of these requirements can be a daunting technical challenge for utilities. In our experience, many utilities are leaving as much as 80% of the data they collect “on the table.” In most cases, the underlying infrastructure for collecting this data is in place. But most of this data goes unexploited, with the primary source systems collecting far more than is effectively analyzed.

 

Common Limitations That can be Addressed with the Right Utility Analytics Capabilities

Leaving reams of data underutilized can impose a thick operational fog; having the right analytics capabilities can provide clarity and focus. Issues like inaccessible data silos are not only a lost opportunity, but a sharp limitation on a utility’s knowledge of its own operations. 

When data goes completely un-analyzed, a utility cannot even know what questions could be answered with that data. Without a strategy for aggregating data and transforming it into actionable analytics, the utility “doesn’t know what it doesn’t know.”

Data Silos Limit Analytics and Hamper Operational Awareness

A data silo isolates information, which is not connected with appropriate context or the right personnel to realize its full business value. A silo can develop because a given source system is only available to a specific business unit, is not integrated with day-to-day operational analytics, or is only accessible by highly trained technical personnel.

Data silos limit analytical transparency between different parts of the organization. Utilities are complex operations that rely on a variety of systems, all of which may contain valuable nuggets of data. 

The inability to effectively pull data across these systems can create a variety of issues. 

1. It can impose sharp limits on the awareness of utility professionals, from field technicians all the way to upper management. For instance, if an executive has access to a breakdown of active outages but no real-time intelligence on crew status, she will have a sharply reduced view of the utility’s service restoration operations. 

2. The value of data is often dramatically enhanced by integration with other data which provides essential context. For example, if an operations manager can see a breakdown of current network outages but needs—and cannot easily access—details from the associated customer accounts, it may be hard to evaluate the business impact of these outages in real time.

3. Data silos reduce a utility’s ability to harness useful insights in a timely manner. Unintegrated source systems may be amenable to longer-term analysis by manually pulling data and exhaustively compiling reports that integrate information from across the organization. But without analytics that integrate data streams from these source systems in near real-time, these reports will never be available to address fast-moving business questions. This issue is closely related to the next limitation: a broader inability to translate data into a real-time business asset.

Outdated Architectures Prevent Effective Real-Time Data Use

Many utilities rely on a traditional data warehousing approach. This approach almost always relies upon a relational database—a longtime business data workhorse that is becoming increasingly outdated. Legacy data warehousing is adequate for historical analysis, but effective real-time transformation and analysis require use of a database that is conducive to analytics. Modern ‘columnar’ databases can keep up with the real-time data needed for timely decisions and are much more suitable for analytics platforms than traditional databases. 

ETL tools are evolving, and it is easier than ever to quickly pull and transform data from varied source systems. Visualization tools are also evolving rapidly and simplify the ability of users to conduct fairly sophisticated data explorations for themselves, without IT intervention. Still, real-time data remains an essential foundation for opening up self-service exploration for current operations, and true data streaming is extremely important for taking a utility’s operational awareness to the next level.

Databases Fail to Provide User-Friendly Access for Non-Technical Stakeholders

The need to query databases to answer basic operational questions can put sand in the gears of day-to-day utility operations. Although writing a SQL query might be a relatively quick and easy operation for a database expert, this requirement effectively restricts access to key data for non-technical personnel.

Constant data requests can be a nuisance for IT or other system administrators. The far greater issue, however, is this unnecessary roadblock to organizational knowledge sharing. Personnel ranging from executives, to field technician crews, to operations managers can benefit immensely from the ability to quickly answer their own questions without the delay of issuing a request to IT. In this case, user interface limitations create data silos which directly prevent data from delivering its maximum possible value to the organization.

Want to read the page later?
Get the PDF!

Important Attributes of an Impactful Real-Time Analytics Strategy

Modern business intelligence tools make it easy to pull some data into a usable dashboard with a sleek interface. Implementing a solution that weaves together real-time data streams from multiple source systems while maintaining a single source of truth, however, requires a more complex solution.

The capabilities discussed in the section below are vital for developing an analytics implementation that can truly transform the way an organization uses data. A transformational solution should offer data-driven decision support for customer service representatives, field crews, operations managers, executives, and beyond.

Real-Time Utility Analytics Directly Address Traditional Limitations

The right analytics capabilities can directly address the limitations discussed above and enable more data-driven decision-making throughout the organization. The ideal goal of a utility analytics implementation should be an integrated ecosystem where relevant data can be streamed from all-important operational systems in real-time.

A high-quality, interactive utility dashboard, built on a quality data warehouse that ties together real-time data feeds, is a great example of a practical solution that can help a utility move directly toward a more integrated, impactful analytics ecosystem. HEXstream has extensive experience delivering dashboards for some of the biggest utility companies in North America. In the remainder of this article, we draw on this experience to take a look at important lessons for delivering the most impactful analytics possible.

A Strategic Approach to Data Curation and Consumption: Supporting Real-Time Decisions and Long-Term Strategic Analysis

A utility’s data can provide analytical value on two different time frames:

1. Near-term and real-time operational data helps support business decisions with timely analytics.

2. Long-term data analysis can help uncover new efficiencies, analyze continuous improvement initiatives, and compare historical trends.

A utility’s approach to analytics needs to remain sensitive to both needs. Collecting huge volumes of data across myriad source systems, utilities will always face decisions concerning which data to store and which data to discard. It can be easy for a utility to err on either side of these decisions.

If a utility only keeps the data it already knows to be valuable, it excludes the possibility of using analytics to uncover new value down the road. If too little data is stored for later exploration and analysis, a utility sharply limits its ability to develop new use cases for operational data streams. At the same time, a utility that simply saves all of the data it collects will inevitably be paying to store data that have little value. A data lake can be a cheaper alternative, but most of the time dumping all types of data in a data lake without proper design and planning creates a nightmare for data retrieval.

In the face of this balancing act, HEXstream’s experience suggests that “big” versus “small” data distinctions are largely overblown with respect to utilities’ most pressing business needs. “Big data” is one of the most hotly discussed analytics topics in the business press for a reason: managing and analyzing voluminous data requires methods that are beyond the scope of traditional data analysis techniques. From the ground-level view of putting valuable real-time insights in the hands of decision-makers, however, big data analysis concerns are less pressing. No matter its size, data is information, and we prefer to see it collectively, as oneData.

Once data is generated, the utility should be able to conduct relevant analyses immediately at that point. Access to quality data presentation should never be delayed—especially during storms and other high-urgency situations. From there, the data management challenge shifts to effective curation.

First, data streams from various source systems need to be provided to fuel real-time analytics interfaces (common examples are dashboards and outage maps). At this stage, exploration to determine which data are important is not required. The ground-level needs of operations personnel can help inform which data is valuable for real-time decision-making.

Next, operational data streams should be channeled into a “data lake.” This term descriptively captures the fact that, at this point, all data should be included. By leaning towards inclusivity at this early stage, all potentially valuable operational data is available for exploration. Effective exploration identifies which data has long-term analytics value. Data that is determined to have no value can be discarded to help limit storage needs and avoid the development of a costly, unnecessary graveyard for old operational data (in the figure below, this discarded data is shown as “exhaust”).

Valuable data can subsequently be treated and containerized for more systematic long-term data analytics. This data remains stored and available for more granular exploration and analysis.

HEXstream's "One Data" Approach

HexstreamData-small 850x440

Seamless Integration Removes Data Silos around Utility Source Systems

Analytics capabilities are not just about a convenient UI for accessing data, but a natural nexus for integrating the most useful data from systems across the organization. Integrated, intuitive, interactive interfaces, fueled by real-time data streaming, effectively overcome several problems associated with silo-bound data.

• Non-technical personnel can instantly check important KPI’s. Because resources like dashboards can be configured to include nearly any operational metric, they can be implemented to reflect the specific needs of any business unit, no matter their technical expertise.

• Stakeholders across the organization have dramatically increased transparency. Source systems no longer impose arbitrary boundaries on data-driven decision-making and collaboration, dramatically enhancing overall operational awareness.

• Integration ensures that data is given the rich context it needs to provide the most value possible. A single interface puts all the data relevant to a particular operations area in one place.

• Valuable insights that draw on data from across the organization no longer need to wait for quarterly data pulls or reports. Data can be harnessed the moment it is generated.

 

An Interactive Interface Makes it Easy to Drill Down to Deeper Details

Real-time data offers the most value when it can be accessed through an interactive interface. Specifically, this interface should offer a high-level view of key data that can be quickly focused to examine more granular details. This interactivity is the most streamlined way to give utility professionals the ability to see the big picture alongside the freedom to see specific details relevant to the challenge at hand.

Estimated Restoration Time (ETR) data provide a useful example. Dashboards should provide a high-level overview of crucial data, like ETR-times expiring in the next 15-minutes. From there, more specific details should be available within a few clicks within the same interface. In the ETR example, a manager might need a quick filter for crew- versus system-generated ETR estimates, or a quick comparison of current ETR performance with past performance (or performance goals). For a deeper look at important ETR reports for outage dashboards, please see our blog here. 

Interactivity helps put all of this information at the disposal of decision-makers on a flexible, as-needed basis. When confronting an urgent question that can only be answered with analytics, personnel will not need to pore over a spreadsheet or submit a data request to IT.

Customer-Facing Integration Enables Real-Time Data Streaming to Support a Better Utility Customer Experience

The data streaming architecture necessary to support real-time analytics should also be ready to push data out publicly. Effective real-time analytics already help customers by helping utilities:

• Operate more efficiently.

• Resolve outages more quickly.

• Manage assets more proactively by leveraging predictive maintenance.

• Provide granular field service insights to technicians and managers.

Integration with customer-facing channels adds the additional benefit of giving customers direct access to the most timely possible information on their utility service. For instance, integrating the outage analytics portal with the website and social media channels can help quickly publicize outage updates.

Internal capabilities like dashboards and outage maps are readily adaptable to customer-facing use cases. Customer-facing outage maps, for instance, allow users to quickly see if they are encountering an isolated incident or if their outage is part of a larger regional issue. Interactive service dashboards allow users to get more detailed information on their own account and service area (such as whether the utility is aware of their outage). Proactively providing this information not only improves customer communication but can directly reduce costs by limiting the volume of customer support calls.

Clear and accurate communication can be just as important to customers as resolving outages quickly. For a deeper look at important ETR reports for outage dashboards, please see our blog here.

Prototypical Types of Critical Real Time Data

icon-home
Customer

Customer: utilities should have a quick method for examining crucial account details, service status, and more. Customer dashboards can also include a snapshot of critical and priority customers, with linked account details.

Detailed real time data can help customer service representatives provides mode detailed information while providing vital context when managing outages. Integration with customer communication channels could even allow for sending updates from the same interface.

icon-crew
Crew

Crew: full transparency for key crew management variables like availability and on/off shift status makes it easy to quickly assign and prioritize service restoration work, monitor performance, and assess storm-readiness.

Concurrently, implementing relevant mobile dashboards for field technicians can help give them direct access to detailed outage information that might otherwise require a time-consuming call (or visit back to the office facility).

icon-time
ETR

ETR: estimated time to restoration (ETR) can be viewed multiple lenses like crew vs. system generated times, impending ETRs, and critical customer ETR’s.

icon-bolt
Outage

Outage: provide the most crucial information for planning a response in the wake of an outage event, like projected equipment failures and customers affected.

icon-power
Asset

Asset: performance data from physical equipment assets supports more precise predictive maintenance strategies and enhanced awareness of the health of utility’s critical infrastructure.

icon-work
Field Service

Field Service: deeper field service analytics can help do everything from optimizing technician travel routes to forecasting lulls and spikes in work demands.

icon-chart
AMI

AMI: review key data from advanced metering infrastructure (AMI) to understand costs and customer usage patterns more precisely.

A Flexible Hosting Model Enables Scalable Utility Analytics While Limiting Unnecessary Service Costs

The cloud offers a new degree of flexibility and scaling for a utility’s analytics platform. This flexibility is extremely valuable, because utilities can typically expect large variations in the size of their analytics user base. On a given day during normal operations, 1000 users might make use of the analytics platform. During periods of heightened outage risk (like a storm for an electric utility), this usage might rapidly balloon to many thousands of users. With the cloud options available today, utilities can seek out a solution that offers highly elastic, on-demand scaling. This feature will help ensure that analytics are ready for urgent scenarios like outage response, while avoiding paying for the costs of this heightened usage all the time.

For some utilities however, security, regulatory or other concerns require on-premises data storage. HEXstream’s experience suggests that a hybrid model is an effective way to thread the needle, enabling scalable analytics while keeping sensitive data onsite. In this model, analytics are based in the cloud, pulling internal data from sources kept behind a firewall onsite.

Want to read the page later?
Get the PDF!

Learn More About How HEXstream Can Help Implement Real-Time Analytics Tailored to Your Utility’s Business Requirements

HEXstream has delivered utility analytics solutions for some of the largest utilities in North America. Our solutions include real time data solutions, the interactive features needed to get the most out of them, and the underlying architecture needed to keep new capabilities reliably fueled with real-time data.

Our work is rooted in deep technical knowledge of implementing enterprise-scale data pipelines and extensive ground-level experience using utility data to address challenges unique to this industry. 

If you’re interested in learning more about how HEXstream can help your organization transform its data into real-time business intelligence, you can reach out to our team using the form below.