• Print
  • Decrease text size
  • Reset text size
  • Larger text size

Next Generation Enterprise Architecture for Utilities

Like most other economic sectors, companies in the utilities sector are coping with an increasing amount of data. Such Big Data, comes in an increasing velocity, volume, variety and veracity. For one reason, an exponentially growing number of datapoints are  retrieved from an increasingly smarter grid, in combination with smart appliances and sensors hooked up to a network, often referred to as the Internet of Things (IoT).

Companies need to evolve into data-driven companies and require a profound digital transformation in order to improve their customer’s experience, develop new business models and streamline their operational processes. Companies have to manage a dense IT landscape and a fact-based decision making processes on all levels in the organization, as well as the ability to explore and develop new digital initiatives. To support such more complex information flows and increase value from data requires next generation Enterprise Architecture.

An enterprise’s architecture (EA) describes the organization and logic for data and information processes to support the company’s operating model. The enterprise architecture as described by The Open Group Architecture Framework (TOGAF) includes the business, application and technology architecture of which the latter 3 are highlighted in this article. Here we define a conceptual blueprint that defines such a structure on a relatively high level. In contrast to the past, a next generation EA is defined by at least the following criteria:

  • Ability to capture and process all kinds of Big Data
  • Ability to bring analytics (closer) to the data and not the other way around, reducing pressure on the network and thereby latency
  • Offering a supporting backbone for 3rd [1] platform technologies, such as cloud, social media, mobile and big data analytics
  • Allowing for experimentation (in so-called sand boxes) by data discovery and rapid development of proofs of concepts (POCs)

An architectural blue-print in the context of a utility company

A conceptual model of an example architecture is shown in Figure 1. When this model is applied in the context of a utility company, the most important elements can be explained and illustrated:

  • Data sources

In recent years, the available dataset of utility companies has increased substantially. In the past, this was limited to mainly consumption data from meter readings and later through meter transmissions, non-exclusively today extended with:

  • Smart meter data and sensor data from the grid
  • Supervisory Control and Data Acquisition (SCADA systems)
  • Social media and web statistics of (potential) customers
  • Grid and asset outage management
  • Wholesale and financial market data
  • Asset tracking for consumers, power generation and electric grids (incl. GIS data)
  • Weather data


  • Data management

Data management as part of the existing operating model of the firm consists of three main components. Relational database management systems (RDBMS) and data warehouses are already known for years. However, to cope with the increased diversity of the data flows, these data warehouses need to be supplemented with data reservoirs and data factories. Data reservoirs contain data that do not have a stringent requirement regarding data formalization or modelling and allow for parallel processing. Typically non-structured data such as social media data, click data, photos, files, etc. can be gathered in data reservoirs. A well-known technology for capturing and processing of these types of data is the Hadoop Distributed File System (HDFS) but also other technologies such as Not Only SQL (NoSQL) data bases are gaining market share. The data factory represents the management and orchestration of the data into and between the Data Discovery and Development lab, as well as the provision of data to other systems for operational use.


  • Fast data streams

Fast data contains data flows that require ‘in-flight’ processing and pre-determined analytics to identify any actionable events and next-best-actions to support fast decision making. For example grid-components’ sensor data that flag any unusual behavior, as an indicator for grid failure. When such an event is identified, automatically a chain of actions can be initiated to correct the issue at the maintenance department.  


  • Data discovery and development lab

Besides using the data sources for operational purposes, a company with a focus on innovation should include a ‘playground’ for new (data driven) initiatives and experimentation in its enterprise architecture. Here defined as a data discovery and development lab, should offer the opportunity for data scientists and developers to work on new POCs. These new insights, products and services that can later be embedded into the operational model of the organization. For example, a utility company can create a POC for a new trading platform for peer-to-peer energy exchange and settlement or develop a new algorithm for electricity consumption forecasting.


  • Analytics

The data processed through the operational chain of events can be used for business analytics. Business analytics mainly uses tools for standard BI activities, such as reporting, creating visuals and dashboards. The data sourced for these activities is therefore offered in a more structured format in contrast to the initial raw data where it originates from. For example, the imbalance position on the intra-day market of a utility company can be monitored via a dashboard at the dispatch department.


Due to the size and multidimensionality of the data available in the company, the ability to quickly interact with this data through visual drill-down capabilities and dashboards has become increasingly important to add value to the available data. Over the past years, the technology to quickly visualize data and create business reports and dashboards has rapidly improved, and new players (e.g. Qlikview, Tableau) have gained market share over the more traditional BI software suppliers. They stand out in terms of performance due to in-memory processing and reduce the dependency on the IT department for creating insightful reports and dashboards.


  • Results and discovery output

A feedback loop that re-feeds the data sources from the results from the analytics and data discovery lab allows for evaluation and continuous improvement of the initiatives and operation.


Figure 1. Conceptual model of an example of next generation Enterprise Architecture as an enabler for digital transformation


Since the revolution of 3rd platform technologies, many firms need to review their Enterprise Architecture, utility companies included. New technologies such as Hadoop, NoSQL databases, numerical and development platforms for R&D need to be integrated seamlessly in the system landscape. Although technology is merely an enabler instead of a driving force behind digital transformation, it can quickly be overlooked. At Sia Partners we believe that most utility companies are lagging behind these developments, compared to other industries such as the financial or the telecom sector. Therefore we urge utilities to step up and show more leadership in this field. Companies that are on the frontlines of such developments have proven to be more agile in adapting to market circumstances.


[1] The third computing platform is stated in the literature as the successor of the mainframe computer system (first generation) and client/server system (second generation)

0 comment
Post a comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Enter the characters shown in the image.
Back to Top