Utilities Benefit from Smarter IT Architectures

The leaders responsible for public utilities have a difficult job. As the physical infrastructure of the utility ages, they’re pressured to contain costs and operate sustainably. They make hard decisions every day about how to allocate limited human and financial resources in service of their missions.

Often, their challenge is compounded by an IT architecture that limits access to the data they need to make informed decisions. Their organizations simply lack the skill sets and tools that allow data to be compiled and analyzed in a meaningful manner.

It’s one thing to know that several pipes need to be replaced and how much they’ll cost. It’s another to know which pipes are most likely to burst and when, as well as the potential costs if they’re not replaced. With this type of information, leaders are in a better position to prioritize and strategize.

What’s stopping them?

The Problem with Discrete Systems

Utilities already rely on SCADA and other monitoring systems, including billing and customer feedback, to collect large amounts of data about the quality of their processes, the health of their infrastructure, and the operation and maintenance costs of their systems. That’s the good news.

The problem is that all that data sits in discrete systems, making it a challenge to compile efficiently and in ways that unlock opportunities for deep analysis.

When reports are necessary, either for stakeholders or regulators, the data is often compiled manually. Given the time involved, the data extracted for those reports is a fraction of what’s available, and only used for limited purposes.

To address the problem in the past, utilities built entirely new IT architectures that would stream disparate data sets into a data warehouse. Setup costs could run in the millions of dollars and take months to deploy.

Turning to the Cloud for Big Data

The future of smart infrastructure is scalable analytic capability. Leveraging cloud technologies is key. Rather than build new infrastructures, cloud platforms make it possible for utilities to integrate existing architecture to draw disparate data streams into a common ‘data lake.’

By leveraging cloud technology to compile, store and analyze data, utilities can bring data-based decision-making to their organizations without needing to overhaul their entire systems. Utilities can start right away with pilot projects then scale as needed, an approach that helps to reduce operating costs while allowing for processes to evolve. An open cloud environment also ensures that utilities aren’t locked into using specific vendors. They retain the flexibility to choose the most relevant, cost-effective solutions available.

With the right framework, another benefit is the ability to introduce data sets from public sources that are necessary for accurate forecasting. For example, utilities could introduce U.S. Geological Survey and National Weather Service data as they look for correlations and trends related to public health, air quality, environmental impact, and more.

To take advantage of this data now, many utilities depend on cumbersome spreadsheets with manually entered data. Cloud technologies can streamline and automate these processes, allowing utilities to benefit from real-time analytics and geospatial overlay tools that help executives make informed decisions.

Join Google Cloud and AEEC for a complimentary webinaron June 18 to learn more about piloting, prototyping and scaling a data analytics solution for water and wastewater utilities.


Related Articles