Visibility in the Data Center: Showing Some Love for Operational Technology

Recently, our CEO Craig Compiano sat down with Mike Andrea, the CEO of Oper8 Global, our key partner in Australia, for an insightful webinar. The conversation covered a wide range of hot topics faced by today’s data center operators. In this series, we’d like to highlight some of the conversations we thought would be most relevant. The link to the entire webinar can be found here.

It’s no secret that data centers do not just store and process a lot of data, they also create a ton of it. This has led to a kind of data disparity between the different information systems used to monitor a data center’s health. When you look at most data centers, they have multiple vendors of equipment, various makes and models, various versions of firmware. They have a BMS, they have an EPMS, and each system tends to be focused on a particular subset of equipment within the data center, and collecting and harmonizing the data across all those pieces of equipment is a challenge.

When you really dive into most data centers today, they really lack the visibility and the data to make informed decisions. This makes informed decision making more painful than it needs to be because operators still might be reliant on spreadsheets or point solutions or silos of data for each function within the data center.

For example, we don’t think there’s a big distinction if it’s your power systems or your cooling systems, that fails without proper redundancy. Frankly, it’s just a matter of how long before you fail. Both deserve equal treatment for visibility and management. You need the opportunity to analyze both in real time and have automation and use machine learning tools that can diagnose anomalies and present potential failures or opportunities for preventative maintenance.

The classic sort of building management system or EPMS was not designed at inception to do that kind of analytics at scale. While they do control functions well, they don’t analyze patterns and are not able to forecast well – the data structures of these systems don’t lend themselves to that. So, again, DCIM provides a value-add across those two kinds of systems.

What DCIM tries to do is to bring together all that data in real time, normalized and standardized in a way that people can easily ingest the data and make use of it.

If you are looking for a next-generation DCIM solution that can help you better understand your data center’s status and opportunities efficiencies, consider Modius® OpenData®. OpenData provides integrated tools to manage the assets and performance of colocation facilities, enterprise data centers, and critical infrastructure.

OpenData is a ready-to-deploy DCIM featuring an enterprise-class architecture that scales incredibly well. In addition, OpenData gives you real-time, normalized, actionable data accessible through a single sign-on and a single pane of glass.

We are passionate about helping clients run more profitable data centers and providing operators with the best possible view into a managed facility’s data. We have been delivering DCIM solutions since 2007. We are based in San Francisco and are proudly a Veteran Owned Small Business (VOSB Certified). You can reach us at or 1-(888) 323.0066.

Share this article