Every facilities manager is acutely aware of the power of data. They are surrounded by data: submeter dashboards, BMS trending and alarms, work orders with comfort complaints, asset management systems, and even automated fault detection systems. Yet more often than not, instead of providing clear, actionable knowledge, the path forward is further obscured by the noise of hundreds of unacknowledged BMS alarms, long lists of maintenance and retrofit projects, piles of site audits, and commissioning reports collecting dust on shelves. To add to the problem, varying quality in sub-contractors and lack of valid, low-cost measurement & verification (M&V) capability cloud issue resolution, preventing the accountability that justifies future investment in system and equipment improvement. The problem is that data is not enough. Data needs to drive action, whether it is prioritizing issues or justifying investment in a retrofit opportunity. This action is based on a host of variables: is the fault visible in an important public area and could it result in reputational impacts for the organization, is the vice chancellor in that office, how loud has that department been screaming, and are you going to replace that asset in a year anyway? There are hundreds of quantitative and qualitative aspects, along with deep-seated industry barriers which must be considered when applying any technology and analytics to a problem in the building operations space.
Shiny technology alone is not the answer
Over the past 5 years, I have seen a rapid expansion of building software and services providing all of the buzzwords the software industry can offer: “big data”, “machine learning”, “smart data analytics”, “IoT”. First and foremost, this is thrilling. The conservative building and construction industry has been slow to embrace these technologies and it feels like the beginning of a revolution against the conservative, monopoly-driven market of the past 20 years. The building industry is finally entering the 21st century!
Unfortunately, I have seen time and time again, companies without extensive on the ground or industry experience believing that technology is enough. That, somehow, applying the newest technology, providing more and more data and a better UI will solve all the deep-seated industry barriers plaguing the building and construction industry. This is manifested in slick dashboards and advanced analytics unable to provide clear answers because it does not address the core issue of systemic data quality issues. One recent client complained about hundreds of alarms everyday —so many in fact that they were all ignored. And when we offered a better dashboard to manage the alarms their response was, “No. You manage them and prioritize them. We’ll dispatch them and catch anything that wasn’t important enough to ticket at the end of the month.” It was not about a better UI or analytics; it was about changing their process and understanding their resource limitations and priorities. While these solutions are all small pieces of the puzzle, without understanding these barriers, these solutions cannot solve the core issues driving the industry.
This statement may be surprising coming from a company that has sold optimization software for the past six years. However, during this time, we found that, more often than not, issues are rooted in people and processes, not data and analytics. This knowledge drove our development of the 5i Platform to support clients throughout the process. If any part of the picture is missing, from a robust process to reliable data and fully trained and supported people, then the data cannot drive sustained, impactful action.
It also doesn’t help that the world of building operations, and the fundamental definition of those critical components of process, data, and people have drastically changed in the past 20 years. Instead of facilities teams fixing mechanical systems —like valves and boilers— technicians now have to be programmers, able to understand and manage complex sequences and control system interactions. I have personally seen large portfolios so desperate for skilled HVAC technicians, they have created their own “super tech” programs at local technical schools to feed into their own apprentice programs. Often, hiring constraints mean a hard choice between the guy that can do mechanical work, and the one that understands networks. This lack of skilled labor and difficulty in retention drives resource constraints and quality of work issues across the entire system. This issue trickles through the system in many forms. Too few technicians with limited experience are monitoring too many buildings on the head end, leading to reactive fire-fighting and inability to troubleshoot more complex issues.
These quality of work issues —from initial install by subcontractors and commissioning through operation— create poor data quality, which in turn leads to manual “quick fixes”. The need for an owner to be more informed and involved to ensure a high quality of work is greater than ever. In fact, most of the submeter systems for large portfolios I have used have so many data quality issues, including zero values, unrealistic readings, and reading completely different zones/equipment than indicated —all stemming from lack of proper point-to-point verification and operational issues— that the data has been almost unusable for deeper analytics. Even with these advanced technologies, or perhaps even because of, building and portfolio owners need to holistically manage their data, people, and processes, and the relationships between them, more than ever before.
Balancing industry barriers with the power of data analytics
Don’t get me wrong, technical solutions will form the core building blocks enabling buildings to overcome these barriers. They just have to be built on that foundation of people and process. In a world where buildings are becoming exponentially more complex, occupants are demanding more comfortable, intuitive, and beautiful spaces, and facilities teams are being continually squeezed by budget cuts and growing maintenance backlogs, the industry needs technology to become faster and smarter.
The market for net-zero buildings alone is projected to grow to $1.4 trillion by 2035, driven by state requirements like CA Title 24 and organizational carbon commitments. These buildings must operate in an infinitely more complex world of data, weather and occupancy prediction, grid interactivity, and system integration. Recent studies have shown that 75 percent of high-performance buildings are not meeting their energy goals and are in fact using two to five times more energy than their models predicted. In fact, this study found that out of 28 high performance buildings, none of the ones that integrated multiple low-carbon technologies met their design targets. However, these complex, integrated systems are being continually driven by the competing goals of enhance occupant comfort and reduced energy efficiency. Moving to these distributed systems means owners can no longer rely on simple, oversized AHU-based system that can be turned to “11” to compensate for poor design and integration. Without this factor of safety, controls and system interactions cannot tolerate bad sequences or sloppy installation and all faults directly affect occupants. Building systems must “work” more than ever before.
To add to this complexity, buildings are becoming a critical resource for modern grids, incorporating EVs, battery storage, and load shedding abilities in response to complex time-of-use and demand-based rate structures to manage the demands of distributed renewables. These highly complex buildings can no longer be operated through simple, rules-based controls strategies and manual tuning. The industry needs machine learning approaches to balance these multi-variable optimizations, paired with data-analytics to translate the massive power of the data in these disparate systems into actionable insights.
All of these factors are creating an arena where the rapidly expanding technical needs of buildings are clashing with the traditional barriers and market structures. Emerging tech companies and building owners must work together to bridge this gap and apply technology within the context of the resource constrained and highly risk-averse buildings industry.
BuildingIQ’s approach with Outcome-based Fault Detection
This focus on people and process is why we, at BuildingIQ, designed a fault detection system that combines the data from smarter, machine learning analytics and M&V with the judgement and experience of our facilities engineers. This combination of advanced quantitative and deep qualitative analysis is supported by a holistic process that drives action from initial anomaly identification to resolution and verification of energy savings. This end-to-end process reduces the risk of hand-off issues and ensures no critical issues are lost in the noise.
We use our software to track issue status in addition to regular one-on-one conversations between the client and our facility engineers to support troubleshooting and internal prioritization and justification. These regular conversations also allow us to further explore the client’s priorities and barriers to continually refine the support we can provide.
To achieve this, OFD will:
What does the future hold?
The beauty of BuildingIQ’s cloud-based architecture is our ability to constantly develop new, more powerful analytics and services to support our clients’ operations and facilities team. As the buildings industry shifts to an IoT approach and complexity keeps getting worse, having a single integrated platform incorporating upcoming information will be extremely powerful. Bits and pieces are here today, but when we begin to integrate asset data, lighting controls, occupant comfort feedback, storage/renewables systems, knowledge management platforms and occupancy the deluge of today’s data will be dwarfed by a tsunami.
Our response is to actively develop machine learning approaches to look for correlations between these data sets to predictively identify faults before they happen, decrease the complexity of integrated disparate data sources informing ROI, persistence and asset planning, and quantify and incorporate intangibles like reputation and risk. However, each of these are a means to an end of driving action. They will be applied within the context of our integrated approach supported by our human capital and end-to-end processes.
Chris McClurg is Product Manager of Services at BuildingIQ, and a mechanical engineer focused on energy efficiency in large portfolios and net zero developments. Chris has worked on deep retrofits, integrated design, integrated project delivery, and buildings as a grid asset. She is a PE, CEM and LEED AP certified.