Why Bad Data Is Silently Draining Value from Commercial Real Estate Portfolios

The Data Quality Crisis in Commercial Real Estate
The Commercial Real Estate (CRE) industry is saturated with data but starved for insight.
Global data creation is doubling every two to three years, but the quality control surrounding data has lagged. CRE companies now collect data from a multitude of operational systems — Property Management Systems tracking leasing and vacancies, Building Management Systems monitoring energy consumption, and financial platforms storing valuations and property specifications; yet much of it remains inconsistent and poorly validated. Commercial developers are aware of this, with 41% reporting dissatisfaction with data quality, as reported by Warwick Business School.
The BIG Picture
This data quality issue is not confined to a single property sector. In an era where AI continues to become embedded and more sophisticated in daily operations, the quality of the very data feeding the systems we use is struggling to keep up. When that data is flawed, the algorithms learn the wrong patterns, amplifying mistakes rather than correcting them. AI can accelerate CRE operations, but when the underlying data is unreliable, it undermines the efficiencies that AI promises to deliver.
Understanding Bad Data in Commercial Real Estate
What Is “Bad Data”?
Data, by nature, is neutral. “Bad data” refers to information that’s inaccurate, incomplete, inconsistent, or duplicated. In short, it is data that is not fit for the purpose for which it was collected. Each form of bad data presents challenges and ultimately a cost to data-driven processes. Inaccurate data can inflate asset values, for example, when outdated floor areas or rent rolls slip into a valuation model. Duplicate records are just as damaging: a ghost lease logged within a system can distort occupancy metrics and confuse reporting analysts. Incomplete data, such as missing lease expiry dates, hides upcoming risks to vacancies. Moreover, when records are inconsistent or mismatched, a single tenant can be entered differently within a system, or an inconsistently formatted status field breaks dashboards, forcing teams into manual reconciliation.
These errors break the flow of otherwise automated processes. Teams are forced to pause and check sources manually and reconcile datasets which should already align. So, what should be instant access to insights becomes a chain of manual interventions to validate figures and rebuild dashboards.
Individually, data quality issues appear minor, so the cost is not dramatic; but it is persistent. More time spent checking is time lost analyzing and acting. That continual impact is what erodes the value of a portfolio.
How Bad Data Drains CRE Value
The CRE Value Drain
Improving data quality, unfortunately, has no single-source solution. CRE data flows through so many disconnected systems, so fixing quality issues means tackling everything from inconsistent entry practices, fragmented data governance and how data is integrated to be used. Increasingly, AI-driven tools can help in this effort by automating data validation and flagging anomalies. That being said, quality still begins at the source.
Operational Impact
The operational drain is ultimately a cost in terms of time and resources, ensuring a usable data set is available. Colleagues spend time cleaning, validating, and reconciling data, often across multiple systems, which causes delays in tasks such as reporting. This delay results in higher costs due to reduced productivity.
Strategic Impact
The strategic drain is where we fall into the trap of misguided decisions. Inconsistent data distorts metrics like KPIs and forecasts. In an extreme example, this opens the door to misallocated capital or overlooked risks.
Reputational Impact
The outward-facing drain. Errors in investor or appraisal reports erode confidence. Now, more than ever before, stakeholders expect transparency, and poor data governance could be an indicator of risk.
Across operations, strategy, and reputational risks, insufficient data compounds quietly, bleeding the value long before it appears on a report.
What We Have Seen
Inconsistent data is a familiar frustration. Often, multiple systems record information about the same property, each with a different perspective in mind. Valuers, Property Managers, and Asset Managers all work in isolation using their own referencing conventions to identify properties and the units which form them. This data, though about the same property, is locked within silos, creating a monumental task to link that data together and gain a fuller picture of the portfolio being managed.
Solutions: Improving Data Quality in Commercial Real Estate
Improving Data and the Value It Creates
Recognizing the cost of bad data is only the first step to reclaiming value. Often, most improvements to poor-quality data are symptomatic fixes that keep projects moving, such as quickly amending a data record to ensure a calculation works as intended. Truly improving data quality requires a comprehensive systemic strategy to address root causes. There are three levers to pull to start addressing the heart of poor-quality data: governance, technology, and culture.
Data governance has long been the core of data management disciplines. It is the end-to-end, custom practice of understanding your data, why it is collected, and how it is used. Technology typically follows governance. Once you understand your data, the platforms used to store, integrate and automate data processes are next, allowing you to tackle the increasing data challenges faced by the CRE industry. Finally, culture – this adds the “human layer” to the data. It encompasses data literacy and ultimately ownership of the data being used. Overall, with these three levers, the improvement of data quality boosts efficiency, decision-making and stakeholder confidence.
The silent drain of poor-quality data can be reversed. It is no small task, but the rewards far outweigh the effort of improving the quality of your data for all parties involved. Ultimately, in an industry where data underpins the AI wave of real estate, the foundational question is not how much data you have, but rather, how good it is.
Frequently Asked Questions About Data Quality in Commercial Real Estate
What is bad data in commercial real estate?
Bad data refers to information that’s inaccurate, incomplete, inconsistent, or duplicated. In CRE, this includes outdated floor areas, missing lease expiry dates, duplicate tenant records, or inconsistently formatted property information that undermines decision-making and operational efficiency.
How does bad data impact CRE portfolio performance?
Bad data creates three types of drains: operational (wasted time on manual data cleaning and reconciliation), strategic (misguided decisions from distorted KPIs and forecasts), and reputational (eroded stakeholder confidence from errors in reports).
Why is data quality particularly important for AI in commercial real estate?
AI algorithms learn patterns from the data they’re fed. When underlying data is flawed, AI amplifies those mistakes rather than correcting them, undermining the efficiency gains that AI promises to deliver to CRE operations.
What are the most common data quality issues in commercial real estate?
Common issues include inaccurate asset valuations from outdated information, duplicate records distorting occupancy metrics, incomplete lease data hiding vacancy risks, and inconsistent data entry across multiple systems creating reconciliation challenges.
How can commercial real estate firms improve data quality?
Improving data quality requires addressing three key areas: governance (understanding what data is collected and why), technology (implementing platforms to store, integrate, and automate data processes), and culture (building data literacy and ownership across teams).
What role does AI play in improving CRE data quality?
AI-driven tools can help automate data validation and flag anomalies, making it easier to identify and correct data quality issues. However, quality still begins at the source with proper data governance and entry practices.
Why do CRE companies struggle with data silos?
Multiple systems record information about the same property from different perspectives. Valuers, Property Managers, and Asset Managers work in isolation using their own referencing conventions, locking data within silos and making it difficult to gain a comprehensive portfolio view.
What percentage of commercial developers are dissatisfied with their data quality?
According to Warwick Business School research, 41% of commercial developers report dissatisfaction with data quality, highlighting the widespread nature of this challenge across the industry.
References
McCausland, T. (2021, 01 04). The Bad Data Problem. Research-Technology Management, 64(1), 68–71. Retrieved from Taylor & Francis Online.
Secoda. (2025, 11 4). What is Bad Data? Retrieved from https://www.secoda.co/glossary/bad-data
Warwick Business School. (2025, 05 01). Proptech revolution hampered by lack of high quality data. Retrieved from Warwick Business School: Proptech revolution hampered by lack of high quality data




