For almost forty years, when contemplating the strategic benefits of computer automation across a business enterprise, we have always been guided by a simple acronym, GIGO: Garbage In, Garbage Out. As every aspect of business is racing towards the explosion of AI applications, it is even more important to realize that the most critical element in a successful implementation is the quality of the underlying data.
The shear volume of information across the globe grows exponentially on almost a daily basis. AI is already impacting areas of research, medical diagnostics and predictive analytics in reviewing terabytes of data and documents in mere seconds. As businesses try to adapt AI across specific use cases, constructing a reliable data set can become a larger challenge.
Using corporate real estate as an example, much of the information can either be proprietary, unstructured or greatly varied between regions and jurisdictions. Without critical mass, analysis or predictive analytics can be inherently flawed by the quality and completeness of the data. The best approach to solidifying data quality on any project is to work from the inside out.
Within every corporate portfolio there is a broad range of detailed data associated with each property. Unfortunately, due to a disparate PropTech industry, many companies apply siloed solutions for Facilities Management, Lease Administration, Procurement, Building Automation, Sustainability and Financial Management. There is simply no “one size fits all” solution to all of these different applications. However, the common thread is in the data.
Within each silo there is critical data that could be vital for other functional areas. Often times, the controlling department will only develop and maintain the information that is related to their function. This can lead to different groups maintaining and relying on data from multiple sources. Taking an enterprise wide holistic approach to vital data requires a higher level cross functional view.
Regardless of the application, the elements of data quality are rooted in consistent process:
• Engage cross-functional stakeholders in defining critical data needs.
• Design as much structured data as possible rather than assembling bundles of text.
• Enforce discipline and rigorous quality review in all elements of data capture.
• Utilize technology wherever possible to capture data electronically and reduce opportunities for manual error.
• Maintain a consistent and auditable process for timely and accurate data capture and ongoing maintenance.
• Incorporate regular data quality and gap analysis reviews on all systems.
• Build stakeholder engagement by building useful cross system reports and analytics.
If structured correctly, AI technology can provide significant insights and benefits leading to greater visibility across the organization, improved decision making and significant improvements in overall cost management. The formation of a reliable and accurate data set, along with the construction of valuable queries, dashboards and reports can be accelerated with an experienced data partner. Jackson Cross Partners has an exceptionally skilled, cross functional team to assist in the journey.
For more information on how to improve your property information, contact Zach Forrest or Brendan Quinn