Closing the Loop In this case, the DDMS starts by helping the owner or facility manage- ment build up a well-defined data requirement (with all the semantical analysis, and Artificial Intelligence (AI) recommendation functionalities described in the previous article). In this way, the facility owner under- stands what data they are expecting from upstream systematically. After the data dictionary has been developed, the DDMS will further polish the data requirement and assemble the data dictionary into the data requirement (parameter) packages and share with each corresponding stage (design, construction, commissioning etc.). Stakeholders in each delivery stage will be informed about the types of data they are supposed to collect or populate so as to avoid the expensive task of collecting and coordinating asset data after the building has been occupied. Lastly, the DDMS needs to be capable of analyzing submitted data against the hosted data dictionary to perform a compliance check. This process not only tests the data completeness but also the data validity and unique- ness based on each attribute/parameter requirement. The DDMS will also perform data analytics and reporting on analyzed data for the data or BIM manager to retouch their data collection. In short, the DDMS will help clients with what needs to be collected as well as when and who needs to gather it. The entire process is centered around the defined data dictionary to eliminate any data miscoordi- nation. Figure 3 illustrates the loop of data collection and validation driven by the utilization of DDMS. The gap described above varies and occurs all around the project deliv- ery. The aggregation of data glitches here and there adds up to the failure of the entire data management plan regardless of how good it was origi- nally. The combination of a well-defined data dictionary, a solid business process that ensures the successful implementation of the dictionary, and effective data management tools can help clients overcome these gaps and eventually get on the right track to close the loop. The concept of a Digital Twin has been around since 2002 and has quickly spread into manufacturing, healthcare, automotive, aerospace, and other industries. The basic definition of a Digital Twin is a digital replica of a physical entity that connects the physical and the virtual world to enable data synchronization between them. In other words, the Digital Twin is a living model that reflects and keeps up with its real physical counterpart, including not only the representative geometry, but current conditions and data. The Call of Digital Twins The Phrase Speaks for Itself In the Architecture, Engineering, Construction, Owner/Operator (AECO) industry, a Digital Twin does not need a specific level of development, detail modeling requirements, or technology/process utilizations. From the owner’s view, what he receives at the end of the delivery phase will be a digital replica of what has been built onsite, which, most of the time, includes everything the owner wants to know. Once a project has gone through the delivery phase, onsite commis- sioning, field/asset data collections, populations, and validations, a Digital Twin can be developed and handed over to the owners. The Digital Twin will fully connect and sync with the owner’s comput- erized maintenance management system (CMMS), updating itself
Figure 1: BIM Lifecycle Implementation needs to be regulated by a proper business process
Therefore, we need to elevate the topic from the data or technology application level to the business process level, which includes but is not limited to BIM Standard, BIM Execution Plan, Contractual Lan- guages, and Quality Assurance/Quality Control (QA/QC) Compliance Validation. This is where a DDMS can help with bridging all data gaps. In the following content, we will look closely into a data gap with real-world scenarios and describe how DDMS can benefit the project delivery and operational loop. Data Gaps Data gaps are the most common problem in BIM lifecycle implementa- tion yet remain difficult to resolve. A data management plan can break down on just one or more data gaps occurring during project delivery and operations. One example is the miscoordination among data col- lection and validation between the submitted BIM model and onsite data collection. In an ideal scenario, data from these two channels should complement each other in a synchronized way. However, this rarely occurs as planned. Figure 2 illustrates ideal and real scenarios of asset data collection for facility management purposes. During the project delivery phase, data is collected from all channels (onsite, cut sheets, manuals) but some of it is not required by facility management. Moreover, what facility management really cares about is missed. Pos- sible root causes are listed below: • There is no concrete FM-oriented data requirement. • If there is a solid data requirement, it is not included in the Integrated Project Delivery (IPD). • If the requirement is included in the IPD, there is a lack of a QA/QC session to regulate the data submission.
Figure: 2 Ideal scenario vs. real scenario
36
csengineermag.com
november 2020
Made with FlippingBook Annual report