C+S May 2021 Vol. 7 Issue 5 (web)

around for about two decades. Yet, within the Horizontal Infrastructure design space, BIM is not as widely adopted as one would expect. Here I discuss three potential avenues to increase BIM adoption. Connect Engineering and Information Modeling Education Civil Engineering education encompasses a wide variety of topics. It ranges from understanding biological processes in environmental engineering to understanding the tensile strength of steel in structural engineering. Most of the education, however, is theoretically oriented. Undoubtedly, it is crucial as without theory there is no science. There is a need to implement and connect Information Modeling and Engineer- ing at the university level. Big data is a snazzy term these days, particularly in the software realm. However, let us pause for a moment to think about it. Civil Engineering has dealt with monumental amounts of data to design and construct most of the current infrastructure for hundreds of years. Imagine the data processed by engineers to build the magnificent US Interstate Highway System. In that sense, Civil Engineers were pio- neers in data management. A cursory survey of contemporary syllabi from different engineering colleges across the U.S. shows minimal education on information mod- eling and data management. With increased digitization, data manage- ment has become a discipline by itself. Some engineers and technicians become proficient through real-world experiences. They acquire this knowledge through the trials and tribulations of project delivery. Often, the knowledge gained lacks an adequate theoretical understanding of data management. BIM as a process and an end-product is essentially the interconnection of various data. To successfully adopt and implement BIM in horizontal design, one needs to understand how data are created and consumed. In other words, engineers and technicians should become proficient in Information Modeling. Educational institutions, practicing engineers, and professional as- sociations should lay the grounds for an improved understanding of data-driven design and construction. It does not merely mean showing students how to use software tools such as Revit, Civil 3D, or Open Roads Designer. Collectively, we must move beyond the understanding of tools and en- courage students to connect theories they learn with data representing the built environment. Not all data are helpful. Engineering education associated with information modeling will provide students opportuni- ties to process data. They will undoubtedly learn how to use modern tools along the way. Creating, managing, and processing data becomes second nature to students. Analogously, data will become like words, and BIM tools will be akin to word processors for students. Combin- ing Information Modeling education with engineering training would leverage our resources by putting infrastructure data to work. Expand Standardization of Data Exchange Formats It is common knowledge now that data has overtaken oil as the most valuable commodity. Undoubtedly, businesses, and society at large,

increasingly see data as an invaluable commodity. Classical economics informs us that when a commodity's supply is greater than the demand, it results in a surplus driving down that commodity's price. Logically, it follows that most businesses delivering software dealing with data would closely guard their data. In contrast to other commodities, when data are exchangeable, their value increases exponentially. Data increases in value the more they are shared. The data generated by me punching my keyboard gets exchanged through multiple layers before getting displayed on your reading device. The data from keyboards get their value only be- cause of the standardization of data exchange. The standardization of the QWERTY keyboard (read typewriter) in the 1870s did not stifle creativity. On the contrary, it led to the proliferation of other technological enhancements. As an industry, we currently use LandXML and GeoJSON to exchange data and collaborate between different platforms. Key software ven- dors increasingly recognize and adopt Industry Foundation Classes (IFC) as a platform-agnostic descriptive data model. Industry standards such as LandXML and GeoJSON are just the tip of the iceberg in terms of standardization of civil infrastructure digital data. If we look back into the not too distant past, there are parallels with the banking industry. The financial sector in the 1970s and 1980s was undergoing seismic shifts with the adoption of computer technologies. With the adoption of the internet in the late 1990s, these shifts gathered unstoppable momentum. However, the data between different banks were not readily interoperable at the onset as it is today. Consider this as a thought experiment: you decided to move your mon- ey to another bank. Subsequently, you would not have access to your money because the data were not interoperable between two banks. Inaccessibility of your money in this thought experiment is unfathom- able. As an industry, we should start thinking about our design data's interoperability, like how we expect our financial data's interoperability. The financial sector did not standardize data exchange without exter- nal pressures. Currently, the Open Financial Exchange (OFX) is an open standard for exchanging financial data and performing financial transactions between financial institutions and applications. The OFX started in 1997 and has continually evolved. The financial industry de- veloped the standard and improved because of demands from various stakeholders, including consumers. Engineering design firms, academics, and governmental agencies should demand increased open standards for data exchange between different software platforms from software vendors. Ultimately, the creators and consumers of data should be able to move data seamlessly between various software platforms. What is the benefit for software vendors, one might ask? Providing a comprehensive data exchange standard will increase software adop- tion, which will increase the bottom-line for the software vendors. There are parallels with the financial industry. Significantly, providing an open data standard increases the reliability and confidence in data.



may 2021

Made with FlippingBook Annual report