Users create data and store it on a local disk, usually in proprietary DWG or RVT formats. These master files are used to create hundreds of other documents (such as PDFs), the exchange of which also needs to be controlled.
moves to the cloud, it’s time to structure data differently. It should become “streamable” and atomic, making it easier to move dynamically. This way, users will only receive data that’s relevant to a specific task, rather than everything.
BIM files in particular can become large and unwieldy in a very short period of time. Design data needs to usa email list be shared, not siloed in application-specific formats. Silos are a pitfall that hinders collaboration. Separate pools of data need to be brought together, which is already happening.
Autodesk and Bentley Systems are pushing unified database structures, Autodesk Docs and iTwin, respectively. Autodesk's approach is proprietary, storing data in the company's cloud. Bentley's iTwin, on the other hand, makes it open and portable.
Other BIM vendors, like Vectorworks, are adding database connectivity to their solutions to improve file workflows. Graphisoft is taking a unique hybrid approach where data can be stored both locally and in the cloud. It seems that BIM file formats as we know them will either become transactional or legacy. As Autodesk’s Anagnost puts it, “Files are the walking dead.”
However, efforts are being made in the industry to create open information systems so that construction industry participants can maintain control over their own data and not fall into the trap of cloud platforms (more on this in the section “Openness”).
BIM
API access
In the cloud future, “forwarding data” will be a last resort. Instead, design applications will “come” to where the data is stored. Users will access it through an API (application programming interface), the tools of which will allow them to perform tasks on the data that is “allowed” to work with.
Historically, developers wrote a custom app that would sit on top of desktop Revit to access BIM data. In the future, new startups will write apps that sit in the cloud and simply connect to the client’s data warehouses to perform tasks (some startups are already working on this). If the data is stored locally, plugins extract it and send it to the cloud for processing. The cloud-based approach ensures seamless interaction between apps.
However, one of the problems with cloud computing is that it involves someone else's computer. Often, you have to pay for data hosting, as well as for microtransactions related to API calls and data transfers between cloud servers. In addition to tool subscriptions, there will be a system of payment for tokens. Perhaps by the number of API calls, the consumption of which will be controlled.
Openness
We live in exciting times. For most of BIM history, the only open data transfer standard was IFC (Industry Foundation Classes), seen as the simplest format possible. To be fair, it suffered from some poor export attempts by software vendors.
But as data storage increasingly
-
- Posts: 12
- Joined: Sun Dec 22, 2024 4:23 am