Is this the right place for questions about the above?
Hi, i would recommend to drop a short mail to this:
Bentley should speak to these MEFISTO guys in Germany. They should be able to firm up which workflows would benefit most from i-model 2.0's strenghs. A lot of their ideas seems to parallel Bentley's longstanding approaches like federated models, version contro, etc... including construction scheduling and data analytics.
Another interesting outfit looking at version tracking: 3D Repo. They seem to have garnered some momentum in the UK BIM space.
Lots of parallels with what Bentley has been doing for years; and more recently.
1. Change tracking: Hey, Bentley has had Design History, ISM, i-models for years, right?
2. Database used to track changes (file based tracking problematic): Hey, Bentley has had Model Servers for Bentley Map and OpenPlant for years, right? I guess 3D Repo are more hip because they are using B/JSON for object storage, and MongoDB for tracking changes. 3D Repo mentions some of the problems with using DB's to track and query/traverse spatial objects. Bentley seems to have made some progress here with SQLite-based i-model formats.
3. Using a game engine (Unity) as a platform to view large models: Hey, this sounds familiar as well. Muto was started by ex-Bentley guy Matt Gooding, designed to convert i-models to Unity. I guess that this is now taken up by LumenRT. And Bentley can also fall back on the new 1.6 SQLite-based i-model format which is much faster/scalable.
4. WebGL/web-browser based interface: Sounds very much like what the new Navigator Web. I suppose if all you are after is viewing models, then a web / cloud-processed approach makes a lot of sense.
5. Mapping: streamed tiles. Again, Bentley should be way ahead of these guys, with its new Cesium buddies.
6. DAG's as a Scene Graph as means to track history. Sounds right up GenerativeComponents and DesignPower ICM's street. The rules-based engine that Design++ uses sounds like just the type of KBE tool that would be needed to sort and merge changes.
'Interesting' case where the algo is trying to filter out permissible conflicts by differentiating the changes to the object itself and its position in 3d space. Seems to have come from the authors initial focus on meshes produced in CG/animation settings. Design History, ISM / Design Sync, PW Component Indexing etc would already determine what is the latest version of each published object.
Having said that, I suppose that this kind of thing might be good for handling 'dumb' info from others or dgn's that did not come in over a 'bridge'.
7. 3D Diff and Merge: I think that something like what OneShape has would be more useful. Useful for setting up CATIA Knowledge Templates-style variants or parametric configurations as well.
I am amazed how much press these guys have got. I guess the underdog start-up crewed by a gaggle of under-40's will always be feted.
Interesting write-up about Digital Threads which are apparently closely related to Digital Twins.... Industry 4.0. SMS test bed.
Bentley's ContextCapture, BIM and APM tools are already into Digital Twins; with AXSYS and WaterGEMS SCADA providing some simulation tools. Siemens probably much more comprehensive with COMOS etc.
iModel 2.0 look like they could be a good fit for enabling Digital Threads. Not sure if Siemens already has a 'twin' product here.
"The digital thread refers to the communication framework that allows a connected data flow and integrated view of the asset’s data throughout its lifecycle across traditionally siloed functional perspectives. The digital thread concept raises the bar for delivering “the right information to the right place at the right time"
... sounds a bit generic.
"The future scenario, using the digital thread and digital twin, would look something like the following:
Conformance tracking: something to sic SpecWave onto? iModel 2.0 already mentions the need for semantic alignment.
"An iModel Bridge aligns information from an application’s native format into the iModelHub‘s registry of semantics, structure, units, and coordinates."
I think think that this means formatting the data associated with every object in a Semantic Web machine readable way that allows it to participate in a dynamic simulation model. A digital valve 'behaves' like a real valve and has all the required attributes in the right units, structure and relationships. Bentley Class Editor will need to play a larger, more mainstream role?
Big digital models will require lots or attribute information that will be progressive added, re-classified/formatted and verified as the project progresses. A lot of objects or components will be 're-tagged' multiple times. i-model 2.0 would be a useful means to provide a centralised 'staging area' where the incoming data can be massaged, tested etc before being added to the 'live' data model. I get this image of all these little BIM model managers trying desperately to speed up on the on ramp to the ole Information Superhighway... at every data drop / model share :-)
SpecWave should be able to link the employer's requirements document or performance/functional requirements with the progressive conversion of those high level requirements into prescriptive materials/workmanship specifications and design documents... in a continuous versioned timeline. Part of those requirements would be the inspection/validation/handover... and I suppose manufacturing requirements.
iModelHub... I wonder if this is a natural next-step for Bentley products which are heavily based on the 'federated' file based working that is enabled by Reference Files, Mstn's longstanding ace productivity tool.
Take Aecosim, which stores all its attribute or engineering data with the dgn files... that are assembled on loading to form a 'federated' model. Great flexibility, but very limited cross file connections due to the file-based 'dark when not loaded' storage limitations. Not very different to pdf's in this respect.
OTOH, OpenPlant / Bentley Map has taken the other approach and is based on centralised Model Server that stores everything in an SQL Server / Oracle Spatial database. The problem is the overheads of checking in/out data, which needs conversion and slows things down quite a lot.
I wonder if imodelHub can help bridge the gap between the two approaches.
For the old file-based prone to fragmentation, hard to synchronise, dim no relationships between files apps like Aecosim, a centralised 'hub' database would be long overdue. Maybe something like the revived Bentley Facilities Connect could provide the 'hub space' for things like GUID management and progressive tag / attribute info adding and management mentioned above.
For the newer Model Server-type apps, I think that iModelHub could replace the SQL Server setup with something that tracks and merges a cluster of ISM-type dgns (or new SQLite-based .imodels?). This would bypass or minimise the need to convert between the server data format and the dgn geometry format/schema?