Time doth flies. Looking back over 15+ years, it is interesting to see which concepts and issues still apply today.
1. Open data formats:
Not really asserted, but I think the underlying assumption on ADSK's vendor format lock-in strategy is that they would like their users to stay in the ADSK camp when exchanging information. Object Enablers? The problem is that they did not and still do not have a unified platform (Project Quantum in future?) for multiple apps, leaving them with a menagerie of formats (dwg, rvt, nwc etc) packaged into expensive suites.
Not really asserted, but I also think that maintaining compatibility with dwg was an 'uncomfortable truth' for Bentley that was a big drain on resources... and a huge drag on Mstn's evolution. There was a selfish reason for attempting to brow-beat ADSK into being more open. Albeit, a reason that has industry-wide communal benefits.
Looking back, I wonder if there were already plans then to reverse engineer .rvt... or the parametric 'integration' (over mere 'aggregation') aspects of R*vit? Probably got side-tracked by GC which grew out of CustomObjects.
2. Stable Formats:
ACAD, R*vit format changes slowed to every three years? Will the format changes not slow now that ADSK is going to a subscription model? More Object Enabler type shims in future versions of R*vit?
3. Emphasis on the Drawing Production process:
Interesting that Phil B kept coming back to the central importance of producing realiable, coordinated drawings and automation to minimise tedious tasks like sheet numbering. I suspect this was something that was missed by Keith B that has proven to be a key differentiator and selling point, especially for R*vit... over ADT and OBD... even today. Maybe Bentley has been adversely insulated from the huge importance of / need for- good automation tools because of the large engineering project / clients it commonly has.
Interestingly, the 'database' side or the 'I' in the BIM was not really seen as a central goal. ' Model-based solutions' (as opposed to 'drafting engines') are seen as a better means to producing coordinated 'accurate' drawings 'out the door' and 'into the field'. The architect's 'brain' was the database. He just to have better tools around him. Blimey!
PB does get it right when he mentions the BIM model will be more instrumental in the long term, for nonpaper-based processes like direct-to-fabrication.
OTOH, KB does point to the interest of the owner operator in the information in the model, but does not elaborate. Hints of the Digital Twin stuff here?
4. Drafting v. Modeling:
I think KB got this right. Modeling is not 3d only. 2d is just as important. Fast foward 15 years, R'vit has provided a lot more drafting tools... but is still pretty bad at handling drafting / CAD tasks. In other words, it does not have a fully-fledged platform for information modeling.
The overheads of running a parallel CAD app in conjuction with R*vit are still true today. In fact, you will find Rhino in lieu of ACAD being used a lot so the poor R*vit firm would need two or three modeling apps instead of one.
Interesting to note that KB acknowleges that there are a lot of R'vit functionality that can and should be integrated into Mstn. Hypermodeling must feature large here. Constraints solving is also now in Mstn. Family Editor equivalent coming soon?
5. Concurrent Working / Change Tracking:
PB does not really have an understanding of the way large projects work. I suppose to be expected. PB's view that manufacturing world needs 'semi-concurrent working' not concurrent working. Covering for R'vit's small project / central model roots? Mistakes the 'Common Design Environment' that is ubiquitous today for 'project management' software.
In fact, the manufacturing world has gone even more concurrent, especially in Catia:Enovia's case. Interesting that KB's view that a central 'manufacturing', 'Integrated' (in PB's terms)- style database would not work in AEC, where a distributed file-based (dgn is mentioned) workflow is needed. I suppose that he would say that having invested so much of the new V8 format. Fast forward to iModelHub 2.0 today, which is designed to aggregate multiple files.
I think that it accepted today that a lot of the data-base like synchronisation of information between information would be much easier in a centralised app like R*vit, compared with file-based apps like Mstn. ProjectBank with is component-level granularity never really found much traction although it would seem that apps like OpenPlant might be able to make this work with the new imodelHub 2.0 and the SQLite-based .imodel 1.6 format.
Design History was about change tracking... but it stopped at just highlighting the changes, awaiting human transaction management. I don't think KB grasped the power and utility of the centralised 'atomic' change management that R*vit is built around... especially when the model is broken up in separate files. Again, maybe solved with imodelHub 2.0 and the SQLite-based .imodel 1.6 format.?
Delta Transfers: ironically, R*vit has had to built this and transaction management in.. something that vanilla Mstn users without ProjectWise's Local Document Organiser does not benefit from, making flowing from separate parts (files) of the federated model slower and less productive. It would be good to compare and analysis this in detail?
6. Central vs Federated Models:
Well, as we know, federated models are the norm today. Again, KB was right on there would a multitude of different data-bases/types. OTOH, PB vaguely mentions the use of a 'pedflow' model-type model that incorporates a lot of the client's information.. and 'self aware'. By SA, I think he means that the BIM objects like walls, beams etc have inbuilt 'behaviours' that mimic their real life 'twins'. I don't think PB addressed the scalability issues, and as we know R*vit's pervasive parametric synchronisation / constraints solving do bog things down and relies on breaking the models up to work after a certain size (apprx 200MB).
7. Layered Development / dotNET:
KB is very supportive of dotNET. Ironically, ADSK seems to have published and supported more dotNET API's compared to Bentley in the intervening years. Understandably, no realisation of the rise of open source, driven by outfits like Google who started giving away high quality API's, cloud computing, AI, social media, mobile etc.
8. Goldmine due to the additional work required for a BIM model:
Neither got this right. Clients tend to not want to pay. In fact, it is fair to say that most clients can even define what they want. More accurately, if clients pay for more BIM functionality, they are more likely to pay other kinds of vendors, not the traditional authoring app vendors like ADSK or Bentley. They seem to go straight to IBM or Oracle or SAP or Salefsorce or FME etc to leverage that data. I wonder why...?
Hi Dominic, thanks for your observations. Sharing and manipulation of data outside silos is obviously the key, and underlies our current digital twin and iModel.js initiatives.