BIM Workfkows - Cost Planning Tools

More damning criticism of Crossrail which is now going to overrun cost and programme-wise... so much so that a lot rail jobs are being put on hold or delayed.

On the programming side, Bentley can point to Synchro. I wonder if the cost estimation / take-off side now need some serious attention.

Any rail or other infra jobbie in the UK will now be under the spotlight cost-wise. I would imagine that all those BEP's would be dusted off and more BIM assisted cost checks will be demanded earlier and earlier on.

Connection or bridge to RIB iTwo... via Speedikon tools? CostX? 

Parents
  • Hi Dominic,

    Thank you for the suggestion, we will definitely discuss it in our internal calls.

    Best Regards
    Aditya

  • Interesting report on the NAO website.

    You could argue that a lot of the other costs follow on from the design changes and new scope.

  • Previous discussion on BIM Workflows: Cost Item Tracking... going back 7 years.

    Reviewing how the cost estimation packages and their use have evolved since then is pretty educational. Also, how the market has been consolidating  lately. It would seem that given the maturity of the CAD market (where growth in licenses/subscriptions for one vendor almost always means loss by another) the only way to break into the market is buying up the second-tier vendors and consolidating.

    Looking at the acquisitors, Trimble and RiB, it is obvious that there is a big market beyond CAD that includes real estate, spaceplanning, facilities/site/tendering management, business analytics and estimating. Trimble, especially, has amalgamated a large portfolio of apps that allows it to sell itself as a onestop shop/platform for both owners and contractors.... kettling Bentley with its design/analysis portfolio in the middle?

    Keith Bentley mentioned once that we will reach an inflection point where the nongraphic data will outweigh the traditional CAD geometry data. I think that he said this well before getting into SQLite, aka i.model 1.6(?). But, has the interface and tools in Mstn changed to reflect this?

    Mstn has always been strong with 3D. Synchro would cover 4D. 5D Cost estimation would be the logical next step.If you look at the current estimating apps on the market: CostX, RIB iTwo, Vico Office, Sigma Estimates, Destini Estimator, CATO etc, all of them have very strong table/spreadsheet tools that are linked to a 2d/3d take-off interface. It is shocking to see how similar or familiar the workflows and tools are.

    Mstn CE started to add some table-manipulation tools into its interface. Verticals like OBD, OPD, WaterCAD and STAAD have always had to deal with table/spreadsheet-type information side-by-side with CAD info. The ur-tool that estimators would know intimately is Excel. The junior estimators probably also know a pdf editor like Revu because they would have to do the take-off... manually... eventually stepping up to use one of the above.

    It would be good to enhance the table tools to provide the deep Excel/Access-type editing that these standalone estimating packages provide. I think that convergence is inevitable. ConstructSIM would be a great vehicle for this.

    Some observations:

    Table interface:

    1. All estimating packages are based on cost 'assemblies' that are built up from nested 'components'. CostX allows for nine levels in a spreadsheet-type interface. Some of the other packages take a more database UI and present the info in dialog boxes.

    2. All the usual Excel tools like filtering, sorting etc are heavily used. Including drag & drop between cells; and between cells and CAD elements.

    3. Since most of the cells will be referenced. it is important to be able to 'suppress' the values from 'rolling up' into the overlying cost position. Vico, Destini.

    4. Transitioning from one cell to the next level down/up should be as easy and intuitive as possible. Double-click in CostX seems to work very well. iTwo displays the nested table in a separate 'fixed' dialog. Please no floating, Russian doll dialog overload.

    5. Variations, comparisons, templates etc will need to be stored. I find i-Two's folder UI which simulates a file system folder the best, and aligns with and allows normal MS Explorer-based management. A lot of the apps bury the 'files' in dialog boxes or tabs.

    6. Comparison is a bread & butter task, and should be reflected in the table set up. In Vico, the line item will have not one cell for the link to the measured 2d/3d object but multiple cells for each revision. Both Vico and i-Two integrate and dock table and CAD view windows seamlessly, avoiding floating windows. All apps allow for mutiple tables or table views on screen, like Excel.

    7. Indicating whether the cell's contents has been manually inputed or a reference or linked to an element in a 2/3d measure ('Dimension' in CostX-speak) is pretty helpful. A number of the apps do this.

    8. Cost estimation functions: All of the apps have estimating/book keeping-specific formulae. Easy.

    9. Autocomplete: Vico provides some sensible tools to manage libraries for this important time saving functionality.

    10. ??

    Bentley has made a big investment in SQLite at low level. Time to expose and celebrate this with some UI?

    Model-based Estimation:

    A lot of this is already in Mstn and its authoring verticals- easy meat on the table...

    1. Importing from authoring apps like Revit either using IFC or a plugin. All of the apps suffer from this barrier in various forms. Typically, the estimating app will have either a 2d or 3d takeoff tool. Only Vico demonstrated the ability to Ref in a 2d drawing into the 3d model. But even Vico could not (AFAIK) link the 3d element to the 2d take off element, leading to potential discrepancies/double-counting. Mstn's Hypermodeling and CVE tools would stand out here. Scalability.

    2. The ability to work in the authoring tool to edit and add classification info is invaluable. Also, It is not uncommon/expected for the estimator to include place holders for items not modeled. Stair well missing balustrades? Model and get the quants directly. I can see a Navigator-style overlay file being used by the estimator, leveraging the Issues Resolution Service. When the balustrade is modeled properly, the IRS is used to remind the estimator to remove his placeholder.

    3. Measuring MEP elements: some attempts in CostX. Not easy to select and classify. Another advantage if the estimating tools has access to the authoring environment.

    4. 2d shape and line markup tools: old hat in Mstn with its huge list of formats that can be referenced/imported. Accudraw. One innovation that CostX has is the ability to 'blockify' a raster symbol. Should be something already in Descartes OCR tools that would be able to replicate this.

    5. i-Two has the most impressive rules-based classification tools. Sigma relies on tools in Revit to get this right. Not great as this means the estimator needs to have a Revit subscription. Mstn/OBD has long had its SelectByAttributes and now Display Rules for this kind of thing. iTwo's rules-based query and assign is very powerful as it does not rely on stable GUID's and can better deal with radical changes.

    6. Importing and classifiying/linking the elements to the correct cost component/line item is never a watertight affair. A lot of verification is needed. Vico and i-Two provide visual highlighting, re-symbolisation, isolation and selection, as well as duplication/clash detection tools. These tools are already in place in Mstn.

    7. Comparison: Mstn has long had its DesignHistory, symbology tools to highlight changes. Vico has very nice before/after slider tool. Easy to duplicate in Mstn using the Ref Clip tool?

    8. Location based estimating- Vico: This is already possible in OBD and I believe ConstructSIM has tools to segment slabs 'non destructively' for scheduling purposes.

    9. Scheduling and Spaceplanning: Destini, Vico and i-Two stray naturally into these areas: ConstructSIm, Synchro and Bentley Facilities Planner.

    10. Excavation: RiB's civils app is pretty basic. MineCycle/OpenRoads would have a lot of this down already. 

    11. ??

    The one gap in Mstn is the lack of tools to access quantification information at the subelement level. For quantification purposes, the estimator needs to be able to highlight and differentiate between the top and bottom surfaces of a pile cap for example.

  • Another report pointing to how bad the civils / infrastructure sector is at cost planning. Useful summary on the usual reasons for overruns.

    I like the suggestion that there is an inherent problem with 'going with the lowest bidder' approach, which is the natural tendency. I can see a lot of tenders taking this route, which is also often compulsory, mainly to avoid accusations of unfair competition, political interference, collusion etc.

    Given that the big factors that drive costs up post tender, from an engineering design standpoint, include:

    Early Cost Escalation

    1. Poor asset information
    2. Late stakeholder and consent issues
    3. Poorly defined owner requirements
    4. Inadequate design

    Later Cost Escalation

    1. Poor systems integration (I would include scheduling here)
    2. Impact of the ECE items above manifesting themselves

    ... there should be more emphasis on evaluating (and reinforcing) the technical capabilities of the bidding teams to 'design' and manage costs post tender...?

    Typical contract bids, dominated by non-technical specialist estimators who will only price what is in the Works Information, saddle their subbies with super onerous terms, exclude or guess up a risk contingency, package it up and expect all parties to 'work to cost' etc => lowest bid... are not necessary the type of people that have the skills or computational tools to read and solve the complex problems ahead.

    OTOH, you can't only have airy-fairy PhD's only teams with all kinds of unproven tools still riding the Hype Cycle curve.

    What does this mean for BIM tools and their users?

    Cost modeling will need to be more integrated into the design tools. In the same way that the key requirement for Level 2 BIM is that the 3d model needs to be the basis for all 2d drawings -via extractions- to ensure maximum coordination, I think that the same should apply to the cost takeoff. The common situation we have today is estimates are still largely a paper-based manual exercise with little or no links to a 'single source of truth'. Never mind clash detection... that problem is a small sideshow, impact-wise.

    If you are lucky, the estimator will have an analytical tool that will be used to add and attach cost info to either 2d or 3d models provided by the designers. But these tools and their workflows need to be closer to -and ideally embeded in- the design tools. It is recognised that a clash is much cheaper to deal with at source (and there is no guarantee that they will not spiral out of control when left too late). Similarly, cost issues or 'avoidance' are ideally dealt with by the designers, rather than waiting for those labourious and boring 'cost reporting' sessions.

    At the take-off level, Cost = Quantities x Spec x Rate. Specifications are a design item... that requires intensive collaboration between estimator and designer. The all too common dysfunctional worflow is the estimator gets a model that is missing most of the spec info and has to guess... based on past projects which are often not verified, not being the actual outcome spec (subtitutions are common) or costs (confidential contractor info).

    At the next level, Rate = Material Supply costs x Installation Rate, which will need 3d scheduling and simulation: Synchro. Again, getting this optimised will need the design to be a responsive partner in a bi-directional relationship.

    Cost modeling needs to go beyond just static cost reporting (typically a ponderous manual-only or BIM Level -1 process), which is a huge blocker to the type of iterative cross-disciplinary exploration needed. We will probably see a lot of clever KBE/Rules-based, GA, ML, data mining tools like Modelogix, Alice Technologies, nPlan etc etc trying to get their spaghetti to stick in the near future. Interop link tools.

    Apps like Testfit clearly show the importance of being able to integrate design information parametrically into the overall 'deal' or cost model. There is a lot of talk of how good we are at providing information for stakeholder engagement using visual information using desktop publishing, renderings, AR/MR/VR tools... but what about the quantitative, actionable, PowerBI-type information?

    As mentioned above, Trimble et al realise that there is a signifcant demand for informed reporting tools that can work with and synthesize 3d modeling in(to) the procurement and facilities management segments of the lifecycle.

    Cost modeling is also a continual process that is part of the procurement and construction process. As the Model progresses and is handed over to the contractor, coordination and product subsitutions are inherent tasks that will need to be managed centrally in a model. And progress on site tracked, independently?

Reply
  • Another report pointing to how bad the civils / infrastructure sector is at cost planning. Useful summary on the usual reasons for overruns.

    I like the suggestion that there is an inherent problem with 'going with the lowest bidder' approach, which is the natural tendency. I can see a lot of tenders taking this route, which is also often compulsory, mainly to avoid accusations of unfair competition, political interference, collusion etc.

    Given that the big factors that drive costs up post tender, from an engineering design standpoint, include:

    Early Cost Escalation

    1. Poor asset information
    2. Late stakeholder and consent issues
    3. Poorly defined owner requirements
    4. Inadequate design

    Later Cost Escalation

    1. Poor systems integration (I would include scheduling here)
    2. Impact of the ECE items above manifesting themselves

    ... there should be more emphasis on evaluating (and reinforcing) the technical capabilities of the bidding teams to 'design' and manage costs post tender...?

    Typical contract bids, dominated by non-technical specialist estimators who will only price what is in the Works Information, saddle their subbies with super onerous terms, exclude or guess up a risk contingency, package it up and expect all parties to 'work to cost' etc => lowest bid... are not necessary the type of people that have the skills or computational tools to read and solve the complex problems ahead.

    OTOH, you can't only have airy-fairy PhD's only teams with all kinds of unproven tools still riding the Hype Cycle curve.

    What does this mean for BIM tools and their users?

    Cost modeling will need to be more integrated into the design tools. In the same way that the key requirement for Level 2 BIM is that the 3d model needs to be the basis for all 2d drawings -via extractions- to ensure maximum coordination, I think that the same should apply to the cost takeoff. The common situation we have today is estimates are still largely a paper-based manual exercise with little or no links to a 'single source of truth'. Never mind clash detection... that problem is a small sideshow, impact-wise.

    If you are lucky, the estimator will have an analytical tool that will be used to add and attach cost info to either 2d or 3d models provided by the designers. But these tools and their workflows need to be closer to -and ideally embeded in- the design tools. It is recognised that a clash is much cheaper to deal with at source (and there is no guarantee that they will not spiral out of control when left too late). Similarly, cost issues or 'avoidance' are ideally dealt with by the designers, rather than waiting for those labourious and boring 'cost reporting' sessions.

    At the take-off level, Cost = Quantities x Spec x Rate. Specifications are a design item... that requires intensive collaboration between estimator and designer. The all too common dysfunctional worflow is the estimator gets a model that is missing most of the spec info and has to guess... based on past projects which are often not verified, not being the actual outcome spec (subtitutions are common) or costs (confidential contractor info).

    At the next level, Rate = Material Supply costs x Installation Rate, which will need 3d scheduling and simulation: Synchro. Again, getting this optimised will need the design to be a responsive partner in a bi-directional relationship.

    Cost modeling needs to go beyond just static cost reporting (typically a ponderous manual-only or BIM Level -1 process), which is a huge blocker to the type of iterative cross-disciplinary exploration needed. We will probably see a lot of clever KBE/Rules-based, GA, ML, data mining tools like Modelogix, Alice Technologies, nPlan etc etc trying to get their spaghetti to stick in the near future. Interop link tools.

    Apps like Testfit clearly show the importance of being able to integrate design information parametrically into the overall 'deal' or cost model. There is a lot of talk of how good we are at providing information for stakeholder engagement using visual information using desktop publishing, renderings, AR/MR/VR tools... but what about the quantitative, actionable, PowerBI-type information?

    As mentioned above, Trimble et al realise that there is a signifcant demand for informed reporting tools that can work with and synthesize 3d modeling in(to) the procurement and facilities management segments of the lifecycle.

    Cost modeling is also a continual process that is part of the procurement and construction process. As the Model progresses and is handed over to the contractor, coordination and product subsitutions are inherent tasks that will need to be managed centrally in a model. And progress on site tracked, independently?

Children
  • Costs, costs, costs... seems like you can't go a day without hearing about it.

    Just as you thought you knew what NRM was, you find out about ICMS.

    Interesting news that Synchro is now integrating with STR Vision CPM. Not surprised to see third party interest in integrating with the leading / dominant scheduling app. Good entry point for Bentley to extend its ecosystem, especially if the same API hook up liaison work can be extended to other apps like OBD.

    Looking at the CPM app, Bentley would be well placed to offer a better and more scalable display system and interoperability platform. It would also provide quick means to extend CPM to mobile and web devices. There might even be some synergies with ProcureWare.

    All at mutually benficial costs, of course!

  • National Audit Office (NAO) HS2 Progress Report January 2020 Update

    Super interesting read about management consulting firms expanding into the infrastructure engineer / project manager's world.

    "Consultants can use expertise in data analytics to inform construction and maintenance decisions, says Mr Threlfall. In the past, a decision to maintain or resurface a road would be made by engineers, but now that assessment is increasingly being driven by big data — for example, an IT dashboard that records the condition of assets in real time and provides a warning that the asphalt is wearing." <FT>

    Seems like the big engineers are also sexy-ing up in response. Even if it just posturing that doesn't have much financial backing, it does sound like just the ticket for Bentley's Digital Twin 'playbook'?

    Meanwhile, down in the trenches... looking at the NAO Exec Summary, lots of problems looking for solutions:

    19. Detailed design v ongoing cost / programme estimates were way off. Including 'undertakings and assurances' made as part of the consents process (Item 17). Optimistic savings assumptions that were not followed though (Item 18)

    20. Even the time estimates needed to generate the costs / programme estimates were way off and repeatedly overtaken by events, leading to expenditure on the hoof, without the benefit reliable estimates / contingencies before 'pressing the button'. ProcureWare?

    21. Risk management: sounds like there was a lot of 'assumptions' (50%). Existing / ground conditions?  Should start that super ground conditions DT good and early (See also Item 15,16)? Change in law required to allow surveys to be conducted in advance.

    22. Tender cost vs risk-transfer: Not sure how these perfomance indicators will be generated but it sounds like a lot of additional monitoring will be required by the government. More drones and continual surveying?

    23. Signalling and testing: Siemens should have some knowledge gained from its Crossrail experience? Should start on a digital twin simulation model well in advance, fully parametric to accommodate changes in the network design as it develops over the next two decades? Can a fully functioning network be simulated? Aimsun?

    24. Sounds like a vast interconnected P6 -and-a-half programme needs to be maintained.... in the cloud?

    25.Current programme and cost overruns lead to political uncertainty for later phases due to cost benefit questions. Modeling payback is always problematic, vulnerable to disruption. Citilabs to help monitor and optimise economic impacts?

    26. Not recognising risks, over-optimistic (I presume they mean inaccurate or we can't quantify so it's a gut-feel, who-dares-wins type 'political' decision) estimates from the outset followed by retro-active containment measures (echoes or Crossrail?) has resulted in poor risk management / value for money.

    27. More monitoring... no surprises there.

    Recommendations

    28. Programme: More top-down oversight by DoT. Ulp! Common KPI's: huge opportunity to develop fine-grained 'evidence' IoT-enabled indicies -not just indicators- that are more 4D-oriented? Costs: parametric / digital reporting that is also parametrically linked to the design?

    29: Limits of benchmarking approach to cost estimating (as opposed to design:bid:tender returns?) Item 16 highlights the problem of using international benchmarking. Recommendation: use multiple souces of info and presumably approaches. Not sure this will work either, but it sounds like a big oppurtunity for all kinds of voodoo econometric apps to be hatched by those fast-talking management consultants mentioned above. Bentley's DT's could provide the evidence base aka 'data lake'?

  • Interesting developments in the Synchro cluster of apps. They now have a Cost module based on iTwin? I think Cost is designed to work with Synchro Modeler that allows the model to be sliced up and quants extracted phasing-wise. 

    I wonder whether OBD will be able to leverage this as means to provide early cost modeling for buildings. BOMA, RICS SMM7, DIN 276/7 etc.

    I suppose since it is iTwin-based, you should already be able to upload OBD models using the ITwin Connector. And use all those fancy scriptable Cost / Modeler tools to assign cost codes and extract quantifies for calcs and reporting in Excel?

    Phase 2: cost reporting is an important and under represented BIM workflow for OBD. There already tools for clash detection, energy modeling, structural analysis, sequencing (via Synchro). thematic visualisation, spec classification (via the new NBS Chrous tool) that provide quick, tight feedback loops. Phase 2 direct writing to iTwins could add cost reporting to the mix?