Bentley Expedition: Dogger Bank ?

Offshore wind seems to be kicking into gear in the UK. I wonder if Bentley isn't / shouldn't be putting together a 'tool chain' for offshore wind turbines.

1. Plaxis Modeto for the turbine towers.

2. gINT, Soilvision, Keynetix for marine geology analysis?

3. SACS/MOSES for any offshore rig stuff

4. Undersea cable: Siemens??

5. Specialist vessel design: MaxSurf

6. Onshore battery / converter etc facility: the usual suspects

7. Artificial Island: OpenSite Designer with Plaxis?

Parents
  • Hi Dominic

    Wind farms are taking off all over the world.  Datgel has developed support for import/export of Orsted AGS4+, and import and presentation of thermal conductivity cones to support our offshore geotech users of Datgel's gINT Add-Ins.

    Your list of products would be used by a wide range of engineers in different specialities, I don't think they would be sold as a pack. Bentley Systems offers the ELS to cover multi product needs of bigger engineering companies.  Smaller companies can buy perpetual or term licenses of specific products.

    cheers

    Phil Wade

    MD, Datgel

    Phil Wade
    Datgel
    Bentley Channel Partner and Developer Partner
    E: phil.wade@datgel.com | T: +61 2 8202 8600 & +65 6631 9780

    Get the most out of gINT with Datgel Tools.

  • Interesting!

    I imagine that you would be right. North Sea undersea geotech modeling has been around for a while led by petrochem players who developed their own tools and databases.

    Attractive proposition engineering wise as the design of the turbine structures could be easily standardised in a combined design analytics tools like Openbridge Designer stc. One tool thousands of towers. 

    The turbine towers are manufacturered by Vesta etc so there would be some in-house app Already with all the PLM / ERP support? 

    All the electrical transmission stuff provided by someone like Siemens? 

    Etc etc

    How would you provide a Digital Twin in this context? 

    Fragmented tools will mean interop friction and waste so there must be some pressure somewhere to consolidate around an info tool pipeline? 

    There is probably more appetite for a consolidated one-stop-shop offer in the Asian context? Sunda Shelf looks interesting...

  • Hi, Dominic,

    Indeed, offshore wind is really taking off: from a research-driven curiosity, it is growing into one of the most competitive forms of energy in terms of price per MWh, well in track of becoming even cheaper than onshore wind.

    Coincidently, the synergies that you refer to on your first post were recently covered in an article on the Digital Energy Journal, including the use of SACS, MOSES, and PLAXIS for the structural and geotechnical analysis of towers and foundations. The article also mentions the collaboration between Bentley and Siemens, the reality modelling of existing plant with ContextCapture, and asset performance management with AssetWise. It does not mention the use of gINT, SoilVision, and Keynetix for the subsurface modeling, but this is certainly a possible use case. 

    About one vs. multiple tools, mind that offshore wind is made of people coming from very diverse backgrounds, who bring to the table their different toolkits and ways of doing things. The industry owes its success precisely to this multi-disciplinary collaboration, and this is something we want to continue supporting.

    Thus, the digital twin is being built progressively, by removing these frictions on the basis of technologies such as our Open Modeling Environment and Connected Data Environment. And with every new release we make our tools converge even more. Ultimately, the debate between using one or multiple tools should be purely commercial, because there will not be any difference in the data that will be generated. This is the true promise of the digital twin.

    Best regards,

    Miquel Lahoz

  • I don't think they would be sold as a pack.

    Yes, this is the existing situation. The engineers come to the vendor and purchases what they want based on what their engineers are familiar with. The result is often times a hodgepodge of different apps with duplication and interop and elevated training overheads. The cost of re-training etc is prohibitive so the situation becomes entrenched... eventhough the long term cost is high.

    Worse: You find a lot of friction due to the need for manual handholding whenever a change is required, so design iterations / quality suffer.

    Over time, the more capable outfits start to accumulate a multitude of homebrew tools of varying quality and do not get updated very often or become unusable because the author leaves the company. These tools also raise checkability / assurance questions, forcing more manual overheads.

    Having multiple tools also generates problems when it comes to the need to work with a Common Data Environment (CDE). Can the tool live on ProjectWise etc? Or Sharepoint? User permisions? etc etc. The default is to return to the old server and rely on manual document control.

    What about web based or mobile device access? More incompatibility and more costs. Result is low uptake and consequent loss of productivity.

    Cloud-based MDO workflows?

    Client-brew app compatibility?

    Smaller companies can buy perpetual or term licenses of specific products.

    Yes. But in the infrastructure world, the smaller fish usually work in a tiered supply chain. They may not be able to justify ProjWise etc but will need to be able to plug into a suite of tools. I imagine that there is probably more of a demand for setup and training services compared to selling specific products, here.

    Looking at geotech world, there seems to be a plethora of data formats.

    Bentley Systems offers the ELS to cover multi product needs of bigger engineering companies. 

    ELS? Not sure how many companies have them or how successful they are in your neck of the woods. I would say that you get what you pay for even with an 'ELS' contract... having seen them at other companies.

    I recently saw a presentation by a large Australian structural engineer that was brought in late on a large job in the ME. They were forced to put together a 'tool chain' of different apps to automate the analysis and detailed design of a large steel structure. The tool chain was centred around their analysis package (Strand 7) and Tekla for design design and documentation.

    How would an ELS subscription here help? I can not think of one that has provided this level of assistance. Maybe in one of the US DoT's? Or in the petrochem world?

    I bet that there was a lot of re-inventing of the wheel, struggling with vendor helplines etc in the example above.

    Silo-perpetuation? Channel Partners probably have a closer relationship to the engineering domain side of things and better placed to help engineers string together their tools. Hybrid multi-channel partner + Bentley project-specific contracts?

    There is probably more appetite for a consolidated one-stop-shop offer in the Asian context?

    Looking at the previous YII presentations, there seems to be a number of very impressive presentations from large Chinese engineering firms... where a consolidated suite of tools have been implemented at project level. I suppose that it should not be a surprise that the Chinese are predisposed to this kind of discipline and forward investment, given the scale and pace of their projects.

    In any case, wind farms look quite lean at the moment in terms of the type of apps required. As mentioned above, its one tower multiplied thousands of times. The 'secret sauce' expertise required is probably in geotech, wind modeling, installation etc but a lot of that has already been figured out by the offshore petrochem industry. Western firms still have a chance? :-)

  • Ultimately, the debate between using one or multiple tools should be purely commercial, because there will not be any difference in the data that will be generated. This is the true promise of the digital twin.

    Not sure, but I think that your vision of a digital twin is a little too static. Even if the data incorporated into the DT is relatively static and does not change, there is still the cost and quality challenges that the information producers feeding into the DT will need to deal with. Don't forget that the productivity blockers engineers and designers face preceed DT's and will still be there after a DT is adopted. In many ways, DT's do not look very relevant to most daily workflows... currently.

    To use your words, what does the Digital Twin promise the information producers, exactly? On the face of it, it promises to provide a smarter exchange format like the current i.dgn's or .ism does. imodel.js is the SDK for it and is open sourced to maximise adoption.

    But, like IFC in the building sector, you do not work in this format. It is mainly for review and information storage for facilities management post design.

    One potential 'promise' would be to enhance cross application interops. A good example of this is the structural discipline which has multiple analytical and design apps that need to speak to each other.

    But looking at the slow progress of .ism's within the Bentley family, I think that the main problem is exposing and defining the dynamic behaviour of all the information that is converted into the ISM / Digital twin so that there can be bi-drectional translation. Takes a long time and probably means that the developers / domain experts will need to change the way they code from the outset.

    Without this work, the information hosted by the DT will be relatively static and not very useful. You may say that the process of building a bridge app for the DT will the means of forcing the separate apps to 'expose and document' their info and this will lead to better interop... but this does not avoid the fact that this is a lot work. The result is that a lot of bridge apps only cover the basics and you are back to wishing everyone was on the same platform.

  • Indeed, there are several trade-offs around interoperability, and many pages have been filled with dilemmas such as flexibility vs. control, or standardisation vs. innovation.

    For sure, it is a complex problem, which is the reason BIM never (yet?) realised the promise of true interoperability. There’s much focus on formats, but formats change all the time, or worse: they become stale and constrain future development. On the other hand, the issue with cross-application interoperability is the sheer number of 1-to-1 links that need to be updated and maintained (which, by simple combinatorics, grows at a factorial rate), unless some common ground is established.

    Personally, I prefer to think of the digital twin as an open (and, yes, dynamic) modelling environment, where data rather than formats are at the centre; a truly CONNECTED data environment, flexible and accessible for many purposes from multiple applications (by Bentley or otherwise), locations, and disciplines, rather than a theoretical blueprint, rigid data structure, closed solution, or exchange format. I agree some level of data 'bridging' will still be necessary, but the objective is minimising the number of points where this needs to happen, and, when it does, making it as open and transparent as possible.

    This is similar to the vision evoked by Bentley's open-source initiative, iModel.js. Coming from past experiences, I can understand how you’d think “how will it be different this time?”, but I believe we're getting closer to a (if not 'the') solution. Given that this is clearly a topic dear to you, I would also recommend keeping an eye for the announcements at Year in Infrastructure. I'd be curious as to whether, after watching them, you also think we're getting closer or rather the opposite.

Reply
  • Indeed, there are several trade-offs around interoperability, and many pages have been filled with dilemmas such as flexibility vs. control, or standardisation vs. innovation.

    For sure, it is a complex problem, which is the reason BIM never (yet?) realised the promise of true interoperability. There’s much focus on formats, but formats change all the time, or worse: they become stale and constrain future development. On the other hand, the issue with cross-application interoperability is the sheer number of 1-to-1 links that need to be updated and maintained (which, by simple combinatorics, grows at a factorial rate), unless some common ground is established.

    Personally, I prefer to think of the digital twin as an open (and, yes, dynamic) modelling environment, where data rather than formats are at the centre; a truly CONNECTED data environment, flexible and accessible for many purposes from multiple applications (by Bentley or otherwise), locations, and disciplines, rather than a theoretical blueprint, rigid data structure, closed solution, or exchange format. I agree some level of data 'bridging' will still be necessary, but the objective is minimising the number of points where this needs to happen, and, when it does, making it as open and transparent as possible.

    This is similar to the vision evoked by Bentley's open-source initiative, iModel.js. Coming from past experiences, I can understand how you’d think “how will it be different this time?”, but I believe we're getting closer to a (if not 'the') solution. Given that this is clearly a topic dear to you, I would also recommend keeping an eye for the announcements at Year in Infrastructure. I'd be curious as to whether, after watching them, you also think we're getting closer or rather the opposite.

Children