Offshore wind seems to be kicking into gear in the UK. I wonder if Bentley isn't / shouldn't be putting together a 'tool chain' for offshore wind turbines.
1. Plaxis Modeto for the turbine towers.
2. gINT, Soilvision, Keynetix for marine geology analysis?
3. SACS/MOSES for any offshore rig stuff
4. Undersea cable: Siemens??
5. Specialist vessel design: MaxSurf
6. Onshore battery / converter etc facility: the usual suspects
7. Artificial Island: OpenSite Designer with Plaxis?
Wind farms are taking off all over the world. Datgel has developed support for import/export of Orsted AGS4+, and import and presentation of thermal conductivity cones to support our offshore geotech users of Datgel's gINT Add-Ins.
Your list of products would be used by a wide range of engineers in different specialities, I don't think they would be sold as a pack. Bentley Systems offers the ELS to cover multi product needs of bigger engineering companies. Smaller companies can buy perpetual or term licenses of specific products.
Phil WadeDatgelBentley Channel Partner and Developer PartnerE: firstname.lastname@example.org | T: +61 2 8202 8600 & +65 6631 9780
Get the most out of gINT with Datgel Tools.
I imagine that you would be right. North Sea undersea geotech modeling has been around for a while led by petrochem players who developed their own tools and databases.
Attractive proposition engineering wise as the design of the turbine structures could be easily standardised in a combined design analytics tools like Openbridge Designer stc. One tool thousands of towers.
The turbine towers are manufacturered by Vesta etc so there would be some in-house app Already with all the PLM / ERP support?
All the electrical transmission stuff provided by someone like Siemens?
How would you provide a Digital Twin in this context?
Fragmented tools will mean interop friction and waste so there must be some pressure somewhere to consolidate around an info tool pipeline?
There is probably more appetite for a consolidated one-stop-shop offer in the Asian context? Sunda Shelf looks interesting...
Indeed, offshore wind is really taking off: from a research-driven curiosity, it is growing into one of the most competitive forms of energy in terms of price per MWh, well in track of becoming even cheaper than onshore wind.
Coincidently, the synergies that you refer to on your first post were recently covered in an article on the Digital Energy Journal, including the use of SACS, MOSES, and PLAXIS for the structural and geotechnical analysis of towers and foundations. The article also mentions the collaboration between Bentley and Siemens, the reality modelling of existing plant with ContextCapture, and asset performance management with AssetWise. It does not mention the use of gINT, SoilVision, and Keynetix for the subsurface modeling, but this is certainly a possible use case.
About one vs. multiple tools, mind that offshore wind is made of people coming from very diverse backgrounds, who bring to the table their different toolkits and ways of doing things. The industry owes its success precisely to this multi-disciplinary collaboration, and this is something we want to continue supporting.
Thus, the digital twin is being built progressively, by removing these frictions on the basis of technologies such as our Open Modeling Environment and Connected Data Environment. And with every new release we make our tools converge even more. Ultimately, the debate between using one or multiple tools should be purely commercial, because there will not be any difference in the data that will be generated. This is the true promise of the digital twin.
Miquel Lahoz said:Ultimately, the debate between using one or multiple tools should be purely commercial, because there will not be any difference in the data that will be generated. This is the true promise of the digital twin.
Not sure, but I think that your vision of a digital twin is a little too static. Even if the data incorporated into the DT is relatively static and does not change, there is still the cost and quality challenges that the information producers feeding into the DT will need to deal with. Don't forget that the productivity blockers engineers and designers face preceed DT's and will still be there after a DT is adopted. In many ways, DT's do not look very relevant to most daily workflows... currently.
To use your words, what does the Digital Twin promise the information producers, exactly? On the face of it, it promises to provide a smarter exchange format like the current i.dgn's or .ism does. imodel.js is the SDK for it and is open sourced to maximise adoption.
But, like IFC in the building sector, you do not work in this format. It is mainly for review and information storage for facilities management post design.
One potential 'promise' would be to enhance cross application interops. A good example of this is the structural discipline which has multiple analytical and design apps that need to speak to each other.
But looking at the slow progress of .ism's within the Bentley family, I think that the main problem is exposing and defining the dynamic behaviour of all the information that is converted into the ISM / Digital twin so that there can be bi-drectional translation. Takes a long time and probably means that the developers / domain experts will need to change the way they code from the outset.
Without this work, the information hosted by the DT will be relatively static and not very useful. You may say that the process of building a bridge app for the DT will the means of forcing the separate apps to 'expose and document' their info and this will lead to better interop... but this does not avoid the fact that this is a lot work. The result is that a lot of bridge apps only cover the basics and you are back to wishing everyone was on the same platform.
Indeed, there are several trade-offs around interoperability, and many pages have been filled with dilemmas such as flexibility vs. control, or standardisation vs. innovation.
For sure, it is a complex problem, which is the reason BIM never (yet?) realised the promise of true interoperability. There’s much focus on formats, but formats change all the time, or worse: they become stale and constrain future development. On the other hand, the issue with cross-application interoperability is the sheer number of 1-to-1 links that need to be updated and maintained (which, by simple combinatorics, grows at a factorial rate), unless some common ground is established.
Personally, I prefer to think of the digital twin as an open (and, yes, dynamic) modelling environment, where data rather than formats are at the centre; a truly CONNECTED data environment, flexible and accessible for many purposes from multiple applications (by Bentley or otherwise), locations, and disciplines, rather than a theoretical blueprint, rigid data structure, closed solution, or exchange format. I agree some level of data 'bridging' will still be necessary, but the objective is minimising the number of points where this needs to happen, and, when it does, making it as open and transparent as possible.
This is similar to the vision evoked by Bentley's open-source initiative, iModel.js. Coming from past experiences, I can understand how you’d think “how will it be different this time?”, but I believe we're getting closer to a (if not 'the') solution. Given that this is clearly a topic dear to you, I would also recommend keeping an eye for the announcements at Year in Infrastructure. I'd be curious as to whether, after watching them, you also think we're getting closer or rather the opposite.
Miquel Lahoz said:I would also recommend keeping an eye for the announcements at Year in Infrastructure.
... that was quick!
More details to come during YII.