The Wall ?

The recent struggles with DG compound walls and floor junctions are more signs that BA is running out of headroom.

http://communities.bentley.com/products/building/building_analysis___design/f/5917/t/40550.aspx

http://communities.bentley.com/products/building/building_analysis___design/f/5917/t/42316.aspx

Bentley needs to look carefully at the current context that basic tools like BA’s wall tool needs operate in. There are many ways to look at this, but I would encourage Bentley to look at emphasizing the ‘horizontal’ dimension of the BIM world as described by Chuck Eastman at GA Tech.

BIM Confusion / Myths:

I think that most productivity gains attributed to BIM are really in what Eastman calls the vertical integration markets, not in horizontal integration. Best example of vertical integration is structural ‘hot rolled’ steel. Shared 3d models, analysis package integration and automated detailing, fabrication etc have delivered compressed programmes. Similar productivity gains will be more difficult to repeat in a lot of the other ‘trades’.

The problem is the lack of integration in the AEC market, which is normally ‘vertical’ in nature. It’s a no-brainer for big integrated industries like the petro-chemical sector to invest in defining data dictionaries / ontologies and capture domain know-how in CAD add-ons for automated re-use. They reap the benefits directly. AEC is more fragmented and there is little incentive for the big ‘social investments’ in standardisation, interoperability, and knowledge capture / automation. The last item is probably the biggest problem and has the most relevance for horizontal BIM.

Horizontal BIM:

Vertical BIM is about disambiguation and avoiding islands / silos of automation. Data re-use by avoiding re-entry, ensuring completeness, and constraining or informing the design by the manufacturing processes are commonly cited goals.

Horizontal BIM on the other hand is about using representation/heuristics for developing, testing, communicating and coordinating designs. To do this, it needs to be able to provide a scalable computational design platform that handles data centric, propagating models. The platform also needs to support KBE by small and medium sized enterprises that don’t have big coding resources. It’s curious that Hollywood CGI houses increasingly tend to have a ‘technical director’ to ‘creative’ ratio that is 1:1 or greater. I don’t think this will happen to the AEC sector anytime soon.

Progressive Representation:

Designs / Information models are developed incrementally, almost in layers. H-BIM must be able to allow the designer to start with reduced data (primarily geometric aka conceptual modeling?) and use the initial constructs to drive ever more detailed or foreign constructs downstream. Enabling paradigms should include associative / history-based parametrics (GC-ish?), rule based selection / processing (Design++, JSpace?), constraints based modelling etc (PowerCivil geometry platform?). While these techniques are not new, there does not appear to be an overarching framework that ties them together, thereby allowing design intent to be franchised / pervasive.

Working in this layered way also emphasizes the need for intelligent, features like persistent arrays / patterns, ACS’s, connectors, workplanes, axes, points that can be imbued with user-guided ‘semantic / ontological’ information. These key / orienting / structuring  objects need to be able to propagate intelligently in a hierarchy or network structure.

Revit 2010 supports the nesting of reference points, axes, curves, planes, which allows the user to quickly build a ‘rig’ to drive a massing model. The massing model can then be cut up and ‘wallified’. This workflow reminds me that one area that ‘BIM’ has demonstrated ROI is automated area/volume calculations, which is important for the client signoff and cost control process. Massing, programme, room scheduling is often a 3d affair (See DProfiler + Affinity) where the overall envelope or cumulative scalar information like areas can drive walls / floors.

Compared to the 90’s, this is a far more complex context, where bi-directional associativity and design intent in the form of driving dimensions / constraints need to be dealt with. BA’s Space planner tool is based on the new EC frameworks. Perhaps walls should be rewritten as EC components? EC look more dynamic and could provide the common 'federating' ground to tie all the building / structural / civil etc verticals together.

Revit also has some basic functions to ‘glue’ walls to floors and vice versa. As discussed in one of the threads listed above, the functionality provided is minimal. But, why is this? Revit is supposed to be a ‘context driven parametric change engine’ that avoids large constraints sets and long regens that come with history based systems.

http://www.cadalyst.com/aec/1-2-3-revit-not-all-bim-parametric-2929

It has families of components, so it could use a rules based inference framework to propagate changes that maintain relationships between components. For example, if the user puts a kink in an external wall, a corresponding kink could be propagated to the floor slab. The problem is how to give the user control of this hidden ‘hardcoded’ behaviour. I can also see the propagation getting out of hand. How does the user restrict the kink to the ‘work’ floor only? Also, since there is no ‘history tree’ to prioritise things, and there can be concurrent edits merging / propagating thru the ‘central file’, so Revit will be forced into bigger and bigger combinatorical problems?

Digital Project takes a more transparent MCAD based approach: It makes use of support planes or surfaces offset /constrained from a system of grid lines. Propagation is history-based but is supplemented with constraints solving in discrete ‘sketches’. CATIA elements have rich ‘update’ methods that include ‘delimiters’ that look for suitable ‘cut-off’ elements. So, the slab can be constrained to stop say 250mm back from the building envelope plane/surface. Internal walls can be dimensionally constrained to other walls or grids.

GC is also a powerful history based tool, but it needs to be able to embed its intelligence in whatever it throws ‘over the wall’ to BA / Mstn. At the minimum, it needs to be able to include ‘ontological’ info like ‘I am a wall on the 7FL, south, ACS: A-7S etc’, so that when it is interacting outside GC, other intelligent objects like pipes can react to it. Speedikon has an intelligent ‘services penetration’ object that automatically recognises pipes that penetrate its walls. PDMS12 stores this kind of design intent as ‘attachment points’ that are associated with the interacting elements. So when a pipe is moved, the ‘attach point’s are queried for the relationships that need to be maintained so that the associated elements can be modified. I suppose PDMS already uses something like Design++ to capture domain production rules that would govern how the modifications will be made. Unlike Revit, these relationships would be exposed to the user as the attachment points are editable, I think.

The ‘ontological info’ should be detailed enough so that follow-on objects / tools like Sketchup’s ProfileBuilder can rebuild and react to changes in the reference geometry. So, the wall object should be able to store which edges are top / bottom, internal /external, core / cladding / cavity and other domain specific semantic clues.

But the clues will vary with the particular interfacing discipline / vertical. Nevertheless, H-BIM needs to provide the linking thread / common denominator, a mapping structure between datasets... your metaphor here... What is or should be the underlying unifying information? Smart geometry must be high on the list, as ambiguous as it is.

Specifically for walls, ArchiCAD 12’s CW tool is a good prototype to leapfrog, I think. Allplan 2009's facade tool has matched a lot of AC12CW’s functionality, but the way AC can separate out the schematic grid prefigures a layered, associative, rules based way of working that is promising.

Parents
  • Really impressive article, though I have to admit that I just understood half of it.

    One thing I wonder is how intelligence is preserved through different suppliers. Things are easy if you are a big company and you can do all inhouse, as there is in the oil and gas industry. The backdraw of this approach is that development is very structured, conservative and boring. There are standards for this and that, and being creative is not easy.

    We also have seen quite a few examples from the airplane production industry, that things are not quite as smooth as software developers are promising its clients. Both the A380 and the B787 are struggling with tremendous problems in their design.

    Do you know how constraints and logical connections are preserved between e.g. Revit or  Archicad models from different suppliers? 

  • Andreas,

    not sure what you mean by models from  different suppliers, but in Revit the families are created the way that they can either be parametric - during creation you make the dimensions and relationships among them, they can be then changed once the family is placed into model (think about family as compound cell or PAZ file in MS) or fixed, in that case you cannot change dimensions of placed family in dialog, but have to open it in family editor and change it there - kinda editing of cells in MS)

    there is one thing, when you start to create new family, you have to choose from delivered templates what type of construction it is. so the logic how the family is behaving is strictly defined in type of the construction. there is no chance to tell wall in Revit that it is slab (as you can in BA just assigning different part). in other words, Revit works like construction kit with certain number of system element types. you just put them together based on the functionality of the types. on one side, this is big advantage, on the other it is also big limitation. of course, there is a massing tool, you can use to create almost anything. but these mass elements are "dumb", you have to "assign" or "convert" them to system typ, which sometimes is rather difficult.

    constrains are defined either based on the element type or you can constrain elements to working planes, grid (which is visible in all views, not top only), floor levels ...

    don't know about ArchiCAD, don't use it

    btw, very interesting post, indeed

    p.

    /pt

  • What i mean with different suppliers, is that you would probably have the architect, HVAC engineer and structural engineer being from different companies and sitting at different places.

    Do you know in which way they can exchange their models? Will there be any parametric relationships between their models? Like when the architect changes a wall, the structural and HVAC engineer will get their models changed as well?

    Or is it more or less as we have it now, with references etc.? 

  • Hi Andreas,

    The increasing amount of info that BIM entails implies that propagation is needed. So, I guess all CAD models would act like Excel spreadsheets where changes in one cell can trigger changes in other cells. But, as anyone who works with multiple linked Excel files can tell you, its not always very easy to manage the propagation. This problem is given as the main justification for the single file set up that Revit uses. This single model orientation is being eroded as AD starts to realise that stalling cpu speeds is a big problem.

    Bentley has gone with a 'federated' approach, which breaks data storage up into separate files. The problem is how to glue the appropriate changes between elements together. These changes could be asychronous transactions, so big database style change tracking is essential. This is already built in to some Bentley apps that use Oracle Spatial etc to store big 'inclusive' databases, out of which a small selection of elements or components are checked out for editing. Triforma / JSpace's MCS works in the same way, Big problem is the need to convert the data from the DB format to/from the CAD format, which is a speed overhead. The old ProjectBank way of working had a lot of access speed issues and did not get much traction in user land. I guess Dist Dgn's are a reduced provisonal version of ProjBank.

    The new Integrated Structural Model looks like a smart Dist Dgn which acts as a transaction-managed repository for storing different types of data generated by disparate design, detailing and analysis apps. The amazing thing is the way changes made in one app can be understood by the others. Maybe this is the best place to find what you are looking for. The problem is this single file way of working looks like today's Revit, which has big speed / scalability problems. But, ArchiCAD 13 has suceeded in re-jigging its file format to allow 'delta transfer' style synchronisation, so the tech is there for Revit / ISM to ameliorate their sync/access speed problems. It would be interesting to see if Bentley will produce a similar integrated model format for the acrhitectural and services apps, which hopefully will be able to talk to each other. I suppose the new i-models look like read-only versions of ISM which do not have the 'schema' / interoperability stuff figured out.

    The new OpenPlant app is interesting as the 'database' info seems to be packaged into the dgn file.Valves and pipe parameters can be stored as icells / assemblies and stored and inserted into dgns without the external DB overhead. Working with OpenPlant means selecting component groups to be checked out, so that a session specific database can be cobbled together dynamically. I guess components are larger in granularity than the lines, curves etc elements that ProjBank was aimed at, so speed is less of a problem with this way of working. Nevertheless, I think AEC type work needs to be able to mix dumb and copious CAD elements stored primarily in files and intelligent propagating 'components' that are better of in databases. Its interesting to note that CATIA V6 is now DB / central store based.

    PDMS's hierachial databases have been around for a long time, but I think its still transaction based. Big advantage of granularity is that it allows the user to adjust the extent of working data set dynamically, thereby reducing the reliance on propagation as the user can call in the interfacing elements for simultaneous editing. Problem is that there be a need to break the propagation chain at some point. Maybe, rules based healing needs to be integrated.

    Cross file propagation has been looked at for a long time in the MCAD world. CATIA calls it Relational Modeling. Once again, its a powerful way of working, but requires an overaching, server / DB type management system to prevent circular references, spaghetti style dependencies and other problems. Wildfire 5 has introduced a lot of changes to prevent problematic propagation from causing regeneration failures. In fact, all of the major MCAD vendors are looking at history free techniques, where the relationships are inferred / generated just in time in an adhoc way. So, parametric relationships are context dependent and the relationships between elements can be multiple and dynamic (not fixed). More complexity !?

    I think there is no real magic bullet here. Propagation / parametric relationships are needed to improve productivity but it creates a lot of messy problems as well. The biggest limiting factor is the understandability and managability of these complex constructs. As a result, in part, I think there is a growing need for a layered way of defining relationships and proceduralism to deal with LOD problems. While there is a range of techniques to deal with managing propagation, maybe Bentley should also ensure that there are sufficient measures available to the user to minimise the need for cross file propagation. For starters, while the big brains are pondering the solution, could we have multiple active models in same window, please?

     

     
Reply
  • Hi Andreas,

    The increasing amount of info that BIM entails implies that propagation is needed. So, I guess all CAD models would act like Excel spreadsheets where changes in one cell can trigger changes in other cells. But, as anyone who works with multiple linked Excel files can tell you, its not always very easy to manage the propagation. This problem is given as the main justification for the single file set up that Revit uses. This single model orientation is being eroded as AD starts to realise that stalling cpu speeds is a big problem.

    Bentley has gone with a 'federated' approach, which breaks data storage up into separate files. The problem is how to glue the appropriate changes between elements together. These changes could be asychronous transactions, so big database style change tracking is essential. This is already built in to some Bentley apps that use Oracle Spatial etc to store big 'inclusive' databases, out of which a small selection of elements or components are checked out for editing. Triforma / JSpace's MCS works in the same way, Big problem is the need to convert the data from the DB format to/from the CAD format, which is a speed overhead. The old ProjectBank way of working had a lot of access speed issues and did not get much traction in user land. I guess Dist Dgn's are a reduced provisonal version of ProjBank.

    The new Integrated Structural Model looks like a smart Dist Dgn which acts as a transaction-managed repository for storing different types of data generated by disparate design, detailing and analysis apps. The amazing thing is the way changes made in one app can be understood by the others. Maybe this is the best place to find what you are looking for. The problem is this single file way of working looks like today's Revit, which has big speed / scalability problems. But, ArchiCAD 13 has suceeded in re-jigging its file format to allow 'delta transfer' style synchronisation, so the tech is there for Revit / ISM to ameliorate their sync/access speed problems. It would be interesting to see if Bentley will produce a similar integrated model format for the acrhitectural and services apps, which hopefully will be able to talk to each other. I suppose the new i-models look like read-only versions of ISM which do not have the 'schema' / interoperability stuff figured out.

    The new OpenPlant app is interesting as the 'database' info seems to be packaged into the dgn file.Valves and pipe parameters can be stored as icells / assemblies and stored and inserted into dgns without the external DB overhead. Working with OpenPlant means selecting component groups to be checked out, so that a session specific database can be cobbled together dynamically. I guess components are larger in granularity than the lines, curves etc elements that ProjBank was aimed at, so speed is less of a problem with this way of working. Nevertheless, I think AEC type work needs to be able to mix dumb and copious CAD elements stored primarily in files and intelligent propagating 'components' that are better of in databases. Its interesting to note that CATIA V6 is now DB / central store based.

    PDMS's hierachial databases have been around for a long time, but I think its still transaction based. Big advantage of granularity is that it allows the user to adjust the extent of working data set dynamically, thereby reducing the reliance on propagation as the user can call in the interfacing elements for simultaneous editing. Problem is that there be a need to break the propagation chain at some point. Maybe, rules based healing needs to be integrated.

    Cross file propagation has been looked at for a long time in the MCAD world. CATIA calls it Relational Modeling. Once again, its a powerful way of working, but requires an overaching, server / DB type management system to prevent circular references, spaghetti style dependencies and other problems. Wildfire 5 has introduced a lot of changes to prevent problematic propagation from causing regeneration failures. In fact, all of the major MCAD vendors are looking at history free techniques, where the relationships are inferred / generated just in time in an adhoc way. So, parametric relationships are context dependent and the relationships between elements can be multiple and dynamic (not fixed). More complexity !?

    I think there is no real magic bullet here. Propagation / parametric relationships are needed to improve productivity but it creates a lot of messy problems as well. The biggest limiting factor is the understandability and managability of these complex constructs. As a result, in part, I think there is a growing need for a layered way of defining relationships and proceduralism to deal with LOD problems. While there is a range of techniques to deal with managing propagation, maybe Bentley should also ensure that there are sufficient measures available to the user to minimise the need for cross file propagation. For starters, while the big brains are pondering the solution, could we have multiple active models in same window, please?

     

     
Children