The recent struggles with DG compound walls and floor junctions are more signs that BA is running out of headroom.
http://communities.bentley.com/products/building/building_analysis___design/f/5917/t/40550.aspx
http://communities.bentley.com/products/building/building_analysis___design/f/5917/t/42316.aspx
Bentley needs to look carefully at the current context that basic tools like BA’s wall tool needs operate in. There are many ways to look at this, but I would encourage Bentley to look at emphasizing the ‘horizontal’ dimension of the BIM world as described by Chuck Eastman at GA Tech.
BIM Confusion / Myths:
I think that most productivity gains attributed to BIM are really in what Eastman calls the vertical integration markets, not in horizontal integration. Best example of vertical integration is structural ‘hot rolled’ steel. Shared 3d models, analysis package integration and automated detailing, fabrication etc have delivered compressed programmes. Similar productivity gains will be more difficult to repeat in a lot of the other ‘trades’.
The problem is the lack of integration in the AEC market, which is normally ‘vertical’ in nature. It’s a no-brainer for big integrated industries like the petro-chemical sector to invest in defining data dictionaries / ontologies and capture domain know-how in CAD add-ons for automated re-use. They reap the benefits directly. AEC is more fragmented and there is little incentive for the big ‘social investments’ in standardisation, interoperability, and knowledge capture / automation. The last item is probably the biggest problem and has the most relevance for horizontal BIM.
Horizontal BIM:
Vertical BIM is about disambiguation and avoiding islands / silos of automation. Data re-use by avoiding re-entry, ensuring completeness, and constraining or informing the design by the manufacturing processes are commonly cited goals.
Horizontal BIM on the other hand is about using representation/heuristics for developing, testing, communicating and coordinating designs. To do this, it needs to be able to provide a scalable computational design platform that handles data centric, propagating models. The platform also needs to support KBE by small and medium sized enterprises that don’t have big coding resources. It’s curious that Hollywood CGI houses increasingly tend to have a ‘technical director’ to ‘creative’ ratio that is 1:1 or greater. I don’t think this will happen to the AEC sector anytime soon.
Progressive Representation:
Designs / Information models are developed incrementally, almost in layers. H-BIM must be able to allow the designer to start with reduced data (primarily geometric aka conceptual modeling?) and use the initial constructs to drive ever more detailed or foreign constructs downstream. Enabling paradigms should include associative / history-based parametrics (GC-ish?), rule based selection / processing (Design++, JSpace?), constraints based modelling etc (PowerCivil geometry platform?). While these techniques are not new, there does not appear to be an overarching framework that ties them together, thereby allowing design intent to be franchised / pervasive.
Working in this layered way also emphasizes the need for intelligent, features like persistent arrays / patterns, ACS’s, connectors, workplanes, axes, points that can be imbued with user-guided ‘semantic / ontological’ information. These key / orienting / structuring objects need to be able to propagate intelligently in a hierarchy or network structure.
Revit 2010 supports the nesting of reference points, axes, curves, planes, which allows the user to quickly build a ‘rig’ to drive a massing model. The massing model can then be cut up and ‘wallified’. This workflow reminds me that one area that ‘BIM’ has demonstrated ROI is automated area/volume calculations, which is important for the client signoff and cost control process. Massing, programme, room scheduling is often a 3d affair (See DProfiler + Affinity) where the overall envelope or cumulative scalar information like areas can drive walls / floors.
Compared to the 90’s, this is a far more complex context, where bi-directional associativity and design intent in the form of driving dimensions / constraints need to be dealt with. BA’s Space planner tool is based on the new EC frameworks. Perhaps walls should be rewritten as EC components? EC look more dynamic and could provide the common 'federating' ground to tie all the building / structural / civil etc verticals together.
Revit also has some basic functions to ‘glue’ walls to floors and vice versa. As discussed in one of the threads listed above, the functionality provided is minimal. But, why is this? Revit is supposed to be a ‘context driven parametric change engine’ that avoids large constraints sets and long regens that come with history based systems.
http://www.cadalyst.com/aec/1-2-3-revit-not-all-bim-parametric-2929
It has families of components, so it could use a rules based inference framework to propagate changes that maintain relationships between components. For example, if the user puts a kink in an external wall, a corresponding kink could be propagated to the floor slab. The problem is how to give the user control of this hidden ‘hardcoded’ behaviour. I can also see the propagation getting out of hand. How does the user restrict the kink to the ‘work’ floor only? Also, since there is no ‘history tree’ to prioritise things, and there can be concurrent edits merging / propagating thru the ‘central file’, so Revit will be forced into bigger and bigger combinatorical problems?
Digital Project takes a more transparent MCAD based approach: It makes use of support planes or surfaces offset /constrained from a system of grid lines. Propagation is history-based but is supplemented with constraints solving in discrete ‘sketches’. CATIA elements have rich ‘update’ methods that include ‘delimiters’ that look for suitable ‘cut-off’ elements. So, the slab can be constrained to stop say 250mm back from the building envelope plane/surface. Internal walls can be dimensionally constrained to other walls or grids.
GC is also a powerful history based tool, but it needs to be able to embed its intelligence in whatever it throws ‘over the wall’ to BA / Mstn. At the minimum, it needs to be able to include ‘ontological’ info like ‘I am a wall on the 7FL, south, ACS: A-7S etc’, so that when it is interacting outside GC, other intelligent objects like pipes can react to it. Speedikon has an intelligent ‘services penetration’ object that automatically recognises pipes that penetrate its walls. PDMS12 stores this kind of design intent as ‘attachment points’ that are associated with the interacting elements. So when a pipe is moved, the ‘attach point’s are queried for the relationships that need to be maintained so that the associated elements can be modified. I suppose PDMS already uses something like Design++ to capture domain production rules that would govern how the modifications will be made. Unlike Revit, these relationships would be exposed to the user as the attachment points are editable, I think.
The ‘ontological info’ should be detailed enough so that follow-on objects / tools like Sketchup’s ProfileBuilder can rebuild and react to changes in the reference geometry. So, the wall object should be able to store which edges are top / bottom, internal /external, core / cladding / cavity and other domain specific semantic clues.
But the clues will vary with the particular interfacing discipline / vertical. Nevertheless, H-BIM needs to provide the linking thread / common denominator, a mapping structure between datasets... your metaphor here... What is or should be the underlying unifying information? Smart geometry must be high on the list, as ambiguous as it is.
Specifically for walls, ArchiCAD 12’s CW tool is a good prototype to leapfrog, I think. Allplan 2009's facade tool has matched a lot of AC12CW’s functionality, but the way AC can separate out the schematic grid prefigures a layered, associative, rules based way of working that is promising.