Rob Snyder for Bentley hero?

Do we have heroes in the Bentley world?

The inventor of Accudraw C1994 - Rob Brown - was one, without doubt, apparently with much input from MS-teacher hero Keith Little - but all that happened a bit before I entered.

Robert Aish seemed the next star, but somehow 18yrs on his GC is still an unimplimented potential, which maybe Adesk will do more with, since Robert jumped ship. In 2008 Volker Mueller's brief to develop on-the-fly or what-if assessment tools (similar to IES's building energy tools maybe) seemed to me to be an ideal 'generative' use of a GC extended to script non-graphical attributes as well as physical elements. But Bentley seems to have dropped that ball too. Acquisition of Hevacomp is NOT it!

Now, I'd draw attention to Rob Snyder, if everyone doesn't already know; see Rob Snyder's Blog, and http://dagsljus.wordpress.com, and just look at http://youtu.be/kQPxPF-lf5I! This is hot stuff, which if implimented would put Bentley way ahead and answer many of the present grumbles about the multiple fussy, undocumented, geek's-delight, unintegrated bolt-ons that has lately made MS into a fragmented, overweight dinosaur, while competitors race ahead in the useability stakes, for example

Unknown said:
You mean like Revit? Revit DBLink? Revit Ideate BIMLink? etc etc

Mstn has tags, fields, Feature Solids / DDD's variable table, GC, AecoSIM has its DGS, F+P, PCS,PFB.... and they all don't talk to each other. To echo Emil, Bentley needs to round up all those little coders working in isolation and get them working on ONE 'parametric' platform for this stuff. They will probably need to knock up something like Revit's user-parameter-centred change management system... so that when something is changed somewhere... it gets 'revised instantly' elsewhere else. The change management system tests and marshals the updates centrally, and does not rely on each coder to 'manually' dictate the steps from their little module's perspective, which is the same as pre-ordaining that none of them will 'talk' to any other module or tool.

Revit even has a Dynamic Update scheme that organises how third party tools propagate their changes amongs themselves and the host platform.

Back to http://youtu.be/kQPxPF-lf5I, about 14mins in, fantastically wonderful dynamic clipping, such as SpaceClaim and others already have.

Rob writes dense but clear stuff that crystalises what I've been longingly groping towards, or somehow hoped grandaddy MS/AECOsim would already have, when I re-bought-in a year ago.

Bentley pioneers amazing research vision but in the end delivers so little, too late.

Parents
  • FWIW, native 64bit would be the first hurdle to clear before making the type of changes you're referring to.  There's no sense in developing 32bit features that will only have to be replaced by 64bit versions in the near future.  Cart before the horse, and all that.     ;-)

    Also, Parametric Content Modeler is the first step towards that single parametric 3D modeling tool.  It's obiously too soon to know exactly how it will pan out, but AFAIK that's the direction we're heading towards.  

    Lastly..   Rob's blog articles are great!   I may work at Bentley, but I still get a lot out of what Rob has been posting.  



  • Tom,

    " it didn’t do anything to contribute to the definition of parametric building design that wasn’t already documented"

    I don't think that they would have suceeded in getting a patent, if the concept wasn't novel in some way.

    I don't don't think BOB and BIM were around or had much mindshare when Irwin Jungreis et al conceived Revit. They came from a MCAD Pro/E background, apparently.

    What I find interesting about Revit's 'contextual change engine' is that it provides a centralised change management system. The patent describes a convoluted way of marshalling, forward-chaining propagations in advance of commiting those changes that is interesting.

    Why is this interesting? Its very computationally expensive... so why do it? Revits founders must have seen something that compelled them to provide such a system. It reminds me of all these dynamic typing, reflection, managed code, garbage collection, compiler as a service stuff that the software platforms guys provide.

    If eveyrthing can be linked to everything else and the user or third party developers are allowed to define their own dependencies... then at some point you need an overarching 'traffic cop' to manage the way things propagate... so that everything can 'revise instantly'.

    I wonder if the new PCM will have to do the same thing at some point.

  • Yea... but even Bentley does them.

    They don't make easy reading either :-)

    Maybe its about obsfucating the idea enough so that your ideas are protected but you don't give your implementation away for those guys in Outer Mongolia to copy.

  • Dominic,

    Understood. And I was just commenting on the ambiguity of the paper.

    On your comments: “Its very computationally expensive... so why do it?” and “If eveyrthing can be linked to everything else and the user or third party developers are allowed to define their own dependencies... then at some point you need an overarching 'traffic cop' to manage the way things propagate... so that everything can 'revise instantly'.”

    You are absolutely correct.

    I’d like to add…

    If you try and parametrically connect everything to everything else, at some point you move something or delete something and everything goes sproing, like someone opened a box of compresses springs and your design becomes some cubist expression!

    Even if you have the ability to parametrically control things off of any other object, it becomes the designer’s responsibility to control what is connected to what and how. So it’s easy to imagine that the task of assigning parametric controllers could overwhelm a designer. It is better to offer sets of tools, such as a curtain wall object, in which a given set of methods are exposed for the designer to control. I’m not a big believer in the goal of everything being parametric even if it were achievable today. And so often when systems offer automatic parametric, such as the relationship of one wall to another, it doesn’t represent how a designer would build the relationship.

    Don’t get me wrong, I like parametric objects. I just don’t like it when they are arbitrarily assigned as they are on Revit (or even on AECOsim at times although in AECOsim we have a tendency to ignore them because we have other ways of manipulation the objects). When I copy a set of beams 2’-0” (600mm) o.c. I don’t want them individually tied to each other with that spacing, I want them spaced at on center spacing. So parametric objects or method which can be applied to objects in ways that mimic the real world make more sense than senseless parametrics for parametric’s sake. An all-encompassing, all knowing parametric god isn’t likely to be able to discern the heuristics, at least on today’ state of computational capability.

    And if this begins to hijack the intent of giving Rob hoodoos, then please accept my apology!

Reply
  • Dominic,

    Understood. And I was just commenting on the ambiguity of the paper.

    On your comments: “Its very computationally expensive... so why do it?” and “If eveyrthing can be linked to everything else and the user or third party developers are allowed to define their own dependencies... then at some point you need an overarching 'traffic cop' to manage the way things propagate... so that everything can 'revise instantly'.”

    You are absolutely correct.

    I’d like to add…

    If you try and parametrically connect everything to everything else, at some point you move something or delete something and everything goes sproing, like someone opened a box of compresses springs and your design becomes some cubist expression!

    Even if you have the ability to parametrically control things off of any other object, it becomes the designer’s responsibility to control what is connected to what and how. So it’s easy to imagine that the task of assigning parametric controllers could overwhelm a designer. It is better to offer sets of tools, such as a curtain wall object, in which a given set of methods are exposed for the designer to control. I’m not a big believer in the goal of everything being parametric even if it were achievable today. And so often when systems offer automatic parametric, such as the relationship of one wall to another, it doesn’t represent how a designer would build the relationship.

    Don’t get me wrong, I like parametric objects. I just don’t like it when they are arbitrarily assigned as they are on Revit (or even on AECOsim at times although in AECOsim we have a tendency to ignore them because we have other ways of manipulation the objects). When I copy a set of beams 2’-0” (600mm) o.c. I don’t want them individually tied to each other with that spacing, I want them spaced at on center spacing. So parametric objects or method which can be applied to objects in ways that mimic the real world make more sense than senseless parametrics for parametric’s sake. An all-encompassing, all knowing parametric god isn’t likely to be able to discern the heuristics, at least on today’ state of computational capability.

    And if this begins to hijack the intent of giving Rob hoodoos, then please accept my apology!

Children
  • Tom,

    Horses for courses. I think that there is a lot of room for parametric versus dataflow (GC) versus constraints solving (DDD, Civil Cells, PCS, PCM) versus rules-based (D++ or iLogic or CityEngine) versus ArchiCAD's Priority Based Connections versus T-Splines Sub-D's cage modeling versus Siemens/LEDAS/CoCreate's Direct Modeling etc etc to co-exist.

    The problem is that we are faced with the need to manage, control, sculpt more and more information. A lot will need to be given over to 'fly-by-wire'. Yes, the more you automate, the more can go sideways... but do you really have a choice long term?

    Every approach or paradigmn will have its limitations. At some point you will need to change gears. You may start something out using GC to quickly script something in the initial stages is exactly what you need. Or, you may very likely find that the pre-packaged compartmentalised parametric objects that PCS create are good enough for most things... or you may find that the bi-directional constraints solving is the only way to go. You may even find your tasks worth investing in a KBE expert system like D++, and fully automate stuff like Robertson Ceco did with its prefab metal buildings.

    You may not be a "big believer" but that is beside the point if you are developing a 'platform' tech like Mstn. Revit and others have taken parametrics very far, and few Revit users will go without parametrics even if they moan constantly about its faults... and even learn to use constraints sparingly to limit conflicts and speed problems. With Dynamo, they are now adding the GC-style 'dataflow' paradigmn to its portfolio of tools.

  • There always seems to be a divide between the CAD drafting perspective and the engineers / designers perspective.

    CAD drafties like to keep things simple and manageable. They are often at the end of the line of the design / documentation process and don't like surprises.

    Engineers and designers on the other hand need to automate as much as possible because they need to 'form-find' and generate and evaluate a lot of potential solutions. They need to have a lot of automation, propagative relationshps etc etc

    They need to be able to change all storey heights using parameters for example... especially during the front end design phase. But as the elements pile up, it gets harder and harder to manage... that's when the parametric 'smarts' need to be suppressed or superseded y a different means of holding everything together.

    Ideally, we need to identify the type of smart tools or objects that are or should be 'persistent'.  Grids for example? Or storey levels.

    We also need to ensure the smarts that we use in one stage can be handled by the following. A plant design using Plantwise or GC will ideally need to be able to 'hand-over' to one another or coexist without having to deal with too much 'zombie-ness'.

    As a design develops more and more detail will need to be added... which means more parametrics to keep things productive. For exmaple, Structural modeler doesn't have much in terms of connection design smarts or productivity tools.. compared to Prostructures or Power Rebar.

    The schema for the connections used in SM will need to be expandable by PS. And where PS has smarts like its 3d Workframe, then SM should be using the same tool upfront.... as it is a 'persistent' and common to both stages of design and documentation.

  • You hit the head when separating a Drafter and a Designer

    We are somewhere in the middle

    We have no received any drafting tools in probably 15 years

    With V8i we started toward the 3d aspects of the Designer but have not gone far enough there either

    Two distinct and different tasks sharing only 50% of the same tools.

    Ustn since 1988
    SS4 - i7-3.45Ghz-16 Gb-250/1Tb/1Tb-Win8.1-64b

    Eric D. Milberger
    Architect + Master Planner + BIM

    Senior  Master Planner NASA - Marshall Space Flight Center

    The Milberger Architectural Group, llc