Software Quality Tsar?

Who is responsible for software quality at Bentley?

Parents
  • It will be interesting if there will be any answer from Bentley.

    But it also leads to a question what do you mean by "software quality". There are different opinions what the software quality is and it can be identified in different places in "software development chain". So there is probably no single person responsible for the quality, but depending on used development methodology (waterfall, some agile principle etc.) there will more such people.

    From a users' perspective I can imagine the quality is (as I think is defined by ISO also) a compliance between product specification and real functionality delivered. And it generates a request to make tests of developed product agains defined scnarios.

    With regards,

    Jan
  • Unknown said:
    It will be interesting if there will be any answer from Bentley.

    Yes... will the real tsar stand up, please.

    Unknown said:
    But it also leads to a question what do you mean by "software quality".

    Yes... I was hoping the tsar would clarify this.

  • Dominic,

    These forums are attended to by many Bentley colleagues who are responsible for and contribute to software quality -- developers, analysts, support, documentation, and many others. Viewed as a whole product, software quality encompasses so much. If there are specific things you have in mind, it would be helpful to list them.

    Regards,

    |\/| /\ |< /\ |

    Makai Smith
    Director of Product Management
    MicroStation Product Line
    Bentley Systems, Inc.



  • Sounds like there is no central figure responsible for software quality at Bentley, which is very worrying.

    Is there a 'manifesto' or a spritual guide for the devs that defines what software quality means to Bentley? Like the engineers at Arup, everyone is responsible for quality, but what is the ethos at Bentley?

    Or should it be like a Moore's Bentley's Law thing? Bentley has a track record of making very good file format design decisions that allow developers long periods of file format stability, which is a good thing. But, at the moment it seems like there are a lot of hiccups wrt what is being delivered and the pace at which it is being developed.

    1. Development Sequence Cycles: seems super slow. Why was GC in 'beta' for so long? Why did it take so long to get a visual UI? Apprently going through three different versions? Who makes these decisions? Cost driven/bound?

    Doesn't Bentley need a 10 year+ roadmap for its products to enable it to properly budget and resource its teams? A Moore's law-type predictive resource curve would be helpful. Most of Bentley's apps are big and 'old'. To keep up with the competition, they need to be constantly being improved, in addition to the bread and butter stuff and de-bugging: Windows update every 3 years. DWG etc update every year? Parasolid update every year? Major platform (Mstn) update every year? Visual Studio / DotNet update every 3 years? UI Framework every 10-15 years? GPU/Multiprocessing every 10 years? 64/128bit ports every 15 years?  Additional hardware platform changes like cloud, handheld (new OS) and AR/VR Hololens?

    Programming API shift every 10 years? MDL(C) => MstnJ (Java) => MstnAPI(C++)?

    TR bug resolution, and general de-bugging. Waiting for external framework providers like Intel, Parasolids, Telerik, MS etc etc to fix their bugs.

    Doesn't seem like there is much time or resource left for either new or follow-on features, if you are not careful. This is a recipe for slow death. Fairly recent controversy at Apple about software quality prompting a 'feature freeze' but it seems that even with no new features there is still a lot of work to be done by the devs just to stand still.

    The 'curve' should also recognise the longer the vertical is in the tooth, the larger the effort/team required just to maintain the status quo. If there is no corresponding revenue growth... get rid of spin-off or consolidate some apps?

    2. Development Dependencies:

    It looks like GC is waiting for Aecosim to port to 64bit, probably on some bits at platform level inlcuding the Parametric Solids, dotNET, scripting etc to firm up before it can generate its wrappers. These kind of logjams are inevitable but very damaging to adoption. The hip crowd is not going to be very receptive if the impression is that things are not improving apace. Select Subscribers feel cheated. What does that ContextCapture stuff do for me? I don't have to use it and the bugs I reported years ago are still there and the new features I want are no where to be seen... blah blah.

    By the time the dev gets an opportunity to implement a new feature, the ground under him has already moved and he has to work on something more foreboding like port to 64bit, Connect features, Windows 10 certification, Cloud, ProjectWise etc etc.

    3. Scope: Parametric solids seems to be missing a lot of fundamentals. The impression is the dev team has not actually got first hand experience of a commercial production-ready system. Buggy, and underwhelming especially to old time users used to a different level of 'Bentley' quality-wise.

    I hear that Agile and scrum is popular at Bentley. This worries me as I don't think its very good for producing innovative software. I also suspect that the first thing that suffers is the UI. There needs to be a minimum standard set to prevent the devs taking too many shortcuts.

    Composibility: Bentley now has so many verticals and platform/middleware-type apps that composibility problems are now guaranteed to be a problem that will take more time/resouce allocation. Dev for Aecosim or Prostructures or Maxsurf etc... lag... PW minimum compatibility... lag... ISM / i-model compatibility... lag... Common (structural) components with OpenPlant...lag...or never... GC/Scenario Services compatibility. Never mind about linking an analysis app to the modeling app or something major like that.

    This kind of interoperability should be high on Bentley's 'DNA' list but there doesn't seem to be a 'Moores Law' curve or roadmap to ramp up to this level. By roadmap, I am thinking of dev kits or SDK's make things easier for the verticals devs to plug into the functionality.

    4. Ecosystem:

    So much of the Bentley portofolio of verticals started outside of Bentley. I get the impression that a lot of these apps were poorly written and suffer from bad architectural decisions... which now present huge barriers to rapid development inside Bentley. Why did it take so long to consolidate Geopak, Inroads and MX into OpenRoads? Why is so much of Aecosim's MEP verticals written in VBA? Why are Aecosim's Triforma solids so buggy with large SWA's?

    I suppose in the past, there was an IP barrier between the vendor and third party devs, and not all devs are equally experienced/capable. But, there should be minimum quality standards set for all/future developers to ensure that they don't make the same mistakes. If OO is good, then maybe MstnAPI should be required over the old MDL stuff. Parasolids has a long leaning curve, maybe there should be a Bentley 'starter kit' for new devs etc etc. Visual Programming tools: a lot of the perceived UX and by extension the software's quality is driven by the ease and predictablity of the tools. Dialogs only doesn't cut it any more. Same for the lack of good table-oriented tools. There needs to be more done to help third party devs onto this ladder. Scripting. If not, the perception is one of low stale quality.

    5. ...?

  • Unknown said:
    A Moore's law-type predictive resource curve would be helpful

    Moore's Law was an observation, not a prediction, made by Gordon Moore at Intel.  In fact, these days it is increasingly inaccurate because of fundamential issues in semiconductor device physics.

     
    Regards, Jon Summers
    LA Solutions

Reply Children
No Data