Creating content for Virtual Reality is one thing but creating that content whilst in VR is another. This month’s Visualisation SIG talked about Bentley’s VR directions with regard to LumenRT but I was wondering if Bentley had any plans to create a VR interface for MicroStation itself?Both the Unity and Unreal game engines have their own VR editor initiatives which you can see here...Unity EditorVR: https://www.youtube.com/watch?v=XYAfXQ0yqls
Unreal VR Editor: https://www.youtube.com/watch?v=1VVr2vMVdjcEven though both of these solutions are breaking ground with novel approaches to VR input, I don’t think either of them will be particularly good at precision modelling. This is where an engineering package like MicroStation might be ideally positioned to implement something like AccuDraw in VR and get us going with powerful precision input rather than the crude eyeballing that the other approaches settle on.I know that ergonomics and long design sessions will be an issue but VR headset comfort and resolution will improve quickly enough so I don’t see that as too much of an impediment. Even the current generation of hand controllers will be replaced with much more efficient hand and finger tracking, not to mention pupil tracking and the benefits that will bring to communicating our intent.All in all, I think the future for VR input looks very exciting and I’m hoping that Bentley embrace this brave new world :-)
Are there any news about this question from Bentley?
It would make the workflows much easier if you had a direct interface out of MicroStation.
what we have done so far is to publish a model from MicroStation to LumenRT update 3. This will recognise the presence of Hive or Occulus hardware and allow you to navigate around the model in LumenRT.
Bentley Systems, Manchester UK