Will MicroStation have a VR interface?

Creating content for Virtual Reality is one thing but creating that content whilst in VR is another. This month’s Visualisation SIG talked about Bentley’s VR directions with regard to LumenRT but I was wondering if Bentley had any plans to create a VR interface for MicroStation itself?

Both the Unity and Unreal game engines have their own VR editor initiatives which you can see here...

Unity EditorVR: https://www.youtube.com/watch?v=XYAfXQ0yqls

Unreal VR Editor: https://www.youtube.com/watch?v=1VVr2vMVdjc

Even though both of these solutions are breaking ground with novel approaches to VR input, I don’t think either of them will be particularly good at precision modelling. This is where an engineering package like MicroStation might be ideally positioned to implement something like AccuDraw in VR and get us going with powerful precision input rather than the crude eyeballing that the other approaches settle on.

I know that ergonomics and long design sessions will be an issue but VR headset comfort and resolution will improve quickly enough so I don’t see that as too much of an impediment. Even the current generation of hand controllers will be replaced with much more efficient hand and finger tracking, not to mention pupil tracking and the benefits that will bring to communicating our intent.

All in all, I think the future for VR input looks very exciting and I’m hoping that Bentley embrace this brave new world :-)



Cheers,

Andrew.