Augmented reality (AR) technology has been around for a several years now, and has already been applied to various areas of engineering, think of:
In those applications, AR is used mostly to display information. As such, displaying virtual data in its physical world context has great value. But another important component of engineering work is design.
Engineering design and AR/VR
Engineering design is typically done on a desktop computer, using CAD software. Models can be viewed on a monitor, but may also be experienced in augmented or virtual reality (VR). Changes to designs are usually done on the computer, and the modified models are then loaded again in AR/VR apps for visualization.
Such a dichotomy is understandable – CAD drafting tools are designed for accuracy and rely on stable user interface devices such as keyboard and mouse, both resting on a flat table surface. AR and VR, on the other hand, rely on inaccurate and unstable hand gestures. Consequently, desktop is often preferred for accurate, engineering quality design, and AR/VR for visualization.
A design/visualization continuum
Perhaps the design/visualization process should rather be a continuum, where each platform would load and display the same design files, and would all enable design, markup, and visualization. All we need is to provide the appropriate tools on each platform. Two years ago, our team worked on the problem, and came up with a simple set of drawing tools along with other tools that enhance drafting accuracy, such as angle, length and element snapping, and snapping to physical world objects – this way one could do his drafting based on existing physical assets by snapping their design elements to them.
Such a technology could be used for instance in situations where an engineer needs to design a bypass system between 2 existing pipes. He would normally have to first take measurements of the pipes or scan them using a laser scanner. He would then take those measurements to his office, import then into CAD software, do his design based on those, and may have to come back to the site for taking more measurements. What if he could do that directly on-site, using AR? Not only might this be much quicker, but he could see right away whether the proposed design would fit the environment. Here is an example for the design of a small table extension.
We envision a future where the AR drafting system would not only snap virtual design elements to physical objects, but also recognize and measure them, and could therefore propose compatible components such as pipe connectors, to assist the user in his design task, and why not, prepare and send the purchase order for approval...