Using Augmented Reality to Facilitate the Interpretation of 2D Construction Drawings

Construction is a complex process aimed at the development of physical infrastructure. Designers propose a 3D building concept, and ultimately builders create the 3D object that corresponds to the designer’s idea. However, even though designers may have produced a 3D model of their design, drawings, because of their location specific review and certification, are the only form of visual design communication that satisfies the legal framework required for construction. Since the process forces designers, architects and engineers to take one dimension out of their 3D design, drafting consists of a complex set of tasks aimed at accurately representing a 3D object with 2D representations.

For large infrastructure projects, the process may lead to the production of a very large number of spatially inter-referenced drawings. Faced with such increasing complexity, nowadays builders often use digital tablets to display 3D models on site. However, even when interactively displayed on a computer system on site, the model is still different from the physical world: displayed elements may not have been built yet, or the corresponding built objects may have different visual properties, making the visual correspondence between the model and the building site difficult to establish for the construction worker.

Although the use of 3D models on-site is appealing, builders are used to work with 2D drawings. However, going from 2D drawings to 3D building is often a difficult mental step that does not seem to be made easier by having an electronic version of the 2D drawings or 3D model on site. Based on our discussions with users in the construction world, it appears some of the common questions they ask when looking at drawings are: “What does that line represent in 3D” or “How does this line refer to those lines in that other drawing?”  2D drawings or 3D models alone cannot easily answer those questions.

We wanted to investigate the question, and see how augmented reality could help the construction process by facilitating the interpretation of 2D construction drawings.  In a short study, we proposed a system that combines the use of 2D drawings and 3D models on-site in an augmented reality context.  We first built our own augmented reality headset using a Oculus Rift device and 2 wide field of view webcams.  Tracking was obtained using an Optitrack tracking system installed in our lab.  We then tested our augmentation system on the task of building a new wall and cabinet in an existing room.  Our augmented reality prototype was developed in house in C++, and based on the Ogre3D rendering engine.

    

The first problem we tried to solve is showing the correspondence between a 2D element on a drawing and the corresponding 3D element in the model. Our system shows the 2D drawing to the user on a computer or tablet screen (figure below, top left).  The arrow indicates the location of a 2D cabinet element.  The corresponding 3D model is displayed in the augmented reality headset (top right).  By selecting an element on the 2D drawing (e.g. the cabinet, figure below, bottom left), then the corresponding 3D cabinet element automatically gets highlighted in the augmented environment (bottom right).  The user can select any element on the drawing and see the corresponding 3D element become highlighted, in the 3D augmented environment surrounding him. When using the system, this immediately made the correspondence very clear and unambiguous to us.  Of course, being in an augmented environment, a user can walk around the highlighted cabinet, see it from various angles, and therefore get a better sense of its size and position.  Inversely, a user could also click on an element in the augmented 3D model, and see the corresponding 2D element(s) get highlighted in the 2D drawing.

That feature proved to be working and useful.  However, when using it, we realized it was equally important to properly display the 3D model in the augmented environment, to get good 3D perception.  This issue became clear when we displayed the 3D cabinet, alone, floating in the air (below).  The element was displayed at its true 3D location, but the display of a floating cabinet appeared strange and kind of useless for construction. Indeed, builders need to know where to build the cabinet (and the wall to support it), and a cabinet representation floating alone appeared insufficient as it showed no connection to any of the surrounding 3D physical objects.

To improve 3D perception of the cabinet's location, we proposed adding context to the scene in 3 different ways: adding neighbor contextual elements (the wall supporting the cabinet, figure below, top left), projection lines (top right, bottom left) and projection lines dimensions (bottom right). Initial informal tests done with 3 subjects revealed that although adding neighbor elements was useful to enhance 3D perception of the position of the cabinet, the addition of projection lines was even more useful, as it helped subjects identify the actual 3D location of those floating elements with respect to existing physical objects. The addition of dimensions seemed to make things even more clear. We hypothesize that this is caused by analogy with 2D drawings, in which users are used to see such projection lines. We further hypothesize that dimension lines would be even more useful to builders than simple augmentation of 3D elements, as it might save them from having to repeatedly look at drawings to take measurements.

     

  

Those images are actually best viewed live - as shown in our video demo below.

In conclusion, our prototype worked well!  Our initial hypothesis related with the importance of establishing a correspondence between the 2D drawing and the 3D model turned out to be true. By displaying the relationship between a 2D element and their corresponding 3D representation in an augmented environment turned out to be visually very clear and useful.  In addition, our pilot experiment on representing a virtual cabinet in a 3D physical environment showed big differences whether it was displayed alone, with a context wall, or with projection lines. This experiment clearly shows future research investigations that could be done to better understand the problem.

We can easily imagine a future where construction workers would wear augmented reality headset, which would display all sorts of useful data such as the 3D model at full scale and at the exact location where it should be built, but also assembly instructions and schedule, delays in the construction process, material and tool location, time before next material delivery by the crane, etc. Possibilities are numerous and fascinating...  Of course, a lot of work needs to be done before we can achieve such augmentation.  Measuring the worker's head position accurately is the first requisite as it would enable accurate augmentation, and this will most likely be quite challenging to achieve in a dynamic environment such as a construction site.  Displaying the information to the user in a clear and unambiguous way and ensuring good 3D perception will also be very important, as well as getting access to that data in real time.  But most importantly, we will need to make sure such augmentation and devices are rugged and safe in a construction environment.  Those are interesting challenges for future investigations!

You can read the full investigation in: Côté S., Beauvais M., Girard-Vallée A., Snyder R., 2014.  A live Augmented Reality Tool for Facilitating Interpretation of 2D Construction Drawings.  Proceedings of the Salento AVR 2014, 1st international conference on augmented and virtual reality, Lecce, Italy.