<?xml version="1.0" encoding="UTF-8" ?>
<?xml-stylesheet type="text/xsl" href="https://communities.bentley.com/cfs-file/__key/system/syndication/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"><channel><title>Stéphane Côté's Blog</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog</link><description /><dc:language>en-US</dc:language><generator>Telligent Community 12</generator><item><title>On-site Engineering Design using Augmented Reality</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/on-site-engineering-design-using-augmented-reality</link><pubDate>Mon, 20 Apr 2020 18:19:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:374495f1-d341-40e3-ad27-9447e2658ec5</guid><dc:creator>StephaneCote</dc:creator><slash:comments>0</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=273496</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/on-site-engineering-design-using-augmented-reality#comments</comments><description>This is a proof of concept that highlights tools for accurate design in augmented reality.  It features hand-based lines drafting, along with accuracy features such as angle, length and element snapping, as well as snapping to physical world elements – this way one could design virtual assets based on existing physical assets by snapping their design elements to them.(&lt;a href="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/on-site-engineering-design-using-augmented-reality"&gt;read more&lt;/a&gt;)&lt;img src="https://communities.bentley.com/aggbug?PostID=273496&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>Using the HoloLens for accurate subsurface utility pipes augmented reality</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/using-the-hololens-for-accurate-subsurface-utility-pipes-augmented-reality</link><pubDate>Wed, 07 Mar 2018 17:27:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:74fad2d5-e04a-48f1-97d4-9e0351c8386b</guid><dc:creator>StephaneCote</dc:creator><slash:comments>2</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=272464</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/using-the-hololens-for-accurate-subsurface-utility-pipes-augmented-reality#comments</comments><description>&lt;p&gt;The 3D rendering of subsurface utility pipes inside a virtual excavation is cool. But if you don&amp;rsquo;t have 3D data to support it (e.g. if you don&amp;rsquo;t know the depth of the pipes), that sort of renderings could actually&amp;nbsp;be misleading.&lt;/p&gt;
&lt;p&gt;&lt;img alt=" " src="/resized-image/__size/600x240/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/2364.VirtualExcavation.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;Users told us that in spite of their limited 3D depth data, 2D pipe maps are very useful for excavation planning, and 2D renderings of those maps in the physical world would be quite handy, even without depth info.&lt;/p&gt;
&lt;p&gt;We prototyped that concept using the Microsoft HoloLens and Bentley ContextCapture.&amp;nbsp;We simulated a workflow where a user wants to find a service pipe connection and a valve, and mark their location. Our results are shown in this video:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://youtu.be/nFSAxMKHNMY"&gt;https://youtu.be/nFSAxMKHNMY&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;The video also shows that our method enables accurate augmentation of surfaces topography showing significant altitude differences, such as staircases.&lt;/p&gt;
&lt;p&gt;Interestingly, our concept could also be&amp;nbsp;extended from visualization to editing.&amp;nbsp; By projecting existing pipe map data onto the ground surface,&amp;nbsp;our&amp;nbsp;proposed system could let the user visually spot discrepancies between the augmentation and the actual location of surface assets, and propose modifications to the pipe database by &amp;ldquo;adjusting&amp;rdquo; virtual pipes location&amp;nbsp;directly on-site&amp;hellip;&lt;/p&gt;
&lt;p&gt;&lt;img alt=" " src="/resized-image/__size/600x240/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/4848.Discrepancy.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;By facilitating their use on site, Augmented Reality will likely increase the value of existing subsurface pipe maps, and also facilitate their maintenance over the years!&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=272464&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/ContextCapture">ContextCapture</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/subsurface%2butilities">subsurface utilities</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/HoloLens">HoloLens</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Augmented%2bReality">Augmented Reality</category></item><item><title>Are pipe maps sufficiently accurate for augmented reality?</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/are-pipe-maps-sufficiently-accurate-for-augmented-reality</link><pubDate>Mon, 27 Nov 2017 18:09:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:a7fa9e5b-4725-440d-badf-293c52fb5ab6</guid><dc:creator>StephaneCote</dc:creator><slash:comments>0</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=272348</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/are-pipe-maps-sufficiently-accurate-for-augmented-reality#comments</comments><description>Using cameras for capturing 3D meshes showing subsurface pipes. (&lt;a href="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/are-pipe-maps-sufficiently-accurate-for-augmented-reality"&gt;read more&lt;/a&gt;)&lt;img src="https://communities.bentley.com/aggbug?PostID=272348&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/surveying">surveying</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/ContextCapture">ContextCapture</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/subsurface%2butilities">subsurface utilities</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Augmented%2bReality">Augmented Reality</category></item><item><title>How the HoloLens could facilitate issue investigation and 3D pipe design in industrial facilities</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/how-the-hololens-could-facilitate-issue-investigation-and-3d-pipe-design-in-industrial-facilities</link><pubDate>Mon, 28 Nov 2016 19:44:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:1570d0f1-75db-4c46-972f-bcb154406743</guid><dc:creator>StephaneCote</dc:creator><slash:comments>0</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=271867</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/how-the-hololens-could-facilitate-issue-investigation-and-3d-pipe-design-in-industrial-facilities#comments</comments><description>&lt;p&gt;Facility maintenance work may be repetitive: sets of instructions need to be executed on a regular basis, to ensure continued operation of the facility.&amp;nbsp; As explained in my &lt;a href="/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2016/11/17/using-the-hololens-to-facilitate-plant-maintenance"&gt;previous post&lt;/a&gt;, an Augmented Reality (AR) tutor could help new employees familiarize themselves and complete the task, by teaching them how to proceed, on a step by step basis.&amp;nbsp; Experienced employees could also benefit from the tutor, using it as a check list.&amp;nbsp; However, one potential danger is facility employees could end up following instructions blindly, ignoring their former training.&amp;nbsp; While AR maintenance tutors would do an excellent job in most cases, they would dramatically fail in unexpected situations, as they could give users instructions that are inappropriate&amp;nbsp;for the situation.&lt;/p&gt;
&lt;p&gt;Virtual tutors make sense when the steps necessary to execute the task are known in advance.&amp;nbsp; But what if something unexpected happens?&amp;nbsp; For instance, a pipe bursts, or an instrument stops working properly. A tutor app would not be of any use here, because it has not been programmed to deal with such situations.&amp;nbsp; In such cases the user is on his own&amp;hellip;&lt;/p&gt;
&lt;p&gt;Unfortunately, such critical decisions are often constrained by time and by the availability of information.&amp;nbsp; Instrument reading history, history of past events, notes left by colleagues, drawings and&amp;nbsp;device specifications could all be useful to the user when trying to decide on how to solve the problem. But the time that would be required to collect and analyze all that information might make the situation even more critical.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;As a solution to this, we proposed the concept of an AR assistant.&amp;nbsp; It would take the form of an Augmented Reality voice operated system that the user could ask questions to, and that would provide him with all the data he needs to resolve the issue he is facing.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://youtu.be/lYptlaO9rKw"&gt;https://youtu.be/lYptlaO9rKw&lt;/a&gt;&lt;br /&gt;&lt;br /&gt;&lt;/p&gt;
&lt;p&gt;Such an assistant would not only save the user a lot of time, but it would enable him to base his decision on relevant contextual data that is delivered to him where and when he needs it. Our demo highlights several innovative services offered by the assistant:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A connection between a P&amp;amp;ID drawing and the physical world, saving the user from having to find the object that corresponds to a specific P&amp;amp;ID element, and facilitating his understanding of the drawing, as shown in a &lt;a href="/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2016/09/26/experimenting-with-the-hololens-for-infrastructure-engineering"&gt;previous post&lt;/a&gt;;&lt;/li&gt;
&lt;li&gt;A method of viewing and browsing pre-recorded data on a holographic 2D graph;&lt;/li&gt;
&lt;li&gt;A way of designing pipe layouts on site, enabling the user to readily see and avoid clashes with existing pipes;&lt;/li&gt;
&lt;li&gt;A method of viewing the hologram of a colleague at exactly the same physical location as during recording &amp;ndash; this will be the subject of a future post.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Such an AR assistant application would of course depend on the existence and online availability of context data. The Internet of Things would offer a part of the solution, by providing online access to all connected instruments in a plant.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;One can easily envision the facility of the future where employees could inspect, operate, maintain and repair assets highly efficiently, by having timely access to all sorts of contextual data related with those assets.&amp;nbsp; Augmented Reality would give them some sort of superpowers, enabling them to achieve more, and more accurately, in less time.&amp;nbsp; Such future is closer than we think...&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=271867&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Pipes">Pipes</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/P_2600_amp_3B00_ID">P&amp;amp;ID</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/HoloLens">HoloLens</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/3D">3D</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Augmented%2bReality">Augmented Reality</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/visualization">visualization</category></item><item><title>Using the HoloLens to facilitate plant maintenance</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/using-the-hololens-to-facilitate-plant-maintenance</link><pubDate>Thu, 17 Nov 2016 21:04:48 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:4fdef803-f7e0-4c2c-a61b-e56069e81dd5</guid><dc:creator>StephaneCote</dc:creator><slash:comments>0</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=271863</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/using-the-hololens-to-facilitate-plant-maintenance#comments</comments><description>&lt;p&gt;We have been using the HoloLens for 8 months now, yet I am still amazed by the quality of the tracking it provides, enabling truly stable and quite robust hologram displays.&amp;nbsp; So far, many of the use cases that have been demonstrated for the device were showing holograms unrelated with their physical environment: whether they be building models, TV screens or Minecraft games, such holograms might be displayed on your coffee table, on your bed, in your garage, or at school &amp;ndash; it makes no difference.&amp;nbsp; But the aspect that, in my opinion, gives the HoloLens such great potential in infrastructure engineering is its capacity to augment reality &amp;ndash; that is to display digital information directly related with the physical world.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Let say for instance that you operate a plant, that requires regular maintenance.&amp;nbsp; The procedure is straightforward, it is always done by the same employee, who knows it so well he could do the job with his eyes closed.&amp;nbsp; But one day the guy is ill, and no other employee has been trained to do the work.&amp;nbsp; What do you do?&lt;/p&gt;
&lt;p&gt;You could send someone else with a maintenance handbook (assuming one exists), who will try to follow the procedure by reading instructions, and trying to execute them.&amp;nbsp; Not only this would likely be slow, but there is also a risk he might make mistakes, as establishing a correspondence between written text and physical handles to operate is error prone.&lt;/p&gt;
&lt;p&gt;A quicker and safer solution would be to show him exactly what he has to do, at the right location, directly on the hand valves and instruments that he has to operate or check.&amp;nbsp; Some sort of an Augmented Reality tutor that would guide him on how to do the work, step by step.&amp;nbsp; We tried achieving that last summer:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://youtu.be/QTuKcm8s4QQ"&gt;https://youtu.be/QTuKcm8s4QQ&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Of course setting up such AR tutor systems would take time, and the AR procedure would need to be updated from time to time.&amp;nbsp; So they would likely make sense only in cases where such procedures have to be repeated on a regular basis.&amp;nbsp; Unless of course their creation could be automated through some sort of analysis process based on the system&amp;rsquo;s P&amp;amp;ID and maintenance task goal.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;But there is a risk in providing such detailed step by step instructions.&amp;nbsp; Let say for instance that the task that you want to teach is how to drive a screw in a piece of wood.&lt;/p&gt;
&lt;p&gt;&lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Tournevis.jpg"&gt;&lt;img alt=" " src="/resized-image/__size/940x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Tournevis.jpg" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Of course, an augmented reality tutor could simply say: &amp;ldquo;Drive the screw at this location&amp;rdquo;.&lt;/p&gt;
&lt;p&gt;Alternatively, it could also say: &amp;ldquo;Place the screw on the driver tip and hold both screw and tip together with the fingers of one hand. Apply very little pressure on the driver while turning in a clockwise direction until the screw engages the wood.&amp;rdquo; &lt;span style="font-size:75%;"&gt;(source: http://www.artofmanliness.com/2010/02/18/toolmanship-how-to-use-a-screwdriver/ )&lt;/span&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Workers know how to use a screwdriver&amp;hellip;&amp;nbsp; Such highly detailed instructions would be way too much information.&amp;nbsp; Actually this could be risky &amp;ndash; the danger is the same as when using a GPS device &amp;ndash; there is a risk you might end up following the instructions blindly, and you stop thinking.&amp;nbsp; Then you run your car into a lake&amp;hellip;&lt;/p&gt;
&lt;p&gt;Using AR, we wish to give superpowers to users, not stupefy them.&amp;nbsp; That is: we want to give users just the right amount of information to enable them doing what they are good at: to take decisions and act.&lt;br /&gt;This will be the subject of my next post&amp;hellip;&amp;nbsp; Stay tuned!&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=271863&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>Augmenting Drone Videos could Facilitate Construction Monitoring</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/augmenting-drone-videos-could-facilitate-construction-monitoring</link><pubDate>Mon, 22 Feb 2016 14:42:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:6b4c0ddf-b12e-47d4-88c9-8d12984d6b35</guid><dc:creator>StephaneCote</dc:creator><slash:comments>5</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=271395</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/augmenting-drone-videos-could-facilitate-construction-monitoring#comments</comments><description>&lt;p&gt;Augmented reality is a fascinating technology that could change the way we live and interact with the world that surrounds us. Unfortunately, we don&amp;rsquo;t see many good applications of AR in engineering yet, partly because achieving good visual integration of digital objects with reality is very challenging.&lt;/p&gt;
&lt;p&gt;To display the augmented elements at the right location on a tablet display at every instant, the AR app must know the position of your tablet in real time - that is fundamental to AR. Now that problem is hard to solve on a handheld device, because it may require multiple sensors and heavy calculations, which are not always easy on small devices with limited battery capacity.&lt;/p&gt;
&lt;p&gt;Initial tracking solutions made this easier through the use of markers (QR codes). The tablet&amp;rsquo;s camera captures live video of the scene that includes a marker, and based on the shape and orientation of the marker seen on the video, the AR app can calculate the tablet&amp;rsquo;s position with reasonable accuracy. Markers work quite well, but to be of any use, they must be visible in your camera view &amp;ndash; so depending on the situation you may need to install many of them.&lt;/p&gt;
&lt;p&gt;Later, more advanced computer vision techniques like &amp;ldquo;SLAM&amp;rdquo; made tracking without markers possible. To calculate the camera position, SLAM relies on visible features in the physical environment, such as edges, corners, and other striking features that appear on video.&amp;nbsp; Using SLAM saves you from installing markers, but of course all those techniques being based on video, they rely heavily on the environment &amp;ndash; factors like poor lighting conditions or low contrast sometimes result in augmentations that appear &amp;ldquo;shaky&amp;rdquo; or displayed at the wrong location. Science has not yet solved that camera position tracking problem robustly enough for &amp;quot;anywhere&amp;quot; augmentation.&lt;/p&gt;
&lt;p&gt;The knowledge of the camera position is not only useful for AR. Take for instance the technology that converts photos into meshes, such as Bentley Systems&amp;rsquo; ContextCapture technology. It is very simple to use: you take a set of photos of a scene following some basic rules, and the program will generate a 3D mesh based on what appears on the photos. Results generally amaze me...&lt;/p&gt;
&lt;p&gt;&lt;img width="640" height="408" class="center" alt=" " src="https://media.licdn.com/mpr/mpr/AAEAAQAAAAAAAAb-AAAAJDE4MGNjMWVkLTAyOGUtNGExOS05NDYzLTJjYzc2Yzc3NzI5Mg.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;To achieve such meshes from photos, the ContextCapture process must first go through a step called &amp;ldquo;aerotriangulation&amp;rdquo; (AT), a process that matches photo features and accurately establishes the relative position and orientation of each photo. The process is offline, and can take several minutes to complete - but it results in very accurate measurements of... the camera position!&amp;nbsp; So we thought: if ContextCapture provides the camera position of each photo for free,&amp;nbsp; then perhaps we could use those calculated positions to augment the corresponding photos?&lt;/p&gt;
&lt;p&gt;To test our hypothesis, here is roughly what we did:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;We flew a drone around our local Bentley office, capturing video of the building, during the construction of an extension to the second floor.&lt;/li&gt;
&lt;li&gt;We extracted all the frames from that video, and used them to create a mesh of the building scene using our ContextCapture technology.&lt;/li&gt;
&lt;li&gt;We then aligned the resulting mesh with a BIM model of our building.&lt;/li&gt;
&lt;li&gt;Finally, we used the calculated positions &amp;amp; orientations of each frame to augment them with the BIM model.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Results show a very steady augmentation, well aligned from frame to frame:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://youtu.be/aPNwXIyu0ZY"&gt;https://youtu.be/aPNwXIyu0ZY&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Of course, this can&amp;rsquo;t exactly be called augmented reality &amp;ndash; because it is not &lt;em&gt;live&lt;/em&gt;. The augmentation took several minutes to compute, because of the AT process.&amp;nbsp; On the other hand, the calculated camera positions are very accurate, which resulted in very steady augmentation.&lt;/p&gt;
&lt;p&gt;In spite of that, such offline augmentation can be very useful.&amp;nbsp; Think for instance of monitoring your building site on a daily basis, trying to identify delays or mistakes in the construction process. Using standard hand held augmented reality, you could walk around the site with your tablet, and assuming the AR app is able to calculate your position accurately in such a dynamic environment, you could view live augmentation of the site which would facilitate the identification of those mistakes and delays.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Alternatively, you could have a drone frequently flying around your site, taking photos, and uploading those to a server on the cloud, which would generate a mesh, align it with the BIM model, and augment the photos with the model.&amp;nbsp; So a few minutes after photo capture, you could look at those augmented photos from your office, identify delays or mistakes in the construction process, and raise a flag when you notice something that would deserve some immediate attention.&amp;nbsp; Not only this technique would save you several visits to the site, but it would enable more frequent monitoring, the augmentation would likely be much more steady and accurate than using handheld augmentation on site, and it would be available from a multitude of vantage points that you could not reach while walking around the site.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Such a solution would do a great job for large infrastructure projects, at least as far as the outer shell of the asset is concerned.&amp;nbsp; Some drones are now equipped with range sensing technology, that can make them fly &lt;em&gt;inside&lt;/em&gt; buildings, &amp;quot;seeing&amp;quot; and avoiding obstacles, and find their way by creating a map in the process.&amp;nbsp; I am sure you too can imagine what lies ahead...&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=271395&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>Using Augmented Reality to Facilitate the Interpretation of 2D Construction Drawings</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/using-augmented-reality-tool-for-facilitating-the-interpretation-of-2d-construction-drawings</link><pubDate>Wed, 09 Dec 2015 19:27:42 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:097e603b-065a-42e0-9166-4e109d13fc57</guid><dc:creator>StephaneCote</dc:creator><slash:comments>0</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=271298</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/using-augmented-reality-tool-for-facilitating-the-interpretation-of-2d-construction-drawings#comments</comments><description>&lt;p&gt;Construction is a complex process aimed at the development of physical infrastructure. Designers propose a 3D building concept, and ultimately builders create the 3D object that corresponds to the designer&amp;rsquo;s idea. However, even though designers may have produced a 3D model of their design, drawings, because of their location specific review and certification, are the only form of visual design communication that satisfies the legal framework required for construction. Since the process forces designers, architects and engineers to take one dimension out of their 3D design, drafting consists of a complex set of tasks aimed at accurately representing a 3D object with 2D representations.&lt;/p&gt;
&lt;p&gt;For large infrastructure projects, the process may lead to the production of a very large number of spatially inter-referenced drawings. Faced with such increasing complexity, nowadays builders often use digital tablets to display 3D models on site. However, even when interactively displayed on a computer system on site, the model is still different from the physical world: displayed elements may not have been built yet, or the corresponding built objects may have different visual properties, making the visual correspondence between the model and the building site difficult to establish for the construction worker.&lt;/p&gt;
&lt;p&gt;Although the use of 3D models on-site is appealing, builders are used to work with 2D drawings. However, going from 2D drawings to 3D building is often a difficult mental step that does not seem to be made easier by having an electronic version of the 2D drawings or 3D model on site. Based on our discussions with users in the construction world, it appears some of the common questions they ask when looking at drawings are: &amp;ldquo;What does that line represent in 3D&amp;rdquo; or &amp;ldquo;How does this line refer to those lines in that other drawing?&amp;rdquo;&amp;nbsp; 2D drawings or 3D models &lt;em&gt;alone&lt;/em&gt; cannot easily answer those questions.&lt;/p&gt;
&lt;p&gt;We wanted to investigate the question, and see how augmented reality could help the construction process by facilitating the interpretation of 2D construction drawings.&amp;nbsp; In a short study, we proposed a system that combines the use of 2D drawings and 3D models on-site in an augmented reality context.&amp;nbsp; We first built our own augmented reality headset using a Oculus Rift device and 2 wide field of view webcams.&amp;nbsp; Tracking was obtained using an Optitrack tracking system installed in our lab.&amp;nbsp; We then tested our augmentation system on the task of building a new wall and cabinet in an existing room.&amp;nbsp; Our augmented reality prototype was developed in house in C++, and based on the Ogre3D rendering engine.&lt;/p&gt;
&lt;p&gt;&lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Oculus.jpg"&gt;&lt;img src="/resized-image/__size/1880x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Oculus.jpg" alt=" " height="187" width="281" /&gt;&lt;/a&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; &lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Model.JPG"&gt;&lt;img src="/resized-image/__size/1880x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Model.JPG" alt=" " height="184" width="320" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;The first problem we tried to solve is showing the correspondence between a 2D element on a drawing and the corresponding 3D element in the model. Our system shows the 2D drawing to the user on a computer or tablet screen (figure below, top left).&amp;nbsp; The arrow indicates the location of a 2D cabinet element.&amp;nbsp; The corresponding 3D model is displayed in the augmented reality headset (top right).&amp;nbsp; By selecting an element on the 2D drawing (e.g. the cabinet, figure below, bottom left), then the corresponding 3D cabinet element automatically gets highlighted in the augmented environment (bottom right).&amp;nbsp; The user can select any element on the drawing and see the corresponding 3D element become highlighted, in the 3D augmented environment surrounding him. When using the system, this immediately made the correspondence very clear and unambiguous to us.&amp;nbsp; Of course, being in an augmented environment, a user can walk around the highlighted cabinet, see it from various angles, and therefore get a better sense of its size and position.&amp;nbsp; Inversely, a user could also click on an element in the augmented 3D model, and see the corresponding 2D element(s) get highlighted in the 2D drawing.&lt;/p&gt;
&lt;p&gt;&lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Select_5F00_elem.jpg"&gt;&lt;img src="/resized-image/__size/1880x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/Select_5F00_elem.jpg" alt=" " height="394" width="636" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;That feature proved to be working and useful.&amp;nbsp; However, when using it, we realized it was equally important to properly display the 3D model in the augmented environment, to get good 3D perception.&amp;nbsp; This issue became clear when we displayed the 3D cabinet, alone, floating in the air (below).&amp;nbsp; The element was displayed at its true 3D location, but the display of a floating cabinet appeared strange and kind of useless for construction. Indeed, builders need to know where to build the cabinet (and the wall to support it), and a cabinet representation floating alone appeared insufficient as it showed no connection to any of the surrounding 3D physical objects.&lt;/p&gt;
&lt;p&gt;&lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetAlone.jpg"&gt;&lt;img src="/resized-image/__size/1880x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetAlone.jpg" alt=" " height="336" width="221" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;To improve 3D perception of the cabinet&amp;#39;s location, we proposed adding context to the scene in 3 different ways: adding neighbor contextual elements (the wall supporting the cabinet, figure below, top left), projection lines (top right, bottom left) and projection lines dimensions (bottom right). Initial informal tests done with 3 subjects revealed that although adding neighbor elements was useful to enhance 3D perception of the position of the cabinet, the addition of projection lines was even more useful, as it helped subjects identify the actual 3D location of those floating elements with respect to existing physical objects. The addition of dimensions seemed to make things even more clear. We hypothesize that this is caused by analogy with 2D drawings, in which users are used to see such projection lines. We further hypothesize that dimension lines would be even more useful to builders than simple augmentation of 3D elements, as it might save them from having to repeatedly look at drawings to take measurements.&lt;/p&gt;
&lt;p&gt;&lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetInContext.jpg"&gt;&lt;img src="/resized-image/__size/1880x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetInContext.jpg" alt=" " height="317" width="205" /&gt;&lt;/a&gt;&amp;nbsp; &lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetWithBotProjLines.jpg"&gt;&lt;img src="/resized-image/__size/940x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetWithBotProjLines.jpg" alt=" " height="317" width="207" /&gt;&lt;/a&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetWith3ProjLines.jpg"&gt;&lt;img src="/resized-image/__size/940x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetWith3ProjLines.jpg" alt=" " height="312" width="205" /&gt;&lt;/a&gt;&amp;nbsp; &lt;a href="/cfs-file/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetWithMeasurements.jpg"&gt;&lt;img src="/resized-image/__size/940x0/__key/communityserver-blogs-components-weblogfiles/00-00-00-50-35/CabinetWithMeasurements.jpg" alt=" " height="304" width="209" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Those images are actually best viewed live - as shown in our video demo below.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://youtu.be/1SocE7yhbhs"&gt;https://youtu.be/1SocE7yhbhs&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;In conclusion, our prototype worked well!&amp;nbsp; Our initial hypothesis related with the importance of establishing a correspondence between the 2D drawing and the 3D model turned out to be true. By displaying the relationship between a 2D element and their corresponding 3D representation in an augmented environment turned out to be visually very clear and useful.&amp;nbsp; In addition, our pilot experiment on representing a virtual cabinet in a 3D physical environment showed big differences whether it was displayed alone, with a context wall, or with projection lines. This experiment clearly shows future research investigations that could be done to better understand the problem.&lt;/p&gt;
&lt;p&gt;We can easily imagine a future where construction workers would wear augmented reality headset, which would display all sorts of useful data such as the 3D model at full scale and at the exact location where it should be built, but also assembly instructions and schedule, delays in the construction process, material and tool location, time before next material delivery by the crane, etc. Possibilities are numerous and fascinating...&amp;nbsp; Of course, a lot of work needs to be done before we can achieve such augmentation.&amp;nbsp; Measuring the worker&amp;#39;s head position accurately is the first requisite as it would enable accurate augmentation, and this will most likely be quite challenging to achieve in a dynamic environment such as a construction site.&amp;nbsp; Displaying the information to the user in a clear and unambiguous way and ensuring good 3D perception will also be very important, as well as getting access to that data in real time.&amp;nbsp; But most importantly, we will need to make sure such augmentation and devices are rugged and safe in a construction environment.&amp;nbsp; Those are interesting challenges for future investigations!&lt;/p&gt;
&lt;p&gt;&lt;em&gt;You can read the full investigation in: C&amp;ocirc;t&amp;eacute; S., Beauvais M., Girard-Vall&amp;eacute;e A., Snyder R., 2014.&amp;nbsp; A live Augmented Reality Tool for Facilitating Interpretation of 2D Construction Drawings.&amp;nbsp; Proceedings of the Salento AVR 2014, 1st international conference on augmented and virtual reality, Lecce, Italy.&lt;/em&gt;&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=271298&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/3D%2bvisualization">3D visualization</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Augmented%2bReality">Augmented Reality</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/construction">construction</category></item><item><title>Extending the virtual excavation</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/extending-the-virtual-excavation</link><pubDate>Mon, 24 Mar 2014 21:46:56 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:eadcde64-6a73-4f12-9ff6-84fc770fc327</guid><dc:creator>StephaneCote</dc:creator><slash:comments>0</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=269811</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/extending-the-virtual-excavation#comments</comments><description>&lt;p&gt;Complex models can be a little difficult to visualize, as element cluttering may impair easy visualization.&amp;nbsp; Transparency, wireframe rendering, selective display, clipping, etc. can all be used to make that task easier, but we thought of another way: what if we created a virtual hole on the surface of the walls of a complex building model, and selectively clipped some of the elements it intersects with?&lt;/p&gt;
&lt;p&gt;We explored the idea, by extending the virtual excavation technique that we used for&lt;a href="/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2012/06/18/augmented-reality-for-subsurface-utilities-further-improving-perception.aspx"&gt; subsurface utility model visualization&lt;/a&gt;.&amp;nbsp; Take a look at our results:&lt;/p&gt;
&lt;p&gt;&lt;a href="http://youtu.be/gL5FRyCHqJU"&gt;http://youtu.be/gL5FRyCHqJU&lt;/a&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;As you could see, we selectively clip everything that intersects with the box, except a set of pipes.&amp;nbsp; We can therefore view the pipe without the cluttering, and the remaining surrounding element context (the rest of the model) facilitates the understanding of the relation between the pipe element and the rest of the model.&lt;/p&gt;
&lt;p&gt;What do you think?&amp;nbsp; Would such a tool be useful in your work?&amp;nbsp; Would you use it for model exploration, easier navigation, or even to facilitate clash visualization?&lt;/p&gt;
&lt;p&gt;Please write to me, or leave a comment below.&amp;nbsp; We love to read your comments!&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=269811&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>Offline spatial panoramic video augmentation</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/offline-spatial-panoramic-video-augmentation</link><pubDate>Sun, 24 Nov 2013 20:00:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:45109f42-e514-4a0c-be93-5410b9713ed0</guid><dc:creator>StephaneCote</dc:creator><slash:comments>0</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=267698</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/offline-spatial-panoramic-video-augmentation#comments</comments><description>&lt;p&gt;The term &amp;ldquo;Augmented Reality&amp;rdquo; is interpreted in various ways.&amp;nbsp; But the concept of &amp;ldquo;live&amp;rdquo; augmented reality usually means that the physical world is augmented from your current location, at the moment you are viewing the world &amp;ndash; that means &lt;i&gt;here&lt;/i&gt; and &lt;i&gt;now&lt;/i&gt;.&amp;nbsp; That is what most of us have heard of &amp;ndash; you take your smart phone, aim at something, its image is displayed on screen, and the augmentation is overlaid.&amp;nbsp; That is very cool.&lt;/p&gt;
&lt;p&gt;But regardless of how nice and useful such a technology could be in the engineering world, it is not always something to be wished for.&amp;nbsp; For instance, live augmented reality requires the user to be on site for augmentation &amp;ndash; that can be inefficient if the site to be augmented is far away, or if it is a dangerous place like inside a nuclear reactor.&amp;nbsp; Since live augmentation has to be done &lt;em&gt;now&lt;/em&gt;, it precludes the review of &amp;ldquo;past&amp;rdquo; augmentations (which could be useful for documenting a site visit).&amp;nbsp; Live augmentation accuracy is also limited by the user&amp;rsquo;s movements &amp;ndash; who has to be tracked in real time, which is hard to do accurately.&amp;nbsp; Finally, live augmentation may not always be compatible with &lt;i&gt;authored&lt;/i&gt; communications within environments, which can be used to guide, clarify, instruct, and affirm specifically what people should see, and do. &amp;nbsp;In live augmentation, the environment cannot be guaranteed (as we never know where the user will be augmenting from).&amp;nbsp; So we thought we needed something more &amp;ndash; an augmentation solution that would provide an answer to those problems.&lt;/p&gt;
&lt;p&gt;Instead of augmenting live, we have proposed augmenting the world on pre-recorded media.&amp;nbsp; To provide realistic augmentations, we chose a media that is as realistic as possible: panoramic images, captured along a path.&amp;nbsp; We visited a building, used a panoramic video camera to capture video along a path inside and outside that building, and aligned the 3D model of that building with each frame of the video.&amp;nbsp; Consequently, the augmentation is no longer constrained by specific locations (as in &lt;a href="http://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2011/06/28/augmented-reality-for-infrastructure-a-first-step.aspx"&gt;Ref1&lt;/a&gt;, &lt;a href="http://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2012/06/18/augmented-reality-for-subsurface-utilities-further-improving-perception.aspx"&gt;Ref2&lt;/a&gt;) but could be done anywhere along the path where the camera was moved.&amp;nbsp; The panoramic video, 3D model and path location were stored in a distributable &amp;ldquo;pre-recorded augmentation package&amp;rdquo;.&amp;nbsp; Since no tracking is required, the resulting augmentation is stable, repeatable, and authorable (actionable clarifying instructions can be authored within it), and uses only a small amount of CPU power.&lt;/p&gt;
&lt;p&gt;We implemented our system, and tested it in the Paddy Wagon Irish Pub in Richmond, Kentucky, using a Ladybug panoramic camera installed on a tripod and dolly, and aligned each captured video frame with a detailed 3D model of the building.&amp;nbsp; The package viewer offers a view split in 2 parts: the top part shows the augmented scene, the bottom part shows the 3D map, the camera path(s) (white line) and current camera position and orientation.&amp;nbsp; The result is shown in the following video:&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.youtube.com/watch?v=xlGxtNX6fwQ"&gt;www.youtube.com/watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Augmentation of pre-recorded media will never replace live augmentation.&amp;nbsp; Such pre-recorded packages can offer no live augmentation, and any recorded environment is, by definition, out of date.&amp;nbsp; So both types of augmentation are different, but they also are complementary: augmenting pre-recorded media offers jitter free augmentation (since no camera tracking is required), it is 100% deterministic, and it offers some navigation freedom in spite of the fact that it runs on a pre-recorded environment.&amp;nbsp; The resulting augmentation package is compatible with authorship, offsite augmentation, and with reviewing past augmentations.&amp;nbsp; The system could be used in operations (for identifying locations of hidden assets), renovations (locating structure), design (showing a model in its physical world context), and in general as a nice way of accessing information related with a building (the interface being the physical world itself).&amp;nbsp; And although the system can be used away from the augmented area, it could also be used on site, in a similar fashion to typical live augmented reality apps, but this time offering jitter free augmentation.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Exploring the world of AR, and talking with users in the field, we realize the best solution for them is not always what we would expect.&amp;nbsp; Technology and market are taking us somewhere, but that is not necessarily always compatible with the needs of infrastructure professionals: accuracy, authorship, reliability &amp;ndash; those have to be taken into account in application design.&amp;nbsp; Our goal is to make sure the AR tools we develop for them are not just toys, that they become useful and&amp;nbsp; totally reliable.&amp;nbsp; So far our results show we are making some progress in that direction&amp;hellip;&lt;/p&gt;
&lt;p&gt;Want to read more?&amp;nbsp; Check our paper:&lt;/p&gt;
&lt;p&gt;C&amp;ocirc;t&amp;eacute; S., Barnard J., Snyder R., Gervais R., 2013.&amp;nbsp; Offline&amp;nbsp; spatial&amp;nbsp; panoramic&amp;nbsp; video&amp;nbsp; augmentation&amp;nbsp; for&amp;nbsp; visual communication in the aec industry.&amp;nbsp; Proceedings of the 13&lt;sup&gt;th&lt;/sup&gt; International Conference on Construction Applications of Virtual Reality, London, November 2013.&amp;nbsp; &lt;a href="http://communities.bentley.com/cfs-file.ashx/__key/CommunityServer-Blogs-Components-WeblogFiles/00-00-00-50-35-Papers/7840.OfflineAR-_2D00_-2013.pdf"&gt;PDF&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Many thanks to Chuck Fields, owner of the Paddy Wagon Irish Pub in Richmond, Kentucky, for giving us access to his building and permission to share our results!&lt;/em&gt;&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=267698&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Offline">Offline</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Augmented%2bReality">Augmented Reality</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/3d%2bmodel">3d model</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/panorama">panorama</category></item><item><title>Exploring Augmented Reality for Construction</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/exploring-augmented-reality-for-construction</link><pubDate>Wed, 12 Jun 2013 19:56:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:c4a25d56-224e-4ac2-b3ff-fc369e596de3</guid><dc:creator>StephaneCote</dc:creator><slash:comments>3</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=247898</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/exploring-augmented-reality-for-construction#comments</comments><description>&lt;p&gt;Infrastructure is inherently 3-dimensional. Designers propose a 3D building concept, and ultimately builders create the 3D object that corresponds to the designer&amp;rsquo;s idea. Yet, the only design document that is legally approved for construction is the 2D drawing. Of course,&amp;nbsp;2D drawings are&amp;nbsp;essential documents for construction, as they represent an efficient way of looking at and understanding the complex 3D model information. But the process forces designers, architects and engineers to take one dimension out of their 3D design.&amp;nbsp; Therefore, drafting&amp;nbsp;consists of a complex set of tasks aimed at accurately representing 3D objects with 2D representations. The process is complex and must be done with great care, to ensure that when the builders read the 2D drawings, they will be in a position to build the 3D building exactly as it was designed.&lt;/p&gt;
&lt;p&gt;Plans must be followed carefully. Unfortunately, construction workers cannot work with drawings in their hands &amp;ndash; as they need their hands to do the construction work. They cannot constantly look at the drawings&amp;nbsp;either. So drawings are often put on a table on the site, and workers frequently come to look at it, understand it, and to take measurements. In the process, errors may be made. A builder may take the wrong measurement. Or he may be looking at the wrong drawing. Or even&amp;nbsp;just not looking at the drawing at all, basing his work on what he remembers from what he saw previously on the drawing. This happened when our family house was built: the builder had put the fireplace at the wrong depth with respect to the surface of the wall. When I showed him that the drawing clearly indicated it should be off the wall by 6 inches, he said: &amp;ldquo;You&amp;#39;re right!... I had not seen it I guess...&amp;rdquo;&amp;nbsp; We really need to find better ways to look at drawings. Actually, we need to find better ways to look at the information conveyed by drawings.&lt;/p&gt;
&lt;p&gt;Drawings contain lots of information about a building &amp;ndash; one problem is that workers do not have the drawing constantly before their eyes. The fact that they sometimes have to manually measure distances on the drawings, make calculations, and hold figures in memory, may lead to error. They should carry the drawing with them, all the time. Even better: the drawing should be displayed on a tablet, which would display only the parts of the drawing that are appropriate for a given task (for instance: it would display just one of the 4 sections that appear on a given sheet, the one that matters to the worker at that specific time). But the drawing, on its own, is not the solution. A drawing is a representation of what should be built in the physical world - however that representation is very abstract... To fully understand it, one could lower the level of abstraction by displaying the drawing within a context.&amp;nbsp; Although drawings&amp;nbsp;are 2D, they actually each represent a specific location in the 3D building. For instance, a section drawing represents a section of the wall at a specific (authored) location. Then why not display it right there? I mean display the drawing in the physical building, at the exact location it represents? Displaying the drawing in combination with the physical world may provide the context required to help understand the drawing better. Such a combined display could be achieved using Augmented Reality.&lt;/p&gt;
&lt;p&gt;Our team has done quite some work in the Augmented Reality (AR) field. AR displays digital data in the context of the physical world. This helps interpreting the digital data (as it is then displayed in a context). It also helps interpreting the physical world, as it then comes with supplementary data. With augmented reality, the whole&amp;nbsp;really is&amp;nbsp;greater than the sum of its parts.&amp;nbsp; We have already explored the possibility of displaying 2D drawings in the context of the physical world, as represented by pre-recorded panoramic images (see our blog &lt;a href="http://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2012/09/11/augmented-reality-for-building-construction-and-maintenance-augmenting-with-2d-drawings.aspx"&gt;post&lt;/a&gt;), in a static augmentation experience.&amp;nbsp; We wanted to go one step further and let the augmentation be more dynamic.&lt;/p&gt;
&lt;p&gt;In this project, we wanted to see what it would be like to build a building using a &lt;em&gt;live&lt;/em&gt; AR system. How could we render (or present) the digital data in such a way that would be useful for the worker? Would the builder really benefit from it? What issues would need to be solved, to be in a position to make such a system operational?&lt;/p&gt;
&lt;p&gt;So we developed a basic AR system for construction, that consists of:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;a set of 3D video eyewear equipped with video cameras,&amp;nbsp;&lt;/li&gt;
&lt;li&gt;an orientation sensor (for measuring the head orientation),&lt;/li&gt;
&lt;li&gt;a set of 3D game controllers (for measuring the head position).&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Our results are shown in the video below.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.youtube.com/watch?v=-U77ALkOj28&amp;amp;amp;feature=youtu.be"&gt;www.youtube.com/watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;We chose to develop our system using a head mounted display, as we started with the assumption that construction workers need their hands to do the work &amp;ndash; so we needed to design a system that&amp;nbsp;would keep their hands free. But other systems based for instance on ruggedized tablets could also be proposed, and would have other advantages.&lt;/p&gt;
&lt;p&gt;You could probably see many flaws in the video, such as: bad tracking (which makes the drawings &amp;ldquo;float&amp;rdquo; above the floor), and no support of occlusion (which causes perception difficulties). In spite of those, the demo leads us to believe there is a future for AR in construction. If implemented properly and accepted by workers, such a system could probably help avoid mistakes, save accidents, and consequently speed up the construction process and lower costs. There are, however, many issues to resolve to make that possible. Those include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Enhance the capacity to measure accurately the head position and orientation (that caused the &amp;quot;floating&amp;quot; effect of the model in the video);&lt;/li&gt;
&lt;li&gt;Find ways to render the model in ways that support occlusion from physical objects (e.g. tools, studs in the way should occlude the model for it to look more &amp;ldquo;realistic&amp;rdquo;);&lt;/li&gt;
&lt;li&gt;Make sure the system is safe to use &amp;ndash; anything that is displayed before the user&amp;rsquo;s eyes has a potential for creating safety issues as it would prevent the user from seeing fully what is happening around him &amp;ndash; and we all know there are many potential dangers on a building site;&lt;/li&gt;
&lt;li&gt;Develop a system aimed at working in coordination with the worker, not at replacing him. Such coordination would likely lead to the highest outcome in terms of efficiency.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;For sure, it appears that displaying 2D drawings in the context of the physical world helps a lot in conveying the design information. When displayed into a physical world context, drawings becomes easier to understand, and the physical world becomes easier to interpret.&lt;/p&gt;
&lt;p&gt;One day, we may see all construction workers equipped with high end smart phones integrated into their clothes, with wireless display on their digital contact lenses, following detailed visual and verbal instructions from a construction app that is carefully coordinating the work of all the workers on the site.&lt;/p&gt;
&lt;p&gt;This all seems far-fetched to you? Recently, I was talking with a user about the potential of augmented reality in the AEC industry. When I told him about that vision of AR for construction, he said: &amp;ldquo;&lt;em&gt;Construction workers would never accept using such systems&amp;hellip;&lt;/em&gt;&amp;rdquo;. Then he paused, and pursued, more slowly: &amp;ldquo;&lt;em&gt;&amp;hellip; well, back in the old days, we were building roads using levels and long wood studs to check our slopes. Now graders blades are GPS located and electronically controlled&amp;hellip; and that is now widely accepted!&lt;/em&gt;&amp;rdquo;&lt;/p&gt;
&lt;p&gt;Someone famous once wrote: &amp;ldquo;&lt;em&gt;The best way to predict the future is to invent it&lt;/em&gt;&amp;rdquo;&amp;hellip; &lt;br /&gt;Well, the future is ahead of us, and still largely unknown. Let&amp;rsquo;s invent it&amp;hellip;&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=247898&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>Augmented reality for building construction and maintenance: augmenting with 2D drawings</title><link>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/augmented-reality-for-building-construction-and-maintenance-augmenting-with-2d-drawings</link><pubDate>Tue, 11 Sep 2012 20:22:00 GMT</pubDate><guid isPermaLink="false">6dad98f5-dbc9-4c4d-a9ba-e9da8dc6aa8e:bb6b8005-5ba9-4027-8383-124d9de9a91f</guid><dc:creator>StephaneCote</dc:creator><slash:comments>5</slash:comments><wfw:commentRss xmlns:wfw="http://wellformedweb.org/CommentAPI/">https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/rsscomments?WeblogPostID=220874</wfw:commentRss><comments>https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/posts/augmented-reality-for-building-construction-and-maintenance-augmenting-with-2d-drawings#comments</comments><description>&lt;p&gt;Augmented reality finds new applications every week. So far, those that have been proposed for the infrastructure world are generally either related to displaying hidden data (such as &lt;a href="http://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2012/06/18/augmented-reality-for-subsurface-utilities-further-improving-perception.aspx"&gt;underground infrastructure models&lt;/a&gt;),&amp;nbsp;or &lt;a href="http://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2011/06/28/augmented-reality-for-infrastructure-a-first-step.aspx"&gt;displaying attributes&lt;/a&gt; related with visible elements. If implemented on portable devices, such applications could be very useful for infrastructure operation and maintenance. There is, however, another aspect of infrastructure that could also benefit from augmented reality: construction. Recently we took some time to explore that possibility.&lt;/p&gt;
&lt;p&gt;Construction drawings are the most important communication documents between the designer and the builder. For large infrastructure projects, there can be a very large number of them (sometimes thousands), each one referencing several other drawings through symbols and codes that indicate their relative spatial relationship. Because of that complexity, it may be hard to find the location a given drawing corresponds to in the physical world, especially on a building site. Builders may also miss important elements that the designer was trying to bring to their attention. Users tell us that drawings are a constant source of misunderstanding, and that they spend a significant amount of time trying to understand what the drawings mean. The use of drawings causes significant losses in time and may cause interpretation errors, which in turn may cause delays and errors in the construction work.&lt;/p&gt;
&lt;p&gt;Yet, drawings are essential &amp;ndash; they are still the only design document that is legally approved for construction. Consequently, even though designers may have produced detailed 3D models, drawings are the only form of design communication certified as reliable for construction.&amp;nbsp; Therefore designers, architects and engineers must take one dimension out of their 3D design and create 2D drawings.&amp;nbsp; Consequently, drafting essentially consists of a complex set of tasks aimed at accurately representing a 3D object with 2D representations.&lt;/p&gt;
&lt;p&gt;Drawings are often misunderstood because they are abstractions. It is like working on a jigsaw puzzle, but trying to interpret any single piece of the puzzle by itself. Moreover, there are lots of drawings &amp;ndash; so it is a non-trivial mental activity to reconstruct a set of abstractions into a coherent 3D representation of what is actually to be built. Paradoxically, although drawings are simplified representations of models, they are sometimes harder to work with&amp;hellip;&lt;/p&gt;
&lt;p&gt;In our most recent research work, we explored the potential of augmented reality for making 2D drawings easier to use on site. In particular, we investigated the problem of finding the physical location that a 2D section drawing represents. It would be interesting, for instance, to display drawings at 1:1 scale, inside the physical world, at the exact position they represent. This is an interesting problem to study, as section drawings often represent wall sections, which are quickly hidden (by the wall surface) as the construction progresses. Therefore, if we were to put the drawing at exactly the right scale and location in the physical world, it would actually be hidden by the wall itself&amp;hellip;&lt;/p&gt;
&lt;p&gt;This problem is reminiscent of the &lt;a href="http://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/2011/12/10/augmentation-of-subsurface-utilities-the-problem-of-spatial-perception.aspx"&gt;subsurface utilities augmentation problem&lt;/a&gt;.&amp;nbsp; Underground pipes are hidden by the ground surface &amp;ndash; displaying them overlaid to the road surface is confusing. However, displaying them underground makes them invisible (hidden by the ground surface). To solve that problem, we proposed a compromise: to display them inside a virtual excavation. Therefore, we proposed a similar solution for section drawings.&lt;/p&gt;
&lt;p&gt;In the following video, you will see 2 techniques that we proposed for making the visualization of 2D section drawings easier in an augmented context. The first one is a &amp;ldquo;sliding plane&amp;rdquo; technique &amp;ndash; it slides the drawing&amp;rsquo;s plane at its exact location, to attract the user&amp;rsquo;s attention by indicating the location of the drawing. The second technique is similar to excavation: we cut through the wall, revealing the inside of the building (the model), and showing the 2D section drawing at exactly the location it represents, in a representation that retains full context, whilst allowing the 2D drawing to be fully visible.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.youtube.com/watch?v=kANBWhbHaEU"&gt;www.youtube.com/watch&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Clearly in this visual representation, the result is greater than the sum of its parts: not only the 2D drawings can be interpreted more easily, but also can the 3D model and the physical world!&amp;nbsp; By displaying data this way, it is like taking a few puzzle pieces which, taken alone, do not mean much, and putting them together, in context, with the other puzzle pieces.&amp;nbsp; Together, they reveal the whole story&amp;hellip;&lt;/p&gt;
&lt;p&gt;This is our first baby step in that direction. Still, as you can see the technique seems promising.&lt;/p&gt;
&lt;p&gt;Want to read more?&amp;nbsp; Check our paper:&amp;nbsp; C&amp;ocirc;t&amp;eacute; S., Trudel P., Snyder R., Gervais R., 2013. An augmented reality tool for facilitating on-site interpretation of 2D construction drawings.&amp;nbsp; Proceedings of the conference on Construction Applications of Virtual Reality (CONVR 2013), London, November 2013.&amp;nbsp; &lt;a href="http://communities.bentley.com/cfs-file.ashx/__key/CommunityServer-Blogs-Components-WeblogFiles/00-00-00-50-35-Papers/8270.ConstructionDrawings-_2D00_-2013.pdf"&gt;PDF&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;More to come! Stay tuned&amp;hellip;&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;&lt;img src="https://communities.bentley.com/aggbug?PostID=220874&amp;AppID=5035&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/2D%2bdrawings">2D drawings</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Infrastructure">Infrastructure</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/Augmented%2bReality">Augmented Reality</category><category domain="https://communities.bentley.com/other/old_site_member_blogs/bentley_employees/b/stephanecotes_blog/archive/tags/construction">construction</category></item></channel></rss>