360 degrees photo with exact location, match with model

Hi all,

I have some 360 degree photos (such as Street View-photo's) and I know the exact location of the photo in coordinates. These photos I'd like to read in such a way in MicroStation that the photo matches an existing surface (for example: an existing drawing of the road) with the photo. I've tried a few things with background and image, and placing a camera on the location of the photo location, but it will not succeed. Does anyone have experience with this?

Parents Reply
  • Photos + Point Clouds is a good idea.

    Maybe Bentley should look at Calabi Yau 's method of fusing images to meshes generated from point clouds.

    "Mesh rendering occurs automatically from the RGB and intensity information inherent within the scan data. Optionally, high resolution spherical images can be imported and fused to a decimated polygonal mesh. The combination of high resolution imagery with a decimated polygonal mesh relieves the system from the burden of scan resolution overkill. This option is very useful for many uses, including virtual survey. This feature provides the benefit of high visual acuity, with just enough mesh geometry to support highly accurate surveys."

    Pointools may be fast with point clouds but a lot of the time that's not what is needed. Maybe the mesh tools and STM teams need to meet and brainstorm with the pointools guys.... ? Probably some GPU texturing capabilities available already?

Children
No Data