[CE 16] How is multiple gpu supported by Microstation

Hi,

Just want to find out if a system has 2 gpus, does Microstation have a say in which gpu it will use? Or is that decided by Windows?

For example, I have the Microstation window displayed in monitor 1 which is connected to gpu 1. Does that mean only GPU 1 is used to Microstation? I'm aware that Windows 10 allows for setting specific gpu used per app, but just want to understand if that means what I think it means. 

Best regards,

Tuan Le

Parents
  • Hi Tuan,

    some more comment to Marco's answer, because you do not share complete picture of your HW configuration, and I also think one scenario is missing in Marco's explanation.

    In my opinion, there are 3 different scenarios possible:

    • 2 (or more) discrete GPUs are installed, serving different monitors: It's not usual configuration, but possible (when anybody used e.g. SLI in the past). In such case, both GPUs are used, when MicroStation is displayed on both monitors.
    • Integrated + external GPUs are installed: It's common typically in notebooks, when there is integrated GPU and external mobile graphics. As Marco wrote, in such case it depends on Windows configuration, and typically integrated GPU is used in low-performance mode (battery is used), and external GPU is used for high-performance mode (power is used). Control applications usually allows to change this default rule individually for every application.
    • Not mentioned is VUE rendering: It tries to utilize "as much as possible", so when both CPU and GPU (supporting OpenCL) are found, rendering is computed at both. I have had no chance to try CPU + GPU + GPU configuration, but I guess everything will be used in such case. Also, when specifically Nvidia is available, it can be used for denoiser.

    When you are not sure whether GPU is used (or what GPU when more are installed), use Windows Task manager, where all basic statistics are displayed. Also, both AMD and Nvidia provides own control panels, allowing to monitor GPU parameters and utilization.

    With regards,

      Jan

  • Hi Jan, 

    Thanks for the insight. SLI isn't on our radar. A question was raised in our office: "any benefit if we have more gpu". For day to day 3D and 2D, sans rendering, I think for most simplistic use case, answer seems to be "yes". Just want to make sure that is the case. 

  • I think for most simplistic use case, answer seems to be "yes".

    In my opinion the answer is "no":

    Even average mid-range cards, especially when DirectX optimized (targeting gaming market), are able to handle MicroStation on multiple monitors in a high resolution. I have to admit I have no experience with gigabytes-large complex 3D datasets, but every time it looks MicroStation is slow (large point clouds), I feel it's because of optimization, not because of not sufficient GPU power.

    An exception from this my opinion is a frequent VUE rendering, where to have more the best available GPUs makes sense. And even when I prefer AMD (my feeling is that they offer better cost / power consumption / DirectX performance), Nvidia GPU would be better, because of Nvidia denoiser availability.

    But, what configuration and GPU is the best depends a lot on local specifics and project requirements. My customer often use mid-range (or even standard) PCs for normal drawing and modeling (for 2D and average 3D, even integrated GPU can provide enough power), with dedicated high-performance workstation(s), used by "company visualization guru" ;-)

    With regards,

      Jan

  • About denoising, we support both the Nvidia and Intel denoisers. The Nvidia denoiser requires an Nvidia card (Maxwell generation or newer, which is about 4 years old or younger), while the Intel denoiser will run on anything, including AMD video cards. The results are comparable and the speed is also great for both cases as the denoising only takes, generally, a fraction of a second.

    We plan to also implement the AMD denoiser, which should run on any hardware, in 2022.

  • Thanks for the details.

    We plan to also implement the AMD denoiser, which should run on any hardware, denoiser in 2022.

    It sounds promising, I am looking forward to it! :-)

    Regards,

      Jan

Reply Children
No Data