You are not logged in.

#1 2017-07-04 20:40:43

damo
....moderator....
Registered: 2015-08-20
Posts: 6,734

VirtualGL running on "server's" X display - questions

wikipaedia wrote:

VirtualGL is an open source program that redirects the 3D rendering commands from Unix and Linux OpenGL applications to 3D accelerator hardware in a dedicated server and displays the rendered output interactively to a thin client located elsewhere on the network

I have an optimus laptop, and when running glxgears I get these outputs:

  • Using on-board graphics

    damo@helium-dev2:~$ glxgears -info
    Running synchronized to the vertical refresh.  The framerate should be
    approximately the same as the monitor refresh rate.
    GL_RENDERER   = Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 
    GL_VERSION    = 3.0 Mesa 13.0.6
    GL_VENDOR     = Intel Open Source Technology Center
    .
    .
    .
    460 frames in 5.0 seconds = 91.909 FPS
    301 frames in 5.0 seconds = 60.023 FPS
    301 frames in 5.0 seconds = 60.019 FPS
    301 frames in 5.0 seconds = 60.021 FPS
    301 frames in 5.0 seconds = 60.021 FPS
    301 frames in 5.0 seconds = 60.019 FPS
  • With bumblebee enabling the graphics card

    damo@helium-dev2:~$ optirun glxgears -info
    GL_RENDERER   = GeForce GTX 970M/PCIe/SSE2
    GL_VERSION    = 4.5.0 NVIDIA 375.66
    GL_VENDOR     = NVIDIA Corporation
    .
    .
    .
    295 frames in 5.0 seconds = 58.977 FPS
    301 frames in 5.0 seconds = 60.020 FPS
    301 frames in 5.0 seconds = 60.023 FPS
    301 frames in 5.0 seconds = 60.018 FPS
    301 frames in 5.0 seconds = 60.022 FPS
  • With bumblebee, and VirtualGL installed:

    damo@helium-dev2:~$ optirun glxgears -info
    GL_RENDERER   = GeForce GTX 970M/PCIe/SSE2
    GL_VERSION    = 4.5.0 NVIDIA 375.66
    GL_VENDOR     = NVIDIA Corporation
    .
    .
    .
    14606 frames in 5.0 seconds = 2921.047 FPS
    14532 frames in 5.0 seconds = 2906.259 FPS
    14803 frames in 5.0 seconds = 2960.591 FPS
    15092 frames in 5.0 seconds = 3018.390 FPS
    15162 frames in 5.0 seconds = 3032.399 FPS

But if the machine isn't serving a client, and is rendering on the server's display, are these results informative or not? Is it really rendering 3k frames per second?


Be Excellent to Each Other...
The Bunsenlabs Lithium Desktop » Here
FORUM RULES and posting guidelines «» Help page for forum post formatting
Artwork on DeviantArt  «» BunsenLabs on DeviantArt

Offline

#2 2017-07-04 22:04:32

Head_on_a_Stick
Member
From: London
Registered: 2015-09-29
Posts: 9,093
Website

Re: VirtualGL running on "server's" X display - questions

damo wrote:

Is it really rendering 3k frames per second?

Not sure.

What are the real framerate values for the non-VirtualGL setup on your (local) hardware:

vblank_mode=0 glxgears -info
vblank_mode=0 optirun glxgears -info

EDIT: the second one may actually be:

optirun vblank_mode=0 glxgears -info

Last edited by Head_on_a_Stick (2017-07-04 22:09:15)

Offline

#3 2017-07-04 22:36:58

damo
....moderator....
Registered: 2015-08-20
Posts: 6,734

Re: VirtualGL running on "server's" X display - questions

damo@helium:~$ vblank_mode=0 glxgears -info
ATTENTION: default value of option vblank_mode overridden by environment.
ATTENTION: default value of option vblank_mode overridden by environment.
GL_RENDERER   = Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 
GL_VERSION    = 3.0 Mesa 13.0.6
GL_VENDOR     = Intel Open Source Technology Center
.
.
.
40851 frames in 5.0 seconds = 8170.163 FPS
40660 frames in 5.0 seconds = 8131.974 FPS
41028 frames in 5.0 seconds = 8205.537 FPS
40916 frames in 5.0 seconds = 8183.041 FPS
damo@helium:~$ vblank_mode=0 optirun glxgears -info
GL_RENDERER   = GeForce GTX 970M/PCIe/SSE2
GL_VERSION    = 4.5.0 NVIDIA 375.66
GL_VENDOR     = NVIDIA Corporation
.
.
.
14792 frames in 5.0 seconds = 2958.338 FPS
14561 frames in 5.0 seconds = 2912.061 FPS
14707 frames in 5.0 seconds = 2941.379 FPS
15183 frames in 5.0 seconds = 3036.573 FPS
14527 frames in 5.0 seconds = 2905.228 FPS

I think this is is going down the rabbit-hole! I was just curious as to whether it was worth installing VirtualGL, if it improved graphics performance,  on a machine which never acts as a server. I have been getting nowhere asking Mr.Google - thanks for investigating smile

EDIT: https://askubuntu.com/questions/285342/ … benchmarks

"vblank_mode" was the clue. I guess virtualgl is pointless in my use-case

Last edited by damo (2017-07-04 22:39:51)


Be Excellent to Each Other...
The Bunsenlabs Lithium Desktop » Here
FORUM RULES and posting guidelines «» Help page for forum post formatting
Artwork on DeviantArt  «» BunsenLabs on DeviantArt

Offline

#4 2017-07-05 05:51:57

Head_on_a_Stick
Member
From: London
Registered: 2015-09-29
Posts: 9,093
Website

Re: VirtualGL running on "server's" X display - questions

^ Looks like you should be disabling the discrete NVIDIA card for best performance, that Skylake integrated chip is sweet  8)

Shame it can't do CUDA...

EDIT: my poor old X201 can't really compete:

1449 frames in 5.0 seconds = 289.781 FPS

sad

Last edited by Head_on_a_Stick (2017-07-05 06:20:21)

Offline

#5 2017-07-05 10:32:58

damo
....moderator....
Registered: 2015-08-20
Posts: 6,734

Re: VirtualGL running on "server's" X display - questions

I only got the nvidia card + CUDA for Cycles rendering smile


Be Excellent to Each Other...
The Bunsenlabs Lithium Desktop » Here
FORUM RULES and posting guidelines «» Help page for forum post formatting
Artwork on DeviantArt  «» BunsenLabs on DeviantArt

Offline

#6 2017-07-05 17:05:10

stevep
MX Linux Developer
Registered: 2016-08-08
Posts: 381

Re: VirtualGL running on "server's" X display - questions

Virtual GL includes what in IMO is a somewhat better benchmark applet, glxspheres for 32-bit or  glxspheres64 for 64-bit.

But I think the best benchmark is to run something like Unigine's Heaven or Valley demo and compare the framerates; that seems to give accurate results with my Optimus setup.

Offline

#7 2017-07-05 19:09:06

Head_on_a_Stick
Member
From: London
Registered: 2015-09-29
Posts: 9,093
Website

Re: VirtualGL running on "server's" X display - questions

I like to use CS:GO to benchmark all of my changes, an XM1014 head shot is far more satifying than watching Ungine  ]:D

Offline

#8 2017-07-05 19:51:52

Sector11
Mod Squid Tpyo Knig
From: Upstairs
Registered: 2015-08-20
Posts: 8,030

Re: VirtualGL running on "server's" X display - questions

^ & ^^ & ^^^

Just so you guys really feel good:

 05 Jul 17 @ 16:38:58 ~
   $ vblank_mode=0 glxgears -info
Running synchronized to the vertical refresh.  The framerate should be
approximately the same as the monitor refresh rate.
GL_RENDERER   = GeForce 210/PCIe/SSE2
GL_VERSION    = 3.3.0 NVIDIA 340.102
GL_VENDOR     = NVIDIA Corporation
{snip} ... 
{snip} ... 
{snip} ... 
VisualID 39, 0x27
302 frames in 5.0 seconds = 60.374 FPS
300 frames in 5.0 seconds = 59.998 FPS
300 frames in 5.0 seconds = 59.995 FPS
301 frames in 5.0 seconds = 59.928 FPS
301 frames in 5.0 seconds = 60.072 FPS
300 frames in 5.0 seconds = 59.983 FPS
301 frames in 5.0 seconds = 60.010 FPS
299 frames in 5.0 seconds = 59.799 FPS

It's called: slow


Debian 12 Beardog, SoxDog and still a Conky 1.9er

Offline

#9 2017-07-05 19:55:11

Head_on_a_Stick
Member
From: London
Registered: 2015-09-29
Posts: 9,093
Website

Re: VirtualGL running on "server's" X display - questions

Sector11 wrote:
 05 Jul 17 @ 16:38:58 ~
   $ vblank_mode=0 glxgears -info
Running synchronized to the vertical refresh.  The framerate should be
approximately the same as the monitor refresh rate.

The statement is not consitent with the command: the `glxgears` output is still showing your monitor refresh rate for some reason.

Offline

#10 2017-07-05 20:04:48

stevep
MX Linux Developer
Registered: 2016-08-08
Posts: 381

Re: VirtualGL running on "server's" X display - questions

I believe the "vblank_mode=0" argument only works with the free xorg drivers.  For Nvidia proprietary, try

__GL_SYNC_TO_VBLANK=1 glxgears

or you can temporarily turn off the sync in nvidia-settings.

Last edited by stevep (2017-07-05 20:07:01)

Offline

#11 2017-07-05 20:44:48

Sector11
Mod Squid Tpyo Knig
From: Upstairs
Registered: 2015-08-20
Posts: 8,030

Re: VirtualGL running on "server's" X display - questions

^  but damo's output is with NVIDIA:

damo@helium:~$ vblank_mode=0 optirun glxgears -info
GL_RENDERER   = GeForce GTX 970M/PCIe/SSE2
GL_VERSION    = 4.5.0 NVIDIA 375.66
GL_VENDOR     = NVIDIA Corporation
.
14792 frames in 5.0 seconds = 2958.338 FPS
14561 frames in 5.0 seconds = 2912.061 FPS

sooooo

 05 Jul 17 @ 17:29:32 ~
   $ __GL_SYNC_TO_VBLANK=1 glxgears
Running synchronized to the vertical refresh.  The framerate should be
approximately the same as the monitor refresh rate.
303 frames in 5.0 seconds = 60.504 FPS
301 frames in 5.0 seconds = 60.001 FPS
300 frames in 5.0 seconds = 59.998 FPS

Doesn't see to make a difference. (cheap onboard card)
2017-07-05_173700_Scrot11.th.jpg

But I'm hyjacking a thread --- just realized "virtual" I have no virtual maching


Debian 12 Beardog, SoxDog and still a Conky 1.9er

Offline

#12 2017-07-06 01:33:33

johnraff
nullglob
From: Nagoya, Japan
Registered: 2015-09-09
Posts: 12,673
Website

Re: VirtualGL running on "server's" X display - questions

@Sector11 if it makes you feel any better, my results are identical with yours! (Hence not bothering to post.)
Except for:

GL_RENDERER   = GeForce GT 220/PCIe/SSE2

...elevator in the Brain Hotel, broken down but just as well...
( a boring Japan blog (currently paused), now on Bluesky, there's also some GitStuff )

Introduction to the Bunsenlabs Boron Desktop

Online

#13 2017-07-06 02:07:27

Sector11
Mod Squid Tpyo Knig
From: Upstairs
Registered: 2015-08-20
Posts: 8,030

Re: VirtualGL running on "server's" X display - questions

Hey, they work, we should be happy.  Beside, we're over the hill, we take all things are slower these days. big_smile


Debian 12 Beardog, SoxDog and still a Conky 1.9er

Offline

Board footer

Powered by FluxBB