<div dir="ltr"><div><div><div>Often with 3d applications there is a bottleneck between the game program and the gpu. The SL viewer uses an older OpenGL API which places certain constraints on the drivers and computer hardware in that they must move a lot of small bits of data from the cpu to the gpu for each frame and moving each piece of data requires overhead. If the hardware or the driver is not designed to optimize this mode of operation then you will see reduced frame rates. This does *not* mean that a particular GPU is slow, rather it means that the cpu and gpu cannot communicate quickly the way the application asks them to communicate. Newer versions of OpenGL offer means of reducing the overhead of many small pieces of data (typically referred to as the "small batch problem"). Graphics cards like the Quadro are not designed for gaming-like applications and probably do not optimize for such older batching modes.<br><br></div>See: <a href="http://community.secondlife.com/t5/Technical/NVIDIA-Quadro-does-it-work-in-SL/qaq-p/1065183">http://community.secondlife.com/t5/Technical/NVIDIA-Quadro-does-it-work-in-SL/qaq-p/1065183</a><br></div>and: <a href="http://www.nvidia.com/docs/IO/8228/BatchBatchBatch.pdf">http://www.nvidia.com/docs/IO/8228/BatchBatchBatch.pdf</a><br><br></div>Bottom line: Quadro is probably a poor choice for running SL viewer.<br></div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Sep 8, 2014 at 3:53 AM, Ai Austin <span dir="ltr"><<a href="mailto:ai.ai.austin@gmail.com" target="_blank">ai.ai.austin@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">I hope no one minds me raising this here... but it may be a useful forum since we have experienced and knowledgeable developers and hardware designers on this discussion forum...<br>
<br>
I know the LL Viewers and hence most of the others viewers say that the Nvidia cards that "report as Quadro" don't have the same support or testing as the consumer GTX cards. But what is it about these cards or their drivers that is not running well.<br>
<br>
On looking at the stats bar, the only differences I can see relates to KTris per Frame and KTris per Sec... which are significantly lower on the Quadro K4000 compared to a GTX 580 and GTX 680.<br>
<br>
I have been monitoring my fps rates a bit more closely than usual recently and I think I see significantly higher frame rates on a much less powerful older Nvidia GTX 580 I have on one rig (with only 8GB of memory and rotating disks) compared to the system I thought would give best performance with an Nvidia Kepler-based K4000, 32GB memory and SSD drives on which I seem to only be getting about half the frame rate. I tried to set these up with seemingly identical tests with 1920×1080 display, identical ultra video, same view distance, shadows, anti-aliasing, same cache size and network bandwidth, and everything, and I had my avatar at same time of day facing the same direction on an otherwise unoccupied region.<br>
<br>
I wonder if anyone knows if there is some specific issue between using the apparently better fps delivering consumer level Nvidia GTX GPUs and seemingly more powerful but less well performing Nvidia Quadro Kepler GPUs that may be a root cause?<br>
<br>
______________________________<u></u>_________________<br>
Opensim-users mailing list<br>
<a href="mailto:Opensim-users@opensimulator.org" target="_blank">Opensim-users@opensimulator.<u></u>org</a><br>
<a href="http://opensimulator.org/cgi-bin/mailman/listinfo/opensim-users" target="_blank">http://opensimulator.org/cgi-<u></u>bin/mailman/listinfo/opensim-<u></u>users</a><br>
</blockquote></div><br></div>