[Paraview] Sorry for the 2 previous posts (this is the one)

Amy Henderson amy.henderson at kitware.com
Mon Sep 27 11:31:33 EDT 2004


>
>Yes they do if I run the client on the same machine as the server.  If I
>run the client on a windows machine (or any other machine for that
>matter) in this mode I get a whole bunch of VTK socket communicator
>errors  (if I spent half a day problem solving this I could probably
>work it out but for now I am happy with a quick fix).  When I run in
>just server/client mode (no render server) I can put the client on any
>machine I want.

Is the server behind a firewall where the client would have a hard time 
connecting to it?  Are you passing the host name of the server to the 
client using the --host argument on the client command-line. (The default 
is "localhost", which would explain the behavior you are seeing.)

>If the server/client mode also supports distributed rendering then it
>seems that the only use for seperating out the render-server is to be
>able to use different machines for the different tasks.  Is this correct
>or am I missing something?

Yes, the reason for using a separate render server is if the machines of 
the render server have better rendering capabilities than the machines of 
the data server.


>Also I got the impression that it was difficult to use the GPU's to do
>rendering.  But it seems that the tiled windows that are popping up on
>my server are in fact using opengl if I am to believe there titles that
>run somthing like "opengl window".  Does this mean that the rendering is
>being done by opengl (and therefore probably the GPU) and that if I had
>   fast graphics cards in all my nodes the rendering will go much faster?

Unless you have told ParaView to use software rendering (i.e., 
--use-software-rendering or -s), then it is trying to use the graphics card 
that is in the machine.  Faster graphics cards would certainly help this 
situation.

- Amy 





More information about the ParaView mailing list