[Paraview] PV 3.12.0 coprocessing problem

Andy Bauer andy.bauer at kitware.com
Mon Jan 2 18:09:16 EST 2012


I have some upcoming changes in the script generator plugin that may help
out with some of your problems when outputting images.  It will give you
options to fit the view to screen (i.e. the reset button in the gui),
rescale to the data range, and magnify the outputted image.  It will also
give you separate output frequencies and better control of the file names
as well.

Andy

On Mon, Jan 2, 2012 at 5:54 PM, Biddiscombe, John A. <biddisco at cscs.ch>wrote:

>  Andy****
>
> ** **
>
> Thanks for the replies, I’ve been out drinking a bit so this reply may be
> hazy, but I did an update after Utkarsh’s last gatekeeper review (which
> said some pvbatch fixes) and found that your test works fine now. Two
> things – one is that I did the update (on my master from kitware/master),
> the other that I looked at the test and saw you’d left
> /my/ssd/drive/path/stuff in the output filenames, so it would never have
> worked for me, and I didn’t check it before (this would explain the hang).
> However I got an error message (which I hadn’t done before) which told me
> that it was unable to write to file XXX (and that’s why I checked the path
> names in the script). Your test now runs just fine for me.****
>
> ** **
>
> My own coprocessing test still has problems. It no longer hangs, but
> nothing is written out (no png I mean), so I’ll try your suggestion of
> adding writers to the pipeline to see what geometry is present.****
>
> ** **
>
> I agree that it is almost certainly not the the coprocessing that’s now
> the issue. ****
>
> ** **
>
> Thanks for delving into this.****
>
> ** **
>
> JB****
>
> One small wrinkle is that on linux, my python print statements produce
> output I can see, but on windows, I see nothing from them. No idea why, but
> it’d help debugging if I knew.****
>
> ** **
>
> *From:* Andy Bauer [mailto:andy.bauer at kitware.com]
> *Sent:* 02 January 2012 22:35
>
> *To:* Biddiscombe, John A.
> *Cc:* paraview at paraview.org
> *Subject:* Re: [Paraview] PV 3.12.0 coprocessing problem****
>
> ** **
>
> I made a mistake.  That reduced-1.py script doesn't work with that sha but
> does work with an older version.  For that sha you'll need to comment out
> the lines with HeadPose and WandPose.
>
> Andy****
>
> On Mon, Jan 2, 2012 at 4:16 PM, Andy Bauer <andy.bauer at kitware.com> wrote:
> ****
>
> Happy New Year John,
>
> That branch that you used should be fine.  I did my testing below with
> 803fb9bda66b479949556c14f9777384995188a6.
>
> I got a chance to play around with your reduced-1.py script and didn't see
> anything wrong with it.  I modified it to work with the my driver script
> and to write out the results (I didn't want to mess with the view settings
> to get a "nice" picture).  I also removed the part where the proxies are
> deleted because that's been changed about a week ago.  I'm attaching the
> files I used to run.  I ran it both in serial (bin/pvpython pkfdriver.py
> reduced-1.py 3) and parallel (mpirun -np 2 bin/pvbatch --symmetric
> pkfdriver.py reduced-1.py 3) without any problems.
>
> I'm not sure how to go any further in solving this problem myself since
> I'm quite unfamiliar with the rendering parts of VTK and ParaView.  It does
> seem odd to me though that vtkRenderWindowInteractor::Initialize() is in
> the stack of the single process.  Is your input a topologically structured
> grid?  If it is, did you properly set the whole extent for your
> vtkCPInputDataDescription object?  The last thing I can think of is to add
> in a writer to the script to see the actual output from the contour
> filter.  Maybe someone else on the list has an idea of what's going wrong
> since I don't think it's a problem with the coprocessing libraries.  If you
> can demonstrate the problem in some way that I can repeat it here though
> I'll keep trying to track it down.  Maybe if you could send me a file for
> your data set I can try to replicate it here.
>
> Andy****
>
> ** **
>
> On Mon, Dec 26, 2011 at 4:57 PM, Biddiscombe, John A. <biddisco at cscs.ch>
> wrote:****
>
> Andy,****
>
>  ****
>
> Merry Christmas ….****
>
>  ****
>
> > ****
>
> Any chance you could test with the paraview's current master branch?
> <****
>
>  ****
>
> Using kitware/master ref : aef70217488706ecd124e4ad5cb25e47f57d1478****
>
>  ****
>
> Using the pvbatch test you sent, it hangs  the same … I have N-1 in here**
> **
>
>  ****
>
> >             vtkParallel.dll!vtkMPICommunicator::BroadcastVoidArray(void
> * data, __int64 length, int type, int root)  Line 1132 + 0x3b
> bytes          C++****
>
>                 vtkParallel.dll!vtkCommunicator::Broadcast(int * data,
> __int64 length, int srcProcessId)  Line 256              C++****
>
>                 vtkParallel.dll!vtkMultiProcessController::Broadcast(int *
> data, __int64 length, int srcProcessId)  Line 402             C++****
>
>
> vtkPVServerManager.dll!vtkSMUtilities::SaveImageOnProcessZero(vtkImageData
> * image, const char * filename, const char * writerName)  Line 139
> C++****
>
>                 vtkPVServerManager.dll!vtkSMViewProxy::WriteImage(const
> char * filename, const char * writerName, int magnification)  Line 311 +
> 0x1f bytes       C++****
>
>
> vtkPVServerManagerPythonD.dll!PyvtkSMViewProxy_WriteImage(_object * self,
> _object * args)  Line 367 + 0x1f bytes     C++ ****
>
>  ****
>
> and 1 in here.****
>
>  ****
>
> >             vtkParallel.dll!vtkMPICommunicatorReduceData(const void *
> sendBuffer, void * recvBuffer, __int64 length, int type, int operation, int
> destProcessId, int * comm)  Line 317        C++****
>
>                 vtkParallel.dll!vtkMPICommunicator::ReduceVoidArray(const
> void * sendBuffer, void * recvBuffer, __int64 length, int type, int
> operation, int destProcessId)  Line 1395 + 0x4c bytes              C++****
>
>                 vtkParallel.dll!vtkCommunicator::Reduce(const double *
> sendBuffer, double * recvBuffer, __int64 length, int operation, int
> destProcessId)  Line 633   C++****
>
>                 vtkParallel.dll!vtkMultiProcessController::Reduce(const
> double * sendBuffer, double * recvBuffer, __int64 length, int operation,
> int destProcessId)  Line 811            C++****
>
>
> vtkPVClientServerCore.dll!vtkPVSynchronizedRenderWindows::SynchronizeBounds(double
> * bounds)  Line 1381                C++****
>
>
> vtkPVClientServerCore.dll!vtkPVRenderView::GatherBoundsInformation(bool
> using_distributed_rendering)  Line 598         C++****
>
>                 vtkPVClientServerCore.dll!vtkPVRenderView::Render(bool
> interactive, bool skip_rendering)  Line 882  C++****
>
>                 vtkPVClientServerCore.dll!vtkPVRenderView::StillRender()
> Line 745      C++****
>
>
> vtkPVClientServerCoreCS.dll!vtkPVRenderViewCommand(vtkClientServerInterpreter
> * arlu, vtkObjectBase * ob, const char * method, const
> vtkClientServerStream & msg, vtkClientServerStream & resultStream)  Line
> 258        C++****
>
>
> vtkClientServer.dll!vtkClientServerInterpreter::ProcessCommandInvoke(const
> vtkClientServerStream & css, int midx)  Line 379 + 0x2f bytes        C++**
> **
>
>
> vtkClientServer.dll!vtkClientServerInterpreter::ProcessOneMessage(const
> vtkClientServerStream & css, int message)  Line 214 + 0x1d
> bytes               C++****
>
>
> vtkClientServer.dll!vtkClientServerInterpreter::ProcessStream(const
> vtkClientServerStream & css)  Line 183 + 0x14 bytes     C++****
>
>
> vtkPVServerImplementation.dll!vtkPVSessionCore::ExecuteStreamInternal(const
> vtkClientServerStream & stream, bool ignore_errors)  Line 636      C++****
>
>
> vtkPVServerImplementation.dll!vtkPVSessionCore::ExecuteStream(unsigned int
> location, const vtkClientServerStream & stream, bool ignore_errors)  Line
> 606  C++****
>
>
> vtkPVServerImplementation.dll!vtkPVSessionBase::ExecuteStream(unsigned int
> location, const vtkClientServerStream & stream, bool ignore_errors)  Line
> 157  C++****
>
>                 vtkPVServerManager.dll!vtkSMProxy::ExecuteStream(const
> vtkClientServerStream & stream, bool ignore_errors, unsigned int location)
> Line 2092                C++****
>
>                 vtkPVServerManager.dll!vtkSMViewProxy::StillRender()  Line
> 137 + 0x18 bytes C++****
>
>                 vtkPVServerManager.dll!`anonymous
> namespace'::vtkRenderHelper::EventuallyRender()  Line 86          C++****
>
>
> vtkPVVTKExtensions.dll!vtkPVGenericRenderWindowInteractor::Render()  Line
> 302       C++****
>
>                 vtkRendering.dll!vtkRenderWindowInteractor::Initialize()
> Line 632         C++****
>
>
> vtkRendering.dll!vtkRenderWindowInteractor::ReInitialize()  Line 76 + 0x13
> bytes            C++****
>
>
> vtkRendering.dll!vtkWin32OpenGLRenderWindow::SetOffScreenRendering(int
> offscreen)  Line 1417    C++****
>
>
> vtkPVServerManager.dll!vtkSMRenderViewProxy::CaptureWindowInternal(int
> magnification)  Line 875               C++****
>
>                 vtkPVServerManager.dll!vtkSMViewProxy::CaptureWindow(int
> magnification)  Line 268 + 0x20 bytes    C++****
>
>                 vtkPVServerManager.dll!vtkSMViewProxy::WriteImage(const
> char * filename, const char * writerName, int magnification)  Line 307 +
> 0x11 bytes      C++****
>
>
> vtkPVServerManagerPythonD.dll!PyvtkSMViewProxy_WriteImage(_object * self,
> _object * args)  Line 367 + 0x1f bytes     C++****
>
>  ****
>
> I’ll look and see if some commits mentioned in earlier emails are missing
> from master.****
>
>  ****
>
> JB****
>
>  ****
>
>  ****
>
> *From:* Andy Bauer [mailto:andy.bauer at kitware.com]
> *Sent:* 21 December 2011 21:59****
>
>
> *To:* Biddiscombe, John A.
> *Cc:* paraview at paraview.org
> *Subject:* Re: [Paraview] PV 3.12.0 coprocessing problem****
>
>  ****
>
> I'm attaching a couple of files that you can use to test if you have the
> proper fix for your branch.  Run the python script with "mpirun -np 8
> bin/pvbatch -sym parallelpythontest.py".  If you get the same results then
> you don't have the fix yet for the image issue.  Note that
> parallelpythontest2.png isn't getting colored properly while
> parallelpythontest.png is.
>
> Andy****
>
> On Wed, Dec 21, 2011 at 3:48 PM, Biddiscombe, John A. <biddisco at cscs.ch>
> wrote:****
>
> I’ll give it a try using master, a very simple python script is attached.
> ****
>
>  ****
>
> The original contained a lot more filters, this has most of them stripped
> out and just a contour left. And yes, it is possible that some processes
> have no points.****
>
>  ****
>
> JB****
>
>  ****
>
> *From:* Andy Bauer [mailto:andy.bauer at kitware.com]
> *Sent:* 21 December 2011 19:10
> *To:* Biddiscombe, John A.
> *Cc:* paraview at paraview.org
> *Subject:* Re: [Paraview] PV 3.12.0 coprocessing problem****
>
>  ****
>
> Hi John,
>
> There were a couple of issues when saving images.  One was for saving
> charts and maybe 2d views.  The other one was for when some processes
> didn't have any points or cells.  Looking at your stack traces I don't
> think it's the latter since that would fail in the python script and give a
> warning in there.  Any chance you could test with the paraview's current
> master branch?
>
> Are you using a python script to drive the coprocessing?  If yes, can you
> share it?
>
> Andy****
>
> On Wed, Dec 21, 2011 at 12:15 PM, Biddiscombe, John A. <biddisco at cscs.ch>
> wrote:****
>
> I'm getting lock ups when saving images using the coprocessing. It looks a
> lot like a bug that was fixed many moons ago, but maybe the fix got lost in
> a merge ...
>
> 1  process makes it to here and waits for MPI traffic
>
> >       vtkParallel.dll!vtkMPICommunicatorReduceData(const void *
> sendBuffer=0x000000000012c318, void * recvBuffer=0x000000000012c378,
> __int64 length=3, int type=11, int operation=1476395010, int
> destProcessId=0, int * comm=0x0000000005d77670)  Line 317   C++
>        vtkParallel.dll!vtkMPICommunicator::ReduceVoidArray(const void *
> sendBuffer=0x000000000012c318, void * recvBuffer=0x000000000012c378,
> __int64 length=3, int type=11, int operation=1, int destProcessId=0)  Line
> 1422 + 0x4c bytes      C++
>        vtkParallel.dll!vtkCommunicator::Reduce(const double *
> sendBuffer=0x000000000012c318, double * recvBuffer=0x000000000012c378,
> __int64 length=3, int operation=1, int destProcessId=0)  Line 633 C++
>        vtkParallel.dll!vtkMultiProcessController::Reduce(const double *
> sendBuffer=0x000000000012c318, double * recvBuffer=0x000000000012c378,
> __int64 length=3, int operation=1, int destProcessId=0)  Line 811       C++
>
>  vtkPVClientServerCore.dll!vtkPVSynchronizedRenderWindows::SynchronizeBounds(double
> * bounds=0x000000000d18a8a8)  Line 1381      C++
>
>  vtkPVClientServerCore.dll!vtkPVRenderView::GatherBoundsInformation(bool
> using_distributed_rendering=true)  Line 598     C++
>        vtkPVClientServerCore.dll!vtkPVRenderView::Render(bool
> interactive=false, bool skip_rendering=false)  Line 882  C++
>        vtkPVClientServerCore.dll!vtkPVRenderView::StillRender()  Line 745
>      C++
>
>  vtkPVClientServerCoreCS.dll!vtkPVRenderViewCommand(vtkClientServerInterpreter
> * arlu=0x0000000005d6be10, vtkObjectBase * ob=0x000000000d18a770, const
> char * method=0x000000000e1f7ee9, const vtkClientServerStream & msg={...},
> vtkClientServerStream & resultStream={...})  Line 258  C++
>
>  vtkClientServer.dll!vtkClientServerInterpreter::ProcessCommandInvoke(const
> vtkClientServerStream & css={...}, int midx=0)  Line 379 + 0x2f bytes
>  C++
>
>  vtkClientServer.dll!vtkClientServerInterpreter::ProcessOneMessage(const
> vtkClientServerStream & css={...}, int message=0)  Line 214 + 0x1d bytes
>      C++
>        vtkClientServer.dll!vtkClientServerInterpreter::ProcessStream(const
> vtkClientServerStream & css={...})  Line 183 + 0x14 bytes   C++
>
>  vtkPVServerImplementation.dll!vtkPVSessionCore::ExecuteStreamInternal(const
> vtkClientServerStream & stream={...}, bool ignore_errors=false)  Line 636
> C++
>
>  vtkPVServerImplementation.dll!vtkPVSessionCore::ExecuteStream(unsigned int
> location=21, const vtkClientServerStream & stream={...}, bool
> ignore_errors=false)  Line 606 C++
>
>  vtkPVServerImplementation.dll!vtkPVSessionBase::ExecuteStream(unsigned int
> location=21, const vtkClientServerStream & stream={...}, bool
> ignore_errors=false)  Line 157 C++
>        vtkPVServerManager.dll!vtkSMProxy::ExecuteStream(const
> vtkClientServerStream & stream={...}, bool ignore_errors=false, unsigned
> int location=21)  Line 2092     C++
>        vtkPVServerManager.dll!vtkSMViewProxy::StillRender()  Line 137 +
> 0x18 bytes     C++
>        vtkPVServerManager.dll!`anonymous
> namespace'::vtkRenderHelper::EventuallyRender()  Line 86      C++
>        vtkPVVTKExtensions.dll!vtkPVGenericRenderWindowInteractor::Render()
>  Line 302   C++
>        vtkRendering.dll!vtkRenderWindowInteractor::Initialize()  Line 632
>      C++
>        vtkRendering.dll!vtkRenderWindowInteractor::ReInitialize()  Line 76
> + 0x13 bytes        C++
>
>  vtkRendering.dll!vtkWin32OpenGLRenderWindow::SetOffScreenRendering(int
> offscreen=0)  Line 1268  C++
>
>  vtkPVServerManager.dll!vtkSMRenderViewProxy::CaptureWindowInternal(int
> magnification=1)  Line 875       C++
>        vtkPVServerManager.dll!vtkSMViewProxy::CaptureWindow(int
> magnification=1)  Line 268 + 0x20 bytes        C++
>        vtkPVServerManager.dll!vtkSMViewProxy::WriteImage(const char *
> filename=0x000000000b619f50, const char * writerName=0x000000000ac750e0,
> int magnification=1)  Line 307 + 0x11 bytes     C++
>        vtkPVServerManagerPythonD.dll!PyvtkSMViewProxy_WriteImage(_object *
> self=0x000000000b62f6d8, _object * args=0x000000000b62b438)  Line 367 +
> 0x1f bytes  C++
>
> ------------------------------
> but the other N-1 processes end up in here - waiting
>
> >       vtkParallel.dll!vtkMPICommunicator::BroadcastVoidArray(void *
> data=0x000000000012dd24, __int64 length=1, int type=6, int root=0)  Line
> 1159 + 0x31 bytes        C++
>        vtkParallel.dll!vtkCommunicator::Broadcast(int *
> data=0x000000000012dd24, __int64 length=1, int srcProcessId=0)  Line 256
>     C++
>        vtkParallel.dll!vtkMultiProcessController::Broadcast(int *
> data=0x000000000012dd24, __int64 length=1, int srcProcessId=0)  Line 402
>   C++
>
>  vtkPVServerManager.dll!vtkSMUtilities::SaveImageOnProcessZero(vtkImageData
> * image=0x000000000e2c79c0, const char * filename=0x000000000b729f50, const
> char * writerName=0x000000000ac850e0)  Line 139  C++
>        vtkPVServerManager.dll!vtkSMViewProxy::WriteImage(const char *
> filename=0x000000000b729f50, const char * writerName=0x000000000ac850e0,
> int magnification=1)  Line 311 + 0x1f bytes     C++
>        vtkPVServerManagerPythonD.dll!PyvtkSMViewProxy_WriteImage(_object *
> self=0x000000000b73f6d8, _object * args=0x000000000b73b438)  Line 367 +
> 0x1f bytes  C++
>
> There was a problem with the SaveImageOnProcess Zero which caused exactly
> this some time ago. I'm using a branch of mine derived from the v3.12.0
> tag, but I cherry picked Andy's patch from kitware/master or wherever that
> was mentioned a few days ago -are there othe fixes on master/next that I
> might beneeding?
>
> Any ideas. The pipeline is a contour of some image data on N processes.
> Pretty Simple.
>
> Any help appreciated.
>
> thanks
>
> JB****
>
>  ****
>
>  ****
>
> ** **
>
> ** **
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20120102/caca60fa/attachment-0001.htm>


More information about the ParaView mailing list