| View Issue Details [ Jump to Notes ] | [ Print ] | ||||||||
| ID | Project | Category | View Status | Date Submitted | Last Update | ||||
| 0013953 | ParaView | (No Category) | public | 2013-03-21 10:09 | 2016-08-12 09:59 | ||||
| Reporter | mexas@bristol.ac.uk | ||||||||
| Assigned To | Kitware Robot | ||||||||
| Priority | normal | Severity | minor | Reproducibility | have not tried | ||||
| Status | closed | Resolution | moved | ||||||
| Platform | OS | OS Version | |||||||
| Product Version | 3.12 | ||||||||
| Target Version | Fixed in Version | ||||||||
| Summary | 0013953: MPI error: MPI_Type_create_subarray(124): Argument array_of_starts has value 120 but must be within [0, 116] | ||||||||
| Description | The following python script works with pvpython, but fails with pvbatch. try: paraview.simple except: from paraview.simple import * paraview.simple._DisableFirstRenderCameraReset() z_raw = ImageReader( FilePrefix='z.raw' ) z_raw.DataExtent = [1, 40, 1, 40, 1, 40] z_raw.DataByteOrder = 'LittleEndian' z_raw.DataScalarType = 'int' RenderView1 = GetRenderView() DataRepresentation1 = Show() DataRepresentation1.ScalarOpacityUnitDistance = 1.7320508075688776 DataRepresentation1.Representation = 'Outline' DataRepresentation1.EdgeColor = [0.0, 0.0, 0.5000076295109483] RenderView1.CenterOfRotation = [19.5, 19.5, 19.5] a1_ImageFile_PVLookupTable = GetLookupTableForArray( "ImageFile", 1, RGBPoints=[1.0, 0.0, 0.0, 0.0, 26.200000000000003, 0.9019607843137255, 0.0, 0.0, 51.400000000000006, 0.9019607843137255, 0.9019607843137255, 0.0, 64.0, 1.0, 1.0, 1.0], VectorMode='Magnitude', NanColor=[0.0, 0.4980392156862745, 1.0], NumberOfTableValues=64, ColorSpace='RGB', ScalarRangeInitialized=1.0 ) a1_ImageFile_PiecewiseFunction = CreatePiecewiseFunction() ScalarBarWidgetRepresentation1 = CreateScalarBar( Title='Image', LabelFontSize=12, Enabled=1, LookupTable=a1_ImageFile_PVLookupTable, TitleFontSize=12 ) GetRenderView().Representations.append(ScalarBarWidgetRepresentation1) RenderView1.CameraViewUp = [-0.8528543278199172, 0.017957183620535777, -0.5218400474046194] RenderView1.CameraPosition = [-43.07616287436094, 67.5962248918674, 123.42461126225268] RenderView1.CameraClippingRange = [65.37811338222085, 212.781995926041] RenderView1.CameraFocalPoint = [19.499999999999975, 19.49999999999995, 19.499999999999993] RenderView1.CameraParallelScale = 33.77499074759311 RenderView1.Background=[0.3, 0.3, 0.4] RenderView1.ViewSize=[800,800] DataRepresentation1.Representation = 'Surface' DataRepresentation1.ScalarOpacityFunction = a1_ImageFile_PiecewiseFunction DataRepresentation1.ColorArrayName = 'ImageFile' DataRepresentation1.LookupTable = a1_ImageFile_PVLookupTable WriteImage('z.png') Render() The computer is www.hector.ac.uk The MPI used to run pvbatch is: xt-mpich2 - Cray MPICH2 Message Passing Interface The data file is in raw binry format. It is a 40x40x40 array of 4byte integers, written with fortran 2003 "stream" qualifier. The file size is 40x40x40x4=256000 bytes. The compressed file (344 bytes) is at: http://seis.bris.ac.uk/~mexas/z.raw.xz [^] The PBS script: #!/bin/bash --login #PBS -N pvbatch #PBS -l mppwidth=32 #PBS -l mppnppn=32 #PBS -l walltime=00:10:00 #PBS -j oe export PBS_O_WORKDIR=$(readlink -f $PBS_O_WORKDIR) cd $PBS_O_WORKDIR export XDG_CONFIG_HOME=${HOME}/work/ParaViewIniConfig module load paraview-servers/3.12.0 if [ -z ${PARAVIEW_SERVER_DIR} ] ; then echo "Error: PARAVIEW_SERVER_DIR not set. Exiting" exit 4 fi echo "DEBUG: PBS_O_WORKDIR is" $PBS_O_WORKDIR echo "DEBUG: PARAVIEW_SERVER_DIR is" ${PARAVIEW_SERVER_DIR} echo "DEBUG: XDG_CONFIG_HOME is" ${XDG_CONFIG_HOME} export MPPWIDTH=`qstat -f $PBS_JOBID | awk '/mppwidth/ {print $3}'` export MPPNPPN=`qstat -f $PBS_JOBID | awk '/mppnppn/ {print $3}'` aprun -n ${MPPWIDTH} -N ${MPPNPPN} ${PARAVIEW_SERVER_DIR}/bin/pvbatch \ --use-offscreen-rendering \ pv2.py Error: DEBUG: PBS_O_WORKDIR is /esfs2/e277/e277/mexas/cgca/branches/coarray/tests DEBUG: PARAVIEW_SERVER_DIR is /work/y07/y07/nag/ParaViewServers-3.12.0-Linux-x86_64 DEBUG: XDG_CONFIG_HOME is /work/e277/e277/mexas/ParaViewIniConfig Generic Warning: In /work/z03/z03/markr/xe6il/pv_3.12.0/ParaView-3.12.0/VTK/Parallel/vtkMPICommunicator.cxx, line 72 MPI had an error ------------------------------------------------ Invalid argument, error stack: MPI_Type_create_subarray(340): MPI_Type_create_subarray(ndims=3, array_of_sizes=0x7fffffff9f20, array_of_subsizes=0x7fffffff9f30, array_of_starts=0x7fffffff9f40, order=57, MPI_BYTE, newtype=0x7fffffff9f54) failed MPI_Type_create_subarray(124): Argument array_of_starts has value 120 but must be within [0,116] ------------------------------------------------ Generic Warning: In /work/z03/z03/markr/xe6il/pv_3.12.0/ParaView-3.12.0/VTK/Parallel/vtkMPICommunicator.cxx, line 72 MPI had an error ------------------------------------------------ Invalid argument, error stack: MPI_Type_create_subarray(340): MPI_Type_create_subarray(ndims=3, array_of_sizes=0x7fffffff9f20, array_of_subsizes=0x7fffffff9f30, array_of_starts=0x7fffffff9f40, order=57, MPI_BYTE, newtype=0x7fffffff9f54) failed MPI_Type_create_subarray(124): Argument array_of_starts has value 120 but must be within [0,116] ------------------------------------------------ Generic Warning: In /work/z03/z03/markr/xe6il/pv_3.12.0/ParaView-3.12.0/VTK/Parallel/vtkMPICommunicator.cxx, line 72 MPI had an error ------------------------------------------------ Invalid argument, error stack: MPI_Type_create_subarray(340): MPI_Type_create_subarray(ndims=3, array_of_sizes=0x7fffffff9f20, array_of_subsizes=0x7fffffff9f30, array_of_starts=0x7fffffff9f40, order=57, MPI_BYTE, newtype=0x7fffffff9f54) failed MPI_Type_create_subarray(124): Argument array_of_starts has value 30 but must be within [0,29] ------------------------------------------------ and so on, from each MPI process. I was told by the Hector support stuff that the problem might be in this snippet: void vtkMPIImageReader::SetupFileView(vtkMPIOpaqueFileHandle &file, const int extent[6]) { int arrayOfSizes[3]; int arrayOfSubSizes[3]; int arrayOfStarts[3]; for (int i = 0; i < this->GetFileDimensionality(); i++) { arrayOfSizes[i] = this->DataExtent[i*2+1] - this->DataExtent[i*2] + 1; arrayOfSubSizes[i] = extent[i*2+1] - extent[i*2] + 1; arrayOfStarts[i] = extent[i*2]; } // Adjust for base size of data type and tuple size. int baseSize = this->GetDataScalarTypeSize() * this->NumberOfScalarComponents; arrayOfSizes[0] *= baseSize; arrayOfSubSizes[0] *= baseSize; arrayOfStarts[0] *= baseSize; // Create a view in MPIIO. MPI_Datatype view; MPICall(MPI_Type_create_subarray(this->GetFileDimensionality(), arrayOfSizes, arrayOfSubSizes, arrayOfStarts, MPI_ORDER_FORTRAN, MPI_BYTE, &view)); MPICall(MPI_Type_commit(&view)); MPICall(MPI_File_set_view(file.Handle, this->GetHeaderSize(file), MPI_BYTE, view, const_cast<char *>("native"), MPI_INFO_NULL)); MPICall(MPI_Type_free(&view)); } | ||||||||
| Tags | No tags attached. | ||||||||
| Project | TBD | ||||||||
| Topic Name | |||||||||
| Type | crash | ||||||||
| Attached Files | |||||||||
| Relationships | |
| Relationships |
| Notes | |
|
(0030455) mexas@bristol.ac.uk (reporter) 2013-03-22 08:00 |
I get a similar error with 3.98 on another linux box with an even simpler script: from paraview.simple import * #Connect() #parameters ext1=80 ext2=ext1 ext3=640 minval=1 maxval=40 # files infile="z9end.raw" outfile="z.png" reader=ImageReader(FilePrefix= infile ) reader.DataByteOrder=1 reader.DataExtent=[1,ext1,1,ext2,1,ext3 ] reader.DataScalarType=6 view = GetActiveView() if not view: view = CreateRenderView() view.ViewSize=[800,800] view.Background=[0.3249412, 0.34902, 0.427451] Show() dp = GetDisplayProperties(reader) dp.LookupTable = MakeBlueToRedLT(minval,maxval) dp.ColorAttributeType = 'POINT_DATA' dp.ColorArrayName = 'ImageFile' dp.Representation = "Surface" bar = CreateScalarBar(LookupTable=dp.LookupTable, Title="grain") #bar.Position=[0.80,0.15] GetRenderView().Representations.append(bar) camera = GetActiveCamera() camera.SetViewUp(-1,0,0) camera.Azimuth(30) camera.Elevation(30) #camera.SetPosition(0,0,100) #camera.Roll(-90) Render() WriteImage( outfile ) Error: bigblue2> pvbatch --use-offscreen-rendering pv1.py Generic Warning: In /home/utkarsh/Dashboards/MyTests/NightlyMaster/ParaViewSuperbuild-Release/paraview/src/paraview/VTK/Parallel/MPI/vtkMPICommunicator.cxx, line 72 MPI had an error ------------------------------------------------ Invalid argument, error stack: MPI_Type_create_subarray(334): MPI_Type_create_subarray(ndims=3, array_of_sizes=0x7fff1acb1ad0, array_of_subsizes=0x7fff1acb1ac0, array_of_starts=0x7fff1acb1ab0, order=57, MPI_BYTE, newtype=0x7fff1acb1ae8) failed MPI_Type_create_subarray(124): Argument array_of_starts has value 4 but must be within [0,0] ------------------------------------------------ application called MPI_Abort(MPI_COMM_WORLD, 269077004) - process 0 bigblue2> Again, it works fine with pvpython, but fails with pvbatch |
|
(0038368) Kitware Robot (administrator) 2016-08-12 09:59 |
Resolving issue as `moved`. This issue tracker is no longer used. Further discussion of this issue may take place in the current ParaView Issues page linked in the banner at the top of this page. |
| Notes |
| Issue History | |||
| Date Modified | Username | Field | Change |
| 2013-03-21 10:09 | mexas@bristol.ac.uk | New Issue | |
| 2013-03-22 08:00 | mexas@bristol.ac.uk | Note Added: 0030455 | |
| 2016-08-12 09:59 | Kitware Robot | Note Added: 0038368 | |
| 2016-08-12 09:59 | Kitware Robot | Status | backlog => closed |
| 2016-08-12 09:59 | Kitware Robot | Resolution | open => moved |
| 2016-08-12 09:59 | Kitware Robot | Assigned To | => Kitware Robot |
| Issue History |
| Copyright © 2000 - 2018 MantisBT Team |