[Paraview] vtkDistributedDataFilter and ghost cells

Andrew Parker andy.john.parker at googlemail.com
Wed Nov 7 04:38:19 EST 2012


So that's working now in terms of the cast, forgot GetOutput() inside the
cast operator!  The returned vtkUGrid is now fully populated and contains
sensible information.

However, both getPointData()->GetGlobalIds()  and
 getCellData()->GetGlobalIds() return null pointers.  Any thoughts?

Also, should I be using CellData since I want the cell global to local
mapping for cells not the nodes, at the moment at least?

On 6 November 2012 19:05, Moreland, Kenneth <kmorel at sandia.gov> wrote:

>   Perhaps it is outputting a composite data set of some type?  Try
> running GetClassName() to see what type of data object it really is.
>
>  -Ken
>
>   From: Andrew Parker <andy.john.parker at googlemail.com>
> Date: Tuesday, November 6, 2012 9:50 AM
>
> To: Kenneth Moreland <kmorel at sandia.gov>
> Cc: "vtkusers at vtk.org" <vtkusers at vtk.org>, "paraview at paraview.org" <
> paraview at paraview.org>
> Subject: [EXTERNAL] Re: [Paraview] vtkDistributedDataFilter and ghost
> cells
>
>   Thanks on both accounts.  Any thoughts why the downcast called after
> dd->Update() on distributedDataFilter is a null pointer?  As in, dd is
> working perfectly properly, but I don't seem to be able to extract a valid
> unstructuredgrid.  For a follow up question I assume
> getPointData()->GetGlobalIds() gives the local to global for the mesh
> nodes, and I should use getCellData()->GetGlobalIds() to get the local to
> global for the cells?  Once I get a valid pointer that is....
>
>  Cheers again,
> Andy
>
> On 6 November 2012 16:40, Moreland, Kenneth <kmorel at sandia.gov> wrote:
>
>>   You should be able to do a vtkUnstructuredGrid::SafeDownCast() to the
>> data to get the unstructured mesh (and access to the point data).
>>
>>  -Ken
>>
>>   From: Andrew Parker <andy.john.parker at googlemail.com>
>> Date: Tuesday, November 6, 2012 9:32 AM
>> To: Kenneth Moreland <kmorel at sandia.gov>
>> Cc: "vtkusers at vtk.org" <vtkusers at vtk.org>, "paraview at paraview.org" <
>> paraview at paraview.org>
>> Subject: [EXTERNAL] Re: [Paraview] vtkDistributedDataFilter and ghost
>> cells
>>
>>   Another question which I'd forgotten, how do I get to a
>> vtkUnstructuredGrid per processor from the vtkDistributedDataFilter.
>>
>>  For instance, dd->GetOutput()->GetPointData()->GetGlobalIds() doesn't
>> work as it's a vtkDataObject
>>
>>  Stupid question I'm sure but the doxy notes say this type returns an
>> unstructured mesh, but I can't seem to get it out?
>>
>>  Also, why exactly do I need the vtkPieceScalars and
>> vtkDataSetSurfaceFilter again? If the above can be made to work and
>> return the mapping, what are they adding in terms of information?
>>
>>  Thanks again,
>> Andy
>>
>> On 6 November 2012 16:00, Moreland, Kenneth <kmorel at sandia.gov> wrote:
>>
>>>   I believe vtkDistributedDataFilter will always return with global
>>> point ids (a mapping from local point ids to global point ids), although it
>>> might pass them if they already exist.  So
>>> dd->GetOutput()->GetPointData()->GetGlobalIds() should return the array
>>> that gives this mapping.
>>>
>>>  Ghost cells are only created on demand, and this is usually done by
>>> pipeline convention.  If you have a filter that needs a layer of ghost
>>> cells, it should override the RequestUpdateExtent method to increment the
>>> value of UPDATE_NUMBER_OF_GHOST_LEVELS from the output information to the
>>> input information.  This method would look something like this.
>>>
>>>   int vtkDistributedDataFilter::RequestUpdateExtent(
>>>   vtkInformation *vtkNotUsed(request),
>>>   vtkInformationVector **inputVector,
>>>   vtkInformationVector *outputVector)
>>> {
>>>   // get the info objects
>>>   vtkInformation *inInfo = inputVector[0]->GetInformationObject(0);
>>>   vtkInformation *outInfo = outputVector->GetInformationObject(0);
>>>
>>>    int piece, numPieces, ghostLevels;
>>>
>>>    // We require an extra layer of ghost cells from upstream.
>>>
>>>    piece =
>>> outInfo->Get(vtkStreamingDemandDrivenPipeline::UPDATE_PIECE_NUMBER());
>>>   numPieces =
>>>
>>> outInfo->Get(vtkStreamingDemandDrivenPipeline::UPDATE_NUMBER_OF_PIECES());
>>>   ghostLevels =
>>>     outInfo->Get(
>>> vtkStreamingDemandDrivenPipeline::UPDATE_NUMBER_OF_GHOST_LEVELS());
>>>
>>>    inInfo->Set(vtkStreamingDemandDrivenPipeline::UPDATE_PIECE_NUMBER(),
>>> piece);
>>>
>>> inInfo->Set(vtkStreamingDemandDrivenPipeline::UPDATE_NUMBER_OF_PIECES(),
>>>               numPieces);
>>>
>>> inInfo->Set(vtkStreamingDemandDrivenPipeline::UPDATE_NUMBER_OF_GHOST_LEVELS(),
>>>               ghostLevels);
>>>
>>>    return 1;
>>> }
>>>
>>>
>>>  The operation of the RequestData method should also strip off this
>>> layer of ghost cells.  It might be possible to request a layer of ghost
>>> cells by setting UPDATE_NUMBER_OF_GHOST_LEVELS at the bottom of the
>>> pipeline, but I'm not totally sure how to make that work.  It's probably
>>> easier (or at least cleaner) to do it from within a filter.
>>>
>>>  -Ken
>>>
>>>   From: Andrew Parker <andy.john.parker at googlemail.com>
>>> Date: Tuesday, November 6, 2012 8:25 AM
>>> To: "vtkusers at vtk.org" <vtkusers at vtk.org>, "paraview at paraview.org" <
>>> paraview at paraview.org>
>>> Subject: [EXTERNAL] [Paraview] vtkDistributedDataFilter and ghost cells
>>>
>>>   Hi,
>>>
>>>  Hope you can help.  I have some code running in parallel, that by
>>> other means I have constructed nprocs worth of vtkRectilinearGrids, one per
>>> process.  Each of which is a valid nprocs-worth of the whole serial mesh,
>>> I've check this and I am happy with that i.e. it's partitioned properly and
>>> nothing is missing.  I need the following information to process my data in
>>> parallel:
>>>
>>>  1) I would like the local -> global cell mapping between the local
>>> rgrid and the corresponding global single mesh.
>>> 2) I would like to know which cells are on processor boundaries for
>>> parallel exchange purposes.
>>> 3) I would like all the double arrays per processor to be "expanded" by
>>> the amount of (1 level of) ghost cells such that I can properly do the
>>> computations I want with the ability to exchange only those additional
>>> cells given the local to global mapping.
>>>
>>>  I have tried from the examples to use the following code, which I call
>>> on every process, each of which has it's own local rgrid as I said.  I do
>>> the following:
>>>
>>>   vtkSmartPointer<vtkDistributedDataFilter> dd =
>>> vtkSmartPointer<vtkDistributedDataFilter>::New();
>>>   dd->SetInput(rgrid);
>>>
>>>  dd->SetController(getVtkController());
>>>   dd->SetBoundaryModeToSplitBoundaryCells();
>>>  //dd->SetBoundaryModeToAssignToOneRegion();
>>>  //dd->SetBoundaryModeToAssignToAllIntersectingRegions();
>>>  dd->UseMinimalMemoryOff();
>>>  dd->Update();
>>>   vtkPieceScalars *ps = vtkPieceScalars::New();
>>>  ps->SetInputConnection(dd->GetOutputPort());
>>>  ps->SetScalarModeToCellData();
>>>   vtkDataSetSurfaceFilter *dss = vtkDataSetSurfaceFilter::New();
>>>  dss->SetInputConnection(ps->GetOutputPort());
>>>
>>>  The dd object works fine and writing its contents out on each
>>> processor gives nprocs worth of meshes, each of which look slightly
>>> different to the way I've partitioned them up, but sum to the same serial
>>> mesh so I am happy with that working correctly. But I can't for the life of
>>> me figure out how to obtain local to global cell mappings, allocate ghost
>>> cells, or work out how to exchange data given the above partition info and
>>> comms....
>>>
>>>  Note I have not provided any additional information to "dd" regarding
>>> global cells as per the doxy notes so I assume it went away and computed
>>> it.  I can't figure out how to extract it however.  I also have no idea how
>>> to modify each local processor rgrid to include the ghost cells for that
>>> processor.  Finally given that info, I could exchange between processors to
>>> write to each local processors ghost cells the corresponding "real" cell
>>> data from the neighbouring meshes and continue the code.
>>>
>>>  Any help really appreciated!
>>>
>>> Cheers,
>>> Andy
>>>
>>>
>>
>>
>>  --
>>
>> __________________________________
>>
>>    Dr Andrew Parker
>>
>>    Em at il:  andrew.parker at cantab.net
>>
>>
>
>
>  --
>
> __________________________________
>
>    Dr Andrew Parker
>
>    Em at il:  andrew.parker at cantab.net
>
>


-- 

__________________________________

   Dr Andrew Parker

   Em at il:  andrew.parker at cantab.net
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.paraview.org/pipermail/paraview/attachments/20121107/82319f32/attachment-0001.htm>


More information about the ParaView mailing list