[Paraview] [Xdmf] ParaView/Xdmf bad memory usage

Renato N. Elias rnelias at nacad.ufrj.br
Wed Nov 5 09:47:55 EST 2008

Hi Jerry,

We've made a memory track for one of the nodes (each node has 8 cores 
and 2GB/core = 16MB/node)

- Temporal collection (all time steps) of spatial collection. Just the 
Mem:  16,342,696k total,  1,877,788k used, 14,464,908k free

-  Spatial collection  for the *latest time step*. Just the geometry:
Mem:  16,342,696k total,  2,256,284k used, 14,086,412k free,

- Spatial collection for the *latest time step*. Geometry plus results 
(velocity vector, pressure, temperature)
Mem:  16,342,696k total,  2,554,156k used, 13,788,540k free,

- Temporal collection (all time steps) of spatial collection. Geometry 
plus results (velocity vector, pressure, temperature)
Mem:  16,342,696k total,  3,550,232k used, 12,792,464k free. Not so bad, 
BUT, If I hit the button to skip to the next time step, the memory 
starts to disappear and the system hangs out. Catastrophically since our 
Altix-ICE nodes are diskless and when they run out of memory they simply 
shutdown freezing the ParaView section as well as (I'm afraid to 
say....) the whole machine. Yes, the entire computing nodes of our 
Altix-ICE system goes down because ParaView :o(. This case made us call 
SGI to understand how a system that should be robust has crashed so 
badly just because one application.

NOTE: I loaded and unloaded files in the same ParaView session.

This was the latest memory screen that we were able to get before 
freezing the system:
Mem:  16,342,696k total, 16,261,440k used,    81,256k free,        0k 
Swap:    49,144k total,    49,144k used,        0k free,    22,604k cached

Would you like the dataset to test? Do you have patience to download 
12GB? Maybe 2 time steps would be enough...



Jerry Clarke wrote:
> Renato,
> You are correct, it should only load one iteration, not all at once.
> Could you please try to load the geometry only (no scalars) and see
> what happens.
> Thanks
> Jerry
> Renato N. Elias wrote:
>> There's something wrong with memory allocation employed by ParaView 
>> (3.4.0) OR Xdmf library. When I load a temporal collection of spatial 
>> collections it seems that ParaView (or Xdmf) is trying to allocate 
>> memory for all time steps at once as it were a multiblock of 
>> multiblocks. My guess is that each time step should be loaded when 
>> required AND after unloading the previous time step in order to 
>> release some memory. Is there any workaround for this problem or it 
>> was really designed to work in this way?
>> Regards
>> Renato.
>> p.s.: I can provide a large dataset (40M tets/128 cpus/15 time steps) 
>> for debugging purposes.
>> ------------------------------------------------------------------------
>> Subject:
>> Rendering large datasets using offscreen rendering...
>> From:
>> "Renato N. Elias" <rnelias at nacad.ufrj.br>
>> Date:
>> Tue, 04 Nov 2008 14:21:37 -0200
>> To:
>> "paraview at paraview.org" <paraview at paraview.org>
>> To:
>> "paraview at paraview.org" <paraview at paraview.org>
>> Folks, I need some tips for rendering a large dataset using offscreen 
>> rendering in a SGI Altix-ICE.
>> Basically, I have a 40 million tetrahedra mesh partitioned in 128 
>> cpus. The result is stored in Xdmf format (15 time steps) and I can 
>> successfully load the datasets in ParaView 3.4.0 and do basic things  
>> like plot the result in isosurfaces or colorize the model with a 
>> scalar field. The problem begins to happen when I try to change the 
>> model opacity. PV takes forever before crashing MPI.
>> I guess that the problem with opacity is due to the compositing order 
>> but I can't figure out a way to use it. I already tried to disable 
>> ordered compositing but it hasn't helped so much.
>> Any idea would be appreciated. Thanks,
>> Renato.
>> ------------------------------------------------------------------------
>> _______________________________________________
>> Xdmf mailing list
>> Xdmf at lists.kitware.com
>> http://www.kitware.com/cgi-bin/mailman/listinfo/xdmf

More information about the ParaView mailing list