HDF5 VDS: Unable to create a VDS linking large number of datasetsChunked HDF5 DataSet and slabsizeSearching a HDF5 datasetConcatenate a large number of HDF5 filesHDF5 :Create a Dataset with stringConverting large SAS dataset to hdf5HDF5 Links to Events in DatasetWriting a large hdf5 dataset using h5pyLimit on number of HDF5 DatasetsUnable to create reference of HDF5 dataset in Python

Why are there yellow dot stickers on the front doors of businesses in Russia?

How to win against ants

Square root of the square of the cosine: absolute value or not

What percentage of campground outlets are GFCI or RCD protected?

How to check a file was encrypted (really & correctly)

Javascript - Find a deepest node in a binary tree

3 beeps on a 486 computer with an American Megatrends bios?

If someone else uploads my GPL'd code to Github without my permission, is that a copyright violation?

Drawing arrowtips at the end of each segment in a polygonal path

Make lens aperture in Tikz

How do I handle a DM that plays favorites with certain players?

…down the primrose path

Why is the Vasa Museum in Stockholm so Popular?

How easy is it to get a gun illegally in the United States?

Four-velocity of radially infalling gas in Schwarzschild metric

What could prevent players from leaving an island?

Based on what criteria do you add/not add icons to labels within a toolbar?

Is an "are" omitted in this sentence

Movie with a girl/fairy who was talking to a unicorn in a snow covered forest

What is an air conditioner compressor hard start kit and how does it work?

How does one receive forgiveness from someone they have wronged after their death?

Awk to get all my regular users in shadow

The Game of the Century - why didn't Byrne take the rook after he forked Fischer?

Why does capacitance not depend on the material of the plates?



HDF5 VDS: Unable to create a VDS linking large number of datasets


Chunked HDF5 DataSet and slabsizeSearching a HDF5 datasetConcatenate a large number of HDF5 filesHDF5 :Create a Dataset with stringConverting large SAS dataset to hdf5HDF5 Links to Events in DatasetWriting a large hdf5 dataset using h5pyLimit on number of HDF5 DatasetsUnable to create reference of HDF5 dataset in Python






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








0















I have 14,430 HDF5 files. Each one of these HDF5 files contains a dataset of datatype float of size = 49152 x 21 x 1. I am a trying to create a VDS dataset of size = 49152 x 21 x 14430 mapping each dataset in all the 14,430 files. The code runs fine without errors and I can see that it creates a VDS of size 49152 x 21 x 14430 and writes it to a HDF5 file. However, if I open (or) do a h5dump of the VDS, I see that the VDS has right data until 49152 x 21 x 1024 and rest of the values in the VDS are zeros. It looks like only the first 1024 files are mapped and rest have not been mapped. But if I do “h5dump -p” on the hdf5 file containing the VDS, it shows that all the 14,430 files are virtually mapped. Is there any limit on the memory (or) number of files that can be mapped that I am not aware of? Am I missing something?. I am attaching the code below for reference. I am using HDF5 1.10.3. Thanks in advance.




#define REAL float
#define H5T_NATIVE_RL H5T_NATIVE_FLOAT
#define MPI_RL MPI_FLOAT

#define RANK 2
#define PAR_RANK 3
#define LENLINE 4096
#define MAXLEN 256
#define DOMAINDATA 6
#define GRIDDATA 6
#define MAXFILE 20000

hid_t vdcpl_id;
hid_t vds_space_id;
hid_t src_dataspace_id;
hid_t vdset_id;
hsize_t vdims[PAR_RANK];
hsize_t src_dims[PAR_RANK];
hsize_t voffset[PAR_RANK];
hsize_t vcount[PAR_RANK];
hsize_t vstride[PAR_RANK];
hsize_t vblock[PAR_RANK];

vdcpl_id = H5Pcreate(H5P_DATASET_CREATE);

vdims[0] = 49152;
vdims[1] = 21;
vdims[2] = iEndFrame-iStartFrame+1;

vds_space_id = H5Screate_simple(PAR_RANK, vdims, NULL);

offset[0] = 0;
offset[1] = 0;
offset[2] = 0;

count[0] = 1;
count[1] = 1;
count[2] = 1;

stride[0] = 1;
stride[1] = 1;
stride[2] = 1;

block[0] = 49152;
block[1] = 21;
block[2] = 1;

src_dims[2] = 1;
src_dims[1] = 21;
src_dims[0] = 49152;

for(i=iStartFrame; i<=iEndFrame; i++)


frameIndex = i - iStartFrame;
voffset[0] = 0;
voffset[1] = 0;
voffset[2] = frameIndex;

vstride[0] = 1;
vstride[1] = 1;
vstride[2] = 1;

vblock[0] = 49152;
vblock[1] = 21;
vblock[2] = 1;

vcount[0] = 1;
vcount[1] = 1;
vcount[2] = 1;

src_dataspace_id = H5Screate_simple(PAR_RANK, src_dims, NULL);

H5Sselect_hyperslab(src_dataspace_id, H5S_SELECT_SET, offset, stride, block, count);

H5Sselect_hyperslab(vds_space_id, H5S_SELECT_SET, voffset, vstride, vblock, vcount);

sprintf(hdf5FileName, "ParData.%s.%d.h5",probName,frameIndex);

sprintf(parDSetName, "/Particles/particle_%d", frameIndex );

status = H5Pset_virtual(vdcpl_id, vds_space_id, hdf5FileName, parDSetName, src_dataspace_id);

if(status < 0)
printf("Proc:%04d ************** Failed creating virtual mapping %s ****************nn",
my_id, hdf5FileName);
return 1;


printf("Proc:%04d ************** Creating virtual mapping %s ****************nn",
my_id, hdf5FileName);

H5Sclose(src_dataspace_id);



sprintf(hdf5FileName, "ParData.%s.0.h5",probName);


file_id = H5Fopen(hdf5FileName, H5F_ACC_RDWR, H5P_DEFAULT);


/* Create VDS in the 0th frame HDF5 file inside the Particles Grp. */
parGrp_id = H5Gopen1(file_id, "/Particles");

vdset_id = H5Dcreate(parGrp_id, "VDS", H5T_NATIVE_RL, vds_space_id, H5P_DEFAULT,
vdcpl_id, H5P_DEFAULT);

H5Sclose(vds_space_id);
H5Gclose(parGrp_id);
H5Dclose(vdset_id);
H5Pclose(vdcpl_id);
H5Fclose(file_id);









share|improve this question






























    0















    I have 14,430 HDF5 files. Each one of these HDF5 files contains a dataset of datatype float of size = 49152 x 21 x 1. I am a trying to create a VDS dataset of size = 49152 x 21 x 14430 mapping each dataset in all the 14,430 files. The code runs fine without errors and I can see that it creates a VDS of size 49152 x 21 x 14430 and writes it to a HDF5 file. However, if I open (or) do a h5dump of the VDS, I see that the VDS has right data until 49152 x 21 x 1024 and rest of the values in the VDS are zeros. It looks like only the first 1024 files are mapped and rest have not been mapped. But if I do “h5dump -p” on the hdf5 file containing the VDS, it shows that all the 14,430 files are virtually mapped. Is there any limit on the memory (or) number of files that can be mapped that I am not aware of? Am I missing something?. I am attaching the code below for reference. I am using HDF5 1.10.3. Thanks in advance.




    #define REAL float
    #define H5T_NATIVE_RL H5T_NATIVE_FLOAT
    #define MPI_RL MPI_FLOAT

    #define RANK 2
    #define PAR_RANK 3
    #define LENLINE 4096
    #define MAXLEN 256
    #define DOMAINDATA 6
    #define GRIDDATA 6
    #define MAXFILE 20000

    hid_t vdcpl_id;
    hid_t vds_space_id;
    hid_t src_dataspace_id;
    hid_t vdset_id;
    hsize_t vdims[PAR_RANK];
    hsize_t src_dims[PAR_RANK];
    hsize_t voffset[PAR_RANK];
    hsize_t vcount[PAR_RANK];
    hsize_t vstride[PAR_RANK];
    hsize_t vblock[PAR_RANK];

    vdcpl_id = H5Pcreate(H5P_DATASET_CREATE);

    vdims[0] = 49152;
    vdims[1] = 21;
    vdims[2] = iEndFrame-iStartFrame+1;

    vds_space_id = H5Screate_simple(PAR_RANK, vdims, NULL);

    offset[0] = 0;
    offset[1] = 0;
    offset[2] = 0;

    count[0] = 1;
    count[1] = 1;
    count[2] = 1;

    stride[0] = 1;
    stride[1] = 1;
    stride[2] = 1;

    block[0] = 49152;
    block[1] = 21;
    block[2] = 1;

    src_dims[2] = 1;
    src_dims[1] = 21;
    src_dims[0] = 49152;

    for(i=iStartFrame; i<=iEndFrame; i++)


    frameIndex = i - iStartFrame;
    voffset[0] = 0;
    voffset[1] = 0;
    voffset[2] = frameIndex;

    vstride[0] = 1;
    vstride[1] = 1;
    vstride[2] = 1;

    vblock[0] = 49152;
    vblock[1] = 21;
    vblock[2] = 1;

    vcount[0] = 1;
    vcount[1] = 1;
    vcount[2] = 1;

    src_dataspace_id = H5Screate_simple(PAR_RANK, src_dims, NULL);

    H5Sselect_hyperslab(src_dataspace_id, H5S_SELECT_SET, offset, stride, block, count);

    H5Sselect_hyperslab(vds_space_id, H5S_SELECT_SET, voffset, vstride, vblock, vcount);

    sprintf(hdf5FileName, "ParData.%s.%d.h5",probName,frameIndex);

    sprintf(parDSetName, "/Particles/particle_%d", frameIndex );

    status = H5Pset_virtual(vdcpl_id, vds_space_id, hdf5FileName, parDSetName, src_dataspace_id);

    if(status < 0)
    printf("Proc:%04d ************** Failed creating virtual mapping %s ****************nn",
    my_id, hdf5FileName);
    return 1;


    printf("Proc:%04d ************** Creating virtual mapping %s ****************nn",
    my_id, hdf5FileName);

    H5Sclose(src_dataspace_id);



    sprintf(hdf5FileName, "ParData.%s.0.h5",probName);


    file_id = H5Fopen(hdf5FileName, H5F_ACC_RDWR, H5P_DEFAULT);


    /* Create VDS in the 0th frame HDF5 file inside the Particles Grp. */
    parGrp_id = H5Gopen1(file_id, "/Particles");

    vdset_id = H5Dcreate(parGrp_id, "VDS", H5T_NATIVE_RL, vds_space_id, H5P_DEFAULT,
    vdcpl_id, H5P_DEFAULT);

    H5Sclose(vds_space_id);
    H5Gclose(parGrp_id);
    H5Dclose(vdset_id);
    H5Pclose(vdcpl_id);
    H5Fclose(file_id);









    share|improve this question


























      0












      0








      0


      1






      I have 14,430 HDF5 files. Each one of these HDF5 files contains a dataset of datatype float of size = 49152 x 21 x 1. I am a trying to create a VDS dataset of size = 49152 x 21 x 14430 mapping each dataset in all the 14,430 files. The code runs fine without errors and I can see that it creates a VDS of size 49152 x 21 x 14430 and writes it to a HDF5 file. However, if I open (or) do a h5dump of the VDS, I see that the VDS has right data until 49152 x 21 x 1024 and rest of the values in the VDS are zeros. It looks like only the first 1024 files are mapped and rest have not been mapped. But if I do “h5dump -p” on the hdf5 file containing the VDS, it shows that all the 14,430 files are virtually mapped. Is there any limit on the memory (or) number of files that can be mapped that I am not aware of? Am I missing something?. I am attaching the code below for reference. I am using HDF5 1.10.3. Thanks in advance.




      #define REAL float
      #define H5T_NATIVE_RL H5T_NATIVE_FLOAT
      #define MPI_RL MPI_FLOAT

      #define RANK 2
      #define PAR_RANK 3
      #define LENLINE 4096
      #define MAXLEN 256
      #define DOMAINDATA 6
      #define GRIDDATA 6
      #define MAXFILE 20000

      hid_t vdcpl_id;
      hid_t vds_space_id;
      hid_t src_dataspace_id;
      hid_t vdset_id;
      hsize_t vdims[PAR_RANK];
      hsize_t src_dims[PAR_RANK];
      hsize_t voffset[PAR_RANK];
      hsize_t vcount[PAR_RANK];
      hsize_t vstride[PAR_RANK];
      hsize_t vblock[PAR_RANK];

      vdcpl_id = H5Pcreate(H5P_DATASET_CREATE);

      vdims[0] = 49152;
      vdims[1] = 21;
      vdims[2] = iEndFrame-iStartFrame+1;

      vds_space_id = H5Screate_simple(PAR_RANK, vdims, NULL);

      offset[0] = 0;
      offset[1] = 0;
      offset[2] = 0;

      count[0] = 1;
      count[1] = 1;
      count[2] = 1;

      stride[0] = 1;
      stride[1] = 1;
      stride[2] = 1;

      block[0] = 49152;
      block[1] = 21;
      block[2] = 1;

      src_dims[2] = 1;
      src_dims[1] = 21;
      src_dims[0] = 49152;

      for(i=iStartFrame; i<=iEndFrame; i++)


      frameIndex = i - iStartFrame;
      voffset[0] = 0;
      voffset[1] = 0;
      voffset[2] = frameIndex;

      vstride[0] = 1;
      vstride[1] = 1;
      vstride[2] = 1;

      vblock[0] = 49152;
      vblock[1] = 21;
      vblock[2] = 1;

      vcount[0] = 1;
      vcount[1] = 1;
      vcount[2] = 1;

      src_dataspace_id = H5Screate_simple(PAR_RANK, src_dims, NULL);

      H5Sselect_hyperslab(src_dataspace_id, H5S_SELECT_SET, offset, stride, block, count);

      H5Sselect_hyperslab(vds_space_id, H5S_SELECT_SET, voffset, vstride, vblock, vcount);

      sprintf(hdf5FileName, "ParData.%s.%d.h5",probName,frameIndex);

      sprintf(parDSetName, "/Particles/particle_%d", frameIndex );

      status = H5Pset_virtual(vdcpl_id, vds_space_id, hdf5FileName, parDSetName, src_dataspace_id);

      if(status < 0)
      printf("Proc:%04d ************** Failed creating virtual mapping %s ****************nn",
      my_id, hdf5FileName);
      return 1;


      printf("Proc:%04d ************** Creating virtual mapping %s ****************nn",
      my_id, hdf5FileName);

      H5Sclose(src_dataspace_id);



      sprintf(hdf5FileName, "ParData.%s.0.h5",probName);


      file_id = H5Fopen(hdf5FileName, H5F_ACC_RDWR, H5P_DEFAULT);


      /* Create VDS in the 0th frame HDF5 file inside the Particles Grp. */
      parGrp_id = H5Gopen1(file_id, "/Particles");

      vdset_id = H5Dcreate(parGrp_id, "VDS", H5T_NATIVE_RL, vds_space_id, H5P_DEFAULT,
      vdcpl_id, H5P_DEFAULT);

      H5Sclose(vds_space_id);
      H5Gclose(parGrp_id);
      H5Dclose(vdset_id);
      H5Pclose(vdcpl_id);
      H5Fclose(file_id);









      share|improve this question














      I have 14,430 HDF5 files. Each one of these HDF5 files contains a dataset of datatype float of size = 49152 x 21 x 1. I am a trying to create a VDS dataset of size = 49152 x 21 x 14430 mapping each dataset in all the 14,430 files. The code runs fine without errors and I can see that it creates a VDS of size 49152 x 21 x 14430 and writes it to a HDF5 file. However, if I open (or) do a h5dump of the VDS, I see that the VDS has right data until 49152 x 21 x 1024 and rest of the values in the VDS are zeros. It looks like only the first 1024 files are mapped and rest have not been mapped. But if I do “h5dump -p” on the hdf5 file containing the VDS, it shows that all the 14,430 files are virtually mapped. Is there any limit on the memory (or) number of files that can be mapped that I am not aware of? Am I missing something?. I am attaching the code below for reference. I am using HDF5 1.10.3. Thanks in advance.




      #define REAL float
      #define H5T_NATIVE_RL H5T_NATIVE_FLOAT
      #define MPI_RL MPI_FLOAT

      #define RANK 2
      #define PAR_RANK 3
      #define LENLINE 4096
      #define MAXLEN 256
      #define DOMAINDATA 6
      #define GRIDDATA 6
      #define MAXFILE 20000

      hid_t vdcpl_id;
      hid_t vds_space_id;
      hid_t src_dataspace_id;
      hid_t vdset_id;
      hsize_t vdims[PAR_RANK];
      hsize_t src_dims[PAR_RANK];
      hsize_t voffset[PAR_RANK];
      hsize_t vcount[PAR_RANK];
      hsize_t vstride[PAR_RANK];
      hsize_t vblock[PAR_RANK];

      vdcpl_id = H5Pcreate(H5P_DATASET_CREATE);

      vdims[0] = 49152;
      vdims[1] = 21;
      vdims[2] = iEndFrame-iStartFrame+1;

      vds_space_id = H5Screate_simple(PAR_RANK, vdims, NULL);

      offset[0] = 0;
      offset[1] = 0;
      offset[2] = 0;

      count[0] = 1;
      count[1] = 1;
      count[2] = 1;

      stride[0] = 1;
      stride[1] = 1;
      stride[2] = 1;

      block[0] = 49152;
      block[1] = 21;
      block[2] = 1;

      src_dims[2] = 1;
      src_dims[1] = 21;
      src_dims[0] = 49152;

      for(i=iStartFrame; i<=iEndFrame; i++)


      frameIndex = i - iStartFrame;
      voffset[0] = 0;
      voffset[1] = 0;
      voffset[2] = frameIndex;

      vstride[0] = 1;
      vstride[1] = 1;
      vstride[2] = 1;

      vblock[0] = 49152;
      vblock[1] = 21;
      vblock[2] = 1;

      vcount[0] = 1;
      vcount[1] = 1;
      vcount[2] = 1;

      src_dataspace_id = H5Screate_simple(PAR_RANK, src_dims, NULL);

      H5Sselect_hyperslab(src_dataspace_id, H5S_SELECT_SET, offset, stride, block, count);

      H5Sselect_hyperslab(vds_space_id, H5S_SELECT_SET, voffset, vstride, vblock, vcount);

      sprintf(hdf5FileName, "ParData.%s.%d.h5",probName,frameIndex);

      sprintf(parDSetName, "/Particles/particle_%d", frameIndex );

      status = H5Pset_virtual(vdcpl_id, vds_space_id, hdf5FileName, parDSetName, src_dataspace_id);

      if(status < 0)
      printf("Proc:%04d ************** Failed creating virtual mapping %s ****************nn",
      my_id, hdf5FileName);
      return 1;


      printf("Proc:%04d ************** Creating virtual mapping %s ****************nn",
      my_id, hdf5FileName);

      H5Sclose(src_dataspace_id);



      sprintf(hdf5FileName, "ParData.%s.0.h5",probName);


      file_id = H5Fopen(hdf5FileName, H5F_ACC_RDWR, H5P_DEFAULT);


      /* Create VDS in the 0th frame HDF5 file inside the Particles Grp. */
      parGrp_id = H5Gopen1(file_id, "/Particles");

      vdset_id = H5Dcreate(parGrp_id, "VDS", H5T_NATIVE_RL, vds_space_id, H5P_DEFAULT,
      vdcpl_id, H5P_DEFAULT);

      H5Sclose(vds_space_id);
      H5Gclose(parGrp_id);
      H5Dclose(vdset_id);
      H5Pclose(vdcpl_id);
      H5Fclose(file_id);






      hdf5






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 27 at 2:40









      Sai Sandeep DammatiSai Sandeep Dammati

      12 bronze badges




      12 bronze badges

























          0






          active

          oldest

          votes










          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55368983%2fhdf5-vds-unable-to-create-a-vds-linking-large-number-of-datasets%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes




          Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.







          Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.



















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55368983%2fhdf5-vds-unable-to-create-a-vds-linking-large-number-of-datasets%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

          은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현