Dimensions swapped with ND2 files, loci_tools 5.0.4
Posted: Tue Oct 07, 2014 2:55 pm
Hi all,
I am working with loci_tools.jar 5.0.4 to open my ND2 files. These are typically TCZ stacks.
Mostly this works perfectly fine, only somtimes dimensions get scrambled around. I already wrote a workaround for this, but it would be great if this would be solved in bioformats. The problem is as follows:
T stacks are saved as multipoints and Z is recognized as T stacks. For example, a 512x512x2x100x40 XYCTZ gets recognized as a collection of 512x512x2x40x1 files. The imagecount in the collection is 100 in this example. I use the loci.formats.meta.MetadataRetrieve functions to get these sizes.
Moreover, the frames are not distributed in the way you would expect. Normally you would expect 3D frame 1 to be in the first 80 frames (2x40), with channel 0 in the even frames and channel 1 in the odd frames. If you readout loci.formats.IFormatReader.getIndex(z,c,t), considering Z as T and T as MP, you get this behavior. And also the MetadataRetrieve functions are OK.
But actually, frame 0, channel 0 is in the first frames of the first 40 multipoints. And channel 1 is in the first frames of the first 40 multipoints. I denote this (0-39,0) and (0-39,1), first is the image index, second the frame index.
So if you think about this, you can do:
With % is the remainder operator and // the floor operator. This is how I work now. Because you keep changing between multipoints, the loading times get longer.
My smallest file with this error is 1.7GB, I can upload it if you want. If I crop it, the error goes away... I will also e-mail Nikon with this issue, but for now I wonder if there are people struggling with this too?
I am working with loci_tools.jar 5.0.4 to open my ND2 files. These are typically TCZ stacks.
Mostly this works perfectly fine, only somtimes dimensions get scrambled around. I already wrote a workaround for this, but it would be great if this would be solved in bioformats. The problem is as follows:
T stacks are saved as multipoints and Z is recognized as T stacks. For example, a 512x512x2x100x40 XYCTZ gets recognized as a collection of 512x512x2x40x1 files. The imagecount in the collection is 100 in this example. I use the loci.formats.meta.MetadataRetrieve functions to get these sizes.
Moreover, the frames are not distributed in the way you would expect. Normally you would expect 3D frame 1 to be in the first 80 frames (2x40), with channel 0 in the even frames and channel 1 in the odd frames. If you readout loci.formats.IFormatReader.getIndex(z,c,t), considering Z as T and T as MP, you get this behavior. And also the MetadataRetrieve functions are OK.
But actually, frame 0, channel 0 is in the first frames of the first 40 multipoints. And channel 1 is in the first frames of the first 40 multipoints. I denote this (0-39,0) and (0-39,1), first is the image index, second the frame index.
- 3D Frame 0 has (0-39,0) and (0-39,1)
3D Frame 1 has (40-79,0) and (40-79,1)
3D Frame 2 has (80-99,0) + (0-19,2) and (80-99,1) + (0-19,3)
3D Frame 3 has (20-59,2) and (20-59,3)
etc.
So if you think about this, you can do:
- Code: Select all
multipoint = (t*sizeT + z)%imageCount,
frame index = (t*sizeT)//imageCount + c
With % is the remainder operator and // the floor operator. This is how I work now. Because you keep changing between multipoints, the loading times get longer.
My smallest file with this error is 1.7GB, I can upload it if you want. If I crop it, the error goes away... I will also e-mail Nikon with this issue, but for now I wonder if there are people struggling with this too?