Submitted by dominic on Thu, 11/05/2009 - 07:39
Dear DPARSF developers,
I am trying to run a test set of participants on using DPARSF and receive the following error:
Removing the linear trend:
Read 3D EPI functional images: "/data/kodos/work/struct/dominic/dparsf_v1.0beta/ads20/FunImgNormalizedSmoothed/1005037".??? Maximum variable size allowed by the program is exceeded.
I read a previous post which mentioned that it may be a Matlab version issue. I have now upgraded our version of Matlab to 7.9.0 (R2009b) and I still get the error message. Any help would be much appreciated.
thanks,
dominic
dominic
Submitted by YAN Chao-Gan on Thu, 11/05/2009 - 09:41 Permalink
Re
Hi!
What's the resolution of your data? How many voxels in each image? (e.g. 61X73X61).
How many time points do you have?
This error may caused by the huge dataset.
More information is in
Be aware of the number of elements limit:
in http://www.mathworks.com/support/tech-notes/1100/1107.html#_Toc170182654
Best wishes!
Submitted by dominic on Thu, 11/05/2009 - 10:41 Permalink
Hi, Thanks for your
Hi,
Thanks for your response. The dimensions are 91X109X91 (I think this is the default from DPARSF) with 510 images. We're running a 32-bit Linux debian system with matlab 2009. Are these specifications likely to cause the problem?
thanks,
dom
Submitted by YAN Chao-Gan on Thu, 11/05/2009 - 12:30 Permalink
Re
Hi!
The default setting for DPARSF is 61X73X61 (voxel size: 3X3X3).
For your data, the dataset is 91*109*91*510*8 bytes, i.e. 91*109*91*510*8/1024/1024/1024/1024/1024/1024=3.4298G.
In http://www.mathworks.com/support/tech-notes/1100/1107.html#_Toc170182654
For 32-bit Linux - Pcocessing Limit: ~3GB
Thus your dataset exceed the maximum.
Two suggestions:
1) install 64bit Linux or install kernel-hugemem (e.g. for CentOS5).
2) Wait for the incoming realease of REST. It will process in single format if your data is too huge (Thus just need 1.7G RAM).
Best wishes!
Submitted by dominic on Fri, 11/06/2009 - 08:52 Permalink
Index exceeds matrix dimensions
Hi again,
To get around this problem I used a test image that has been preprocessed in SPM5 to 3mm. The detrending and filtering now works but when I attempt to run the functional connectivity with regression of nuisance covariates I get the following error:
Thanks for your continued help.
dominic
Submitted by YAN Chao-Gan on Fri, 11/06/2009 - 11:58 Permalink
Re
I think the dimmention of your data is not 61*73*61 though their vox size is 3*3*3.
Best wishes!
Submitted by dominic on Sun, 11/08/2009 - 08:53 Permalink
meet error while writing the data
Hi,
Thanks again for your help, the images were in a different dimension to your templates and now I am able to preprocess a single subject through dparsf from beginning to end. However, when I tried to process a group of 20 I get a read/write error at the filtering stage. I have not changed the settings between running the single subject and running the group. I used the REST_Fix_Read_Write_Error.m patch but I still get the error. Strangely, I have run this analysis twice and the error occurs at two different times: in the first run it happened after the 11th subject and now it happens at the first subject. In both cases it occurs when the hdr/img files are being written. I am using matlab version 7.4.0 (R2007a) with SPM8 because of a previous post indicating that 2009 does not work. This is the error:
Meet error while writing the data
dominic
Submitted by YAN Chao-Gan on Sun, 11/08/2009 - 15:20 Permalink
Re
Hi!
This error is really weird.
I just wander if there is any permission or disk space problem?
Best wishes!