Skip to content

MemoryError #1740

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
albamrt opened this issue Aug 21, 2019 · 1 comment
Closed

MemoryError #1740

albamrt opened this issue Aug 21, 2019 · 1 comment
Labels

Comments

@albamrt
Copy link

albamrt commented Aug 21, 2019

Hello, I am writing because after running fmriprep in several subjects, now I'm encountering 'Memory Errors'. I suspect this is because for the previous subjects the fMRI files being processed were split in 2 runs, while for these subjects causing errors there is only 1 run with all the images. This file is in .nii.gz format and has an approximate size of 400 MB. The pipeline is being run in a host with 8 CPUs and 32 GB, with the following command:

docker run --rm -it -e DOCKER_VERSION_8395080871=18.09.2 -v /archive/albamrt/MRI/license.txt:/opt/freesurfer/license.txt:ro -v /archive/albamrt/MRI/BIDS:/data:ro -v /archive/albamrt/MRI/preprocess:/out -v /archive/albamrt/MRI/work:/scratch -u $UID poldracklab/fmriprep:latest /data /out participant --participant_label E04 -t wm --ignore slicetiming --bold2t1w-dof 6 --no-submm-recon --fs-no-reconall --write-graph -v --output-spaces MNI152NLin6Asym:res-2 --n_cpus 6 --nthreads 1 --mem-mb 28000 -w /scratch --low-mem

And the logfile obtained after crashing is the following:

Node: fmriprep_wf.single_subject_E04_wf.func_preproc_ses_01_task_wm_wf.func_derivatives_wf.ds_bold_std
Working directory: /scratch/fmriprep_wf/single_subject_E04_wf/func_preproc_ses_01_task_wm_wf/func_derivatives_wf/_key_MNI152NLin6Asym/ds_bold_std

Node inputs:

base_directory = /out
check_hdr = True
compress = True
desc = preproc
extra_values = <undefined>
in_file = ['/scratch/fmriprep_wf/single_subject_E04_wf/func_preproc_ses_01_task_wm_wf/bold_std_trans_wf/_key_MNI152NLin6Asym/merge/vol0000_xform-00000_merged.nii.gz']
keep_dtype = True
meta_dict = <undefined>
source_file = /data/sub-E04/ses-01/func/sub-E04_ses-01_task-wm_bold.nii.gz
space = MNI152NLin6Asym
suffix = 

Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 316, in _send_procs_to_workers
    self.procs[jobid].run(updatehash=updatehash)
  File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 472, in run
    result = self._run_interface(execute=True)
  File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 563, in _run_interface
    return self._run_command(execute)
  File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 643, in _run_command
    result = self._interface.run(cwd=outdir)
  File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 375, in run
    runtime = self._run_interface(runtime)
  File "/usr/local/miniconda/lib/python3.7/site-packages/niworkflows/interfaces/bids.py", line 493, in _run_interface
    nii.__class__(np.array(nii.dataobj), nii.affine, hdr).to_filename(
  File "/usr/local/miniconda/lib/python3.7/site-packages/nibabel/arrayproxy.py", line 356, in __array__
    return apply_read_scaling(raw_data, self._slope, self._inter)
  File "/usr/local/miniconda/lib/python3.7/site-packages/nibabel/volumeutils.py", line 967, in apply_read_scaling
    arr = arr + inter
MemoryError

I have been reading some other related issues such as #1097 or #886 and I have tried changing some parameters, but I haven't been able to solve it. So I would really really appreciate some help.

Thanks in advance!

@effigies
Copy link
Member

So a 400MB .nii.gz is going to consume a lot of RAM when decompressed and scaled to a float64, which is what is happening here. I think we're going to need to consider alternative ways of updating the headers for cases like this, where there's no reason to be touching the data array.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants