[color=#000000]Hi Adam,[/color]
[color=#000000]Regarding (1) yes, that is the default behavior. If computing run- or condition- specific measures is more appropriate for your study, you may do so from the [i]Setup.Covariates (1st level)[/i] tab in the menu labeled "[b]compute new/derived 2nd-level covariates[/b]". see for example the following post for additional details: https://www.nitrc.org/forum/message.php?msg_id=26100[/color]
Regarding (2), for Friston36 simply enter in the denoising step the "realignment" and "scrubbing" covariates (in your case the "realignment" covariate simply imports from fMRIPrep the 6 translation/rotation parameters), then select the "realignment" covariate and change the "[i]no temporal expansion[/i]" option to "[i]add 2nd-order derivative[/i]", and change the "[i]no polynomial expansion[/i]" option to "[i]add quadratic effects[/i]". I would also recommend adding aCompCor parameters during denoising (e.g add "White matter" and "CSF" covariates, and set the "confound dimensions" field to 5 for both). See https://web.conn-toolbox.org/fmri-methods/denoising-pipeline for other options
Hope this helps
Alfonso
[i]Originally posted by Adam Raikes:[/i][quote][color=#000000]Hi Alfonso,[/color]
[color=#000000]Yes it finished, actually about 30 minutes later after I posted that. I guess my expectation was that the log file would show all of the images loaded for processing rather than just the top 2 and the bottom 2.[/color]
Two further questions, one of which is particular the to fMRIPrep import:
1. For the QA plots, particularly the violin plots including valid scans, it appears that CONN is collapsing across sessions (e.g., each subject has only one data point in the violins). Is this true and is this supposed to be the case?
2. I was hoping to approach denoising/motion correction using the 36 parameter + scrubbing described in https://www.sciencedirect.com/science/article/pii/S1053811917302288 (see Model #8). Currently I believe fMRIPrep outputs all of the necessary confound fields to do so. Any recommendation on implementing this particular strategy in CONN? My assumption would be CSF+WM+GM, x,y,z trans|rot + derivatives, squares, and squared derivatives of these. These are available in the fMRIPrep confounds but there are n/a's in the first row for the derivatives and squared derivatives and so CONN gets mad.
[color=#000000]Thanks again,[/color]
[color=#000000]Adam[/color]
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Adam,[/color]
[color=#000000]That looks fine, smoothing is done in a single step for all subjects (without any progress bar giving you some feedback about how that is going) so it can take a bit until this step finishes. To give you some reference, it takes on my computer around 2.5 minutes per subject to smooth MNI152NLin2009cAsym functional data (my example has 2mm resolution and a total of 420 timepoints/acquisitions), so that would translate to about an hour to get all 23 subjects (or two hours if you are smoothing two sessions per subject). [/color]The total time scales roughly linearly with scan length and number of subjects (but note that it scales also linearly with your hard-drive speed; this example was run on a local SSD which is the fastest option, running the same analysis on a network drive will take considerably longer).
One option, to be able to continue working while this runs, is simply to run smoothing locally on the background (using the option labeled '[i]Distributed processing (run on background process)[/i]'); that will launch a new windowless matlab sessions which will be running the smoothing step. See also [url=https://web.conn-toolbox.org/resources/cluster-configuration]parallelization settings[/url] for other options
[color=#000000]Hope this helps[/color]
[color=#000000]Alfonso [/color]
[i]Originally posted by Adam Raikes:[/i][quote][color=#000000]Hi Alfonso,[/color]
[color=#000000]The patch appears to work as far as detecting the files. And yes, my dataset was partial at the time I was testing the import functionality but thanks for catching that.[/color]
[color=#000000]Everything appears to have imported ok. However, on the smoothing step it looks like it has stalled (honestly, can't tell). I set to smooth on all participants and sessions (all 23 as test; local processing). Here's the current log and it's been sitting there for 30 minutes like this :[/color]
[color=#000000]2020-Feb-28 10:08 : Preparing functional Smoothing (spatial convolution with Gaussian kernel)
Performing functional Smoothing (spatial convolution with Gaussian kernel)
SPM preprocessing job
.spm.spatial.smooth.fwhm = [6 6 6]
.spm.spatial.smooth.data(1) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI104\ses-bline\func\sub-BLMTBI104_ses-bline_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
.spm.spatial.smooth.data(2) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI104\ses-ptx\func\sub-BLMTBI104_ses-ptx_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
.spm.spatial.smooth.data(43) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI160\ses-bline\func\sub-BLMTBI160_ses-bline_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
.spm.spatial.smooth.data(44) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI160\ses-ptx\func\sub-BLMTBI160_ses-ptx_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
[/color]
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Adam,[/color]
[color=#000000]Thanks for the additional information. Please try the attached patch (to install this patch first unzip the attached file, that should create a folder named patch20200229 containing three *.m files, then simply copy these three files (not the folder itself) to your conn distribution folder overwriting the files with the same name there)[/color]
[color=#000000]If everything is working fine in your dataset CONN should find anatomical and "ses-bline" functional data for 23 subjects, and "ses-ptx" functional data for 21 subjects. The remaining 11 subjects (BLMTBI118-BLMTBI134) do not seem to contain any functional or structural data (check the fmriprep log files in these directories, as that may indicate that fmriprep was not able to finish preprocessing for some reason). [/color]
[color=#000000]Let me know if you run into any issues[/color]
[color=#000000]Best[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Adam Raikes:[/i][quote]Hi Alfonso,
The patch works for detecting the functional images in 20.0.0. However, the anatomicals are still not detected. The structure is attached.
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Adam,[/color]
[color=#000000]Thanks for the detailed error report, I believe you are right regarding (2) and CONN is just having trouble with the new 'res-*' entry here, the attached patch should fix that. Yet there might be something else that might be confusing CONN in (1) and perhaps here as well, if the patch does not fix the issue please run the matlab command below and send me the resulting testdatafile.mat file so I can take a closer look at your folder structure (even if the patch works I would be curious to see what was causing (1) as well, so if you don't mind sending me that info also for the 1.5.2 dataset that would be very helpful)[/color]
a = conn_bidsdir( '/data/fmriprep'); % change /data/fmriprep/ to your data root fmriprep output folder
save testdatafile.mat a;
[color=#000000]Thanks[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Adam Raikes:[/i][quote]Greetings,
I'm hoping I can avail of some help. I was hoping to take advantage of new "fMRIPrep" import option in 19.b. I have two versions of the same longitudinal dataset processed with fMRIPrep v 1.5.2 and 20.0.0 (this week's release). I have the following issues:
1. For the 1.5.2 dataset, CONN detects the functional images but does not detect any structural images. Accordingly, the import process unzips the MNI152NLin2009cAsym preprocessed files but not the anatomicals. My suspicion (eyeballing the .m files) is that CONN is expecting the fMRIPrep'd T1s to be located in `fmriprep/sub-xxxx/ses-xxxx/anat`. However these files are located in `fmriprep/sub-xxx/anat`.
2. For the 20.0.0 dataset, CONN finds the subject IDs but no files at all. The structure of the folders is exactly same. The only difference is that that 20.0.0 files contain the resolution field (i.e., `sub-BLMTBI104_ses-bline_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii.gz`) consistent with the new TemplateFlow recommendations for specifying output resolution.
Any recommendations on how to overcome these issues?
Thanks[/quote][/quote][/quote][/quote][/quote][/quote][/quote]
[color=#000000]Regarding (1) yes, that is the default behavior. If computing run- or condition- specific measures is more appropriate for your study, you may do so from the [i]Setup.Covariates (1st level)[/i] tab in the menu labeled "[b]compute new/derived 2nd-level covariates[/b]". see for example the following post for additional details: https://www.nitrc.org/forum/message.php?msg_id=26100[/color]
Regarding (2), for Friston36 simply enter in the denoising step the "realignment" and "scrubbing" covariates (in your case the "realignment" covariate simply imports from fMRIPrep the 6 translation/rotation parameters), then select the "realignment" covariate and change the "[i]no temporal expansion[/i]" option to "[i]add 2nd-order derivative[/i]", and change the "[i]no polynomial expansion[/i]" option to "[i]add quadratic effects[/i]". I would also recommend adding aCompCor parameters during denoising (e.g add "White matter" and "CSF" covariates, and set the "confound dimensions" field to 5 for both). See https://web.conn-toolbox.org/fmri-methods/denoising-pipeline for other options
Hope this helps
Alfonso
[i]Originally posted by Adam Raikes:[/i][quote][color=#000000]Hi Alfonso,[/color]
[color=#000000]Yes it finished, actually about 30 minutes later after I posted that. I guess my expectation was that the log file would show all of the images loaded for processing rather than just the top 2 and the bottom 2.[/color]
Two further questions, one of which is particular the to fMRIPrep import:
1. For the QA plots, particularly the violin plots including valid scans, it appears that CONN is collapsing across sessions (e.g., each subject has only one data point in the violins). Is this true and is this supposed to be the case?
2. I was hoping to approach denoising/motion correction using the 36 parameter + scrubbing described in https://www.sciencedirect.com/science/article/pii/S1053811917302288 (see Model #8). Currently I believe fMRIPrep outputs all of the necessary confound fields to do so. Any recommendation on implementing this particular strategy in CONN? My assumption would be CSF+WM+GM, x,y,z trans|rot + derivatives, squares, and squared derivatives of these. These are available in the fMRIPrep confounds but there are n/a's in the first row for the derivatives and squared derivatives and so CONN gets mad.
[color=#000000]Thanks again,[/color]
[color=#000000]Adam[/color]
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Adam,[/color]
[color=#000000]That looks fine, smoothing is done in a single step for all subjects (without any progress bar giving you some feedback about how that is going) so it can take a bit until this step finishes. To give you some reference, it takes on my computer around 2.5 minutes per subject to smooth MNI152NLin2009cAsym functional data (my example has 2mm resolution and a total of 420 timepoints/acquisitions), so that would translate to about an hour to get all 23 subjects (or two hours if you are smoothing two sessions per subject). [/color]The total time scales roughly linearly with scan length and number of subjects (but note that it scales also linearly with your hard-drive speed; this example was run on a local SSD which is the fastest option, running the same analysis on a network drive will take considerably longer).
One option, to be able to continue working while this runs, is simply to run smoothing locally on the background (using the option labeled '[i]Distributed processing (run on background process)[/i]'); that will launch a new windowless matlab sessions which will be running the smoothing step. See also [url=https://web.conn-toolbox.org/resources/cluster-configuration]parallelization settings[/url] for other options
[color=#000000]Hope this helps[/color]
[color=#000000]Alfonso [/color]
[i]Originally posted by Adam Raikes:[/i][quote][color=#000000]Hi Alfonso,[/color]
[color=#000000]The patch appears to work as far as detecting the files. And yes, my dataset was partial at the time I was testing the import functionality but thanks for catching that.[/color]
[color=#000000]Everything appears to have imported ok. However, on the smoothing step it looks like it has stalled (honestly, can't tell). I set to smooth on all participants and sessions (all 23 as test; local processing). Here's the current log and it's been sitting there for 30 minutes like this :[/color]
[color=#000000]2020-Feb-28 10:08 : Preparing functional Smoothing (spatial convolution with Gaussian kernel)
Performing functional Smoothing (spatial convolution with Gaussian kernel)
SPM preprocessing job
.spm.spatial.smooth.fwhm = [6 6 6]
.spm.spatial.smooth.data(1) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI104\ses-bline\func\sub-BLMTBI104_ses-bline_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
.spm.spatial.smooth.data(2) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI104\ses-ptx\func\sub-BLMTBI104_ses-ptx_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
.spm.spatial.smooth.data(43) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI160\ses-bline\func\sub-BLMTBI160_ses-bline_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
.spm.spatial.smooth.data(44) = D:\Documents\GitHub\bright_light_mtbi\data\neuroimaging_data\derivatives\fmriprep-20.0.0\sub-BLMTBI160\ses-ptx\func\sub-BLMTBI160_ses-ptx_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii
[/color]
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Adam,[/color]
[color=#000000]Thanks for the additional information. Please try the attached patch (to install this patch first unzip the attached file, that should create a folder named patch20200229 containing three *.m files, then simply copy these three files (not the folder itself) to your conn distribution folder overwriting the files with the same name there)[/color]
[color=#000000]If everything is working fine in your dataset CONN should find anatomical and "ses-bline" functional data for 23 subjects, and "ses-ptx" functional data for 21 subjects. The remaining 11 subjects (BLMTBI118-BLMTBI134) do not seem to contain any functional or structural data (check the fmriprep log files in these directories, as that may indicate that fmriprep was not able to finish preprocessing for some reason). [/color]
[color=#000000]Let me know if you run into any issues[/color]
[color=#000000]Best[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Adam Raikes:[/i][quote]Hi Alfonso,
The patch works for detecting the functional images in 20.0.0. However, the anatomicals are still not detected. The structure is attached.
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Adam,[/color]
[color=#000000]Thanks for the detailed error report, I believe you are right regarding (2) and CONN is just having trouble with the new 'res-*' entry here, the attached patch should fix that. Yet there might be something else that might be confusing CONN in (1) and perhaps here as well, if the patch does not fix the issue please run the matlab command below and send me the resulting testdatafile.mat file so I can take a closer look at your folder structure (even if the patch works I would be curious to see what was causing (1) as well, so if you don't mind sending me that info also for the 1.5.2 dataset that would be very helpful)[/color]
a = conn_bidsdir( '/data/fmriprep'); % change /data/fmriprep/ to your data root fmriprep output folder
save testdatafile.mat a;
[color=#000000]Thanks[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Adam Raikes:[/i][quote]Greetings,
I'm hoping I can avail of some help. I was hoping to take advantage of new "fMRIPrep" import option in 19.b. I have two versions of the same longitudinal dataset processed with fMRIPrep v 1.5.2 and 20.0.0 (this week's release). I have the following issues:
1. For the 1.5.2 dataset, CONN detects the functional images but does not detect any structural images. Accordingly, the import process unzips the MNI152NLin2009cAsym preprocessed files but not the anatomicals. My suspicion (eyeballing the .m files) is that CONN is expecting the fMRIPrep'd T1s to be located in `fmriprep/sub-xxxx/ses-xxxx/anat`. However these files are located in `fmriprep/sub-xxx/anat`.
2. For the 20.0.0 dataset, CONN finds the subject IDs but no files at all. The structure of the folders is exactly same. The only difference is that that 20.0.0 files contain the resolution field (i.e., `sub-BLMTBI104_ses-bline_task-rest_run-001_space-MNI152NLin2009cAsym_res-2_desc-preproc_bold.nii.gz`) consistent with the new TemplateFlow recommendations for specifying output resolution.
Any recommendations on how to overcome these issues?
Thanks[/quote][/quote][/quote][/quote][/quote][/quote][/quote]