Quantcast
Channel: NITRC CONN : functional connectivity toolbox Forum: help
Viewing all 6869 articles
Browse latest View live

RE: Multi-session pre-post design QC_ValidScans for each separate session

$
0
0
[i]If you prefer instead to compute QC_ValidScans directly you may do so by using in step 2 above the following options instead:[/i]

[i]consider covariate scrubbing: raw values[/i]
[color=#ff0000][i]summarize across timepoints: user-defined (and when prompted enter "sum(1-x,1)" -without quotes- as your user-defined function)[/i][/color]
[i]summarize across dimensions: sum[/i]
[i]summarize across dimensions first: checked[/i]
[i]condition-specific measures: checked[/i]
[i]and that will create two new second-level covariates, each summing the number of valid/non-outlier scans for one condition (i.e. like QC_ValidScans but separately for each condition).[/i]

Hi Alfonso,

I have conn 18.a and am trying to compute QC_ValidScans for 3 different sessions, but when I try to use those parameters, I cannot use user-defined on "symmarize across timepoints". Is this a new update to conn? How would I be able to do this on 18.a?

Best,
Omar

Second-level Seed-to-voxel analysis interpretation

$
0
0
Greetings, 

I am currently conducting a Seed-to-voxel analysis of connectivity between six seeds (left ACC (x,y,z = -6, 18, 21), left orbitofrontal cortex (OFC; x,y,z = -42, 24, -6), right insula (x,y,z = 36, 0, -21), left DMPFC (x,y,z = -3, 45, 24), right ventral striatum (x,y,z = 6, 12, -6), and left amygdala (x,y,z = -39, 0, -21) and whole-brain voxels. I have two groups: Dependent and Non-dependent (on cocaine). I am looking at functional connectivity within one (of two) runs, and it is the average connectivity through two conditions throughout that run. At the moment, I just want the functional connectivity between my DMPFC seed and whole-brain voxels among the Dependent group when observing DRUG-related videos (two video conditions). As such, my between-subjects contrast weights is [0,1] to identify the Dependent group out of the two possible groups; my between-conditions contrast weights are [0.5 0.5 0 0], to identify the two video conditions within the first (DRUG-related run/videos); and my between-sources contrast weight are [0,1,0,0,0,0], to identify the DMPFC seed out of my six seeds. When I view the results explorer, I end up with results in six massive clusters. I have attached a screenshot of the results output. However, the first cluster has a peak nearly exactly in the center of my ROI seed, and when I click on SPM (to display effect sizes and values), the first cluster in the results output is entirely over my ROI. 

I am currently at a loss, and read throughout this forum that I may need to extract subthreshold clusters to be inputed into the second level GUI. Could there be a step I missed in the preprocessing/denoising/first-level analysis pipeline? Is it normal to have such large clusters encompassing the entire brain, including the ROI? 

I would appreciate any recommendations. 

Regards, 

Will

QA as a variable in the second-level analysis

$
0
0
Hello, 

After all preprocessing steps (with SPM Art Toolbox as well) I have 6 additional Quality Assurance variables as second-level covariates (QA:  white matter, grey matter and csf: vol and eroded). Should I/Could I include those 6 variables in my model (and modelled them as "0") ? 

Thank you for you help, 

Agnieszka Debska

Adding subjects

$
0
0
Dear Alfonso,

when I add new subjects to an existing dataset the GUI shows not an empty subject for the newly created but copies the last from the previous dataset. Is this only a graphical issue or is it really cloning the last subject?  How can I get rid of scans in the primary functional dataset then if I want leave some sessions blank for the new subject?

Thanks,
Boris

Error while importing ROI data

$
0
0
ERROR DESCRIPTION:

Error using rex>rex_do (line 741)
mismatched number of scans and covariate datapoints ([2 1] vs. [800 16])
Error in rex (line 180)
[params.ROIdata,params.ROInames,params.ROIinfo.basis,params.ROIinfo.voxels,params.ROIinfo.files,params.ROIinfo.select,params.ROIinfo.trans]=rex_do(params,1);
Error in conn_rex (line 8)
[varargout{1:nargout}]=rex(varargin{:});
Error in conn_process (line 858)
else [data{nroi1},namesroi{nroi},params]=conn_rex(Vsourcethis,Vmask{nroi}{min(nses,nsesstemp)},'summary_measure','eigenvariate','dims',CONN_x.Setup.rois.dimensions{nroi},'conjunction_mask',mask,'level',level,'scaling',scalinglevel,'select_clusters',0,'covariates',entercovariates,'fsanatomical',fsanatomical,'output_type',outputtype,'output_rex',filenamerex,'output_folder',filepath);
Error in conn_process (line 16)
case 'setup', conn_disp(['CONN: RUNNING SETUP STEP']); conn_process([0:4,4.5,5]);
Error in conn (line 5262)
else conn_process(processname); ispending=false;
Error in conn_menumanager (line 120)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN18.b
SPM12 + DEM FieldMap MEEGtools
Matlab v.2017b
project: CONN18.a
storage: 327.4Gb available
spm @ /home/demenzbild/Schreibtisch/spm12
conn @ /home/demenzbild/Schreibtisch/conn




I encountered this error-message and after multiple researches I still wasn´t able to figure out its reason. Maybe somebody could help me? I would be really really grateful if someone could help me with this task!

Thank you in advance,
Best regards,

Boris

Error importing results to CONN project

$
0
0
2019-Mar-07 17:50 : Preparing functional Label current functional files as part of list of Secondary Datasets ("original data" label)
Performing functional Label current functional files as part of list of Secondary Datasets ("original data" label)
Importing results to CONN project
ERROR DESCRIPTION:
ERROR DESCRIPTION:

Error using cellstr (line 49)
Conversion to cellstr from double is not possible.
Error in conn_datasetcopy (line 44)
existf=conn_existfile(cellstr(f1));
Error in conn_setup_preproc (line 2727)
conn_datasetcopy(sets,'original data',subjects);
Error in conn (line 1081)
ok=conn_setup_preproc('',varargin{2:end});
Error in conn_menumanager (line 120)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN18.b
SPM12 + DEM FieldMap MEEGtools
Matlab v.2017b
project: CONN18.a
storage: 326.0Gb available
spm @ /home/demenzbild/Schreibtisch/spm12
conn @ /home/demenzbild/Schreibtisch/conn

RE: Extremely high values of contrast estimates in SPM after using CONN preprocessed and denoised data

$
0
0
[color=#000000]Hi Alfonso,[/color]

[color=#000000]The Global Normalization is already set to "None" in spm first level. [/color]
[color=#000000]Attached is the image of the same. [/color]

[color=#000000]Do you mean something else is supposed to be changed?[/color]

[color=#000000]Avantika[/color]
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Avantika,[/color]

[color=#000000]I would suggest trying to set the "[i]grand mean scaling[/i]" option in SPM first-level estimation off, since that looks like a possible culprit for this behavior (after band-pass filtering, the mean functional data is zero at every voxel, so global signal scaling -and similarly any other default mechanisms that rely on the average BOLD signal containing anatomical information/features- are likely to fail in rather unexpected ways). [/color]Let me know if that works

[color=#000000]Best[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by avantika mathur:[/i][quote]Hi Conn users,

After following the following posts, 
https://www.nitrc.org/forum/message.php?msg_id=12021

I used the alternative method to import conn preprocessed data in SPM which is the following :
Entering the preprocessed/denoised timeseries into SPM to perform the first-level analyses.

The data I am analyzing is children data thus, ART was used at liberal threshold in preprocessing [Global signal z value threshold 10, subject motion 5 mm]. I did not have the "effect of Condition X" entered as confounding effects during Denoising.
I used the file generated after conn preprocessing and denoising...the niftiDATA_Subject001_Condition000 and further defined first-level design matrices within SPM, specified masking threshold to -Inf in first level analysis [https://www.nitrc.org/forum/message.php?msg_id=14852].

After doing first level analysis and group level analysis [10 subjects], I get weird beta estimate values - which are extremely high . Attached are the bar plots for the same [1st bar-chart - single subject, 2nd bar chart - group of 10 subjects]. Beta values should not be this high.

Can someone direct me where I am going wrong?

Avantika[/quote][/quote]

Cerebellum Cut Off after Preprocessing

$
0
0
Dear Experts, 

I imported both structural and functional images for 30 subjects and preprocessed them on CONN using default pipeline. When I check the images after preprocessing, I realized a lower portion of the brain (looks like it's the cerebellum) was cut off from the functional image. The structural image looks fine. The cut off in functional image appeared for all subjects. I checked the original functional images for several subjects, and they look fine and complete (has the cerebellum portion). In some patients, the cut offs are severe and may have affected the temporal lobes. Can anyone help me understand what's happening with the cut off? Why does it only appears in functional image? (I attached the screenshots for clarity). 

In addition, I'm trying to calculate global efficiency and local efficiency for ROIs in the frontoparietal network, would this cutoff influence the calculation of global/local efficiency? 

Thank you so much for the help!!

CONN 18b frozen when setting the preprocesing steps

$
0
0
Dear Conn Users,

I used CONN version 17a, then 17f as SPM12 toolboxes running under Matlab R2017b on a Linux machine.
Everything was running properly for a while but recently, everything I process in 17f as ROI-to-ROI is frozen in a stage of explorer results. Initially, I thought that's a memory issue because I had 700 participants.
Recently I looked at a small group of 28 participants and the same issue appears again. After I select a couple of options in the explorer results, CONN becomes frozen (especially after I manually defined a set of ROI and selected them). It's not possible anymore to see the 3D representation or to analyse the table as before.
Then I decided to do everything in CONN 18b as SPM12(7487) toolboxes running under Matlab R2017b on a Linux. But unfortunately, the software is frozen even earlier, when I have to enter the smoothing kernel in the preprocesing steps.

I need to mention that I have all three versions of CONN (conn17a, 17f and 18b) saved in the "toolbox" of SPM12.

Please find below the error description:

"ERROR DESCRIPTION:

Error using figure
First argument must be a figure object or a positive Integer
Error in conn_menumanager (line 289)
figure(CONN_MM.gcf);
Error in conn (line 1465)
conn_menumanager clf;
Error in conn (line 1085)
conn gui_setup;
Error in conn_menumanager (line 120)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN18.b
SPM12 + BrainNet DEM FieldMap LST MEEGtools conn conn17a conn17f wfupickatlas
Matlab v.2017b
project: CONN18.b
storage: 1705.8Gb available
spm @ /usr/local/spm/12.0
conn @ /usr/local/spm/12.0/toolbox/conn"

Do you have any suggestions? I would be happy to see 18b working.
Thank you very much in advance.

Larra

denoising (compcor) is shifting all correlation histogram means to zero

$
0
0
We are preprocessing data using other methods, but then put the timeseries into CONN so we can do compcor on them and subsequent analysis.
I'm attaching a screenshot from the QA of histograms before and after denoising.
Also attaching a screenshot showing denoising parameters included...
in particular, WM and CSF selected for compcor.
filter set to include 0-100 (essentially to turn it off).
linear detrending also selected.

As you can see, there are around 60 correlation distributions here. Prior to denoising, the means are on the order of 0.1. After denoising they are ALL going to zero.
Would expect this from GSR, but that is exactly what we are trying to avoid and therefore using compcor.
Please advise... this is unexpected.
Thank you.

Extracting fMRI data after denoising step

$
0
0
Hello everybody,

I'm using CONN to preprocess rs-fMRI data and am especcially interested in the denoising functionality. Is it possible to save the denoised fMRI data so that I can use them outside of CONN?

Thanks in advance.

Different numbers of runs in fMRI task

$
0
0
Dear Alfonso,

After preprocessing our fMRI task with a standard pipeline, we also applied ICA AROMA to each run of the task (6 runs in total).
For some subjects, we had to remove some runs, meaning that we now have different numbers of runs across subjects.
Is this a problem in CONN? How should we deal with this?

Thank you
all the best
Heidi

Problems preprocessing imported data unwarped with fsl topup

$
0
0
Hello,

I am trying to perform preprocessing of resting state data (with conn 18b) after performing motion correction and unwarping with fsl mcflirt and topup, respectively. I have imported the motion corrected/unwarped data into conn as the primary dataset. I have also imported the motion parameters as a 1st level covariate labelled 'realignment'.

These are the preprocessing steps I am trying to run (basically the default pipeline starting at the functional Outlier detection step):

functional Outlier detection (ART-based identification of outlier scans for scrubbing)
functional Direct Segmentation & Normalization (simultaneous Grey/White/CSF segmentation and MNI normalization)
functional Label current functional files as part of list of Secondary Datasets ("mni-space data" label)
structural Segmentation & Normalization (simultaneous Grey/White/CSF segmentation and MNI normalization)
functional Smoothing (spatial convolution with Gaussian kernel)
functional Label current functional files as part of list of Secondary Datasets ("smoothed data" label)

However, I am getting the following error during preprocessing (at functional Direct Segmentation & Normalization):
[code]SPM preprocessing job
(1).spm.spatial.preproc.channel.vols = /Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/art_mean_fmri.nii
(1).spm.spatial.preproc.warp.write = [1 1]
(2).spm.spatial.normalise.write.woptions.bb = [-90 -126 -72;90 90 108]
(2).spm.spatial.normalise.write.woptions.vox = [2 2 2]
(2).spm.spatial.normalise.write.subj.def = /Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/y_art_mean_fmri.nii
(2).spm.spatial.normalise.write.subj.resample(1) = /Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/art_mean_fmri.nii
(2).spm.spatial.normalise.write.subj.resample(2) = /Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/c1art_mean_fmri.nii
(2).spm.spatial.normalise.write.subj.resample(453) = /Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/fmri.nii,449
(2).spm.spatial.normalise.write.subj.resample(454) = /Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/fmri.nii,450[/code][code]------------------------------------------------------------------------
Running job #1
------------------------------------------------------------------------
Running 'Segment'[/code][code]SPM12: spm_preproc_run (v6365) 23:36:48 - 09/03/2019
========================================================================
Segment /Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/art_mean_fmri.nii
Failed 'Segment'
Index exceeds matrix dimensions.
In file "/usr/local/spm12/spm_maff8.m" (v7202), function "loadbuf" at line 94.
In file "/usr/local/spm12/spm_maff8.m" (v7202), function "spm_maff8" at line 25.
In file "/usr/local/spm12/spm_preproc_run.m" (v6365), function "run_job" at line 107.
In file "/usr/local/spm12/spm_preproc_run.m" (v6365), function "spm_preproc_run" at line 41.
In file "/usr/local/spm12/config/spm_cfg_preproc8.m" (v6952), function "spm_local_preproc_run" at line 450.[/code][code]Running 'Normalise: Write'
Failed 'Normalise: Write'
Error using read_hdr (line 32)
Error reading header file "/Volumes/My_Book_Pro/BDD-OCD/Analysis/fMRI/REST/spm/115a/y_art_mean_fmri.nii".
In file "/usr/local/spm12/@nifti/private/read_hdr.m" (v7147), function "read_hdr" at line 32.
In file "/usr/local/spm12/@nifti/nifti.m" (v7147), function "nifti" at line 26.
In file "/usr/local/spm12/@nifti/nifti.m" (v7147), function "nifti" at line 81.
In file "/usr/local/spm12/config/spm_run_norm.m" (v6578), function "write_norm" at line 98.
In file "/usr/local/spm12/config/spm_run_norm.m" (v6578), function "spm_run_norm" at line 29.[/code][code]The following modules did not run:
Failed: Segment
Failed: Normalise: Write[/code][code]ERROR DESCRIPTION:[/code][code]Error using MATLABbatch system
Job execution failed. The full log of this run can be found in MATLAB command window, starting with the lines (look for the line showing the exact #job as displayed in this error message)
------------------
Running job #1
------------------
CONN18.b
SPM12 + ALI Anatomy DEM FieldMap MEEGtools Masking com conn
Matlab v.2017b
project: CONN18.b
storage: 14296.5Gb available[/code][code]spm @ /usr/local/spm12
conn @ /usr/local/spm12/toolbox/conn[/code]

The image sizes and resolution are the same across images, including the unwarped fmri data and the art_mean_fmri (96 96 66, 2 x 2 x 2). 

In addition, the scrubbing covariate appears with a diagonal of dots (see attachment) and the functional data are being reoriented by tilting the head up (see attachment).

I appreciate any help solving these issues.

Regards,

Juan

RE: CONN 18b frozen when setting the preprocesing steps

$
0
0
[color=#000000]Dear Larra,[/color]

[color=#000000]Could please clarify what you mean by frozen? (the window is unresponsive? it throws error messages when you attempt to interact with it? can you ctrl-C and do you see any error messages when you do?)[/color]

[color=#000000]The error message that you attach is quite strange, in my impression it is perhaps due to either the CONN figure having been deleted (e.g. issuing a "close force all" command in Matlab), or perhaps the Matlab workspace having been cleared (e.g. issuing a "clear all" command in Matlab). After you launch CONN, is there any other Matlab program that you are launching/running that perhaps may be interacting with CONN in unexpected ways?[/color]

Any additional information may be helpful here. Thanks
[color=#000000]Alfonso[/color]
[i]Originally posted by Larra San:[/i][quote]Dear Conn Users,

I used CONN version 17a, then 17f as SPM12 toolboxes running under Matlab R2017b on a Linux machine.
Everything was running properly for a while but recently, everything I process in 17f as ROI-to-ROI is frozen in a stage of explorer results. Initially, I thought that's a memory issue because I had 700 participants.
Recently I looked at a small group of 28 participants and the same issue appears again. After I select a couple of options in the explorer results, CONN becomes frozen (especially after I manually defined a set of ROI and selected them). It's not possible anymore to see the 3D representation or to analyse the table as before.
Then I decided to do everything in CONN 18b as SPM12(7487) toolboxes running under Matlab R2017b on a Linux. But unfortunately, the software is frozen even earlier, when I have to enter the smoothing kernel in the preprocesing steps.

I need to mention that I have all three versions of CONN (conn17a, 17f and 18b) saved in the "toolbox" of SPM12.

Please find below the error description:

"ERROR DESCRIPTION:

Error using figure
First argument must be a figure object or a positive Integer
Error in conn_menumanager (line 289)
figure(CONN_MM.gcf);
Error in conn (line 1465)
conn_menumanager clf;
Error in conn (line 1085)
conn gui_setup;
Error in conn_menumanager (line 120)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN18.b
SPM12 + BrainNet DEM FieldMap LST MEEGtools conn conn17a conn17f wfupickatlas
Matlab v.2017b
project: CONN18.b
storage: 1705.8Gb available
spm @ /usr/local/spm/12.0
conn @ /usr/local/spm/12.0/toolbox/conn"

Do you have any suggestions? I would be happy to see 18b working.
Thank you very much in advance.

Larra[/quote]

Denoising: dimensions for WM and CSF

$
0
0
Dear Alfonso, 

1. I am using the Conn 17. In the denoising tab, I had input the number of dimensions for White Matter(5) and CSF(5) under the 'Confounds' field. Am I correct to say that by default, 16 dimensions are extracted for White Matter and CSF each, and in my case, only the first 5 dimensions for each WM and CSF are used in the denoising step?

2. My second question is that: As I mouse over the Confound timeseries of white matter for example, I see several values (in my case, 5 values). Can I presume that these values refer to the amplitude? 

Thank you very much. 
GK

ROIs for ALFF

$
0
0
Dear all,


I want to analyze simple resting state (ALFF/fALFF). Is it possible to replace the CONN internal brain template by a template containing only the brain regions of interest? Can there be several different ROIs or only a single one that contains all of the ROIs?

Thank you in advance

Joachim

RE: BACTH Processing Help

$
0
0
Thanks Alfonso!
The matlab version I'm using is '9.2.0.556344 (R2017a)' and I'm now on CONN18b. I'm still getting this error and I can't seem to figure it out:

Error using cell/strmatch (line 19)
Requires character array or cell array of character vectors as inputs.

Error in conn_batch (line 1466)
idx=strmatch(batch.Results.between_subjects.effect_names{neffect},CONN_x.Setup.l2covariates.names,'exact');
Error in CONNbatch (line 77)
conn_batch(BATCH);

clear BATCH;
BATCH.filename= '/media/nir/SharedDrive/Ben/GSPmar4.mat';
BATCH.Setup.isnew=1;
BATCH.Setup.preprocessing.steps={'default_mni'};
BATCH.Setup.functionals{1}{1}='/media/nir/SharedDrive/Ben/GSP/Sub0052_Ses1/Sub0052_Ses1_Scan_02_BOLD1.nii'
BATCH.Setup.functionals{1}{2}='/media/nir/SharedDrive/Ben/GSP/Sub0052_Ses1/Sub0052_Ses1_Scan_03_BOLD2.nii'
BATCH.Setup.functionals{2}{1}='/media/nir/SharedDrive/Ben/GSP/Sub0053_Ses1/Sub0053_Ses1_Scan_02_BOLD1.nii'
BATCH.Setup.functionals{2}{2}='/media/nir/SharedDrive/Ben/GSP/Sub0053_Ses1/Sub0053_Ses1_Scan_03_BOLD2.nii'
BATCH.Setup.structurals{1}='/media/nir/SharedDrive/Ben/GSP/Sub0052_Ses1/Sub0052_Ses1_Scan_01_ANAT1.nii'
BATCH.Setup.structurals{2}='/media/nir/SharedDrive/Ben/GSP/Sub0053_Ses1/Sub0053_Ses1_Scan_01_ANAT1.nii'
BATCH.Setup.preprocessing.fwhm=8;
BATCH.Setup.preprocessing.voxelsize_func=2;
BATCH.Setup.preprocessing.sliceorder=[1:2:47,2:2:47];
BATCH.Setup.RT=3.0;
BATCH.Setup.nsubjects=2;
BATCH.Setup.analyses=[1,2];
BATCH.Setup.voxelmask=1;
BATCH.Setup.voxelresolution=1;
BATCH.Setup.outputfiles=[0,1,0];
BATCH.Setup.roi.names={'LeftVentralStriatum','RightVentralStriatum', 'LeftDorsalPutamen', 'RightDorsalPutamen', 'LeftMedialDorsalThalamus', 'RightMedialDorsalThalamus', 'LeftVentralPutamen', 'RightVentralPutamen', 'LeftDorsalStriatum', 'RightDorsalStriatum'};

%works this far
BATCH.Setup.rois.names={'LeftVentralStriatum','RightVentralStriatum', 'LeftDorsalPutamen', 'RightDorsalPutamen', 'LeftMedialDorsalThalamus', 'RightMedialDorsalThalamus', 'LeftVentralPutamen', 'RightVentralPutamen', 'LeftDorsalStriatum', 'RightDorsalStriatum'}
BATCH.Setup.rois.dimensions={1,1,1,1,1,1,1,1,1,1}
BATCH.Setup.rois.files{1}='/media/nir/SharedDrive/Ben/ROIs/binLeftVentralStriatum.nii';
BATCH.Setup.rois.files{2}='/media/nir/SharedDrive/Ben/ROIs/binRightVentralStriatum.nii';
BATCH.Setup.rois.files{3}='/media/nir/SharedDrive/Ben/ROIs/binLeftDorsalPutamen.nii';
BATCH.Setup.rois.files{4}='/media/nir/SharedDrive/Ben/ROIs/binRightDorsalPutamen.nii';
BATCH.Setup.rois.files{5}='/media/nir/SharedDrive/Ben/ROIs/binLeftMedialDorsalThalamus.nii';
BATCH.Setup.rois.files{6}='/media/nir/SharedDrive/Ben/ROIs/binRightMedialDorsalThalamus.nii';
BATCH.Setup.rois.files{7}='/media/nir/SharedDrive/Ben/ROIs/binLeftVentralPutamen.nii';
BATCH.Setup.rois.files{8}='/media/nir/SharedDrive/Ben/ROIs/binRightVentralPutamen.nii';
BATCH.Setup.rois.files{9}='/media/nir/SharedDrive/Ben/ROIs/binLeftDorsalStriatum.nii';
BATCH.Setup.rois.files{10}='/media/nir/SharedDrive/Ben/ROIs/binRightDorsalStriatum.nii';
BATCH.Setup.conditions.names={'preop'};
BATCH.Setup.conditions.onsets{1}{1}{1}=[6];
BATCH.Setup.conditions.onsets{1}{1}{2}=[6];
BATCH.Setup.conditions.onsets{1}{2}{1}=[6];
BATCH.Setup.conditions.onsets{1}{2}{2}=[6];
BATCH.Setup.conditions.durations{1}{1}{1}=[inf];
BATCH.Setup.conditions.durations{1}{1}{2}=[inf];
BATCH.Setup.conditions.durations{1}{2}{1}=[inf];
BATCH.Setup.conditions.durations{1}{2}{2}=[inf];
%BATCH.Setup.covariates.names={'realignment'};
%BATCH.Setup.covariates.files{1}{1}{1}='/media/nir/SharedDrive/Ben/GSP/Sub0052_Ses1/rp_aSub0052_Ses1_Scan_02_BOLD1.txt';
%BATCH.Setup.covariates.files{1}{1}{2}='/media/nir/SharedDrive/Ben/GSP/Sub0052_Ses1/rp_aSub0052_Ses1_Scan_03_BOLD2.txt';
%BATCH.Setup.covariates.files{1}{2}{1}='/media/nir/SharedDrive/Ben/GSP/Sub0053_Ses1/rp_aSub0053_Ses1_Scan_02_BOLD1.txt';
%BATCH.Setup.covariates.files{1}{2}{2}='/media/nir/SharedDrive/Ben/GSP/Sub0053_Ses1/rp_aSub0053_Ses1_Scan_03_BOLD2.txt';
BATCH.Setup.subjects.effect_names{1}={'ANYresponse'};
BATCH.Setup.subjects.effects{1}=[-1;1];
BATCH.Setup.done=1;
BATCH.Setup.overwrite='No';
BATCH.Preprocessing.filter=[.01,.1];
%BATCH.Preprocessing.confounds.names={'White','CSF','realignment'};
%BATCH.Preprocessing.confounds.dimensions={3,3,6};
%BATCH.Preprocessing.confounds.deriv={0,0,1};
BATCH.Preprocessing.done=1;
BATCH.Preprocessing.overwrite='No';
BATCH.Analysis.type=1;
BATCH.Analysis.measure=1;
BATCH.Analysis.weight=2;
BATCH.Analysis.sources.names={'LeftVentralStriatum','RightVentralStriatum', 'LeftDorsalPutamen', 'RightDorsalPutamen', 'LeftMedialDorsalThalamus', 'RightMedialDorsalThalamus', 'LeftVentralPutamen', 'RightVentralPutamen', 'LeftDorsalStriatum', 'RightDorsalStriatum'};
BATCH.Analysis.sources.dimensions={1,1,1,1,1,1,1,1,1,1};
BATCH.Analysis.sources.deriv={0,0,0,0,0,0,0,0,0,0};
BATCH.Analysis.done=1;
BATCH.Analysis.overwrite='No';
BATCH.Results.between_subjects.effect_names={'AllSubjects'};
BATCH.Results.between_subjects.contrast=[1];
%BATCH.Results.between_conditions.effect_names={task1','task2'};
%BATCH.Results.between_conditions.contrast=[1,-1];
BATCH.Results.between_sources.effect_names={'RightVentralStriatum'};
BATCH.Results.between_sources.contrast=[1];
%BATCH.Results.analysis_number=2;
BATCH.Results.done=1;
conn_batch(BATCH);

I'll see about re-working the script based on the conn_batch_humanconnectomeproject.m

RE: denoising (compcor) is shifting all correlation histogram means to zero

$
0
0
Alfonso,
Thank you for your reply.
The WM and CSF masks look ok to us.  In any case, we are importing data to CONN that are in MNI space, and the WM/CSF masks are those in CONN (also MNI space).
Regarding the QA-GCOR plots, what exactly are we looking at?  Is each point on the plot the center value (mean) of the correlation histogram for a particular subject?
Thank you.

ICA networks processing does not run

$
0
0
Dear colleagues,

I am using the Conn GUI and everything is running except with the ICA networks part (left icon on RESULTs (2nd-level). When I tried to run it I get the attached message. Basically It asks me to run the first-level voxel-to-voxel step, which I already had done. Any suggestion is wellcome. Thanks in advance.

Regards,

Radu

Problem with non-parametric cluster-level statistics (permutation tests)

$
0
0
Hi Alfonso,
I am running a surface-based functional connectivity analysis with conn and it is going well until the permutation tests for second level analysis. It always froze on the interface "updating non-parametric statistics, please wait" and could not proceed and got the results. I have tried used another version of conn as well as another computer, but i still cannot solve this problem. Do you know why and how could I get the results of non-parametric cluster-level statistics? 
Thanks,
Huan
Viewing all 6869 articles
Browse latest View live