Quantcast
Channel: NITRC CONN : functional connectivity toolbox Forum: help
Viewing all 6883 articles
Browse latest View live

CONN Pipeline: ART Implicit Mask Error?

$
0
0
I was re-processing a dataset with the newest CONN version and ran across this systematic (potential?) error for all subjects:

"Warning. Error encountered during ART implicit-mask file creation. Skipping this step"

Everything else appears to work fine during the ART outlier detection process. 

Is this of concern?

Thanks,
Jeff

Want to use my own ROIs for seed-based connectivity analysis

$
0
0
Dear Conn experts,

Sorry for bothering you for such stupid questions. I did a VBM analysis on my two subject groups and found significant gray matter differences in a small region covering insula and inferior frontal gyrus. Now I want to use this cluster as seed region to explore functional connectivity analysis. I already build the mask by using Marsbar in SPM. But I don't know how to load this mask in Conn as my seed region. Can you give me some guidelines for doing this?

Looking forward to your reply,

Yours sincerely,

Hua

How to erase sessions

$
0
0
Hello conn experts,

My issue is that i want to erase sessions of every patient, though I do not want to erase some specific sessions. 

This is, I have plenty of session, as the experiment i was trying to solve required them, but i need the results of comparison of some sessions as soon as posible, but i need to enter the data of the first level covariates of all the sassions.

The conditions I defined are not related to those session i want to erase.

In short, there are only some sessions I want to keep, but i can not erase the rest, without erasing those.

Is there any way I can erase only some speciffic sessions?

Thank you very much. This might sound trickier than it is in fact. 

Nikolas

Restrict between-group comparisons within ICA networks

$
0
0
Dear experts,

I´m doing ICA analysis with Conn.
My problem is that I would like to look at differences [u]within networks[/u]. I know how to create the network mask from the whole group analysis but I can not figure out how to continue. Should I go back and use this group mask before in order to do again the individual map? is there any option to directly use the all sample maps to limit the group comparisons?

Thanks in advance,

Noys

RE: Error in conn preprocessing ROI

$
0
0
Hello!

I've had the same problem as you Sarah. 

ERROR DESCRIPTION:

Undefined function or method 'eq' for input arguments of type 'struct'.
Error in ==> conn_menumanager>@(x)any(x==varargin{1}) at 327
idx=find(cellfun(@(x)any(x==varargin{1}),CONN_MM.onregionhandle));
Error in ==> conn_menumanager at 327
idx=find(cellfun(@(x)any(x==varargin{1}),CONN_MM.onregionhandle));
Error in ==> conn_menu at 535
conn_menumanager('onregionremove',position.h5);
Error in ==> conn at 4872
conn_menu('update',CONN_h.menus.m_preproc_00{3},[]);
CONN v.18.a
SPM12 + Anatomy DEM FieldMap LI MEEGtools marsbar rwls suit vbm8
Matlab v.2009b
storage: 2690.0Gb available

Has anyone discovered what is the problem here. In my case I have been using SPM12 and CONN 18.a.

Question permutation testing with repeated measures design

$
0
0
Dear Conn users,

I am using conn 18a for resting-state fMRI analysis with following design:

test group: 3 repeated measures (pre, post, follow-up)
control group: 2 repeated measures (pre, post)

I would like to perform non-parametric analysis using permutation testing, for which I have two questions:

1) a general question on permutation testing in conn is that I noticed that you can only choose to run additional permutations, rather than choosing the exact amount. So for example, runningif I would run 5000 permutations, I can only add more permutations afterwards and cannot do a test using 1000 permutations.
It also seems impossible to then switch between results using different amounts of permutations. For example, when running 1000 permutations and then 1000 additional permutations, I can only see the results of 2000, and not those of 1000 permutations anymore. Is there a way to keep the results of different runs using different amounts of permutations?

2) Concerning the repeated measures design, if I understand permutation testing correctly, one can only permute the labels within the subject. So subject1_pre can be swapped with subject1_post, but not with subject2_post. Does conn take this into account? Because when I perform a paired t-test of the pre and post measurements of the test group (n=12), there should only be 2^12 (=4096) possible permutations, yet conn allows to run 5000 permutations. Could someone please clarify how conn does the permutation testing with repeated measures?

Thanks a lot in advance.

All the best,
Steven

CSF signal contamination in seed timeseries

$
0
0
Hi Alfonso,  

We are performing a seed-to-voxel analysis using a seed that is adjacent to the third ventricle. 

I just wanted to verify whether we would need to be concerned about CSF signal contamination in the voxels of our seed, given that smoothing of this area may theoretically include voxels that are part of the third ventricle? If not, how does CONN correct for this? 

Thanks in advance for you help. 

Ana Maria

difference between Effect of condition in confounds and first level covariate

$
0
0
Dear CONN experts,

I'd like to know why one would want to copy task-related conditions to first level covariates.
Aren't these regressed out during the temporal preprocessing (denoising) anyway?
The original CONN paper (2012) suggests these effects are indeed removed in the Denoising step. It appears so since the conditions are imported as confounds with the name 'Effect of...'. I guess this is done to obtain "resting state" task-independent FC measures, as in Fair et al (2007).
However in the CONN User Manual states that, in order to achieve this, the conditions must be copied to the 1st level covariate list. Is this correct?
This is confusing. In a nutshell, what is the purpose of this 1st level covariate list, other than to provide regressors not to be HRF-convolved (like in SPM)?
On the other hand, is the HRF-free regression used to remove task effects or just HRF convolved conditions?
Looking forward any comment on this.

Pedro

How to analyze the data preprocessed from other software?

$
0
0
Hi,

I would like to analyze the data preprocessed by Freesurfer and AFNI with CONN.
However, we did not know how to insert each parameter in the item "Covariates 1st-level" and it could not be executed. For example, I do not know the format of "art_regression *" .mat used for QA_timesiries and scrubbing.

For these reasons, I do not know how to register the data obtained by previous preprocessing and proceed to the next analysis.

Can you tell me how to analyze preprocessed data from other software with CONN?

Best regards,
Masaki

First Level Not Displaying Data

$
0
0
Dear Conn Users,

I am working in version 17f and have data that was fully processed in CONN for N=28 [2 sessions (pre- and post-intervention), two task conditions (2-Back and 0-Back) in each session]. I am running gPPI ROI-to-ROI analyses (regression - bivariate). I have data fully processed and am able to view it at the second level. However, the first level will not display data for any subject, condition, or ROI. When viewing the source timeseries for each subject, timeseries display does not change when clicking on different conditions (screenshot example). It is not clear to me how the timeseries is being treated across the different conditions at the first level, and I would like to check over the individual subject data. Because of this, I am not clear on what values are being used in the second level. I have not received any error during processing the data. Re-analyzing and overwriting the data at first level has not resolved the display issue.

If anyone has insight on how to resolve this, please let me know. Thank you in advance!


Best regards,
Nicole

Coding Main Effect/Interactions for Symptom Covariates of Multiple Groups

$
0
0
Hello CONN Experts,

I've looked over the manual and past message board questions but still have some doubts on if I'm properly coding my 2nd level contrasts.

I am looking at relationship of RS connectivity to symptom severity in a multi-site study of multiple patient groups in a number of ROIs.
I would like to primarily look at main effects of symptom for the all the patients and also group x symptom interactions. 

Example of my current Subject Effect I have coded:
QCSubj: Subset of subjects that are okay quality to use in analysis e.g. [ 1 1 1 NaN 1 1 1 NaN 1 1 1 NaN]
OnlyPts: Only patients, exclude healthy controls. eg. [ 1 1 1 1 1 1 1 1 1 NaN NaN NaN]
Group A: e.g.  [ 1 0 0 1 0 0 1 0 0]
Group B: e.g. [  0 1 0 0 1 0 0 1 0]
Group C: e.g. [  0 0 1 0 0 1 0 0 1]
DxGroup: e.g. [ 1 2 3 1 2 3 1 2 3]
SymA: e.g. [ 1 0 0 4 0 0 7 0 0] - has NaN for healthy controls
SymB: e.g. [ 0 2 0 0 5 0 0 8 0]- has NaN for healthy controls
SymC: e.g. [ 0 0 3 0 0 6 0 0 9] - has NaN for healthy controls
Sym:  e.g. [1 2 3 4 5 6 7 8 9]
Age: e.g. [ 21 22 23 24 25 26 27 28 29 30]
Sex: e.g [1 0 1 0 1 0 1 0 1 0]
Site1: e.g. [ 1 1 1 0 0 0 0 0 0]
Site2: e.g. [ 0 0 0 1 1 1  0 0 0]
Site3: e.g. [ 0 0 0 0 0 0 1 1  1]
OnlyGroupAB e.g. [ 1 1 NaN 1 1 NaN 1 1 NaN]
OnlyGroupAC e.g. [ 1 NaN 1  1 NaN 1 1 NaN 1]
OnlyGroupBC e.g. [ NaN 1 1  NaN 1 1 NaN 1 1] 

Also each of the 3 patient groups has ~90-120 subjects.

[b]1. Is this the appropriate F test to look at the symptom effect of any group?[/b]

<span style="white-space: pre;"> </span>QC OnlyPts  Age Sex S1  S2  S3 GrA GrA GrC SymA SymB SymC

<span style="white-space: pre;"> </span>[ 0 0 0 0 0 0 0 0 0 0 1 0 0 ];
<span style="white-space: pre;"> </span>[ 0 0 0 0 0 0 0 0 0 0 0 1 0 ];
<span style="white-space: pre;"> </span>[ 0 0 0 0 0 0 0 0 0 0 0 0 1 ];

<span style="white-space: pre;"> </span>1b) If so I'm wondering if this doesn't inflate the degrees of freedom? I tried using OnlyGroupA OnlyGroupB OnlyGroupC with NaNs as group vectors but ended up with n=0 subjects for the design.
<span style="white-space: pre;"> </span>1c) Another concern I have is currently I have my Symptom Group covariates coded with 0 for the non-group members, I'm worried that in this F test for example that zero values are being included in the regression versus if their were somehow NaNs to prevent them being incorporated in the test. Is this not an issue?
<span style="white-space: pre;"> </span>1d) In addition I'd also done separate subgroup analyses for each group with symptom and while results are fairly similar I'm wondering theoretically why not identical.  I test each group separately like this QC Age Sex S1 S2 S3 Sym OnlyGroupA - [0 0 0 0 0 0 1 0] which I would think to be ultimately identical to one of the F test conjunction vectors if they were working as I presumed. I'd expect all my individual subgroup results to show up in the F test but all don't. Is it a df or coding issue?

[b]2. Then to test for GroupXSymtom Interactions would I just do pairwise contrasts like this for each group? Is there a way to test all 3 groups at once like an F-test?[/b]
 <span style="white-space: pre;"> </span> QC Age Sex S1 S2 S3 GrA GrA SymA SymB OnlyGrAB
<span style="white-space: pre;"> </span>
<span style="white-space: pre;"> </span>[ 0 0 0 0 0 0 0 0 1 -1 0]; Interaction for Group A and B for Symptom

<span style="white-space: pre;"> </span> QC Age Sex S1 S2 S3 GrB GrC SymB SymC OnlyGrBC

<span style="white-space: pre;"> </span>[ 0 0 0 0 0 0 0 0 1 -1 0]; Interaction for Group B and C for Symptom

...... And similar for Group A and C Symptom Interaction.

[b]3. What is the difference in interpretation for these tests? Which would be appropriate to test for common connectivity association with symptom?[/b]
[b] [/b]
<span style="white-space: pre;"> </span>QC OnlyPts Age Sex S1 S2 S3  Sym

<span style="white-space: pre;"> </span>[0 0 0 0 0 0 0 0 1]
  
   QC OnlyPts Age Sex S1 S2 S3 Sym

<span><span style="white-space: pre;"> </span><span style="white-space: pre;"> </span>[0 0 0 0 0 0 0 0 0 1]</span>

<span>4. [b]I also saw in old post Alfonso mentioned that I could do the F test, mask for the significant regions and then do the post-hoc interaction tests within the masks. Sorry if this is a basic concept, why isn't this double dipping or p-hacking?  [/b]</span>
<span>[b][url=message.php?msg_id=12044]https://www.nitrc.org/forum/message.php?msg_id=12044<span id="_plain_text_marker"> </span>[/url]
[/b]</span>

<span>Thanks in advance for the help!</span>
<span>Victoria Okunye</span>

Preprocessing

$
0
0
How does changing the parameters for preprocessing like motion outlier detection (95%percentile, 97%percentile) and structural and functional target resolution (1mm or 2 mm ) change the way the ROI matrix differs?
Which one is recommended for optimal results?
Thanks
Jason

RE: Restrict between-group comparisons within ICA networks

$
0
0
In order to look at group differences within networks, would it be valid to:

1. Do ICA analysis
2. Do a mask of the group networks i m interested in ICA networks/Spatial properties/Results explorer (selecting allsubjects and the ICA network of interest)/export mask
3. Select the between subject contrast of itnerest (e.g., patients > controls) and the NEtwork, click on results explorer, click on SPM (in results display), and then select the network mask ans inclusive.

Would it be that correct?

Thanks in advance,

Diana




[i]Originally posted by Noys Lambent:[/i][quote]Dear experts,

I´m doing ICA analysis with Conn.
My problem is that I would like to look at differences [u]within networks[/u]. I know how to create the network mask from the whole group analysis but I can not figure out how to continue. Should I go back and use this group mask before in order to do again the individual map? is there any option to directly use the all sample maps to limit the group comparisons?

Thanks in advance,

Noys[/quote]

Realign and Unwarp Issue

$
0
0
Running 'Realign & Unwarp'

SPM12: spm_realign (v6070) 17:09:55 - 09/01/2019
========================================================================
Failed 'Realign & Unwarp'
Error using save
Unable to write file C:\Program Files\MATLAB\R2016b\data\functional\NYU_51159\rest\rp_rest.txt: permission denied.
In file "C:\Users\Sonzogo\Desktop\Good\spm12\spm_realign.m" (v6070), function "save_parameters" at line 541.
In file "C:\Users\Sonzogo\Desktop\Good\spm12\spm_realign.m" (v6070), function "spm_realign" at line 162.
In file "C:\Users\Sonzogo\Desktop\Good\spm12\config\spm_run_realignunwarp.m" (v6554), function "spm_run_realignunwarp" at line 78.
The following modules did not run:
Failed: Realign & Unwarp
Failed: Realign & Unwarp
Failed: Realign & Unwarp
Failed: Realign & Unwarp



Please any help i will appreciate

gPPI after denoising the Effect of the conditions

$
0
0
Hi CONN experts,
Suppose I have a set of conditions, say rest, stim1 and stim2
I've been wondering if it is correct to do gPPI using the task conditions after having used the Effect of stim1 and Effect of stim2 as confounds in the denoising step.
The way I see it, the Effect of these confounds are regressed out from the BOLD signal in an i-th region/voxel, by estimating:
yi = yi'+beta1i*conv(hrf,stim1)-beta2i*conv(hrf,stim2)
where yi' is the denoised signal

On the other hand, gPPI estimates the betas of the following model, given the target and seed regions/voxels i and k, respectively
yi' = beta1ik*conv(hrf,stim1)*yk'+beta2*conv(hrf,stim2)*yk'+   (PPI interactions)
        beta1i*conv(hrf,stim1)+beta2i*conv(hrf,stim2)+   (main effect of conditions)
        betak*yk'  (main effect of seed)
which may seem to be controlling for the effect of the conditions for the second time.

Is this correct? Is so, is it acceptable? Is it irrelevant, i.e. after denoising, the main effect of the conditions in the gPPI model will not be significant (estimates beta1i=beta2i=0)? Or should I denoise without using the Effects of the conditions if gPPI is intended?

Thank you!

second level cov not used until second level analysis stage, correct?

$
0
0
Hi Experts,

I ran through setup, denoising, and first level analysis in a large dataset and just realized I made a mistake when specifying one of my second level covariates (questionnaire score). If I go and fix that in the setup area, can I still proceed with going straight to second level analysis?
I am trying to get some confirmation that these second level covariates (unlike first level covariates) are not used anywhere in the processing stream up until that point and that I hopefully do not have to go back and re-run everything.

Thanks for any input!
Emily

RE: Denoising Error, Updating Analysis variables

$
0
0
Hello,

I am suffering from the same problem here. I am getting the same error as William mentioned and have tried running through older versions of CONN to no avail.  Strangely, I can run the denoising step if I do not select voxel-voxel options but I need these and when I include it breask down at the final stage. I was wondering if anyone could shed any further light on how to resolve this issue in denoising?

Many thanks,

Nick

ERROR DESCRIPTION:

Index exceeds array bounds.
Error in conn_process (line 1830)
else CONN_x.vvAnalyses(ianalysis).regressors.(optionsnames{n2}){n1}=initial.(optionsnames{n2}){n1};
Error in conn_process (line 28)
case {'preprocessing_gui','denoising_gui'}, disp(['CONN: RUNNING DENOISING STEP']); conn_process([1.5,2,6:9],varargin{:});
Error in conn (line 5086)
else conn_process('denoising_gui'); ispending=false;
Error in conn_menumanager (line 120)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.17.b
SPM12 + DEM FieldMap MEEGtools wfupickatlas
Matlab v.2018a
storage: 1725.8Gb available
spm @ C:\programs\SPM12
conn @ C:\programs\conn18a\conn

RE: dmn seed definition rationale?

$
0
0
Hi Alfonso,

I was wondering if there is an update on this, does Conn still use the coordinates from the Fox et al. PNAS 2005 paper?

Thank you for all your hard work putting this toolbox together.

Best wishes,

Ed

RE: How to make it faster?

$
0
0
[color=#000000]Dear Alfonso, [/color]
thank you for keeping the CONN toolbox updated and helping the CONN community everyday!

I'm using a windows PC for our CONN analyses. Would option C (from your answer to Haleh) work also for a windows computer with multiple cores?

Thanks for your help and support
Davide

[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote]Dear Haleh,

While usb3 is relatively fast, depending on the length of your data, your are still probably looking at something around 1 hour per subject (you can easily test that simply running a single-subject first, which is not a bad idea because that will help you debug if there are any issues before committing to waiting for a week for your results, and that will give you a good estimate on how long will it take to run all subjects). I do not believe neither storing the conn project locally nor running preprocessing separately on SPM would significantly reduce computation time. The main thing that does in my experience significantly speed things up is parallelizing your analyses. Some options would be:

a) HPC / cluster. Many institutions offer simple access to cluster computing resources, and CONN works with many standard cluster configurations right out of the box, so that might be worth investigating (see www.conn-toolbox.org/resources/cluster-configuration for additional info). If you follow this route, you would connect to your institution network remotely, run CONN and then simply select the parallelization option that reads "[i]distributed processing[/i]" when running your analyses, and CONN will automatically distribute the analyses across your choice of nodes in the network (e.g as much as one node per subject)

b) if you have a shared remote storage at home/office (e.g. a network drive) and several computers all connected to the same shared storage, you may also use this setup as a sort of DIY cluster. If you follow this route, then you would simply select in CONN the parallelization option that reads "[i]queue / script it (save as scripts to be run later)[/i]" to break down your processing pipeline into N blocks (here N is the number of different computers you have set up). That will create N Matlab scripts (or command-line shell scripts) that you can then manually run one on each of your computers. After all are finished, simply opening your CONN project from the GUI will merge the results from all of these different computers

and c) if you do not have access to multiple computers but your computer has a reasonable number of cores, you may also simply parallelize your analyses across the different cores. if you follow this route, you would first go to "[i]Tools. HPC options. Configuration[/i]" and make sure there that the default profile (e.g. named "[i]background process (Unix, Mac)[/i]" if you are on a Mac computer) works fine (e.g. just select 'Test profile'), and then, as in option (a) above, simply select in CONN the parallelization option that reads "[i]distributed processing[/i]" when running your analyses to have CONN automatically distribute the analyses across your choice of cores (e.g. one process per core)

Hope this helps and good luck!
Alfonso[/quote]

How to get the sharp output time series after Preprocessing step

$
0
0
Dear CONN experts, 

I'm a new user of CONN toolbox. 
I would like to remove noises from the EPI images (denoise) and then use them as input files to SPM for the analysis.  
So, I selected 'create confound corrected time series' and got 'niftiDATA_Subject*_Condition*.nii files. But these images are strange; low sharpness and very noisy (like a screen sandstorm) compared to the original EPI files.  

This is my batch file;

...
batch.Setup.outputfiles= [0,1,0,0,0,0] ;
batch.Setup.preprocessing.steps= 'structural_segment';
batch.Setup.voxelresolution=3;
batch.Setup.done=1;
batch.Setup.overwrite='Yes';
%% DENOISING step
batch.Denoising.filter=[0.01, 1];
batch.Denoising.detrending=1;
batch.Denoising.despiking=1;
batch.Denoising.confounds.names={'White Matter','CSF','head_movement'}
batch.Denoising.done=1;
batch.Denoising.overwrite='Yes';
%% FIRST-LEVEL ANALYSIS step
batch.Analysis.done=0;
%% Run all analyses
conn_batch(batch);

Because I only chose the structural_segmentation, the functional images might not be smoothed or resliced I think.
Does anyone know how I can obtain the sharp output time series?

Thanks.

Toshi
Viewing all 6883 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>