Quantcast
Channel: NITRC CONN : functional connectivity toolbox Forum: help
Viewing all 6872 articles
Browse latest View live

RE: Correction for global signal

$
0
0
Thanks for further clarification on GSR concerns, Alfonso.

When using aCompCor, have there been documented recommendations for *how* to choose the explicit number of principal components to be regressed from WM and CSF regions?  The Chai et al. (2012) methods paper gives results with regression of 1, 3, 5, and 10 principal components per each WM and CSF regions and eventually settles on 5 for extensive analyses; however, the choice of 5 seems pretty arbitrary. There are bound to be differences in the PC structure of research subjects' data, and I worry that choosing an arbitrary cutoff would result in either insufficient removal of "noise" or distortion of "neural signal." The latter case I think would be increasingly likely when choosing aCompCor with large numbers (e.g., 5 per CSF and WM) of components removed (is anything >1 safe? if not, why not just use the average?). 

A method to determine on a single-subject basis the optimal number of artifact components to regress using aCompCor would be greatly appreciated. Are you aware if anyone out there in the CONN-world has implemented this?

RE: Longitudinal Procedure for Multiple Subjects

$
0
0
[color=#000000]Hi Deniz,[/color]

[color=#000000]Yes, simply enter 2 in the 'number of sessions' field in [i]Setup.Basic[/i], then enter the TP1 data in session-1 and TP2 data in session-2, and then create two conditions (TP1 and TP2) each associated with a single session (see for example this post http://www.nitrc.org/forum/message.php?msg_id=12664 for additional details). You will then be able to look at the difference between sessions directly within Conn by entering a between-condition contrast [-1 1] in your second-level analyses.[/color]

Hope this helps
Alfonso

[i]Originally posted by Deniz Gürsel:[/i][quote]Hi everyone,

It's the first time I'm using this toolbox and doing a resting-state analysis. I need your input. I have functional and anatomical data from 2 groups and from 2 time points (TP1, TP2). I need to do seed-to-whole brain analysis for hippocampus.

So far I ran the whole CONN steps (setup, denoising, 1st level and 2nd level) for all the subjects for the TP1. I am about to run all the steps now for TP2, but I got a bit confused.

Since this is a longitudinal study, should I have ran both time points at the same time in the preprocessing phase (giving session parameter "2")?

If its possible to continue as I've done, so separate analysis for both time points, what would be the statistical procedure I should follow? Should I use the "BETA_Subject1_Condition001_Source*.nii" files somehow and get a difference for the maps for both time points? Is it possible to perform such analyses in the toolbox or should I use SPM etc.?

Thank you so much for your help.

Best,[/quote]

RE: Starting new project using SPM.mat files

$
0
0
[color=#000000]Dear Maria,[/color]

[color=#000000]Did you get any error messages in the [i]Project.Import[/i] gui when pressing the 'Done' button there or did it report that the SPM.mat info was imported correctly?[/color]

[color=#000000]Thanks[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Maria Hakonen:[/i][quote]Dear all,

I have analysed functional and anatomical MRI data with SPM and have entered the SPM.mat files in the conn (i.e. Project -> Import). However, the data and information of the experiment don't seem to be loaded in Setup window (i.e. sections Basic, Structural, Functional, Conditions). Could someone please let me know what might be wrong?

Thanks,
Maria[/quote]

ICA - extracting time series from components

$
0
0
Hello,

I have run the full ICA pipeline through the newest release of CONN. I would like to use the parcellations to create average time series for all subjects - is there a function in CONN to complete this, or is there an output file with these values? If not, can anyone advise me on the prefixes and abbreviations saved to preprocessed functional data, so I know which can be masked using a different technique?

Thanks!

Preprocessing error for pre/post analysis

$
0
0
Hello Alfonso,
I'm trying to have a pared t test pre/post analysis for 9 subjects in resting state fmri. I read many useful threads about pre/post(paired t test) analysis "Condition Step", Thank you for that !. I am in the set up step and would like to go through pre-processing. However after completing set up , Preprocessing gives me error at the very beginning of its process and the error is :

[i]Error using cellstr (line 33)[/i]
[i]Input must be a string.[/i]

So, I would like to explain the steps which I did in set up. I would appreciate if you let me know if I'm in the right track, and why I am getting that error:
1) Basic : #subjects :9 , #session: 2 , RT: [2 2 2 2 2 2 2 2 2]

2) Structural and functional : I upload the corresponding files

3) ROI : default values

4) Conditions:  I created 2 conditions "pre" and "post". For "pre" condition, all subjects, session 1 => pre 0 inf , for pre condition, all subjects, session 2 => pre [] [], for post condition, all subjects, session 1 => post [] [], for post condition, all subjects, session 2 => post 0 inf
 I also checked  "Allow Missing data" , because some of subjects have only 1 session, and some have 2 sessions.

5) first level Covariates: default ( I don't really know what should be defined in Covariates. I know it should be realignment or motion parameters, but I do not have any text files or at least I don't know how to get it. so I prefer to use defaults. I read in some other threads in this forum (https://www.nitrc.org/forum/forum.php?thread_id=6855&forum_id=1144)  that in the pre-processing step parameter values will be set automatically .  so I just manually typed realignment , without uploading any text file.

6) second level  Covariates: default
 
Then I am getting the error when pre-processing starts.  I would appreciate if you can help me in this regard. The full error is as follows:

ERROR DESCRIPTION:

Error using cellstr (line 33)
Input must be a string.
Error in conn_setup_preproc (line 748)
temp=cellstr(CONN_x.Setup.functional{nsubject}{nses}{1});
Error in conn (line 773)
ok=conn_setup_preproc('',varargin{2:end});
Error in conn_menumanager (line 119)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.16.b
SPM12 + DEM FieldMap MEEGtools
Matlab v.2015b
storage: 22.4Gb available



Hi again, I found the error is because of  "Allow Missing data" , so I excluded the subjects with no having same number of sessions as 2, and now the pre-processing works without issue. So Would you please help me to figure out why "Allow Missing data"  gives error when we have not the same number of sessions for all subjects.

Thanks

ICC

$
0
0
Hello, 

for the voxel-to-voxel ICC, the maps' contrast estimates still represent z values?

Thanks!

Athena

compare pre and post for single subjects

$
0
0
Hi ,
I have pre and post resting state fMRI and in the first level analysis, I am  not sure how I can compare pre and post for single subjects.  I  found one related postin the forum, Alfosono has responded to one of the users, but unfortunately the link that he provided doesn't work any more. here is  Alfosono's response to the user:[quote]
In your case, to compare two individual conditions on a single subject, one potential approach would be to use the analyses described in this post (http://www.nitrc.org/forum/message.php?m... that will take the first-level estimates for each condition-specific seed-to-voxel connectivity maps and compute a Fisher test comparing those within-subject estimates). 
 [/quote]The  page from that link is removed. Would you please  provide some information how we can compare pre and post in first level of analysis?
 
Thanks you !

Undefined function or variable 'DOGICA3'.

$
0
0
Dear all,

I have tried to run a voxel-to-voxel analysis via the graphical GUI but got an error message pasted below. Could someone please let me know what might be wrong?

ERROR DESCRIPTION:

Undefined function or variable 'DOGICA3'.
Error in conn_process (line 2714)
if ~DOGICA3, % (GICA1) wqc'*y2=y1
Error in conn_process (line 39)
case 'analyses_gui_vv', disp(['CONN: RUNNING ANALYSIS STEP (voxel-to-voxel analyses)']); conn_process([13],varargin{:});
Error in conn (line 4548)
else conn_process('analyses_gui_vv');
Error in conn_menumanager (line 119)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.16.b
SPM12 + DEM FieldMap MEEGtools
Matlab v.2016a

Thanks already in advance!

Best Regards,
Maria

How to get beta values in 2nd-level results?

$
0
0
Dear experts,

Would you like to tell me how to get beta values in 2nd-level results with Conn 16.b? Now only T values can be exported.

Thank you!

Hayaku

RE: How to remove the grey matter from roi list ?

$
0
0
Dear Alfonso,

I know this question has already been addressed numerous times (e.g. here: https://www.nitrc.org/forum/forum.php?thread_id=4006&forum_id=1144) but for clarity and for future visitors of this thread:

You said that adding the grey-matter in the confounds list is "equivalent to 'global signal regression'". As far as I understood, that's not quite true as conn uses the CompCor method while actual GSR regresses-out the average signal over the whole brain. Thus adding the grey-matter in the confounds list is just an approximation to GSR, but not exactly the same.

Please correct me if I'm wrong!


Cheers,

Sascha


[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Dear Liu[/color]

[color=#000000]Entering the grey-matter in the 'confounds' list will regress out the average BOLD signal extracted from the gray matter mask. This is equivalent to 'global signal regression', and it is not recommended because it risks biasing the voxel-to-voxel statistics (see for example Murphy K, Birn RM, Handwerker DA, Jones TB, Bandettini PA. 2009. The impact of global signal regression on resting state correlations: are anti-correlated networks introduced? Neuroimage 44:893–905). [/color]

[color=#000000]If you simply remove the 'grey matter' ROI in the 'first-level analysis' tab from the list of 'sources', the resultsROI_Condition*.mat files should contain a matrix of size 90 (only 'source' ROIs) x 91 (all ROIs). The columns of this matrix are sorted in the same way as the rows, so you may simply disregard the last column to get your proper 90 x 90 matrix of connectivity among AAL areas. If you are in the 'ROI-to-ROI results explorer' window, you can also disregard there the 'grey matter' ROI (and any other ROI that was not selected in the 'sources' list) by selecting in the top-right corner the option 'network of source ROIs' (instead of the default 'network of all ROIs'). [/color]

Let me know if this clarifies.
Best
Alfonso


[i]Originally posted by chen liu:[/i][quote]Dear conn expert:

in the first level analysis I'm trying to load AAL 90 ROIs, but automatic
loaded one for the default "grey matter" ROI, and I can't remove it.
then i Right-clicking on the grey matter field in the ROI list and selecting
'remove selected ROIs' ,however i found it dosen't take effect. the grey matter
still remain in the roi list at the first level analysisi step.

in the first level analysis if i don't remove the grey matter from the roi
list, i found that roi to roi matrix is 91*90 in the resultsROI_Condition*.mat.
when i use the check the grey matter as covariate ,then matrix is 90*90.

question1 if i want to get 90*90 matrix according to aal template, i use the
grey matter as covariate to regress out. what dose affect to my result.

question2 what else ways to remove the grey matter from roi list?
 
                     
    many thanks

         liuchen[/quote][/quote]

RE: Error in Batching Results

$
0
0
Hi Alfonso,
Here is my script. Thank you for your help!
Marlene

Single-subject voxel-to-voxel repeated measur

$
0
0
Hello all, 

is it possible in CONN to perform a test-retest statistical analysis with n=1 in a hypothesis-free way?

Thank you in advance, 

Athena

Files created by conn during preprocessing

$
0
0
Dear all,

after preprocessing of my data, conn created numerous files and stored them in the folders of my original structural and functional files. Can anyone help me what files these are exactly? I didn't find anything online, and the manual does not specify it, either.

I'm mostly wondering about the files created by conn in the "structural" folder of my data:

c1ct1.nii
c2ct1.nii
c3ct1.nii
c4ct1.nii
c5ct1.nii
centering_t1.mat
ct1.nii
ct1_seg8.mat
...
and so on...


If you know what any of these files are, I'd be very happy about your help!

Specifically: Are any of the newly created files the estimated GM/WM/CSF masks?

I use the default preprocessing pipeline.


Thanks!

Cheers,
Sascha

Error in batch processing the NYU dataset

$
0
0
Hello,

I am trying to run the conn_batch_workshop_nyudataset script from conn 15g, which just happens to be the last one I downloaded. I am getting the following error after the downloading and extraction portions of the code:

Reference to non-existent field 'Setup'.

Error in conn_batch (line 364)
else SUBJECTS=1:CONN_x.Setup.nsubjects;
Error in conn_batch_workshop_nyudataset (line 104)
conn_batch('Setup.RT',2)

Looked at the lines and attempted to traceback the code but everything looks like it should be fine. I even went into the setup wizard code to verify that a default value is being set for the Setup.nsubjects

I'm unsure what else it could be.

Thanks in advance for the help,
Kenneth

RE: registration problem

$
0
0
Dear Alfonso,

I have been trying to follow your suggestions. It took a while (59 7T datasets take some time :)), but now I noticed that I do not have swar files, but swau files.
I have been trying to figure out the spm prefix system and the R seems to indicate "registration". 
Do you know why I don't have swar files?

I have chosen the default preprocessing with the extra "functional coregistration to structural" step.
I checked the folder with the data and there is also no swar file....

Many thanks
Heidi

Controls vs patients contrast

$
0
0
HI,

I have a set of 10 controls, 10 patients in category 1 and 10 patients in category 2. I am interested in comparing difference between functional connectivity using CONN.

I defined 4 2nd level covariates in Set Up steps as following:
controls as 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
category1 as 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 
category2 as 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1
category12 as 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

Now in second-level analysis steps, somehow for following two comparisons, I am getting very different results:

(1) For controls > category12, I selected controls and category12 and defined contrast as 1 -1
(2) For controls > category1 and category 2, I selected controls, category1 and category 2 and defined contrast as 1 -1 -1

Shouldn't both of these contrasts give me identical results as I am averaging over category 1 and category 2 in (2) above which is similar as category12?

I would really appreciate any help.

Thanks.

colour tables for imported atlas

$
0
0
Dear Alfonso,

I have right and left colour tables for an atlas I am using to do native space ROI analyses (right atlas attached). How do I ensure that the ROI parcellation image is recognised as having 72 individual parcellations ?

Is there anything I need to do with regard to my batch script?

Thanks so much for your help.

bw

Peter

Incompatible Condition Names

$
0
0
When trying to complete second level analysis, several subjects within the group are generating an error that says incompatible condition names. In the Matlab files, there are some invisible condition names, apparently deleted names, but the space for the names has been saved. Is there a way to reconcile the condition names so that the files can be merged for 2nd level analysis? Thanks!

Error during Preprocessing

$
0
0
Dear all,

I am working with conn 16.b in Matlab R2015b. Rather late into the preprocessing (of a large number of subjects), I received the following error message:



ERROR DESCRIPTION:

Error using MATLABbatch system
Job execution failed . The full log of this run can be found in MATLAB command window, starting with the lines (look for the line showing exact #job as displayed in this error message)
---------------
Running job #1
---------------

CONN v.16.b
SPM12 + ArtRepair DEM FieldMap MEEGtools wfupickatlas
Matlab v.2015b
storage: 8939.1Gb available


In the Matlab command window:

------------------------------------------------------------------------
Running job #1
------------------------------------------------------------------------
Running 'Normalise: Estimate & Write'

SPM12: spm_preproc_run (v6217) 20:59:39 - 15/09/2016
========================================================================
Completed : 21:03:11 - 15/09/2016

SPM12: spm_preproc_run (v6217) 21:05:01 - 15/09/2016
========================================================================
Failed 'Normalise: Estimate & Write'
Error using fclose
Invalid file identifier. Use fopen to generate a valid file identifier.
In file "/Path_to_SoftwareFolder/BrainNetViewer_20150807/BrainNet_spm8_files/@nifti/private/write_hdr_raw.m" (v2237), function "write_hdr_raw" at line 71.
In file "/Path_to_SoftwareFolder/BrainNetViewer_20150807/BrainNet_spm8_files/@nifti/create.m" (v1143), function "create_each" at line 27.
In file "/Path_to_SoftwareFolder/BrainNetViewer_20150807/BrainNet_spm8_files/@nifti/create.m" (v1143), function "create" at line 15.
In file "/Path_to_SoftwareFolder/spm12/spm_preproc_write8.m" (v6137), function "spm_preproc_write8" at line 448.
In file "/Path_to_SoftwareFolder/spm12/spm_preproc_run.m" (v6217), function "run_job" at line 143.
In file "/Path_to_SoftwareFolder/spm12/spm_preproc_run.m" (v6217), function "spm_preproc_run" at line 41.
In file "/Path_to_SoftwareFolder/spm12/config/spm_run_norm.m" (v5700), function "normalise" at line 72.
In file "/Path_to_SoftwareFolder/spm12/config/spm_run_norm.m" (v5700), function "spm_run_norm" at line 23.
The following modules did not run:
Failed: Normalise: Estimate & Write




Now it seems to be stuck at "Performing functional Normalization Please Wait..." (or, if it's not stuck, it's just become very slow. Anyway it doesn't seem like anything continued after the error).

Any ideas as to what happened here and what I can do about it?


Cheers,

Sascha

Error in Results Explorer with Monkey atlas

$
0
0
Dear CONN community,

I am doing seed-voxel and roi-roi connectivity analysis with macaque data. 

I followed some other threads on this help forum and changed the background anatomical image and background atlas image to the monkey template that I am using. 

There are two main errors:

1. The first seems to be just with the display. In the GUI one can hover over points on the brain and it will give the coordinates and name of that atlas section. This only works for one side of the brain. When I try to look at the other side, it will sometimes give the information, and sometimes does not. It always gives this error in my matlab window:

Error while evaluating Figure WindowButtonMotionFcn
Index exceeds matrix dimensions.
Error in conn_menu>conn_menubuttonmtnfcn (line 902)
if v>0, txt=CONN_gui.refs.rois.labels{v}; else txt=''; end
Error in conn_menu>@(varargin)conn_menubuttonmtnfcn('volume',gcf,h.h1,h.h2,h.h6a,h.h2c) (line 292)
conn_menumanager('onregion',h.h6a,1,position,h.h2,@(varargin)conn_menubuttonmtnfcn('volume',gcf,h.h1,h.h2,h.h6a,h.h2c));
Error in conn_menumanager>conn_menumanager_mousemove (line 644)
if nargin>0, feval(varargin{:}); end
Error in conn_menumanager (line 192)
if cond, anymtn=true; conn_menumanager_mousemove(CONN_MM.onregionmotioncallback{n0}); end
Error while evaluating Figure WindowButtonMotionFcn

2. The second is that I cannot use results explorer. CONN successfully calculated all of the results that I wanted but I cannot view them. I can view it in the 'results preview' (although problem 1 is present I can at least view the results). It gives this error:

ERROR DESCRIPTION:

Index exceeds matrix dimensions.
Error in conn_vproject>vproject_display (line 1161)
clusternames{n1}.uvd={refsrois.labels{1+uv}};
Error in conn_vproject (line 570)
[bnew,txt,xyzpeak,clusters,clusternames,ithres,cluster_stats,doneparam,DATA.SIM]=vproject_display(bnew,threshold,thres,mat,DATA.side,DATA.d,DATA.peakFDR,DATA.parametric,DATA.SIM,DATA.spmfile,init);
Error in conn_display (line 128)
conn_vproject(param,nonparam,[],[],{.001,1,.05,3},[],[],[],.50,[],SPMfilename,voxeltovoxel);
Error in conn_process (line 3953)
varargout{1}=conn_display('SPM.mat',1);
Error in conn_process (line 46)
case 'results_voxel', [varargout{1:nargout}]=conn_process(16,varargin{:});
Error in conn (line 5969)
conn_process('results_voxel','readsingle','seed-to-voxel');
Error in conn_menumanager (line 119)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.16.b
SPM12b + Anatomy DEM FieldMap MEEGtools deface
Matlab v.2014b
storage: 1703.3Gb available
Warning: Contents.m overloaded by version in folder /Applications/spm12b/external/fieldtrip
Warning: spm_get_data.m overloaded by version in folder /Applications/spm12b/toolbox/Anatomy
Warning: spm_mip_ui(Backup).m overloaded by version in current folder (/Volumes/Data2/Caitlin/Master Thesis/Analysis/6_monkeys_batch/results/secondlevel/bivariate_correlation/AllSubjects/rest/L_vAIC)
Warning: spm_platform.m overloaded by version in folder /Applications/spm12b/toolbox/Anatomy
SUGGESTIONS:
Potential missing Setup information. Try revising Setup fields for potential missing information.

I have attached the atlas file.

Any help would be greatly appreciated!

Sincerely,

Caitlin
Viewing all 6872 articles
Browse latest View live