Quantcast
Channel: NITRC CONN : functional connectivity toolbox Forum: help
Viewing all 6896 articles
Browse latest View live

Missing parallelization option?

$
0
0
Hello Alfonso and CONN experts,

I recently switched from the 17.f standalone to the 18.b standalone and I've noticed some great improvements, especially in the interface speed, so thank you for all the work you have put into this!

I have a question with the HPC setup with 18.b. Back in 17.f I had a setup (with PBS/Torque) that worked both in the configuration tests, and for processing steps. In 18.b I noticed that while the PBS/Torque option still shows in the configuration menu (Tools>HPC Options>Configuration) and can be tested successfully, it no longer appears as an option in the dropdown for processing steps (see screenshot). Since no one else has mentioned this I presume it is a glitch with my specific installation. Do you know if there is a way to restore this option? Parallelization has been a life-saver for the dozens of subjects I've had to process, and it would be great to have it back.

Thanks!

RE: How to erase sessions

$
0
0
[color=#000000]Hi Nicolas,[/color]

[color=#000000]This is not part of the GUI but there is a small function (conn_removesession) that allows you to manually remove specific sessions from specific subjects. The syntax is:[/color]

[color=#000000]  conn_removesession( sessions, subjects )[/color]

[color=#000000]For example, if you want to remove session 2 and 3 from subjects 1, 2, and 3, you would use the syntax:[/color]

[color=#000000]  conn_removesession([2 3], [1 2 3])[/color]

[color=#000000]If the changes looks ok in the gui to you simply save the project to keep these changes. Also note that this function will update the list of functional files, structural files, session-specific ROI files, condition onset information, and first-level covariate files, to remove from there the excluded sessions, but you will still need to propagate those changes (i.e. re-run the Setup/Denoising/FirstlevelAnalysis steps) if those steps have already been run.[/color]

[color=#000000]Hope this helps[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Nicolas Echeverria:[/i][quote]Hello conn experts,

My issue is that i want to erase sessions of every patient, though I do not want to erase some specific sessions. 

This is, I have plenty of session, as the experiment i was trying to solve required them, but i need the results of comparison of some sessions as soon as posible, but i need to enter the data of the first level covariates of all the sassions.

The conditions I defined are not related to those session i want to erase.

In short, there are only some sessions I want to keep, but i can not erase the rest, without erasing those.

Is there any way I can erase only some speciffic sessions?

Thank you very much. This might sound trickier than it is in fact. 

Nikolas[/quote]

RE: First Level Not Displaying Data

$
0
0
[color=#000000]Dear Nicole,[/color]

[color=#000000]In gPPI analyses the seed/source timeseries (physiological effects) are the same irrespective of the choice of conditions. Multiple conditions are apparent in the multiple psychological effect timeseries as well as in the multiple psycho-physiological interaction timeseries. To display these in CONN simply click in the first-level analysis tab (in the same screenshot that you attached) on the option that reads '[i]Source timeseries[/i]'  and change it to '[i]First-level analysis design matrix'[/i]. If you have 4 conditions, for example, you should now see there 9 timeseries / rows in the displayed design matrix. The first will be the source timeseries (physiological term), the next 4 will be the condition timeseries (psychological terms), and the last 4 will be the interaction timeseries (PPI terms). As you click on the different conditions you will see in red highlighted the individual PPI timeseries which will be used for modeling that selected condition-specific effect. [/color]

[color=#000000]Regarding the absence of a preview display in that tab, one possible reason for this would be if you have run only an ROI-specific pipeline, since that preview display is only available when there is voxel-level data available (sorry about that). If this is the case, simply re-running the Setup and/or Denoising steps for the voxel-level data should fix this issue (i.e. make sure that the [i]seed-to-voxel [/i]line is checked in the 'enabled analyses' option part of the [i]Setup.Options [/i]tab; and when running the Setup and Denoising step make sure that the [i]seed-to-voxel [/i]line is also checked there as well)[/color]

[color=#000000]Hope this helps[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Nicole Nissim:[/i][quote]Dear Conn Users,

I am working in version 17f and have data that was fully processed in CONN for N=28 [2 sessions (pre- and post-intervention), two task conditions (2-Back and 0-Back) in each session]. I am running gPPI ROI-to-ROI analyses (regression - bivariate). I have data fully processed and am able to view it at the second level. However, the first level will not display data for any subject, condition, or ROI. When viewing the source timeseries for each subject, timeseries display does not change when clicking on different conditions (screenshot example). It is not clear to me how the timeseries is being treated across the different conditions at the first level, and I would like to check over the individual subject data. Because of this, I am not clear on what values are being used in the second level. I have not received any error during processing the data. Re-analyzing and overwriting the data at first level has not resolved the display issue.

If anyone has insight on how to resolve this, please let me know. Thank you in advance!


Best regards,
Nicole[/quote]

RE: CONN Pipeline: ART Implicit Mask Error?

$
0
0
[color=#000000]Hi Jeff,[/color]

[color=#000000]This means that ART run into a problem when creating either the art_mask_*.nii or art_mean_*.nii files. The former is an implicit mask of brain voxels that is not influenced by potential outlier scans (it is computed as the conjunction of the implicit masks across all non-outlier timepoints), and the latter is a mean-functional volume that again is not influenced by potential outlier scans (it is computed by averaging only across non-outlier timepoints). In general CONN does not use the former at all, and it only uses the latter if available (otherwise CONN defaults to using the meanfunctional.nii image that is created during the realignment preprocessing step. This means that this error is not terribly concerning (I do not believe the added robustness of using the art_mean_*.nii image over the meanfunctional.nii image will make a big difference generally, although it can really help in cases when there are really dramatic outliers) but I would really like to know the source of the error even if just for debugging purposes. If you do not mind it would be helpful if you could please re-run [/color]ART just for one individual subject but this time using the ART gui so that we can see what is causing this error. You may do so by:

1) in the [i]Setup.Functional [/i]tab, select the '[i]Primary dataset[/i]' and click on '[i]functional tools. Apply individual-preprocessing step[/i]' (we are going to run ART on the already preprocessed data just to avoid any of your original files being overwritten)

2) in the new preprocessing window, unselect the '[i]All subjects[/i]' checkbox and select a single subject there, select the [i]'functional Outlier detection (ART-based identification of outlier scans for scrubbing)[/i]' option, and change the option that reads '[i]run process and import results to CONN project[/i]' to '[i]run process only (do not import results)[/i]' (we are doing this also to avoid CONN changing any of your current project info), and then click 'Start'

3) in the new '[i]functional outlier detection settings[/i]' window, select the option that reads 'edit settings interactively (ART gui)' and click on 'Ok'

and 4) in the main ART gui click on 'Save' (in the bottom right corner), and then select 'Analysis mask' and 'Mean functional' sequentially. One of these should produce an error, if you could please copy and paste the entire error message and send it to me that would be very helpful

Thanks
Alfonso

[i]Originally posted by Jeff Browndyke:[/i][quote]I was re-processing a dataset with the newest CONN version and ran across this systematic (potential?) error for all subjects:

"Warning. Error encountered during ART implicit-mask file creation. Skipping this step"

Everything else appears to work fine during the ART outlier detection process. 

Is this of concern?

Thanks,
Jeff[/quote]

RE: Leave one out

RE: How to make it faster?

$
0
0
Thank you Pravesh! I've just noticed that the new version of CONN (18b) has a "[i]background process (Windows)"[/i] configuration. I was using 17f, which didn't allow such option. I think the best option would be updating to the newest release. Would my conn project still be usable?
Many thanks!

Results: Why we get different "resultsROI_Condition001" for same batch file including ( or not) a STRUCTURAL_FILE

$
0
0
When handling the same data (sub1.zip) by conn18a, one batch file includes a STRUCTURAL_FILE(see sub1t.m) and the other does not include the STRUCTURAL_FILE (see sub2t.m), this is  only a  difference between the two batch files, but we get different "resultsROI_Condition001" (in ...\sub1test\results\firstlevel\SBC_01) for the two cases, I want to know how the structural file to affect the result?Thank you very much for your help!

ALFF

$
0
0
Dear CONN experts: 

I was wondering if it was possible to get ALFF raw values for each voxel/ ROI. Currently, I only see test score values and p-values.

Thank you,
Vivek

Non-parametric stats and cluster size correction

$
0
0
Hello everyone,

I have a question about the non-parametric statistics: I understand these are not exactly permutations, so does it correct for multiple comparisons? In my understanding it does, but since there are some cluster-size corrections designed for the non-parametric statistics I am wondering if not correcting the p-value can be a correct approach as it is for permutations.
On my data, if I use parametric statistics and an FDR correction for the cluster size, I have a similar result as if I use non-parametric statistics and not correct the p-value (with clusters a bit bigger and sometimes observed on both hemisphere when it is not with parametric statistics).

Hope that its clear.
Thanks in advance for any response,

Lena

RE: Error in conn preprocessing ROI

$
0
0
Hi everyone!

I recently encountered the same problem you've all described. I found the answer in another thread:

https://www.nitrc.org/forum/forum.php?thread_id=9483&forum_id=1144

The software patch works very well. For reference, I'm using CONN 18a, SPM 12, and Matlab 14a.

Hope this helps!
Ariel

RE: Coding Main Effect/Interactions for Symptom Covariates of Multiple Groups

$
0
0
Just following up because I'd really appreciate any advice on this.
At a bit of a stand still with analyses until I'm sure I'm using proper contrast coding.

Thanks,
Victoria

RE: Atlas.nii ROI's

$
0
0
[i]Originally posted by Greg Overbeek:[/i][quote]Hello,

Does anyone know if there is any information on the size, shape, and exact locations (MNI coordinates) of the ROI's in atlas.nii file within conn. I have tried to separate the ROIs in this image into different files, but I have been unsuccessful. There is a .mat file within the results/firstlevel/([specific analysis] folder that contains a variable xyz that gives MNI coordinates for each ROI. Plotting these points gives a region that appears to be the center of the ROI. Can anyone confirm this?

Thank you in advance for your help!

Best,

Greg[/quote]
[color=#000000]Hi Greg, [/color]

[color=#000000]I have the same question and am wondering if you were able to confirm the answer to this. [/color]

[color=#000000]Thanks![/color]
[color=#000000]Megan
[/color]

RE: difference between Effect of condition in confounds and first level covariate

$
0
0
[color=#000000]Dear Pedro,[/color]

[color=#000000]Sorry that was confusing. You are right that there might not be obvious scenarios when you would want to use the 'copy task-related conditions to covariate list' option, since, as you state, those hrf-convolved condition regressors will already appear by default simply encoded as 'effect of [condition]' during Denoising and future steps, so copying there as a first-level covariate does not seem to add much if anything at all.[/color]

[color=#000000]In general, anything that is defined as a [b]condition[/b] will be hrf-convolved (assuming you have specified a continuous acquisition) and the resulting timeseries will be: a) shown in the default list of potential confounding effects in the [i]Denoising[/i] tab; and b) be used as weights in the [i]first-level[/i] tab for weighted-GLM or gPPI analyses for the estimation of condition-specific connectivity measures. In contrast, anything that is defined as a [b]first-level covariate[/b] will be: a) shown (as is, no hrf-convolution) in the default list of potential confounding effects in the [i]Denoising [/i]tab; b) appear as potential seed timeseries in the [i]first-level analysis [/i]tab (only those covariates not selected as confounding effects during the denoising step); and c) appear as potential interaction terms, both in the [i]Setup.Conditions [/i]tab as well as in the [i]first-level analysis [/i]tab for the "other temporal-modulation effects" analysis type.[/color]

[color=#000000]There are, of course, some somewhat-convoluted scenarios when you would want to genuinely use the 'copy task-related conditions...' option such as: a) for display purposes (for example, having the hrf-convolved conditions included in the list of first-level covariates allows you to include them in some plots such as QA which otherwise would need to be created manually; similarly if you want to use the 'covariate tools' gui to compute some summary measure of your conditions); or b) for more complex interaction analyses (e.g. CONN allows you define condition*covariate interactions, so sometimes it is useful to copy some subset of conditions into first-level covariates just to be able then to use the resulting timeseries as interaction terms). [/color]

[color=#000000]In practice, though, the most common use of this '[b]copy[/b] task-relate conditions...' option is a soft way to perform the '[b]move[/b] task-related conditions to covariate list' option in two steps (i.e. first use the 'copy ...' option, then, if everything looks fine, simply delete the original conditions). The 'move task-related conditions...' option is useful, as stated in the manual, when you want to perform Fair et al. -style analyses, where you still want to regress out anything that correlates with your conditions from the BOLD signal but you do not want to obtain condition-specific connectivity measures. In that case, moving a condition into a first-level covariate does exactly that, it still shows you the appropriate timeseries during the [i]Denoising [/i]step so the appropriate timeseries can still be included as a confounding effect, but it is no longer treated as a condition so CONN does not estimate condition-specific connectivity measures. [/color]

[color=#000000]Let me know if that clarifies[/color]
[color=#000000]Alfonso[/color]

[i]Originally posted by Pedro Valdes-Hernandez:[/i][quote]Dear CONN experts,

I'd like to know why one would want to copy task-related conditions to first level covariates.
Aren't these regressed out during the temporal preprocessing (denoising) anyway?
The original CONN paper (2012) suggests these effects are indeed removed in the Denoising step. It appears so since the conditions are imported as confounds with the name 'Effect of...'. I guess this is done to obtain "resting state" task-independent FC measures, as in Fair et al (2007).
However in the CONN User Manual states that, in order to achieve this, the conditions must be copied to the 1st level covariate list. Is this correct?
This is confusing. In a nutshell, what is the purpose of this 1st level covariate list, other than to provide regressors not to be HRF-convolved (like in SPM)?
On the other hand, is the HRF-free regression used to remove task effects or just HRF convolved conditions?
Looking forward any comment on this.

Pedro[/quote]

Multi-session pre-post design QC_ValidScans for each separate session

$
0
0
Good afternoon

I have a pre-post design with resting state scans. For both the pre and the post I have two sessions of resting state scan. My setup in conn has 183 subjects each with 4 sessions (2 pre and 2 post scans). Then within the condition setup I differentiate between the scans for pre and the scans for post. When I run the ART and look at the 2nd level covariate QC_ValidScans I am finding that this is the total # of ValidScans collapsed across all 4 sessions. However, I am interested in making sure there is enough usable scan data (at least 5 minutes) for pre and for post separately.

Is there any way to separate the QC_ValidScans for pre and QC_ValidScans for post as defined by the condition setup, or, if not, is there is a way to just split it by session and I can calculate the total number of ValidScans from there? 

Thank you!
Daniel

RE: CONN Pipeline: ART Implicit Mask Error?

$
0
0
Strangely, I could not replicate the error using the instructions you provided, Alfonso.  

I'll keep an eye out for the error and post it here if it crops up again.

Warm regards,
Jeff



[i]Originally posted by Jeff Browndyke:[/i][quote]Will do.  Thanks, Alfonso.

BTW - I could really use your help on the spm_crossvalidation script output explanation.  I'm not certain if you saw my email to the alfnie account.

Warm regards,
Jeff[/quote]

weird numerical labels on second level folders

$
0
0
Hi Experts,

I am using conn 17f, and for some of my second level comparisons, the created folders have strange numerical labels rather than the names of the contrast, which is listed correctly in the gui. This happens for some but not all comparisons.
For instance, one comparison with say, QIDS_sum covariate, will correctly create a folder called:
OCD_groupIntercept(0).QIDS_sum(1).SITE(0)

but if I choose a different covariate, say, a variable called "MAIA_Noticing_average" (which is displayed correctly in the gui), the folder is called:
OCD_groupIntercept(0).MAIA_Noticing_av5684504775734756
(see attached screenshot)

This is not specific to these variables but happens with several others as well. I think this may have to do with the length of the covariate name. Long covariate names give me the numbered folders, short ones correctly label the folder with the contrast name. Is there any fix to this (in newer versions of conn?) or should I just shorten my variable names?


Thanks,
Emily

Setting covariates (Second-level) in 4 groups analysis

$
0
0
Hi Alphonso, thank you for all the help and support you're giving to the CONN community.

We're currently running a group analysis with four groups, on a database of 220 subjects.
We have specified our second-level covariates as follow (here I'm reporting just 12 subjects as an example):

All_Subj  [1 1 1 1 1 1 1 1 1 1 1 1]
Group_A [0 1 0 0 1 0 1 1 0 0 0 0]
Group_B [1 0 0 0 0 0 0 0 1 1 0 0]
Group_C [0 0 1 0 0 1 0 0 0 0 0 0]
Group_D [0 0 0 1 0 0 0 0 0 0 1 1]
Age*  [12 5 7 8 12 10 13 9 5 8 5 5]
Gender [0 1 1 0 0 1 1 0 1 1 0 0]

*Since we know only the age span of each subject (e.g. 60-65 years), we've coded age as a multiple of 5 years (i.e. 55-60 = "11", 60-65 = "12", 65-70 = "13" etc..).

We are running a ROI-to-ROI analysis on 11 ROIs and we are planning to do pairwise comparisons in order to explore connectivity differences between groups.
Unfortunately, everytime we open the second-level analysis GUI, we encounter this error:

======================[quote]
ERROR DESCRIPTION:

Error using repmat
Too many input arguments.
Error in conn_menu_montage (line 71)
p1=repmat(ref(:,i1(mask)),1,1,4,4)+repmat(permute(D,[1,3,2,4]),1,nnz(mask),1,4);
Error in conn (line 9395)
[connx0,conny0]=conn_menu_montage('plotline',CONN_h.menus.m_results.xsen1n2,repmat(xyz1(:,n1),1,size(xyz2,2)),xyz2,max(.2,1/CONN_h.menus.m_results.xsen1n2(6)));
Error in conn (line 7426)
else conn gui_results_r2r;
Error in conn_menumanager (line 120)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN18.b
SPM12 + Anatomy DEM FAVBS FieldMap MEEGtools TFCE aal cat12 conn vbm5 vbm8 wfupickatlas
Matlab v.2013a
project: CONN17.f
storage: 532.1Gb available
spm @ P:\SPM\spm12_v6225
conn @ P:\SPM\spm12_v6225\toolbox\conn
[/quote]
======================

-[i]First question[/i]: have we made any error in covariate specification?

-[i]Second question[/i]: what is the "too many input arguments" error about?

-[i]Third question[/i]: if we want to compare two groups, would this design work:

- Variables: Group A; Group B; Age (covariate); Gender (covariate).
- T-contrast: [1 -1 0 0]

Should we also create a new variable "Included subjects" to specify that we're running the analysis only on a subgroup of patients?

(example:
Group_A [0 1 0 0 1 0 1 1 0 0 0 0]
Group_B [1 0 0 0 0 0 0 0 1 1 0 0]
Age [12 5 7 8 12 10 13 9 5 8 5 5]
Gender   [0 1 1 0 0 1 1 0 1 1 0 0]
Included_subjs [1 1 NaN NaN 1 NaN 1 1 1 1 NaN NaN])


Thank you immensely for the help,
Davide

Errors post-Preprocessing on a network drive

$
0
0
Hi,

I wanted to run 162 participants on CONN, but didn't have enough space to do so locally on my desktop, so I created and ran the project from my database on a network drive. It ran through the preprocessing steps fine with all of the proper files created, but after I uploading my ROIs, set the condition and covariates, and hit done, it kept freezing on step 3 of 7 (Importing conditions/covariates). I went back and realized that for some reason CONN did not make a folder corresponding to the .mat file with all the additional data, results, and sub folders on my network drive. I created those folders manually and tried it again, and it started to write to those folders until it froze at step 5 of 7 (Importing ROI data). I then copied 2 subjects onto my desktop to test, and I ran into the same problem, freezing at step 5. What is CONN doing during this step? Is there something else I need to add to the folders manually to get it to finish this step before denoising? Or do I have an issue because those folders weren't created by CONN during preprocessing? Would it be better if I started over and did everything on an external drive in case this means there are some permission issues from the network drive?

Thanks,
Hanna

RE: Lesion Masks

$
0
0
Hi Giuliana,

Any luck solving this problem? I also need advice on how to modify the TPM.nii file. So far, all I have found is a paper demonstrating that masking a lesion is crucial to analyzing functional connectivity in a brain with a lesion: https://journals.sagepub.com/doi/abs/10.1177/0271678X17709198

Thanks,
Matthew

RE: Lesion Segmentation

$
0
0
Hi Jeff,

Any luck solving using lesion masks in CONN? I saw elsewhere on this forum that you need to modify the TPM.nii file, but I haven't found any sources on how to do so: https://www.nitrc.org/forum/message.php?msg_id=26112 

So far, all I have found is a paper demonstrating that masking a lesion is crucial to analyzing functional connectivity in a brain with a lesion: https://journals.sagepub.com/doi/abs/10.1177/0271678X17709198

Thanks,

Matthew
Viewing all 6896 articles
Browse latest View live