Quantcast
Channel: NITRC CONN : functional connectivity toolbox Forum: help
Viewing all 6915 articles
Browse latest View live

RE: Batch for preprocessing with a text-only linux environment

$
0
0
Thank you, Alfonso!

I will try that!

kind regards and hello from Berlin :-)

Till

gPPI versus weighted GLM for Block design

$
0
0
Hello,

Thanks so much for the excellent posts explaining gPPI versus weighted GLM for use with block task designs! They have been very helpful! I had some further follow-up questions that have arisen from a reviewer for a manuscript for publication.

1. I chose to do a weighted GLM but they were insisting that I do a gPPI analysis. However, I noticed in one of your posts, weighted GLM and gPPI are both perfectly valid approaches for task block designs and are about equally as prevalent in the literature. I was wondering if you know of any relevant literature that may help back up these statements?

2. I wanted to ensure that I have correctly set up my weighted GLM.

My task consists of a "trauma" block that included in each trial a a) 60 fixation period  b.) a 30 second presentation of a trauma narrative, c. a 30 second recall period, and d.) a 60 second relax period where they were asked to let go of the traumatic event

*The same structure was present in the neutral block, except the trauma narrative was replaced with a neutral narrative

In my analysis I modeled the 30 second narrative presentation, the 30 second recall period, and the 60 second recovery period for both trauma and neutral blocks. I did not model the initial 60 second fixation period. Does this seem correct?

3. I examined both "absolute" and "relative connectivity" measures. We were specifically interested in the "recall" period so I examined amygdala-whole brain resting state connectivity in relation to PTSD symptoms during both trauma and neutral recall period separately (absolute connectivity) and also looked at PTSD symptoms in relation to trauma recall relative to neutral recall resting state connectivity (1, -1 contrast; relative connectivity). Is their merits to looking at both relative and absolute measures of connectivity, or perhaps just relative?

I really appreciate any guidance! And my apologies for the long winded post!

Emily

Mean functional connectivity

$
0
0
Hello,
I am attempting to analyze mean functional connectivity of specific regions (PCC) across both multiple sessions and between multiple groups. I assume this value can be found somewhere in second level seed-to-voxel analysis, but I can't seem to be able to separate that specific value, rather, the only output I can find is the specific clusters that show significant changes with the ROI. Any info on how to find global changes in the mean FC of a specific ROI would be much appreciated. Thanks!

how to create new ROI: hypothalamus ?

$
0
0
Dear all

I would like to create new ROIs of bilateral hypothalamus (not in the standard atlas tools of CONN) 

according to a research paper
https://doi.org/10.1007/s11695-019-03822-7
and the hypothalamic MRI atlas by Baroncini
LH (x ± 6, y − 9, z − 10 plus 2 mm sphere)
MH (x ± 4, y − 2, z − 12 plus 2 mm sphere)
The coordinates of LH and MH are in standard space (MNI space).


1. can I automatically define this in CONN, or must I pass via e.g. MRICron

2. how to I get it into CONN

3. can I also use these ROIs for 2nd level analyses ROI-ROI ?

THANKS !!

Sven

Error using zeros

$
0
0
I m trying to run an MVPA analysis on a large sample (600 subjects, 2 sessions per subject)

after about 40 hours, i got the following error:

ERROR DESCRIPTION:

Error using zeros
Out of memory. Type "help memory" for your options.
Error in conn_process (line 2790)
c=zeros([Y1.size.Nv(slice),repmat([numel(validsubjects),numel(validconditions)],[1,2])]);
Error in conn_process (line 43)
case 'analyses_gui_vv', disp(['CONN: RUNNING ANALYSIS STEP (voxel-to-voxel analyses)']); conn_process([13],varargin{:});
Error in conn (line 6561)
else conn_process('analyses_gui_vv',CONN_x.vvAnalysis); ispending=false;
Error in conn_menumanager (line 120)
feval(CONN_MM.MENU{n0}.callback{n1}{1},CONN_MM.MENU{n0}.callback{n1}{2:end});
CONN v.17.f
SPM12 + DEM FieldMap MEEGtools
Matlab v.2018b
storage: 696.6Gb available


Any suggestions? I have 32 gb ram

How to import preprocess file at MATLAB into CONN to skip preprocess steps in CONN?

$
0
0
Hi guys, I know it is perhaps simple but I have no idea how to import files that i had preprocess at MATLAB into CONN. I am new to CONN. I watched youtube to learn it but they mostly used the raw data and preprocess in CONN. I tried to import the one had been preprocess but I realized the MNI canonical didn't fit my normalized brain structure. 

Thank you

What is the difference between Subject and session?

$
0
0
What is the difference between Subject and session?

Conn18b SLURM issue

$
0
0
Hi,

I'm trying to analyze HCP data (n=223) and am having issues submitting jobs to our cluster (https://hprc.tamu.edu/wiki/Terra). All jobs submit and, as far as conn is concerned, stay in limbo perpetually. Slurm however returns the jobs as complete after a few seconds with no real errors. When I reopen conn, it says I have jobs running then tries to merge and nothing more happens. 

I think the issue revolves around how the info.mat file is created. I'm using default SLURM settings (except for mem and cpus). Looking through the logs after submissions fail, the JOBID variable in the GRID config is pulling my account number, not the job id. I attempted to change this by setting an env variable, but that was lost when submitting to other nodes, and needs to be dynamically set. I imagine conn looks for a jobid using one of the slurm commands and is just reading from the wrong column or something similar. Any ideas?

Error in preprocessing (Segment)

$
0
0
Hello,

I have tried to preprocess my data in the toolbox and this message appears:

------------------------------------------------------------------------
11-Feb-2020 17:03:26 - Running job #1
------------------------------------------------------------------------
11-Feb-2020 17:03:26 - Running 'Segment'

SPM12: spm_preproc_run (v7670) 17:03:26 - 11/02/2020
========================================================================
Segment C:\Users\Akhila\Desktop\Brain data\Nii files\functional\art_mean_au2013-02-12-17-functional-sub1.nii
11-Feb-2020 17:03:28 - Failed 'Segment'
Index exceeds the number of array elements (1).
In file "C:\Users\Akhila\Downloads\spm12\spm12\spm_maff8.m" (v7377), function "loadbuf" at line 95.
In file "C:\Users\Akhila\Downloads\spm12\spm12\spm_maff8.m" (v7377), function "spm_maff8" at line 26.
In file "C:\Users\Akhila\Downloads\spm12\spm12\spm_preproc_run.m" (v7670), function "run_job" at line 118.
In file "C:\Users\Akhila\Downloads\spm12\spm12\spm_preproc_run.m" (v7670), function "spm_preproc_run" at line 41.
In file "C:\Users\Akhila\Downloads\spm12\spm12\config\spm_cfg_preproc8.m" (v7629), function "spm_local_preproc_run" at line 474.
11-Feb-2020 17:03:28 - Running 'Normalise: Write'
11-Feb-2020 17:03:29 - Failed 'Normalise: Write'
Error using read_hdr (line 39)
Error reading header file "C:\Users\Akhila\Desktop\Brain data\Nii files\functional\y_art_mean_au2013-02-12-17-functional-sub1.nii".
In file "C:\Users\Akhila\Downloads\spm12\spm12\@nifti\private\read_hdr.m" (v7504), function "read_hdr" at line 39.
In file "C:\Users\Akhila\Downloads\spm12\spm12\@nifti\nifti.m" (v7758), function "nifti" at line 26.
In file "C:\Users\Akhila\Downloads\spm12\spm12\@nifti\nifti.m" (v7758), function "nifti" at line 97.
In file "C:\Users\Akhila\Downloads\spm12\spm12\config\spm_run_norm.m" (v7406), function "write_norm" at line 98.
In file "C:\Users\Akhila\Downloads\spm12\spm12\config\spm_run_norm.m" (v7406), function "spm_run_norm" at line 29.
[b]The following modules did not run:[/b]
[b]Failed: Segment[/b]
[b]Failed: Normalise: Write[/b]
[b]ERROR DESCRIPTION:[/b]
[b]Error using MATLABbatch system[/b]
[b]Job execution failed. The full log of this run can be found in MATLAB command window, starting with the lines (look for the line showing the exact #job as displayed in this error message)[/b]
------------------
Running job #1
------------------
CONN18.b
SPM12 + DAiSS DEM FieldMap MEEGtools
Matlab v.2019b
project: CONN18.b
storage: 15.2Gb available
spm @ C:\Users\Akhila\Downloads\spm12\spm12
conn @ C:\Users\Akhila\Downloads\conn
>><div id="papago-trans" style="left: 125px; top: 394.667px; position: absolute !important;">

SBC denoised data

$
0
0
Dear all,

I would like to do a seed based connectivity analysis. My data is already denoised outside of CONN as we obtained multi-echo data. I am wondering if I can still use CONN, as the GUI requires completion of its own denoising step before connectivity analysis.
Is there a way to skip the denoising in CONN entirely, or should the parameters just be 'tweaked' in a way that no additional denoising is performed? If the latter is true, which denoising settings would have to be adjusted?

Best regards,
Bram

CONN New Release

$
0
0
Alfonso,

Congratulations on the new release and wonderfully detailed manual!  

I just wanted to take a moment to thank you for all your assistance and sage advice over the years.  Your willingness to field questions on this forum has been invaluable to many in their research efforts.  You have created a wonderful resource, and it is clear from your regular engagement that CONN is your passion.

I look forward to giving some of the new CONN features a whirl (e.g., FNC).

Warm regards,
Jeff

Export ROI-to-ROI matrices in csv format

$
0
0
Dear Developers, 

I really enjoy working with the CONN toolbox, so first of all thank you for that!

For some machine learning I would like to convert the ROI-to-ROI matrices (.mat) per subject into csv files, to then flatten the upper triangle into one dimension and finally merge all subjects into one csv with all subjects. Do you have an idea on how to convert the .mat files containing the connectivity matrices into those csv files? I am working on this for quite some time, but did not achieve that.

Thanks in advance, best regards,
Hans

RE: how to create new ROI: hypothalamus ?

$
0
0
Hello Sven -

Since you have the 3D coordinates of your ROI, you can also use the Wake Forest Pick Atlas. I created ROIs for the pre SMA and SMA using this.

best,

Alan Francis

Conversion to double from struct

$
0
0
ERROR DESCRIPTION:

Error using get
Conversion to double from struct is not possible.
Error in conn (line 3897)
nsess=get(CONN_h.menus.m_setup_00{3},'value');
CONN18.b
SPM12 + DEM FieldMap MEEGtools
Matlab v.2019b
project: CONN18.b
storage: 840.1Gb available
spm @ C:\Program Files\MATLAB\spm12
conn @ C:\Program Files\MATLAB\conn

May I know how can i solve this matter? Under Roi --> network. Thank you

RE: gPPI versus weighted GLM for Block design

$
0
0
Thank you so much Alfonso! This was incredibly helpful and thorough! And thank you for running those simulations. I am very grateful.

I went ahead and ran a gPPI on my data to compare the results with the weighted GLM. I then looked at the trauma recall versus neutral recall contrast in relation to PTSD symptoms. For both gPPI and weighted GLM, Both analyses at p <0.0001 FWE corrected to p <0.05, nothing was significant (what I had expected based on my prior analysis). For further exploration for comparison purposes, I set the height threshold to p <0.05 cluster corrected to FWE p <0.05. For both gPPI and weighted GLM, I got the same cluster in the postcentral gyrus, but with the weighted GLM, I got a few additional clusters.

Also, is examining one single condition (e.g., trauma recall) the same for gPPI versus weighted GLM? Here is where I saw things start to diverge more. I had findings when doing weighted GLM but no findings with gPPI. Is this because with weighted GLM you can/are looking at absolute connectivity but with gPPI you can still only look at relative connectivity (trauma recall relative to the implicit baseline)?

I am still trying to wrap my head around absolute connectivity. What exactly are we looking at in the context of a weighted GLM? Is it not still looking at the single condition relative to baseline (or what you did not model)?

Thanks again for your guidance, it has been very helpful!


Emily

Removing ROI in batch

$
0
0
Hi all

does anyone know how to remove ROI via conn_batch? I know how to add, but not the other way round...

Thank you!

Eugenio

Plotting the values for each group (interaction)

$
0
0
Dear all,

I have got a 2 (before intervention vs after intervention) x 2 (patients vs controls) design. With the salience network as a seed (for a seed to voxel analysis), I have got a significant result at the interaction level but not at the t-test level (effect of time within either patients or controls). I would like to work out what is driving this significant result: i.e., is it that performance slightly increases over time for patients while it slightly decreases for controls? Is there a way I can plot the results, with a break down of the values for each of the 4 groups (before-patients; before-controls; after-patients; after-controls)?

Thanks

Marie

RE: Problem displaying surface-level analyses

$
0
0
Hi again Alfonso,
I re-conducted the surface-based analyses using CONN v. 19.b, however, when I select the 3D display in the 2nd level analyses, the ROIs are still outside the brain. I associated the ROI files with subject-space functional data as you suggested. 
Could you please advise on how to fix this? I am open to using another atlas file or another procedure.
Thanks,
Amy

Extracting aCompCor parameters for use in SPM

$
0
0
Hi Alfonso,

Is there a simple way to extract the aCompCor parameters, along with the ART parameters, through CONN? I would like to use these parameters (1 file? 2 files?) as regressors to include to create a new SPM 1st level model (outside of CONN)?

I selected the 'create confound-corrected time series' option in the Setup->Options tab, but can't seem to find these files. unless I am not looking for the good file?


Thanks for your time,

Yann

Between Subjects analysis

$
0
0
Hi, I am new to imaging and conn and just want to make sure i am doing things right. 

I have 30 subjects, each with just 1 session (15 seizure free, 15 not) and want to do a between subject analysis and compare rs connectivity between the two groups. I know in the set up there is an option to create two conditions for between subjects, and then in second level analysis i can specify contrasts.

My first question:  Does it matter which order I input my data for when I later analyze them? Ie, based on how i specify i would need to input my non seizure free patients first, and then seizure free. And thus, the conditions would look something like this: 

sz free:        00000011111
non sz free: 1111110000

I am unsure of how conn would otherwise be able to identify the different groups. 

Second question: to perform the between subjects analysis would I select my contrasts as [1-1] to analyze differences between groups? 

Hopefully that made some sense. Thanks so much!
Viewing all 6915 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>