Quantcast
Channel: NITRC CONN : functional connectivity toolbox Forum: help
Viewing all 6882 articles
Browse latest View live

error help please!

$
0
0
Hi,

I have been coming across this problem for the third time please help! Thank you.

Computing mask... : ...done
Reslicing images... : ...done
Writing mean image... : ...done
Completed : 18:26:07 - 23/09/2021
23-Sep-2021 18:26:07 - Done 'Realign & Unwarp'
The following modules did not run:
Failed: Realign & Unwarp

ERROR DESCRIPTION:
Error using MATLABbatch system
Job execution failed. The full log of this run can be found in MATLAB command window, starting with the lines (look for the line showing the exact #job as displayed in this error message)
------------------
Running job #1
------------------
CONN20.b
SPM12 + DEM FieldMap MEEGtools
Matlab v.2018b
project: CONN18.b
storage: 3854.6Gb available

abcd/hcp prepro import

$
0
0
Hello
I was wondering if there was an effort to include or scripts available to import preprocessing output from HCP or ABCD pipelines into CONN (similar to the helpful fmriprep import function)
Thank you!
David

2nd Level Analysis Error Text Conn20b

$
0
0
Hello,

When I run my second level analysis, it runs into an error and then gives me no output/plots. Error output suggested I post it here:
[i][b]ERROR DESCRIPTION:[/b][/i]

[i][b]Error using text[/b][/i]
[i][b]Value should be a finite number greater than 0[/b][/i]
[i][b]Error in conn_vproject (line 1175)[/b][/i]
[i][b]hold(h,'on'); hf=text(.95*size(tplot,2),.95*size(tplot,1),titleclusternames,'color','k','horizontalalignment','right','fontsize',4+CONN_gui.font_offset,'parent',h); hold(h,'off');[/b][/i]
[i][b]Error in conn_display (line 325)[/b][/i]
[i][b]conn_vproject(param,nonparam,[],[],THR,side,parametric,[],[],[],.50,[],SPMfilename,voxeltovoxel,issurface);[/b][/i]
[i][b]Error in conn_process (line 5051)[/b][/i]
[i][b]fh=conn_display('SPM.mat',ncon,style);[/b][/i]
[i][b]Error in conn_process (line 60)[/b][/i]
[i][b]case 'results_voxel', [varargout{1:nargout}]=conn_process(16,varargin{:});[/b][/i]
[i][b]Error in conn (line 11497)[/b][/i]
[i][b]conn_process('results_voxel','readsingle','seed-to-voxel');[/b][/i]
[i][b]Error in conn (line 10649)[/b][/i]
[i][b]if state==2, conn gui_results_wholebrain;[/b][/i]
[i][b]CONN20.b[/b][/i]
[i][b]SPM12 + AAL3 DAiSS DEM FieldMap MEEGtools cat12 gift marsbar wfupickatlas[/b][/i]
[i][b]Matlab v.2019b[/b][/i]
[i][b]project: CONN20.b[/b][/i]
[i][b]storage: 7969.6Gb available[/b][/i]
[i][b]spm @ /usr/local/MATLAB/tools/spm12[/b][/i]
[i][b]conn @ /usr/local/MATLAB/tools/conn20b/conn[/b][/i]

I tried to google this, but do not understand what I could do differently (or what I might be doing wrong). Why would the input to text not be a finite number greater than 0? How can I get this fixed?

Matlab 2019b, spm12, conn20b

Thanks!

Error extracting motion measures after scrubbing

$
0
0
Hi Alfonso,

I hope you're doing well!

I was wondering if you would be able to advise me on how to best proceed with the following issue.

I tried extracting the motion measures after scrubbing using the setup described in a previous response (https://www.nitrc.org/forum/message.php?msg_id=21510), and it has worked for my prior datasets. However, with this new dataset, I'm getting the following warning as soon as I hit Subject 23:

Error in conn_process (line 21)
case 'setup_conditions',conn_disp(['CONN: RUNNING SETUP STEP (condition-setup only)']); conn_process([1.5 2 5]); %
propagate conditions

I checked the nifti files and I noted that I only have 178 slices for this subject (instead of the usual 180). I "allowed missing data" when I completed the preprocessing step. I'm wondering if this may be the issue (or maybe not), and how best to circumvent the error message.


Thank you so much in advance, and sending you my best!
Maria

RE: Extremely high values of contrast estimates in SPM after using CONN preprocessed and denoised data

$
0
0
[color=#000000]Hi everyone, [/color]

[color=#000000]I am also facing the same issue of really high beta values as described by Avantika, and it isn't getting resolved even after switching off "grand mean scaling", "global normalization" and "Serial Correlations" in SPM. [/color]

[color=#000000]I preprocessed the data in SPM first, then denoised in CONN and used the niftiDATA files (confound-corrected files generated by CONN) as input in SPM for first level analysis. The SPM second level output looks significantly cleaner after these steps as compared to the traditional approach of using only SPM preprocessed files for the GLM analysis (and skipping denoising in CONN).[/color]
[color=#000000]However, when I try to conduct ROI analysis in MarsBar, I get really high beta values/parameter estimates from the denoised version of the SPM analysis. On the other hand, these values look normal on files both preprocessed & analyzed only in SPM (and without denoising in CONN).   [/color]

[color=#000000]Any ideas on what I can do? [/color]

[color=#000000]Thanks[/color]
[color=#000000]Sneha[/color]

[i]Originally posted by avantika mathur:[/i][quote][color=#000000]Hi Alfonso,[/color]

[color=#000000]The Global Normalization is already set to "None" in spm first level. [/color]
[color=#000000]Attached is the image of the same. [/color]

[color=#000000]Do you mean something else is supposed to be changed?[/color]

[color=#000000]Avantika[/color]
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Hi Avantika,[/color]

[color=#000000]I would suggest trying to set the "[i]grand mean scaling[/i]" option in SPM first-level estimation off, since that looks like a possible culprit for this behavior (after band-pass filtering, the mean functional data is zero at every voxel, so global signal scaling -and similarly any other default mechanisms that rely on the average BOLD signal containing anatomical information/features- are likely to fail in rather unexpected ways). [/color]Let me know if that works

[color=#000000]Best[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by avantika mathur:[/i][quote]Hi Conn users,

After following the following posts, 
https://www.nitrc.org/forum/message.php?msg_id=12021

I used the alternative method to import conn preprocessed data in SPM which is the following :
Entering the preprocessed/denoised timeseries into SPM to perform the first-level analyses.

The data I am analyzing is children data thus, ART was used at liberal threshold in preprocessing [Global signal z value threshold 10, subject motion 5 mm]. I did not have the "effect of Condition X" entered as confounding effects during Denoising.
I used the file generated after conn preprocessing and denoising...the niftiDATA_Subject001_Condition000 and further defined first-level design matrices within SPM, specified masking threshold to -Inf in first level analysis [https://www.nitrc.org/forum/message.php?msg_id=14852].

After doing first level analysis and group level analysis [10 subjects], I get weird beta estimate values - which are extremely high . Attached are the bar plots for the same [1st bar-chart - single subject, 2nd bar chart - group of 10 subjects]. Beta values should not be this high.

Can someone direct me where I am going wrong?

Avantika[/quote][/quote][/quote]

RE: Undefined function 'spm_inv' for input arguments of type 'double'.

$
0
0
Boosting again as I still have yet to determine the root of these issues; can't move forward after a week and a half.

art_regression_timeseries vs 1st-level scrubbing covariate

$
0
0
Hi Alfonso and CONN users,

I'm trying to calculate, for each subject, the following:

1. #volumes with global signal change > 3 SD
2. #volumes with FD > 0.5 mm
3. #volumes scrubbed based on the criteria listed above (GS change > 3 SD OR FD > 0.5 mm)

I want to use this information to identify and exclude any participant with > 20% volumes that are scrubbed.

My understanding is that this information is created and saved as outputs during preprocessing. If I open up the file labeled "art_regression_timeseries" I believe I get the GS change and FD for each volume of each run. When I open a file called "scrubbing_art_regression_timeseries", it looks as though it indicates the total number of volumes that have been scrubbed in the given run. I assumed that this scrubbing variable was based on whatever movement threshold I had set. The issue is that when I compare the "scrubbing" file to the "art_regression" file, the number of volumes scrubbed is more than the number of volumes that exceed the 3 SD and 0.5 mm threshold.

For example, for subject 1, when I open the "art_regression_timeseries" file, the first few GS changes and FD values are:

0           0
0.477     0.053
0.0827   0.0787
3.3294   0.1097
1.785     0.084

In the same subject's "scrubbing_art_regression_timeseries" file, the values are:

0           0
0           0
1           0
0           1
0           0

Why are 2 volumes scrubbed here, if according to the art regression file, there is only one volume that exceeds GS change or FD values of 3 or 0.5 mm, respectively? Are volumes scrubbed for any other reasons?

Finally, when I look at "QC_InvalidScans" it appears as though it gives me the number of volumes that were dropped for all sessions across a subject. Is there a way to calculate this information per session?

Thank you!

First level co-variates not there

$
0
0
Hello community! 

  I am new at using CONN and fMRI analysis in general. For this project I am using already preprocessed data. After importing everything to CONN  I realized that, since it is already pre-processed, it didn't populate a first-level covariate. It just says "enter covariate here"
       I'm confused as to what my next steps should be. Thank you for your help! 

P.S. See attached !

-Alex-

Error while running CONN

$
0
0
Dear Alfonso and CONN community,

I tried running CONN for around 30 AD subjects and while unzipping the nifti files itself MATLAB showed me the following error. May I know what is the meaning of this error ?

ERROR DESCRIPTION:

Error using spm_slice_vol (line 32)
spm_slice_vol.c not compiled - see Makefile
Error in spm_read_vols (line 34)
Y(:,:,p,i) = spm_slice_vol(V(i),spm_matrix([0 0 p]),V(i).dim(1:2),0);
Error in conn (line 773)
CONN_gui.refs.canonical=struct('filename',filename,'V',V,'data',spm_read_vols(V));
Error in conn_batch (line 690)
conn init; % initializes CONN_x structure
Error in conn_program_for_whole_dataset (line 180)
conn_batch(batch);
CONN20.b
SPM12 + DAiSS DEM FieldMap MEEGtools
Matlab v.2021a
project: CONN20.b
spm @ /Users/padmapriyavijayakumaran/spm12
conn @ /Users/padmapriyavijayakumaran/conn

and it matlab command window I had this
unzipping gz files...Error using
matlab.io.internal.archive.core.builtin.uncompressgz
An internal error has occurred.

Error in gunzip (line 92)
names =
matlab.io.internal.archive.core.builtin.uncompressgz({entries.file},
outputDir);
Error in conn_gz2nii (line 12)
filename(redo)=gunzip(filename(redo));
Error in conn_getinfo (line 13)
filename=conn_gz2nii(filename);
Error in conn_file (line 11)
[nV,str,icon,filename]=conn_getinfo(filename,doconvert);
Error in conn_batch (line 913)
else
CONN_x.Setup.structural{nsub}{nses}=conn_file(temp{min(numel(temp),nses)});
Error in conn_program_for_whole_dataset (line 180)
conn_batch(batch);
An error was encountered while saving the command history
Please visit http://www.nitrc.org/forum/forum.php?forum_id=1144 if the page is not automatically loaded
>>

Can anyone help me? Initially I thought it is lack of space so I deleted most of my old files but still I get this error .

Cerebellum ROI analysis in CONN

$
0
0
Hi all,

I am brand new to CONN Toolbox, and I'm currently making my way through the documentation and tutorials. 

I have manually traced dentate nuclei in a group of patients with cerebellar ataxias, and I want to do a seed to voxel resting-state functional connectivity analysis in CONN.

I have a couple of questions that I haven't been able to figure out from the documentation. First, will CONN accept/allow ROI's in native space or do they have to be normalised to MNI space, before importing them? I am also contemplating whether a normalisation to SUIT space is required for accurate registration of the cerebellum. Would anyone have some tips on how the workflow would look like? For example, would I need to do a normalisation of my images to SUIT space first (which does a normalisation and segmentation of the cerebellum and then crops out the cerebellum from the rest of the brain), and then use those images as my functional images to do denoising, or would I be better off doing all my preprocessing first and then using CONN just for the extraction of my ROI time series and functional connectivity with the rest of the brain?

Thank you in advance,

Rebecca

RE: Analysis of interaction of categorical variables

$
0
0
[color=#000000]Dear Mayron,[/color]

[color=#000000]Yes, exactly. In general dichotomous factors (defining two groups in your data) can be treated exactly the same way as continuous factors when defining GLM analyses (this is not true for factors that define more than two groups). For example, if you have a "group" factor defining four groups (HYPER, HYPO, EUPH, HC) and a "sex" factor defining two groups (MALE, FEMALE), the following two analyses are equivalent (both testing the group*sex interaction):[/color]

[color=#000000]1) subject effects: [HYPER*MALE HYPO*MALE EUPH*MALE HC*MALE HYPER*FEMALE HYPO*FEMALE EUPH*FEMALE HC*FEMALE][/color]
[color=#000000]    between-subjects contrast: d2 x d4  (i.e. [1 -1 0 0 -1 1 0 0;0 1 -1 0 0 -1 1 0;0 0 1 -1 0 0 -1 1])[/color]

and 2) subject effects: [HYPER HYPO EUPH HC HYPER*SEX HYPO*SEX EUPH*SEX HC*SEX]
     between-subjects contrast: [0 1] x d4 (i.e. [0 0 0 0 -1 1 0 0;0 0 0 0 0 -1 1 0;0 0 0 0 0 0 -1 1])

where the variable like HYPER*MALE, HYPO*MALE, etc. are simple interaction terms (e.g. HYPER*MALE is 1 for subjects that are HYPER&MALE and 0 for everybody else; you may simply define these variables in CONN's [i]Setup.Covariates 2nd-level[/i] tab by selecting HYPER/HYPO/EUPH/HC/MALE/FEMALE and then clicking on the '[i]Create interaction of selected covariates[/i]' menu), and the variable SEX in the second analysis can be defined in any meaningful way (e.g. -1/+1 values, 0/1 values, etc.), all resulting in the same statistics (and also the same as in analysis (1) above). Of course approach (1) can be easily extended for covariates with more than two levels while approach (2) cannot, but in the case of dichotomous factors both approaches will work perfectly fine.

Also, just for reference, if you go ahead and use CONN's '[i]create interaction of selected covariates[/i]' to define all those covariates in case (1) above, then in the second-level analysis tab you should be able to find in the '[i]choose analysis description'[/i] menu something that reads: 

   '[i]does the connectivity differences between HYPER, HYPO, EUPH and HC subjects depend on MALE/FEMALE-status[/i]'

which will fill-out automatically the proper values for analysis (1) above evaluating the group*sex interaction.

Hope this helps
Alfonso
[i]Originally posted by Mayron Pereira Picolo Ribeiro:[/i][quote]Dear Alfonso,

Thank you for all the support offered here!
We have used CONN to investigate fALFF and SBC in four different groups: hyperphagic MDD, hypophagic MDD, euphagic MDD and healthy controls (HC). Because data were collected in four sites, we included site as covariates as well as sex (due to difference between groups for sex).
We received a request from a reviewer to explore sex in a moderation analyses given sex differences between groups, so we would like to test the interaction group*sex for fALFF and SBC in our sample using CONN.
We saw a post in which you explain this process using continuous variables (https://www.nitrc.org/forum/forum.php?thread_id=11287&forum_id=1144), but it is not very clear for us how to do that with categorial variables. We thought of adding separate 2nd-level covariates to setup including each group and each group separated by males and females (e.g., hyper females, hyper males, hypo females, hypo males, euphagic females, euphagic males, HC females, HC males) and set up the between-subject contrast in 2nd-level results to explore interaction. Would this be a way to explore this interaction? How exactly could we set up the main effect of group and group * sex interaction?

Thank you so much!

Mayron[/quote]

RE: matc2nii using standalone installation of CONN

$
0
0
[color=#000000]Hi Paul,[/color]

[color=#000000]The .matc format expects to find associated .mat files with the same name and in the same folder containing the header information for these data (similar to the old .img / .hdr files), could it be that perhaps you have moved these .matc files from a different location and are missing the associated .mat files?[/color]

[color=#000000]Best[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Paul Thomas:[/i][quote]Hello all,

I would like to convert some preprocessed/denoised time series from .matc to .nii file formats. I do not have access to a Matlab license, so I am using the standalone version of CONN with MCR (the GUI works so I am assuming that the installation was successful). I've tried to use the "run_conn.sh" shell script to run the "conn_matc2nii" function from command line (MacOS) via: './run_conn.sh matc2nii "/path/to/data/DATA_Subject001_Condition000.matc"'. I get the following error:

"Error using load
Unable to read MAT-file /path/to/data/DATA_Subject001_Condition000.mat. File might be corrupt.

Error in conn_vol (line 3)
Error in conn_matc2nii (line 28)
Error in conn (line 9907)
CONN19.b
SPM compiled
Matlab v.2019b
project: CONNundetermined
ans =
logical
1"

Where it looks like the file path string has the last character ('c') removed and CONN therefore tries to open the .matc file as a .mat file. I've tried several combinations of using 'conn_matc2nii' instead of 'matc2nii', removing quotes from the file path, encapsulating the entire argument in quotes, etc. I am also fairly confident that the .matc files I have are not corrupt. If anyone has any advice on how to do this I would greatly appreciate it!

Thanks!

Paul[/quote]

RE: REX data issue

$
0
0
[color=#000000]Dear Alfonso,[/color]
[color=#000000]When you got a chance, please can we revisit this issue?[/color]
[color=#000000]Thanks,[/color]
[color=#000000]Haixia[/color]
[i]Originally posted by haixia zheng:[/i][quote]Dear Alfonso,
I used 'Plot effects' button and enable REX gui to extract the data. I have tried 'import data' button too, that gave me same data as the ones from REX gui. Placebo/200/600 are three different within-subject conditions. Because this was a cross-over design, 20 subjects visited 3 times for different doses. Please see the attached PDF file which shows the screenshots for all the settings.
[color=#000000]Thank you very much for your help.[/color]
[color=#000000]Haixia Zheng[/color]
[i]Originally posted by Alfonso Nieto-Castanon:[/i][quote][color=#000000]Dear Haixia Zheng[/color]

[color=#000000]Could you please provide a few more details on how exactly the data was extracted using REX (e.g. did you use the 'import data' button and then exported the resulting covariates to a file, did you use the 'export mask' button and then used rex manually to extract the original data? ...) and also abut your second-level analyses (e.g. are the placebo/200/600 three different subject groups or are they three different within-subject conditions?)[/color]

[color=#000000]Thanks[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by haixia zheng:[/i][quote]Dear Alfonso and Community members,
I did some seed-to-voxel analyses using CONN and identified a significant cluster. CONN REX showed a nice bar plot. But when I extracted the data from CONN REX and plotted it in R, it looks very different from the plot gave by CONN REX (please see the attached). I wonder how the plot given by CONN REX was generated, and why it looks so different from the R plot using the data extracted from CONN REX. 
Also, I did the statistical test in R using these extracted data, there was no significant effect at all. Did I do anything wrong? 
Any input or guidance is much appreciated!
Thank you[img]https://www.nitrc.org/Users/hzheng/Documents/IBU/Manuscript_VIA/REXdata.jpg[/img][/quote][/quote][/quote]

RE: MAT to nifti format

$
0
0
Hi Alfonso,

Is it possible to use the matc2nii command to convert .matc file to nifti if ONLY the .matc file is available? Or does this command also require the presence of the associated .mat file? Further - even if the .matc and .mat files are available/in the same directory, does the matc2nii command require that these files are in the directory structure created by CONN during the previous preprocessing steps?

I currently can only get subsets of all of the files created by CONN as they are stored on a departmental server that I cannot run CONN on and I likely cannot store all of this data locally.

Thank you,

Paul

RE: 2nd level analysis (seed to voxel) display result shows error and fails to compute connectivity maps

$
0
0
Hi, thanks for pointing out the problem. But I tried the same thing with conn 18b and it worked nicely and generated connectivity maps . So, I was wondering if it's an issue with conn 20 that it cant connect with spm properly?

RE: CONN Preprocessing very slow

$
0
0
[color=#000000]Dear community,[/color]

[color=#000000]I'm facing a similar issue.[/color]

[color=#000000]In my case, I have 1 session for 75 subjects, each with 1200 volumes.[/color]

[color=#000000]It seems an error using spm_bsplinc came up but CONN kept running.[/color]

[color=#000000]Will this affect the preprocessing results? Should I stop the preprocessing and reinstall spm12 and CONN?[/color]

[color=#000000]I'm using MATLAB R2021a and the latest stable versions of SPM12 and CONN.[/color]

[color=#000000]Please find the Command Window output attached to this message.[/color]

[color=#000000]Thank you in advance.[/color]

[color=#000000]Francisco
[/color][i]Originally posted by emmanuelle_b:[/i][quote]Dear community,

I am currently running a default preprocessing pipeline in CONN (minus the slice-timing step, which I removed), with 24 subjects * 5 sessions * ~750 functional volumes.
My issue is, 12 hours after launching preprocessing, it looks like I'm still at the Realignment step, and for Subject 2 only...

A similar preprocessing pipeline on the same dataset launched from SPM [i]without[/i] using the CONN toolbox takes about 1 hour per subject (of which ~15 minutes for the realignment step, if I remember correctly), so I'm not sure why CONN preprocessing is taking so long here...

(Eventually, I want to use CONN for its denoising/ CompCor method.)


Many thanks in advance for your help!

Emma[/quote]

RE: Error extracting motion measures after scrubbing

$
0
0
Excellent - thank you so much, Alfonso!

Have a wonderful afternoon, and sending you my best,
M

Error with SPM

$
0
0
Dear Alfonso and CONN community,

while running the CONN toolbox for alzheimer's data, I experienced the following error. It ran well for 14 subjecs among 30 subjects while subject 15 was running. I got this error

The following modules did not run:
Failed: Segment
Failed: Image Calculator
Failed: Image Calculator

Error using MATLABbatch system
Job execution failed. The full log of this run can be found in MATLAB command window, starting with the lines (look for the line showing the exact #job as displayed in this
error message)
------------------
Running job #1
------------------
 does anyone know what does this imply, previous I had permission error for spm12 and i modified it according to instructions from the internet and then this happened 

Thank you

Display slice viewer with MNI boundaries

$
0
0
Hey CONN team!

I am working on a pediatric dataset so, I am using a pediatric TPM instead of default TPM. But after preprocessing when I check the registration for the structural image from 'Display slice viewer with MNI boundaries', the overlay boundary seems to be smaller than the structural image. It is unable to cover the whole brain. What should I do in here? 

I would be very grateful if someone can provide any inputs or suggestions.

Functional images cut-off after realignment

$
0
0
Hi,

I am trying to run an analysis with 8 sessions per subject. After performing the realignment, parts of the brain tend to be cut-off. I noticed that this does not occur when I preprocess just a single session separate. Is there any known cause/solution for this?

On a (likely) related note, is it possible to force motion corrections to be limited to each session? The default pipeline seems to realign all sessions to each other; I'm using session specific structurals and would like use the normalization to align the sessions instead.

Thanks in advance,

- Rik
Viewing all 6882 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>