Quantcast
Channel: NITRC CONN : functional connectivity toolbox Forum: help
Viewing all 6872 articles
Browse latest View live

Invalid or incomplete JSON during preprocessing

$
0
0
Hello!



I am new to CONN and I using the compiled version. 

I am trying to complete a preprocessing step using the preprocessing pipeline for volume-based analysis (direct normalization to MNI-space) when FieldMaps are available. When entering functional data, for each subject I am including two resting state scan files in separate sessions as my primary data set. I created a secondary data set, independent from the primary data set, where for each subject I add their .img fieldmap file (in Hz) to each session.

When asked about slice order, I chose interleaved (Siemens). When entering VDM settings, I select the secondary data set as my fieldmap location and pre-computed fieldmap file as my fieldmap type. I then enter my EPI total readout time. All other parameters are left unchanged.

When I set up the pipeline, I receive this error following a prompt that says "Preparing functional Creation of voxel-displacement map (VDM) for Susceptibility Distortion Correction":



ERROR DESCRIPTION:
Error using spm_jsonread
Invalid or incomplete JSON.
Error in conn_jsonread (line 29)
Error in conn_setup_preproc (line 2804)
Error in conn (line 1105)
Error in conn_menumanager (line 120)

CONN19.b

SPM compiled

Matlab v.2019b

project: CONN19.b

storage: 848.8Gb available

Following the error message chain above, in conn_setup_preproc (line 2804) I found

if nses==1&&isempty(BLIP), conn_disp('fprintf','warning: unable to find PhaseEncodingDirection information in %s\n',filename); end

However, it looks like the issue may begin earlier on line 2771, it is not recognizing what we enter for vdm_type, and not registering that we entered a number for BLIP. 

Does anyone see an obvious error, or perhaps have a suggestion on how I can resolve this issue. As far as I am aware, I do not need a JSON file here, so I am curious why this error keeps appearing and how the number entered into the BLIP field relates to the error?

Thanks,
Ted

RE: CONN and fmriprep - error loading confounds

$
0
0
[color=#000000]Hi Ana,[/color]

[color=#000000]Yes, you are right one alternative would be to manually edit those files and change the N/A values to 0. Another alternative would be to recreate those Framewise Displacement timeseries from the realignment data directly (e.g. you can do that in the [i]Setup.Covariates.First-level [/i]tab, by selecting the '[i]covariate tools. compute new/derived first-level covariates[/i]. [i][b]compute FD_power[/b][/i]' menu there; that will create a new FD_Power covariate which should be identical to your current fmriprep-imported measure but with 0 instead of NaN values on the first scan). [/color]

[color=#000000]Hope this helps[/color]
[color=#000000]Alfonso[/color]

[i]Originally posted by Ana Souza:[/i][quote]Hi all, 


I am using CONN 19c for connectivity analysis in a resting state dataset where I ran fmriprep for preprocessing.
I get an error in CONN when loading one of the confounds from the confounds.tsv file (file was imported without errors). The error happens in the Denoising step, when I select which confounds to use as regressors. The framewise displacement confound has a N/A value in its first row and CONN passes it to a svd method which throws an error. This scan line is also detected as an outlier in the fmriprep output whose column I thought of using as a regressor as well. So my question is whether I can remove this first line before loading it into CONN or zero pad it. If so is there anything else I should be aware of? 

Any help is much appreciated.
Best,
Ana[/quote]

RE: Pairwise time-series correlations in a mask

$
0
0
Hi Tamir,

[color=#000000]Sorry that cannot be done simply from the GUI, but you could compute those average-correlation measures using something like:[/color]


Maskfile = '/data/mymask.nii'; % ROI mask; this file should containing 0/1 values with same dimensions as functional data (e.g. [91 109 91])

AvgR = [];
MASK = spm_read_vols(spm_vol(Maskfile)); 
for nsub=1:conn_module('get','Setup.nsubjects')
   a = conn_vol(fullfile(conn_prepend('',conn_module('get','filename'),''), 'results','preprocessing',sprintf('vvPC_Subject%03d_Condition001.mat',nsub)));
   b = conn_get_volume(a);
   AvgR(nsub) = norm(b*MASK(a.voxels))^2/nnz(MASK(a.voxels))^2;
end

where the variable [b]AvgR[/b] will be a vector (one value per subject) with the average correlation among all voxels specified by the mask file (like Global Correlation, but within a mask).

Hope this helps
Alfonso

[i]Originally posted by Tamir Eisenstein:[/i][quote]Hi CONN experts,

I would like to compute the average of all pairwise correlations between the time-series of all the voxels in a specfic mask for each subject - is it something I can get using the GUI? or does it needs to be implemented directly in MATLAB?

Thanks!
Tamir[/quote]

Regional Homogeneity (ReHo) in CONN

$
0
0
Hello Alfonso,

Can ReHo be computed in CONN?
Will computing LCORR with a kernel size equal to the voxel width be the same as regional homogeneity (ReHo)?

Many thanks in advance,
Mohammad

RE: Problem with ALFF and fALFF analysis

$
0
0
Thank you Alfonso,

The issue was fixed. I appreciate your help greatly.
It was an issue with the Lab's drive and the subjects with missing ALFF values actually had missing BOLD data.

Thanks again,
Mohammad

Extract BOLD timeseries physiological factor gPPI

$
0
0
Hi there,

Would you be able to help with the following (and apologies if this has been asked before)?

I have set-up a gPPI model with 5 conditions and I was wondering if it is possible to extract individual BOLD time series for the physiological factor between the seed and my ROI (equivalent to PPI.Y / VOI eigenvariate in SPM PPI analysis).

Reading the forum, it is possible to obtain BOLD time series in CONN (both prior to denoising and after denoising), but they seem to be condition specific. Is there a variable in CONN that contains the individual BOLD time series regressor for the physiological factor between the seed and my ROI, irrespective of condition?

Many thanks in advance!

Best,

Diede

RE: First-level analysis and exporting results

$
0
0
[i]Originally posted by Benson Stevens:[/i][quote]Hi Bryant,

There is a brief description using the GUI in the CONN forum here:
https://www.nitrc.org/forum/message.php?msg_id=13304

The raw files can be found within the folder in your main CONN folder, in the .../results/firstlevel/ANALYSIS_01/ folder. They are called things such as "resultsROI_Subject002_Condition001.mat" This is subject 2, condition 1. If you know any matlab scripting, you could extract from these, but you will have to create in the script to make your contrast of interest, as it saves them per condition/stimulus type. These files have the Fisher Z values, under "Z" if you drop one of the files into the matlab command line. "Names" is your source names, and "names2" is the targets. They are saved as correlation matrix.  

Many Blessings,
Benson[/quote][color=#000000]Hi Benson,[/color]

[color=#000000]I am trying to export first-level results from CONN, but I am unfamiliar with MATLAB scripting. Is there a way to export these results within the GUI? Thank you for your help![/color]

[color=#000000]Best,[/color]
[color=#000000]Subin[/color]

Seed based Connectivity Analysis in CONN

$
0
0
Hello Everyone,

I am performing a SBC analysis on resting state data on 3 groups of subjects. 
After preprocessing my data into conn, I tried to export my roi in the ROI option in the SETUP tab and proceeded with de-noising, first level and second level. However, in the second level I do not find results. All of the glass brains appear empty and are without any clusters for all contrast ( t and f). But clusters appear as soon as any-effect (f-test) is selected in the second level

I am not sure what is going wrong. As stated in the manual I selected - weighted GLM ,  bivariant correlation, seed-to-voxel analysis and no weighting in the first level tab.
 
Can someone please help me out or suggest me some other way?


Thanks for your time.

RE: Selecting Voxel Mask

$
0
0
Hi,

I have the same question. Just bumping it if someone can verify. From what I understand, specifying implicit mask means that voxel-voxel analysis will be in the subject space and not in MNI space. Does that mean that you don't need to normalize your data to MNI space if you are choosing implicit mask?

I have searched the forum and the Conn book but this hasn't been mentioned clearly. I will be grateful if someone can help.

Thank you.

Zaeem

RE: Studying rs- fMRI in individual subjects is the new mainstream?

$
0
0
Dear Kasia,

Thank you for sharing your experience. I would say that all types of analysis are mostly done to make a prediction about a group because we usually want to generalize the results to a population.
The only reason to study individual subjects would be that one want to make predictions only in those particular subjects. That being said, the purpose of "2nd level analysis" is so that one can make a group prediction and not on one subject.

Also, I think in GLM implementation, the variance terms used are for "mixed-effects", that include both fixed-effects and random effects. That is so that one can generalize their results.

I hope someone else can verify.

Best,

Zaeem

RE: use default Tissue Probability Map option

$
0
0
Dear Marianna,

Tissue probability maps are included at the segmentation step. So if you choose one of the default preprocessing pipelines that contain segmentation, a GUI will pop up when you will "run" the preprocessing.
The default TPM file is already selected, which you can change.

I cannot answer regarding if you need different TPM for different subject. My guess would be yes, if you have subjects with lesions of different types.

Best,

Zaeem

Export mask

$
0
0
Hi Alfonso and all CONN experts,

In the Results Explorer window under the clusters list there is a list of the corresponding areas from the Harvard-Oxford atlas for each identified cluster in the analysis.

Is there a way to export/create masks of these specific "sub-cluster" areas? For example, if I have a significant cluster which is composed of 250 voxels in the precuneus and 150 in the PCC, is there a way to export/create two seperate masks of these precuneus and PCC clusters?

Thanks!
Tamir

Problem while I work with anothers data

$
0
0
Hello
I have this problem while I work with anothers data
ERROR DESCRIPTION:

Error using conn_menu>conn_spm_read_vols (line 1091)
Error reading file /Applications/conn/utils/surf/referenceT1.nii. File may have been modified or relocated. Please load file again
Error in conn_menu (line 666)
[temp,volhdr]=conn_spm_read_vols(title);
Error in conn (line 2450)
conn_menu('updateimage',CONN_h.menus.m_setup_00{5},vol);
CONN19.c
SPM12 + DAiSS DEM FieldMap MEEGtools
Matlab v.2016a
project: CONN19.c
storage: 334.8Gb available
spm @ /Applications/spm12
conn @ /Applications/conn
Please help help to sort out this problem

Good atlas for ROI-to-ROI analysis

$
0
0
Dear all,

As things evolve with time, I wanted to ask for current suggestions for a well-established brain atlas for ROI-to-ROI analysis.

Thank you,
Lucas

QC variables are not moved to 2nd level analysis tab

$
0
0
Hi everyone, I have recently switched to conn 19c. It is awesome as usual, except for the fact that QC variable calculated during preprocessing and denoising are not automatically moved to 2nd level analysis tab. Is this the expected behaviour or is it a bug ? And hwo could I move the QC variables (or at least those that i am interested in) at the 2nd level (possibly without rerunning everything) ?
Thank in advance for any tip
F.

Import conditions informations from a text file, in an event-related desing + batch to select files for analysis + question about denoising files

$
0
0
Dear Dr Alfonso Nieto-Castanon,

I would like to carry out a gPPI analysis with the CONN toolbox. I am facing several issues:

[u]1. Import conditions informations from a text file[/u]
To set Setup.conditions I tried to import the information about the conditions from a .csv file.
I had followed the structure given as an example in help conn_import condition (I am attaching here an extract from my file) relating to "FILE FORMAT 1 (CONN single-file condition format; *.csv or *.txt extensions)". However, during the import, an error "multiple rows for condition human subject 1 session 1" seems to indicate that there cannot be several onsets and durations for the same condition.

Nevertheless, in my event design paradigm, within a run several trials belonging to different conditions follow one another, in random order for each participant.

Could you please help me to understand how to model this in an automated way with a .csv file import in CONN toolbox?

[u]2. Batch to select functional and structural files[/u]
When I executed the batch to select the structural files and functional files (with the path to my files), the matlab terminal indicates "Unzipping files..." but nothing is written in the folder and the script remains blocked at this stage (even though I have write rights in the folders). Unzipping .gz files manually does not solve the problem.

You can find just below the corresponding block of code:

% FIND functional/structural files
% note: this will look for all data in these folders, irrespestive of the specific download subsets entered as command-line arguments
NSUBJECTS=3;
cwd=pwd;
%FUNCTIONAL_FILE=cellstr(conn_dir('SSU_archeoneuro_*/TRACE*_BOLD/wvol0000_warp_merged_blur.nii'));
%STRUCTURAL_FILE=cellstr(conn_dir('archeoneuro_*_wmcropped_T1.nii'));
%if rem(length(FUNCTIONAL_FILE),NSUBJECTS),error('mismatch number of functional files %n', length(FUNCTIONAL_FILE));end
%if rem(length(STRUCTURAL_FILE),NSUBJECTS),error('mismatch number of anatomical files %n', length(FUNCTIONAL_FILE));end
%nsessions=length(FUNCTIONAL_FILE)/NSUBJECTS;
%FUNCTIONAL_FILE=reshape(FUNCTIONAL_FILE,[nsessions, NSUBJECTS]);
%STRUCTURAL_FILE={STRUCTURAL_FILE{1:NSUBJECTS}};
%disp([num2str(size(FUNCTIONAL_FILE,1)),' sessions']);
%disp([num2str(size(FUNCTIONAL_FILE,2)),' subjects']);

[u]3. Configure the denoising step in the CONN Toolbox[/u]
Could you please tell me where are stored the files used in the Denoising step, which contain the non interest regressors (nuisance table): CSF, white-matter, motion regressors.
Are these files replaceable and modifiable?

Many thanks in advance.
Best Regards,
Mathilde Salagnon

RE: ROI-to-ROI menu error.

$
0
0
[color=#000000]Dear Romke,[/color]

[color=#000000]That is curious, it seems there is a mismatch between the number of subjects in your study and the number of subjects' data stored in your ROI-to-ROI first-level results folder (e.g. I imagine this may happen if you delete some subjects. Could you please try running the command:[/color]

[color=#000000]conn_process prepare_results_roi[/color]

[color=#000000]from Matlab command-line window (that will repeat the "preparing second-level ROI analyses" step) and seeing whether that fixes this mismatch?[/color]

[color=#000000]Thanks[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Romke Hannema:[/i][quote]Dear Conn Toolbox community, 

thank you for your being so consistently helpful. I am trying to perform an ROI-to-ROI analysis in the Conn interface. Whenever I so much as try to open the ROI-to-ROI menu, however, it immediately throws an error: 

ERROR DESCRIPTION:

Matrix dimensions must agree.
Error in conn_process (line 4685)
nsubjects=find(any(X(:,nsubjecteffects)~=0,2)&~any(isnan(X(:,nsubjecteffects)),2)&~any(any(all(isnan(y),2),3),4));
Error in conn_process (line 57)
case 'results_roi', [varargout{1:nargout}]=conn_process(17,varargin{:});
Error in conn (line 9255)
CONN_h.menus.m_results.roiresults=conn_process('results_ROI',CONN_x.Results.xX.nsources,CONN_x.Results.xX.csources);
Error in conn_menumanager (line 134)
feval(CONN_MM.MENU{n0}.callback2{n1}{1},CONN_MM.MENU{n0}.callback2{n1}{2:end});
CONN18.b
SPM12 + DAiSS DEM FieldMap MEEGtools
Matlab v.2018b
project: CONN18.b

Due to the variable names in the error lines I assumed it might have had something to do with there somehow being a mismatch in number of files in my functional and structural setup, but I determined that not to be true. It will also throw this error whenever I try 'results explorer' or 'graph theory' from that same menu. Would there happen to be someone who can tell me what could be causing this issue? 

Kind regards, 
Romke Hannema[/quote]

RE: Regional Homogeneity (ReHo) in CONN

$
0
0
[color=#000000]Hi Mohammad,[/color]

[color=#000000]Yes, LCOR, computed as the average connectivity between a voxel and its neighbors, is a measure of regional homogeneity, as it measures the similarity between the BOLD timeseries of neighboring voxels (with a user-defined neighboring-size parameter). [/color][color=#000000]That said, LCOR, as computed in CONN (see Nieto-Castanon 2020 for details) and extending the "Integrated Local Correlation" Deshpande et al. 2007 definition, is not identical to ReHo, as computed in C-PAC and following Zang et al. 2004 definition. In general LCOR can be considered a generalization of ReHo that behaves more robustly across different data resolutions and neighboring sizes (see for example Deshpande et al. "Integrated local correlation: A new measure of local coherence in fMRI data" for specific details, in particular some direct comparisons between ReHo and LCOR)[/color]

[color=#000000]Hope this helps[/color]
[color=#000000]Alfonso[/color]
[i]Originally posted by Mohammad Iyas Kawas:[/i][quote]Hello Alfonso,

Can ReHo be computed in CONN?
Will computing LCORR with a kernel size equal to the voxel width be the same as regional homogeneity (ReHo)?

Many thanks in advance,
Mohammad[/quote]

Permutation testing in ROI-ROI analyses

$
0
0
Hi,

In past versions of Conn (~17f), I was able to threshold and enable permutation testing in the ROI-ROI second level analysis results explorer. In the newest version of conn (19c), the results explorer has been substantially updated and I'm having trouble finding the option to permute the results in the same way. I'm trying to replicate the results of a past experiment with a new sample and would like to use the exact same analysis pipeline as before. Is that possible without downgrading conn (which I'm hesitant to do for several reasons)? More specifically, is there a way to simply threshold and permute my ROIs?

Thanks

adjusted betas after 2nd-level analysis

$
0
0
Dear Alfonso,


thank you for the great toolbox!

How can I extract adjusted/corrected betas of my the 2nd-level analysis?

As I have understood from this post (https://www.nitrc.org/forum/message.php?msg_id=13459), the "Import values" in the 2-nd level analysis tab, imports the original beta values of the 1st-level analysis.

I read this post (https://www.nitrc.org/forum/message.php?msg_id=18633) explaining the ROI.mat output. But, I did not unterstand whether it includes the adjusted betas as well? A short clarification on this topic is very much appreciated.

Thank you in advance!
Alexandra
Viewing all 6872 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>