Hi,
I plan on running 750 subjects, 1 mprage, 1 bold run, through Conn
I did a test run of 10 subjects, the raw data was 1.1gb, after preprocessing I am at 17.4 gb raw data and 9.2gb results
so...multiply this by 75, and I am at 2Tb. Is this a reasonable estimate? And is there a way to reduce file sizes as I go along?
I plan on running 750 subjects, 1 mprage, 1 bold run, through Conn
I did a test run of 10 subjects, the raw data was 1.1gb, after preprocessing I am at 17.4 gb raw data and 9.2gb results
so...multiply this by 75, and I am at 2Tb. Is this a reasonable estimate? And is there a way to reduce file sizes as I go along?