Please use this identifier to cite or link to this item: https://ahro.austin.org.au/austinjspui/handle/1/35405
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTahayori, Bahman-
dc.contributor.authorSmith, Robert-
dc.contributor.authorVaughan, David-
dc.contributor.authorPierre, Eric-
dc.contributor.authorJackson, Graeme-
dc.contributor.authorAbbott, David F-
dc.date.accessioned2024-07-29T00:22:32Z-
dc.date.available2024-07-29T00:22:32Z-
dc.date.issued2024-07-
dc.identifier.urihttps://ahro.austin.org.au/austinjspui/handle/1/35405-
dc.descriptionResearchFest 2024en_US
dc.description.abstractAim Multi-Band Multi-Echo (MBME) imaging in functional MRI (fMRI) provides fast data acquisition and improves the signal-to-noise ratio by leveraging the echo-time (TE) dependency of the signal to distinguish between Blood Oxygenation-Level Dependent (BOLD) and non-BOLD signals. TE Dependent ANAlysis (TEDANA) is an open-source tool that processes MBME fMRI data to produce a denoised dataset. However, previous studies indicated that TEDANA might inadvertently remove BOLD signal along with noise. This study aimed to improve the TEDANA framework by introducing a Modified Independent Component Analysis Denoising (MICAD) framework. The enhancements included 1) applying thermal denoising to raw data, 2) using a robust method for component analysis, and 3) refining the classification algorithm. Methods We used the Australian Epilepsy Project (AEP) language task fMRI data of 240 participants to compare the denoising frameworks. We employed fMRIPrep for typical processing of fMRI data and used the iBrain Analysis Toolbox for SPM (iBT) to estimate the activation map for each subject. We calculated mean t-score and Activation Volume (AV) within a language region of interest for comparing pipelines. Furthermore, two clinicians with expertise in language network identification from our centre evaluated the performance of the pipelines for a subset of the subjects. Results A significant proportion of subjects, over 80%, showed a higher AV/t-score within the language region when processed with MICAD. More importantly, in contrast to TEDANA, the proposed method did not result in any significant removal of AV reduction and achieved reasonable results at an individual subject level. Furthermore, clinicians’ evaluations were well aligned with the AV measure. Conclusion The MICAD pipeline is sufficiently robust to reliably yield acceptable results for individual subjects, a pre-requisite for clinical research in AEP. The MICAD pipeline can be extended to resting-state fMRI data in an automated pipeline. Impact The proposed automated pipeline will be applied to the AEP resting-state fMRI data to achieve more consistent, reliable results. Furthermore, the framework can be applied to any large fMRI dataset as it does not require manual inspection of the denoising result.en_US
dc.titleAn automated denoising pipeline for multi-band multi-echo fMRI dataen_US
dc.typeConference Presentationen_US
dc.identifier.affiliationThe Florey Institute of Neuroscience and Mental Healthen_US
dc.identifier.affiliationMedicine (University of Melbourne)en_US
dc.identifier.affiliationAustin Healthen_US
dc.description.conferencenameResearchFest 2024en_US
dc.description.conferencelocationAustin Healthen_US
dc.type.contentTexten_US
dc.type.contentImageen_US
dc.identifier.orcid0000-0002-4927-0023en_US
dc.identifier.orcid0000-0002-7259-8238en_US
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
item.grantfulltextopen-
item.fulltextWith Fulltext-
item.openairetypeConference Presentation-
crisitem.author.deptThe Florey Institute of Neuroscience and Mental Health-
Appears in Collections:ResearchFest abstracts
Files in This Item:
File Description SizeFormat 
ResearchFest_2024_tedana.pptxPoster by B. Tahayori et al.4.41 MBMicrosoft Powerpoint XMLView/Open
Show simple item record

Page view(s)

130
checked on Sep 18, 2024

Download(s)

14
checked on Sep 18, 2024

Google ScholarTM

Check


Items in AHRO are protected by copyright, with all rights reserved, unless otherwise indicated.