Skip to main content
. 2023 Aug 26;10(9):1012. doi: 10.3390/bioengineering10091012

Table 5.

Data-Consistency Layers and Unrolled Networks-based MRI Reconstruction Models.

Ref. Year Network Contributions Unsolved Challenges
[33] 2018 VN Preserved essential features of MR images, including pathologies not present in the training dataset Suffer from residual artifacts that are particularly evident in the axial sequences
[34] 2018 VN Provided rapid reconstruction speed of approximately 0.2 s per section Variation in reconstruction times based on hardware models, the use of constant regularizations, and the absence of fully sampled data
[76] 2018 MoDL Achieved faster convergence per iteration using numerical optimization blocks for data-consistency and required less training data Use of many conjugate gradient steps in data-discrepancy layers may lead to increased computational time, possibly reducing reconstruction speed
[77] 2019 PC-CNN Improved image accuracy by enforcing data consistency and enhanced convergence Computational complexity, data dependency, limited interpretability, and sensitivity to noise and artifacts
[78] 2020 jVN Image quality was improved, and blurring was reduced through the learning of efficient regularizers Generalization to unseen data or different acquisition scenarios
[79] 2020 DeepcomplexMRI No sensitivity information calculation required for resolving aliasing and channel correlations High acceleration factors can result in persistent blurriness in the reconstructed MRIs
[80] 2020 Dense-RNN Showed potential for capturing long-range dependencies among image units Does not completely address the slow convergence issue inherent in proximal gradient descent methods
[81] 2020 TVINet Ensured data consistency and preserved the fine details in the reconstructed MRI Time-consuming and computationally expensive hyperparameter tuning, lack of uncertainty quantification in deterministic predictions
[82] 2020 FlowVN Achieved accurate reconstructions of pathological flow in a stenotic aorta within a short timeframe of 21 s Large training data requirement, interpretability
[83] 2022 CNN & UNet Enhanced unfolding structures without complexity increase, using an adaptively calculated noise parameter for improved reconstruction performance Suffer from training instability, slow convergence, and limited explainability, which can hinder its practical applicability and interpretability
[84] 2022 DEMO Efficiently removed CS-MRI artifacts, such as motion, zebra, and herringbone artifacts High computational requirements, including GPUs, for training and inference
[85] 2023 DIRCN Used long-range skip connections to improve gradient and information flow Model trained on retrospective public domain data, needs to be tested on clinically valid prospective data