Deep, Deep Learning with BART
Moritz Blumenthal1 and Martin Uecker1,2,3,4
1Institute for Diagnostic and Interventional Radiology, University Medical Center Göttingen, Göttingen, Germany, 2DZHK (German Centre for Cardiovascular Research), Göttingen, Germany, 3Campus Institute Data Science (CIDAS), University of Göttingen, Göttingen, Germany, 4Cluster of Excellence “Multiscale Bioimaging: from Molecular Machines to Networks of Excitable Cells” (MBExC), University of Göttingen, Göttingen, Germany
By integrating deep learning into BART, we have created a reliable framework combining state-of-the-art MRI reconstruction methods with neural networks. For MoDL and the Variational Network, we achieve similar performance as implementations using TensorFlow.
Figure 2: Nlops and automatic differentiation. a) A basic nlop modeled by the operator $$$F$$$ and derivatives $$$\mathrm{D}F$$$. When applied, $$$F$$$ stores data internally such that the $$$\mathrm{D}F$$$s are evaluated at $$$F$$$'s last inputs. b) Chaining two nlops $$$F$$$ and $$$G$$$ automatically chains their derivatives obeying the chain rule. The adjoint of the derivative can be used to compute the gradient by backpropagation. c) Composition of a residual structure $$$F(\mathbf{x}, G(\mathbf{x},\mathbf{y}))$$$ by combine, link and duplicate.
Figure 1: Basic structure of the BART-libraries used for deep learning: Networks are implemented as nlops supporting automatic differentiation. The nn-library provides deep-learning specific components such as initializers. Training algorithms have been integrated BART's library for iterative algorithms. MD-functions act as a flexible, unified interface to the numerical backend. We added many deep-learning specific nlops such as convolutions using efficient numerical implementations. Solid blocks represent major changes allowing deep learning.