Skip to main content
Fig. 1 | EJNMMI Physics

Fig. 1

From: Deep learning-assisted PET imaging achieves fast scan/low-dose examination

Fig. 1

Network architecture of HYPER DLR. relu: An activation function defined as the positive part of its argument: f(x) = max(0,x), where x is the input to a neuron. conv: Convolution is the most basic operation in convolutional neural networks. The input image is convolved with the network to extract features in the image. bn: A technique for improving speed, performance, and stability of the artificial neural network. It is used to normalize the input layer by adjusting and scaling the activations. dropout: A regularization technique for reducing overfitting in neural networks by preventing complex co-adaptations on training data. skip connection: A technique to help solve the problem of vanishing gradients, allowing faster training. It builds shortcuts to jump over some layers

Back to article page