ResNet18
ResNet18 is a residual neural network architecture introduced as part of the ResNet family in 2015 by Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. It demonstrates that relatively shallow networks can benefit from residual learning, enabling effective training of deeper models by mitigating vanishing gradients through skip connections.
The network consists of 18 layers and uses a basic residual block. Each block contains two consecutive
Parameter count for ResNet18 is about 11.7 million. It is commonly trained on ImageNet and widely used