Analyzing Effect on Residual Learning by Gradual Narrowing Fully-Connected Layer Width and Implementing Inception Block in Convolution Layer
- 1 Department of Computer Science, Sir Padampat Singhania School of Engineering, India
Abstract
Research conducted on the advancement of CNN architecture for computer vision problems focuses on strategically choosing and modifying convolution hyperparameters (kernel, pooling, etc.). However, these research works don't exploit the advantage of employing multi fully-connected layers post the core schema to avail further performance improvements, which have been identified as the first research gap. Studies were also conducted to address the challenges of vanishing gradients in deep networks by employing residual learning via skip connections and lowering model training computational costs using parallel convolution rather than sequential convolution operations by employing inception blocks. These studies also don't discuss in detail the impact of sparing features on feature learning, which has been identified as the second research gap. Diagnosis of infectious patterns in chest X-rays using residual learning is chosen as the problem statement for this study. Results show that ResNet50 architecture achieved improved accuracy by 0.6218% and declined error rate by 2.6326% if gradually narrowing FC layers are employed between core residual learning schema and output layer. Also, independent implementation of inception blocks (google net v2) before skip-connections in ResNet50 architecture boosts accuracy by 0.961% and lowers the error rate by 4.2438%. These performance improvements were achieved without regularization and thus, encourage future work in this direction.
DOI: https://doi.org/10.3844/jcssp.2022.339.349
Copyright: © 2022 Saurabh Sharma. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
- 2,300 Views
- 966 Downloads
- 1 Citations
Download
Keywords
- Fully-Connected Layer
- Neuron Layer Width
- ResNet50
- Residual Network
- Skip-Connections
- Inception Blocks