Home
World Journal of Advanced Research and Reviews
International Journal with High Impact Factor for fast publication of Research and Review articles

Main navigation

  • Home
    • Journal Information
    • Editorial Board Members
    • Reviewer Panel
    • Abstracting and Indexing
    • Journal Policies
    • Our CrossMark Policy
    • Publication Ethics
    • Issue in Progress
    • Current Issue
    • Past Issues
    • Instructions for Authors
    • Article processing fee
    • Track Manuscript Status
    • Get Publication Certificate
    • Join Editorial Board
    • Join Reviewer Panel
  • Contact us
  • Downloads

eISSN: 2581-9615 || CODEN: WJARAI || Impact Factor 8.2 ||  CrossRef DOI

Research and review articles are invited for publication in April 2026 (Volume 30, Issue 1) Submit manuscript

Effect of excessive neural network layers on overfitting

Breadcrumb

  • Home
  • Effect of excessive neural network layers on overfitting

Caleb Isaac * and Kourosh Zareinia

University of Alabama, USA.
 
Review Article
World Journal of Advanced Research and Reviews, 2022, 16(02), 1246-1257
Article DOI: 10.30574/wjarr.2022.16.2.1247
DOI url: https://doi.org/10.30574/wjarr.2022.16.2.1247
 
Received on 08 October 2022; revised on 21 November 2022; accepted on 25 November 2022
 
Artificial intelligence has been transformed through deep neural networks into models that can learn complex representations from our data. But, an excessive amount of layers in the neural networks may create overfitting; that is, the model can memorize training data instead of generalizing to newer inputs. Networks that are too complex tend to 'overfit' on noise in addition to meaning patterns. Vanishing and exploding gradients, increased computational cost and the curse of dimensionality are the factors that make this issue worse and deeper networks harder to train effectively.
In this paper, I discuss how neural network layers are used to learn and how they are problematic when depth becomes large. We further discuss regularization techniques (L1/L2 regularization, dropout, batch normalization, etc) which prevent the problem of overfitting and promotes generalization. Besides, training efficiency and stability can be optimized with adaptive learning rate optimizers, gradient clipping, early stopping, etc. Additionally, transfer learning is also introduced as a powerful way of exploiting the power of pre-trained networks while avoiding excessively deep networks.
We also investigate the real-world case studies on which deep networks did not generalize well and yet its problem was bypassed with NAS, sparse networks, and meta-learning. Efficient, adaptable, and the capacity for generalization are the potential future of deep learning models for designing efficient and generalizable models with high performance at the right depth. By applying best practices in architecture design and optimization, researchers can construct strong models that offer excellence and accuracy without over-complication. 
 
Overfitting in deep learning; Neural network layers; Regularization techniques; Transfer learning; Neural architecture search (NAS)
 
https://wjarr.com/sites/default/files/fulltext_pdf/WJARR-2022-1247.pdf

Preview Article PDF

Caleb Isaac and Kourosh Zareinia. Effect of excessive neural network layers on overfitting. World Journal of Advanced Research and Reviews, 2022, 16(2), 1246-1257. Article DOI: https://doi.org/10.30574/wjarr.2022.16.2.1247

Copyright © Author(s). All rights reserved. This article is published under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits use, sharing, adaptation, distribution, and reproduction in any medium or format, as long as appropriate credit is given to the original author(s) and source, a link to the license is provided, and any changes made are indicated.


All statements, opinions, and data contained in this publication are solely those of the individual author(s) and contributor(s). The journal, editors, reviewers, and publisher disclaim any responsibility or liability for the content, including accuracy, completeness, or any consequences arising from its use.

Get Certificates

Get Publication Certificate

Download LoA

Check Corssref DOI details

Issue details

Issue Cover Page

Editorial Board

Table of content

Copyright © 2026 World Journal of Advanced Research and Reviews - All rights reserved

Developed & Designed by VS Infosolution