Download PDFOpen PDF in browserThe Effects of Bi-Label Classification on the Learning Efficiancy of Feed-Forward Neutral NetworksEasyChair Preprint 151389 pages•Date: September 28, 2024AbstractThis study examines the impact of bi-label classification on the performance of feed-forward neural networks (FFNNs) by comparing it with single-label classification models. The research focuses on key performance metrics such as convergence speed, loss reduction, and model stability across three case studies of varying complexity. The findings indicate that bi-label classification consistently achieves faster convergence, lower loss values, and greater stability compared to single-label classification, particularly in simpler datasets. As the complexity of the data increases, the performance gap between the two models narrows, though bi-label classification continues to maintain a slight advantage in terms of loss reduction, generalization, and the ability to handle diverse datasets. These results suggest that bi-label classification can significantly enhance the efficiency of deep learning models, making them more effective in solving tasks involving multiple dependent variables. The study’s findings are particularly relevant to fields like predictive analytics and large-scale data classification, where processing speed and model stability are critical. Keyphrases: : Multi-label Classification, Bi-label classification, Single-label classification, deep learning, neural networks
|