Download PDFOpen PDF in browser

Forced Selective Information Reduction for Interpreting Multi-Layered Neural Networks

17 pagesPublished: September 20, 2022

Abstract

The present paper aims to reduce unnecessary information obtained through inputs, supposed to be inappropriately encoded, for producing easily interpretable networks with better generalization. The proposed method lies mainly in forced reduction of selective information even at the expense of a larger cost to eliminate unnecessary information coming from the inputs in the initial stage of learning. Then, in the later stage of learning, selective information is increased to produce a small number of really important connection weights for learning. The method was preliminarily applied to two business data sets: the bankruptcy and the mission statement data sets, in which the interpretation is considered as important as generalization performance. The results show that selective information could be decreased, though the cost to realize this reduction became larger. However, the accompa- nying selective information increase could be used to compensate for the expensive cost to produce simpler and interpretable internal representations with better generalization performance.

Keyphrases: cost, generalization, interpretation, multi layered neural networks, selective information

In: Tokuro Matsuo (editor). Proceedings of 11th International Congress on Advanced Applied Informatics, vol 81, pages 24-40.

BibTeX entry
@inproceedings{IIAIAAI2021-Winter:Forced_Selective_Information_Reduction,
  author    = {Ryotaro Kamimura and Ryozo Kitajima},
  title     = {Forced Selective Information Reduction for Interpreting Multi-Layered Neural Networks},
  booktitle = {Proceedings of 11th International Congress on Advanced Applied Informatics},
  editor    = {Tokuro Matsuo},
  series    = {EPiC Series in Computing},
  volume    = {81},
  publisher = {EasyChair},
  bibsource = {EasyChair, https://easychair.org},
  issn      = {2398-7340},
  url       = {/publications/paper/wKKF},
  doi       = {10.29007/n4kz},
  pages     = {24-40},
  year      = {2022}}
Download PDFOpen PDF in browser