您当前的位置: > 详细浏览

Unraveling the Black-box Magic: An Analysis of Neural Networks’ Dynamic Local Extrema

请选择邀稿期刊:
摘要: We point out that neural networks are not black boxes, and their generalization stems from the ability to dynamically map a dataset to the local extrema of the model function. We further prove that the number of local extrema in a neural network is positively correlated with the number of its parameters, and on this basis, we give a new algorithm that is different from the back-propagation algorithm, which we call the extremum-increment algorithm. Some difficult situations, such as gradient vanishing and overfitting, can be reasonably explained and dealt with in this framework.

版本历史

[V1] 2025-07-08 11:01:44 ChinaXiv:202507.00082V1 下载全文
点击下载全文
预览
同行评议状态
待评议
许可声明
metrics指标
  •  点击量303
  •  下载量67
评论
分享
申请专家评阅