机构地区: 中国科学院计算技术研究所智能信息处理重点实验室
出 处: 《软件学报》 2003年第5期930-935,共6页
摘 要: 主要讨论离散时间连续状态的Hopfield网络模型中当神经元的激活函数为单调增函数(不一定严格单调增)时,并行和串行收敛的充分条件以及具有全局惟一稳定点的充分条件.通过定义新的能量函数和研究单调增函数(不一定严格单调增)的性质,给出了并行和串行收敛的充分条件.通过研究能量函数成为凸函数的条件,将Hopfield 网络的运行看作约束凸优化问题求解,从而得出了仅有全局惟一极小点的充分条件.当网络神经元的自反馈大于该神经元激活函数导数的倒数时,串行运行收敛.当网络连接权值矩阵的最小特征值大于激活函数导数的倒数时,网络并行收敛.如果网络的能量函数为凸函数,则网络将仅有惟一一个全局稳定点.这些结果在应用Hopfield 网络求解优化问题和联想记忆时拓广了神经元激活函数的选择范围. The convergent conditions in sequence or parallel update mode and the sufficient condition with only one global stable state for Hopfield network model with discrete time and continuous states when its neuron's activation function is non-decreasing (not being strictly monotone increasing) are discussed. With the definition of a new energy function and the research on the properties of monotonously increasing function, the sufficient conditions is presented to converge in parallel or sequential update mode when neuron's activation function is monotonously increasing (not be necessary to strictly increase). After obtained the condition for energy function to be convex with respect to the network states variables, it follows that a sufficient condition for network to have only one stable point with the minimum energy by regarding the operation of Hopfield network as solving a constrained convex optimal problem. When auto-connection weight value of each neuron in network is greater than the reciprocal of derivation of its activation function, the network will be convergent in sequence update mode. When the minimal eigenvalue of connection weights matrix is greater than the reciprocal of derivation of its neuron activation function, the network will be convergent in parallel update mode. If the energy function of network is convex, the network will have only one global stable point. These results extend the choice range of activation function of neuron when using Hopfield net to solution of optimization problem or to associative memory.
领 域: [自动化与计算机技术] [自动化与计算机技术]