本論文提出一種針對定點數實現下的遞迴神經網路的穩定度分析。我們所提出的類神經網路架構係以區域回授狀態空間實現構成,基於極點靈敏度極小化,我們使用了相似性轉換,將具有不同極點的初始狀態空間架構轉換至正規型式矩陣架構的狀態空間實現,使得轉換後的狀態空間架構具有極佳的強健性。基於正規型式矩陣的特性,我們接著使用了定點數的數學模型推導出可以估測保證遞迴神經網路在有限精準度實現下保持穩定的穩定準則,該穩定準則可以估測確保系統穩定所需的最低字元長度,最後我們以數值案例驗證了本論文所提方法的有效性。
In this thesis, a stability analysis is proposed for a recurrent neural network (RNN) using fixed-point implementations. The proposed RNN is composed of local feed-back path and state space realizations. Based on the minimization of pole sensitivity measures and the similarity transformations, the original given state space reali-zations with distinct poles can be transferred into normal-form ones. This will lead to the new structure with high robustness under finite precision implementations of the proposed neural networks. We then derive the mathematical model of fixed-point state-space representations. By using the properties of normal-form matrices, we derive a stability criterion with respect to word-length estimations. Such criterion can evaluate the word-length to guarantee the stability of the finite precision RNN structure. Finally, numerical examples are performed to illustrate the effectiveness of the proposed approach.