English  |  正體中文  |  简体中文  |  Items with full text/Total items : 94286/110023 (86%)
Visitors : 21692399      Online Users : 817
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://asiair.asia.edu.tw/ir/handle/310904400/112219


    Title: 遞迴式與前饋式多層感知機之研究與實作
    Investigations and Implementations for Recurrent Neural Networks and Feedforward Multiple Layer Perceptron
    Authors: 王原彬
    WANG, YUAN-BIN
    Contributors: 光電與通訊學系
    Keywords: 多層前饋式類神經網路;多層遞迴式類神經網路
    multiple layer feedforward neural networks;multiple layer recurrent neural network
    Date: 2019
    Issue Date: 2019-10-28
    Publisher: 亞洲大學
    Abstract: 本論文研究對象為單層與多層前饋式類神經網路(FNN)與遞迴式類神經網路(RNN)在學習效率上的比較,在RNN的架構中,我本以無限脈衝響應濾波器擔任訊號遞迴的角色。其中,我們採用分段式線性啟動函數,且在實現RNN時,我們針對極點與 範數靈敏度同時進行最佳化,每個神經元的權重和偏移量則是使用倒傳遞學習演算法進行更新,最後呈現類神經對超越函數 以及組合函數的學習效果,結果顯示,針對低複雜度函數的學習,單層RNN表現略優於FNN,然而面對較高複雜度的函數所需學習次數單層RNN學習次數可明顯下降,說明在神經元數量較少的情形下RNN能有效降低學習次數。若提升層數時FNN與RNN的學習次數以FNN較低。
    In this thesis, the learning efficiency of the single layer and multiple layer feedforward neural networks (FNN), as well as recurrent neural networks (MLRNN) were investigated. In the RNN structure, piecewise linear activation functions were used. In addition, infinite impulse response digital filter played the role of signal recursions. In RNN implementation, pole-L_2 sensitivity minimization was performed. The weight of every neuron was adjusted by using the back-propagation learning algorithm. In this thesis, a simple sinusoidal function and a relatively complicated function were used for comparison. The simulation result shows that single layer FNN and RNN are with similar learning efficiency in learning simple sinusoidal function. Whereas RNN is with higher efficiency in learning complicated functions
    Appears in Collections:[光電與通訊學系] 博碩士論文

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML349View/Open


    All items in ASIAIR are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback