基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測

基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測

ID:36455588

大?。?.46 MB

頁數(shù):63頁

時間:2019-05-10

基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測_第1頁
基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測_第2頁
基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測_第3頁
基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測_第4頁
基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測_第5頁
資源描述:

《基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測》由會員上傳分享,免費在線閱讀,更多相關(guān)內(nèi)容在學(xué)術(shù)論文-天天文庫。

1、南京工業(yè)大學(xué)碩士學(xué)位論文基于改進(jìn)的回聲狀態(tài)神經(jīng)網(wǎng)絡(luò)的非線性預(yù)測姓名:王瑟申請學(xué)位級別:碩士專業(yè):計算機(jī)應(yīng)用技術(shù)指導(dǎo)教師:蔚承建20060515摘要關(guān)鍵詞:回聲狀態(tài)神經(jīng)網(wǎng)絡(luò);小波神經(jīng)網(wǎng)絡(luò);小波分解;混沌時間序列預(yù)測;先驗性;PSO;集群智能II碩士學(xué)位論文ABSTRACTNonlinearsystempredictionusingneuralnetworksappearsgreatefficiencyandhasabundanceofapplications.Recurrentneuralnetworksshowsmoreadvancedadvantagesamongthem

2、againstthesepredictiontasks,althoughitslearningmethodshavenotimprovedmuchmoreforlongtime.Echostatenetworkisonenovelstructureofrecurrentneuralnetwork(RNN)alsoonenovellearningmethodforRNNaswell,it’ssimilarwiththosebio-neural-networksstructurally,andithastheperfectSTMcapabilityasoneRNN.Itempl

3、oysonelargescaleRNNasinformationreservoircalleddynamicalreservoir,thenminimizesthemeaningsquarederror(MSE)duringtrainingtogetthelearningusingcomputingsimpleregressionweightmatrixfrominternalstatestowardsoutputunit.However,thereisonecontradictionexistinginESN,itis:toemploynonlinearneuroncan

4、raisethenonlinearcapabilityofESNbutreducetheSTMofitsimultaneously.IthastoemployoneverylargescaleDRwhenfacethosetoughtaskwhichrequirenotonlyhighnonlinearitybutalsoniceMClikechaotictimeseriesprediction.ThiscausestherunningprocessofESNslowingdownandbecomingmoreinstableduringexploitationperiod

5、.AccordingtothetranscendentalknowledgetheoryofANN,theESNcanemployotherneuralnodetoimprovetheperformance,thewavelonusinginWNNchoseninthisthesis.Theinternalstatespaceisenlargedwheninputsometunedwavelon.TheSWHESNcanpredict46%furtherthantheoriginalESNwithouttypicaldeviationbutonlyconsumingonly

6、30%timeofwhatESNdowhenlearningsamedatasample.Wecan’tforgetthatESNhasimprovedthebestprevious[1]technologybyfactor700.Thisthesisshowstri-highlightviews:1.WeintroducewavelonintoRNNwhichappearsinforwardANNtraditionally.2.Wereducedthediversitybetweenwavelonforthereasontosmoothworkingconditionin

7、ESNratherthanaugmentingthemwhichforwardANNwhoneedlargerbasicvectorfunctionembedded.3.Theparametersinechostatenetworksinvolvedinapplicationaresetbyexpertofechostatenetworkscommonly,whichusuallywasteofcomputationresource,inthispaperwepresentonemethodthattooptimi

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文

此文檔下載收益歸作者所有

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文
溫馨提示:
1. 部分包含數(shù)學(xué)公式或PPT動畫的文件,查看預(yù)覽時可能會顯示錯亂或異常,文件下載后無此問題,請放心下載。
2. 本文檔由用戶上傳,版權(quán)歸屬用戶,天天文庫負(fù)責(zé)整理代發(fā)布。如果您對本文檔版權(quán)有爭議請及時聯(lián)系客服。
3. 下載前請仔細(xì)閱讀文檔內(nèi)容,確認(rèn)文檔內(nèi)容符合您的需求后進(jìn)行下載,若出現(xiàn)內(nèi)容與標(biāo)題不符可向本站投訴處理。
4. 下載文檔時可能由于網(wǎng)絡(luò)波動等原因無法下載或下載錯誤,付費完成后未能成功下載的用戶請聯(lián)系客服處理。