Convolutional Neural Networks for Sentence Classification

Convolutional Neural Networks for Sentence Classification

ID:40352759

大?。?35.97 KB

頁數(shù):6頁

時間:2019-07-31

Convolutional Neural Networks for Sentence Classification_第1頁
Convolutional Neural Networks for Sentence Classification_第2頁
Convolutional Neural Networks for Sentence Classification_第3頁
Convolutional Neural Networks for Sentence Classification_第4頁
Convolutional Neural Networks for Sentence Classification_第5頁
資源描述:

《Convolutional Neural Networks for Sentence Classification》由會員上傳分享,免費(fèi)在線閱讀,更多相關(guān)內(nèi)容在學(xué)術(shù)論文-天天文庫。

1、ConvolutionalNeuralNetworksforSentenceClassi?cationYoonKimNewYorkUniversityyhk255@nyu.eduAbstractlocalfeatures(LeCunetal.,1998).Originallyinventedforcomputervision,CNNmodelshaveWereportonaseriesofexperimentswithsubsequentlybeenshowntobeeffectiveforNLPconvolutionalneuralnetworks(CNN)andhave

2、achievedexcellentresultsinsemantictrainedontopofpre-trainedwordvec-parsing(Yihetal.,2014),searchqueryretrievaltorsforsentence-levelclassi?cationtasks.(Shenetal.,2014),sentencemodeling(Kalch-WeshowthatasimpleCNNwithlit-brenneretal.,2014),andothertraditionalNLPtlehyperparametertuningandstati

3、cvec-tasks(Collobertetal.,2011).torsachievesexcellentresultsonmulti-Inthepresentwork,wetrainasimpleCNNwithplebenchmarks.Learningtask-speci?conelayerofconvolutionontopofwordvectorsvectorsthrough?ne-tuningoffersfurtherobtainedfromanunsupervisedneurallanguagegainsinperformance.Weadditionallym

4、odel.ThesevectorsweretrainedbyMikolovetproposeasimplemodi?cationtothear-al.(2013)on100billionwordsofGoogleNews,chitecturetoallowfortheuseofbothandarepubliclyavailable.1Weinitiallykeepthetask-speci?candstaticvectors.TheCNNwordvectorsstaticandlearnonlytheotherparam-modelsdiscussedhereinimpro

5、veupontheetersofthemodel.Despitelittletuningofhyper-stateofthearton4outof7tasks,whichparameters,thissimplemodelachievesexcellentincludesentimentanalysisandquestionresultsonmultiplebenchmarks,suggestingthatclassi?cation.thepre-trainedvectorsare‘universal’featureex-1Introductiontractorsthatc

6、anbeutilizedforvariousclassi?ca-tiontasks.Learningtask-speci?cvectorsthroughDeeplearningmodelshaveachievedremarkable?ne-tuningresultsinfurtherimprovements.Weresultsincomputervision(Krizhevskyetal.,?nallydescribeasimplemodi?cationtothearchi-2012)andspeechrecognition(Gravesetal.,2013)tecture

7、toallowfortheuseofbothpre-trainedandinrecentyears.Withinnaturallanguageprocess-task-speci?cvectorsbyhavingmultiplechannels.ing,muchoftheworkwithdeeplearningmeth-odshasinvolvedlearningwordvectorrepresenta-OurworkisphilosophicallysimilartoRazaviantionsthroughneu

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文

此文檔下載收益歸作者所有

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文
溫馨提示:
1. 部分包含數(shù)學(xué)公式或PPT動畫的文件,查看預(yù)覽時可能會顯示錯亂或異常,文件下載后無此問題,請放心下載。
2. 本文檔由用戶上傳,版權(quán)歸屬用戶,天天文庫負(fù)責(zé)整理代發(fā)布。如果您對本文檔版權(quán)有爭議請及時聯(lián)系客服。
3. 下載前請仔細(xì)閱讀文檔內(nèi)容,確認(rèn)文檔內(nèi)容符合您的需求后進(jìn)行下載,若出現(xiàn)內(nèi)容與標(biāo)題不符可向本站投訴處理。
4. 下載文檔時可能由于網(wǎng)絡(luò)波動等原因無法下載或下載錯誤,付費(fèi)完成后未能成功下載的用戶請聯(lián)系客服處理。