Word Semantic Representations using Bayesian Probabilistic Tensor Factorization

Word Semantic Representations using Bayesian Probabilistic Tensor Factorization

ID:40357777

大?。?15.55 KB

頁數(shù):10頁

時間:2019-07-31

Word Semantic Representations using Bayesian Probabilistic Tensor Factorization _第1頁
Word Semantic Representations using Bayesian Probabilistic Tensor Factorization _第2頁
Word Semantic Representations using Bayesian Probabilistic Tensor Factorization _第3頁
Word Semantic Representations using Bayesian Probabilistic Tensor Factorization _第4頁
Word Semantic Representations using Bayesian Probabilistic Tensor Factorization _第5頁
資源描述:

《Word Semantic Representations using Bayesian Probabilistic Tensor Factorization 》由會員上傳分享,免費在線閱讀,更多相關(guān)內(nèi)容在學(xué)術(shù)論文-天天文庫

1、WordSemanticRepresentationsusingBayesianProbabilisticTensorFactorizationJingweiZhangandJeremySalwenMichaelGlassandAl?oGliozzoColumbiaUniversityIBMT.J.WastonResearchComputerScienceYorktownHeights,NY10598,USANewYork,NY10027,USA{mrglass,gliozzo}@us.ibm.com{jz2541,jas2312}@columbia.

2、eduAbstractworddistributions.Forinstance,theyarebelievedtohavedif?cultydistinguishingantonymsfromManyformsofwordrelatednesshavebeensynonyms,becausethedistributionofantonymousdeveloped,providingdifferentperspec-wordsareclose,sincethecontextofantonymoustivesonwordsimilarity.Weintr

3、oducewordsarealwayssimilartoeachother(Moham-aBayesianprobabilistictensorfactoriza-madetal.,2013).Althoughsomeresearchclaimstionmodelforsynthesizingasinglewordthatincertainconditionstheredoexistdiffer-vectorrepresentationandper-perspectiveencesbetweenthecontextsofdifferentantony-

4、lineartransformationsfromanynumbermouswords(Scheibleetal.,2013),thedifferencesofwordsimilaritymatrices.Theresult-aresubtleenoughthatitcanhardlybedetectedbyingwordvectors,whencombinedwiththesuchlanguagemodels,especiallyforrarewords.per-perspectivelineartransformation,ap-Anotherim

5、portantclassoflexicalresourceforproximatelyrecreatewhilealsoregulariz-wordrelatednessisalexicon,suchasWord-ingandgeneralizing,eachwordsimilarityNet(Miller,1995)orRoget’sThesaurus(Kipfer,perspective.2009).Manuallyproducingorextendinglexi-Ourmethodcancombinemanuallycre-consismuchm

6、orelaborintensivethangenerat-atedsemanticresourceswithneuralwordingVSMwordvectorsusingacorpus.Thus,lex-embeddingstoseparatesynonymsandiconsaresparsewithmissingwordsandmulti-antonyms,andiscapableofgeneraliz-wordtermsaswellasmissingrelationshipsbe-ingtowordsoutsidethevocabularyoft

7、weenwords.Consideringthesynonym/antonymanyparticularperspective.Weevaluatedperspectiveasanexample,WordNetanswerslessthewordembeddingswithGREantonymthan40%percentofthetheGREantonymques-questions,theresultachievesthestate-of-tionsprovidedbyMohammadetal.(2008)di-the-artperformance.

8、rectly.Moreover,binaryentriesinlexiconsdonotind

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文

此文檔下載收益歸作者所有

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文
溫馨提示:
1. 部分包含數(shù)學(xué)公式或PPT動畫的文件,查看預(yù)覽時可能會顯示錯亂或異常,文件下載后無此問題,請放心下載。
2. 本文檔由用戶上傳,版權(quán)歸屬用戶,天天文庫負(fù)責(zé)整理代發(fā)布。如果您對本文檔版權(quán)有爭議請及時聯(lián)系客服。
3. 下載前請仔細(xì)閱讀文檔內(nèi)容,確認(rèn)文檔內(nèi)容符合您的需求后進(jìn)行下載,若出現(xiàn)內(nèi)容與標(biāo)題不符可向本站投訴處理。
4. 下載文檔時可能由于網(wǎng)絡(luò)波動等原因無法下載或下載錯誤,付費完成后未能成功下載的用戶請聯(lián)系客服處理。