feature engineering

feature engineering

ID:39771815

大?。?.66 MB

頁數(shù):32頁

時間:2019-07-11

feature engineering_第1頁
feature engineering_第2頁
feature engineering_第3頁
feature engineering_第4頁
feature engineering_第5頁
資源描述:

《feature engineering》由會員上傳分享,免費在線閱讀,更多相關(guān)內(nèi)容在學(xué)術(shù)論文-天天文庫。

1、DiscoverFeatureEngineeringHowtoEngineerFeaturesandHowtoGetGoodatItImportanceofFeatureEngineering●Betterfeaturesmeansflexibility.●Betterfeaturesmeanssimplermodels.●Betterfeaturesmeansbetterresults.WhatisFeatureEngineering?●Featureengineeringis●theprocessoftransformingrawdataintofeatures●thatbette

2、rrepresenttheunderlyingproblemtothepredictivemodels●resultinginimprovedmodelaccuracyonunseendata.Sub-ProblemsofFeatureEngineering●FeatureImportance(correlation,randomforest)–Anestimateoftheusefulnessofafeature●FeatureExtraction(PCA)–Theautomaticconstructionofnewfeaturesfromrawdata●FeatureSelecti

3、on(rankingscore,wrapper,LASSO)–Frommanyfeaturestoafewthatareuseful●FeatureConstruction()–Themanualconstructionofnewfeaturesfromrawdata●FeatureLearning–TheautomaticidentificationanduseoffeaturesinrawdataIterativeProcessofFeatureEngineering●Brainstormfeatures●Devisefeatures●Selectfeatures●Evaluate

4、modelsGeneralExamplesofFeatureEngineering●DecomposeCategoricalAttributes–“Item_Color”thatcanbeRed,BlueorUnknown.●DecomposeaDate-Time–2014-09-20T20:45:40Z●ReframeNumericalQuantities–Num_Customer_PurchasesPurchases_Summer,Purchases_FallFeatureselectioninsklearn●Removingfeatureswithlowvariance–Vari

5、anceThreshold●Univariatefeatureselection–Regressionp-values–ClassificationAnovaF-valueVariableRanking●CorrelationCriteria–Pearsoncorrelationcoefficient●SingleVariableClassifiers–ROC(x-FPRy-TPR)AUC●InformationTheoreticRankingCriteria●Noisy(noninformative)features●Applyingunivariatefeatureselectio

6、nbeforetheSVMincreasestheSVMweightattributedtothesignificantfeaturesLimitationsofvariableranking●CanPresumablyRedundantVariablesHelpEachOther?Limitationsofvariableranking●HowDoesCorrelationImpactVariableRedundancy●Limitationsofvariableranking●CanaVariablethatisUselessbyItselfbeUsefulwithOthers?●

7、Featureselectioninsklearn●Recursivefeatureelimination–Allfeature→absoluteweightsarethesmallestarepruned(SVC)●L1-basedfeatureselection–Lasso(higheralphathefewerfeatures)–SVMsandlogistic-regression(smallerCthefewerfeatures)––●

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文

此文檔下載收益歸作者所有

當(dāng)前文檔最多預(yù)覽五頁,下載文檔查看全文
溫馨提示:
1. 部分包含數(shù)學(xué)公式或PPT動畫的文件,查看預(yù)覽時可能會顯示錯亂或異常,文件下載后無此問題,請放心下載。
2. 本文檔由用戶上傳,版權(quán)歸屬用戶,天天文庫負(fù)責(zé)整理代發(fā)布。如果您對本文檔版權(quán)有爭議請及時聯(lián)系客服。
3. 下載前請仔細(xì)閱讀文檔內(nèi)容,確認(rèn)文檔內(nèi)容符合您的需求后進(jìn)行下載,若出現(xiàn)內(nèi)容與標(biāo)題不符可向本站投訴處理。
4. 下載文檔時可能由于網(wǎng)絡(luò)波動等原因無法下載或下載錯誤,付費完成后未能成功下載的用戶請聯(lián)系客服處理。