資源描述:
《秦曉飛系列-深度學(xué)習(xí)-1.2神經(jīng)網(wǎng)絡(luò)的編程基礎(chǔ)ppt課件.ppt》由會(huì)員上傳分享,免費(fèi)在線閱讀,更多相關(guān)內(nèi)容在教育資源-天天文庫。
1、神經(jīng)網(wǎng)絡(luò)的編程基礎(chǔ)BasicsofNeuralNetworkProgramming主講教師:秦曉飛上海理工大學(xué)光電學(xué)院SomethingYouwillLearningthisWeekHowtoavoidusing“for”looptostepthroughmtrainingexamples.Whyalearningprocesscanbeorganizedasforwardpropagationstep,followedbybackpropagationstep.Andhowtorealizethem.Weuselogisticregressionalgorismasanex
2、ampletoprocessbinaryclassificationproblems.2.1BinaryClassification2.1BinaryClassificationIfimagesizeis64×64,incomputeritisrepresentedasy=1(cat)vs0(noncat)Unrollitintoavectorthatcanbefedintoalearningalgorism2.1BinaryClassificationNotationsusedinthesecourses2.2LogisticRegression2.2LogisticReg
3、ressionLogisticregressionisalearningalgorithmusedinasupervisedlearningproblemwhentheoutput?arealleither1or0.Thegoaloflogisticregressionistominimizetheerrorbetweenitspredictionsandtrainingdata.Example:CatvsNo-catGivenanimagerepresentedbyafeaturevector?,thealgorithmwillevaluatetheprobabilityo
4、facatbeinginthatimage.2.2LogisticRegression2.3LogisticRegressionCostFunction2.3LogisticRegressionCostFunctionNon-convexNote:thislosefunctioncanbeseenasakindofcrossentropy,whichisdefinedasameasureofsimilarityoftwodistributions.2.3LogisticRegressionCostFunctionCostfunction:tomeasurethewholetr
5、ainingseterror.Thecostfunctionistheaverageofthelossfunctionoftheentiretrainingset.Wearegoingtofindtheparameters?????thatminimizetheoverallcostfunction.2.4GradientDescent2.4GradientDescentConvex2.4GradientDescentUse1dimensiontodemonstratehowgradientdescentwork:Gradientdescentprocessinmultidi
6、mensionalspace:2.5Derivatives2.6MoreDerivativeExamples2.7ComputationGraph2.7ComputationGraphForwardpropagation:tocomputetheoutput/inference/predictBackwardpropagation:tocomputethederivativeswhichisusedtoupdateparameters.Weusecomputationgraphtocomputeforwardpropagationandbackwardpropagation.
7、2.8DerivativeswithaComputationGraph2.8DerivativeswithaComputationGraphWeusetheChainRuletorealizebackwardpropagationandcalculatethederivatives.2.9LogisticRegressionGradientDescent2.9LogisticRegressionGradientDescentLogisticregressionrecap:Suppose2.9Logist