A New Supervised Competitive Learning Algorithm for Probabilistic Neural NetworkWord下载.docx
- 文档编号:4913117
- 上传时间:2023-05-04
- 格式:DOCX
- 页数:23
- 大小:383.67KB
A New Supervised Competitive Learning Algorithm for Probabilistic Neural NetworkWord下载.docx
《A New Supervised Competitive Learning Algorithm for Probabilistic Neural NetworkWord下载.docx》由会员分享,可在线阅读,更多相关《A New Supervised Competitive Learning Algorithm for Probabilistic Neural NetworkWord下载.docx(23页珍藏版)》请在冰点文库上搜索。
ahiddenlayerneuronisdefinedbyatrainingsample.Asaresult,thehugenumberofsampleswillleadtoalarge-scalestructureofthenetwork,thuslimitingthepromotionandapplicationofPNNnetwork.
Forthisreason,manyscholarsresearchonhowtosimplifythetopologystructureofPNN.Forexample,Kmeansalgorithm[2]orLVQ(LearningVectorQuantization)[3]areusedtoclustertrainingsamplesandclusteringcentersisdefinedasPNNhiddenvectorcenter.EM(ExpectationMaximization)[4]algorithmassumesthatallofthetrainingsamplescomefromdifferentmixed-normaldistribution.ThemeansofthenormaldistributionareestimatedandthendefinedasPNNhiddenvectorcenter.
Noneoftheaboveapproachesisbasedonthelearningcapabilitiesofneuralnetworksforparameteradjustment.Theybelongtotheunsupervisedlearning.Comparingunsupervisedlearning,supervisedlearningnetworkhasadvantagesinclassificationaccuracy,adaptivityandgeneralization.
AnewsupervisedcompetitivelearningalgorithmforthePNNisdeveloped:
classificationresultofthePNNnetworkisemployedtoadjustthelocationofhiddencentralvector.ThePNN-basedspamfilteringexperimentalresultsindicatethattheproposedalgorithmforthePNNclassificationnetworkhasabetterclassificationperformanceSpamfilteringexperiment.
2ProbabilisticNeuralNetwork
Inessence,probabilisticneuralnetworkistousethestructureoftheneuralnetworktoachieveParzenwindowmethod.AsshowninFigure1,atypicalPNNnetworkcanbedividedintoinputlayer,hiddenlayer,summationlayerandoutputlayer[1].
Fig.2-1StructureofPNN
InputLayer:
Acceptstheinputvectorxtobeclassified,withnocalculation.
HiddeLayer:
Thenodesofthislayerareconnectedtoallinputsofthefirstlayer.WeconsiderthateachandeveryhiddenunitcanbedefinedashavinganactivationfunctiontheGaussianbasisfunction.
(1)
wherei=1,…,M,j=1,…,N,andMisthenumberofcategories,
isthenumberoftrainingsamplesini’thclass.Here
isthestandarddeviation,alsoknownassmoothingfactor.Theinputvector
andthecenters
ofthekernelareofdimensionalityd.
SummationLayer:
Thislayercomputestheclassprobabilityfunctionsthroughacombinationofoutputsfromhiddenlayer:
(2)
OutputLayer:
Output
where
isprioriprobabilityofclassiand
iscategoryestimatedbyPNN.
3SupervisedCompetitiveLearningAlgorithm
Themethod,whichuseKmeansalgorithm,EMalgorithmandLVQalgorithmtoselecthiddencentervectors,isunsupervisedlearning.Comparingunsupervisedlearning,supervisedlearningnetworkhasadvantagesinclassificationaccuracy,adaptivityandgeneralization.In[5],FarshidDelgosha1andMohammadB.Menhajusemixed-normaldistributionoftrainingsamplestoestimatetheprobabilitydensityfunction.Allnormaldistributionsshareacovariancematrix.Theyfirstvanishoutallcrosscovariancesofcentervectorsandadjustthesevectorsbycompetitivelearningprocessthen.In[6],CaiQulindevelopedanewlearningalgorithmforthePNN:
thelearningvectorquantizationisemployedtogrouptrainingsamplesandtheGeneticalgorithms(GA’s)isusedfortrainingthenetwork’ssmoothingparametersandhiddencentralvectorfordetermininghiddenneurons.Thesetwotrainingprocessareallbasedonthelearningcapabilitiesofneuralnetworksforparameteradjustment.However,theyalsohaveshortcomingsasfollows:
whenthenumberofhiddencenterincrease,theorderofcovariancematrixin[5]willbecomeveryhigh.Asaresult,itneedtotakeupalotofstoragespaceandcalculationprocessistime-consuming.Geneticalgorithms[6]haveacertaindegreeofdependenceonchoiceofinitialpopulationandcannotusethenetworkfeedbackinatimelymanner,sothat,thesearchspeedofalgorithmismoreslowly.Onthebasisof[5]and[6]’sresearch,anewsupervisedcompetitivelearningalgorithmforthePNNisdeveloped.Inthisalgorithm,PNNmodelassumethatalltheCategorieswithtrainingsamplessymmetricunimodaldistribution,whichcouldovercomethecomplexcalculationin[5]broughtaboutbytheuseofcovariancematrix.Atthesametime,thealgorithmadjustsoriginalhiddencentervectorsbycompetitivelearningwhichisabletoovercomeslowsearchspeedin[6].
Inthischapter,thedetailedworkcontainsananalysisofPNN’sdecisionboundaryforthecaseofoverlappingcategories,concretedesignofalgorithm,determinationofconvergenceconditionandamethodtoselectsmoothingparameters.
3.1AnalysisofPNN’sDecisionBoundary
Inthecaseofnon-overlappingcategories,classificationresultsofthePNNarereliable.Whenthereisoverlapbetweenclasses,commonportionsofpartiallyoverlappedclassesmustbeleftoutandnonoverlappingsectionsmaybeappliedtothePNNclassifier.
Supposethatthesamplevector
belongingtothejthclass,ispresentedtothePNNandtheriskofchoosingthisclassiscalculatedas
.Letthej'
thclasshastheminimumriskof
amongallclassesexcludingthecorrectclassj,i.e.:
(3)
Ameasureofmisclassificationcanbeintroducedas:
(4)
Threescenariosareofinterest:
1)Ifmisnegative,theclassificationiscorrectlyperformed;
2)Ifmispositive,thesamplevectorxismisclassified;
3)Asmallmagnitudeformalarmsasamplevectornearthedecisionboundarythatmayhaveoverlapwiththeregionofotherclasses.
3.2SupervisedCompetitiveLearningforPNN
Themethod,whichuseKmeansalgorithm,EMalgorithmandLVQalgorithmtoselecthiddencentervectors,isunsupervisedlearning.Inordertoovercometheirshortcomingsinclassificationaccuracy,adaptivityandgeneralization,anewsupervisedcompetitivelearningalgorithmforthePNNisdevelopedasfollows:
1)Clustertrainingsamples.Adjacentsamplevectorsaregroupedasaclusterthatisrepresentedbyaprototypevector.Inthismethod,theactivationfunctionmaybeestimatedas:
(5)
whereQisisthetotalnumberofclusters,
and
arerespectivelytheproportionandprototypevectorofthei’thclusterintheestimation.The
’smustobviouslysumtounity.
Kmeansalgorithm,EMalgorithm,LVQalgorithmorotherdataclusteringalogorithmcanbeusedtoclustersamples.
2)DefineclusteringcentervectorsasPNNhiddenceterstodesignPNNnetwork.
3)PresenttrainingsamplevectorstothePNNclassifierdesignedby2)onebyone.
4)Forthesamplevectorofthej’thclass,supposetheclassifier’sclassindexisj’.Oneofthefollowingtwocasesmaytakeplace:
aif
,thenupdate
and
asfollows(0<
1)
,
where
.
bif
where
5)Repeatsteps2),3),4)utilmeetthespecificterminationconditionsorreachtherequiredlearningnumber.
Inessence,thisalgorithmusecompetitivelearningtoadjustthelocationofclusteredhiddencentralvectoraccordingtoresultofthePNNnetwork.Competitionstudyisappropriatetofindastatisticallycharacteristicsinahighdegree.
Inthecaseofnon-overlappingcategories,thelearningalgorithmisterminatedifallthetrainingdatavectorsarecorrectlyclassified.However,whenthereisoverlapbetweencategoriesthealgorithmpresentedbeforewillnotconverge.
3.3ConvergenceCondition
Whenthereisoverlapbetweencategories,asmallmagnitudefor
definedby(4)alarmsasamplevectornearthedecisionboundarythatmayhaveoverlapwiththeregionofotherclasses,andthuscandegradethelearningalgorithm.Toresolvethisdegradationproblem,theeffectofmisclassificationmeasuremustbeincludedinthetrainingprocessthroughintroductionofapenaltyfunction.ThePNNdisregardsthisproblemandmakeuseofthefollowingpenaltyfunction,whichis,infact,aharddecision:
(6)
Inotherwords,thePNNshowsequalsensitivitytoallsamplevectorsregardlessoftheirrelativedistancetodecisionboundary.Anon-sharp(smooth)penaltyfunction(logsigmoidaltype),whichisdefinedbelow,itperformsasoftdecision.
(7)
Intheaboveequation,bisthebiasintroducedformorerobustnessandsisthesofinessparameter.ThecurvesofthetwoabovepenaltyfunctionsaredepictedinFig.2.
Fig.3-1Har
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- New Supervised Competitive Learning Algorithm for Probabilistic Neural Network
链接地址:https://www.bingdoc.com/p-4913117.html