欢迎来到冰点文库! | 帮助中心 分享价值,成长自我!
冰点文库
全部分类
  • 临时分类>
  • IT计算机>
  • 经管营销>
  • 医药卫生>
  • 自然科学>
  • 农林牧渔>
  • 人文社科>
  • 工程科技>
  • PPT模板>
  • 求职职场>
  • 解决方案>
  • 总结汇报>
  • ImageVerifierCode 换一换
    首页 冰点文库 > 资源分类 > PDF文档下载
    分享到微信 分享到微博 分享到QQ空间

    上海交通大学神经网络原理与应用作业1.pdf

    • 资源ID:3430465       资源大小:156.03KB        全文页数:3页
    • 资源格式: PDF        下载积分:10金币
    快捷下载 游客一键下载
    账号登录下载
    微信登录下载
    三方登录下载: 微信开放平台登录 QQ登录
    二维码
    微信扫一扫登录
    下载资源需要10金币
    邮箱/手机:
    温馨提示:
    快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
    如填写123,账号就是123,密码也是123。
    支付方式: 支付宝    微信支付   
    验证码:   换一换

    加入VIP,免费下载
     
    账号:
    密码:
    验证码:   换一换
      忘记密码?
        
    友情提示
    2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
    3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
    4、本站资源下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。
    5、试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。

    上海交通大学神经网络原理与应用作业1.pdf

    1、Neural Network Theory and ApplicationsHomework Assignment 1oxstarSJTUJanuary 19,2012Problem oneOne variation of the perceptron learning rule isWnew=Wold+epTbnew=bold+ewhere is called the learning rate.Prove convergence of this algorithm.Does the proofrequire a limit on the learning rate?Explain.Proo

    2、f.We will combine the presentation of weight matrix and the bias into a single vector:x=?Wb?,zq=?pq1?.(1)So the net input and the perceptron learning rule can be written as:WTp+b=xTz,(2)xnew=xold+ez.(3)We only take those iterations for which the weight vector is changed in to account,so thelearning

    3、rule becomes(WLOG,assume that x(0)=0)x(k)=x(k 1)+dz(k 1)(4)=dz(0)+dz(1)+.+dz(k 1)(5)where dz zQ,.,z1,z1,.,zQ.Assume that the correct weight vector is x,wecan say,xTdz 0(6)From Equation 5 and Equation 6 we can show thatxTx(k)=xTdz(0)+xTdz(1)+.+xTdz(k 1)(7)k(8)From the Cauchy-Schwartz inequality we ha

    4、ve|x|2|x(k)|2(xTx(k)2(k)2(9)1The weights will be updated if the previous weights were incorrect,so we have xTdz 0and|x(k)|2=xT(k)x(k)(10)=x(k 1)+dz(k 1)Tx(k 1)+dz(k 1)(11)=|x(k 1)|2+2xT(k 1)dz(k 1)+2|dz(k 1)|2(12)|x(k 1)|2+2|dz(k 1)|2(13)2|dz(0)|2+.+2|dz(k 1)|2(14)2kmax(|dz|2)(15)From Equation 9 and

    5、 Equation 15 we have(k)2|x|2|x(k)|2|x|22kmax(|dz|2)(16)k|x|2max(|dz|2)2(17)Now we have found that the proof does not require a limit on the learning rate for the reasonthat k has no relations with the learning rate.Intuitively,this problem equals to all inputsbeing multiplied by the learning rate(ez

    6、=e(z).The cost will not change a lot to findthe correct boundary to proportional data.Problem twoWe have a classification problem with three classes of input vectors.The three classes areclass1:?p1=?11?,p2=?02?,p3=?31?class2:?p4=?21?,p5=?20?,p6=?12?class3:?p7=?12?,p8=?21?,p9=?11?Implement the percep

    7、tron network based on the learning rule of problem one to solvethis problem.Run your problem at different learning rate(=1,0:8,0:5,0:3,0:1),compareand discuss the results.Ans.In my experiment,the learning rates are set form 0.1 to 1 with step=0.05.Here Ishow the times of iteration at different learn

    8、ing rates(Figure 1).Note that this perceptronalgorithm can only handle two-class problems,so I use two phases of classification to solvethe three-class problem.I can hardly find the relationships between the learning rates and times of iteration.Justas what have been proved above,the cost will not b

    9、e influenced a lot to find the correctboundary at different learning rates.I also proved the results for classification at learningrate=1(Figure 2).Problem threeFor the problem XOR as follows,class1:?p1=?10?,p2=?01?class2:?p3=?00?,p4=?11?20.10.20.30.40.50.60.70.80.91051015202530Learning ratesTimes o

    10、f learning Classification 1Classification 2Figure 1:Times of Iteration atDifferent Learning Rates21.510.500.511.522.53302520151050510 Class 1Class 2Class 3Figure 2:Result for Classification(P2)0.500.511.53.532.521.510.500.51 Class 1Class 2Figure 3:Result for Classification(XOR Problem)43210123454321

    11、01234 Class 1Class 2Figure 4:Result for Classification(Two Spiral Problem)and the two spiral problem delivered as the material,could the perceptron algorithm cor-rectly classify these two problems?If not,explain why.Ans.Just as the results(Figure 3 and Figure 4)show,the perceptron algorithm cannot correctly classify these two problems.As we know,the single-layer perceptrons can onlyclassify linearly separable vectors while the vectors in these two problems are just linearlyinseparable.3


    注意事项

    本文(上海交通大学神经网络原理与应用作业1.pdf)为本站会员主动上传,冰点文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知冰点文库(点击联系客服),我们立即给予删除!

    温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




    关于我们 - 网站声明 - 网站地图 - 资源地图 - 友情链接 - 网站客服 - 联系我们

    copyright@ 2008-2023 冰点文库 网站版权所有

    经营许可证编号:鄂ICP备19020893号-2


    收起
    展开