棰樼洰錛歋tochastic Configuration Networks: Theory and Applications
銆€銆€鎶ュ憡浜猴細鐜嬫杈?鏁欐巿
銆€銆€鏃墮棿錛?017騫?2鏈?鍙?15:30-16:30
銆€銆€鍦扮偣錛氭牸鑷翠腑妤?00瀹?br />銆€銆€鎶ュ憡鎽樿錛?br />銆€銆€Randomized methods for development of neural networks have great potential to cope with big data processing. This methodology offers a trade-off solution between effectiveness and efficiency. Over the past decades, it has been a common practice to randomly assign the weights and biases of a neural network without any constraint, which results in poor modelling performance due to the existence of junk nodes. This talk reports our findings on the constraint condition and visually demonstrates the significance of our proposed supervisory mechanism to the performance improvement, An original, innovative and effective randomized learning algorithm and resulting randomized learner model, termed as deep stochastic configuration networks (DeepSCNs), are briefly introduced in this talk.
鎶ュ憡浜虹畝鍘嗭細
銆€銆€鐜嬫杈夋暀鎺?995騫?鏈堣幏涓滃寳澶у宸ヤ笟鑷姩鍖栦笓涓氬崥澹浣嶏紝1995騫?1997騫村湪鏂板姞鍧″崡媧嬬悊宸ュぇ瀛︾數瀛愬伐紼嬪闄㈠仛鍗氬+鍚庣爺絀跺伐浣滐紝1998騫?2001騫村湪棣欐腐鐞嗗伐澶у璁$畻瀛︾郴鐮旂┒鍛橈紝浠庝簨鏈哄櫒瀛︿範錛屾暟鎹寲鎺樺拰鍥懼儚澶勭悊鏂歸潰鐨勭爺絀跺伐浣溿€?001騫?鏈堣嚦浠婂湪婢沖ぇ鍒╀簹La Trobe澶у璁$畻鏈虹瀛︿笌淇℃伅鎶€鏈郴浠庝簨鏁欏涓庣鐮斿伐浣溿€備富瑕佺爺絀舵柟鍚戯細璁$畻鏅鴻兘涓庢暟鎹寲鎺樻妧鏈湪澶ф暟鎹俊鎭鐞嗗拰鏅鴻兘緋葷粺鏂歸潰鐨勫簲鐢ㄧ爺絀訛紝鍙戣〃鐮旂┒璁烘枃200浣欑瘒銆傜洰鍓嶆槸IEEE楂樼駭浼氬憳錛屽崥澹敓瀵煎笀錛屼換銆奍nternational Journal of Machine Intelligence and Sensory Signal Processing銆嬩富緙栵紝銆奍EEE Transactions on Neural Networks and Learning Systems銆嬨€併€奍EEE Transactions on Cyebernetics銆嬨€併€奍nformation Sciences銆嬨€併€奛eurocomputing銆嬬瓑澶氫釜鍥介檯鏈熷垔鐨勫壇涓葷紪銆?br />銆€銆€嬈㈣繋騫垮ぇ甯堢敓鍙傚姞錛?/p>
鐞嗗闄?br />2017騫?2鏈?鏃?/p>