Neural networks have been reborn as a Deep Learning thanks to big data, improved processor, and some modification of training methods. Neural networks used to initialize weights in a stupid way, and to choose wrong type activation functions of non-linearity. Weight initialization contributes as a significant factor on the final quality of a network as well as its convergence rate. This paper discusses different approaches to weight initialization. MNIST dataset is used for experiments for comparing their results to find out the best technique that can be employed to achieve higher accuracy in relatively lower duration.
목차
Abstract 1. INTRODUCTION 2. MULTI-LAYER NEURAL NETWORKS TRAINING 2. 1. Structure of Multi-layer Neural Networks 2. 2. Neural Networks Training 3. WEIGHTS INITIALIZATION OF NEURAL NETWORKS 3. 1. Random Initialization 3. 2. Xavier Initialization 3. 3. He-at-al Initialization 3. 4. Batch Normal Initialization 4. EMPIRICAL RESULTS AND OBSERVATION 5. CONCLUSION ACKNOWLEDGEMENT
국제문화기술진흥원 [The International Promotion Agency of Culture Technology]
설립연도
2009
분야
공학>공학일반
소개
본 진흥원은 문화기술(Culture Technology) 관련 산·학·연·관으로 구성된 비영리 단체이다. 문화기술(CT)은 정보통신기술(ICT), 문화적 사고 기반의 예술, 인문학, 디자인, 사회과학기술이 접목된 신융합기술(New Convergence Technology, NCT)로 정의한다. 인간의 삶의 질을 향상시키고, 진보된 방향으로 변화시키고, 문화기술 관련 분야의 학술 및 기술의 발전과 진흥에 공헌하기 위하여, 제3조의 필요한 사업을 행함을 그 목적으로 한다.
간행물
간행물명
International Journal of Advanced Culture Technology(IJACT)
간기
계간
pISSN
2288-7202
eISSN
2288-7318
수록기간
2013~2025
등재여부
KCI 등재
십진분류
KDC 600DDC 700
이 권호 내 다른 논문 / International Journal of Advanced Culture Technology(IJACT) Volume 7 Number 4