Font Size: a A A

SPAF-network with Saturating Pretraining Neurons

Posted on:2017-03-02Degree:M.SType:Thesis
University:Trent University (Canada)Candidate:Burhani, HashamFull Text:PDF
GTID:2448390005473950Subject:Artificial Intelligence
Abstract/Summary:
In this work, various aspects of neural networks, pre-trained with denoising autoencoders (DAE) are explored. To saturate neurons more quickly for feature learning in DAE, an activation function that offers higher gradients is introduced. Moreover, the introduction of sparsity functions applied to the hidden layer representations is studied. More importantly, a technique that swaps the activation functions of fully trained DAE to logistic functions is studied, networks trained using this technique are reffered to as SPAF-networks. For evaluation, the popular MNIST dataset as well as all (3) sub-datasets of the Chars74k dataset are used for classification purposes. The SPAF-network is also analyzed for the features it learns with a logistic, ReLU and a custom activation function. Lastly future roadmap is proposed for enhancements to the SPAF-network.
Keywords/Search Tags:Spaf-network, DAE
Related items