Font Size: a A A

Research On Deep Convolutional Computation Models For Feature Learning Of Big Data

Posted on:2020-02-25Degree:DoctorType:Dissertation
Country:ChinaCandidate:P LiFull Text:PDF
GTID:1368330602450114Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Feature Learning,as a main research branch in big data analytics,aims to mine the intrinsic developing information of events of interest,which can support the business decision-making and scientific researches.However,the big data of the characteristics of high volume,high variety,high velocity and high veracity poses vast challenges on the feature learning.To address these challenges,this dissertation proposes the deep convolutional computation model for the feature learning of the big data of high volume,high variety,high velocity and high veracity.The main contributions are summarized as follows:(1)The current deep learning methods cannot well learn fusion representations of the inter-modality and cross-modality in the heterogenous data.To address this problem,a deep convo-lutional computation model is proposed based on the tensor representation.In detail the tensor convolution is designed based on the tensor multi-dot product.And the tensor convolutional layer is introduced to model big data features in the high-order space,which reduces the number of parameters by the parameter sharing.Furthermore,the tensor fully-connected layer is devised based on the tensor multi-dot product,which learns multi-layer fusion representations of the in-trinsic features of big data.To train the deep convolutional computation model,a high-order back-propagation algorithm is extended from the vector space into the tensor space by designing the back-propagation rules of the loss in the tensor convolutional,pooling and fully-connected layers.Experiments show that the proposed deep model can effectively capture multi-layer fea-tures of the heterogeneous big data.(2)The existing deep learning methods that are based on the high-performance computing architecture ignore the redundancy of deep learning.To solve this problem,a canonical polyadic(CP)decomposition deep convolutional computation model is proposed.In detail,the canonical polyadic tensor convolution is designed by reducing the redundancy of the parameters in the tensor convolutional kernels.In addition,the canonical polyadic tensor weight is devised by reducing the correlation of the tensor fully-connected features.Then,the CP back-propagation rule of the loss is introduced to train the CP deep convolutional computation model in the tensor space.Finally,Experiments show that the proposed CP deep model can efficiently capture multi-layer features of the fast big data without losing much accuracy by compressing the redundancy in the parameters of the model.(3)The existing deep learning methods are static models that cannot efficiently learn fea-tures of the incremental data without losing the historical knowledge.To tackle this problem.an incremental deep convolutional computation model is proposed based on the online learning strategy.In detail,a parameter-incremental algorithm is introduced to capture features of data of the similar distribution by designing an improved dropout method.In addition,a computation method of the initial loss is introduced to speed up the incremental learning.For the dynamic data,the updating rules of the tensor convolutional,pooling,and fully-connected layers are pro-posed to transfer the historical knowledge into the new model.Furthermore,the standard dropout method is extended into the tensor space to improve the robustness of the model.Finally,exper-iments show that the proposed deep model can incrementally capture multi-layer features of the new big data without losing the knowledge of historical data.(4)The existing clustering method cannot well learn the unsupervised value patterns.To address this problem,a deep fuzzy c-means algorithm is proposed based on the deep convolution-al computation model.By exploiting the independency of each private modality,a de-noising auto-encoder is introduced to capture the single-layer features of each private modality.Then,the high-order convolutional auto-encoder is proposed to learn deep features of the fusion rep-resentations generated by the single-layer features of each private modality.A high-order fuzzy-c-means is designed by extending the updating equation of the clustering center and membership into the tensor space based on the tensor distance.Finally,experiments show that the proposed deep clustering algorithm can effectively recognize the unsupervised value patterns.
Keywords/Search Tags:Big Data Feature Learning, Deep Learning, Deep Convolutional Computation Model, Incremental Learning, Fuzzy C-means
PDF Full Text Request
Related items