Font Size: a A A

Research On Neural Network Parallel Computing Method And Applications Based On Cloud Computing Platform

Posted on:2021-05-18Degree:MasterType:Thesis
Country:ChinaCandidate:M J ZengFull Text:PDF
GTID:2428330611966949Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
The training of neural networks,especially deep learning,is very time-consuming and requires a lot of training data,which makes it suitable for the cloud computing platform.The cloud computing platform can store training data in a distributed manner,and can use multiple nodes to train a neural network in parallel to improve training speed.However,there is a large gap between the current training speedup and the linear speedup.In order to improve the training speed of neural networks based on the cloud computing platform,this paper proposes a parallel acceleration method,and applies it to keyword spotting and speaker verification.This paper uses Hadoop and Spark to build the cloud computing platform,and implements distributed parallel training of neural networks based on the Tensorflow On Spark framework.The neural network parallel computing method in this paper includes two aspects: one is to launch multiple workers on each cloud node,and the other is the asynchronous data parallelism based on Downpour SGD.Asynchronous data parallelism is common for training neural networks based on the cloud computing platform,but the communication overhead between computing nodes and the parameter server severely affects the speedup.Downpour SGD can reduce the number of communications between computing nodes and the parameter server.In terms of keyword spotting,this paper proposes a new model Dense Net-Bi LSTM,which combines Dense Net and Bi LSTM,and compares its accuracy on the Google Speech Commands data sets with the results of related papers.It is found that Dense Net-Bi LSTM achieves better results.By training Dense Net-Bi LSTM on the cloud computing platform,the method of this paper improves the training speed by 65% and the accuracy is 96.7%.In the aspect of speaker verification,this paper uses two neural network models with similar parameters,and proposes to improve the accuracy through transfer learning and score fusion.After deploying the two models on the cloud computing platform,it is found that our method can improve the training speed above 100%.The experiment verifies that Downpour SGD has better acceleration effects on the neural network of higher proportion of communication costs.The parallel computing method proposed in this paper achieves good training acceleration ratio in keyword spotting and speaker verification,and the accuracy of the neural network models has not been significantly affected,which verifies the effectiveness of the method in this paper.
Keywords/Search Tags:Cloud Computing, Neural Network Parallel, Keyword Spotting, Speaker Verification, Downpour SGD
PDF Full Text Request
Related items