Font Size: a A A

Earthquake Big Data Machine Learning Platform

Posted on:2022-10-23Degree:MasterType:Thesis
Country:ChinaCandidate:Z H WangFull Text:PDF
GTID:2480306320484404Subject:Geological Engineering
Abstract/Summary:PDF Full Text Request
Earthquake is a common natural disaster.There are about 5 million earthquakes in the world every year,which bring serious economic losses and casualties to society.China's research on earthquakes has never stopped.Since 1980,thousands of earthquake observation stations have been built across the country,accumulating a large number of earthquake precursor observation data of deformation,geoelectricity,geomagnetism,gravity,fluid,and other disciplines.These data are of great significance to the researchers in the earthquake industry.How to store and analyze data quickly is a difficult problem for seismologists.The emergence of big data technology brings hope to the storage of massive earthquake data.Moreover,the arrival of artificial intelligence era promotes the development of machine learning and deep learning technology,which provides a new idea for earthquake data analysis.For the researcher of the earthquake industry,using big data and artificial intelligence technology to build an earthquake big data machine learning platform,which integrates the access,processing,and analysis of earthquake data,will play a great role in their research work.In terms of data storage,this paper proposes a data storage strategy based on HBase for earthquake precursor time series observation.The key to design HBase storage model is to design a reasonable Rowkey.The observation data of earthquake precursor time series have multiple sampling rates.Because the earthquake precursor time series observation data have multiple sampling rates,the Rowkey is designed as station ID?point ID?item ID?sample?data-date.According to different sampling rate,the length of column cluster is different.The HBase based storage scheme has shown good performance in both query operation and insertion operation.In the aspect of data analysis,this paper proposes a high-scale algorithm model implementation scheme based on template file.According to the research of traditional machine learning and deep learning,the core business function of seismic data analysis is abstracted into four parts: data set selection module,algorithm model setting module,super parameter setting module,model training and result display module.Platform users can customize their own algorithm model by providing the template file,front-end parsing rule file and corresponding Python algorithm program,which is also the innovation of the platform.The scheme provides data mining,depth analysis and other big data services for all kinds of earthquake monitoring,prediction and data management departments of the earthquake big data platform,and provides corresponding technical exploration and verification.In terms of development technology,the platform adopts B/S architecture.The web front end uses the easy-to-use Vue framework,and the server end uses Django framework combined with spark components to achieve,which has the characteristics of simple and easy-to-use,high performance and strong scalability.The research and implementation of earthquake big data machine learning platform is to provide a convenient,fast and reliable earthquake big data processing platform for earthquake researchers.This research will greatly save the time of researchers in the earthquake industry,improve the work efficiency,and promote the research and development of China's earthquake industry.
Keywords/Search Tags:Earthquake Big Data Storage Scheme, HBase, Artificial Intelligence, Vue Framework, Django Framework, Big Data Platform
PDF Full Text Request
Related items