Surrogate Optimization of Deep Neural Networks for Groundwater Predictions (original) (raw)
Related papers
Frontiers in Water
With the growing use of machine learning (ML) techniques in hydrological applications, there is a need to analyze the robustness, performance, and reliability of predictions made with these ML models. In this paper we analyze the accuracy and variability of groundwater level predictions obtained from a Multilayer Perceptron (MLP) model with optimized hyperparameters for different amounts and types of available training data. The MLP model is trained on point observations of features like groundwater levels, temperature, precipitation, and river flow in various combinations, for different periods and temporal resolutions. We analyze the sensitivity of the MLP predictions at three different test locations in California, United States and derive recommendations for training features to obtain accurate predictions. We show that the use of all available features and data for training the MLP does not necessarily ensure the best predictive performance at all locations. More specifically, river flow and precipitation data are important training features for some, but not all locations. However, we find that predictions made with MLPs that are trained solely on temperature and historical groundwater level measurements as features, without additional hydrological information, are unreliable at all locations.
Deep Learning Based Modeling of Groundwater Storage Change
Computers, materials & continua, 2022
The understanding of water resource changes and a proper projection of their future availability are necessary elements of sustainable water planning. Monitoring GWS change and future water resource availability are crucial, especially under changing climatic conditions. Traditional methods for in situ groundwater well measurement are a significant challenge due to data unavailability. The present investigation utilized the Long Short Term Memory (LSTM) networks to monitor and forecast Terrestrial Water Storage Change (TWSC) and Ground Water Storage Change (GWSC) based on Gravity Recovery and Climate Experiment (GRACE) datasets from 2003-2025 for five basins of Saudi Arabia. An attempt has been made to assess the effects of rainfall, water used, and net budget modeling of groundwater. Analysis of GRACE-derived TWSC and GWSC estimates indicates that all five basins show depletion of water from 2003-2020 with a rate ranging from −5.88 ± 1.2 mm/year to −14.12 ± 1.2 mm/year and −3.5 ± 1.5 to −10.7 ± 1.5, respectively. Forecasting based on the developed LSTM model indicates that the investigated basins are likely to experience serious water depletion at rates ranging from −7.78 ± 1.2 to −15.6 ± 1.2 for TWSC and −4.97 ± 1.5 to −12.21 ± 1.5 for GWSC from 2020-2025. An interesting observation was a minor increase in rainfall during the study period for three basins.
Artificial Intelligence Hybrid Deep Learning Model for Groundwater Level Prediction Using MLP-ADAM
ArXiv, 2021
Groundwater is the largest storage of freshwater resources, which serves as the major inventory for most of the human consumption through agriculture, industrial, and domestic water supply. In the fields of hydrological, some researchers applied a neural network to forecast rainfall intensity in space-time and introduced the advantages of neural networks compared to numerical models. Then, many researches have been conducted applying data-driven models. Some of them extended an Artificial Neural Networks (ANNs) model to forecast groundwater level in semi-confined glacial sand and gravel aquifer under variable state, pumping extraction and climate conditions with significant accuracy. In this paper, a multi-layer perceptron is applied to simulate groundwater level. The adaptive moment estimation optimization algorithm is also used to this matter. The root mean squared error, mean absolute error, mean squared error and the coefficient of determination ( 2 R ) are used to evaluate the ...
Hydrology
Machine learning and deep learning have demonstrated usefulness in modelling various groundwater phenomena. However, these techniques require large amounts of data to develop reliable models. In the Southern African Development Community, groundwater datasets are generally poorly developed. Hence, the question arises as to whether machine learning can be a reliable tool to support groundwater management in the data-scarce environments of Southern Africa. This study tests two machine learning algorithms, a gradient-boosted decision tree (GBDT) and a long short-term memory neural network (LSTM-NN), to model groundwater level (GWL) changes in the Shire Valley Alluvial Aquifer. Using data from two boreholes, Ngabu (sample size = 96) and Nsanje (sample size = 45), we model two predictive scenarios: (I) predicting the change in the current month’s groundwater level, and (II) predicting the change in the following month’s groundwater level. For the Ngabu borehole, GBDT achieved R2 scores o...
A Comprehensive Review of Deep Learning Applications in Hydrology and Water Resources
2020
The global volume of digital data is expected to reach 175 zettabytes by 2025. The volume, variety, and velocity of water-related data are increasing due to large-scale sensor networks and increased attention to topics such as disaster response, water resources management, and climate change. Combined with the growing availability of computational resources and popularity of deep learning, these data are transformed into actionable and practical knowledge, revolutionizing the water industry. In this article, a systematic review of literature is conducted to identify existing research which incorporates deep learning methods in the water sector, with regard to monitoring, management, governance and communication of water resources. The study provides a comprehensive review of state-of-the-art deep learning approaches used in the water industry for generation, prediction, enhancement, and classification tasks, and serves as a guide for how to utilize available deep learning methods fo...
Natural Resources Research
Uncertainty quantification (UQ) is an important benchmark to assess the performance of artificial intelligence (AI) and particularly deep learning ensembled-based models. However, the ability for UQ using current AI-based methods is not only limited in terms of computational resources but it also requires changes to topology and optimization processes, as well as multiple performances to monitor model instabilities. From both geo-engineering and societal perspectives, a predictive groundwater table (GWT) model presents an important challenge, where a lack of UQ limits the validity of findings and may undermine science-based decisions. To overcome and address these limitations, a novel ensemble, an automated random deactivating connective weights approach (ARDCW), is presented and applied to retrieved geographical locations of GWT data from a geo-engineering project in Stockholm, Sweden. In this approach, the UQ was achieved via a combination of several derived ensembles from a fixed...
CNN-Bi LSTM Neural Network for Simulating Groundwater Level
COMPUTATIONAL RESEARCH PROGRESS IN APPLIED SCIENCE & ENGINEERING, 2022
Managing groundwater resources affected by varying climatic conditions requires applying reliable and precise forecasts of groundwater levels. Hence, we investigated the implementation of deep learning neural network called CNN-Bi LTM, which combines convolutional neural network layers and bidirectional long-short term memory layers (Bi LSTM) models for forecasts of groundwater levels in a well affected by pumping for irrigation. The CNN-BiLSTM model was trained with hourly groundwater level data for Jan 2021-Dec 2021, and the data was divided into 70% for training and 30% for testing. Besides, Bayesian optimization was used to find the best range of variables for the model, such as the number of Bi LSTM units, the number of Bi LSTM layers, and the initial learning rate. Also, the Adaptive Moment Estimation (Adam) is used to calculate adaptive learning rates. As a result, the model showed promising results in the taring stage with a regression value equal to 0.9173. In comparison, the model showed acceptable results in the testing stage with regression equal to 0.6324, and the optimization duration lasted for 21 hours. Further, the optimization method showed that the best number of Bi LSTM units is 192, the best number of Bi LSTM layers is two layers, and the best initial learning rate is 0.01.
Field-Scale Application of Three Types of Neural Networks to Predict Ground-Water Levels
Journal of the American Water Resources Association, 2007
In this paper, a field-scale applicability of three forms of artificial neural network algorithms in forecasting short-term ground-water levels at specific control points is presented. These algorithms are the feedforward back propagation (FFBP), radial basis networks (RBN), and generalized regression networks (GRN). Ground-water level predictions from these algorithms are in turn to be used in an Optimized Regional Operations Plan that prescribes scheduled wellfield production for the coming four weeks. These models are up against each other for their accuracy of ground-water level predictions on lead times ranging from a week to four weeks, ease of implementation, and execution times (mainly training time). In total, 208 networks of each of the three algorithms were developed for the study. It is shown that although learning algorithms have emerged as viable solution at field scale much larger than previously studied, no single algorithm performs consistently better than others on all the criteria. On average, FFBP networks are 20 and 26%, respectively, more accurate than RBN and GRN in forecasting one week ahead water levels and this advantage drops to 5 and 9% accuracy in forecasting four weeks ahead water levels. Whereas GRN posted a training time that is only 5% of the training time taken by that of FFBP networks. This may suggest that in field-scale applications one may have to trade between the type of algorithm to be used and the degree to which a given objective is honored.
Groundwater Level Assessment in an Alluvial Aquifer Using Neural Networks
Civil Engineering and Architecture, 2022
Groundwater is an important source of water worldwide due to its wide availability and generally good quality. Earlier groundwater was easily accessible to meet various domestic demands, but recently, it is vulnerable depletion in many areas due to over exploitation and mismanagement of groundwater resources. This study used the Artificial Neural Network (ANN) model to forecast groundwater (GW) level near Varanasi. ANN is a way to develop a prediction model based on the human brain's functions. This research provides a flawless prediction using the LM (Levenberg-Marquardt) and GDX training algorithms (Adaptive Learning rate with back Propagation). Data from eight wells, annual precipitation, the maximum and minimum temperatures, and relative humidity are all accepted as inputs, while the output is expected groundwater levels. The R (regression coefficient) and RMSE (root mean square error) values were used to measure model competency and precision. The observed R and RMSE values for the majority of the wells were heading towards unity using the LM technique. This LM technique is effective when we have a limited amount of data, and it is believed that this strategy will produce a precise result for a large amount of data. When there is a data constraint, the LM approach is found to be appropriate for determining any forecast of water fluctuations. This technique produces accurate results when the river location is used as an input in the artificial neural network (ANN).