Predicting Shallow Water Table Depth at Regional Scale: Optimizing Monitoring Network in Space and Time
详细信息   
摘要
Shallow water table levels can be predicted using several approaches, either based on climatic records, on field evidences based on soil morphology, or on the outputs of physically based models. In this study, data from a monitoring network in a relevant agricultural area of Northern Italy (ca. 12,000Km2) were used to develop a data driven model for predicting water table depth in space and time from meteorological data and long-term water table characteristics and to optimize sampling density in space and time. Evolutionary Polynomial Regressions (EPR) were used to calibrate a predictive tool based on climatic data and on the records from 48 selected sites (N=5,611). The model was validated against the water table depths observed in 15 independent sites (N=1,739), resulting in a mean absolute error of 30.8cm (R 2=0.61). The model was applied to the whole study area, using the geostatistical estimates of the average water table depth as input, to provide spatio-temporal maps of the water table depth. The impact of the degradation of data input in the temporal and spatial domain was then assessed following two approaches. In the first case, three different EPR models were calibrated based on 25%, 50% and 75% of the available data, and the error indexes compared. In the second case, an increasing number of monitoring sites were removed from the initial data set, and the associated increased kriging standard deviation was assessed. Reducing the average sampling frequency from 1.5 per month to 1 every 40days did not impact significantly on the prediction capability of the proposed model. Reducing the sampling frequency to 1 every 4months resulted in a loss of accuracy <3%, while removing more than half locations from the network, resulted in a global loss of information <15%.