J. M. Vali Samani; H. Radmehr; M. Delavar
Abstract
Introduction: The greatest part of constructed dams belongs to embankment dams and there are many examples of their failures throughout history. About one-third of the world’s dam failures have been caused by flood overtopping, which indicates that flood overtopping is an important factor affecting ...
Read More
Introduction: The greatest part of constructed dams belongs to embankment dams and there are many examples of their failures throughout history. About one-third of the world’s dam failures have been caused by flood overtopping, which indicates that flood overtopping is an important factor affecting reservoir projects’ safety. Moreover, because of a poor understanding of the randomness of floods, reservoir water levels during flood seasons are often lowered artificially in order to avoid overtopping and protect the lives and property of downstream residents. So, estimation of dam overtopping risk with regard to uncertainties is more important than achieving the dam’s safety. This study presents the procedure for risk evaluation of dam overtopping due to various uncertaintiess in inflows and reservoir initial condition.
Materials and Methods: This study aims to present a practical approach and compare the different uncertainty analysis methods in the evaluation of dam overtopping risk due to flood. For this purpose, Monte Carlo simulation and Latin hypercube sampling methods were used to calculate the overtopping risk, evaluate the uncertainty, and calculate the highest water level during different flood events. To assess these methods from a practical point of view, the Maroon dam was chosen for the case study. Figure. 1 indicates the work procedure, including three parts: 1) Identification and evaluation of effective factors on flood routing and dam overtopping, 2) Data collection and analysis for reservoir routing and uncertainty analysis, 3) Uncertainty and risk analysis.
Figure 1- Diagram of dam overtopping risk evaluation
Results and Discussion: Figure 2 shows the results of the computed overtopping risks for the Maroon Dam without considering the wind effect, for the initial water level of 504 m as an example. As it is shown in Figure. 2, the trends of the risk curves computed by the different uncertainty analysis methods are similar. As it can be seen, the risk curves computed by the LHS are slightly higher than those curves computed by the MCS method. Also as it is observed, the differences between risk values of the two methods increase in longer return periods. Variations of overtopping risk with increasing the initial water level and return period related to overtopping risk in the 2-year return period for the initial water level of 470 m are shown in Table1. The results show that elongation of return period plays a more important role in increasing the risk, than the increase of initial water level.
T Method 2→2 2→50 2→100 2→1000 2→5000 2→10000
470→470 MCS 1 5 9 23 42.36 58
470→478 2 7 15.6 37 58.34 79
470→485 5.6 13.6 28.6 55.6 85.67 112.6
470→493 10.3 32.6 54 95.6 127.34 152
470→504 40.3 83 117.3 165 200.34 224.3
470→470 LHS 1 5.34 11 25.3 43 60.3
470→478 2.3 8.6 18 39.3 60.67 84
470→485 5.3 17.3 32.6 58.3 89 114.6
470→493 13.3 37.6 57.6 97 133.34 160.3
470→504 41.6 87.3 119.6 173 205 233.3
Figure 2- Overtopping risk in the initial water level of 504 m, without considering the wind effect
Conclusions: This study applies MCS and LHS methods to analyze the uncertainty and evaluate the dam overtopping risk consideringthe uncertainties in input variables, such as quintile of flood peak discharge, initial levels of water and spill coefficients. The results show that the uncertainty of water level calculated by MCS is higher than that calculated by LHS. In addition, the overtopping risk calculated by LHS is higher than that calculated by MCS. Furthermore, the increase of inflow rate influences the variations of the overtopping risk more than the increase of the return period. In addition, evaluation of the results indicates that the overtopping risk is an important issue in the Maroon dam. So, a comprehensiverisk analysis procedure in conjunction with uncertainty gives very important information for decision makers to make better judgments in dam operation based on uncertainty in inputs.
A. Jafari; Norair Toomanian; R. Taghizadeh Mehrjerdi
Abstract
Introduction: Methods of soil survey are generally empirical and based on the mental development of the surveyor, correlating soil with underlying geology, landforms, vegetation and air-photo interpretation. Since there are no statistical criteria for traditional soil sampling; this may lead to bias ...
Read More
Introduction: Methods of soil survey are generally empirical and based on the mental development of the surveyor, correlating soil with underlying geology, landforms, vegetation and air-photo interpretation. Since there are no statistical criteria for traditional soil sampling; this may lead to bias in the areas being sampled. In digital soil mapping, soil samples may be used to elaborate quantitative relationships or models between soil attributes and soil covariates. Because the relationships are based on the soil observations, the quality of the resulting soil map depends also on the soil observation quality. An appropriate sampling design for digital soil mapping depends on how much data is available and where the data is located. Some statistical methods have been developed for optimizing data sampling for soil surveys. Some of these methods deal with the use of ancillary information. The purpose of this study was to evaluate the quality of sampling of existing data.
Materials and Methods: The study area is located in the central basin of the Iranian plateau (Figure 1). The geologic infrastructure of the area is mainly Cretaceous limestone, Mesozoic shale and sandstone. Air photo interpretation (API) was used to differentiate geomorphic patterns based on their formation processes, general structure and morphometry. The patterns were differentiated through a nested geomorphic hierarchy (Fig. 2). A four-level geomorphic hierarchy is used to breakdown the complexity of different landscapes of the study area. In the lower level of the hierarchy, the geomorphic surfaces, which were formed by a unique process during a specific geologic time, were defined. A stratified sampling scheme was designed based on geomorphic mapping. In the stratified simple random sampling, the area was divided into sub-areas referred to as strata based on geomorphic surfaces, and within each stratum, sampling locations were randomly selected (Figure 2). This resulted in 191 profiles, which were then described, sampled, analyzed and classified according to the USDA soil classification system (16). The basic rationale is to set up a hypercube, the axes of which are the quantiles of rasters of environmental covariates, e.g., digital elevation model. Sampling evaluation was made using the HELS algorithm. This algorithm was written based on the study of Carre et al., 2007 (3) and run in R.
Results and Discussion: The covariate dataset is represented by elevation, slope and wetness index (Table 2). All data layers were interpolated to a common grid of 30 m resolution. The size of the raster layer is 421 by 711 grid cells. Each of the three covariates is divided into four quantiles (Table 2). The hypercube character space has 43, i.e. 64 strata (Figure 5). The average number of grid cells within each stratum is therefore 4677 grid cells. The map of the covariate index (Figure 6) shows some patterns representative of the covariate variability. The values of the covariate index range between 0.0045 and 5.95. This means that some strata are very dense compared to others. This index allows us to explain if high or low relative weight of the sampling units (see below) is due to soil sampling or covariate density. The strata with the highest density are in the areas with high geomorphology diversity. It means that geomorphology processes can cause the diversity and variability and it is in line with the geomorphology map (Figure 2). Of the 64 strata, 30.4% represent under-sampling, 60.2% represent adequate sampling and 9.4% represent over-sampling. Regarding the covariate index, most of the under-sampling appears in the high covariate index, where soil covariates are then highly variable. Actually, it is difficult to collect field samples in these highly variable areas (Figure 7). Also, most of the over-sampling was observed in areas with alow covariate index (Figure 7). We calculated the weights of all the sampling units and showed the results in Figure 8. One 64 strata out of 16 were empty of legacy sample units. Therefore, if we are going to increase the number of samples, it is better to take samples from the empty strata.
Conclusion: Since, we assume that soil attributes to be mapped can be predicted by the environmental covariates, our estimation of the sample units is based on the covariates. Then, the results are very dependent on the covariates (number and spatial resolution of the covariates and the quality of their measurement or description). Hypercube sampling provides the means to evaluate adequacy of sampling units according to the soil covariates. The main advantage of such a method is that all the sample units can be estimated according to their density in the feature space that represents soil variability. From the results, it is possible to add new sampling units in order to cover the whole feature space. Thus, in case some parts are missing, we can enhance some parts of the feature space that appear to be under-sampled.
Keywords: Environmental variables, Latin hypercube, Soil sampling, Soil survey