Authors: Alidad Hashemi, Nicholas Gregor, and Farhang Ostadan

Uncertainty can be separated into “aleatory” and “epistemic” components. Within the framework of a probabilistic seismic hazard analysis (PSHA) both of these components of uncertainty can and should be accounted for. Aleatory variability refers to the natural randomness of a process. Its assessment can be improved but it can only be reduced as much dictated by the natural process. Epistemic uncertainty, however, refers to the engineering uncertainty in modelling the process, due to lack of sufficient data and knowledge. In theory, the epistemic uncertainty can be reduced to zero. The goal of this study is to investigate the impact of epistemic uncertainty in site response analysis when computing Ground Motion Response Spectra (GMRS).

One general principle that is perceived in engineering practice is that the less information engineers have, the larger is the uncertainty and the higher is the estimated seismic hazard. As a result, the effort spent on collecting more data and reducing uncertainty is rewarded by reducing the seismic hazard. This very basic principal is not only violated in the current practice of seismic hazard analysis but it works contrary to the very basic principle so that the larger uncertainty yields lower seismic hazard. The current state-of-practice can result in a reduction in computed seismic hazard as epistemic uncertainty increases. This could result in the folly of avoiding site specific investigations because more precise knowledge of the site conditions (i.e., a decrease in epistemic uncertainty) may lead to an increase in the estimated seismic hazard. Following this current state of practice approach, the use of a large epistemic uncertainty for cases in which limited information is known about the characterization of the site response analysis can lead to a lower mean amplification with a broad bandwidth. However, with improved data and thus lower epistemic uncertainty the mean amplification factors may increase with a narrower bandwidth. This observation contradicts the general principle described earlier that less information implies higher uncertainty and results in higher computed seismic ground motions.

For this study, a test case was generated to include two cases with five base case profiles and the more standard use of three base case profiles. Weights for each base case are estimated following current approach implemented for the case of five and three base case profiles. The weighted average amplifications of the case of the five base case profiles do not show significant difference from those based on the three base case profiles. As part of this test case it is shown that the variation in the mean based ground motions is indeed less sensitive to the assigned variations in the site response analysis (e.g., three profiles vs. five profiles). However, a key conclusion is that when evaluating the ground motions at a higher fractile level, for example, the 84th percentile, it is observed that the analysis based on additional site specific data can lead to lower ground motions. This observation is consistent with the premise that the collection and use of additional site specific data can lead to lower uncertainties and ultimately lower and more refined ground motions. To capture this benefit, however, the seismic provisions controlling the development of ground motions would need to consider higher fractile levels than the mean which is currently defined in the guidelines. The seismic input motion calculated at the higher fractile level (e.g. 84th percentile) can be scaled down (using appropriate scale factor for a well investigated site) to the mean level, such that no additional conservatism is implied for the well investigated sites, but the lack of information for sparsely investigated sites is penalized.

Download the full paper by clicking the link below.