Begell House Inc.
International Journal for Uncertainty Quantification
IJUQ
2152-5080
5
2
2015
A NONSTATIONARY COVARIANCE FUNCTION MODEL FOR SPATIAL UNCERTAINTIES IN ELECTROSTATICALLY ACTUATED MICROSYSTEMS
99-121
Aravind
Alwan
Department of Mechanical Science and Engineering, Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign, 405 N. Mathews Avenue, Urbana, IL 61801, USA
Narayana R.
Aluru
Department of Mechanical Science and Engineering, Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign, 405 N. Mathews Avenue, Urbana, IL 61801, USA
This paper presents a data-driven method of estimating stochastic models that describe spatial uncertainties. Relating these uncertainties to the spatial statistics literature, we describe a general framework that can handle heterogeneous random processes by providing a parameterization for the nonstationary covariance function in terms of a transformation function and then estimating the unknown hyperparameters from data using Bayesian inference. The transformation function is specified as a displacement that transforms the coordinate space to a deformed configuration in which the covariance between points can be represented by a stationary model. This approach is then used to model spatial uncertainties in microelectromechanical actuators, where the ground plate is assumed to have a spatially varying profile. We estimate the stochastic model corresponding to the random surface using synthetic profilometric data that simulate multiple experimental measurements of ground plate surface roughness. We then demonstrate the effect of the uncertainty on the displacement of the actuator as well as on other parameters, such as the pull-in voltage. We show that the nonstationarity is essential when performing uncertainty quantification in electrostatic microactuators.
BIVARIATE QUANTILE INTERPOLATION FOR ENSEMBLE DERIVED PROBABILITY DENSITY ESTIMATES
123-137
Brad Eric
Hollister
Computer Science Department, Jack Baskin School of Engineering, 1156 High Street, University of California, Santa Cruz, California 95060, USA
Alex
Pang
Computer Science Department, Jack Baskin School of Engineering, 1156 High Street, University of California, Santa Cruz, California 95060, USA
Probability distribution functions (PDFs) may be estimated from members in an ensemble. For an ensemble of 2D vector fields, this results in a bivariate PDF at each location in the field. Vector field analysis and visualization, e.g., stream line calculation, require an interpolation to be defined over these 2D density estimates. Thus, a nonparametric PDF interpolation must advect features as opposed to cross-fading them, where arbitrary modalities in the distribution can be introduced. This is already achieved for 1D PDF interpolation via inverse cumulative distribution functions (CDFs). However, there is no closed-form extension to bivariate PDF. This paper presents one such direct extension of the 1D closed-form solution for bivariates. We show an example of physically coupled components (velocity) and correlated random variables. Our method does not require a complex implementation or expensive computation as does displacement interpolation Bonneel et al., ACM Trans. Graphics (TOG), 30(6):158, 2011. Additionally, our method does not suffer from ambiguous pair-wise linear interpolants, as does Gaussian Mixture Model Interpolation.
HIERARCHICAL SPARSE BAYESIAN LEARNING FOR STRUCUTRAL HEALTH MONITORING WITH INCOMPLETE MODAL DATA
139-169
Yong
Huang
Division of Engineering and Applied Science, California Institute of Technology, Pasadena, California 91125, USA
James L.
Beck
Division of Engineering and Applied Science, California Institute of Technology, Pasadena, California 91125, USA
For civil structures, structural damage due to severe loading events such as earthquakes, or due to long-term environmental degradation, usually occurs in localized areas of a structure. A new sparse Bayesian probabilistic framework for computing the probability of localized stiffness reductions induced by damage is presented that uses noisy incomplete modal data from before and after possible damage. This new approach employs system modal parameters of the structure as extra variables for Bayesian model updating with incomplete modal data. A specific hierarchical Bayesian model is constructed that promotes spatial sparseness in the inferred stiffness reductions in a way that is consistent with the Bayesian Ockham razor. To obtain the most plausible model of sparse stiffness reductions together with its uncertainty within a specified class of models, the method employs an optimization scheme that iterates among all uncertain parameters, including the hierarchical hyper-parameters. The approach has four important benefits: (1) it infers spatially sparse stiffness changes based on the identified modal parameters; (2) the uncertainty in the inferred stiffness reductions is quantified; (3) no matching of model and experimental modes is needed, and (4) solving the nonlinear eigenvalue problem of a structural model is not required. The proposed method is applied to two previously studied examples using simulated data: a ten-story shear-building and the three-dimensional braced-frame model from the Phase II Simulated Benchmark problem sponsored by the IASC-ASCE Task Group on Structural Health Monitoring. The results show that the occurrence of false-positive and false-negative damage detection is clearly reduced in the presence of modeling error (differences between the real structural behavior and the model of it). Furthermore, the identified most probable stiffness loss ratios are close to their actual values.
POLYNOMIAL-CHAOS-BASED KRIGING
171-193
Roland
Schobi
Chair of Risk, Safety and Uncertainty Quantification, Department of Civil Engineering, ETH Zurich, Stefano-Franscini-Platz 5, 8093 Zurich, Switzerland
Bruno
Sudret
ETH Zurich, Institute of Structural Engineering, Chair of Risk, Safety and Uncertainty Quantification, Stefano-Franscini-Platz 5, CH-8093 Zurich, Switzerland
Joe
Wiart
WHIST Lab, Institut Mines Telecom, 46 rue Barrault 75634 Paris Cedex 13, France
Computer simulation has become the standard tool in many engineering fields for designing and optimizing systems, as well as for assessing their reliability. Optimization and uncertainty quantification problems typically require a large number of runs of the computational model at hand, which may not be feasible with high-fidelity models directly. Thus surrogate models (a.k.a meta-models) have been increasingly investigated in the last decade. Polynomial chaos expansions (PCE) and Kriging are two popular nonintrusive meta-modeling techniques. PCE surrogates the computational model with a series of orthonormal polynomials in the input variables where polynomials are chosen in coherency with the probability distributions of those input variables. A least-square minimization technique may be used to determine the coefficients of the PCE. Kriging assumes that the computer model behaves as a realization of a Gaussian random process whose parameters are estimated from the available computer runs, i.e., input vectors and response values. These two techniques have been developed more or less in parallel so far with little interaction between the researchers in the two fields. In this paper, PC-Kriging is derived as a new nonintrusive meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal polynomials (PCE) approximates the global behavior of the computational model whereas Kriging manages the local variability of the model output. An adaptive algorithm similar to the least angle regression algorithm determines the optimal sparse set of polynomials. PC-Kriging is validated on various benchmark analytical functions which are easy to sample for reference results. From the numerical investigations it is concluded that PC-Kriging performs better than or at least as good as the two distinct meta-modeling techniques. A larger gain in accuracy is obtained when the experimental design has a limited size, which is an asset when dealing with demanding computational models.