Begell House
International Journal for Uncertainty Quantification
International Journal for Uncertainty Quantification
2152-5080
4
2
2014
ENABLING THE ANALYSIS OF FINITE ELEMENT SIMULATION BUNDLES
We propose a methodology capable of allowing a fast evaluation of thousands of finite element design variants simultaneously. This approach uses a high dimensional analysis concept, namely diffusion maps, that has been in use successfully for years in many areas of science. Using feature vectors from a bundle of finite element simulations containing information on the design variables on different mesh sizes and applying this analysis concept is the purpose of this paper. Applying this approach enables the identification of a set of parameters (reduction coordinates) along which (i) geometrical variants can be identified and (ii) for random time-dependent problems, slow variables can be identified that show the variables with the most significant impact on a design. We demonstrate the application of this approach in several industrial examples in the areas of metal forming and vibration analysis as well as vehicle crash simulation, which is a noisy stochastic process. Finally, we show per example that this approach can identify and expound on the occurrence of a bifurcation point, a very important issue in vehicle design.
Rodrigo
Iza Teran
Fraunhofer Institute for Scientific Computing, 53757 Sankt Augustin, Germany
95-110
DATA-FREE INFERENCE OF UNCERTAIN PARAMETERS IN CHEMICAL MODELS
We outline the use of a data-free inference procedure for estimation of uncertain model parameters for a chemical model of methane-air ignition. The method involves a nested pair of Markov chains, exploring both the data and parametric spaces, to discover a pooled joint posterior consistent with available information. We describe the highlights of the method, and detail its particular implementation in the system at hand. We examine the performance of the procedure, focusing on the robustness and convergence of the estimated joint parameter posterior with increasing number of data chain samples. We also comment on comparisons of this posterior with the missing reference posterior density.
Habib N.
Najm
Sandia National Laboratories, Livermore, CA, 94551
Robert D.
Berry
P.O.Box 969, MS 9051; Sandia National Laboratories, Livermore, California 94551, USA
Cosmin
Safta
P.O.Box 969, MS 9051; Sandia National Laboratories, Livermore, California 94551, USA
Khachik
Sargsyan
P.O.Box 969, MS 9051; Sandia National Laboratories, Livermore, California 94551, USA
Bert J.
Debusschere
P.O.Box 969, MS 9051; Sandia National Laboratories, Livermore, California 94551, USA
111-132
GAUSSIAN PROCESS ADAPTIVE IMPORTANCE SAMPLING
The objective is to calculate the probability, PF, that a device will fail when its inputs, x, are randomly distributed with probability density, p (x), e.g., the probability that a device will fracture when subject to varying loads. Here failure is defined as some scalar function, y (x), exceeding a threshold, T. If evaluating y (x) via physical or numerical experiments is sufficiently expensive or PF is sufficiently small, then Monte Carlo (MC) methods to estimate PF will be unfeasible due to the large number of function evaluations required for a specified accuracy. Importance sampling (IS), i.e., preferentially sampling from "important" regions in the input space and appropriately down-weighting to obtain an unbiased estimate, is one approach to assess PF more efficiently. The inputs are sampled from an importance density, p' (x). We present an adaptive importance sampling (AIS) approach which endeavors to adaptively improve the estimate of the ideal importance density, p* (x), during the sampling process. Our approach uses a mixture of component probability densities that each approximate p* (x). An iterative process is used to construct the sequence of improving component probability densities. At each iteration, a Gaussian process (GP) surrogate is used to help identify areas in the space where failure is likely to occur. The GPs are not used to directly calculate the failure probability; they are only used to approximate the importance density. Thus, our Gaussian process adaptive importance sampling (GPAIS) algorithm overcomes limitations involving using a potentially inaccurate surrogate model directly in IS calculations. This robust GPAIS algorithm performs surprisingly well on a pathological test function.
Keith R.
Dalbey
Department of Optimization and Uncertainty Quantification, Sandia National Laboratories, Albuquerque, New Mexico 87123, USA
Laura P.
Swiler
Optimization and Uncertainty Quantification Department, Sandia National Laboratories, P.O.
Box 5800, MS 1318, Albuquerque, New Mexico 87185, USA
133-149
INFERENCE AND UNCERTAINTY PROPAGATION OF ATOMISTICALLY-INFORMED CONTINUUM CONSTITUTIVE LAWS, PART 1: BAYESIAN INFERENCE OF FIXED MODEL FORMS
Uncertainty quantification techniques have the potential to play an important role in constructing constitutive relationships applicable to nanoscale physics. At these small scales, deviations from laws appropriate at the macroscale arise due to insufficient scale separation between the atomic and continuum length scales, as well as fluctuations due to thermal processes. In this work, we consider the problem of inferring the coefficients of an assumed constitutive model form using atomistic information and propagation of the associated uncertainty. A nanoscale heat transfer problem is taken as the model, and we use a polynomial chaos expansion to represent the thermal conductivity with a linear temperature dependence. A Bayesian inference method is developed to extract the coefficients in this expansion from molecular dynamics (MD) samples at prescribed temperatures. Importantly, the atomistic data are incompatible with the continuum model because of the finite probability of heat flowing in the opposite direction of the temperature gradient; we present a method to account for this in the model. The fidelity and uncertainty in these techniques are then examined. Validation is provided by comparing a continuum Fourier model against a larger all MD simulation representing the true solution.
Maher
Salloum
Sandia National Laboratories, 7011 East Avenue, MS 9158, Livermore, California 94550, USA
Jeremy A.
Templeton
Sandia National Laboratories, 7011 East Avenue, MS 9409, Livermore, California 94550, USA
151-170
INFERENCE AND UNCERTAINTY PROPAGATION OF ATOMISTICALLY INFORMED CONTINUUM CONSTITUTIVE LAWS, PART 2: GENERALIZED CONTINUUM MODELS BASED ON GAUSSIAN PROCESSES
Constitutive models in nanoscience and engineering often poorly represent the physics due to significant deviations in model form from their macroscale counterparts. In Part 1 of this study, this problem was explored by considering a continuum scale heat conduction constitutive law inferred directly from molecular dynamics (MD) simulations. In contrast, this work uses Bayesian inference based on the MD data to construct a Gaussian process emulator of the heat flux as a function of temperature and temperature gradient. No assumption of Fourier-like behavior is made, requiring alternative approaches to assess the well-posedness and accuracy of the emulator. Validation is provided by comparing continuum scale predictions using the emulator model against a larger all-MD simulation representing the true solution. The results show that a Gaussian process emulator of the heat conduction constitutive law produces an empirically unbiased prediction of the continuum scale temperature field for a variety of time scales, which was not observed when Fourier's law is assumed to hold. Finally, uncertainty is propagated in the continuum model and quantified in the temperature field so the impact of errors in the model on continuum quantities can be determined.
Maher
Salloum
Sandia National Laboratories, 7011 East Avenue, MS 9158, Livermore, California 94550, USA
Jeremy A.
Templeton
Sandia National Laboratories, 7011 East Avenue, MS 9409, Livermore, California 94550, USA
171-184