Reliable classification: Learning classifiers that ...

aleatoric and epistemic uncertainty

aleatoric and epistemic uncertainty - win

Hawkins Memorial Lecture : Predictive Simulation in the Thermal Sciences: Opportunities and New Directions, Dr. Jayathi Y. Murthy, Ernest Cockrell Jr. Chair and Professor of Mechanical Engineering, University of Texas Austin, October 17, 2013 4:30pm, ME 1061

Abstract: Over the few decades, computational fluid dynamics (CFD) and computational heat transfer (CHT) have evolved from a research tool, developed and used by a few experts in research laboratories, into essential adjuncts of the industrial design and analysis process. In this talk, I trace the historical evolution of CFD and CHT, from the early work of pioneers like Lewis Fry Richardson, through the foundational advances of the World War II era by giants like von Neumann and Courant, to the maturation and widespread adoption of CFD in industry. Central to this ubiquitous spread was the development of unstructured solution-adaptive finite volume methods and the generalization of these methods to a wide range of industrially-relevant physics. More recently, as interest in the heat transfer community has shifted to micro- and nanoscale transport, these methods have been used to simulation flow and heat transfer at sub-micron scales. Recent computational advances in these areas are described. Finally, I turn to notion of “predictive simulation,” i.e, computational predictions with quantified uncertainty bounds, an emerging and largely unexplored opportunity for CFD and CHT. It encompasses areas such as local and global sensitivity analysis, probabilistic simulations accounting for aleatoric, and epistemic uncertainties, and ultimately, to the melding of decision theory with simulation and experiments. Specific examples from the speaker’s PRISM center in the area of microsystem simulation are adduced to illustrate the power of these ideas, and research opportunities for predictive simulation in multiscale multiphysics systems are discussed.
Bio: Jayathi Murthy is Ernest Cockrell Jr. Department Chair and Professor of Mechanical Engineering at The University of Texas at Austin and Director of PRISM: NNSA Center for Prediction of Reliability, Integrity and Survivability of Microsystems. She received her Ph.D degree from the University of Minnesota in the area of numerical heat transfer and has worked in both academia and in industry. During her employment at Fluent Inc., a leading vendor of CFD software, she developed the unstructured solution-adaptive finite volume methods underlying their flagship software Fluent, and the electronics cooling software package ICEPAK. More recently, her research has addressed sub-micron thermal transport, multiscale multiphysics simulations of MEMS and NEMS and uncertainty quantification in these systems. She is the recipient of the IBM Faculty Partnership award 2003-2005, the 2004 Journal of Electronics Packaging Best Paper award, the 2007 ASABE Best Paper Award, the 2008 ASME HTD Best Paper Award, the 2009 ASME EPPD Woman Engineer of the Year Award and the 2012 ASME EPPD Clock Award. In 2012, she was named a distinguished alumna of IIT Kanpur, India. Prof. Murthy serves on the editorial boards of Numerical Heat Transfer and International Journal of Thermal Sciences and is an editor of the 2nd edition of the Handbook of Numerical Heat Transfer. She has served on numerous national committees and panels on electronics thermal management and CFD, and is the author of over 280 technical publications.
submitted by rdx313 to PurdueSeminars [link] [comments]

[D] Difference between probabilistic and Bayesian approaches for modeling uncertainty?

I notice there are approaches (such as HPU U-Net https://arxiv.org/pdf/1905.13077.pdf ) which use the generated samples to imply uncertainty and there are Bayesian models (such as Bayesian segNet https://arxiv.org/pdf/1511.02680.pdf ) for directly modeling epistemic/aleatoric uncertainty.
Slightly confused but are the 2 approaches essentially modeling the same uncertainty? How can we differentiate between the 2? As far as I understand, the probabilistic generative approach cannot directly represent model uncertainty but basically implies it based on the generated samples - is that a correct interpretation? Thanks
submitted by agoevm to MachineLearning [link] [comments]

Uncertainty estimates from deep neural networks - opinions wanted

There have been many approaches proposed over the years for obtaining uncertainty estimates from neural networks (e.g. MC dropout, concrete dropout, stein variational gradient descent, stochastic gradient Langevin dynamics, deep ensembles, BNNs via variational inference).
Aleatoric uncertainty can be estimated by using a parametric distribution as the output and minimising the negative log likelihood, but estimating epistemic uncertainty requires some kind of distribution over network weights (or distribution over functions).
Am I missing something or are we still a way off from reaching any kind of consensus on what would work best here? Is there a current SOTA, or approach that just 'generally works' in terms of estimating uncertainty from neural networks? I'm primarily interested in regression.
MC dropout had sounded promising, and is simple to implement, but seems to have fallen out of favour due to not living up to its claims.
Or should we all just be using deep Gaussian processes?
submitted by BelligerentCatharsis to deeplearning [link] [comments]

Discussion and Help. Bayesian Neural Network in unsupervised learning

Is there any sense of using BNNs in unsupervised tasks? As I understand, a BNN models both epistemic and aleatoric uncertainty into prediction.
For unsupervised tasks, there is no way to measure, aleatoric uncertainty (or is there?), so apart from epistemic uncertainty, will there be any advantage? In general, what would be the scope of the model.
submitted by invoker96_ to deeplearning [link] [comments]

[D] What is the current state of dropout as Bayesian approximation?

Some time ago already, Gal & Ghahramani published their Dropout as Bayesian Approximation paper, and a few more follow-up papers by Gal and colleagues about epistemic vs. aleatoric risks etc. There they claim that test-time dropout can be seen as Bayesian approximation to a Gaussian process related to the original network. (I would not claim to understand the proof in all of its details.) So far so good, but at the Bayesian DL workshop at NIPS2016 Ian Osband of Google DeepMind published his note Risk versus Uncertainty in Deep Learning: Bayes, Bootstrap and the Dangers of Dropout, where he claims that even for absurdly simple networks you can analytically show that the 'posterior' you get using MC dropout doesn't concentrate asymptotically -- which I take as saying that there's no Bayesian approximation happening, since almost any reasonable prior on the weights should lead to a near-certain posterior in the limit of infinite data.
Alas, there are still papers popping up using the MC dropout approach, without even mentioning Osband's note. Did I miss something? Is there a follow-up to Osband's note? A rebuttal? I didn't attend NIPS2016, and I am thus not aware of any discussions that might have happened there, but would certainly appreciate any pointers (-- and given that Yarin Gal was co-organizing that workshop, I am pretty sure that he has seen Osband's note).
Edit: For completeness, here is Yarin Gal's thesis on this topic and the appendix to their 2015 paper containing the proof. Additionally, the supplementary material (section A) of Deep Exploration via Bootstrapped DQN contains some more of Ian's thoughts on this issue
submitted by sschoener to MachineLearning [link] [comments]

Using deep learning to design a 'super compressible' material

Using deep learning to design a 'super compressible' material. The system uses a less used method called Bayesian machine learning. The researcher (Miguel Bessa, Assistant Professor in Materials Science and Engineering at Delft University of Technology) thought probabilistic techniques were the way to go when analyzing or designing structure-dominated materials because they deal with uncertainties that he categorizes as "epistemic" and "aleatoric". Normal deep learning methods are non-probabilistic.
"Epistemic or model uncertainties affect how certain we are of the model predictions (this uncertainty tends to decrease as more data is used for training). Aleatoric uncertainties arise when data is gathered from noisy observations (for example, when different material responses are observed due to uncontrollable manufacturing imperfections)."
Structure-dominated materials "are often strongly sensitive to manufacturing imperfections because they obtain their unprecedented properties by exploring complex geometries, slender structures and/or high-contrast base material properties."
https://www.youtube.com/watch?v=cWTWHhMAu7I
submitted by waynerad to u/waynerad [link] [comments]

Bayes by Backprop - Likelihood function variance for regression

This question isn't necessarily specific to Bayes by Backprop, but it's an example model of where my question pops up.
So the modified objective function (which is the negative ELBO) can be broken down into the KL divergence (which serves as a complexity cost) and the likelihood function. For regression, the likelihood function is a normal distribution with a mean of the model output, and a variance that is equal to a constant variance of the original data. So this assumes that the original data is inherently noisy (Epistemic uncertainty).
If we have a dataset that is not noisy, and we want to model the model uncertainty (Aleatoric uncertainty), then presumably the variance in the likelihood function is 0, which weights the likelihood function infinitely more than the KL divergence term, getting us back to the original squared loss function. Then does this mean the Bayes by Backprop is only useful for epistemic uncertainty, and does not capture aleatoric uncertainty?
submitted by SamStringTheory to learnmachinelearning [link] [comments]

aleatoric and epistemic uncertainty video

Epistemic uncertainty, or subjective uncertainty, on the other hand, refers to uncertainty that can be reduced. [3] Despite these apparent distinctions in uncertainty, probability theory alone has traditionally been used to characterize both forms of uncertainty in engineering applications [ Apostolakis , 1990 ; Helton et al. , 2004 ]. In particular, this includes the importance of distinguishing between (at least) two different types of uncertainty, often refereed to as aleatoric and epistemic. In this paper, we provide an introduction to the topic of uncertainty in machine learning as well as an overview of hitherto attempts at handling uncertainty in general and formalizing this distinction in particular. We provide single-model estimates of aleatoric and epistemic uncertainty for deep neural networks. To estimate aleatoric uncertainty, we propose Simultaneous Quantile Regression (SQR), a loss function to learn all the conditional quantiles of a given target variable. These quantiles can be used to compute well-calibrated prediction intervals. A lot of various paper discuss the need of uncertainty decomposition into aleatoric and epistemic. In the context of reliability and risk analysis, the uncertainty quantification is commonly ... Aleatory variability and epistemic uncertainty are terms used in seismic hazard analysis that are not commonly used in other fields, but the concepts are well known. Aleatory variability is the natural randomness in a process. For discrete variables, the randomness is parameterized by the probability of each possible value. Aleatoric and epistemic uncertainty for logistic regression as a function of sample size. 5.4. Comparing predicted and reported uncertainty. In another experiment, we analyzed to what extent the uncertainty “predicted” by our model is in agreement with the GP’s level of uncertainty in a classification. 需要注意的是,偶然不确定性(Aleatoric Uncertainty)描述的是数据不能解释的信息,只能通过提高数据的精度来消除;而认知不确定性(Epistemic Uncertainty)描述的是模型因为缺少训练数据而存在的未知,可通过增加训练数据解决。 In particular, this includes the importance of distinguishing between (at least) two different types of uncertainty, often referred to as aleatoric and epistemic. In this paper, we provide an introduction to the topic of uncertainty in machine learning as well as an overview of attempts so far at handling uncertainty in general and formalizing this distinction in particular. 为了建模计算Uncertainty,我们介绍了Monte-Carlo和Ensemble方法来建模Epistemic uncertainty,也介绍了Probabilistic DeepLearing用于计算(预测)Aleatoric Uncertainty。更多的建模方法,可参考对应图标中的参考文献。 本文参考文献. 201909-医学图像分析中的Uncertainty学习小结 推荐***** Epistemic uncertainty derives from the lack of knowledge of a parameter, phenomenon or process, while aleatory uncertainty refers to uncertainty caused by probabilistic variations in a random event . Each of these two different types of uncertainty has its own unique set of characteristics that separate it from the other and can be quantified through different methods.

aleatoric and epistemic uncertainty top

[index] [675] [1837] [2589] [6806] [8239] [4729] [6018] [118] [7733] [8138]

aleatoric and epistemic uncertainty

Copyright © 2024 top100.realmoneygamestop.xyz