Inference on Phase-type Models via MCMC
|
|
|
- Amberly Harrell
- 9 years ago
- Views:
Transcription
1 Inference on Phase-type Models via MCMC with application to networks of repairable redundant systems Louis JM Aslett and Simon P Wilson Trinity College Dublin 28 th June 202
2 Toy Example : Redundant Repairable Components C C2 State Meaning both C and C2 work 2 C failed, C2 working 3 C working, C2 failed 4 system failed a general stochastic process, e.g. Y (t) t System Failed
3 Continuous-time Markov Chain Model C up C2 up State Meaning both C and C2 work 2 C failed, C2 working 3 C working, C2 failed 4 system failed λ r C down C2 up λ f λ f λ u λ f λ f λ r C up C2 down C down C2 down = π = 0 0, T = 2λ f λ f λ f 0 λ r λ r λ f 0 λ f λ r 0 λ r λ f λ f
4 Inferential Setting Cano & Rios (2006) provide conjugate posterior calculations in the context of analysing repairable systems when the stochastic process leading to absorption is observed. Data For each system failure time, one has: Starting state Length of time in each state Number of transitions between each state Ultimate system failure time
5 Inferential Setting Cano & Rios (2006) provide conjugate posterior calculations in the context of analysing repairable systems when the stochastic process leading to absorption is observed. Data For each system failure time, one has: Starting state Length of time in each state Number of transitions between each state Ultimate system failure time Reduced information scenario = Bladt et al. (2003) provide a Bayesian MCMC algorithm, or Asmussen et al. (996) provide a frequentist EM algorithm.
6 Definition of Phase-type Distributions An absorbing continuous time Markov chain is one in which there is a state that, once entered, is never left. That is, the n + state intensity matrix can be written: ( ) S s T = 0 0 where S is n n, s is n and 0 is n, with s = Se Then, a Phase-type distribution (PHT) is defined to be the distribution of the time to entering the absorbing state. F Y (y) = π T exp{ys}e Y PHT(π, S) = f Y (y) = π T exp{ys}s
7 Relating to the Toy Example State Meaning both C and C2 work 2 C failed, C2 working 3 C working, C2 failed 4 system failed = T = λ r C down C2 up λ f λ f C up C2 up λ u λ f C down C2 down 2λ f λ f λ f 0 λ r λ r λ f 0 λ f λ r S0 λ r λ f sλ f λ f λ r C up C2 down f Y (y) = π T exp{ys}s F Y (y) = π T exp{ys}e
8 Bladt et al: Gibbs Sampling from Posterior Strategy is a Gibbs MCMC algorithm which achieves the goal of simulating from p(π, S y) by sampling from through the iterative process p(π, S, paths y) p(π, S paths, y) p(paths π, S, y)
9 Bladt et al: Metropolis-Hastings Simulation of Process In summary: Can simulate chain from p(path Y i y i ) trivially by rejection sampling. A Metropolis-Hastings acceptance ratio (ratio of exit rates) exists st truncating chain to time y i (at which point it absorbs) will be a draw from p(path Y i = y i ) Metropolis-Hastings p(path π, S, Y = y) Rejection Sampling p(path π, S, Y y) CTMC Sampling p(path π, S)
10 Motivation for Modifications Certain state transitions make no physical sense. (eg 2 3 in earlier example) 2 When part of a larger system, it is highly likely there will be censored observations. 3 Where there is no reason to believe distributional differences between parameters, they should (in idealised modelling sense) be constrained to be equal. This is as much to assist with reducing parameter dimensionality. 4 Computation time!
11 Toy Example Results uncensored observations simulated from PHT with S = = λ f =.8, λ r = 9.5 Density Parameter Value Param S2 S3 s2 2 s3 3 λf f
12 Toy Example Results 00 uncensored observations simulated from PHT with S = = λ f =.8, λ r = 9.5 Density Parameter Value Param S2 S3 Rλ r Reliability less sensitive to λ r (Daneshkhah & Bedford 2008)
13 Toy Example Results 2 00 uncensored observations simulated from PHT with S = = λ f =.8, λ r = 9.5 Density Parameter Value Param s S23 S32
14 The Big Issue Intractable computation time for many applications!
15 The Big Issue Intractable computation time for many applications! longer chains and MCMC jumps to states for which observations are far in the tails can stall rejection sampling step of MH algorithm. P(Y i y i π, S) = 0 6 p(π, S paths, y) p(paths π, S, y)
16 The Big Issue Intractable computation time for many applications! longer chains and MCMC jumps to states for which observations are far in the tails can stall rejection sampling step of MH algorithm. P(Y i y i π, S) = 0 6 p(π, S paths, y) p(paths π, S, y) E(RS iter.) = % CI = [2537, ]
17 The Big Issue Intractable computation time for many applications! longer chains and MCMC jumps to states for which observations are far in the tails can stall rejection sampling step of MH algorithm. 2 states from which absorption impossible wasteful to resample whole chain because state at time y i unsuitable for truncation invalid truncation y i simulation
18 The Big Issue Intractable computation time for many applications! longer chains and MCMC jumps to states for which observations are far in the tails can stall rejection sampling step of MH algorithm. 2 states from which absorption impossible wasteful to resample whole chain because state at time y i unsuitable for truncation. State Meaning P(state) both PS working = E(MH iter) = failed, 2 working working, 2 failed % CI = [36, 5267]
19 The Big Issue Intractable computation time for many applications! longer chains and MCMC jumps to states for which observations are far in the tails can stall rejection sampling step of MH algorithm. 2 states from which absorption impossible wasteful to resample whole chain because state at time y i unsuitable for truncation. 3 time for MH algorithm to reach stationarity can grow rapidly. Total Variation Distance Iterations
20 Solution: Exact Conditional Sampling Metropolis-Hastings p(path π, S, Y = y) Rejection Sampling p(path π, S, Y y) CTMC Sampling p(path π, S)
21 Solution: Exact Conditional Sampling Metropolis-Hastings i) Starting state ~ discrete p(path π, S, Y = y) Y {0} π ii) Advance time ~ Exponential Rejection Sampling δ Exp( S Y {t},y {t} ) p(path π, S, Y y) iii) Select next state ~ discrete Y {t + δ} { S Y {t}, Y {t} /S Y {t},y {t} } CTMC Sampling iv) If not absorbed, t = t + δ, loop to ii p(path π, S)
22 Solution: Exact Conditional Sampling Metropolis-Hastings e.g. Starting state mass function changes p(path π, S, Y = y) with conditioning: Rejection Sampling p(path π, S, Y y) P(Y {0} = i π, S) =π i CTMC Sampling p(path p(path π, π, S, Y S) = y) P(Y {0} = i π, S,Y = y) = et i exp{sy}s π i π T exp{sy}s
23 Tail Depth Performance Improvement 0 2 log[time (secs)] Method ECS MH Upper tail probability (0^-x)
24 Overall Performance Improvement This shows the new method keeping pace in nice problems and significantly outperforming otherwise. T = ( No problems i-iii MH ECS t.6 µs 7.2 µs s t 04 µs 9 µs ) T = ( All problems i-iii ) MH ECS 0.2 hours 0.06 secs 9.4 hours 0.05 secs 2,300,000 faster on average in hard problem
25 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type.
26 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2
27 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 2
28 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 2
29 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 2
30 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 2
31 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 2
32 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 2
33 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2
34 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 3
35 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type. 2 3
36 Networks/Systems of Components Quick overview now of current research focus: inference for networks/systems comprising nodes/components which may be of Phase-type.??? T = t? Again, reduced information setting: overall network failure time, or so-called Masked system lifetime data.
37 Direct Masked System Lifetime Inference Even a very simple setting quite hard to tackle directly. X X 2 X i iid Weibull(scale = α, shape = β) System lifetimes t = {t,..., t n } X 3 F T (t) = ( F X (t) F X2 (t))( F X3 (t)) = L(α, β; t) = m i= { t i β(t i /α) β exp 3(t i /α) β} [ 2 exp {(t i /α) β} { + exp 2(t i /α) β} ] 3 p(α, β t) L(α, β; t)p(α, β) awkward.
38 MCMC Solution (Independent Case) Proposed solution in the tradition of Tanner & Wong (987), since inference easy in the presence of (augmented) component lifetimes. Thus, for X i iid FX ( ; ψ) sample from the natural completion of the posterior distribution: p(ψ, x,..., x n t) by blocked Gibbs sampling using the conditional distributions: p(x,..., x n ψ, t) p(ψ x,..., x n, t) where x i = {x i,..., x im } are the m component failure times for the i th of n systems (x ij = t j some j)
39 p(ψ x,..., x n, t) is now simple Bayesian inference for system lifetime distribution well understood and in Phase-type case, above algorithm slots in here. Problem shifted to sampling p(x,..., x n ψ, t)
40 p(ψ x,..., x n, t) is now simple Bayesian inference for system lifetime distribution well understood and in Phase-type case, above algorithm slots in here. Problem shifted to sampling p(x,..., x n ψ, t) Propose using Samaniego s system signature s j = P(T = X j:n ) e.g. p(x i,..., x im ψ, t) = m {p(x i,..., x im ψ, X j:n = t i ) j= P(T = X j:n ψ, t i )} where ( ) m P(T = X j:n ψ, t i ) s j F X (t i ) j FX (t i ) m j j
41 Algorithm to sample p(x,..., x n ψ, t) For each system i =,..., n: Sample j {,..., m} from the discrete probability distribution defined by the conditioned system signature, P(T = X j:n ψ, t i ). This samples the order statistic indicating that the j th failure caused system failure. 2 Sample: j values, x i,..., x i(j ), from F X X<ti ( ; ψ), the distribution of the component lifetime conditional on failure before t i m j values, x i(j+),..., x in, from F X X>ti ( ; ψ), the distribution of the component lifetime conditional on failure after t i and set x ij = t i. Each iteration provides x i
42 All Systems of 4 Components, Exponential (λ = 0.4) Network Rate
43 Exchangeable Failure Rate Parameters It is straight-forward to allow the more general setting of exchangeable failure rate parameters between networks: i X ij T i j =,...,n i =,...,m However, exchangeability within network would break the signature-based sampling of node failure times so this is probably as general as this particular approach can go.
44 All Systems of 4 Components, Ψ Gam(α = 9, β = 2 ) Network Population Mean of Rate Distribution
45 All Systems of 4 Components, Ψ Gam(α = 9, β = 2 ) Network Population Variance of Rate Distribution
46 Extend to Topological Inference It is now possible to add topology detection to the inferential process.? T = t??? It is simple to sample from p(m ψ, t) as an additional Gibbs update, moving between network topologies. With care, reversible jump between component numbers is also feasible.
47 True Topology with Signature No. 2 (40 observation) 0.25 Posterior Probability Network
48 Asmussen, S., Nerman, O. & Olsson, M. (996), Fitting phase-type distributions via the EM algorithm, Scand. J. Statist. 23(4), Bladt, M., Gonzalez, A. & Lauritzen, S. L. (2003), The estimation of phase-type related functionals using Markov chain Monte Carlo methods, Scand. Actuar. J. 2003(4), Cano, J. & Rios, D. (2006), Reliability forecasting in complex hardware/software systems, Proceedings of the First International Conference on Availability, Reliability and Security (ARES 06). Daneshkhah, A. & Bedford, T. (2008), Sensitivity analysis of a reliability system using gaussian processes, in T. Bedford, ed., Advances in Mathematical Modeling for Reliability, IOS Press, pp Tanner, M. A. & Wong, W. H. (987), The calculation of posterior distributions by data augmentation, Journal of the American Statistical Association 82(398),
Markov Chain Monte Carlo Simulation Made Simple
Markov Chain Monte Carlo Simulation Made Simple Alastair Smith Department of Politics New York University April2,2003 1 Markov Chain Monte Carlo (MCMC) simualtion is a powerful technique to perform numerical
Bayesian Statistics in One Hour. Patrick Lam
Bayesian Statistics in One Hour Patrick Lam Outline Introduction Bayesian Models Applications Missing Data Hierarchical Models Outline Introduction Bayesian Models Applications Missing Data Hierarchical
Gaussian Processes to Speed up Hamiltonian Monte Carlo
Gaussian Processes to Speed up Hamiltonian Monte Carlo Matthieu Lê Murray, Iain http://videolectures.net/mlss09uk_murray_mcmc/ Rasmussen, Carl Edward. "Gaussian processes to speed up hybrid Monte Carlo
Tutorial on Markov Chain Monte Carlo
Tutorial on Markov Chain Monte Carlo Kenneth M. Hanson Los Alamos National Laboratory Presented at the 29 th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Technology,
A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data
A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data Faming Liang University of Florida August 9, 2015 Abstract MCMC methods have proven to be a very powerful tool for analyzing
Parallelization Strategies for Multicore Data Analysis
Parallelization Strategies for Multicore Data Analysis Wei-Chen Chen 1 Russell Zaretzki 2 1 University of Tennessee, Dept of EEB 2 University of Tennessee, Dept. Statistics, Operations, and Management
Probabilistic Models for Big Data. Alex Davies and Roger Frigola University of Cambridge 13th February 2014
Probabilistic Models for Big Data Alex Davies and Roger Frigola University of Cambridge 13th February 2014 The State of Big Data Why probabilistic models for Big Data? 1. If you don t have to worry about
Introduction to Markov Chain Monte Carlo
Introduction to Markov Chain Monte Carlo Monte Carlo: sample from a distribution to estimate the distribution to compute max, mean Markov Chain Monte Carlo: sampling using local information Generic problem
Introduction to Monte Carlo. Astro 542 Princeton University Shirley Ho
Introduction to Monte Carlo Astro 542 Princeton University Shirley Ho Agenda Monte Carlo -- definition, examples Sampling Methods (Rejection, Metropolis, Metropolis-Hasting, Exact Sampling) Markov Chains
Spatial Statistics Chapter 3 Basics of areal data and areal data modeling
Spatial Statistics Chapter 3 Basics of areal data and areal data modeling Recall areal data also known as lattice data are data Y (s), s D where D is a discrete index set. This usually corresponds to data
Dirichlet Processes A gentle tutorial
Dirichlet Processes A gentle tutorial SELECT Lab Meeting October 14, 2008 Khalid El-Arini Motivation We are given a data set, and are told that it was generated from a mixture of Gaussian distributions.
Optimizing Prediction with Hierarchical Models: Bayesian Clustering
1 Technical Report 06/93, (August 30, 1993). Presidencia de la Generalidad. Caballeros 9, 46001 - Valencia, Spain. Tel. (34)(6) 386.6138, Fax (34)(6) 386.3626, e-mail: [email protected] Optimizing Prediction
STA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! [email protected]! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct
11. Time series and dynamic linear models
11. Time series and dynamic linear models Objective To introduce the Bayesian approach to the modeling and forecasting of time series. Recommended reading West, M. and Harrison, J. (1997). models, (2 nd
Bayesian Statistics: Indian Buffet Process
Bayesian Statistics: Indian Buffet Process Ilker Yildirim Department of Brain and Cognitive Sciences University of Rochester Rochester, NY 14627 August 2012 Reference: Most of the material in this note
An Introduction to Using WinBUGS for Cost-Effectiveness Analyses in Health Economics
Slide 1 An Introduction to Using WinBUGS for Cost-Effectiveness Analyses in Health Economics Dr. Christian Asseburg Centre for Health Economics Part 1 Slide 2 Talk overview Foundations of Bayesian statistics
DURATION ANALYSIS OF FLEET DYNAMICS
DURATION ANALYSIS OF FLEET DYNAMICS Garth Holloway, University of Reading, [email protected] David Tomberlin, NOAA Fisheries, [email protected] ABSTRACT Though long a standard technique
Basics of Statistical Machine Learning
CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu [email protected] Modern machine learning is rooted in statistics. You will find many familiar
Bayesian Machine Learning (ML): Modeling And Inference in Big Data. Zhuhua Cai Google, Rice University [email protected]
Bayesian Machine Learning (ML): Modeling And Inference in Big Data Zhuhua Cai Google Rice University [email protected] 1 Syllabus Bayesian ML Concepts (Today) Bayesian ML on MapReduce (Next morning) Bayesian
Model-based Synthesis. Tony O Hagan
Model-based Synthesis Tony O Hagan Stochastic models Synthesising evidence through a statistical model 2 Evidence Synthesis (Session 3), Helsinki, 28/10/11 Graphical modelling The kinds of models that
Sampling for Bayesian computation with large datasets
Sampling for Bayesian computation with large datasets Zaiying Huang Andrew Gelman April 27, 2005 Abstract Multilevel models are extremely useful in handling large hierarchical datasets. However, computation
2WB05 Simulation Lecture 8: Generating random variables
2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating
Computational Statistics for Big Data
Lancaster University Computational Statistics for Big Data Author: 1 Supervisors: Paul Fearnhead 1 Emily Fox 2 1 Lancaster University 2 The University of Washington September 1, 2015 Abstract The amount
Statistical Machine Learning from Data
Samy Bengio Statistical Machine Learning from Data 1 Statistical Machine Learning from Data Gaussian Mixture Models Samy Bengio IDIAP Research Institute, Martigny, Switzerland, and Ecole Polytechnique
Statistical Machine Learning
Statistical Machine Learning UoC Stats 37700, Winter quarter Lecture 4: classical linear and quadratic discriminants. 1 / 25 Linear separation For two classes in R d : simple idea: separate the classes
Recent Developments of Statistical Application in. Finance. Ruey S. Tsay. Graduate School of Business. The University of Chicago
Recent Developments of Statistical Application in Finance Ruey S. Tsay Graduate School of Business The University of Chicago Guanghua Conference, June 2004 Summary Focus on two parts: Applications in Finance:
Imperfect Debugging in Software Reliability
Imperfect Debugging in Software Reliability Tevfik Aktekin and Toros Caglar University of New Hampshire Peter T. Paul College of Business and Economics Department of Decision Sciences and United Health
Validation of Software for Bayesian Models using Posterior Quantiles. Samantha R. Cook Andrew Gelman Donald B. Rubin DRAFT
Validation of Software for Bayesian Models using Posterior Quantiles Samantha R. Cook Andrew Gelman Donald B. Rubin DRAFT Abstract We present a simulation-based method designed to establish that software
Estimation and comparison of multiple change-point models
Journal of Econometrics 86 (1998) 221 241 Estimation and comparison of multiple change-point models Siddhartha Chib* John M. Olin School of Business, Washington University, 1 Brookings Drive, Campus Box
Reliability estimators for the components of series and parallel systems: The Weibull model
Reliability estimators for the components of series and parallel systems: The Weibull model Felipe L. Bhering 1, Carlos Alberto de Bragança Pereira 1, Adriano Polpo 2 1 Department of Statistics, University
BayesX - Software for Bayesian Inference in Structured Additive Regression
BayesX - Software for Bayesian Inference in Structured Additive Regression Thomas Kneib Faculty of Mathematics and Economics, University of Ulm Department of Statistics, Ludwig-Maximilians-University Munich
α α λ α = = λ λ α ψ = = α α α λ λ ψ α = + β = > θ θ β > β β θ θ θ β θ β γ θ β = γ θ > β > γ θ β γ = θ β = θ β = θ β = β θ = β β θ = = = β β θ = + α α α α α = = λ λ λ λ λ λ λ = λ λ α α α α λ ψ + α =
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I February
Programma della seconda parte del corso
Programma della seconda parte del corso Introduction Reliability Performance Risk Software Performance Engineering Layered Queueing Models Stochastic Petri Nets New trends in software modeling: Metamodeling,
Portfolio Distribution Modelling and Computation. Harry Zheng Department of Mathematics Imperial College [email protected]
Portfolio Distribution Modelling and Computation Harry Zheng Department of Mathematics Imperial College [email protected] Workshop on Fast Financial Algorithms Tanaka Business School Imperial College
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Maren Bennewitz, Diego Tipaldi, Luciano Spinello 1 Motivation Recall: Discrete filter Discretize
Monte Carlo Simulation
1 Monte Carlo Simulation Stefan Weber Leibniz Universität Hannover email: [email protected] web: www.stochastik.uni-hannover.de/ sweber Monte Carlo Simulation 2 Quantifying and Hedging
Nonparametric adaptive age replacement with a one-cycle criterion
Nonparametric adaptive age replacement with a one-cycle criterion P. Coolen-Schrijner, F.P.A. Coolen Department of Mathematical Sciences University of Durham, Durham, DH1 3LE, UK e-mail: [email protected]
Monte Carlo-based statistical methods (MASM11/FMS091)
Monte Carlo-based statistical methods (MASM11/FMS091) Jimmy Olsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I February 5, 2013 J. Olsson Monte Carlo-based
Linear Classification. Volker Tresp Summer 2015
Linear Classification Volker Tresp Summer 2015 1 Classification Classification is the central task of pattern recognition Sensors supply information about an object: to which class do the object belong
BAYESIAN ANALYSIS OF AN AGGREGATE CLAIM MODEL USING VARIOUS LOSS DISTRIBUTIONS. by Claire Dudley
BAYESIAN ANALYSIS OF AN AGGREGATE CLAIM MODEL USING VARIOUS LOSS DISTRIBUTIONS by Claire Dudley A dissertation submitted for the award of the degree of Master of Science in Actuarial Management School
Web-based Supplementary Materials for Bayesian Effect Estimation. Accounting for Adjustment Uncertainty by Chi Wang, Giovanni
1 Web-based Supplementary Materials for Bayesian Effect Estimation Accounting for Adjustment Uncertainty by Chi Wang, Giovanni Parmigiani, and Francesca Dominici In Web Appendix A, we provide detailed
1 Simulating Brownian motion (BM) and geometric Brownian motion (GBM)
Copyright c 2013 by Karl Sigman 1 Simulating Brownian motion (BM) and geometric Brownian motion (GBM) For an introduction to how one can construct BM, see the Appendix at the end of these notes A stochastic
Probabilistic Fitting
Probabilistic Fitting Shape Modeling April 2016 Sandro Schönborn University of Basel 1 Probabilistic Inference for Face Model Fitting Approximate Inference with Markov Chain Monte Carlo 2 Face Image Manipulation
How To Solve A Sequential Mca Problem
Monte Carlo-based statistical methods (MASM11/FMS091) Jimmy Olsson Centre for Mathematical Sciences Lund University, Sweden Lecture 6 Sequential Monte Carlo methods II February 3, 2012 Changes in HA1 Problem
Parameter estimation for nonlinear models: Numerical approaches to solving the inverse problem. Lecture 12 04/08/2008. Sven Zenker
Parameter estimation for nonlinear models: Numerical approaches to solving the inverse problem Lecture 12 04/08/2008 Sven Zenker Assignment no. 8 Correct setup of likelihood function One fixed set of observation
Topic models for Sentiment analysis: A Literature Survey
Topic models for Sentiment analysis: A Literature Survey Nikhilkumar Jadhav 123050033 June 26, 2014 In this report, we present the work done so far in the field of sentiment analysis using topic models.
HT2015: SC4 Statistical Data Mining and Machine Learning
HT2015: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Bayesian Nonparametrics Parametric vs Nonparametric
MCMC Using Hamiltonian Dynamics
5 MCMC Using Hamiltonian Dynamics Radford M. Neal 5.1 Introduction Markov chain Monte Carlo (MCMC) originated with the classic paper of Metropolis et al. (1953), where it was used to simulate the distribution
THE USE OF STATISTICAL DISTRIBUTIONS TO MODEL CLAIMS IN MOTOR INSURANCE
THE USE OF STATISTICAL DISTRIBUTIONS TO MODEL CLAIMS IN MOTOR INSURANCE Batsirai Winmore Mazviona 1 Tafadzwa Chiduza 2 ABSTRACT In general insurance, companies need to use data on claims gathered from
More details on the inputs, functionality, and output can be found below.
Overview: The SMEEACT (Software for More Efficient, Ethical, and Affordable Clinical Trials) web interface (http://research.mdacc.tmc.edu/smeeactweb) implements a single analysis of a two-armed trial comparing
Visualization of Collaborative Data
Visualization of Collaborative Data Guobiao Mei University of California, Riverside [email protected] Christian R. Shelton University of California, Riverside [email protected] Abstract Collaborative data
A LOGNORMAL MODEL FOR INSURANCE CLAIMS DATA
REVSTAT Statistical Journal Volume 4, Number 2, June 2006, 131 142 A LOGNORMAL MODEL FOR INSURANCE CLAIMS DATA Authors: Daiane Aparecida Zuanetti Departamento de Estatística, Universidade Federal de São
An Introduction to Machine Learning
An Introduction to Machine Learning L5: Novelty Detection and Regression Alexander J. Smola Statistical Machine Learning Program Canberra, ACT 0200 Australia [email protected] Tata Institute, Pune,
Parametric fractional imputation for missing data analysis
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 Biometrika (????,??,?, pp. 1 14 C???? Biometrika Trust Printed in
Notes on the Negative Binomial Distribution
Notes on the Negative Binomial Distribution John D. Cook October 28, 2009 Abstract These notes give several properties of the negative binomial distribution. 1. Parameterizations 2. The connection between
Probabilistic Methods for Time-Series Analysis
Probabilistic Methods for Time-Series Analysis 2 Contents 1 Analysis of Changepoint Models 1 1.1 Introduction................................ 1 1.1.1 Model and Notation....................... 2 1.1.2 Example:
Coding and decoding with convolutional codes. The Viterbi Algor
Coding and decoding with convolutional codes. The Viterbi Algorithm. 8 Block codes: main ideas Principles st point of view: infinite length block code nd point of view: convolutions Some examples Repetition
A Markov Chain Analysis of Blackjack Strategy
A Markov Chain Analysis of Blackjack Strategy Michael B. Wakin and Christopher J. Rozell Department of Electrical and Computer Engineering, Rice University, Houston, Texas 77251 1 Introduction Many statistical
1 Prior Probability and Posterior Probability
Math 541: Statistical Theory II Bayesian Approach to Parameter Estimation Lecturer: Songfeng Zheng 1 Prior Probability and Posterior Probability Consider now a problem of statistical inference in which
References. Importance Sampling. Jessi Cisewski (CMU) Carnegie Mellon University. June 2014
Jessi Cisewski Carnegie Mellon University June 2014 Outline 1 Recall: Monte Carlo integration 2 3 Examples of (a) Monte Carlo, Monaco (b) Monte Carlo Casino Some content and examples from Wasserman (2004)
MCMC-Based Assessment of the Error Characteristics of a Surface-Based Combined Radar - Passive Microwave Cloud Property Retrieval
MCMC-Based Assessment of the Error Characteristics of a Surface-Based Combined Radar - Passive Microwave Cloud Property Retrieval Derek J. Posselt University of Michigan Jay G. Mace University of Utah
Operational Risk Management: Added Value of Advanced Methodologies
Operational Risk Management: Added Value of Advanced Methodologies Paris, September 2013 Bertrand HASSANI Head of Major Risks Management & Scenario Analysis Disclaimer: The opinions, ideas and approaches
Credit Risk Models: An Overview
Credit Risk Models: An Overview Paul Embrechts, Rüdiger Frey, Alexander McNeil ETH Zürich c 2003 (Embrechts, Frey, McNeil) A. Multivariate Models for Portfolio Credit Risk 1. Modelling Dependent Defaults:
Course: Model, Learning, and Inference: Lecture 5
Course: Model, Learning, and Inference: Lecture 5 Alan Yuille Department of Statistics, UCLA Los Angeles, CA 90095 [email protected] Abstract Probability distributions on structured representation.
Statistics Graduate Courses
Statistics Graduate Courses STAT 7002--Topics in Statistics-Biological/Physical/Mathematics (cr.arr.).organized study of selected topics. Subjects and earnable credit may vary from semester to semester.
A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking
174 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 2, FEBRUARY 2002 A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking M. Sanjeev Arulampalam, Simon Maskell, Neil
Incorporating cost in Bayesian Variable Selection, with application to cost-effective measurement of quality of health care.
Incorporating cost in Bayesian Variable Selection, with application to cost-effective measurement of quality of health care University of Florida 10th Annual Winter Workshop: Bayesian Model Selection and
Generating Valid 4 4 Correlation Matrices
Applied Mathematics E-Notes, 7(2007), 53-59 c ISSN 1607-2510 Available free at mirror sites of http://www.math.nthu.edu.tw/ amen/ Generating Valid 4 4 Correlation Matrices Mark Budden, Paul Hadavas, Lorrie
Predicting the World Cup. Dr Christopher Watts Centre for Research in Social Simulation University of Surrey
Predicting the World Cup Dr Christopher Watts Centre for Research in Social Simulation University of Surrey Possible Techniques Tactics / Formation (4-4-2, 3-5-1 etc.) Space, movement and constraints Data
A HYBRID GENETIC ALGORITHM FOR THE MAXIMUM LIKELIHOOD ESTIMATION OF MODELS WITH MULTIPLE EQUILIBRIA: A FIRST REPORT
New Mathematics and Natural Computation Vol. 1, No. 2 (2005) 295 303 c World Scientific Publishing Company A HYBRID GENETIC ALGORITHM FOR THE MAXIMUM LIKELIHOOD ESTIMATION OF MODELS WITH MULTIPLE EQUILIBRIA:
Bayesian Methods for the Social and Behavioral Sciences
Bayesian Methods for the Social and Behavioral Sciences Jeff Gill Harvard University 2007 ICPSR First Session: June 25-July 20, 9-11 AM. Email: [email protected] TA: Yu-Sung Su ([email protected]).
Model-based approaches to nonparametric Bayesian quantile regression
Model-based approaches to nonparametric Bayesian quantile regression Athanasios Kottas 1, Milovan Krnjajić 2, Matthew Taddy 3 University of California, Santa Cruz 1 University of California, Lawrence Livermore
Package zic. September 4, 2015
Package zic September 4, 2015 Version 0.9 Date 2015-09-03 Title Bayesian Inference for Zero-Inflated Count Models Author Markus Jochmann Maintainer Markus Jochmann
Monte Carlo Methods and Models in Finance and Insurance
Chapman & Hall/CRC FINANCIAL MATHEMATICS SERIES Monte Carlo Methods and Models in Finance and Insurance Ralf Korn Elke Korn Gerald Kroisandt f r oc) CRC Press \ V^ J Taylor & Francis Croup ^^"^ Boca Raton
A three dimensional stochastic Model for Claim Reserving
A three dimensional stochastic Model for Claim Reserving Magda Schiegl Haydnstr. 6, D - 84088 Neufahrn, [email protected] and Cologne University of Applied Sciences Claudiusstr. 1, D-50678 Köln
Markov Chain Monte Carlo and Applied Bayesian Statistics: a short course Chris Holmes Professor of Biostatistics Oxford Centre for Gene Function
MCMC Appl. Bayes 1 Markov Chain Monte Carlo and Applied Bayesian Statistics: a short course Chris Holmes Professor of Biostatistics Oxford Centre for Gene Function MCMC Appl. Bayes 2 Objectives of Course
Discrete Frobenius-Perron Tracking
Discrete Frobenius-Perron Tracing Barend J. van Wy and Michaël A. van Wy French South-African Technical Institute in Electronics at the Tshwane University of Technology Staatsartillerie Road, Pretoria,
Dealing with large datasets
Dealing with large datasets (by throwing away most of the data) Alan Heavens Institute for Astronomy, University of Edinburgh with Ben Panter, Rob Tweedie, Mark Bastin, Will Hossack, Keith McKellar, Trevor
Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.
Statistical Learning: Chapter 4 Classification 4.1 Introduction Supervised learning with a categorical (Qualitative) response Notation: - Feature vector X, - qualitative response Y, taking values in C
SPECTRAL POLYNOMIAL ALGORITHMS FOR COMPUTING BI-DIAGONAL REPRESENTATIONS FOR PHASE TYPE DISTRIBUTIONS AND MATRIX-EXPONENTIAL DISTRIBUTIONS
Stochastic Models, 22:289 317, 2006 Copyright Taylor & Francis Group, LLC ISSN: 1532-6349 print/1532-4214 online DOI: 10.1080/15326340600649045 SPECTRAL POLYNOMIAL ALGORITHMS FOR COMPUTING BI-DIAGONAL
