BEGIN:VCALENDAR VERSION:2.0 PRODID:-//132.216.98.100//NONSGML kigkonsult.se iCalcreator 2.20.4// BEGIN:VEVENT UID:20250712T084849EDT-07850xjk3V@132.216.98.100 DTSTAMP:20250712T124849Z DESCRIPTION:\n \n \n \n TITLE / TITRE\n\n Free lunches and subsampling Monte Car lo\n \n ABSTRACT/RÉSUMÉ \n\n It is well-known that the performance of MCMC al gorithms degrades quite quickly when targeting computationally expensive p osterior distributions\, including the posteriors for even simple models w hen the dataset is large. This has motivated the search for MCMC variants that scale well for large datasets. One simple approach\, taken by several research groups\, has been to look at only a subsample of the data at eve ry step. This method is known to work quite well for optimization\, and va riants of stochastic gradient descent are the workhorse of modern machine learning. In this talk\, we focus on a simple 'no-free-lunch' result which shows that no algorithm of this sort can provide substantial speedups for Bayesian computation. We briefly sketch the main steps in the proof\, ill ustrate how these generic results apply to realistic statistical problems and proposed algorithms\, and discuss some special examples that can avoid our generic results and provide a free (or at least cheap) lunch. We also mention recent work 'in both directions\,' extending our basic conclusion to some non-reversible chains and showing explicitly how it can be avoide d for more complex posteriors (Based on joint with Patrick Conrad\, Andrew Davis\, James Johndrow\, Zonghao Li\, Youssef Marzouk\, Natesh Pillai\, P engfei Wang and Azeem Zaman.)\n\n PLACE / LIEU\n\n Hybride - CRM\, Salle / R oom 5340\, Pavillon André Aisenstadt\n \n \n \n\n DTSTART:20241129T203000Z DTEND:20241129T213000Z LOCATION:Burnside Hall\, CA\, QC\, Montreal\, H3A 0B9\, 805 rue Sherbrooke Ouest SUMMARY:Aaron Smith (University of Ottawa) URL:/mathstat/channels/event/aaron-smith-university-ot tawa-361371 END:VEVENT END:VCALENDAR