The better way to do statistics

107,580
0
Published 2024-04-03
To try everything Brilliant has to offer—free—for a full 30 days, visit brilliant.org/VeryNormal. You’ll also get 20% off an annual premium subscription.

Non-clickbait title: A gentle, but progressively rough introduction to Bayesian statistics

Stay updated with the channel and some stuff I make!
👉 verynormal.substack.com/
👉 very-normal.sellfy.store/

This video was sponsored by Brillian

All Comments (21)
  • @RomanNumural9
    Math finance PhD student here. Great video! Just so you know there's a book called "Deep Learning" by Ian Goodfellow et al. It covers Bayesian stats, including MCMCs and other things. It's a great resource and if you wanna know more about this stuff I found it a pretty reasonable read! :)
  • @kadaj131313
    Half my professors would fight you over this title, the other half would agree with you
  • @Tom_Het
    Wow, nobody has ever explained to me that Bayes' Theorem is derived from p(a|b)p(b) = p(b|a)p(a). That makes way more sense now.
  • @qwerty11111122
    As an introduction to Bayes theorem, i think that 3b1b really helped me form an intuition about this statistics using his "bayes ratio" of multiplying your prior by a ratio formed by the likelihood and margin to form the posterior, a new prior
  • @ricafrod
    As a PhD student who has used frequentist statistics for as long as I remember, I’d only ever heard gossip and rumours about Bayesian statistics, but your video hooked me from start to finish on such a fascinating subject! Great video!!!
  • @avenger1825
    I always get excited when I see one of your uploads; I've been studying heavily about statistics coming from a pure mathematics background, and your videos are always very helpful to build the conceptual foundations that textbooks often obscure in favor of specialized, theoretical language. This has already cleared up several things I didn't quite understand about Bayesian statistics, so thank you (for this and your other videos)! :^)
  • @hyunsunggo855
    The cool thing about variational inference is that it converts the problem of computing the intractable integral into a more manageable optimization problem of, with respect to the parameters, optimizing some quantity, the variational free energy! This not only makes the problem often easier (through the more flexible variational graphical model) and more tractable (than e.g. MCMC, etc..), but also enables borrowing insights from mathematical optimization theory, to solve the particular formulation of the problem. By the way, this connection to mathematical optimization is why it is called "variational" inference in the first place, directly connected to calculus of variations! Also, VI has amazing applications in deep learning, namely, variational autoencoders (VAEs), in which it's applied to the latent space for the induced probability distribution, for explaining the data distribution, to become much, much more complex, compared to the classical examples you've shown in this video. For example, diffusion models, that can create those amazing images, can indeed be seen as an instance of VAE! Thank you for this great video! I learned a lot! :)
  • I'm an astrophysicist, and in our field Bayesian statistics is the way. Great and all, except everyone seemingly expected me to know what an MCMC analysis was (it wasn't mentioned anywhere in the refresher lectures at the start of my PhD) despite never having heard of it before I started. This video was a massive help.
  • I used Bayes Theorem for a simple learning model for establishing categories for various phrases that were similar but not exactly the same. Going through thousands of records manually was possible, but using this allowed me to do it in a day with the help of excel and python.
  • For some reason I always love when someone says “hi mom” in a video. It’s just wholesome and nice to know they are getting their mom’s support.
  • @nzt29
    Best video i’ve seen on this so far. I like the comparison between the two methods and that fact that you map back the data and parameter variables back to the typical A and B seen in the Baye’s thm definition. edit: I should have phrased this instead as how you connected Baye’s thm to distributions.
  • @tomalapapa100
    Ive studied math as a degree and specialized in statistics and finance. Had rhe same experience with numerous frequentist clases but few bayesian. Ive studied on my own and with a couple of clases that were available to me in grad. Struggled a lot to get the gist of bayesian statistics. This video is s perfect for people with knowledge of frequientist view who wish to then learn bayesian
  • @justdave9195
    Could you please make a video on Generalized Linear Models too? These explanations are soooo helpful.
  • @elinope4745
    YouTube recommended this video to me on my recommended feed. Only about one in twenty videos is any good. The odds were low that someone would make a video worth while, subbed, liked.
  • @thegimel
    Great video as the rest of your content. You have a pleasantly simple, intuitive and concise way of presenting the D :) I would very much like for you to dive deeper into the Likelihood in particular, and why it isn't a real PDF even though it can look like one. Cheers!
  • Loved this! Definitely a subscriber now 🎉 I got confronted with Bayes in a Seminar where we used various Machine learning and deep learning models with the expectation of already knowing all this prior to starting. It led to me having no confidence in the model results even though they outperformed some other approaches.
  • @jrlearnstomath
    Looking forward to more on variational inference, it's really doing my head in
  • Its so cool to see MCMCs get some love. The only use I’ve ever seen of it in my field (Psychology) is in Item Response Theory. Awesome video!
  • @entivreality
    Really great explanation! Love the progression from elementary to more advanced topics. A video on empirical Bayes methods could also be cool :)