statisticsmatt
statisticsmatt
  • 974
  • 1 247 330
Parameter Estimation with Backfitting (part 1/2): R illustration with two predictors
The videos on this UA-cam Channel are not affiliated with The University of Missouri or my role as a professor at the University.
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it.
Help this channel to remain great! Donating to Patreon or Paypal can do this!
www.patreon.com/statisticsmatt
paypal.me/statisticsmatt
Переглядів: 126

Відео

ims54 - Using MGFs to derive the distribution of the sample variance.
Переглядів 2403 місяці тому
The videos on this UA-cam Channel are not affiliated with The University of Missouri or my role as a professor at the University. Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patre...
ims53 - Using MGFs to show that the sample mean and variance are independent
Переглядів 4474 місяці тому
The videos on this UA-cam Channel are not affiliated with The University of Missouri or my role as a professor at the University. Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patre...
ims52 - Using the Normal Distribution to Derive Distributions
Переглядів 5108 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
Deriving the t Distribution
Переглядів 1,1 тис.8 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv12 - Amplitude of the Sum of the Sine and Cosine Functions
Переглядів 17410 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv11 - Sine and Cosine of the Inverse Tangent
Переглядів 8110 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv10 - Periodic Trig Functions: Period, Phase Shift, Vertical Shift, Amplitude, and Frequency
Переглядів 6611 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv9 - Dot Product and Cross Product of Complex Numbers
Переглядів 10211 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv8 - Complex Polynomial of Degree n
Переглядів 5311 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv7 - Quadratic Equation with Complex Coefficients
Переглядів 9211 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv6 - The Complex Exponential Function
Переглядів 15711 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv5 - Roots of a Complex Number
Переглядів 11211 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv4 - Polar Form of a Complex Number
Переглядів 12611 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv3 - Modulus of a Complex Number
Переглядів 14111 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv2 - General Equation of a Circle in the Complex Plan
Переглядів 9611 місяців тому
cv2 - General Equation of a Circle in the Complex Plan
cv1 - Introduction to Complex Numbers
Переглядів 16411 місяців тому
cv1 - Introduction to Complex Numbers
ims51 - Limiting Distributions(7/7): Slutsky's Theorem & Delta Method
Переглядів 2,6 тис.Рік тому
ims51 - Limiting Distributions(7/7): Slutsky's Theorem & Delta Method
ims50 - Limiting Distributions(6/7): Convergence in probability
Переглядів 350Рік тому
ims50 - Limiting Distributions(6/7): Convergence in probability
ims49 - Limiting Distributions(5/7): Asymptotic Normal Order Statistic
Переглядів 813Рік тому
ims49 - Limiting Distributions(5/7): Asymptotic Normal Order Statistic
ims48 - Limiting Distributions(4/7): Normal Approximation to a Binomial
Переглядів 308Рік тому
ims48 - Limiting Distributions(4/7): Normal Approximation to a Binomial
ims47 - Limiting Distributions(3/7): Central Limit Theorem
Переглядів 458Рік тому
ims47 - Limiting Distributions(3/7): Central Limit Theorem
ims46 - Limiting Distributions(2/7): Stochastic Convergence
Переглядів 636Рік тому
ims46 - Limiting Distributions(2/7): Stochastic Convergence
ims45 - Limiting Distributions(1/7): Sequence of Random Variables.
Переглядів 1,5 тис.Рік тому
ims45 - Limiting Distributions(1/7): Sequence of Random Variables.
ims44 - Order Statistics (2 of 2)
Переглядів 191Рік тому
ims44 - Order Statistics (2 of 2)
ims43 - Order Statistics (1 of 2)
Переглядів 347Рік тому
ims43 - Order Statistics (1 of 2)
Inverse Binomial Sampling
Переглядів 240Рік тому
Inverse Binomial Sampling
ims42 - Transformation of Bivariate Random Variables
Переглядів 207Рік тому
ims42 - Transformation of Bivariate Random Variables
amv60 - Test Comparing k Population Covariance Matrices
Переглядів 157Рік тому
amv60 - Test Comparing k Population Covariance Matrices
amv59 - One Sample Tests for the Sphericity of a Covariance Matrix
Переглядів 241Рік тому
amv59 - One Sample Tests for the Sphericity of a Covariance Matrix

КОМЕНТАРІ

  • @gchesterton
    @gchesterton 7 годин тому

    Hi Matt, Suppose I have a sample of data that I believe are sampled from a Cauchy distribution. Let's suppose it's symmetrical about 0, so it's Cauchy (0, gamma). However, let's suppose my observations of the underlying Cauchy (0, gamma) are truncated to the relatively limited range (0,2) -- that is, positive values from 0 to 2. All other potential values are not captured in my case. Do you have a way of estimating gamma from this set of limited observations?

  • @prabirkumardas6927
    @prabirkumardas6927 2 дні тому

    Hi can you please share the lecture notes here? In a link ? It will be really helpful

    • @statisticsmatt
      @statisticsmatt 2 дні тому

      Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt

  • @Comrade-wv1lu
    @Comrade-wv1lu 7 днів тому

    Disgusting

    • @statisticsmatt
      @statisticsmatt 6 днів тому

      I'm not sure what your comment means? I'm going to assume the "disgusting" is so "bad" that it means "good". Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @mdazim08
    @mdazim08 9 днів тому

    Hi. A question. In the Graybill(1976) book, chapter 6 is called General Linear Model while Chapter 10 is called Multiple Regression. Could you please clarify for me on the difference between general linear model and multiple regression?

    • @statisticsmatt
      @statisticsmatt 9 днів тому

      The general linear model is a comprehensive framework that includes multiple regression as one of its special cases. While multiple regression deals specifically with the linear relationship between a dependent variable and multiple continuous predictors, the general linear model can encompass a wider range of models and predictor types. Many thanks for watching!

  • @cizbargahjr.8401
    @cizbargahjr.8401 11 днів тому

    Damn man. You're my hero. :')

    • @statisticsmatt
      @statisticsmatt 11 днів тому

      I love hearing that the videos are helpful! Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @wolpumba4099
    @wolpumba4099 12 днів тому

    *Summary* *Parameter Estimation with Backfitting (Part 1/2)* * *Goal:* Estimate parameters in multiple linear regression using only simple linear regression. * *Method:* Backfitting - an iterative process of estimating parameters one at a time while holding others fixed. * *Steps:* 1. *Data Generation (**0:00**):* Create 100 data points with two predictors (X1, X2) and one response variable (Y). 2. *Initialization (**3:00**):* Make an initial guess for one parameter (e.g., beta 1). 3. *Iteration (**3:00**):* * Use the fixed value of beta 1 to estimate beta 2 via simple linear regression. * Fix beta 2 at its new estimate and re-estimate beta 1. * Use both beta 1 and beta 2 to estimate the intercept (beta 0). * Store these estimates and repeat the process for a set number of iterations (e.g., 50). * *Convergence (**6:00**):* The estimates for beta 0, beta 1, and beta 2 converge to the least squares estimates from multiple linear regression after a few iterations. * *Visualization (**8:42**):* The convergence of parameter estimates across iterations can be visualized using a plot. * *Comparison (**9:30**):* The backfitting estimates are shown to be identical to those obtained from directly fitting a multiple linear regression model. *Key takeaway:* Backfitting provides a way to estimate parameters in situations where only simple linear regression tools are available. i used gemini 1.5 pro to summarize the transcript

    • @statisticsmatt
      @statisticsmatt 12 днів тому

      That's amazing. Many thanks, and many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @whatever--
    @whatever-- 13 днів тому

    really interesting, thanks for the video

    • @statisticsmatt
      @statisticsmatt 13 днів тому

      You're welcome. Many thanks for watching! Don't forget to subscribe and let others know about this channel.

  • @syz911
    @syz911 15 днів тому

    Your videos are excellent and helped me a lot understanding mathematical statistics. How can I make a one-time donation to your channel?

    • @statisticsmatt
      @statisticsmatt 14 днів тому

      I love hearing that the videos are helpful! Many thanks for watching. Don't forget to subscribe and let others know about this channel. Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt Many thanks in advance for your kind donation.

  • @TomasPaixao-sx8gw
    @TomasPaixao-sx8gw 16 днів тому

    Your classes were really helpful, Matt! Thanks!

    • @statisticsmatt
      @statisticsmatt 15 днів тому

      I love hearing that the videos are helpful! Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @anangelsdiaries
    @anangelsdiaries 21 день тому

    Since the slope could be negative, why do we do a right-side test on the t-statistic? Unless specified shouldn't the two-sided test be the default?

    • @statisticsmatt
      @statisticsmatt 21 день тому

      Good question. If we think that B1 is negative, we would conduct a left tailed test. The default in most statistical summaries is a two tailed test. Many thanks for watching.

  • @anangelsdiaries
    @anangelsdiaries 21 день тому

    Great video! My only question is what do the new derivations achieve precisely? While they make sense, I don't see how them on their own lead to the conclusion that b0, b1 is normally distributed (that is in a way we couldn't have inferred from them being linear operators?) Was it just a fun exercise, or am I missing something?

    • @statisticsmatt
      @statisticsmatt 21 день тому

      I assumed a fact in the video that I didn't cover, which is that a linear combination of independent normally distributed random variables is itself a normally distributed random variable. Since we showed that b0 and b1 are linear combinations of independent normal random variables, they are themselves normal random variables.

  • @anangelsdiaries
    @anangelsdiaries 21 день тому

    Math is not a spectator sport, but your videos is kind of like the closest you can get to that. I got a sheet of paper besides me when a step does not quite make sense, or I want to make sure I understood the rationale, but your explanations make so much sense that I could probably get away with just watching. I still try to derive the important results on my own after watching because it's always good to get that practice in. Thank you a billion.

    • @statisticsmatt
      @statisticsmatt 21 день тому

      Many thanks for your kind comment, much appreciated. Many thanks for watching!

  • @anangelsdiaries
    @anangelsdiaries 21 день тому

    Great video, limpid explanations!

    • @statisticsmatt
      @statisticsmatt 21 день тому

      Many thanks for your kind comment, much appreciated. Many thanks for watching!

  • @user-hq3vz7pq6c
    @user-hq3vz7pq6c 23 дні тому

    • @statisticsmatt
      @statisticsmatt 23 дні тому

      Many thanks for watching! Don't forget to subscribe and let others know about this channel.

  • @nkosisampson5555
    @nkosisampson5555 25 днів тому

    didn't you only prove in "Derivatives of a Normal Density: Useful Identities" that x^3f(x) from -inf to inf goes to zero for standard normal distributions? if so it wouldn't necessarily apply to non-standard normal distributions

    • @statisticsmatt
      @statisticsmatt 24 дні тому

      You're correct. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @ZhuodiaoKuang-ht7zg
    @ZhuodiaoKuang-ht7zg 27 днів тому

    You are my god, literally.

    • @statisticsmatt
      @statisticsmatt 27 днів тому

      You're so kind. Many thanks for watching! Don't forget to subscribe and let others know about this channel.

  • @nannanwang9094
    @nannanwang9094 Місяць тому

    This is wonderful~ Thank you for explaining EM in terms of sufficient statistics!

    • @statisticsmatt
      @statisticsmatt Місяць тому

      You're welcome. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @leeris19
    @leeris19 Місяць тому

    I really love this series. I told myself to stop getting side tracked and I only came here to study the proof for the parameters of the multi variate gaussian. But oh boy, I never regretted watching these videos.

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Your comment made my day. I love hearing that the videos are helpful. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @nannanwang9094
    @nannanwang9094 Місяць тому

    Please keep making videos of advanced stats! I learned so much from you. Sometime when I can't understand some concepts in my class I can always find some of your videos helpful. Thank you for making these videos and making them available to us.

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Will do. I plan to make 100's of more videos. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @aztecterp
    @aztecterp Місяць тому

    I've been reading proofs about this chi-square test, and not one of them is written well. Your video is the easiest explanation I've seen so far. It seems to me that the biggest stumbling block is how to deal with the lack of independence in the multinomial distribution. I suppose that's why you created that diagonal matrix.

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Definitely the trick is to find the transformation that creates independent observations. Many thanks for watching.

  • @Random-sm5gi
    @Random-sm5gi Місяць тому

    Thanks boss you are the best!

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @Songvbm
    @Songvbm Місяць тому

    Hi Matt. A long time subscriber here. Hope you are fine. I have a question → Can we use Helmert transformation to derive chi-square pdf ?

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching and subscribing! The quick answer is yes. Here's a link to math stack exchange that should be helpful. math.stackexchange.com/questions/47009/proof-of-fracn-1s2-sigma2-sim-chi2-n-1

  • @anangelsdiaries
    @anangelsdiaries Місяць тому

    Hi Matt, thank you very much for your videos! Sorry if that's a dumb question, but how is saying that the error terms are i.i.d according to N(0, sigma^2) different from saying E(e_i) = 0 and Var(e_i) = sigma^2?

    • @statisticsmatt
      @statisticsmatt Місяць тому

      That's a good question. Let X be a random variable with mean zero, E(X)=0, and variance sigma^2, V(X)=sigma^2. What do we know about the distribution of X? Nothing. However, if we say X is a normal random variable with E(X)=0 and V(X)=sigma^2 we are providing more information about the random variable X. Many thanks for watching.

    • @anangelsdiaries
      @anangelsdiaries 21 день тому

      @@statisticsmatt Oh I see, thank you very much. This makes sense!

  • @riskamulyani4801
    @riskamulyani4801 Місяць тому

    hi,, in last min around 30:04, you did cancel both g'(ui), however in denumerator has g'(ui)^2 while numerator only g'(ui)...

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Good question. here is the basis equation that you are asking about. 0=(1/g'(ui))*(...). multiple g'(ui) to the left side. 0*g'(ui) = 0. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @bhawikajain4022
    @bhawikajain4022 Місяць тому

    Hi matt, can you explain why you have witten (n*lambda) to the power {sum xi} in ex1 around 1:31? Thank you for such helpful videos!!

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching! You have found an error in the video, which I highlighted in the description for the video.

  • @Daniel-ve8oi
    @Daniel-ve8oi Місяць тому

    Another thing I just realize: The F statistics should (acc. to you, cf. around 9:35) be identical for RCB with fixed and random treatment effects. However, in this video you wrote F = ((SS_trt)/(a-1))/((SS_E)/((a-1)(b-1)), while in the video #46 in the playlist it is F = (((SS_trt)/sigma²)/(a-1))/(((SS_E)/sigma²)/((a-1)(b-1)). PS: I can also stop writing these things along my way through your playlists.

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching! I don't mind the questions and comments at all. I hoping that you can see that the F statistics you point out are really the same quantity.

    • @statisticsmatt
      @statisticsmatt Місяць тому

      @Daniel-ve8oi, in 11:30 in the video, I discuss why I divided by sigma^2, which makes SStrt/sigma^2 distributed as a chi-square random variable. Then dividing by degrees of freedom we start to create an F random variable.

    • @Daniel-ve8oi
      @Daniel-ve8oi Місяць тому

      @@statisticsmatt Ok, I see. Thx!

  • @Daniel-ve8oi
    @Daniel-ve8oi Місяць тому

    At 17:30 you said "it becomes a central F distribution, but you've written a chi-square).

    • @statisticsmatt
      @statisticsmatt Місяць тому

      That's an amazing catch! Thanks! It's a central F distribution. Not sure why I wrote chi-square. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @raltonkistnasamy6599
    @raltonkistnasamy6599 Місяць тому

    thanks alot man

    • @statisticsmatt
      @statisticsmatt Місяць тому

      You're welcome. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @AlexeyMatushevsky
    @AlexeyMatushevsky Місяць тому

    Here is my missing peace i would like to share with thouse who are watyching 6:44 $-\mathbb{E}[\ell'^2]=-\text{Var}(\ell')$ why? We should take a look the Variance definitione. $\text{Var}(\ell')=\mathbb{E}[\ell'^2]-(\mathbb{E}[\ell'])^2$ Since the $(\mathbb{E}[\ell'])^2=(0)^2=0$ Then $\text{Var}(\ell')=\mathbb{E}[\ell'^2]$ I always forget about that property of the variance!

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching and sharing your thoughts! Much appreciated. Don't forget to subscribe and let others know about this channel.

  • @AlexeyMatushevsky
    @AlexeyMatushevsky Місяць тому

    may i ask if on 11:24, in the "there pieace" - haven't we lost the square/second power?

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Again, many thanks for watching. If you work it out by hand, you'll find that the first power was factored out front.

  • @AlexeyMatushevsky
    @AlexeyMatushevsky Місяць тому

    @statisticsmatt sorry for practicine questions under your video. So when you say that MLE theta satisfies the log likelihood first derivative = 0. you mean that - the equation will be equal to 0 once we take first derivative of log likelihood and use the hat theta as argument?

  • @AlexeyMatushevsky
    @AlexeyMatushevsky Місяць тому

    I have a question, at the 8:40 when you talk about the new formula. You in the formulat we have now the $\hat\theta$, and it's now the $\theta^*$ is between $\hat\theta$ and $\theta_0$ My question is - how did we transitioned to the $\hat\theta$? is that connection throught the Note (2)? where the $\hat\theta_{MLE}$ satisfyies the $\ell'(\hat\theta)=0$?

  • @bhawikajain4022
    @bhawikajain4022 Місяць тому

    Love your videos so much!! Perfect revision before exams <3

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching! I love hearing that the videos are helpful!

  • @AlexeyMatushevsky
    @AlexeyMatushevsky Місяць тому

    I just love it! manby thanks for sharing this. May I ask you about the the last black square on the page - what is that?. I notice them in other stats book. It's like The End? Thanks

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching! Don't forget to subscribe and let others know about this channel. In regard to the black square, that indicates the end of a proof. Often you might see "QED" at the end of a proof too, which is a Latin phrase.

    • @AlexeyMatushevsky
      @AlexeyMatushevsky Місяць тому

      ​ @statisticsmatt Many thanks for your aswer.

  • @fletton_man
    @fletton_man Місяць тому

    Great video, thanks. Would like to ask please: if in calculating the externally studentized residual for the ith data point, the leverage hii is calculated from a model with all data points (not with ith data point removed)?

    • @statisticsmatt
      @statisticsmatt Місяць тому

      Many thanks for watching! Don't forget to subscribe and let others know about this channel. You're correct, hii is calculated from a full model fit. Unfortunately, some of the videos in the play list require watching a previous video in the play list.

  • @jamalnuman
    @jamalnuman 2 місяці тому

    this is the best lecture i've ever seen that explains how the factors are mathematically calculated. but couldn't figure out how the lambda are calculated. not clear

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Many thanks for watching and your nice comment. The next two videos in the play list "Applied Multivariate Analyses Using R Software" provide the methods of how to estimate the lambda parameters. Don't forget to subscribe and let others know about this channel.

  • @user-zd7id9rx3f
    @user-zd7id9rx3f 2 місяці тому

    Is there a textbook you can recommend that shows these proofs?

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Many thanks for watching! Don't forget to subscribe and let others know about this channel. There are several online sites that have formulas with some derivations. Here's a nice pdf www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf In my videos I try to provide details as the way I think about things.

  • @Daniel-ve8oi
    @Daniel-ve8oi 2 місяці тому

    A general question in regards to alpha error inflation (maybe you talk about this later on, but I just made it up to here right now): It is usually said that once you perform multiple tests on the same data, you should either reduce the p value or increase the alpha levels. What I don't get is this: Whenever 1 test is performed, you have a chance drawing the correct conclusion (1 - alpha)^(k = 1). Thus, why should one only correct when using same data. Wouldn't it be more plausible to correct whenever a test is performed, This would of course mean that k has to increase with every test performed by a person, and thus a scientist would hardly find and significant results towards the end of her career. Is there a (mathematical) reason for why you only correct if you perform tests on the same data?

    • @statisticsmatt
      @statisticsmatt Місяць тому

      This is such a topic that would require me typing more than I want to type in a comment. Therefore I'm going to point you to a book chapter and a website to research this more. 1) Website link en.wikipedia.org/wiki/Multiple_comparisons_problem 2 ) Book link (see chapter 13) hastie.su.domains/ISLR2/ISLRv2_corrected_June_2023.pdf.download.html www.statlearning.com/resources-second-edition Many thanks for watching.

    • @Daniel-ve8oi
      @Daniel-ve8oi Місяць тому

      @@statisticsmatt Ok, thanks a lot. Always had this in my mind since I learnt the first time about the multiple comparison problem - but I never saw someone talking about the problem of alpha error inflation when testing different data sets ...

  • @ujjawalmanocha1768
    @ujjawalmanocha1768 2 місяці тому

    I had a assignment to submit explaining minimax test and comparing it with MP, UMP test. After spending weeks searching for a book/ pdf, a friend found your channel and suggested it to me. I am glad to tell you I watched not only the minimax lectures but even beyond my presentation topic. Your delivery was great and I got clarity on earlier topics as well. Thank you

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      I love hearing that the videos are helpful!! Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @Daniel-ve8oi
    @Daniel-ve8oi 2 місяці тому

    Around 5:20 - 5:30 I think it should be 1/sigma² ~ MNV(1/sigma² hat beta, I). Multiplied by 1/sigma should give sigma I for the variance, or 1/sigma² as coefficients for hat Y and the MVN average, shouldn't it? Nevertheless, great video as always!

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      First, many thanks for watching! After re-watching the video, it seems to be correct. Note the following property. (1) Let Y ~ N(Mu, Sigma^2), (2) Let c be a constant, Then (3) c^Y ~ N(c*Mu, c^2*Sigma). Property (3) is the part that I think you are confused about (guess). Be sure the subscribe and let others know about this channel.

  • @markneumann381
    @markneumann381 2 місяці тому

    Really great job. Appreciate you. Your presentation. Clarity. Thank you.

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Thanks for your kind words. Much appreciated. Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @eigensmith8316
    @eigensmith8316 2 місяці тому

    Hey matt i really love your videos it really helps me a lot. I am currently watching your videos and following casella and berger but don't exactly know which playlists to follow from your channel. Could you please list down the playlists which would align with casella and berger (statistical inference book)

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Many thanks for watching! This is a great suggestion! After the semester, I'll see what I can make for this. Don't forget to subscribe and let others know about this channel.

  • @DionisioAlejandro-vq5xl
    @DionisioAlejandro-vq5xl 2 місяці тому

    Hi, it was so useful! Thank you for the detailed explanation. Do you have videos related to Quantiles Tests?

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Many thanks for watching! Don't forget to subscribe and let others know about this channel. If I have videos on quantile test, it would be in the playlist Nonparametrics.

  • @dragoneer8756
    @dragoneer8756 2 місяці тому

    I think there's a problem with the FTP site? it gave an error, then I went online and saw they had changed the URL, but that gave an error too. Your videos are very clear to follow along to though without the data. I'm just asking in case I'm doing something wrong.

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Many thanks for watching! I'm not sure what is happening with the site. This link works for me. statisticsmatt.gumroad.com/

  • @jeffk1722
    @jeffk1722 2 місяці тому

    The website didn't show probabilities (or it was hidden) on the multiplier. I was thinking, "none" is only 1 of 5 options?? Of course I want it multiplied! But yeah, if 2x is more probable than 1x, I guess that's still worth it right?

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      First, are you looking at the Missouri lottery website? Or another state lottery?

  • @ccuuttww
    @ccuuttww 2 місяці тому

    If someone don't understand the expectation(Alternative Formula) actually it is very easy 4:22 P + P(1-P) + P(1-P)^2 + P(1-P)^3...... The first summation term P(1-P) + P(1-P)^2 + P(1-P)^3...... The second summation term P(1-P)^2 + P(1-P)^3...... The third summation term It just same as x*p*(1-P)^(x-1) x = 1 (the right most first column) x*p*(1-P)^(x-1) x = 2 (the right most second column) if u continue and sum it up it just same as the expectation formula which x > 0

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      You got it! Many thanks for watching. Don't forget to subscribe and let others know about this channel.

  • @Daniel-ve8oi
    @Daniel-ve8oi 2 місяці тому

    Really like it and could only repeat everything the others already commented ... Please continue!

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Many thanks for watching! Don't forget to subscribe and let others know about this channel.

  • @markmoon1237
    @markmoon1237 2 місяці тому

    nice work!

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Many thanks for watching! Don't forget to subscribe and let others know about this video.

  • @vobogodlovecaleb4559
    @vobogodlovecaleb4559 2 місяці тому

    This is a great video Sir but i hardly understand this mapping when looking for the limit of integration. How are the lines drawn and how is the shading done?

    • @statisticsmatt
      @statisticsmatt 2 місяці тому

      Are you referencing the plot at 3:40 in the video? If yes, we are plotting two order statistics. For example and to make the notation easier, let's assume we are plotting y1 and y2. Since these are order statistics, y1 < y2. The plot has to illustrate this relationship. The same argument is for M and S. We have to have M < S, which is the plot on the right.