- 974
- 1 247 330
statisticsmatt
United States
Приєднався 7 тра 2009
Briefly, this channel will be used to disseminate statistical and mathematical results.
I like to break statistics into 3 broad classes of "low", "medium", and "high", in regards to mathematical rigor. There are some amazing channels in the range of "low" to "medium" level statistics. Theses sites include some incredible graphical illustrations and low tech videos with incredible intuitive explanations of statistical concepts. Adding another channel in this arena would not benefit the average viewer to an already statistically rich playing field.
There are fewer channels dedicated to "medium" level statistics and only a small handful and sites dedicated to "high" level statistics. So, this is where I decided to focus my videos: "medium" to "high" level statistical videos.
Hopefully my passion for statistics comes through in my videos. Enjoy!
Help this channel to remain great! Donating to Patreon can do this! www.patreon.com/statisticsmatt
I like to break statistics into 3 broad classes of "low", "medium", and "high", in regards to mathematical rigor. There are some amazing channels in the range of "low" to "medium" level statistics. Theses sites include some incredible graphical illustrations and low tech videos with incredible intuitive explanations of statistical concepts. Adding another channel in this arena would not benefit the average viewer to an already statistically rich playing field.
There are fewer channels dedicated to "medium" level statistics and only a small handful and sites dedicated to "high" level statistics. So, this is where I decided to focus my videos: "medium" to "high" level statistical videos.
Hopefully my passion for statistics comes through in my videos. Enjoy!
Help this channel to remain great! Donating to Patreon can do this! www.patreon.com/statisticsmatt
Parameter Estimation with Backfitting (part 1/2): R illustration with two predictors
The videos on this UA-cam Channel are not affiliated with The University of Missouri or my role as a professor at the University.
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it.
Help this channel to remain great! Donating to Patreon or Paypal can do this!
www.patreon.com/statisticsmatt
paypal.me/statisticsmatt
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it.
Help this channel to remain great! Donating to Patreon or Paypal can do this!
www.patreon.com/statisticsmatt
paypal.me/statisticsmatt
Переглядів: 126
Відео
ims54 - Using MGFs to derive the distribution of the sample variance.
Переглядів 2403 місяці тому
The videos on this UA-cam Channel are not affiliated with The University of Missouri or my role as a professor at the University. Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patre...
ims53 - Using MGFs to show that the sample mean and variance are independent
Переглядів 4474 місяці тому
The videos on this UA-cam Channel are not affiliated with The University of Missouri or my role as a professor at the University. Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patre...
ims52 - Using the Normal Distribution to Derive Distributions
Переглядів 5108 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
Deriving the t Distribution
Переглядів 1,1 тис.8 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv12 - Amplitude of the Sum of the Sine and Cosine Functions
Переглядів 17410 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv11 - Sine and Cosine of the Inverse Tangent
Переглядів 8110 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv10 - Periodic Trig Functions: Period, Phase Shift, Vertical Shift, Amplitude, and Frequency
Переглядів 6611 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv9 - Dot Product and Cross Product of Complex Numbers
Переглядів 10211 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv8 - Complex Polynomial of Degree n
Переглядів 5311 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv7 - Quadratic Equation with Complex Coefficients
Переглядів 9211 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv6 - The Complex Exponential Function
Переглядів 15711 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv5 - Roots of a Complex Number
Переглядів 11211 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv4 - Polar Form of a Complex Number
Переглядів 12611 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv3 - Modulus of a Complex Number
Переглядів 14111 місяців тому
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
cv2 - General Equation of a Circle in the Complex Plan
Переглядів 9611 місяців тому
cv2 - General Equation of a Circle in the Complex Plan
cv1 - Introduction to Complex Numbers
Переглядів 16411 місяців тому
cv1 - Introduction to Complex Numbers
ims51 - Limiting Distributions(7/7): Slutsky's Theorem & Delta Method
Переглядів 2,6 тис.Рік тому
ims51 - Limiting Distributions(7/7): Slutsky's Theorem & Delta Method
ims50 - Limiting Distributions(6/7): Convergence in probability
Переглядів 350Рік тому
ims50 - Limiting Distributions(6/7): Convergence in probability
ims49 - Limiting Distributions(5/7): Asymptotic Normal Order Statistic
Переглядів 813Рік тому
ims49 - Limiting Distributions(5/7): Asymptotic Normal Order Statistic
ims48 - Limiting Distributions(4/7): Normal Approximation to a Binomial
Переглядів 308Рік тому
ims48 - Limiting Distributions(4/7): Normal Approximation to a Binomial
ims47 - Limiting Distributions(3/7): Central Limit Theorem
Переглядів 458Рік тому
ims47 - Limiting Distributions(3/7): Central Limit Theorem
ims46 - Limiting Distributions(2/7): Stochastic Convergence
Переглядів 636Рік тому
ims46 - Limiting Distributions(2/7): Stochastic Convergence
ims45 - Limiting Distributions(1/7): Sequence of Random Variables.
Переглядів 1,5 тис.Рік тому
ims45 - Limiting Distributions(1/7): Sequence of Random Variables.
ims42 - Transformation of Bivariate Random Variables
Переглядів 207Рік тому
ims42 - Transformation of Bivariate Random Variables
amv60 - Test Comparing k Population Covariance Matrices
Переглядів 157Рік тому
amv60 - Test Comparing k Population Covariance Matrices
amv59 - One Sample Tests for the Sphericity of a Covariance Matrix
Переглядів 241Рік тому
amv59 - One Sample Tests for the Sphericity of a Covariance Matrix
Hi Matt, Suppose I have a sample of data that I believe are sampled from a Cauchy distribution. Let's suppose it's symmetrical about 0, so it's Cauchy (0, gamma). However, let's suppose my observations of the underlying Cauchy (0, gamma) are truncated to the relatively limited range (0,2) -- that is, positive values from 0 to 2. All other potential values are not captured in my case. Do you have a way of estimating gamma from this set of limited observations?
Hi can you please share the lecture notes here? In a link ? It will be really helpful
Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt
Disgusting
I'm not sure what your comment means? I'm going to assume the "disgusting" is so "bad" that it means "good". Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Hi. A question. In the Graybill(1976) book, chapter 6 is called General Linear Model while Chapter 10 is called Multiple Regression. Could you please clarify for me on the difference between general linear model and multiple regression?
The general linear model is a comprehensive framework that includes multiple regression as one of its special cases. While multiple regression deals specifically with the linear relationship between a dependent variable and multiple continuous predictors, the general linear model can encompass a wider range of models and predictor types. Many thanks for watching!
Damn man. You're my hero. :')
I love hearing that the videos are helpful! Many thanks for watching. Don't forget to subscribe and let others know about this channel.
*Summary* *Parameter Estimation with Backfitting (Part 1/2)* * *Goal:* Estimate parameters in multiple linear regression using only simple linear regression. * *Method:* Backfitting - an iterative process of estimating parameters one at a time while holding others fixed. * *Steps:* 1. *Data Generation (**0:00**):* Create 100 data points with two predictors (X1, X2) and one response variable (Y). 2. *Initialization (**3:00**):* Make an initial guess for one parameter (e.g., beta 1). 3. *Iteration (**3:00**):* * Use the fixed value of beta 1 to estimate beta 2 via simple linear regression. * Fix beta 2 at its new estimate and re-estimate beta 1. * Use both beta 1 and beta 2 to estimate the intercept (beta 0). * Store these estimates and repeat the process for a set number of iterations (e.g., 50). * *Convergence (**6:00**):* The estimates for beta 0, beta 1, and beta 2 converge to the least squares estimates from multiple linear regression after a few iterations. * *Visualization (**8:42**):* The convergence of parameter estimates across iterations can be visualized using a plot. * *Comparison (**9:30**):* The backfitting estimates are shown to be identical to those obtained from directly fitting a multiple linear regression model. *Key takeaway:* Backfitting provides a way to estimate parameters in situations where only simple linear regression tools are available. i used gemini 1.5 pro to summarize the transcript
That's amazing. Many thanks, and many thanks for watching. Don't forget to subscribe and let others know about this channel.
really interesting, thanks for the video
You're welcome. Many thanks for watching! Don't forget to subscribe and let others know about this channel.
Your videos are excellent and helped me a lot understanding mathematical statistics. How can I make a one-time donation to your channel?
I love hearing that the videos are helpful! Many thanks for watching. Don't forget to subscribe and let others know about this channel. Here's a link for pdf's of certain videos. statisticsmatt.gumroad.com Also note that if a pdf of the video you are wanting is not uploaded yet, please reply in a comment that you'd like me to upload and I'll do it. Help this channel to remain great! Donating to Patreon or Paypal can do this! www.patreon.com/statisticsmatt paypal.me/statisticsmatt Many thanks in advance for your kind donation.
Your classes were really helpful, Matt! Thanks!
I love hearing that the videos are helpful! Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Since the slope could be negative, why do we do a right-side test on the t-statistic? Unless specified shouldn't the two-sided test be the default?
Good question. If we think that B1 is negative, we would conduct a left tailed test. The default in most statistical summaries is a two tailed test. Many thanks for watching.
Great video! My only question is what do the new derivations achieve precisely? While they make sense, I don't see how them on their own lead to the conclusion that b0, b1 is normally distributed (that is in a way we couldn't have inferred from them being linear operators?) Was it just a fun exercise, or am I missing something?
I assumed a fact in the video that I didn't cover, which is that a linear combination of independent normally distributed random variables is itself a normally distributed random variable. Since we showed that b0 and b1 are linear combinations of independent normal random variables, they are themselves normal random variables.
Math is not a spectator sport, but your videos is kind of like the closest you can get to that. I got a sheet of paper besides me when a step does not quite make sense, or I want to make sure I understood the rationale, but your explanations make so much sense that I could probably get away with just watching. I still try to derive the important results on my own after watching because it's always good to get that practice in. Thank you a billion.
Many thanks for your kind comment, much appreciated. Many thanks for watching!
Great video, limpid explanations!
Many thanks for your kind comment, much appreciated. Many thanks for watching!
❤
Many thanks for watching! Don't forget to subscribe and let others know about this channel.
didn't you only prove in "Derivatives of a Normal Density: Useful Identities" that x^3f(x) from -inf to inf goes to zero for standard normal distributions? if so it wouldn't necessarily apply to non-standard normal distributions
You're correct. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
You are my god, literally.
You're so kind. Many thanks for watching! Don't forget to subscribe and let others know about this channel.
This is wonderful~ Thank you for explaining EM in terms of sufficient statistics!
You're welcome. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
I really love this series. I told myself to stop getting side tracked and I only came here to study the proof for the parameters of the multi variate gaussian. But oh boy, I never regretted watching these videos.
Your comment made my day. I love hearing that the videos are helpful. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Please keep making videos of advanced stats! I learned so much from you. Sometime when I can't understand some concepts in my class I can always find some of your videos helpful. Thank you for making these videos and making them available to us.
Will do. I plan to make 100's of more videos. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
I've been reading proofs about this chi-square test, and not one of them is written well. Your video is the easiest explanation I've seen so far. It seems to me that the biggest stumbling block is how to deal with the lack of independence in the multinomial distribution. I suppose that's why you created that diagonal matrix.
Definitely the trick is to find the transformation that creates independent observations. Many thanks for watching.
Thanks boss you are the best!
Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Hi Matt. A long time subscriber here. Hope you are fine. I have a question → Can we use Helmert transformation to derive chi-square pdf ?
Many thanks for watching and subscribing! The quick answer is yes. Here's a link to math stack exchange that should be helpful. math.stackexchange.com/questions/47009/proof-of-fracn-1s2-sigma2-sim-chi2-n-1
Hi Matt, thank you very much for your videos! Sorry if that's a dumb question, but how is saying that the error terms are i.i.d according to N(0, sigma^2) different from saying E(e_i) = 0 and Var(e_i) = sigma^2?
That's a good question. Let X be a random variable with mean zero, E(X)=0, and variance sigma^2, V(X)=sigma^2. What do we know about the distribution of X? Nothing. However, if we say X is a normal random variable with E(X)=0 and V(X)=sigma^2 we are providing more information about the random variable X. Many thanks for watching.
@@statisticsmatt Oh I see, thank you very much. This makes sense!
hi,, in last min around 30:04, you did cancel both g'(ui), however in denumerator has g'(ui)^2 while numerator only g'(ui)...
Good question. here is the basis equation that you are asking about. 0=(1/g'(ui))*(...). multiple g'(ui) to the left side. 0*g'(ui) = 0. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Hi matt, can you explain why you have witten (n*lambda) to the power {sum xi} in ex1 around 1:31? Thank you for such helpful videos!!
Many thanks for watching! You have found an error in the video, which I highlighted in the description for the video.
Another thing I just realize: The F statistics should (acc. to you, cf. around 9:35) be identical for RCB with fixed and random treatment effects. However, in this video you wrote F = ((SS_trt)/(a-1))/((SS_E)/((a-1)(b-1)), while in the video #46 in the playlist it is F = (((SS_trt)/sigma²)/(a-1))/(((SS_E)/sigma²)/((a-1)(b-1)). PS: I can also stop writing these things along my way through your playlists.
Many thanks for watching! I don't mind the questions and comments at all. I hoping that you can see that the F statistics you point out are really the same quantity.
@Daniel-ve8oi, in 11:30 in the video, I discuss why I divided by sigma^2, which makes SStrt/sigma^2 distributed as a chi-square random variable. Then dividing by degrees of freedom we start to create an F random variable.
@@statisticsmatt Ok, I see. Thx!
At 17:30 you said "it becomes a central F distribution, but you've written a chi-square).
That's an amazing catch! Thanks! It's a central F distribution. Not sure why I wrote chi-square. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
thanks alot man
You're welcome. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Here is my missing peace i would like to share with thouse who are watyching 6:44 $-\mathbb{E}[\ell'^2]=-\text{Var}(\ell')$ why? We should take a look the Variance definitione. $\text{Var}(\ell')=\mathbb{E}[\ell'^2]-(\mathbb{E}[\ell'])^2$ Since the $(\mathbb{E}[\ell'])^2=(0)^2=0$ Then $\text{Var}(\ell')=\mathbb{E}[\ell'^2]$ I always forget about that property of the variance!
Many thanks for watching and sharing your thoughts! Much appreciated. Don't forget to subscribe and let others know about this channel.
may i ask if on 11:24, in the "there pieace" - haven't we lost the square/second power?
Again, many thanks for watching. If you work it out by hand, you'll find that the first power was factored out front.
@statisticsmatt sorry for practicine questions under your video. So when you say that MLE theta satisfies the log likelihood first derivative = 0. you mean that - the equation will be equal to 0 once we take first derivative of log likelihood and use the hat theta as argument?
Correct.
I have a question, at the 8:40 when you talk about the new formula. You in the formulat we have now the $\hat\theta$, and it's now the $\theta^*$ is between $\hat\theta$ and $\theta_0$ My question is - how did we transitioned to the $\hat\theta$? is that connection throught the Note (2)? where the $\hat\theta_{MLE}$ satisfyies the $\ell'(\hat\theta)=0$?
Oh, you actually say theta at 8:15
many thanks for watching!
Love your videos so much!! Perfect revision before exams <3
Many thanks for watching! I love hearing that the videos are helpful!
I just love it! manby thanks for sharing this. May I ask you about the the last black square on the page - what is that?. I notice them in other stats book. It's like The End? Thanks
Many thanks for watching! Don't forget to subscribe and let others know about this channel. In regard to the black square, that indicates the end of a proof. Often you might see "QED" at the end of a proof too, which is a Latin phrase.
@statisticsmatt Many thanks for your aswer.
Great video, thanks. Would like to ask please: if in calculating the externally studentized residual for the ith data point, the leverage hii is calculated from a model with all data points (not with ith data point removed)?
Many thanks for watching! Don't forget to subscribe and let others know about this channel. You're correct, hii is calculated from a full model fit. Unfortunately, some of the videos in the play list require watching a previous video in the play list.
this is the best lecture i've ever seen that explains how the factors are mathematically calculated. but couldn't figure out how the lambda are calculated. not clear
Many thanks for watching and your nice comment. The next two videos in the play list "Applied Multivariate Analyses Using R Software" provide the methods of how to estimate the lambda parameters. Don't forget to subscribe and let others know about this channel.
Is there a textbook you can recommend that shows these proofs?
Many thanks for watching! Don't forget to subscribe and let others know about this channel. There are several online sites that have formulas with some derivations. Here's a nice pdf www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf In my videos I try to provide details as the way I think about things.
A general question in regards to alpha error inflation (maybe you talk about this later on, but I just made it up to here right now): It is usually said that once you perform multiple tests on the same data, you should either reduce the p value or increase the alpha levels. What I don't get is this: Whenever 1 test is performed, you have a chance drawing the correct conclusion (1 - alpha)^(k = 1). Thus, why should one only correct when using same data. Wouldn't it be more plausible to correct whenever a test is performed, This would of course mean that k has to increase with every test performed by a person, and thus a scientist would hardly find and significant results towards the end of her career. Is there a (mathematical) reason for why you only correct if you perform tests on the same data?
This is such a topic that would require me typing more than I want to type in a comment. Therefore I'm going to point you to a book chapter and a website to research this more. 1) Website link en.wikipedia.org/wiki/Multiple_comparisons_problem 2 ) Book link (see chapter 13) hastie.su.domains/ISLR2/ISLRv2_corrected_June_2023.pdf.download.html www.statlearning.com/resources-second-edition Many thanks for watching.
@@statisticsmatt Ok, thanks a lot. Always had this in my mind since I learnt the first time about the multiple comparison problem - but I never saw someone talking about the problem of alpha error inflation when testing different data sets ...
I had a assignment to submit explaining minimax test and comparing it with MP, UMP test. After spending weeks searching for a book/ pdf, a friend found your channel and suggested it to me. I am glad to tell you I watched not only the minimax lectures but even beyond my presentation topic. Your delivery was great and I got clarity on earlier topics as well. Thank you
I love hearing that the videos are helpful!! Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Around 5:20 - 5:30 I think it should be 1/sigma² ~ MNV(1/sigma² hat beta, I). Multiplied by 1/sigma should give sigma I for the variance, or 1/sigma² as coefficients for hat Y and the MVN average, shouldn't it? Nevertheless, great video as always!
First, many thanks for watching! After re-watching the video, it seems to be correct. Note the following property. (1) Let Y ~ N(Mu, Sigma^2), (2) Let c be a constant, Then (3) c^Y ~ N(c*Mu, c^2*Sigma). Property (3) is the part that I think you are confused about (guess). Be sure the subscribe and let others know about this channel.
Really great job. Appreciate you. Your presentation. Clarity. Thank you.
Thanks for your kind words. Much appreciated. Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Hey matt i really love your videos it really helps me a lot. I am currently watching your videos and following casella and berger but don't exactly know which playlists to follow from your channel. Could you please list down the playlists which would align with casella and berger (statistical inference book)
Many thanks for watching! This is a great suggestion! After the semester, I'll see what I can make for this. Don't forget to subscribe and let others know about this channel.
Hi, it was so useful! Thank you for the detailed explanation. Do you have videos related to Quantiles Tests?
Many thanks for watching! Don't forget to subscribe and let others know about this channel. If I have videos on quantile test, it would be in the playlist Nonparametrics.
I think there's a problem with the FTP site? it gave an error, then I went online and saw they had changed the URL, but that gave an error too. Your videos are very clear to follow along to though without the data. I'm just asking in case I'm doing something wrong.
Many thanks for watching! I'm not sure what is happening with the site. This link works for me. statisticsmatt.gumroad.com/
The website didn't show probabilities (or it was hidden) on the multiplier. I was thinking, "none" is only 1 of 5 options?? Of course I want it multiplied! But yeah, if 2x is more probable than 1x, I guess that's still worth it right?
First, are you looking at the Missouri lottery website? Or another state lottery?
If someone don't understand the expectation(Alternative Formula) actually it is very easy 4:22 P + P(1-P) + P(1-P)^2 + P(1-P)^3...... The first summation term P(1-P) + P(1-P)^2 + P(1-P)^3...... The second summation term P(1-P)^2 + P(1-P)^3...... The third summation term It just same as x*p*(1-P)^(x-1) x = 1 (the right most first column) x*p*(1-P)^(x-1) x = 2 (the right most second column) if u continue and sum it up it just same as the expectation formula which x > 0
You got it! Many thanks for watching. Don't forget to subscribe and let others know about this channel.
Really like it and could only repeat everything the others already commented ... Please continue!
Many thanks for watching! Don't forget to subscribe and let others know about this channel.
nice work!
Many thanks for watching! Don't forget to subscribe and let others know about this video.
This is a great video Sir but i hardly understand this mapping when looking for the limit of integration. How are the lines drawn and how is the shading done?
Are you referencing the plot at 3:40 in the video? If yes, we are plotting two order statistics. For example and to make the notation easier, let's assume we are plotting y1 and y2. Since these are order statistics, y1 < y2. The plot has to illustrate this relationship. The same argument is for M and S. We have to have M < S, which is the plot on the right.