00:01
Okay, great.
00:02
Let's continue by introducing the concept of
an estimate of a population
parameter. It is an approximation, depending
solely on sample
information. A specific value is called an
estimate.
00:17
There are two types of estimates.
00:19
Point estimates and confidence interval
estimates.
00:23
A point estimate is a single number, while a
confidence interval
naturally is an interval.
00:31
The two are closely related.
00:33
In fact, the point estimate is located
exactly in the middle of the confidence
interval. However, confidence intervals
provide much more information
and are preferred when making inferences.
00:45
Don't worry, we will have separate lessons
dedicated to confidence intervals.
00:49
All right. Have we seen estimates so far?
Sure we have.
00:55
The sample mean x bar is a point estimate of
the population mean mu.
01:01
Moreover, the sample variance as squared was
an estimate of the population variance
sigma squared.
01:08
There may be many estimators for the same
variable.
01:11
However, they all have two properties, bias
and efficiency.
01:16
We will not prove them, as the mathematics
associated is really out of the scope of this
course. However, you should have an idea
about the concepts.
01:25
Estimators are like judges.
01:27
We are always looking for the most
efficient, unbiased estimators.
01:32
An unbiased estimator has an expected value
equal to the population parameter.
01:38
Let's think of a biased estimate or to
explain that point.
01:42
What if somebody told you that you will find
the average height of Americans by taking a
sample, finding its mean, and then adding
one foot to that result?
So the formula is x bar plus one foot.
01:57
Well, I hope you will trust them.
02:00
They gave you an estimate later, but biased
one.
02:04
It makes much more sense that the average
height of Americans is approximated just by
the sample mean right.
02:11
We say that the bias of this estimate is one
foot.
02:16
Clear. Great.
02:19
Let's move on to efficiency.
02:22
The most efficient estimators are the ones
with the least variability of outcomes.
02:28
From the estimates. We know so far we
haven't seen estimates with problematic
variants, so it is hard to exemplify.
02:36
It is enough to know that most efficient
means the unbiased estimates are with the
smallest variance.
02:43
A final note worth making is about the
difference between estimators and statistics.
02:49
The word statistic is the broader term.
02:53
A point estimate is a statistic.
02:57
All right. This is how we can describe
estimators and point estimates.
03:02
In the next lecture, we will explore
confidence intervals.
03:06
So stick around.