2. POINT ESTIMATION POINT ESTIMATE :- An estimate of a population parameter given by a single number is called point estimate POINT ESTIMATOR :- A point estimator is a statistic for Estimating the population Parameter ө and will be denoted by ө*
3. Example Problem of point estimation of the population mean µ :- The statistic chosen will be called a point estimator for µ Logical estimator for µ is the Sample mean Hence µ* =
4. UNBIASED ESTIMATOR Unbiased Estimator:- If the mean of sampling distribution of a Statistic equals the corresponding Population Parameter,the Statistic is called an Unbiased Estimator of the Parameter i.e E(ө*) = ө Biased Estimator:- If E(ө*)≠ ө i.e Estimator is not Unbiased. Bias Of Estimator Bias of Estimator = E(ө*) - ө
5. STANDARD ERROR OF THE MEAN Let denote the Sample mean based on a Sample of size n drawn from a distribution with standard deviation σ.The Standard deviation of is given by σ / and is called standard error of the mean
6. METHODS FOR FINDINGS ESTIMATORS:- METHOD OF MAXIMUM LIKELIHOOD ESTIMATION METHOD OF MOMENTS
7. METHOD OF MAXIMUM LIKELIHOOD ESTIMATION LIKELIHOOD FUNCTION:- Let x1,x2,….xn be a random sample of size n from a population with density function f(x) and parameter ө.Then the likelihood function of the sample value x1,x2,…..xn is denoted by L , is their joint density function given by L(ө)= f(x1) f(x2)….. f(xn)
8. METHOD OF MAXIMUM LIKELIHOOD ESTIMATION Principal of Maximum likelihood consist in finding an estimator (of the parameter) which maximize L. thus if their exist function ө*=ө*(x1,x2,x3,….xn) Of the sample values which maximizes L then ө* is taken as an Estimator of ө.
9. METHOD OF MAXIMUM LIKELIHOOD ESTIMATION Thus ө* is the solution ,if any of The eqn (1) can be rewritten as
10. METHOD OF MAXIMUM LIKELIHOOD ESTIMATION Since L >0, so is Log L which shows that L and Log L attains its extreme values at the same value of ө* which is called maximum likelihood estimator. Note:- Eqn (3) is more convenient from practical point of view
11. METHOD OF MAXIMUM LIKELIHOOD ESTIMATION The likelihood equation for estimating λis Thus the M.L.E for λ is the sample mean.
12. METHOD OF MOMENTS METHOD Let f(x,ө1,ө2,…..өk) be the density function of the parent population with k parameter If µr’ denotes r th moment about origin then
13. STEPS OF METHOD OF MOMENTS Let x1,x2,……,xn be random sample of size n from the given population Step 1:- solve k equations (1) for ө1,….,өk in terms of µ1’,……,µk’ Step2:- Replace these moments µr’ r =1,2,….,k by the sample moments m1’,m2’,….,mk’. i.e if өi*= өi(µ1’*,µ2’*,……,µk’*) =өi(m1’,m2’, …..,mk’) i=1,2,…,k Step3:- ө1*,ө2*,……,өk* are the required estimators
14. ERROR OF ESTIMATE When we use a sample mean to estimate the population mean, we know that although we are using a method of estimation which has certain desirable properties, the chances are slim, virtually nonexistent, that the estimate will actually equal to population mean . Error of estimate is the difference between the estimator and the quantity it is supposed to estimate. is t.he error of estimate for population mean To examine this error, let us make use of the fact that for large n is a random variable having approximately the standard normal distribution
16. STEPS OF METHOD OF MOMENTS Figure: The large sample distribution of 1- /2 /2 z/2 - z/2 0 As shown in Figure, we can assert with probability 1 - that the the inequality will be satisfied or that where z/2 is such that the normal curve area to its right equals /2.
17. Determination of sample size: Suppose that we want to use the mean of a large random sample to estimate the mean of population and we want to be able to assert with probability 1 - that the error will be at most prescribed quantity E. The sample size can be determined by