Method of Moments

Report 5 Downloads 61 Views
Stat 428 AU2008

1

STATISTICS 428 – Spring 2008 Section 6.2 Methods of Point Estimation Key topics: moments, method of moments, likelihood, maximum likelihood estimation.

In Section 6.1 we studied a few criteria that can be used to judge the worth of point estimators. Here, we study how to construct point estimators.

Method of Moments The basic idea here is to equate certain sample statistics to the corresponding population expected values. Population moment: Let X1 , X2 , . . . , Xn be a random sample from a pmf or pdf f (x). For k = 1, 2, 3, . . . the k th population moment, or k th moment of the distribution f (x), is E(X k ). Sample moment: The k th sample moment is

1 n

Pn

i=1

Xi .

Examples:

• • Method of Moment(MoM) estimation: Let X1 , X2 , . . . , Xn be a random sample from a pmf or pdf f (x; θ1 , θ2 , . . . , θm ), where θ1 , θ2 , . . . , θm are parameters whose values are unknown. Then the moment estimators θˆ1 , θˆ2 , . . . , θˆm are obtained by equating the first m sample moments to the corresponding first m population moments and solving for θ1 , θ2 , . . . , θm . 1. Example: Let X1 , X2 , . . . , Xn represent a random sample from Uniform(0, τ ) that is f (x; τ ) = τ1 , 0 < x < τ

1

Stat 428 AU2008

2

2. Example: Let X1 , X2 , . . . , Xn represent a random sample of service times of n customers at a bank. Let’s also assume the underlying distribution is exponential with parameter, λ (lambda). That is, X1 , X2 , . . . , Xn ∼ f (x; λ), where

  λ exp−λx , f (x; λ) =  0,

where x > 0 elsewhere

Find the method of moments estimator of lambda.

3. Example: Let X1 , X2 , . . . , Xn represent a random sample from a gamma distribution, that is, X1 , X2 , . . . , Xn ∼ f (x; α, β), where α and β are the 2 parameters. Find the method of moments estimators of α and β.

2

Stat 428 AU2008

3

Maximum Likelihood Estimation Maximum likelihood estimation is a widely used technique. The basic idea behind this method is: we observe data from a certain distribution and then ask ourselves, ‘Given the data, what are most likely values of the parameters for the distribution from which the data was obtained?’ 1. Example: A random sample of 20 voters was chosen from a large city. Twelve of the 20 said that they favored candidate A for mayor. Let p be the % of voters in the city who favor candidate A. For what value of p is the observed sample most likely to have occured?

• Likelihood function: Let X1 , X2 , . . . , Xn have joint pmf or pdf f (x1 , x2 , . . . , xn ; θ1 , θ2 , . . . , θm ) where θ1 , θ2 , . . . , θm are parameters with unknown values. When x1 , x2 , . . . , xn are the observed sample values and the joint pmf or pdf is regarded as a function of θ1 , θ2 , . . . , θm , then the joint pmf or pdf is called the likelihood function.

3

Stat 428 AU2008

4

• Maximum Likelihood Estimates (MLEs):

1. Example: Let X1 , X2 , . . . , Xn be a random sample from a exponential distribution with parameter λ. Find the MLE of λ.

2. Example: Let X1 , X2 , . . . , Xn be a random sample from a normal distribution with parameters µ and σ 2 . Find the MLEs of µ and σ 2 .

4

Stat 428 AU2008

5

• Estimating functions of parameters: Let θˆ1 , θˆ2 , . . . , θˆm be the mle’s of the parameters θ1 , θ2 , . . . , θm . Then the mle of any function h(θ1 , θ2 , . . . , θm ) of these parameters is the function h(θˆ1 , θˆ2 , . . . , θˆm ) of the mle’s. ¯ and σ • Example: For the N (µ, σ 2 ), we found the ML estimators to be µ ˆ = X ˆ2 = √ Pn 2 ¯ 2 σ 2 = σ. i=1 (Xi − X) /n. Find the maximum likelihood estimator of h(µ, σ ) =

5

Stat 428 AU2008

6

• Many books use the following form of the exponential distribution:  

f (x; λ) = 

1 λ

x

exp− λ , 0,

Find the maximum likelihood estimator of λ.

6

where x > 0 elsewhere