# Parametric Estimation

This section discusses how to estimate the parameters of a distribution i.e. $$\mu, \sigma^2$$ of the line $$f(x) = w\_0 + w\_1x$$.

We denote the parameters by $$\Theta = (\mu, \sigma^2)$$

The **likelihood** of $$\Theta$$ given a sample X is given by:

$$l(\Theta|X) = \sum\_t ,p(x^t|\Theta)$$

Therefore, the **Log Likelihood** of $$\Theta$$ given a sample X is denoted by:

$$L(\Theta|X) := log ,l(\Theta|X) = \sum\_t log p(x^t|\Theta)$$

This assumes that the observations in X are independent.

The **Maximum Likelihood Estimator** (MLE) is given by:

$$\Theta^\* := argmax\_\Theta L(\Theta|X)$$

## Estimating the Parameter $$P\_h$$ of a Bernoulli Distribution

X is a Bernoulli Random Variable.

$$P\_h = P\[X=1]$$

For example, consider the following:

Let 1 denote Heads, and 0 denote Tails.\
Say X = {1,1,0}\
We need to determine $$\Theta$$ i.e. $$P\_h$$.

We have $$l(P\_h|X) = P(X|P\_h) = P\_h*P\_h*(1-P\_h)$$

More generally, for $$X = {x^t}\_{t=1}^N$$, we have:

$$p(X|p\_h) = \Pi\_{t=1}^N p\_h^{x^t}(1-p\_h)^{(1-x^t)}$$

It can be proved that the MLE is given by $$p\_h = \frac{\sum x^t}{N}$$.

## Estimating the Parameters of a Multinomial Distribution

Consider a die with 6 faces numbered from 1 to 6.

If X is a Multinomial Random Variable, there are k>2 possible values of X (here, 6).

Say X={5, 4, 6}. We can imagine indicator vectors for each observation as \[0 0 0 0 1 0], \[0 0 0 1 0 0] and \[0 0 0 0 0 1].

Say X={4,6,4,2,3,3}. The MLE of $$x\_i$$ i.e. side **i** shows up, can be given by:

$$p\_i = \frac{\sum\_{t=1}^N x\_i^t}{N}$$.

## Estimating the Parameters of a Gaussian Distribution

The MLE for the mean m is $$\frac{\sum\_t x^t}{N}$$ and the MLE for the variance $$\sigma^2$$ is $$\frac{\sum\_t,(x^t-m)^2}{N}$$.

However, if we divide by N-1 instead of N (for variance), it is called the **unbiased estimate**.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://vikram-bajaj.gitbook.io/cs-gy-6923-machine-learning/types-of-machine-learning/supervised-learning/parametric-estimation.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
