Skip to content
GitLab
Projects
Groups
Snippets
Help
Loading...
Help
What's new
10
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Open sidebar
Robert Ricci
Evaluating Networked Systems
Commits
b9642f00
Commit
b9642f00
authored
Mar 03, 2014
by
Robert Ricci
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Finish part of lecture about parameter esimation
parent
85ff8570
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
36 additions
and
1 deletion
+36
-1
lectures/lecture14/lecturenotes.tex
lectures/lecture14/lecturenotes.tex
+36
-1
No files found.
lectures/lecture14/lecturenotes.tex
View file @
b9642f00
...
...
@@ -20,8 +20,43 @@
\begin{outline}
\1
From last time
\1
Today: How well does your data fit a line?
\2
More complicated regressions exist, of course, but we'll stick with this one for now
\2
Eyeballing is just not rigorous enough
\1
Basic model:
$
y
_
i
=
b
_
0
+
b
_
1
x
_
i
+
e
_
i
$
\2
$
y
_
i
$
is the prediction
\2
$
b
_
0
$
is the y-intercept
\2
$
b
_
1
$
is the slope
\2
$
x
_
i
$
is the predictor
\2
$
e
_
i
$
is the error
\2
\textit
{
Which of these are random variables?
}
\3
A: All but
$
x
_
i
$
the
$
b
$
s are estimated from random variables,
$
e
$
is difference between random variables
\3
So, we can compute statistics on them
\1
Two criteria for getting
$
b
$
s
\2
Zero total error
\2
Minimize SSE (sum of squared errors)
\2
Example of why one is not enough: two points, infinite lines with zero total error
\2
Squared errors always positive, so this criterion alone could overshoot
or undershoot
\1
Deriving
$
b
_
0
$
is easy
\2
Solve for
$
e
_
i
$
:
$
y
_
i
-
(
b
_
0
+
b
_
i x
_
i
)
$
\2
Take the mean over all
$
i
$
:
$
\overline
{
x
}
=
\overline
{
y
}
-
b
_
0
-
b
_
1
\overline
{
x
}$
\2
Set mean error to 0 to get
$
b
_
0
=
\overline
{
y
}
-
b
_
1
\overline
{
x
}$
\2
Now we just need
$
b
_
1
$
\1
Deriving
$
b
_
1
$
is harder
\2
SSE = sum of errors squared over all
$
i
$
\2
We want a minimum value for this
\2
It's a function with one local maximum
\2
So we can differentiate and look for zero
\2
$
s
_
y
^
2
-
2
b
_
1
s
^
2
_{
xy
}
+
b
_
1
^
2
s
_
x
^
2
$
, then take derivative
\2
$
s
_{
xy
}$
is correlation coefficient of
$
x
$
and
$
y
$
(see p. 181)
\2
In the end, gives us
$
b
_
1
=
\frac
{
s
^
2
_{
xy
}}{
s
_
x
^
2
}$
\3
Correlation of
$
x
$
and
$
y
$
divided by variance of
$
x
$
\3
$
\frac
{
\sum
{
xy
}
-
n
\overline
{
x
}
\overline
{
y
}}{
\sum
{
x
^
2
}
-
n
(
\overline
{
x
}
)
^
2
}$
\1
For next time
\end{outline}
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment