All new accounts created on Gitlab now require administrator approval. If you invite any collaborators, please let Flux staff know so they can approve the accounts.

lecturenotes.tex 3.3 KB
Newer Older
1 2
\documentclass{article}[12pt]

Robert Ricci's avatar
Robert Ricci committed
3 4
\input{../../texstuff/fonts.sty}
\input{../../texstuff/notepaper.sty}
5 6 7 8
\usepackage{outlines}

\title{CS6963 Lecture \#20}
\author{Robert Ricci}
Robert Ricci's avatar
Robert Ricci committed
9
\date{April 16, 2015}
10 11 12 13 14 15 16

\begin{document}

\maketitle

\begin{outline}

17 18
\1 Project status updates

19 20 21 22
\1 Review of topics covered
    \2 Overall point of systems evaluation
        \3 Providing data necessary to make a decision
        \3 Presenting it on a convincing manner
23 24
        \3 Performance evaluation is not the only relevant data, but it's
            important
25 26 27
    \2 Understand what you are measuring
        \3 Understand what make convincing evidence 
        \3 Understand what you need to measure and how to measure it
28 29
        \3 Do some baseline tests of your workload generator and monitors to
            make sure they exhibit expected results
30 31 32
    \2 Understand the system under test
        \3 Where are the boundaries?
        \3 What is inside those boundaries that you are actually measuring?
33 34
        \3 How do the SUT boundaries relate to a deployment environment?
        \3 How do  the SUT boundaries relate to the claims of the paper?
35
    \2 Evaluation should be a part of the research and development process
36 37 38 39 40 41
        \3 Convince yourself with data, not just bias
        \3 Often need preliminary evaluations
        \3 Understand the scope and limitations of your work
        \3 The more evaluation you do along the way, the less biased you are
            likely to be
    \2 Recognize the strengths and weaknesses in evaluations that you read
42
        \3 There is a huge difference between active and passive reading
43
        \3 Think actively about what you need to see to be convinced
44
        \3 Look for biases or basic mistakes
45 46 47
    \2 Common mistakes in systems evaluation
        \3 No goals or biased goals
        \3 Ignoring significant factors
48 49
        \3 Analysis without understanding the problem---a reason to have a
            concrete problem statement
50 51 52 53 54 55 56
        \3 No sensitivity analysis
        \3 Ignoring variability
    \2 Use the statistical tools available to you
        \3 For selecting the number of runs
        \3 For understanding the confidence in your results
        \3 For showing difference or sameness in results
        \3 The question is ``have you found something real?''
57 58 59 60 61 62 63 64 65
        \3 Don't just eyeball graphs
            \4 Compute CIs
            \4 Do linear regressions
            \4 Compares means, percentiles, etc.
    \2 Work done to make work repeatable helps everyone
        \3 Not just for others, but for yourself too
        \3 Lets you re-run things when something changes
        \3 Makes it easier for others to build on your work
        \3 These things bitrot incredibly fast
66 67
        \3 The closer you can get to ``experiment as function call'', the 
            happier you will be
68 69
        \3 Assuming your environment is fragile forces you to build more
            repeatable research
70
        \3 Keep track of everything 
71 72
    \2 You are running experiments all the time, the only question is whether
        you are learning anything from them
73

74 75 76 77 78 79
\1 For next time
    \2 Weekly 3 due Friday midnight
    \2 Papers 4 due next Tuesday
    \2 Talks next Thursday (Anil Kumar, Jithu, George, Jonathon)
        \3 Schedule a time to meet with me
        \3 Any questions about what's expected?
80 81 82 83

\end{outline}

\end{document}