Press question mark to see available shortcut keys

Just finished! I just finished the CMU OLI Probability & Statistics  course, which I started... somewhere back in March or June. I think, overall, it's a pretty good statistics course. What I like best about it is that it is heavy about quizzes and exercises with real-world datasets, so I learned a bit more about R as well as learning the basics.

 It covers (https://oli.cmu.edu/jcourse/webui/syllabus/outline.do?section=66b1a66880020ca600b8f6e6295753ca&from=66b1bfc080020ca6012390014e52d1b7) from a fairly practical standpoint: data graphing, stuff like means or medians or distributions, the rules of probability, conditional probability, probability trees, Bayes's theorem, binomials and the normal distribution in particular, confidence intervals, z-tests, t-tests, ANOVA f-tests, the chi-squared test, linear models.

It has some drawbacks, of course: it's largely NHST-based as one would expect; the Java applets make copy-and-paste impossible on my Linux system which made answering questions a bit annoying; the R code is not really explained so you have to figure things out yourself; parts of it can be very repetitious (if I never have to specify what is the null hypothesis and what is H_1, it will be too soon) and trivial leading to occasional '-_- yeah whatever' reactions.

But overall I'm pretty glad I did it. I understand much better the tools I was using to analyze my self-experiments and hopefully it'll be a good base for tackling a Bayesian textbook like Kruschke's 2010 Doing Bayesian Data Analysis.
Shared publiclyView activity