[ml] Stanford Machine Learning Course CS229

Paul Oppenheim paul at pauloppenheim.com
Wed Aug 25 08:32:43 UTC 2010


I'm interested, but about to move to europe for several months, taking 
me out of any face-to-face meetings. Could I recommend using the wiki 
for class discussion and updates? That way others could follow along and 
learn from what you do at their own pace, and beginners would have a 
clear place to start.

+ p

On 2010-08-19 4:39 PM, Glen Jarvis wrote:
> I'm obviously incredibly interested too. However, here are a few words
> of caution:
>
> * I have another course starting in a few weeks, taught by my boss, that
> I'm required to take (and obviously need to do incredibly well on -- it
> has priority and I'm already a very busy beaver),
> * Although it may not be true for the ML group, I found when I taught a
> Linux Certification Course at Noisebridge people were interested (very
> interested) for about two weeks. They weren't generally willing to put
> the homework to get a better understanding for the next week. But, we
> had a new starter every week who was very *very* eager and I didn't want
> to leave behind.. So, we were constantly in the beginning mode and
> either loosing people by progressing too far, or boring people catering
> to the new people.
>
> Just a few thoughts in case we hadn't considered this yet..
>
> Cheers,
>
>
> Glen
>
> On Thu, Aug 19, 2010 at 3:36 PM, Joe Hale <joe at jjhale.com
> <mailto:joe at jjhale.com>> wrote:
>
>     Hi,
>
>     I was wondering if anyone out there wanted to form a study group to
>     work through the Stanford Machine learning course. The videos of the
>     lectures are on iTunesU and all the handouts and problem sets are
>     online.
>
>     The course consists of 20 lectures which are 1h 15m long each. I've
>     pasted the syllabus at the end. It seems like it would provide a
>     really solid foundation for future ML projects at Noisebridge for
>     those interested in getting into ML but who maybe didn't get round to
>     studying it at school.
>
>     I figure we'd watch lectures on our own time and get together to
>     discuss them and the problem sets.
>
>     Let me know if you'd be interested.
>
>     - Joe Hale
>
>     :::The course details:::
>
>     Machine Learning CS229
>     http://www.stanford.edu/class/cs229/
>
>     Course Description
>
>     This course provides a broad introduction to machine learning and
>     statistical pattern recognition. Topics include: supervised learning
>     (generative/discriminative learning, parametric/non-parametric
>     learning, neural networks, support vector machines); unsupervised
>     learning (clustering, dimensionality reduction, kernel methods);
>     learning theory (bias/variance tradeoffs; VC theory; large margins);
>     reinforcement learning and adaptive control. The course will also
>     discuss recent applications of machine learning, such as to robotic
>     control, data mining, autonomous navigation, bioinformatics, speech
>     recognition, and text and web data processing.
>
>     Prerequisites
>
>     Students are expected to have the following background:
>     Knowledge of basic computer science principles and skills, at a level
>     sufficient to write a reasonably non-trivial computer program.
>     Familiarity with the basic probability theory. (CS109 or Stat116 is
>     sufficient but not necessary.)
>     Familiarity with the basic linear algebra (any one of Math 51, Math
>     103, Math 113, or CS 205 would be much more than necessary.)
>
>     Course Materials
>     There is no required text for this course. Notes will be posted
>     periodically on the course web site. The following books are
>     recommended as optional reading:
>
>     Syllabus
>     Introduction (1 class)
>     Basic concepts.
>
>     Supervised learning. (7 classes)
>     Supervised learning setup. LMS.
>     Logistic regression. Perceptron. Exponential family.
>     Generative learning algorithms. Gaussian discriminant analysis.
>     Naive Bayes.
>     Support vector machines.
>     Model selection and feature selection.
>     Ensemble methods: Bagging, boosting, ECOC.
>     Evaluating and debugging learning algorithms.
>
>     Learning theory. (3 classes)
>     Bias/variance tradeoff. Union and Chernoff/Hoeffding bounds.
>     VC dimension. Worst case (online) learning.
>     Practical advice on how to use learning algorithms.
>
>     Unsupervised learning. (5 classes)
>     Clustering. K-means.
>     EM. Mixture of Gaussians.
>     Factor analysis.
>     PCA. MDS. pPCA.
>     Independent components analysis (ICA).
>
>     Reinforcement learning and control. (4 classes)
>     MDPs. Bellman equations.
>     Value iteration and policy iteration.
>     Linear quadratic regulation (LQR). LQG.
>     Q-learning. Value function approximation.
>     Policy search. Reinforce. POMDPs.
>     _______________________________________________
>     ml mailing list
>     ml at lists.noisebridge.net <mailto:ml at lists.noisebridge.net>
>     https://www.noisebridge.net/mailman/listinfo/ml
>
>
>
>
> --
> Whatever you can do or imagine, begin it;
> boldness has beauty, magic, and power in it.
>
> -- Goethe
>
>
>
> _______________________________________________
> ml mailing list
> ml at lists.noisebridge.net
> https://www.noisebridge.net/mailman/listinfo/ml




More information about the ml mailing list