This semester I went wild. Machine Learning for Trading AND Machine Learning - took up both these classes. I heard many question this decision commenting that it’ll be very tough, rigorous and that I’ll have no time. But thinking back, I did enjoy life quite a bit. No doubt that the effort of doing these two classes together was enormous, but I still could find time to go on family vacations, workcations where I didn’t do my coursework, went to Qatar for a week for the FIFA World Cup 2022 and even had some free weekends.

Why did I take this course? Firstly, this counts towards the ML Specialization. Secondly, I expected this to be easy for me given my backgroun. Motivation being to complete the program faster, I paired this with ML (CS7641) as I thought it would compliment the curriculum here.

Now, about the course - it is a Prof. Joyner course, so it came with the very neatly structured curriculum and calendar. The course is broken down into 8 Projects with quizzes and two exams.

With some python experience and basic ML knowledge, you can breeze through this course. Fairly easy. The course focuses in some detail about the trading jargon - stocks, indicators for transactions, options, etc. If you have never got into trading then this course could prep you up with the terms and theory behind the trades.

Newsflash: You will not end up building ML algorithms that can mint you money.

OMSCentral, at the time of writing, tags this course with a difficulty of 2.5 and workload to be approximately 11.2 hours. It took me way lesser than that to complete, probably 6-7 hours per week.

The projects differ in its weightage, some are valued less while one project holds 20% of your grade, so think of it as a mini-project heavy course. The projects are fairly simple - again, just python, nothing fancy. Half of the projects requires you to write a report. This mostly involves various plots that you will be creating and then comparison and analysis of the plots. I liked this bit, deriving insights and understanding the trading ideas using plots. Spend extra minutes focusing on how your plots look like, graders are very strict about the colors you use for the lines, missing legends, axes labels and details, etc. These are supposed to be easy projects, but losing points are also easy since these minute things are strictly graded. One bonus that I found in this course is that you are free to share your output plots and artifacts on Ed (with some watermarks to avoid reuse) and compare with other students. The course encourages this to bring out a peer-review sort of atmosphere.

There are two exams - midterms and finals. Like the other Prof. Joyner classes I’ve taken, both are MCQ style (closed book though). If you paid attention to the lectures (which are easy to follow) then this should be easy too. Sample questions and previous tests are provided for practice. If you just look at that once you can get 100% on the exams.

The content is fairly basic and easy. It starts off by teaching you to how to manipulate dataframes with pandas and numpy, goes into the details of the trading jargon (indicators and shizz) and then finally comes to ML towards the last third of the course. Here, regression, bagging and boosting, decision trees, etc. are discussed. After another 10 feet of jargon like CAPM and EMH, reinforcement learning is touched with primary focus being on Q-Learning. This was the most exciting part for me - writing a Q-Learning agent from scratch. You may want to refer the texts for detailed knowledge here.

If you are good at python and already have some ML skill-set (or at-least understand some concepts) this course should be a cakewalk. I’d highly recommend such students to pair this with another course to utilize your time better (of course, unless you have other commitments). For others, you could take this standalone.

A small shoutout to Shubham my friend, colleague and classmate for this course. It is fun if you are doing it with someone!