@KulvinderSingh-pm7cr

So informative  !!!
Thanks professor for making ML so fun and intuitive !!!

@PhilMcCrack1

Stars at 1:22

@vasileiosmpletsos178

Nice Lecture! Helped me a lot!

@saikumartadi8494

thanks a lot for the video .please upload other courses offered by you if they are recorded. would love to learn from teachers like you

@rameezusmani1294

sir..i cant explain in words how helpful your videos are and how easy they are to understand....people like you are true heroes of humanity and science...thank you so much....i have 1 question...these days machine learning seems to be limited to deep learning using neural networks...i want to know that do people still use ML or MAP or Naive Bayes ?

@giulianobianco6752

Great lecture, thanks Professor! Integrating the online Certificate with deeper math and concepts

@yukiy.4201

magnifico!

@doyourealise

sir did u find out where the handouts went? 10:30??

@sooriya233

Brilliant

@benw4361

Great lecture!

@fuweili5320

learn lots from it!!!

@mehmetaliozer2403

Thanks for this amazing lecture 👍👍

@coolarun3150

awesome!!

@maharshiroy8471

I believe in practical implementations, a poorly conditioned Hessian can cause huge numerical errors while converging, ultimately lending second-order methods like Newton very unreliable.

@mikejason3822

Why was an approximation of l(w) (using tailor series) was used at 13:20?

@adosar7261

At 39:53 the example seems a little bit off to me. I would expect Newton's method to converge in a single step (the function looks like a quadratic).

@JoaoVitorBRgomes

At 23:00 (circa) how do we check if the function is convex?

@meghnashankr9340

just saved me hours of digging into books for understanding these concepts...

@analyticstoolsbyhooman6963

Heard a lot about Cornell but got the reason for that.