Stars at 1:22
Nice Lecture! Helped me a lot!
thanks a lot for the video .please upload other courses offered by you if they are recorded. would love to learn from teachers like you
sir..i cant explain in words how helpful your videos are and how easy they are to understand....people like you are true heroes of humanity and science...thank you so much....i have 1 question...these days machine learning seems to be limited to deep learning using neural networks...i want to know that do people still use ML or MAP or Naive Bayes ?
Great lecture, thanks Professor! Integrating the online Certificate with deeper math and concepts
magnifico!
sir did u find out where the handouts went? 10:30??
Brilliant
Great lecture!
learn lots from it!!!
Thanks for this amazing lecture 👍👍
awesome!!
Amazing!
I believe in practical implementations, a poorly conditioned Hessian can cause huge numerical errors while converging, ultimately lending second-order methods like Newton very unreliable.
Why was an approximation of l(w) (using tailor series) was used at 13:20?
At 39:53 the example seems a little bit off to me. I would expect Newton's method to converge in a single step (the function looks like a quadratic).
At 23:00 (circa) how do we check if the function is convex?
just saved me hours of digging into books for understanding these concepts...
Heard a lot about Cornell but got the reason for that.
@KulvinderSingh-pm7cr