Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You do not need any matrix calculus for deep learning and micrograd (https://github.com/karpathy/micrograd), which implements backpropagation for neural nets in 100 lines of code, is a proof. Everything else is just vectorization.


Yes, you'll probably need more information-theory and statistics than linear algebra. The focus seems to be in the wrong place.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: